Home » Big Bang Theory
Category Archives: Big Bang Theory
Backup articles for students
There Was No Big Bang Singularity, Ethan Siegel, Forbes, 7/27/2018
Almost everyone has heard the story of the Big Bang. But if you ask anyone, from a layperson to a cosmologist, to finish the following sentence, “In the beginning, there was…” you’ll get a slew of different answers. One of the most common ones is “a singularity,” which refers to an instant where all the matter and energy in the Universe was concentrated into a single point. The temperatures, densities, and energies of the Universe would be arbitrarily, infinitely large, and could even coincide with the birth of time and space itself.
But this picture isn’t just wrong, it’s nearly 40 years out of date! We are absolutely certain there was no singularity associated with the hot Big Bang, and there may not have even been a birth to space and time at all. Here’s what we know and how we know it.
When we look out at the Universe today, we see that it’s full of galaxies in all directions at a wide variety of distances. On average, we also find that the more distant a galaxy is, the faster it appears to be receding from us. This isn’t due to the actual motions of the individual galaxies through space, though; it’s due to the fact that the fabric of space itself is expanding.
This was a prediction that was first teased out of General Relativity in 1922 by Alexander Friedmann, and was observationally confirmed by the work of Edwin Hubble and others in the 1920s. It means that, as time goes on, the matter within it spreads out and becomes less dense, since the volume of the Universe increases. It also means that, if we look to the past, the Universe was denser, hotter, and more uniform.
If you were to extrapolate back farther and farther in time, you’d begin to notice a few major changes to the Universe. In particular:
- you’d come to an era where gravitation hasn’t had enough time to pull matter into large enough clumps to have stars and galaxies,
- you’d come to a place where the Universe was so hot you couldn’t form neutral atoms,
- and then where even atomic nuclei were blasted apart,
- where matter-antimatter pairs would spontaneously form,
- and where individual protons and neutrons would be dissociated into quarks and gluons.
Each step represents the Universe when it was younger, smaller, denser, and hotter. Eventually, if you kept on extrapolating, you’d see those densities and temperatures rise to infinite values, as all the matter and energy in the Universe was contained within a single point: a singularity.
The hot Big Bang, as it was first conceived, wasn’t just a hot, dense, expanding state, but represented an instant where the laws of physics break down. It was the birth of space and time: a way to get the entire Universe to spontaneously pop into existence. It was the ultimate act of creation: the singularity associated with the Big Bang.
Yet, if this were correct, and the Universe had achieved arbitrarily high temperatures in the past, there would be a number of clear signatures of this we could observe today. There would be temperature fluctuations in the Big Bang’s leftover glow that would have tremendously large amplitudes. The fluctuations that we see would be limited by the speed of light; they would only appear on scales of the cosmic horizon and smaller. There would be leftover, high-energy cosmic relics from earlier times, like magnetic monopoles.
And yet, the temperature fluctuations are only 1-part-in-30,000, thousands of times smaller than a singular Big Bang predicts. Super-horizon fluctuations are real, robustly confirmed by both WMAP and Planck. And the constraints on magnetic monopoles and other ultra-high-energy relics are incredibly tight. These missing signatures have a huge implication: the Universe never reached these arbitrarily large temperatures.
Instead, there must have been a cutoff. We cannot extrapolate back arbitrarily far, to a hot-and-dense state that reaches whatever energies we can dream of. There’s a limit to how far we can go and still validly describe our Universe.
In the early 1980s, it was theorized that, before our Universe was hot, dense, expanding, cooling, and full of matter and radiation, it was inflating. A phase of cosmic inflation would mean the Universe was:
- filled with energy inherent to space itself,
- which causes a rapid, exponential expansion,
- that stretches the Universe flat,
- gives it the same properties everywhere,
- with small-amplitude quantum fluctuations,
- that get stretched to all scales (even super-horizon ones),
and then inflation comes to an end.
When it does, it converts that energy, which was previously inherent to space itself, into matter and radiation, which leads to the hot Big Bang. But it doesn’t lead to an arbitrarily hot Big Bang, but rather one that achieved a maximum temperature that’s at most hundreds of times smaller than the scale at which a singularity could emerge. In other words, it leads to a hot Big Bang that arises from an inflationary state, not a singularity.
The information that exists in our observable Universe, that we can access and measure, only corresponds to the final ~10-33 seconds of inflation, and everything that came after. If you want to ask the question of how long inflation lasted, we simply have no idea. It lasted at least a little bit longer than 10-33 seconds, but whether it lasted a little longer, a lot longer, or for an infinite amount of time is not only unknown, but unknowable.
So what happened to start inflation off? There’s a tremendous amount of research and speculation about it, but nobody knows. There is no evidence we can point to; no observations we can make; no experiments we can perform. Some people (wrongly) say something akin to:
Well, we had a Big Bang singularity give rise to the hot, dense, expanding Universe before we knew about inflation, and inflation just represents an intermediate step. Therefore, it goes: singularity, inflation, and then the hot Big Bang.
There are even some very famous graphics put out by top cosmologists that illustrate this picture. But that doesn’t mean this is right.
In fact, there are very good reasons to believe that this isn’t right! One thing that we can mathematically demonstrate, in fact, is that it’s impossible for an inflating state to arise from a singularity.
Here’s why: space expands at an exponential rate during inflation. Think about how an exponential works: after a certain amount of time goes by, the Universe doubles in size. Wait twice as long, and it doubles twice, making it four times as large. Wait three times as long, it doubles thrice, making it 8 times as large. And if you wait 10 or 100 times as long, those doublings make the Universe 210 or 2100 times as large.
Which means if we go backwards in time by that same amount, or twice, or thrice, or 10 or 100 times, the Universe would be smaller, but would never reach a size of 0. Respectively, it would be half, a quarter, an eighth, 2-10, or 2-100 times its original size. But no matter how far back you go, you never achieve a singularity.
There is a theorem, famous among cosmologists, showing that an inflationary state is past-timelike-incomplete. What this means, explicitly, is that if you have any particles that exist in an inflating Universe, they will eventually meet if you extrapolate back in time.
This doesn’t, however, mean that there must have been a singularity, but rather that inflation doesn’t describe everything that occurred in the history of the Universe, like its birth. We also know, for example, that inflation cannot arise from a singular state, because an inflating region must always begin from a finite size.
Every time you see a diagram, an article, or a story talking about the “big bang singularity” or any sort of big bang/singularity existing before inflation, know that you’re dealing with an outdated method of thinking.
The idea of a Big Bang singularity went out the window as soon as we realized we had a different state — that of cosmic inflation — preceding and setting up the early, hot-and-dense state of the Big Bang.
There may have been a singularity at the very beginning of space and time, with inflation arising after that, but there’s no guarantee. In science, there are the things we can test, measure, predict, and confirm or refute, like an inflationary state giving rise to a hot Big Bang. Everything else? It’s nothing more than speculation.
Related articles by Ethan Siegel
Many researchers believe that physics will not be complete until it can explain not just the behaviour of space and time, but where these entities come from.
Zeeya Merali, Nature, 28 August 2013
“Imagine waking up one day and realizing that you actually live inside a computer game,” says Mark Van Raamsdonk, describing what sounds like a pitch for a science-fiction film. But for Van Raamsdonk, a physicist at the University of British Columbia in Vancouver, Canada, this scenario is a way to think about reality. If it is true, he says, “everything around us — the whole three-dimensional physical world — is an illusion born from information encoded elsewhere, on a two-dimensional chip”. That would make our Universe, with its three spatial dimensions, a kind of hologram, projected from a substrate that exists only in lower dimensions.
This ‘holographic principle’ is strange even by the usual standards of theoretical physics. But Van Raamsdonk is one of a small band of researchers who think that the usual ideas are not yet strange enough. If nothing else, they say, neither of the two great pillars of modern physics — general relativity, which describes gravity as a curvature of space and time, and quantum mechanics, which governs the atomic realm — gives any account for the existence of space and time. Neither does string theory, which describes elementary threads of energy.
Van Raamsdonk and his colleagues are convinced that physics will not be complete until it can explain how space and time emerge from something more fundamental — a project that will require concepts at least as audacious as holography. They argue that such a radical reconceptualization of reality is the only way to explain what happens when the infinitely dense ‘singularity’ at the core of a black hole distorts the fabric of space-time beyond all recognition, or how researchers can unify atomic-level quantum theory and planet-level general relativity — a project that has resisted theorists’ efforts for generations.
“All our experiences tell us we shouldn’t have two dramatically different conceptions of reality — there must be one huge overarching theory,” says Abhay Ashtekar, a physicist at Pennsylvania State University in University Park.
Finding that one huge theory is a daunting challenge. Here, Nature explores some promising lines of attack — as well as some of the emerging ideas about how to test these concepts.
NIK SPENCER/NATURE; Panel 4 adapted from Budd, T. & Loll, R. Phys. Rev. D 88, 024015 (2013)
Gravity as thermodynamics
One of the most obvious questions to ask is whether this endeavour is a fool’s errand. Where is the evidence that there actually is anything more fundamental than space and time?
A provocative hint comes from a series of startling discoveries made in the early 1970s, when it became clear that quantum mechanics and gravity were intimately intertwined with thermodynamics, the science of heat.
In 1974, most famously, Stephen Hawking of the University of Cambridge, UK, showed that quantum effects in the space around a black hole will cause it to spew out radiation as if it was hot. Other physicists quickly determined that this phenomenon was quite general. Even in completely empty space, they found, an astronaut undergoing acceleration would perceive that he or she was surrounded by a heat bath. The effect would be too small to be perceptible for any acceleration achievable by rockets, but it seemed to be fundamental. If quantum theory and general relativity are correct — and both have been abundantly corroborated by experiment — then the existence of Hawking radiation seemed inescapable.
A second key discovery was closely related. In standard thermodynamics, an object can radiate heat only by decreasing its entropy, a measure of the number of quantum states inside it. And so it is with black holes: even before Hawking’s 1974 paper, Jacob Bekenstein, now at the Hebrew University of Jerusalem, had shown that black holes possess entropy.
But there was a difference. In most objects, the entropy is proportional to the number of atoms the object contains, and thus to its volume. But a black hole’s entropy turned out to be proportional to the surface area of its event horizon — the boundary out of which not even light can escape. It was as if that surface somehow encoded information about what was inside, just as a two-dimensional hologram encodes a three-dimensional image.
In 1995, Ted Jacobson, a physicist at the University of Maryland in College Park, combined these two findings, and postulated that every point in space lies on a tiny ‘black-hole horizon’ that also obeys the entropy–area relationship. From that, he found, the mathematics yielded Einstein’s equations of general relativity — but using only thermodynamic concepts, not the idea of bending space-time1.
“This seemed to say something deep about the origins of gravity,” says Jacobson. In particular, the laws of thermodynamics are statistical in nature — a macroscopic average over the motions of myriad atoms and molecules — so his result suggested that gravity is also statistical, a macroscopic approximation to the unseen constituents of space and time.
In 2010, this idea was taken a step further by Erik Verlinde, a string theorist at the University of Amsterdam, who showed2 that the statistical thermodynamics of the space-time constituents — whatever they turned out to be — could automatically generate Newton’s law of gravitational attraction.
And in separate work, Thanu Padmanabhan, a cosmologist at the Inter-University Centre for Astronomy and Astrophysics in Pune, India, showed3 that Einstein’s equations can be rewritten in a form that makes them identical to the laws of thermodynamics — as can many alternative theories of gravity. Padmanabhan is currently extending the thermodynamic approach in an effort to explain the origin and magnitude of dark energy: a mysterious cosmic force that is accelerating the Universe’s expansion.
Testing such ideas empirically will be extremely difficult. In the same way that water looks perfectly smooth and fluid until it is observed on the scale of its molecules — a fraction of a nanometre — estimates suggest that space-time will look continuous all the way down to the Planck scale: roughly 10−35 metres, or some 20 orders of magnitude smaller than a proton.
But it may not be impossible. One often-mentioned way to test whether space-time is made of discrete constituents is to look for delays as high-energy photons travel to Earth from distant cosmic events such as supernovae and γ-ray bursts. In effect, the shortest-wavelength photons would sense the discreteness as a subtle bumpiness in the road they had to travel, which would slow them down ever so slightly.
Giovanni Amelino-Camelia, a quantum-gravity researcher at the University of Rome, and his colleagues have found4 hints of just such delays in the photons from a γ-ray burst recorded in April. The results are not definitive, says Amelino-Camelia, but the group plans to expand its search to look at the travel times of high-energy neutrinos produced by cosmic events. He says that if theories cannot be tested, “then to me, they are not science. They are just religious beliefs, and they hold no interest for me.”
Other physicists are looking at laboratory tests. In 2012, for example, researchers from the University of Vienna and Imperial College London proposed5 a tabletop experiment in which a microscopic mirror would be moved around with lasers. They argued that Planck-scale granularities in space-time would produce detectable changes in the light reflected from the mirror (see Naturehttp://doi.org/njf; 2012).
Loop quantum gravity
Even if it is correct, the thermodynamic approach says nothing about what the fundamental constituents of space and time might be. If space-time is a fabric, so to speak, then what are its threads?
One possible answer is quite literal. The theory of loop quantum gravity, which has been under development since the mid-1980s by Ashtekar and others, describes the fabric of space-time as an evolving spider’s web of strands that carry information about the quantized areas and volumes of the regions they pass through6. The individual strands of the web must eventually join their ends to form loops — hence the theory’s name — but have nothing to do with the much better-known strings of string theory. The latter move around in space-time, whereas strands actually are space-time: the information they carry defines the shape of the space-time fabric in their vicinity.
Because the loops are quantum objects, however, they also define a minimum unit of area in much the same way that ordinary quantum mechanics defines a minimum ground-state energy for an electron in a hydrogen atom. This quantum of area is a patch roughly one Planck scale on a side. Try to insert an extra strand that carries less area, and it will simply disconnect from the rest of the web. It will not be able to link to anything else, and will effectively drop out of space-time.
One welcome consequence of a minimum area is that loop quantum gravity cannot squeeze an infinite amount of curvature onto an infinitesimal point. This means that it cannot produce the kind of singularities that cause Einstein’s equations of general relativity to break down at the instant of the Big Bang and at the centres of black holes.
In 2006, Ashtekar and his colleagues reported7 a series of simulations that took advantage of that fact, using the loop quantum gravity version of Einstein’s equations to run the clock backwards and visualize what happened before the Big Bang. The reversed cosmos contracted towards the Big Bang, as expected. But as it approached the fundamental size limit dictated by loop quantum gravity, a repulsive force kicked in and kept the singularity open, turning it into a tunnel to a cosmos that preceded our own.
This year, physicists Rodolfo Gambini at the Uruguayan University of the Republic in Montevideo and Jorge Pullin at Louisiana State University in Baton Rouge reported8 a similar simulation for a black hole. They found that an observer travelling deep into the heart of a black hole would encounter not a singularity, but a thin space-time tunnel leading to another part of space. “Getting rid of the singularity problem is a significant achievement,” says Ashtekar, who is working with other researchers to identify signatures that would have been left by a bounce, rather than a bang, on the cosmic microwave background — the radiation left over from the Universe’s massive expansion in its infant moments.
Loop quantum gravity is not a complete unified theory, because it does not include any other forces. Furthermore, physicists have yet to show how ordinary space-time would emerge from such a web of information. But Daniele Oriti, a physicist at the Max Planck Institute for Gravitational Physics in Golm, Germany, is hoping to find inspiration in the work of condensed-matter physicists, who have produced exotic phases of matter that undergo transitions described by quantum field theory. Oriti and his colleagues are searching for formulae to describe how the Universe might similarly change phase, transitioning from a set of discrete loops to a smooth and continuous space-time. “It is early days and our job is hard because we are fishes swimming in the fluid at the same time as trying to understand it,” says Oriti.
Such frustrations have led some investigators to pursue a minimalist programme known as causal set theory. Pioneered by Rafael Sorkin, a physicist at the Perimeter Institute in Waterloo, Canada, the theory postulates that the building blocks of space-time are simple mathematical points that are connected by links, with each link pointing from past to future. Such a link is a bare-bones representation of causality, meaning that an earlier point can affect a later one, but not vice versa. The resulting network is like a growing tree that gradually builds up into space-time. “You can think of space emerging from points in a similar way to temperature emerging from atoms,” says Sorkin. “It doesn’t make sense to ask, ‘What’s the temperature of a single atom?’ You need a collection for the concept to have meaning.”
In the late 1980s, Sorkin used this framework to estimate9 the number of points that the observable Universe should contain, and reasoned that they should give rise to a small intrinsic energy that causes the Universe to accelerate its expansion. A few years later, the discovery of dark energy confirmed his guess. “People often think that quantum gravity cannot make testable predictions, but here’s a case where it did,” says Joe Henson, a quantum-gravity researcher at Imperial College London. “If the value of dark energy had been larger, or zero, causal set theory would have been ruled out.”
Causal dynamical triangulations
That hardly constituted proof, however, and causal set theory has offered few other predictions that could be tested. Some physicists have found it much more fruitful to use computer simulations. The idea, which dates back to the early 1990s, is to approximate the unknown fundamental constituents with tiny chunks of ordinary space-time caught up in a roiling sea of quantum fluctuations, and to follow how these chunks spontaneously glue themselves together into larger structures.
The earliest efforts were disappointing, says Renate Loll, a physicist now at Radboud University in Nijmegen, the Netherlands. The space-time building blocks were simple hyper-pyramids — four-dimensional counterparts to three-dimensional tetrahedrons — and the simulation’s gluing rules allowed them to combine freely. The result was a series of bizarre ‘universes’ that had far too many dimensions (or too few), and that folded back on themselves or broke into pieces. “It was a free-for-all that gave back nothing that resembles what we see around us,” says Loll.
But, like Sorkin, Loll and her colleagues found that adding causality changed everything. After all, says Loll, the dimension of time is not quite like the three dimensions of space. “We cannot travel back and forth in time,” she says. So the team changed its simulations to ensure that effects could not come before their cause — and found that the space-time chunks started consistently assembling themselves into smooth four-dimensional universes with properties similar to our own10.
Intriguingly, the simulations also hint that soon after the Big Bang, the Universe went through an infant phase with only two dimensions — one of space and one of time. This prediction has also been made independently by others attempting to derive equations of quantum gravity, and even some who suggest that the appearance of dark energy is a sign that our Universe is now growing a fourth spatial dimension. Others have shown that a two-dimensional phase in the early Universe would create patterns similar to those already seen in the cosmic microwave background.
Meanwhile, Van Raamsdonk has proposed a very different idea about the emergence of space-time, based on the holographic principle. Inspired by the hologram-like way that black holes store all their entropy at the surface, this principle was first given an explicit mathematical form by Juan Maldacena, a string theorist at the Institute of Advanced Study in Princeton, New Jersey, who published11 his influential model of a holographic universe in 1998. In that model, the three-dimensional interior of the universe contains strings and black holes governed only by gravity, whereas its two-dimensional boundary contains elementary particles and fields that obey ordinary quantum laws without gravity.
Hypothetical residents of the three-dimensional space would never see this boundary, because it would be infinitely far away. But that does not affect the mathematics: anything happening in the three-dimensional universe can be described equally well by equations in the two-dimensional boundary, and vice versa.
In 2010, Van Raamsdonk studied what that means when quantum particles on the boundary are ‘entangled’ — meaning that measurements made on one inevitably affect the other12. He discovered that if every particle entanglement between two separate regions of the boundary is steadily reduced to zero, so that the quantum links between the two disappear, the three-dimensional space responds by gradually dividing itself like a splitting cell, until the last, thin connection between the two halves snaps. Repeating that process will subdivide the three-dimensional space again and again, while the two-dimensional boundary stays connected. So, in effect, Van Raamsdonk concluded, the three-dimensional universe is being held together by quantum entanglement on the boundary — which means that in some sense, quantum entanglement and space-time are the same thing.
Or, as Maldacena puts it: “This suggests that quantum is the most fundamental, and space-time emerges from it.”
Nature 500,516–519 (29 August 2013) doi:10.1038/500516a
This website is educational. Materials within it are being used in accord with the Fair Use doctrine, as defined by United States law.
§107. Limitations on Exclusive Rights: Fair Use
Notwithstanding the provisions of section 106, the fair use of a copyrighted work, including such use by reproduction in copies or phone records or by any other means specified by that section, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright. In determining whether the use made of a work in any particular case is a fair use, the factors to be considered shall include:
the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes;
the nature of the copyrighted work;
the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and
the effect of the use upon the potential market for or value of the copyrighted work. (added pub. l 94-553, Title I, 101, Oct 19, 1976, 90 Stat 2546)
This lesson is copyright © Michael Richmond. This work is licensed under a Creative Commons License.
The Big Bang Model
Let’s review the observational evidence:
- Distance/velocity relationship: distant galaxies are moving away from us, with speeds which increase linearly with distance
- Chemistry: the universe is almost entirely hydrogen and helium, in a mixture of roughly 12 H atoms to 1 He atom
- Cosmic Microwave Background: no matter where we look in the universe, we see radio waves which look like those radiated by a blackbody at about 2.7 degrees above absolute zero. There are tiny (one part in 10,000) variations in the brightness of this radiation on scales of a degree or so
Is there any way to tie all these pieces of data together? Yes! One model which can explain them all is called the Big Bang model. The name was coined by a scientist who didn’t like the theory and tried to make it sound silly.
Fundamentals of the Big Bang Model
The Big Bang is built upon three main tenets:
- the universe used to be very hot
- the universe used to be very dense
- the universe is expanding (which is why it isn’t so hot or dense anymore)
Note that the basic Big Bang Model does NOT say anything about the following questions:
- will the universe collapse again, or expand forever?
- is space curved or flat?
- how old is the universe?
- what is the matter density in the universe?
- what about dark matter?
- is there some mysterious “repulsive” force on large scales?
- how did galaxies form?
Some of these questions all depend upon the values of certain parameters in the model, which we may derive from observations. Others have nothing to do with the Big Bang itself.
Our understanding of the laws of nature permit us to track the physical state of the universe back to a certain point, when the density and temperature were REALLY high. Beyond that point, we don’t know exactly how matter and radiation behave. Let’s call that moment the starting point. It doesn’t mean that the universe “began” at that time, it just means that we don’t know what happened before that point.
Big Bang Nucleosynthesis
One of the primary successes of the Big Bang theory is its explanation for the chemical composition of the universe. Recall that the universe is mostly hydrogen and helium, with very small amounts of heavier elements. How does this relate to the Big Bang?
Well, a long time ago, the universe was hot and dense. When the temperature is high enough (a few thousand degrees), atoms lose all their electrons; we call this state of matter, a mix of nuclei and electrons, a fully-ionized plasma. If the temperature is even higher (millions of degrees), then the nuclei break up into fundamental particles, and one is left with a “soup” of fundamental particles:
Now, if the “soup” is very dense, then these particles will collide with each other frequently. Occasionally, groups of protons and neutrons will stick together to form nuclei of light elements … but under extremely high pressure and temperature, the nuclei are broken up by subsequent collisions. The Big Bang theory postulates that the entire universe was so hot at one time that it was filled with this proton-neutron-electron “soup.”
But the Big Bang theory then states that, as the universe expanded, both the density and temperature dropped. As the temperature and density fell, collisions between particles became less violent, and less frequent. There was a brief “window of opportunity” when protons and neutrons could collide hard enough to stick together and form light nuclei, yet not suffer so many subsequent collisions that the nuclei would be destroyed. This “window” appeared about three minutes after the starting point, and lasted for a bit less than a minute.
Which nuclei would form under these conditions? Experiments with particle colliders have shown us that most of the possible nuclei are unstable, meaning they break up all by themselves, or fragile, meaning they are easily broken by collisions.
Helium (the ordinary sort, with 2 protons and 2 neutrons) is by far the most stable and robust compound nucleus. Deuterium (one proton and one neutron) is easily destroyed, and so is helium-3 (2 protons, one neutron).
So, it seems that this period of hot, dense plasma would create a lot of helium. Could it create other, heavier elements, too?
It turns out that none of the heavier nuclei which are easily made by collisions of single particles with helium nuclei, or helium nuclei with each other, are stable or robust. Almost all nuclei heavier than helium are likely to be destroyed by subsequent collisions. The only heavier nucleus which might possibly survive is lithium-7 (3 protons and 4 neutrons), but it requires the collision of a helium nucleus plus 2 or 3 other particles simultaneously, which isn’t very likely.
Detailed models of Big Bang nucleosynthesis predict that the brief “window of opportunity” lasted only a minute or two. After that, about three and a half minutes after the starting point, the temperature and density dropped so much that collisions between particles were rare, and of such low energy that the electric forces of repulsion between positively-charged nuclei prevented fusion. The result is
- lots of hydrogen
- some helium (ordinary helium-4)
- tiny bits of deuterium
- tiny bits of lithium
- not much else
The relative amounts of hydrogen, helium, deuterium and lithium depend very sensitively on the exact density of matter in the universe during this window of opportunity. We’ll discuss this later.
The Cosmic Microwave Background
So, during the first few minutes after the starting point, the universe was hot enough to fuse particles into helium nuclei. The result was a ratio of about 12 hydrogen nuclei to 1 helium nucleus; that’s equivalent to saying that three quarters of the mass of the universe was hydrogen nuclei, and one quarter of the mass was helium nuclei.
But these nuclei were totally ionized: they lacked the normal collection of electrons surrounding them. The electrons were free to fly around space on their own. Free electrons are very efficient at scattering photons. Any light rays or radio waves or X-rays in this ionized plasma were scattered before they could travel far. The universe was opaque.
After a few thousand years, as the universe continued to expand and cool, the temperature reached a critical point. About 100,000 years after the starting point, the temperature dropped to about 3,000 degrees Kelvin. At this point, hydrogen nuclei (protons) were able to capture electrons, and hold them against collisions. We call this process of capturing electrons recombination (even though it was really the first “combination”, not a re-“combination”).
The universe became largely neutral, with electrons bound into hydrogen and helium atoms. Neutral atoms are nearly transparent to light rays and radio waves. Suddenly, the universe became transparent.
What this meant was that light rays which were produced by the hot, 3,000-degree gas were free to fly throughout the universe without being scattered or absorbed. What kind of photons were they? Since they were produced by a hot gas, they had a blackbody spectrum appropriate for a temperature of about 3,000 Kelvin:
As the universe continued to expand, the wavelength of these photons increased: each time the universe doubled in size, the wavelength of the photons doubled. However, since the wavelength of each photon increased by the same factor, the relative wavelengths of the photons remained fixed. It turns out that when one increases all the wavelengths of a blackbody spectrum by the same amount, one gets another blackbody spectrum — but one which corresponds to a lower temperature than the original.
The universe has expanded by a factor of about 1,000 since the time of recombination, which means that the wavelengths of these blackbody photons have increased by a factor of about 1,000 as they have flown through the cosmos. The Big Bang theory predicts that we should be able to detect these stretched-out photons, if we look in the right part of the electromagnetic spectrum:
The peak of the shifted blackbody spectrum now falls in the microwave range, with a wavelength of about one millimeter. This corresponds to a blackbody temperature of about 3 Kelvin: a factor of 1,000 times lower than the original 3,000 Kelvin when the gas became neutral. It is reasonable to say that as the universe expands, its “temperature” drops in sync: expansion by a factor of 2 means the temperature drops by a factor of 2, expansion by a factor of 10 means the temperature drops by a factor of 10, and so on.
And when we look at the universe with radio telescopes in the microwave range, we see a spectrum which is exactly like a blackbody …
with a peak at a wavelength of 1.869 millimeters, corresponding to a temperature 2.726 Kelvin.
Does the temperature of the microwave background change with time?
The Big Bang model predicts that this temperature drops as the universe expands. If we could somehow measure the temperature of the CMB at some time long ago, we ought to find a temperature higher than 2.7 Kelvin. And … we can!
- some clouds of gas contain atoms which can be excited by absorbing a photon from the CMB; the atoms then decay into their ground states by emitting several secondary photons
- by detecting these secondary photons, and measuring atomic properties of these atoms very carefully in a lab on Earth, we can deduce properties of the photons from the CMB which excited the atoms
- if we look at such clouds of gas in our own Milky Way, we can calculate that the radiation absorbed by the atoms has a spectrum corresponding to a blackbody at a temperature of about 2.7 Kelvin
- if we look at such clouds in very distant galaxies, the we can determine properties of the CMB spectrum long ago in time
The measurements are very difficult: they involve taking high-resolution spectra of very distant quasars, and looking at absorption by intervening material in galaxies that just happen to lie between us and the quasar. One recent attempt used the ratio of several lines of neutral carbon:
Astronomers have tried to use such measurements to calculate the temperature of the CMB as a function of redshift (and, in turn, as a function of time). Most of the attempts have yielded only upper limits to the CMB temperature in the past, but in December, 2000, a team of astronomers using one of the 8.2-meter Very Large Telescopes in Chile succeeded in making an honest-to-goodness determination: on the plot below, their value is the solid line denoting a temperature 6K < T < 14K at a redshift z=2.34.
Why is the CMB lumpy?
Now, if the universe was perfectly uniform at the time of recombination — same density everywhere, same temperature everywhere — then we should see a perfectly uniform microwave background. But, if the gas in the early universe had gathered into clumps by the time of recombination, even very fluffy clumps, the radiation it produced would be slightly more intense in the clumpy areas, and less intense in between them. The amplitude and size of fluctuations in the microwave background can tell us a lot about the conditions of the hot gas just 100,000 years after the starting point. We’ll discuss this later…
The distance/velocity connection
The Big Bang theory states that the universe is expanding, though it does not explain why the universe should behave in this way. As a result, objects which are subject to no forces should move away from each other. In a uniformly-expanding universe, the rate at which objects move away from each other depends linearly on their distance from each other.
And that linear relationship between distance and radial velocity is just what we see when we look at distant galaxies:
But … wait a minute. Does this expansion occur on all scales? What about
- the distance between two people on opposite sides of the room?
- the distance between the Earth and the Sun?
- the distance between the Sun and the center of the Milky Way?
- the distance between the Milky Way and the Andromeda Galaxy?
If there are “significant” attractive forces between objects, they do not move away from each other as time goes by.
These attractive forces may be
"chemical" forces between important on microscopic, neighboring molecules or atoms human, and planet-sized (these are really due to scales electric forces) gravitational forces between important on solar-system, large bodies of matter galaxy, and galaxy-cluster scales
So the distance between the Earth and Sun has not increased over the past 4 billion years.
Nor does the length of a meterstick grow due to the expansion of the universe.
Only on the very largest scales, distances between isolated galaxies or clusters of galaxies, are the attractive forces so weak that the expansion of the universe is able to move objects apart.
The rate of expansion depends sensitively on the exact amount of matter in the universe. I
f the density of matter is high, long-range gravitational forces can slow down the expansion, or even stop it.
If the density of matter is very low, the expansion will go on forever. We will discuss this in greater detail later.
The Big Bang theory explains
the relative amount of light elements (depends on conditions a few minutes after the starting point)
the cosmic microwave background (depends on conditions 100,000 years after the starting point)
the expansion of distant galaxies away from each other (depends on the density of matter in the universe, and on the cosmological constant, if it is not zero)
It doesn’t provide answers to all our questions, but it does a better job than any alternative which has yet been proposed. All but a handful of astronomers have adopted the Big Bang as the standard theory for the evolution of the universe.
For more information,
- Ned Wright’s cosmology tutorial
- Ned Wright’s explanation of microwave power spectrum plots
- Introduction to cosmology from the MAP project, which will launch a satellite in June, 2001, to map the microwave background.
- Big Bang tutorial from the Center for Astrophysics and Space Sciences at the University of San Diego. They provide a nice timeline showing important events in the evolution of the universe.
- VLT measurements of CMBR temperature at z=2.34
Copyright © Michael Richmond. This work is licensed under a Creative Commons Li
Thermodynamics is an essential component of physics and chemistry: Science standards for thermodynamics
FactCheck.Org ran this analysis:
Ben Carson claimed that prevailing theories of how the universe began and how planets and stars formed violate the second law of thermodynamics. His comments represent a misunderstanding of scientific concepts. Carson, a retired pediatric neurosurgeon and Republican presidential candidate, spoke at a rally on Sept. 22 at Cedarville University — an Ohio school that describes itself as a “Christ-centered, Baptist institution.” Carson began his discussion of science by explaining — correctly — that many studies have debunked the notion that vaccines cause autism. “That’s why we have science and scientific studies to look at these kinds of things,” he said.
He then went on to say “science is not always correct,” and claimed that the Big Bang theory is one such example (at the 1:03:13 mark):
Carson, Sept. 22: Now you’re saying, there’s a Big Bang, a big explosion, and our solar system and our universe come into perfect alignment. Now I said you also believe in the second law of thermodynamics, entropy, right? “Yeah.” And I said, that states that things move toward a state of disorganization, right? “Yeah.” I said, so, how is there a Big Bang and instead of things moving toward disorganization they become perfectly organized to the point where we can predict 70 years hence when a comet is coming. How does that work? “Well. We don’t understand everything.”
The second law of thermodynamics says that in any isolated system, the entropy of that system will increase or remain the same — not decrease.
Carson claims that the Big Bang theory violates the second law of thermodynamics, since the solar system has moved to what he calls a “perfectly organized” point, instead of becoming more disorganized.
But the two concepts aren’t in contradiction. A small part of a system can become more ordered, while the rest of the system sees a decrease in order in the process.
One good example of this is an ice tray in a freezer. The molecules in liquid water move into a more ordered state when they freeze into a solid. On its own then, water turning to ice appears to be a violation of the second law. But the ice in the freezer is not a closed system: The freezer also generates heat as it runs, which is radiated out into your kitchen. That heat increases entropy more than the water turning to ice decreases it.
Greene, Sept. 23: How do you take a messy room and make it ordered? That would seem to be decreasing the disorder – it was a mess, now it’s not a mess. It was disordered, now it’s ordered. How could anybody do that? It seems to violate the second law of thermodynamics!
But the answer is: you have to take into account all of the sources of order and disorder, including the body of the human who is cleaning up the room, the heat that they are generating, the fat that’s being burned as they undertake this exercise. And when you take into account everything – the molecules of air that get excited by the sweat forming on the brow of the individual doing the cleaning – when you take into account all of these features, the amount of disorder generated overly compensates – always – for the amount of order that’s being created in the room.
Moving outward to the solar system scale, the situation is the same. The increasing entropy is not violated by the formation of planets, stars and comets due to arrive in 70 years. All the factors that go into the formation of these celestial bodies work to increase disorder rather than decrease it.
As Greene told us: “The formation of a star is an entropically increasing phenomenon. It is not decreasing the amount of disorder, it is increasing the amount of disorder, even though it looks so darn ordered relative to, say, the swirling gas cloud from which it emerged.”
Planets and stars form when gases and dust in space slow down and begin to clump together, at which point gravity helps pull these clumps together and draw in more dust and gas, until those big objects are formed. “That process as we understand it is completely consistent with the second law of thermodynamics,” Greene said.
From a universe-wide perspective, the overall increasing entropy is measurable based on the leftover heat from the Big Bang, known as the cosmic microwave background radiation. According to the Big Bang theory, at the point of the initial explosion all the energy in the universe was concentrated in a state of very low entropy — an almost completely ordered state.
Ever since that explosion, that energy has been spreading out, a continually rising degree of disorder. The observed level of the background radiation is consistent with the predictions of modern cosmology. In short, Big Bang theory predicts the existence of and the specific amounts of background radiation as a result of the rising entropy of the entire system, and observations actually bear that out. “The calculations agree with the observations to fantastic precision,” Greene said.
Carson went on to claim that the presence of stars and planets is related to the existence of multiple Big Bangs that eventually might produce an ordered universe:
Carson: And then they go to the probability theory, and they say “but if there’s enough big bangs over a long enough period of time, one of them will be the perfect big bang and everything will be perfectly organized.” And I said, so you’re telling me if I blow a hurricane through a junkyard enough times over a long enough period of time after one of them there will be a 747 fully formed and ready to fly?
That is not an accurate reflection of the Big Bang theory. Though some theories of the origin of the universe suggest that the Big Bang was only one of many such explosions, these theories do not state that the currently ordered existence is a spontaneous result of one of these repeated Big Bangs.
Greene called this a “red herring,” and said the concept of multiple Big Bangs has nothing to do with how stars and planets form in this current universe. Instead, those theories involve the idea that the universe goes through cycles over many billions of years: Big Bang, expansion, contraction, “Big Crunch,” followed by another Big Bang. How the stars and planets form between each of those bangs and crunches is a separate issue.
Although there is still much to be learned about the origins of the universe, the fact is science has extremely thorough explanations for how planets and stars form, and they mesh perfectly with the laws of thermodynamics.
Editor’s Note: SciCheck is made possible by a grant from the Stanton Foundation. – Dave Levitan. Original article : Ben Carson rewrites the laws of thermodynamics