Home » thermodynamics
Category Archives: thermodynamics
Zeroth law of thermodynamics
People normally think of the three laws of thermodynamics. But there is one idea that they all depend on, so basic that it often gets overlooked: the zeroth law.
This idea works like the transitive rule of algebra:
If A = B and B = C then A = C
If the temp of object A = temp of object B,
and the temp of object B = temp of object C,
then the temp of object A = temp of object C
Therefore all three systems would be in thermal equilibrium.
Let’s watch three different materials fulfill this law, by coming into thermal equilibrium.
Animation by Charles Xie
Thermal equilibrium (in this example) is reached when the temp of all pieces = 13.4 degrees C.
http://weelookang.blogspot.sg/2012/09/the-zeroth-law-of-thermodynamics.html
Also see https://www.grc.nasa.gov/www/k-12/airplane/thermo0.html
Another way to view this:
“When body A is placed in thermal contact with body B, there will be a flow of thermal energy between the two bodies. Thermal energy will flow from the body at a higher temperature, to the one at a lower temperature, until thermal equilibrium between the two bodies is reached.”
– Loo Kang Lawrence
Thanks for visiting my website. We have resources for teachers of Astronomy, Biology, Chemistry, Earth Science, Physics, Diversity and Inclusion in STEM, and connections with reading, books, TV, and film. At this next link are some great resources at Teachers Pay Teachers, including free downloads – KaiserScience TpT resources
The nature of time
What is time?
What is time? Where does time come from?
In what way is time really something objective? (something actually out there?)
In what ways is time not objective? (so it would be just a way that humans use to describe our perception of the universe)
What is time?
Why does time never go backward?
The answer apparently lies not in the laws of nature, which hardly distinguish between past and future, but in the conditions prevailing in the early universe.
The Arrow of Time, Scientific American article. David Layzer
Is there a relationship between time and the second law of thermodynamics?
Before reading further, understand that these topics require at least some familiarity with the laws of Thermodynamics
“According to many, there might be a link between what we perceive as the arrow of time and a quantity called entropy…. [but] as far as we can tell, the second law of thermodynamics is true: entropy never decreases for any closed system in the Universe, including for the entirety of the observable Universe itself. It’s also true that time always runs in one direction only, forward, for all observers. What many don’t appreciate is that these two types of arrows — the thermodynamic arrow of entropy and the perceptive arrow of time — are not interchangeable.”
No, Thermodynamics Does Not Explain Our Perceived Arrow Of Time, Starts With A Bang, Ethan Siegel, Forbes
No, Thermodynamics Does Not Explain Our Perceived Arrow Of Time
Is time (and perhaps space,) quantized?
Ethan Siegel leads us in a fascination discussion:
The idea that space (or space and time, since they’re inextricably linked by Einstein’s theories of relativity) could be quantized goes way back to Heisenberg himself.
Famous for the Uncertainty Principle, which fundamentally limits how precisely we can measure certain pairs of quantities (like position and momentum), Heisenberg realized that certain quantities diverged, or went to infinity, when you tried to calculate them in quantum field theory….
It’s possible that the problems that we perceive now, on the other hand, aren’t insurmountable problems, but are rather artifacts of having an incomplete theory of the quantum Universe.
It’s possible that space and time are really continuous backgrounds, and even though they’re quantum in nature, they cannot be broken up into fundamental units. It might be a foamy kind of spacetime, with large energy fluctuations on tiny scales, but there might not be a smallest scale. When we do successfully find a quantum theory of gravity, it may have a continuous-but-quantum fabric, after all.
Are Space And Time Quantized? Maybe Not, Says Science
Even In A Quantum Universe, Space And Time Might Be Continuous, Not Discrete
Theoretical physics: The origins of space and time
Can we stop a hurricane
Can we stop a hurricane? Sounds like something out of science fiction, a proposal fitting of mad scientists, right?
Remember Hurricane Katrina? August 2005. It was a destructive Category 5 Atlantic hurricane that caused over 1,800 fatalities and $125 billion in damage. Damaged the are of and around the city of New Orleans. What if there had been a way to shift its course, or reduce its intensity?
In Hurricane Forcing: Can Tropical Cyclones Be Stopped? by Christopher Mims, Scientific American, October 23, 2009, we read
This past June, a plan to reduce the severity and frequency of hurricanes leaked to the public in the form of a patent application under Bill Gates’s name (along with many others), resuscitating speculation about a scheme that has been proposed off and on since the 1960s. The core of the idea remains the same: mixing the warm surface waters that fuel tropical cyclones with cooler waters below to drain storms of their energy. But now Stephen Salter an emeritus professor of engineering design at the University of Edinburgh proposes a new—and possibly more realistic—method of mixing.
Salter has outlined in an engineering paper the design for a floating structure 100 meters in diameter—basically a circular raft of lashed-together used tires (to reduce cost). It would support a thin plastic tube 100 meters in diameter and 200 meters in length.
When deployed in the open ocean, the tube would hang vertically, descending through the warm, well-mixed upper reaches of the ocean and terminating in a deeper part of the water column known as the thermocline, where water temperatures drop precipitously.
The point of this design is to transfer warm surface water into the deeper, cooler reaches of the ocean, mixing the two together and, hopefully, cooling the sea surface. Salter’s design is relatively simple, using a minimum of material in order to make the construction of each of his devices cheap (millions of used tires are thrown away each year, worldwide); his scheme would also require the deployment of hundreds of these devices.
Using horizontal wave action at the ocean surface, passive no-return valves would capture energy by closing after a wave has passed through them, allowing the circular interior of each device to raise the level of the seawater within the device by, on average, 20 centimeters. The weight of the gathered warm water would thereby create downward pressure, pushing it down the tube.
The idea is that hundreds of these floating wave-powered seawater pumps would be deployed year-round in areas, such as the eastern tropical Atlantic and the Gulf of Mexico, where hurricanes typically spawn or grow in intensity. (The devices would not, as widely speculated, be deployed only in the path of a hurricane that already formed.) …
In Can Science Halt Hurricanes? we read
Until recently, the U.S. Department of Homeland Security has been investigating whether seeding storm clouds with pollution-size aerosols (particles suspended in gas) might help slow tropical cyclones. Computer models suggest that deploying aerosols can have “an appreciable impact on tropical cyclone intensity,” writes William Cotton, an atmospheric scientist at Colorado State University. He and his colleagues recently reviewed such work in the Journal of Weather Modification. In fact, human pollution may already be weakening storms, including August’s Hurricane Irene. “[Computer] models all predicted that the intensity of Irene would be much greater than it was,” Cotton notes. “Was that because they did not include aerosol effects?”…
In The Insider, Kelley Dickerson writes
Engineers could stop hurricanes with the ‘sunglasses effect’ — but it’d require a huge sacrifice
According to new research published in the journal Proceedings of the National Academy of Sciences, if we pumped sulfate gases into our planet’s upper atmosphere, we could cool down our oceans enough to cut the number of Katrina-force hurricanes in half over the next 50 years. It’d require about 10 billion tons of sulfates to get the job done, which is tens or hundreds of times the sulfates a typical volcanic eruption can form.
From Stanford University we read
Computer simulations by Professor Mark Z. Jacobson have shown that offshore wind farms with thousands of wind turbines could have sapped the power of three real-life hurricanes, significantly decreasing their winds and accompanying storm surge, and possibly preventing billions of dollars in damages…. he found that the wind turbines could disrupt a hurricane enough to reduce peak wind speeds by up to 92 mph and decrease storm surge by up to 79 percent.
The study, conducted by Jacobson, and Cristina Archer and Willett Kempton of the University of Delaware, was published online in Nature Climate Change….
Taming Hurricanes With Arrays of Offshore Wind Turbines (Nature Climate Change, 2014)
In this intriguing discussion, science fiction writers look into the real physics of the question, What would be need to stop a hurricane? What would we need to stop a hurricane? Worldbuilding @ Stackexchange
Also see these great topics at Hurricane Research Division NOAA, National Oceanic and Atmospheric Administration
Tropical Cyclone Modification and Myths
Theoretical physics: The origins of space and time
Many researchers believe that physics will not be complete until it can explain not just the behaviour of space and time, but where these entities come from.
Zeeya Merali, Nature, 28 August 2013
“Imagine waking up one day and realizing that you actually live inside a computer game,” says Mark Van Raamsdonk, describing what sounds like a pitch for a science-fiction film. But for Van Raamsdonk, a physicist at the University of British Columbia in Vancouver, Canada, this scenario is a way to think about reality. If it is true, he says, “everything around us — the whole three-dimensional physical world — is an illusion born from information encoded elsewhere, on a two-dimensional chip”. That would make our Universe, with its three spatial dimensions, a kind of hologram, projected from a substrate that exists only in lower dimensions.
This ‘holographic principle’ is strange even by the usual standards of theoretical physics. But Van Raamsdonk is one of a small band of researchers who think that the usual ideas are not yet strange enough. If nothing else, they say, neither of the two great pillars of modern physics — general relativity, which describes gravity as a curvature of space and time, and quantum mechanics, which governs the atomic realm — gives any account for the existence of space and time. Neither does string theory, which describes elementary threads of energy.
Van Raamsdonk and his colleagues are convinced that physics will not be complete until it can explain how space and time emerge from something more fundamental — a project that will require concepts at least as audacious as holography. They argue that such a radical reconceptualization of reality is the only way to explain what happens when the infinitely dense ‘singularity’ at the core of a black hole distorts the fabric of space-time beyond all recognition, or how researchers can unify atomic-level quantum theory and planet-level general relativity — a project that has resisted theorists’ efforts for generations.
“All our experiences tell us we shouldn’t have two dramatically different conceptions of reality — there must be one huge overarching theory,” says Abhay Ashtekar, a physicist at Pennsylvania State University in University Park.
Finding that one huge theory is a daunting challenge. Here, Nature explores some promising lines of attack — as well as some of the emerging ideas about how to test these concepts.
NIK SPENCER/NATURE; Panel 4 adapted from Budd, T. & Loll, R. Phys. Rev. D 88, 024015 (2013)
Gravity as thermodynamics
One of the most obvious questions to ask is whether this endeavour is a fool’s errand. Where is the evidence that there actually is anything more fundamental than space and time?
A provocative hint comes from a series of startling discoveries made in the early 1970s, when it became clear that quantum mechanics and gravity were intimately intertwined with thermodynamics, the science of heat.
In 1974, most famously, Stephen Hawking of the University of Cambridge, UK, showed that quantum effects in the space around a black hole will cause it to spew out radiation as if it was hot. Other physicists quickly determined that this phenomenon was quite general. Even in completely empty space, they found, an astronaut undergoing acceleration would perceive that he or she was surrounded by a heat bath. The effect would be too small to be perceptible for any acceleration achievable by rockets, but it seemed to be fundamental. If quantum theory and general relativity are correct — and both have been abundantly corroborated by experiment — then the existence of Hawking radiation seemed inescapable.
A second key discovery was closely related. In standard thermodynamics, an object can radiate heat only by decreasing its entropy, a measure of the number of quantum states inside it. And so it is with black holes: even before Hawking’s 1974 paper, Jacob Bekenstein, now at the Hebrew University of Jerusalem, had shown that black holes possess entropy.
But there was a difference. In most objects, the entropy is proportional to the number of atoms the object contains, and thus to its volume. But a black hole’s entropy turned out to be proportional to the surface area of its event horizon — the boundary out of which not even light can escape. It was as if that surface somehow encoded information about what was inside, just as a two-dimensional hologram encodes a three-dimensional image.
In 1995, Ted Jacobson, a physicist at the University of Maryland in College Park, combined these two findings, and postulated that every point in space lies on a tiny ‘black-hole horizon’ that also obeys the entropy–area relationship. From that, he found, the mathematics yielded Einstein’s equations of general relativity — but using only thermodynamic concepts, not the idea of bending space-time1.
“This seemed to say something deep about the origins of gravity,” says Jacobson. In particular, the laws of thermodynamics are statistical in nature — a macroscopic average over the motions of myriad atoms and molecules — so his result suggested that gravity is also statistical, a macroscopic approximation to the unseen constituents of space and time.
In 2010, this idea was taken a step further by Erik Verlinde, a string theorist at the University of Amsterdam, who showed2 that the statistical thermodynamics of the space-time constituents — whatever they turned out to be — could automatically generate Newton’s law of gravitational attraction.
And in separate work, Thanu Padmanabhan, a cosmologist at the Inter-University Centre for Astronomy and Astrophysics in Pune, India, showed3 that Einstein’s equations can be rewritten in a form that makes them identical to the laws of thermodynamics — as can many alternative theories of gravity. Padmanabhan is currently extending the thermodynamic approach in an effort to explain the origin and magnitude of dark energy: a mysterious cosmic force that is accelerating the Universe’s expansion.
Testing such ideas empirically will be extremely difficult. In the same way that water looks perfectly smooth and fluid until it is observed on the scale of its molecules — a fraction of a nanometre — estimates suggest that space-time will look continuous all the way down to the Planck scale: roughly 10−35 metres, or some 20 orders of magnitude smaller than a proton.
But it may not be impossible. One often-mentioned way to test whether space-time is made of discrete constituents is to look for delays as high-energy photons travel to Earth from distant cosmic events such as supernovae and γ-ray bursts. In effect, the shortest-wavelength photons would sense the discreteness as a subtle bumpiness in the road they had to travel, which would slow them down ever so slightly.
Giovanni Amelino-Camelia, a quantum-gravity researcher at the University of Rome, and his colleagues have found4 hints of just such delays in the photons from a γ-ray burst recorded in April. The results are not definitive, says Amelino-Camelia, but the group plans to expand its search to look at the travel times of high-energy neutrinos produced by cosmic events. He says that if theories cannot be tested, “then to me, they are not science. They are just religious beliefs, and they hold no interest for me.”
Other physicists are looking at laboratory tests. In 2012, for example, researchers from the University of Vienna and Imperial College London proposed5 a tabletop experiment in which a microscopic mirror would be moved around with lasers. They argued that Planck-scale granularities in space-time would produce detectable changes in the light reflected from the mirror (see Naturehttp://doi.org/njf; 2012).
Loop quantum gravity
Even if it is correct, the thermodynamic approach says nothing about what the fundamental constituents of space and time might be. If space-time is a fabric, so to speak, then what are its threads?
One possible answer is quite literal. The theory of loop quantum gravity, which has been under development since the mid-1980s by Ashtekar and others, describes the fabric of space-time as an evolving spider’s web of strands that carry information about the quantized areas and volumes of the regions they pass through6. The individual strands of the web must eventually join their ends to form loops — hence the theory’s name — but have nothing to do with the much better-known strings of string theory. The latter move around in space-time, whereas strands actually are space-time: the information they carry defines the shape of the space-time fabric in their vicinity.
Because the loops are quantum objects, however, they also define a minimum unit of area in much the same way that ordinary quantum mechanics defines a minimum ground-state energy for an electron in a hydrogen atom. This quantum of area is a patch roughly one Planck scale on a side. Try to insert an extra strand that carries less area, and it will simply disconnect from the rest of the web. It will not be able to link to anything else, and will effectively drop out of space-time.
One welcome consequence of a minimum area is that loop quantum gravity cannot squeeze an infinite amount of curvature onto an infinitesimal point. This means that it cannot produce the kind of singularities that cause Einstein’s equations of general relativity to break down at the instant of the Big Bang and at the centres of black holes.
In 2006, Ashtekar and his colleagues reported7 a series of simulations that took advantage of that fact, using the loop quantum gravity version of Einstein’s equations to run the clock backwards and visualize what happened before the Big Bang. The reversed cosmos contracted towards the Big Bang, as expected. But as it approached the fundamental size limit dictated by loop quantum gravity, a repulsive force kicked in and kept the singularity open, turning it into a tunnel to a cosmos that preceded our own.
This year, physicists Rodolfo Gambini at the Uruguayan University of the Republic in Montevideo and Jorge Pullin at Louisiana State University in Baton Rouge reported8 a similar simulation for a black hole. They found that an observer travelling deep into the heart of a black hole would encounter not a singularity, but a thin space-time tunnel leading to another part of space. “Getting rid of the singularity problem is a significant achievement,” says Ashtekar, who is working with other researchers to identify signatures that would have been left by a bounce, rather than a bang, on the cosmic microwave background — the radiation left over from the Universe’s massive expansion in its infant moments.
Loop quantum gravity is not a complete unified theory, because it does not include any other forces. Furthermore, physicists have yet to show how ordinary space-time would emerge from such a web of information. But Daniele Oriti, a physicist at the Max Planck Institute for Gravitational Physics in Golm, Germany, is hoping to find inspiration in the work of condensed-matter physicists, who have produced exotic phases of matter that undergo transitions described by quantum field theory. Oriti and his colleagues are searching for formulae to describe how the Universe might similarly change phase, transitioning from a set of discrete loops to a smooth and continuous space-time. “It is early days and our job is hard because we are fishes swimming in the fluid at the same time as trying to understand it,” says Oriti.
Causal sets
Such frustrations have led some investigators to pursue a minimalist programme known as causal set theory. Pioneered by Rafael Sorkin, a physicist at the Perimeter Institute in Waterloo, Canada, the theory postulates that the building blocks of space-time are simple mathematical points that are connected by links, with each link pointing from past to future. Such a link is a bare-bones representation of causality, meaning that an earlier point can affect a later one, but not vice versa. The resulting network is like a growing tree that gradually builds up into space-time. “You can think of space emerging from points in a similar way to temperature emerging from atoms,” says Sorkin. “It doesn’t make sense to ask, ‘What’s the temperature of a single atom?’ You need a collection for the concept to have meaning.”
In the late 1980s, Sorkin used this framework to estimate9 the number of points that the observable Universe should contain, and reasoned that they should give rise to a small intrinsic energy that causes the Universe to accelerate its expansion. A few years later, the discovery of dark energy confirmed his guess. “People often think that quantum gravity cannot make testable predictions, but here’s a case where it did,” says Joe Henson, a quantum-gravity researcher at Imperial College London. “If the value of dark energy had been larger, or zero, causal set theory would have been ruled out.”
Causal dynamical triangulations
That hardly constituted proof, however, and causal set theory has offered few other predictions that could be tested. Some physicists have found it much more fruitful to use computer simulations. The idea, which dates back to the early 1990s, is to approximate the unknown fundamental constituents with tiny chunks of ordinary space-time caught up in a roiling sea of quantum fluctuations, and to follow how these chunks spontaneously glue themselves together into larger structures.
The earliest efforts were disappointing, says Renate Loll, a physicist now at Radboud University in Nijmegen, the Netherlands. The space-time building blocks were simple hyper-pyramids — four-dimensional counterparts to three-dimensional tetrahedrons — and the simulation’s gluing rules allowed them to combine freely. The result was a series of bizarre ‘universes’ that had far too many dimensions (or too few), and that folded back on themselves or broke into pieces. “It was a free-for-all that gave back nothing that resembles what we see around us,” says Loll.
But, like Sorkin, Loll and her colleagues found that adding causality changed everything. After all, says Loll, the dimension of time is not quite like the three dimensions of space. “We cannot travel back and forth in time,” she says. So the team changed its simulations to ensure that effects could not come before their cause — and found that the space-time chunks started consistently assembling themselves into smooth four-dimensional universes with properties similar to our own10.
Intriguingly, the simulations also hint that soon after the Big Bang, the Universe went through an infant phase with only two dimensions — one of space and one of time. This prediction has also been made independently by others attempting to derive equations of quantum gravity, and even some who suggest that the appearance of dark energy is a sign that our Universe is now growing a fourth spatial dimension. Others have shown that a two-dimensional phase in the early Universe would create patterns similar to those already seen in the cosmic microwave background.
Holography
Meanwhile, Van Raamsdonk has proposed a very different idea about the emergence of space-time, based on the holographic principle. Inspired by the hologram-like way that black holes store all their entropy at the surface, this principle was first given an explicit mathematical form by Juan Maldacena, a string theorist at the Institute of Advanced Study in Princeton, New Jersey, who published11 his influential model of a holographic universe in 1998. In that model, the three-dimensional interior of the universe contains strings and black holes governed only by gravity, whereas its two-dimensional boundary contains elementary particles and fields that obey ordinary quantum laws without gravity.
Hypothetical residents of the three-dimensional space would never see this boundary, because it would be infinitely far away. But that does not affect the mathematics: anything happening in the three-dimensional universe can be described equally well by equations in the two-dimensional boundary, and vice versa.
In 2010, Van Raamsdonk studied what that means when quantum particles on the boundary are ‘entangled’ — meaning that measurements made on one inevitably affect the other12. He discovered that if every particle entanglement between two separate regions of the boundary is steadily reduced to zero, so that the quantum links between the two disappear, the three-dimensional space responds by gradually dividing itself like a splitting cell, until the last, thin connection between the two halves snaps. Repeating that process will subdivide the three-dimensional space again and again, while the two-dimensional boundary stays connected. So, in effect, Van Raamsdonk concluded, the three-dimensional universe is being held together by quantum entanglement on the boundary — which means that in some sense, quantum entanglement and space-time are the same thing.
Or, as Maldacena puts it: “This suggests that quantum is the most fundamental, and space-time emerges from it.”
Nature 500,516–519 (29 August 2013) doi:10.1038/500516a
http://www.nature.com/news/theoretical-physics-the-origins-of-space-and-time-1.13613
__________
This website is educational. Materials within it are being used in accord with the Fair Use doctrine, as defined by United States law.
§107. Limitations on Exclusive Rights: Fair Use
Notwithstanding the provisions of section 106, the fair use of a copyrighted work, including such use by reproduction in copies or phone records or by any other means specified by that section, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright. In determining whether the use made of a work in any particular case is a fair use, the factors to be considered shall include:
the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes;
the nature of the copyrighted work;
the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and
the effect of the use upon the potential market for or value of the copyrighted work. (added pub. l 94-553, Title I, 101, Oct 19, 1976, 90 Stat 2546)
__________________________________________________
Time’s Arrow Traced to Quantum Source
A new theory explains the seemingly irreversible arrow of time while yielding insights into entropy, quantum computers, black holes, and the past-future divide.
Natalie Wolchover, Senior Writer, Quanta Magazine, April 16, 2014
Coffee cools, buildings crumble, eggs break and stars fizzle out in a universe that seems destined to degrade into a state of uniform drabness known as thermal equilibrium. The astronomer-philosopher Sir Arthur Eddington in 1927 cited the gradual dispersal of energy as evidence of an irreversible “arrow of time.”
But to the bafflement of generations of physicists, the arrow of time does not seem to follow from the underlying laws of physics, which work the same going forward in time as in reverse. By those laws, it seemed that if someone knew the paths of all the particles in the universe and flipped them around, energy would accumulate rather than disperse: Tepid coffee would spontaneously heat up, buildings would rise from their rubble and sunlight would slink back into the sun.
“In classical physics, we were struggling,” said Sandu Popescu, a professor of physics at the University of Bristol in the United Kingdom. “If I knew more, could I reverse the event, put together all the molecules of the egg that broke? Why am I relevant?”
Surely, he said, time’s arrow is not steered by human ignorance. And yet, since the birth of thermodynamics in the 1850s, the only known approach for calculating the spread of energy was to formulate statistical distributions of the unknown trajectories of particles, and show that, over time, the ignorance smeared things out.
Now, physicists are unmasking a more fundamental source for the arrow of time: Energy disperses and objects equilibrate, they say, because of the way elementary particles become intertwined when they interact — a strange effect called “quantum entanglement.”
“Finally, we can understand why a cup of coffee equilibrates in a room,” said Tony Short, a quantum physicist at Bristol. “Entanglement builds up between the state of the coffee cup and the state of the room.”
Popescu, Short and their colleagues Noah Linden and Andreas Winter reported the discovery in the journal Physical Review E in 2009, arguing that objects reach equilibrium, or a state of uniform energy distribution, within an infinite amount of time by becoming quantum mechanically entangled with their surroundings. Similar results by Peter Reimann of the University of Bielefeld in Germany appeared several months earlier in Physical Review Letters.
Short and a collaborator strengthened the argument in 2012 by showing that entanglement causes equilibration within a finite time. And, in work that was posted on the scientific preprint site arXiv.org in February, two separate groups have taken the next step, calculating that most physical systems equilibrate rapidly, on time scales proportional to their size. “To show that it’s relevant to our actual physical world, the processes have to be happening on reasonable time scales,” Short said.
The tendency of coffee — and everything else — to reach equilibrium is “very intuitive,” said Nicolas Brunner, a quantum physicist at the University of Geneva. “But when it comes to explaining why it happens, this is the first time it has been derived on firm grounds by considering a microscopic theory.”
If the new line of research is correct, then the story of time’s arrow begins with the quantum mechanical idea that, deep down, nature is inherently uncertain. An elementary particle lacks definite physical properties and is defined only by probabilities of being in various states. For example, at a particular moment, a particle might have a 50 percent chance of spinning clockwise and a 50 percent chance of spinning counterclockwise. An experimentally tested theorem by the Northern Irish physicist John Bell says there is no “true” state of the particle; the probabilities are the only reality that can be ascribed to it.
Quantum uncertainty then gives rise to entanglement, the putative source of the arrow of time.
When two particles interact, they can no longer even be described by their own, independently evolving probabilities, called “pure states.” Instead, they become entangled components of a more complicated probability distribution that describes both particles together. It might dictate, for example, that the particles spin in opposite directions. The system as a whole is in a pure state, but the state of each individual particle is “mixed” with that of its acquaintance. The two could travel light-years apart, and the spin of each would remain correlated with that of the other, a feature Albert Einstein famously described as “spooky action at a distance.”
“Entanglement is in some sense the essence of quantum mechanics,” or the laws governing interactions on the subatomic scale, Brunner said. The phenomenon underlies quantum computing, quantum cryptography and quantum teleportation.
The idea that entanglement might explain the arrow of time first occurred to Seth Lloyd about 30 years ago, when he was a 23-year-old philosophy graduate student at Cambridge University with a Harvard physics degree. Lloyd realized that quantum uncertainty, and the way it spreads as particles become increasingly entangled, could replace human uncertainty in the old classical proofs as the true source of the arrow of time.
Using an obscure approach to quantum mechanics that treated units of information as its basic building blocks, Lloyd spent several years studying the evolution of particles in terms of shuffling 1s and 0s. He found that as the particles became increasingly entangled with one another, the information that originally described them (a “1” for clockwise spin and a “0” for counterclockwise, for example) would shift to describe the system of entangled particles as a whole. It was as though the particles gradually lost their individual autonomy and became pawns of the collective state. Eventually, the correlations contained all the information, and the individual particles contained none. At that point, Lloyd discovered, particles arrived at a state of equilibrium, and their states stopped changing, like coffee that has cooled to room temperature.
“What’s really going on is things are becoming more correlated with each other,” Lloyd recalls realizing. “The arrow of time is an arrow of increasing correlations.”
The idea, presented in his 1988 doctoral thesis, fell on deaf ears. When he submitted it to a journal, he was told that there was “no physics in this paper.” Quantum information theory “was profoundly unpopular” at the time, Lloyd said, and questions about time’s arrow “were for crackpots and Nobel laureates who have gone soft in the head.” he remembers one physicist telling him.
“I was darn close to driving a taxicab,” Lloyd said.
Advances in quantum computing have since turned quantum information theory into one of the most active branches of physics. Lloyd is now a professor at the Massachusetts Institute of Technology, recognized as one of the founders of the discipline, and his overlooked idea has resurfaced in a stronger form in the hands of the Bristol physicists. The newer proofs are more general, researchers say, and hold for virtually any quantum system.
“When Lloyd proposed the idea in his thesis, the world was not ready,” said Renato Renner, head of the Institute for Theoretical Physics at ETH Zurich. “No one understood it. Sometimes you have to have the idea at the right time.”
In 2009, the Bristol group’s proof resonated with quantum information theorists, opening up new uses for their techniques. It showed that as objects interact with their surroundings — as the particles in a cup of coffee collide with the air, for example — information about their properties “leaks out and becomes smeared over the entire environment,” Popescu explained. This local information loss causes the state of the coffee to stagnate even as the pure state of the entire room continues to evolve. Except for rare, random fluctuations, he said, “its state stops changing in time.”
Consequently, a tepid cup of coffee does not spontaneously warm up. In principle, as the pure state of the room evolves, the coffee could suddenly become unmixed from the air and enter a pure state of its own. But there are so many more mixed states than pure states available to the coffee that this practically never happens — one would have to outlive the universe to witness it. This statistical unlikelihood gives time’s arrow the appearance of irreversibility. “Essentially entanglement opens a very large space for you,” Popescu said. “It’s like you are at the park and you start next to the gate, far from equilibrium. Then you enter and you have this enormous place and you get lost in it. And you never come back to the gate.”
In the new story of the arrow of time, it is the loss of information through quantum entanglement, rather than a subjective lack of human knowledge, that drives a cup of coffee into equilibrium with the surrounding room. The room eventually equilibrates with the outside environment, and the environment drifts even more slowly toward equilibrium with the rest of the universe. The giants of 19th century thermodynamics viewed this process as a gradual dispersal of energy that increases the overall entropy, or disorder, of the universe. Today, Lloyd, Popescu and others in their field see the arrow of time differently. In their view, information becomes increasingly diffuse, but it never disappears completely. So, they assert, although entropy increases locally, the overall entropy of the universe stays constant at zero.
“The universe as a whole is in a pure state,” Lloyd said. “But individual pieces of it, because they are entangled with the rest of the universe, are in mixtures.”
One aspect of time’s arrow remains unsolved. “There is nothing in these works to say why you started at the gate,” Popescu said, referring to the park analogy. “In other words, they don’t explain why the initial state of the universe was far from equilibrium.” He said this is a question about the nature of the Big Bang.
Despite the recent progress in calculating equilibration time scales, the new approach has yet to make headway as a tool for parsing the thermodynamic properties of specific things, like coffee, glass or exotic states of matter. (Several traditional thermodynamicists reported being only vaguely aware of the new approach.) “The thing is to find the criteria for which things behave like window glass and which things behave like a cup of tea,” Renner said. “I would see the new papers as a step in this direction, but much more needs to be done.”
Some researchers expressed doubt that this abstract approach to thermodynamics will ever be up to the task of addressing the “hard nitty-gritty of how specific observables behave,” as Lloyd put it. But the conceptual advance and new mathematical formalism is already helping researchers address theoretical questions about thermodynamics, such as the fundamental limits of quantum computers and even the ultimate fate of the universe.
“We’ve been thinking more and more about what we can do with quantum machines,” said Paul Skrzypczyk of the Institute of Photonic Sciences in Barcelona. “Given that a system is not yet at equilibrium, we want to get work out of it. How much useful work can we extract? How can I intervene to do something interesting?”
Sean Carroll, a theoretical cosmologist at the California Institute of Technology, is employing the new formalism in his latest work on time’s arrow in cosmology. “I’m interested in the ultra-long-term fate of cosmological space-times,” said Carroll, author of “From Eternity to Here: The Quest for the Ultimate Theory of Time.” “That’s a situation where we don’t really know all of the relevant laws of physics, so it makes sense to think on a very abstract level, which is why I found this basic quantum-mechanical treatment useful.”
Twenty-six years after Lloyd’s big idea about time’s arrow fell flat, he is pleased to be witnessing its rise and has been applying the ideas in recent work on the black hole information paradox. “I think now the consensus would be that there is physics in this,” he said.
Not to mention a bit of philosophy.
According to the scientists, our ability to remember the past but not the future, another historically confounding manifestation of time’s arrow, can also be understood as a buildup of correlations between interacting particles. When you read a message on a piece of paper, your brain becomes correlated with it through the photons that reach your eyes. Only from that moment on will you be capable of remembering what the message says. As Lloyd put it: “The present can be defined by the process of becoming correlated with our surroundings.”
The backdrop for the steady growth of entanglement throughout the universe is, of course, time itself. The physicists stress that despite great advances in understanding how changes in time occur, they have made no progress in uncovering the nature of time itself or why it seems different (both perceptually and in the equations of quantum mechanics) than the three dimensions of space. Popescu calls this “one of the greatest unknowns in physics.”
“We can discuss the fact that an hour ago, our brains were in a state that was correlated with fewer things,” he said. “But our perception that time is flowing — that is a different matter altogether. Most probably, we will need a further revolution in physics that will tell us about that.”
https://www.quantamagazine.org/20150428-how-quantum-pairs-stitch-space-time/
The Quantum Thermodynamics Revolution
As physicists extend the 19th-century laws of thermodynamics to the quantum realm, they’re rewriting the relationships among energy, entropy and information.
Natalie Wolchover, Senior Writer, Quanta Magazine, May 2, 2017
https://www.quantamagazine.org/quantum-thermodynamics-revolution/
In his 1824 book, Reflections on the Motive Power of Fire, the 28-year-old French engineer Sadi Carnot worked out a formula for how efficiently steam engines can convert heat — now known to be a random, diffuse kind of energy — into work, an orderly kind of energy that might push a piston or turn a wheel. To Carnot’s surprise, he discovered that a perfect engine’s efficiency depends only on the difference in temperature between the engine’s heat source (typically a fire) and its heat sink (typically the outside air). Work is a byproduct, Carnot realized, of heat naturally passing to a colder body from a warmer one.
Carnot died of cholera eight years later, before he could see his efficiency formula develop over the 19th century into the theory of thermodynamics: a set of universal laws dictating the interplay among temperature, heat, work, energy and entropy — a measure of energy’s incessant spreading from more- to less-energetic bodies. The laws of thermodynamics apply not only to steam engines but also to everything else: the sun, black holes, living beings and the entire universe. The theory is so simple and general that Albert Einstein deemed it likely to “never be overthrown.”
Yet since the beginning, thermodynamics has held a singularly strange status among the theories of nature.
“If physical theories were people, thermodynamics would be the village witch,” the physicist Lídia del Rio and co-authors wrote last year in Journal of Physics A. “The other theories find her somewhat odd, somehow different in nature from the rest, yet everyone comes to her for advice, and no one dares to contradict her.”
Unlike, say, the Standard Model of particle physics, which tries to get at what exists, the laws of thermodynamics only say what can and can’t be done. But one of the strangest things about the theory is that these rules seem subjective. A gas made of particles that in aggregate all appear to be the same temperature — and therefore unable to do work — might, upon closer inspection, have microscopic temperature differences that could be exploited after all. As the 19th-century physicist James Clerk Maxwell put it, “The idea of dissipation of energy depends on the extent of our knowledge.”
In recent years, a revolutionary understanding of thermodynamics has emerged that explains this subjectivity using quantum information theory — “a toddler among physical theories,” as del Rio and co-authors put it, that describes the spread of information through quantum systems. Just as thermodynamics initially grew out of trying to improve steam engines, today’s thermodynamicists are mulling over the workings of quantum machines. Shrinking technology — a single-ion engine and three-atom fridge were both experimentally realized for the first time within the past year — is forcing them to extend thermodynamics to the quantum realm, where notions like temperature and work lose their usual meanings, and the classical laws don’t necessarily apply.
They’ve found new, quantum versions of the laws that scale up to the originals. Rewriting the theory from the bottom up has led experts to recast its basic concepts in terms of its subjective nature, and to unravel the deep and often surprising relationship between energy and information — the abstract 1s and 0s by which physical states are distinguished and knowledge is measured. “Quantum thermodynamics” is a field in the making, marked by a typical mix of exuberance and confusion.
“We are entering a brave new world of thermodynamics,” said Sandu Popescu, a physicist at the University of Bristol who is one of the leaders of the research effort. “Although it was very good as it started,” he said, referring to classical thermodynamics, “by now we are looking at it in a completely new way.”
Entropy as Uncertainty
In an 1867 letter to his fellow Scotsman Peter Tait, Maxwell described his now-famous paradox hinting at the connection between thermodynamics and information. The paradox concerned the second law of thermodynamics — the rule that entropy always increases — which Sir Arthur Eddington would later say “holds the supreme position among the laws of nature.” According to the second law, energy becomes ever more disordered and less useful as it spreads to colder bodies from hotter ones and differences in temperature diminish. (Recall Carnot’s discovery that you need a hot body and a cold body to do work.) Fires die out, cups of coffee cool and the universe rushes toward a state of uniform temperature known as “heat death,” after which no more work can be done.
The great Austrian physicist Ludwig Boltzmann showed that energy disperses, and entropy increases, as a simple matter of statistics: There are many more ways for energy to be spread among the particles in a system than concentrated in a few, so as particles move around and interact, they naturally tend toward states in which their energy is increasingly shared.
But Maxwell’s letter described a thought experiment in which an enlightened being — later called Maxwell’s demon — uses its knowledge to lower entropy and violate the second law. The demon knows the positions and velocities of every molecule in a container of gas. By partitioning the container and opening and closing a small door between the two chambers, the demon lets only fast-moving molecules enter one side, while allowing only slow molecules to go the other way. The demon’s actions divide the gas into hot and cold, concentrating its energy and lowering its overall entropy. The once useless gas can now be put to work.
Maxwell and others wondered how a law of nature could depend on one’s knowledge — or ignorance — of the positions and velocities of molecules. If the second law of thermodynamics depends subjectively on one’s information, in what sense is it true?
A century later, the American physicist Charles Bennett, building on work by Leo Szilard and Rolf Landauer, resolved the paradox by formally linking thermodynamics to the young science of information. Bennett argued that the demon’s knowledge is stored in its memory, and memory has to be cleaned, which takes work. (In 1961, Landauer calculated that at room temperature, it takes at least 2.9 zeptojoules of energy for a computer to erase one bit of stored information.) In other words, as the demon organizes the gas into hot and cold and lowers the gas’s entropy, its brain burns energy and generates more than enough entropy to compensate. The overall entropy of the gas-demon system increases, satisfying the second law of thermodynamics.
The findings revealed that, as Landauer put it, “Information is physical.” The more information you have, the more work you can extract. Maxwell’s demon can wring work out of a single-temperature gas because it has far more information than the average user.
But it took another half century and the rise of quantum information theory, a field born in pursuit of the quantum computer, for physicists to fully explore the startling implications.
Over the past decade, Popescu and his Bristol colleagues, along with other groups, have argued that energy spreads to cold objects from hot ones because of the way information spreads between particles. According to quantum theory, the physical properties of particles are probabilistic; instead of being representable as 1 or 0, they can have some probability of being 1 and some probability of being 0 at the same time. When particles interact, they can also become entangled, joining together the probability distributions that describe both of their states. A central pillar of quantum theory is that the information — the probabilistic 1s and 0s representing particles’ states — is never lost. (The present state of the universe preserves all information about the past.)
Over time, however, as particles interact and become increasingly entangled, information about their individual states spreads and becomes shuffled and shared among more and more particles. Popescu and his colleagues believe that the arrow of increasing quantum entanglement underlies the expected rise in entropy — the thermodynamic arrow of time. A cup of coffee cools to room temperature, they explain, because as coffee molecules collide with air molecules, the information that encodes their energy leaks out and is shared by the surrounding air.
Understanding entropy as a subjective measure allows the universe as a whole to evolve without ever losing information. Even as parts of the universe, such as coffee, engines and people, experience rising entropy as their quantum information dilutes, the global entropy of the universe stays forever zero.
Renato Renner, a professor at ETH Zurich in Switzerland, described this as a radical shift in perspective. Fifteen years ago, “we thought of entropy as a property of a thermodynamic system,” he said. “Now in information theory, we wouldn’t say entropy is a property of a system, but a property of an observer who describes a system.”
Moreover, the idea that energy has two forms, useless heat and useful work, “made sense for steam engines,” Renner said. “In the new way, there is a whole spectrum in between — energy about which we have partial information.”
Entropy and thermodynamics are “much less of a mystery in this new view,” he said. “That’s why people like the new view better than the old one.”
Thermodynamics From Symmetry
The relationship among information, energy and other “conserved quantities,” which can change hands but never be destroyed, took a new turn in two papers published simultaneously last July in Nature Communications, one by the Bristol team and another by a team that included Jonathan Oppenheim at University College London. Both groups conceived of a hypothetical quantum system that uses information as a sort of currency for trading between the other, more material resources.
Imagine a vast container, or reservoir, of particles that possess both energy and angular momentum (they’re both moving around and spinning). This reservoir is connected to both a weight, which takes energy to lift, and a turning turntable, which takes angular momentum to speed up or slow down. Normally, a single reservoir can’t do any work — this goes back to Carnot’s discovery about the need for hot and cold reservoirs. But the researchers found that a reservoir containing multiple conserved quantities follows different rules. “If you have two different physical quantities that are conserved, like energy and angular momentum,” Popescu said, “as long as you have a bath that contains both of them, then you can trade one for another.”
In the hypothetical weight-reservoir-turntable system, the weight can be lifted as the turntable slows down, or, conversely, lowering the weight causes the turntable to spin faster. The researchers found that the quantum information describing the particles’ energy and spin states can act as a kind of currency that enables trading between the reservoir’s energy and angular momentum supplies. The notion that conserved quantities can be traded for one another in quantum systems is brand new. It may suggest the need for a more complete thermodynamic theory that would describe not only the flow of energy, but also the interplay between all the conserved quantities in the universe.
The fact that energy has dominated the thermodynamics story up to now might be circumstantial rather than profound, Oppenheim said. Carnot and his successors might have developed a thermodynamic theory governing the flow of, say, angular momentum to go with their engine theory, if only there had been a need. “We have energy sources all around us that we want to extract and use,” Oppenheim said. “It happens to be the case that we don’t have big angular momentum heat baths around us. We don’t come across huge gyroscopes.”
Popescu, who won a Dirac Medal last year for his insights in quantum information theory and quantum foundations, said he and his collaborators work by “pushing quantum mechanics into a corner,” gathering at a blackboard and reasoning their way to a new insight after which it’s easy to derive the associated equations. Some realizations are in the process of crystalizing. In one of several phone conversations in March, Popescu discussed a new thought experiment that illustrates a distinction between information and other conserved quantities — and indicates how symmetries in nature might set them apart.
“Suppose that you and I are living on different planets in remote galaxies,” he said, and suppose that he, Popescu, wants to communicate where you should look to find his planet. The only problem is, this is physically impossible: “I can send you the story of Hamlet. But I cannot indicate for you a direction.”
There’s no way to express in a string of pure, directionless 1s and 0s which way to look to find each other’s galaxies because “nature doesn’t provide us with [a reference frame] that is universal,” Popescu said. If it did — if, for instance, tiny arrows were sewn everywhere in the fabric of the universe, indicating its direction of motion — this would violate “rotational invariance,” a symmetry of the universe. Turntables would start turning faster when aligned with the universe’s motion, and angular momentum would not appear to be conserved. The early-20th-century mathematician Emmy Noether showed that every symmetry comes with a conservation law: The rotational symmetry of the universe reflects the preservation of a quantity we call angular momentum. Popescu’s thought experiment suggests that the impossibility of expressing spatial direction with information “may be related to the conservation law,” he said.
The seeming inability to express everything about the universe in terms of information could be relevant to the search for a more fundamental description of nature. In recent years, many theorists have come to believe that space-time, the bendy fabric of the universe, and the matter and energy within it might be a hologram that arises from a network of entangled quantum information. “One has to be careful,” Oppenheim said, “because information does behave differently than other physical properties, like space-time.”
Knowing the logical links between the concepts could also help physicists reason their way inside black holes, mysterious space-time swallowing objects that are known to have temperatures and entropies, and which somehow radiate information. “One of the most important aspects of the black hole is its thermodynamics,” Popescu said. “But the type of thermodynamics that they discuss in the black holes, because it’s such a complicated subject, is still more of a traditional type. We are developing a completely novel view on thermodynamics.” It’s “inevitable,” he said, “that these new tools that we are developing will then come back and be used in the black hole.”
What to Tell Technologists
Janet Anders, a quantum information scientist at the University of Exeter, takes a technology-driven approach to understanding quantum thermodynamics. “If we go further and further down [in scale], we’re going to hit a region that we don’t have a good theory for,” Anders said. “And the question is, what do we need to know about this region to tell technologists?”
In 2012, Anders conceived of and co-founded a European research network devoted to quantum thermodynamics that now has 300 members. With her colleagues in the network, she hopes to discover the rules governing the quantum transitions of quantum engines and fridges, which could someday drive or cool computers or be used in solar panels, bioengineering and other applications. Already, researchers are getting a better sense of what quantum engines might be capable of. In 2015, Raam Uzdin and colleagues at the Hebrew University of Jerusalem calculated that quantum engines can outpower classical engines. These probabilistic engines still follow Carnot’s efficiency formula in terms of how much work they can derive from energy passing between hot and cold bodies. But they’re sometimes able to extract the work much more quickly, giving them more power. An engine made of a single ion was experimentally demonstrated and reported in Science in April 2016, though it didn’t harness the power-enhancing quantum effect.
Popescu, Oppenheim, Renner and their cohorts are also pursuing more concrete discoveries. In March, Oppenheim and his former student, Lluis Masanes, published a paper deriving the third law of thermodynamics — a historically confusing statement about the impossibility of reaching absolute-zero temperature — using quantum information theory. They showed that the “cooling speed limit” preventing you from reaching absolute zero arises from the limit on how fast information can be pumped out of the particles in a finite-size object. The speed limit might be relevant to the cooling abilities of quantum fridges, like the one reported in a preprint in February. In 2015, Oppenheim and other collaborators showed that the second law of thermodynamics is replaced, on quantum scales, by a panoply of second “laws” — constraints on how the probability distributions defining the physical states of particles evolve, including in quantum engines.
As the field of quantum thermodynamics grows quickly, spawning a range of approaches and findings, some traditional thermodynamicists see a mess. Peter Hänggi, a vocal critic at the University of Augsburg in Germany, thinks the importance of information is being oversold by ex-practitioners of quantum computing, who he says mistake the universe for a giant quantum information processor instead of a physical thing. He accuses quantum information theorists of confusing different kinds of entropy — the thermodynamic and information-theoretic kinds — and using the latter in domains where it doesn’t apply. Maxwell’s demon “gets on my nerves,” Hänggi said. When asked about Oppenheim and company’s second “laws” of thermodynamics, he said, “You see why my blood pressure rises.”
While Hänggi is seen as too old-fashioned in his critique (quantum-information theorists do study the connections between thermodynamic and information-theoretic entropy), other thermodynamicists said he makes some valid points. For instance, when quantum information theorists conjure up abstract quantum machines and see if they can get work out of them, they sometimes sidestep the question of how, exactly, you extract work from a quantum system, given that measuring it destroys its simultaneous quantum probabilities. Anders and her collaborators have recently begun addressing this issue with new ideas about quantum work extraction and storage. But the theoretical literature is all over the place.
“Many exciting things have been thrown on the table, a bit in disorder; we need to put them in order,” said Valerio Scarani, a quantum information theorist and thermodynamicist at the National University of Singapore who was part of the team that reported the quantum fridge. “We need a bit of synthesis. We need to understand your idea fits there; mine fits here. We have eight definitions of work; maybe we should try to figure out which one is correct in which situation, not just come up with a ninth definition of work.”
Oppenheim and Popescu fully agree with Hänggi that there’s a risk of downplaying the universe’s physicality. “I’m wary of information theorists who believe everything is information,” Oppenheim said. “When the steam engine was being developed and thermodynamics was in full swing, there were people positing that the universe was just a big steam engine.” In reality, he said, “it’s much messier than that.” What he likes about quantum thermodynamics is that “you have these two fundamental quantities — energy and quantum information — and these two things meet together. That to me is what makes it such a beautiful theory.”
__________________________
This website is educational. Materials within it are being used in accord with the Fair Use doctrine, as defined by United States law.
§107. Limitations on Exclusive Rights: Fair Use
Notwithstanding the provisions of section 106, the fair use of a copyrighted work, including such use by reproduction in copies or phone records or by any other means specified by that section, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright. In determining whether the use made of a work in any particular case is a fair use, the factors to be considered shall include:
the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes;
the nature of the copyrighted work;
the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and
the effect of the use upon the potential market for or value of the copyrighted work. (added pub. l 94-553, Title I, 101, Oct 19, 1976, 90 Stat 2546)