Home » Articles posted by New England Blogger (Page 44)
Author Archives: New England Blogger
Data needs an interpretation to have meaning
Lesson: “Data has no meaning without a physical interpretation”
Content objectives:
1. SWBAT to identify trends in data (apparent linear plots; apparent linear data plus noise; and simple harmonic motion.)
Thesis: raw data doesn’t tells us anything physical phenomenon. We always first need to know what physical phenomenon we are analyzing, before we can interpret it.
Tier III vocabulary: Simple harmonic motion
Launch: Students are given graph paper, and data. Plot the given data points, and connect the dots in a way that they think is logical.
Question: Justify why you connected the dots in that way. Why not in some other way?
Direct Instruction/guided practice
Teacher instructions
Create a sine wave. I do so here using Desmos – desmos.com/calculator/xxmkiptej7
I modified this function to be Y = 4•sin(1.5x)
Don’t tell the students yet. This sine wave is a position versus time graph of any object in the real world undergoing simple harmonic motion.
Y-axis can be interpreted as height; X-axis is time.
Let’s get some data points from this function. Draw a straight line across it, from upper right to lower left.
The line will intersect the sine wave at many points.
Overlay some semi-transparent graph paper on top of this, and plot these points. Or, as I have done here, do it on an app. In this example we have seven data points.
Give the students the coordinates for these points but do not show them the graph! Just give them the data. Ask them to interpret it, plot it, and hypothesize about what the data could mean.
Tag six more points from the sine wave, that are not on the original straight line.
Here I chose some data points that we could sample from actual motion, if we happened to be sampling at just the right time interval.
Again, give students these coordinates without showing them the graph. Ask them to interpret it, plot it, and hypothesize about what the data could mean.
If one were to plot only these points then they would appear as a straight line.
A naïve reading of the raw data would lead one (mistakenly) to believe that we are studying some kind of linear phenomenon.
If one were to plot only these points then they would appear as a straight line.
A naïve reading of the raw data would lead one (mistakenly) to believe that we are studying some kind of linear phenomenon.
Very few students will quickly see that these points fit a sine curve. They will have all sorts of answers
When we are done with all of these examples, then we can show them the original sine curve; show them each of these graphs, and how all the different data came from the same data set/phenomenon.
Part A: Justify your choice: What real world motion would produce such a function? Think-Pair-Share
After the discussion, the teacher reveals what produces such data: SHM, Simple Harmonic Motion:
Summative question, tying this all together:
Why couldn’t most students plot the data correctly, even after the final data points were added?
Answer: Unless you know what kind of phenomenon you are studying, you have no idea whether the data is supposed to be linear, harmonic, exponential, etc. Data – by itself – has no meaning without a physical interpretation.
Closure: Query multiple students: Where do you experience SHM in your own life?
Possible answers: Moving back-and-forth on a swing, pendulum of a clock, automobile suspension system
Something more to think about:

Image from https://m.xkcd.com/2048/
Learning standards
A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas (2012)
Dimension 1: Scientific and Engineering Practices: Practice 4: Analyzing and Interpreting Data.
“Once collected, data must be presented in a form that can reveal any patterns and relationships and that allows results to be communicated to others. Because raw data as such have little meaning, a major practice of scientists is to organize and interpret data through tabulating, graphing, or statistical analysis. Such analysis can bring out the meaning of data—and their relevance—so that they may be used as evidence.”
Escape Slide Parachute
MythBusters and the Scientific method
Episode 37, “Escape Slide Parachute (Story of Vesna Vulović)
Textbook: Chapter 3, Accelerated Motion. Free Fall.
https://kaiserscience.wordpress.com/physics/physics-in-films/mythbusters/
- What is the myth?
A person can survive free-fall from over 20,000 feet
How far is this in meters?Convert feet to meters. 1 foot = 0.31 meters.
20,000 feet x 0.31 meters = 6200 m = 6.2 x 10 3 m
feet
- Why would people believe that myth is true?
(Discuss)
Newspaper articles: Vesna Vulović, a Serbia flight attendant, survived the explosion of a DC-9, from 30,000 feet, in 1972.
- Testable predictions: A crash-test dummy, dropped from a high altitude, within an airplane, will experience survivable damage
- How do they test their prediction?
* Purchase used airplane
* cut away the rear section of the plane.
* Strap a crash-test dummy into flight attendant’s seat.
* Lift section up, thousands of feet high, with a helicopter.
* Drop section in free-fall. - Evaluation of the results?
* Buster had a terrible amount of damage.
* A person in that position would not survive this drop.
* Yet it has been confirmed that Vesna Vulović (Serbia flight attendant) survived the explosion of a DC-9, from 30,000 feet, in 1972.
* The section of the plane that the MythBusters dropped was only partially destroyed; other parts were still in shape.* Conclusion: Since we know such a fall is thus theoretically survivable, something was not adequate about our experiment:
* Did they need to form a new hypothesis: Yes – they should re-test this with multiple crash dummies. They can be seated in various sections of the airplane. And the entire test could be repeated several times.

Theoretical physics: The origins of space and time
Many researchers believe that physics will not be complete until it can explain not just the behaviour of space and time, but where these entities come from.
Zeeya Merali, Nature, 28 August 2013
“Imagine waking up one day and realizing that you actually live inside a computer game,” says Mark Van Raamsdonk, describing what sounds like a pitch for a science-fiction film. But for Van Raamsdonk, a physicist at the University of British Columbia in Vancouver, Canada, this scenario is a way to think about reality. If it is true, he says, “everything around us — the whole three-dimensional physical world — is an illusion born from information encoded elsewhere, on a two-dimensional chip”. That would make our Universe, with its three spatial dimensions, a kind of hologram, projected from a substrate that exists only in lower dimensions.
This ‘holographic principle’ is strange even by the usual standards of theoretical physics. But Van Raamsdonk is one of a small band of researchers who think that the usual ideas are not yet strange enough. If nothing else, they say, neither of the two great pillars of modern physics — general relativity, which describes gravity as a curvature of space and time, and quantum mechanics, which governs the atomic realm — gives any account for the existence of space and time. Neither does string theory, which describes elementary threads of energy.
Van Raamsdonk and his colleagues are convinced that physics will not be complete until it can explain how space and time emerge from something more fundamental — a project that will require concepts at least as audacious as holography. They argue that such a radical reconceptualization of reality is the only way to explain what happens when the infinitely dense ‘singularity’ at the core of a black hole distorts the fabric of space-time beyond all recognition, or how researchers can unify atomic-level quantum theory and planet-level general relativity — a project that has resisted theorists’ efforts for generations.
“All our experiences tell us we shouldn’t have two dramatically different conceptions of reality — there must be one huge overarching theory,” says Abhay Ashtekar, a physicist at Pennsylvania State University in University Park.
Finding that one huge theory is a daunting challenge. Here, Nature explores some promising lines of attack — as well as some of the emerging ideas about how to test these concepts.
NIK SPENCER/NATURE; Panel 4 adapted from Budd, T. & Loll, R. Phys. Rev. D 88, 024015 (2013)
Gravity as thermodynamics
One of the most obvious questions to ask is whether this endeavour is a fool’s errand. Where is the evidence that there actually is anything more fundamental than space and time?
A provocative hint comes from a series of startling discoveries made in the early 1970s, when it became clear that quantum mechanics and gravity were intimately intertwined with thermodynamics, the science of heat.
In 1974, most famously, Stephen Hawking of the University of Cambridge, UK, showed that quantum effects in the space around a black hole will cause it to spew out radiation as if it was hot. Other physicists quickly determined that this phenomenon was quite general. Even in completely empty space, they found, an astronaut undergoing acceleration would perceive that he or she was surrounded by a heat bath. The effect would be too small to be perceptible for any acceleration achievable by rockets, but it seemed to be fundamental. If quantum theory and general relativity are correct — and both have been abundantly corroborated by experiment — then the existence of Hawking radiation seemed inescapable.
A second key discovery was closely related. In standard thermodynamics, an object can radiate heat only by decreasing its entropy, a measure of the number of quantum states inside it. And so it is with black holes: even before Hawking’s 1974 paper, Jacob Bekenstein, now at the Hebrew University of Jerusalem, had shown that black holes possess entropy.
But there was a difference. In most objects, the entropy is proportional to the number of atoms the object contains, and thus to its volume. But a black hole’s entropy turned out to be proportional to the surface area of its event horizon — the boundary out of which not even light can escape. It was as if that surface somehow encoded information about what was inside, just as a two-dimensional hologram encodes a three-dimensional image.
In 1995, Ted Jacobson, a physicist at the University of Maryland in College Park, combined these two findings, and postulated that every point in space lies on a tiny ‘black-hole horizon’ that also obeys the entropy–area relationship. From that, he found, the mathematics yielded Einstein’s equations of general relativity — but using only thermodynamic concepts, not the idea of bending space-time1.
“This seemed to say something deep about the origins of gravity,” says Jacobson. In particular, the laws of thermodynamics are statistical in nature — a macroscopic average over the motions of myriad atoms and molecules — so his result suggested that gravity is also statistical, a macroscopic approximation to the unseen constituents of space and time.
In 2010, this idea was taken a step further by Erik Verlinde, a string theorist at the University of Amsterdam, who showed2 that the statistical thermodynamics of the space-time constituents — whatever they turned out to be — could automatically generate Newton’s law of gravitational attraction.
And in separate work, Thanu Padmanabhan, a cosmologist at the Inter-University Centre for Astronomy and Astrophysics in Pune, India, showed3 that Einstein’s equations can be rewritten in a form that makes them identical to the laws of thermodynamics — as can many alternative theories of gravity. Padmanabhan is currently extending the thermodynamic approach in an effort to explain the origin and magnitude of dark energy: a mysterious cosmic force that is accelerating the Universe’s expansion.
Testing such ideas empirically will be extremely difficult. In the same way that water looks perfectly smooth and fluid until it is observed on the scale of its molecules — a fraction of a nanometre — estimates suggest that space-time will look continuous all the way down to the Planck scale: roughly 10−35 metres, or some 20 orders of magnitude smaller than a proton.
But it may not be impossible. One often-mentioned way to test whether space-time is made of discrete constituents is to look for delays as high-energy photons travel to Earth from distant cosmic events such as supernovae and γ-ray bursts. In effect, the shortest-wavelength photons would sense the discreteness as a subtle bumpiness in the road they had to travel, which would slow them down ever so slightly.
Giovanni Amelino-Camelia, a quantum-gravity researcher at the University of Rome, and his colleagues have found4 hints of just such delays in the photons from a γ-ray burst recorded in April. The results are not definitive, says Amelino-Camelia, but the group plans to expand its search to look at the travel times of high-energy neutrinos produced by cosmic events. He says that if theories cannot be tested, “then to me, they are not science. They are just religious beliefs, and they hold no interest for me.”
Other physicists are looking at laboratory tests. In 2012, for example, researchers from the University of Vienna and Imperial College London proposed5 a tabletop experiment in which a microscopic mirror would be moved around with lasers. They argued that Planck-scale granularities in space-time would produce detectable changes in the light reflected from the mirror (see Naturehttp://doi.org/njf; 2012).
Loop quantum gravity
Even if it is correct, the thermodynamic approach says nothing about what the fundamental constituents of space and time might be. If space-time is a fabric, so to speak, then what are its threads?
One possible answer is quite literal. The theory of loop quantum gravity, which has been under development since the mid-1980s by Ashtekar and others, describes the fabric of space-time as an evolving spider’s web of strands that carry information about the quantized areas and volumes of the regions they pass through6. The individual strands of the web must eventually join their ends to form loops — hence the theory’s name — but have nothing to do with the much better-known strings of string theory. The latter move around in space-time, whereas strands actually are space-time: the information they carry defines the shape of the space-time fabric in their vicinity.
Because the loops are quantum objects, however, they also define a minimum unit of area in much the same way that ordinary quantum mechanics defines a minimum ground-state energy for an electron in a hydrogen atom. This quantum of area is a patch roughly one Planck scale on a side. Try to insert an extra strand that carries less area, and it will simply disconnect from the rest of the web. It will not be able to link to anything else, and will effectively drop out of space-time.
One welcome consequence of a minimum area is that loop quantum gravity cannot squeeze an infinite amount of curvature onto an infinitesimal point. This means that it cannot produce the kind of singularities that cause Einstein’s equations of general relativity to break down at the instant of the Big Bang and at the centres of black holes.
In 2006, Ashtekar and his colleagues reported7 a series of simulations that took advantage of that fact, using the loop quantum gravity version of Einstein’s equations to run the clock backwards and visualize what happened before the Big Bang. The reversed cosmos contracted towards the Big Bang, as expected. But as it approached the fundamental size limit dictated by loop quantum gravity, a repulsive force kicked in and kept the singularity open, turning it into a tunnel to a cosmos that preceded our own.
This year, physicists Rodolfo Gambini at the Uruguayan University of the Republic in Montevideo and Jorge Pullin at Louisiana State University in Baton Rouge reported8 a similar simulation for a black hole. They found that an observer travelling deep into the heart of a black hole would encounter not a singularity, but a thin space-time tunnel leading to another part of space. “Getting rid of the singularity problem is a significant achievement,” says Ashtekar, who is working with other researchers to identify signatures that would have been left by a bounce, rather than a bang, on the cosmic microwave background — the radiation left over from the Universe’s massive expansion in its infant moments.
Loop quantum gravity is not a complete unified theory, because it does not include any other forces. Furthermore, physicists have yet to show how ordinary space-time would emerge from such a web of information. But Daniele Oriti, a physicist at the Max Planck Institute for Gravitational Physics in Golm, Germany, is hoping to find inspiration in the work of condensed-matter physicists, who have produced exotic phases of matter that undergo transitions described by quantum field theory. Oriti and his colleagues are searching for formulae to describe how the Universe might similarly change phase, transitioning from a set of discrete loops to a smooth and continuous space-time. “It is early days and our job is hard because we are fishes swimming in the fluid at the same time as trying to understand it,” says Oriti.
Causal sets
Such frustrations have led some investigators to pursue a minimalist programme known as causal set theory. Pioneered by Rafael Sorkin, a physicist at the Perimeter Institute in Waterloo, Canada, the theory postulates that the building blocks of space-time are simple mathematical points that are connected by links, with each link pointing from past to future. Such a link is a bare-bones representation of causality, meaning that an earlier point can affect a later one, but not vice versa. The resulting network is like a growing tree that gradually builds up into space-time. “You can think of space emerging from points in a similar way to temperature emerging from atoms,” says Sorkin. “It doesn’t make sense to ask, ‘What’s the temperature of a single atom?’ You need a collection for the concept to have meaning.”
In the late 1980s, Sorkin used this framework to estimate9 the number of points that the observable Universe should contain, and reasoned that they should give rise to a small intrinsic energy that causes the Universe to accelerate its expansion. A few years later, the discovery of dark energy confirmed his guess. “People often think that quantum gravity cannot make testable predictions, but here’s a case where it did,” says Joe Henson, a quantum-gravity researcher at Imperial College London. “If the value of dark energy had been larger, or zero, causal set theory would have been ruled out.”
Causal dynamical triangulations
That hardly constituted proof, however, and causal set theory has offered few other predictions that could be tested. Some physicists have found it much more fruitful to use computer simulations. The idea, which dates back to the early 1990s, is to approximate the unknown fundamental constituents with tiny chunks of ordinary space-time caught up in a roiling sea of quantum fluctuations, and to follow how these chunks spontaneously glue themselves together into larger structures.
The earliest efforts were disappointing, says Renate Loll, a physicist now at Radboud University in Nijmegen, the Netherlands. The space-time building blocks were simple hyper-pyramids — four-dimensional counterparts to three-dimensional tetrahedrons — and the simulation’s gluing rules allowed them to combine freely. The result was a series of bizarre ‘universes’ that had far too many dimensions (or too few), and that folded back on themselves or broke into pieces. “It was a free-for-all that gave back nothing that resembles what we see around us,” says Loll.
But, like Sorkin, Loll and her colleagues found that adding causality changed everything. After all, says Loll, the dimension of time is not quite like the three dimensions of space. “We cannot travel back and forth in time,” she says. So the team changed its simulations to ensure that effects could not come before their cause — and found that the space-time chunks started consistently assembling themselves into smooth four-dimensional universes with properties similar to our own10.
Intriguingly, the simulations also hint that soon after the Big Bang, the Universe went through an infant phase with only two dimensions — one of space and one of time. This prediction has also been made independently by others attempting to derive equations of quantum gravity, and even some who suggest that the appearance of dark energy is a sign that our Universe is now growing a fourth spatial dimension. Others have shown that a two-dimensional phase in the early Universe would create patterns similar to those already seen in the cosmic microwave background.
Holography
Meanwhile, Van Raamsdonk has proposed a very different idea about the emergence of space-time, based on the holographic principle. Inspired by the hologram-like way that black holes store all their entropy at the surface, this principle was first given an explicit mathematical form by Juan Maldacena, a string theorist at the Institute of Advanced Study in Princeton, New Jersey, who published11 his influential model of a holographic universe in 1998. In that model, the three-dimensional interior of the universe contains strings and black holes governed only by gravity, whereas its two-dimensional boundary contains elementary particles and fields that obey ordinary quantum laws without gravity.
Hypothetical residents of the three-dimensional space would never see this boundary, because it would be infinitely far away. But that does not affect the mathematics: anything happening in the three-dimensional universe can be described equally well by equations in the two-dimensional boundary, and vice versa.
In 2010, Van Raamsdonk studied what that means when quantum particles on the boundary are ‘entangled’ — meaning that measurements made on one inevitably affect the other12. He discovered that if every particle entanglement between two separate regions of the boundary is steadily reduced to zero, so that the quantum links between the two disappear, the three-dimensional space responds by gradually dividing itself like a splitting cell, until the last, thin connection between the two halves snaps. Repeating that process will subdivide the three-dimensional space again and again, while the two-dimensional boundary stays connected. So, in effect, Van Raamsdonk concluded, the three-dimensional universe is being held together by quantum entanglement on the boundary — which means that in some sense, quantum entanglement and space-time are the same thing.
Or, as Maldacena puts it: “This suggests that quantum is the most fundamental, and space-time emerges from it.”
Nature 500,516–519 (29 August 2013) doi:10.1038/500516a
http://www.nature.com/news/theoretical-physics-the-origins-of-space-and-time-1.13613
__________
This website is educational. Materials within it are being used in accord with the Fair Use doctrine, as defined by United States law.
§107. Limitations on Exclusive Rights: Fair Use
Notwithstanding the provisions of section 106, the fair use of a copyrighted work, including such use by reproduction in copies or phone records or by any other means specified by that section, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright. In determining whether the use made of a work in any particular case is a fair use, the factors to be considered shall include:
the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes;
the nature of the copyrighted work;
the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and
the effect of the use upon the potential market for or value of the copyrighted work. (added pub. l 94-553, Title I, 101, Oct 19, 1976, 90 Stat 2546)
__________________________________________________
Power (electrical)
If you look carefully at a stereo, hair dryer, or other household appliance, you find that most devices list a “power rating” that tells how many watts the appliance uses. In this section you will learn what these power ratings mean, and how to figure out the electricity costs of using various appliances.
The three electrical quantities

We have now learned three important electrical quantities:
Paying for electricity
Electric bills sent out by utility companies don’t charge by the volt, the amp, or the ohm. You may have noticed that electrical appliances in your home usually include another unit – the watt. Most appliances have a label that lists the number of watts or kilowatts. You may have purchased 60-watt light bulbs, or a 900-watt hair dryer, or a 1500-watt toaster oven. Electric companies charge for the energy you use, which depends on how many watts each appliance consumes in a given month.
A watt is a unit of power
The watt is a unit of power. Power, in the scientific sense, has a precise meaning. Power is the rate at which energy is flowing. Energy is measured in joules. Power is measured in joules per second. One joule per second is equal to one watt. A 100-watt light bulb uses 100 joules of energy every second. Where does the electrical power go?
Electrical power can be easily transformed into many different forms. An electric
motor takes electrical power and makes mechanical power. A light bulb turns electrical power into light and a toaster oven turns the power into heat. The same unit (watts) applies to all forms of energy flow, including light, motion, electrical, thermal, or many others.
Power in a circuit can be measured using the tools we already have. Remember
that one watt equals an energy flow of one joule per second.
Amps = flow of 1 coulomb of charge per second
Volts = an energy of 1 joule of energy / coulomb of charge
If these two quantities are multiplied together, you will find that the units of
coulombs cancel out, leaving the equation we want for power.

Watts equal joules/second, so we can calculate electrical power in a circuit by
multiplying voltage times current.
P = VI
power measured in watts; voltage in volts; current in amps
A larger unit of power is sometimes needed.
A 1500-watt toaster oven may be labeled 1.5 kW.
kilowatt (kW) is equal to 1000 watts, or 1000 joules per second.
Horsepower – another common unit of power often seen on electric motors

1 horsepower = 746 watts.
Electric motors you find around the house range in
size from 1/25th of a horsepower (30 watts) for a small electric fan to 2 horsepower (1492 watts) for an electric saw.
Time’s Arrow Traced to Quantum Source
A new theory explains the seemingly irreversible arrow of time while yielding insights into entropy, quantum computers, black holes, and the past-future divide.
Natalie Wolchover, Senior Writer, Quanta Magazine, April 16, 2014
Coffee cools, buildings crumble, eggs break and stars fizzle out in a universe that seems destined to degrade into a state of uniform drabness known as thermal equilibrium. The astronomer-philosopher Sir Arthur Eddington in 1927 cited the gradual dispersal of energy as evidence of an irreversible “arrow of time.”
But to the bafflement of generations of physicists, the arrow of time does not seem to follow from the underlying laws of physics, which work the same going forward in time as in reverse. By those laws, it seemed that if someone knew the paths of all the particles in the universe and flipped them around, energy would accumulate rather than disperse: Tepid coffee would spontaneously heat up, buildings would rise from their rubble and sunlight would slink back into the sun.
“In classical physics, we were struggling,” said Sandu Popescu, a professor of physics at the University of Bristol in the United Kingdom. “If I knew more, could I reverse the event, put together all the molecules of the egg that broke? Why am I relevant?”
Surely, he said, time’s arrow is not steered by human ignorance. And yet, since the birth of thermodynamics in the 1850s, the only known approach for calculating the spread of energy was to formulate statistical distributions of the unknown trajectories of particles, and show that, over time, the ignorance smeared things out.
Now, physicists are unmasking a more fundamental source for the arrow of time: Energy disperses and objects equilibrate, they say, because of the way elementary particles become intertwined when they interact — a strange effect called “quantum entanglement.”
“Finally, we can understand why a cup of coffee equilibrates in a room,” said Tony Short, a quantum physicist at Bristol. “Entanglement builds up between the state of the coffee cup and the state of the room.”
Popescu, Short and their colleagues Noah Linden and Andreas Winter reported the discovery in the journal Physical Review E in 2009, arguing that objects reach equilibrium, or a state of uniform energy distribution, within an infinite amount of time by becoming quantum mechanically entangled with their surroundings. Similar results by Peter Reimann of the University of Bielefeld in Germany appeared several months earlier in Physical Review Letters.
Short and a collaborator strengthened the argument in 2012 by showing that entanglement causes equilibration within a finite time. And, in work that was posted on the scientific preprint site arXiv.org in February, two separate groups have taken the next step, calculating that most physical systems equilibrate rapidly, on time scales proportional to their size. “To show that it’s relevant to our actual physical world, the processes have to be happening on reasonable time scales,” Short said.
The tendency of coffee — and everything else — to reach equilibrium is “very intuitive,” said Nicolas Brunner, a quantum physicist at the University of Geneva. “But when it comes to explaining why it happens, this is the first time it has been derived on firm grounds by considering a microscopic theory.”
If the new line of research is correct, then the story of time’s arrow begins with the quantum mechanical idea that, deep down, nature is inherently uncertain. An elementary particle lacks definite physical properties and is defined only by probabilities of being in various states. For example, at a particular moment, a particle might have a 50 percent chance of spinning clockwise and a 50 percent chance of spinning counterclockwise. An experimentally tested theorem by the Northern Irish physicist John Bell says there is no “true” state of the particle; the probabilities are the only reality that can be ascribed to it.
Quantum uncertainty then gives rise to entanglement, the putative source of the arrow of time.
When two particles interact, they can no longer even be described by their own, independently evolving probabilities, called “pure states.” Instead, they become entangled components of a more complicated probability distribution that describes both particles together. It might dictate, for example, that the particles spin in opposite directions. The system as a whole is in a pure state, but the state of each individual particle is “mixed” with that of its acquaintance. The two could travel light-years apart, and the spin of each would remain correlated with that of the other, a feature Albert Einstein famously described as “spooky action at a distance.”
“Entanglement is in some sense the essence of quantum mechanics,” or the laws governing interactions on the subatomic scale, Brunner said. The phenomenon underlies quantum computing, quantum cryptography and quantum teleportation.
The idea that entanglement might explain the arrow of time first occurred to Seth Lloyd about 30 years ago, when he was a 23-year-old philosophy graduate student at Cambridge University with a Harvard physics degree. Lloyd realized that quantum uncertainty, and the way it spreads as particles become increasingly entangled, could replace human uncertainty in the old classical proofs as the true source of the arrow of time.
Using an obscure approach to quantum mechanics that treated units of information as its basic building blocks, Lloyd spent several years studying the evolution of particles in terms of shuffling 1s and 0s. He found that as the particles became increasingly entangled with one another, the information that originally described them (a “1” for clockwise spin and a “0” for counterclockwise, for example) would shift to describe the system of entangled particles as a whole. It was as though the particles gradually lost their individual autonomy and became pawns of the collective state. Eventually, the correlations contained all the information, and the individual particles contained none. At that point, Lloyd discovered, particles arrived at a state of equilibrium, and their states stopped changing, like coffee that has cooled to room temperature.
“What’s really going on is things are becoming more correlated with each other,” Lloyd recalls realizing. “The arrow of time is an arrow of increasing correlations.”
The idea, presented in his 1988 doctoral thesis, fell on deaf ears. When he submitted it to a journal, he was told that there was “no physics in this paper.” Quantum information theory “was profoundly unpopular” at the time, Lloyd said, and questions about time’s arrow “were for crackpots and Nobel laureates who have gone soft in the head.” he remembers one physicist telling him.
“I was darn close to driving a taxicab,” Lloyd said.
Advances in quantum computing have since turned quantum information theory into one of the most active branches of physics. Lloyd is now a professor at the Massachusetts Institute of Technology, recognized as one of the founders of the discipline, and his overlooked idea has resurfaced in a stronger form in the hands of the Bristol physicists. The newer proofs are more general, researchers say, and hold for virtually any quantum system.
“When Lloyd proposed the idea in his thesis, the world was not ready,” said Renato Renner, head of the Institute for Theoretical Physics at ETH Zurich. “No one understood it. Sometimes you have to have the idea at the right time.”
In 2009, the Bristol group’s proof resonated with quantum information theorists, opening up new uses for their techniques. It showed that as objects interact with their surroundings — as the particles in a cup of coffee collide with the air, for example — information about their properties “leaks out and becomes smeared over the entire environment,” Popescu explained. This local information loss causes the state of the coffee to stagnate even as the pure state of the entire room continues to evolve. Except for rare, random fluctuations, he said, “its state stops changing in time.”
Consequently, a tepid cup of coffee does not spontaneously warm up. In principle, as the pure state of the room evolves, the coffee could suddenly become unmixed from the air and enter a pure state of its own. But there are so many more mixed states than pure states available to the coffee that this practically never happens — one would have to outlive the universe to witness it. This statistical unlikelihood gives time’s arrow the appearance of irreversibility. “Essentially entanglement opens a very large space for you,” Popescu said. “It’s like you are at the park and you start next to the gate, far from equilibrium. Then you enter and you have this enormous place and you get lost in it. And you never come back to the gate.”
In the new story of the arrow of time, it is the loss of information through quantum entanglement, rather than a subjective lack of human knowledge, that drives a cup of coffee into equilibrium with the surrounding room. The room eventually equilibrates with the outside environment, and the environment drifts even more slowly toward equilibrium with the rest of the universe. The giants of 19th century thermodynamics viewed this process as a gradual dispersal of energy that increases the overall entropy, or disorder, of the universe. Today, Lloyd, Popescu and others in their field see the arrow of time differently. In their view, information becomes increasingly diffuse, but it never disappears completely. So, they assert, although entropy increases locally, the overall entropy of the universe stays constant at zero.
“The universe as a whole is in a pure state,” Lloyd said. “But individual pieces of it, because they are entangled with the rest of the universe, are in mixtures.”
One aspect of time’s arrow remains unsolved. “There is nothing in these works to say why you started at the gate,” Popescu said, referring to the park analogy. “In other words, they don’t explain why the initial state of the universe was far from equilibrium.” He said this is a question about the nature of the Big Bang.
Despite the recent progress in calculating equilibration time scales, the new approach has yet to make headway as a tool for parsing the thermodynamic properties of specific things, like coffee, glass or exotic states of matter. (Several traditional thermodynamicists reported being only vaguely aware of the new approach.) “The thing is to find the criteria for which things behave like window glass and which things behave like a cup of tea,” Renner said. “I would see the new papers as a step in this direction, but much more needs to be done.”
Some researchers expressed doubt that this abstract approach to thermodynamics will ever be up to the task of addressing the “hard nitty-gritty of how specific observables behave,” as Lloyd put it. But the conceptual advance and new mathematical formalism is already helping researchers address theoretical questions about thermodynamics, such as the fundamental limits of quantum computers and even the ultimate fate of the universe.
“We’ve been thinking more and more about what we can do with quantum machines,” said Paul Skrzypczyk of the Institute of Photonic Sciences in Barcelona. “Given that a system is not yet at equilibrium, we want to get work out of it. How much useful work can we extract? How can I intervene to do something interesting?”
Sean Carroll, a theoretical cosmologist at the California Institute of Technology, is employing the new formalism in his latest work on time’s arrow in cosmology. “I’m interested in the ultra-long-term fate of cosmological space-times,” said Carroll, author of “From Eternity to Here: The Quest for the Ultimate Theory of Time.” “That’s a situation where we don’t really know all of the relevant laws of physics, so it makes sense to think on a very abstract level, which is why I found this basic quantum-mechanical treatment useful.”
Twenty-six years after Lloyd’s big idea about time’s arrow fell flat, he is pleased to be witnessing its rise and has been applying the ideas in recent work on the black hole information paradox. “I think now the consensus would be that there is physics in this,” he said.
Not to mention a bit of philosophy.
According to the scientists, our ability to remember the past but not the future, another historically confounding manifestation of time’s arrow, can also be understood as a buildup of correlations between interacting particles. When you read a message on a piece of paper, your brain becomes correlated with it through the photons that reach your eyes. Only from that moment on will you be capable of remembering what the message says. As Lloyd put it: “The present can be defined by the process of becoming correlated with our surroundings.”
The backdrop for the steady growth of entanglement throughout the universe is, of course, time itself. The physicists stress that despite great advances in understanding how changes in time occur, they have made no progress in uncovering the nature of time itself or why it seems different (both perceptually and in the equations of quantum mechanics) than the three dimensions of space. Popescu calls this “one of the greatest unknowns in physics.”
“We can discuss the fact that an hour ago, our brains were in a state that was correlated with fewer things,” he said. “But our perception that time is flowing — that is a different matter altogether. Most probably, we will need a further revolution in physics that will tell us about that.”
https://www.quantamagazine.org/20150428-how-quantum-pairs-stitch-space-time/
The Quantum Thermodynamics Revolution
As physicists extend the 19th-century laws of thermodynamics to the quantum realm, they’re rewriting the relationships among energy, entropy and information.
Natalie Wolchover, Senior Writer, Quanta Magazine, May 2, 2017
https://www.quantamagazine.org/quantum-thermodynamics-revolution/
In his 1824 book, Reflections on the Motive Power of Fire, the 28-year-old French engineer Sadi Carnot worked out a formula for how efficiently steam engines can convert heat — now known to be a random, diffuse kind of energy — into work, an orderly kind of energy that might push a piston or turn a wheel. To Carnot’s surprise, he discovered that a perfect engine’s efficiency depends only on the difference in temperature between the engine’s heat source (typically a fire) and its heat sink (typically the outside air). Work is a byproduct, Carnot realized, of heat naturally passing to a colder body from a warmer one.
Carnot died of cholera eight years later, before he could see his efficiency formula develop over the 19th century into the theory of thermodynamics: a set of universal laws dictating the interplay among temperature, heat, work, energy and entropy — a measure of energy’s incessant spreading from more- to less-energetic bodies. The laws of thermodynamics apply not only to steam engines but also to everything else: the sun, black holes, living beings and the entire universe. The theory is so simple and general that Albert Einstein deemed it likely to “never be overthrown.”
Yet since the beginning, thermodynamics has held a singularly strange status among the theories of nature.
“If physical theories were people, thermodynamics would be the village witch,” the physicist Lídia del Rio and co-authors wrote last year in Journal of Physics A. “The other theories find her somewhat odd, somehow different in nature from the rest, yet everyone comes to her for advice, and no one dares to contradict her.”
Unlike, say, the Standard Model of particle physics, which tries to get at what exists, the laws of thermodynamics only say what can and can’t be done. But one of the strangest things about the theory is that these rules seem subjective. A gas made of particles that in aggregate all appear to be the same temperature — and therefore unable to do work — might, upon closer inspection, have microscopic temperature differences that could be exploited after all. As the 19th-century physicist James Clerk Maxwell put it, “The idea of dissipation of energy depends on the extent of our knowledge.”

In recent years, a revolutionary understanding of thermodynamics has emerged that explains this subjectivity using quantum information theory — “a toddler among physical theories,” as del Rio and co-authors put it, that describes the spread of information through quantum systems. Just as thermodynamics initially grew out of trying to improve steam engines, today’s thermodynamicists are mulling over the workings of quantum machines. Shrinking technology — a single-ion engine and three-atom fridge were both experimentally realized for the first time within the past year — is forcing them to extend thermodynamics to the quantum realm, where notions like temperature and work lose their usual meanings, and the classical laws don’t necessarily apply.
They’ve found new, quantum versions of the laws that scale up to the originals. Rewriting the theory from the bottom up has led experts to recast its basic concepts in terms of its subjective nature, and to unravel the deep and often surprising relationship between energy and information — the abstract 1s and 0s by which physical states are distinguished and knowledge is measured. “Quantum thermodynamics” is a field in the making, marked by a typical mix of exuberance and confusion.
“We are entering a brave new world of thermodynamics,” said Sandu Popescu, a physicist at the University of Bristol who is one of the leaders of the research effort. “Although it was very good as it started,” he said, referring to classical thermodynamics, “by now we are looking at it in a completely new way.”
Entropy as Uncertainty
In an 1867 letter to his fellow Scotsman Peter Tait, Maxwell described his now-famous paradox hinting at the connection between thermodynamics and information. The paradox concerned the second law of thermodynamics — the rule that entropy always increases — which Sir Arthur Eddington would later say “holds the supreme position among the laws of nature.” According to the second law, energy becomes ever more disordered and less useful as it spreads to colder bodies from hotter ones and differences in temperature diminish. (Recall Carnot’s discovery that you need a hot body and a cold body to do work.) Fires die out, cups of coffee cool and the universe rushes toward a state of uniform temperature known as “heat death,” after which no more work can be done.
The great Austrian physicist Ludwig Boltzmann showed that energy disperses, and entropy increases, as a simple matter of statistics: There are many more ways for energy to be spread among the particles in a system than concentrated in a few, so as particles move around and interact, they naturally tend toward states in which their energy is increasingly shared.
But Maxwell’s letter described a thought experiment in which an enlightened being — later called Maxwell’s demon — uses its knowledge to lower entropy and violate the second law. The demon knows the positions and velocities of every molecule in a container of gas. By partitioning the container and opening and closing a small door between the two chambers, the demon lets only fast-moving molecules enter one side, while allowing only slow molecules to go the other way. The demon’s actions divide the gas into hot and cold, concentrating its energy and lowering its overall entropy. The once useless gas can now be put to work.
Maxwell and others wondered how a law of nature could depend on one’s knowledge — or ignorance — of the positions and velocities of molecules. If the second law of thermodynamics depends subjectively on one’s information, in what sense is it true?

A century later, the American physicist Charles Bennett, building on work by Leo Szilard and Rolf Landauer, resolved the paradox by formally linking thermodynamics to the young science of information. Bennett argued that the demon’s knowledge is stored in its memory, and memory has to be cleaned, which takes work. (In 1961, Landauer calculated that at room temperature, it takes at least 2.9 zeptojoules of energy for a computer to erase one bit of stored information.) In other words, as the demon organizes the gas into hot and cold and lowers the gas’s entropy, its brain burns energy and generates more than enough entropy to compensate. The overall entropy of the gas-demon system increases, satisfying the second law of thermodynamics.
The findings revealed that, as Landauer put it, “Information is physical.” The more information you have, the more work you can extract. Maxwell’s demon can wring work out of a single-temperature gas because it has far more information than the average user.
But it took another half century and the rise of quantum information theory, a field born in pursuit of the quantum computer, for physicists to fully explore the startling implications.
Over the past decade, Popescu and his Bristol colleagues, along with other groups, have argued that energy spreads to cold objects from hot ones because of the way information spreads between particles. According to quantum theory, the physical properties of particles are probabilistic; instead of being representable as 1 or 0, they can have some probability of being 1 and some probability of being 0 at the same time. When particles interact, they can also become entangled, joining together the probability distributions that describe both of their states. A central pillar of quantum theory is that the information — the probabilistic 1s and 0s representing particles’ states — is never lost. (The present state of the universe preserves all information about the past.)
Over time, however, as particles interact and become increasingly entangled, information about their individual states spreads and becomes shuffled and shared among more and more particles. Popescu and his colleagues believe that the arrow of increasing quantum entanglement underlies the expected rise in entropy — the thermodynamic arrow of time. A cup of coffee cools to room temperature, they explain, because as coffee molecules collide with air molecules, the information that encodes their energy leaks out and is shared by the surrounding air.
Understanding entropy as a subjective measure allows the universe as a whole to evolve without ever losing information. Even as parts of the universe, such as coffee, engines and people, experience rising entropy as their quantum information dilutes, the global entropy of the universe stays forever zero.
Renato Renner, a professor at ETH Zurich in Switzerland, described this as a radical shift in perspective. Fifteen years ago, “we thought of entropy as a property of a thermodynamic system,” he said. “Now in information theory, we wouldn’t say entropy is a property of a system, but a property of an observer who describes a system.”
Moreover, the idea that energy has two forms, useless heat and useful work, “made sense for steam engines,” Renner said. “In the new way, there is a whole spectrum in between — energy about which we have partial information.”
Entropy and thermodynamics are “much less of a mystery in this new view,” he said. “That’s why people like the new view better than the old one.”
Thermodynamics From Symmetry
The relationship among information, energy and other “conserved quantities,” which can change hands but never be destroyed, took a new turn in two papers published simultaneously last July in Nature Communications, one by the Bristol team and another by a team that included Jonathan Oppenheim at University College London. Both groups conceived of a hypothetical quantum system that uses information as a sort of currency for trading between the other, more material resources.
Imagine a vast container, or reservoir, of particles that possess both energy and angular momentum (they’re both moving around and spinning). This reservoir is connected to both a weight, which takes energy to lift, and a turning turntable, which takes angular momentum to speed up or slow down. Normally, a single reservoir can’t do any work — this goes back to Carnot’s discovery about the need for hot and cold reservoirs. But the researchers found that a reservoir containing multiple conserved quantities follows different rules. “If you have two different physical quantities that are conserved, like energy and angular momentum,” Popescu said, “as long as you have a bath that contains both of them, then you can trade one for another.”
In the hypothetical weight-reservoir-turntable system, the weight can be lifted as the turntable slows down, or, conversely, lowering the weight causes the turntable to spin faster. The researchers found that the quantum information describing the particles’ energy and spin states can act as a kind of currency that enables trading between the reservoir’s energy and angular momentum supplies. The notion that conserved quantities can be traded for one another in quantum systems is brand new. It may suggest the need for a more complete thermodynamic theory that would describe not only the flow of energy, but also the interplay between all the conserved quantities in the universe.
The fact that energy has dominated the thermodynamics story up to now might be circumstantial rather than profound, Oppenheim said. Carnot and his successors might have developed a thermodynamic theory governing the flow of, say, angular momentum to go with their engine theory, if only there had been a need. “We have energy sources all around us that we want to extract and use,” Oppenheim said. “It happens to be the case that we don’t have big angular momentum heat baths around us. We don’t come across huge gyroscopes.”

Popescu, who won a Dirac Medal last year for his insights in quantum information theory and quantum foundations, said he and his collaborators work by “pushing quantum mechanics into a corner,” gathering at a blackboard and reasoning their way to a new insight after which it’s easy to derive the associated equations. Some realizations are in the process of crystalizing. In one of several phone conversations in March, Popescu discussed a new thought experiment that illustrates a distinction between information and other conserved quantities — and indicates how symmetries in nature might set them apart.
“Suppose that you and I are living on different planets in remote galaxies,” he said, and suppose that he, Popescu, wants to communicate where you should look to find his planet. The only problem is, this is physically impossible: “I can send you the story of Hamlet. But I cannot indicate for you a direction.”
There’s no way to express in a string of pure, directionless 1s and 0s which way to look to find each other’s galaxies because “nature doesn’t provide us with [a reference frame] that is universal,” Popescu said. If it did — if, for instance, tiny arrows were sewn everywhere in the fabric of the universe, indicating its direction of motion — this would violate “rotational invariance,” a symmetry of the universe. Turntables would start turning faster when aligned with the universe’s motion, and angular momentum would not appear to be conserved. The early-20th-century mathematician Emmy Noether showed that every symmetry comes with a conservation law: The rotational symmetry of the universe reflects the preservation of a quantity we call angular momentum. Popescu’s thought experiment suggests that the impossibility of expressing spatial direction with information “may be related to the conservation law,” he said.
The seeming inability to express everything about the universe in terms of information could be relevant to the search for a more fundamental description of nature. In recent years, many theorists have come to believe that space-time, the bendy fabric of the universe, and the matter and energy within it might be a hologram that arises from a network of entangled quantum information. “One has to be careful,” Oppenheim said, “because information does behave differently than other physical properties, like space-time.”
Knowing the logical links between the concepts could also help physicists reason their way inside black holes, mysterious space-time swallowing objects that are known to have temperatures and entropies, and which somehow radiate information. “One of the most important aspects of the black hole is its thermodynamics,” Popescu said. “But the type of thermodynamics that they discuss in the black holes, because it’s such a complicated subject, is still more of a traditional type. We are developing a completely novel view on thermodynamics.” It’s “inevitable,” he said, “that these new tools that we are developing will then come back and be used in the black hole.”
What to Tell Technologists
Janet Anders, a quantum information scientist at the University of Exeter, takes a technology-driven approach to understanding quantum thermodynamics. “If we go further and further down [in scale], we’re going to hit a region that we don’t have a good theory for,” Anders said. “And the question is, what do we need to know about this region to tell technologists?”
In 2012, Anders conceived of and co-founded a European research network devoted to quantum thermodynamics that now has 300 members. With her colleagues in the network, she hopes to discover the rules governing the quantum transitions of quantum engines and fridges, which could someday drive or cool computers or be used in solar panels, bioengineering and other applications. Already, researchers are getting a better sense of what quantum engines might be capable of. In 2015, Raam Uzdin and colleagues at the Hebrew University of Jerusalem calculated that quantum engines can outpower classical engines. These probabilistic engines still follow Carnot’s efficiency formula in terms of how much work they can derive from energy passing between hot and cold bodies. But they’re sometimes able to extract the work much more quickly, giving them more power. An engine made of a single ion was experimentally demonstrated and reported in Science in April 2016, though it didn’t harness the power-enhancing quantum effect.
Popescu, Oppenheim, Renner and their cohorts are also pursuing more concrete discoveries. In March, Oppenheim and his former student, Lluis Masanes, published a paper deriving the third law of thermodynamics — a historically confusing statement about the impossibility of reaching absolute-zero temperature — using quantum information theory. They showed that the “cooling speed limit” preventing you from reaching absolute zero arises from the limit on how fast information can be pumped out of the particles in a finite-size object. The speed limit might be relevant to the cooling abilities of quantum fridges, like the one reported in a preprint in February. In 2015, Oppenheim and other collaborators showed that the second law of thermodynamics is replaced, on quantum scales, by a panoply of second “laws” — constraints on how the probability distributions defining the physical states of particles evolve, including in quantum engines.
As the field of quantum thermodynamics grows quickly, spawning a range of approaches and findings, some traditional thermodynamicists see a mess. Peter Hänggi, a vocal critic at the University of Augsburg in Germany, thinks the importance of information is being oversold by ex-practitioners of quantum computing, who he says mistake the universe for a giant quantum information processor instead of a physical thing. He accuses quantum information theorists of confusing different kinds of entropy — the thermodynamic and information-theoretic kinds — and using the latter in domains where it doesn’t apply. Maxwell’s demon “gets on my nerves,” Hänggi said. When asked about Oppenheim and company’s second “laws” of thermodynamics, he said, “You see why my blood pressure rises.”

While Hänggi is seen as too old-fashioned in his critique (quantum-information theorists do study the connections between thermodynamic and information-theoretic entropy), other thermodynamicists said he makes some valid points. For instance, when quantum information theorists conjure up abstract quantum machines and see if they can get work out of them, they sometimes sidestep the question of how, exactly, you extract work from a quantum system, given that measuring it destroys its simultaneous quantum probabilities. Anders and her collaborators have recently begun addressing this issue with new ideas about quantum work extraction and storage. But the theoretical literature is all over the place.
“Many exciting things have been thrown on the table, a bit in disorder; we need to put them in order,” said Valerio Scarani, a quantum information theorist and thermodynamicist at the National University of Singapore who was part of the team that reported the quantum fridge. “We need a bit of synthesis. We need to understand your idea fits there; mine fits here. We have eight definitions of work; maybe we should try to figure out which one is correct in which situation, not just come up with a ninth definition of work.”
Oppenheim and Popescu fully agree with Hänggi that there’s a risk of downplaying the universe’s physicality. “I’m wary of information theorists who believe everything is information,” Oppenheim said. “When the steam engine was being developed and thermodynamics was in full swing, there were people positing that the universe was just a big steam engine.” In reality, he said, “it’s much messier than that.” What he likes about quantum thermodynamics is that “you have these two fundamental quantities — energy and quantum information — and these two things meet together. That to me is what makes it such a beautiful theory.”
__________________________
This website is educational. Materials within it are being used in accord with the Fair Use doctrine, as defined by United States law.
§107. Limitations on Exclusive Rights: Fair Use
Notwithstanding the provisions of section 106, the fair use of a copyrighted work, including such use by reproduction in copies or phone records or by any other means specified by that section, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright. In determining whether the use made of a work in any particular case is a fair use, the factors to be considered shall include:
the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes;
the nature of the copyrighted work;
the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and
the effect of the use upon the potential market for or value of the copyrighted work. (added pub. l 94-553, Title I, 101, Oct 19, 1976, 90 Stat 2546)
Virtual lab: Series and Parallel circuits
Learn about electrical circuits with the PhET Circuit construction kit
* Briefly play with the app, learning the drag-and-drop components
* Follow the instructions. Carefully write answers in your notebook.
* Accurately answer questions in complete sentences, at a high school level.
This must be completed in class to get credit. Unless you have an excused absence, you can’t make up the lab.
Learning Goals:
Develop a general rule regarding how resistance affects current flow,
when the voltage is constant.
Learn how changing resistance values affect current flow in both series and parallel circuits.
Series Circuit A

Right click on the resistor, change the value of the resistor and observe what happens to the rate that the electrons move through it. The rate at which the electrons move is called current. Current is measure in Amps
(A) Make a general rule about the relationship between current and resistance.
– 10 points for circuit and accurate answer.
——————-
Parallel Circuit B

Make observations & draw conclusions. – By right clicking on the resistors, change the values of the resistors, making one very high and one very low and visa versa.
Look for what happens to the current flow through the different resistors.
With regards to circuit B:
(a) Describe current at different locations in the circuit, esp. rate of the current and the value of the resistors.
(b) Explain your observations of the current flow in terms of the water tank model of electricity given to you in class
(c) Describe how your general rule from step 2 relates to your observations
– 20 points for circuit and accurate answer.
______________________________
Circuit C

Change the values of the resistors, making one very high and one very low, and visa versa.
(a) Look for what happens to the current flow through the different resistors.
(b) Describe current at different locations in the circuit.
(c) Explain observations of the current flow in terms of the water-flow analogy.
(d) Describe how your general rule from the beginning relates to your observations.
Water flow analogies for electrical current
– 20 points for circuit and accurate answer.
=============================================================
Circuit D: voltage in a series circuit
Build the series circuit shown below. On the left-hand menu, click voltmeter. You can drag-and-drop the red and black leads.
In your notebook, add the following definitions:
A lead is an electrical connection that comes from some device. Some are used to transfer power; ours are used to probe circuits.
A multimeter is a measuring instrument that combines multiple meters (measuring devices) into one Typical multimeters include
ammeter = measures I (current)
metric unit of current is amperes (A)
ohmmeter = measures r (resistance)
metric unit of resistance is ohms (Ω)
Ω is the Greek letter omega.
voltmeter = measures v (voltage) in a battery,
or the voltage drop across a part of a circuit.
metric unit of voltage is the volt (v).

With the knife-switch closed, what is the voltage drop across:
- the battery
- the light bulb
- the knife-switch
- the resistor
With the knife-switch open, what is the voltage drop across:
- the battery
- the light bulb
- the knife-switch
- the resistor
____________________________
Circuit E: voltage in a parallel circuit
Build the series circuit shown below. On the left-hand menu, click voltmeter.
You can drag-and-drop the red and black leads.
What is the voltage drop across:
- the 2 batteries
- the resistor in the middle
- the light-bulb
- Points A and B on the wires.

=============================================================
Circuit F: Measuring both I and V
Build the circuit shown here. Use the voltmeter to measure voltage, and the ammeter to measure current. Carefully fill in the 2 data tables. After you have taken the data, answer
(a) Compare the voltage numbers before you changed the resistance, to after you changed the resistance.
(b) Look just at the left column (default values) for current. Compare your numbers, to their locations on the circuit: What’s the relationship between the amount of current in one part of the circuit, to another? (Thinking of the water-flow analogy may be helpful.)
(c) Look at the right column for current. How did changing the value of one resistor affect the circuit (if at all?)




Learning Standards
Massachusetts 2016 Science and Technology/Engineering (STE) Standards
HS-PS2-9(MA). Evaluate simple series and parallel circuits to predict changes to voltage, current, or resistance when simple changes are made to a circuit.
Technology/Engineering Progression Grades 9-10
The use of electrical circuits and electricity is critical to most technological systems in society. Electrical systems can be AC or DC, rely on a variety of key components, and are designed for specific voltage, current, and/or power.
Emergent phenomenon
Thomas T. Thomas writes:
From our perspective at the human scale, a tabletop is a flat plane.

but at the atomic level, the flat surface disappears into a lumpy swarm of molecules.
![]()
Aficionados of fractal imagery will understand this perfectly: any natural feature like the slope of a hill or shore of a coast can be broken down into smaller and smaller curves and angles, endlessly subject to refinement. In fractal geometry, which is driven by simple equations, the large curves mirror the small curves ad infinitum.
The emergent property is not an illusion… The flatness of the tabletop is just as real—and more useful for setting out silverware and plates—than the churning atoms that actually compose it. The hill and its slope are just as real—and more useful for climbing—than the myriad tiny angles and curves, the surfaces of the grains of sand and bits of rock, that underlie the slope.
Emergent property works on greater scales, too. From space the Earth presents as a nearly perfect sphere, a blue-white marble decorated with flashes of green and brown, but still quite smooth. That spherical shape only becomes apparent from a great distance. Viewed from the surface, it’s easy enough for the eye to see a flat plane bounded by the horizon and to focus on hills and valleys as objects of great stature which, from a distance of millions of miles, do not even register as wrinkles.
Emergent properties come into play only when the action of thousands, millions, or billions of separate and distinct elements are perceived and treated as a single entity. “Forest” is an emergent property of thousands of individual trees. The concept of emergent properties can be extremely useful to describe some of the situations and events that we wrestle with daily.
The Human Condition: Emergent Properties, Thomas T. Thomas, 8/11/2013
also
NOVA ScienceNow Emergence, PBS
Examples
Conway’s game of life
https://en.wikipedia.org/wiki/Conway%27s_Game_of_Life
http://emergentuniverse.wikia.com/wiki/Conway%27s_Game_of_Life
http://www.scholarpedia.org/article/Game_of_Life
http://www.conwaylife.com/
BOIDS: Birds flocking
Boids Background and Update by Craig Reynolds
http://www.red3d.com/cwr/behave.html
http://www.emergentmind.com/boids
Coding: 3 Simple Rules of Flocking Behaviors: Alignment, Cohesion, and Separation
https://en.wikipedia.org/wiki/Flocking_(behavior)
Classical physics
Classical physics is an emergent property of quantum mechanics
TBA
External links
Online Interactive Science Museum about Emergence
How Complex Wholes Emerge From Simple Parts Quanta magazine
Learning Standards
2016 Massachusetts Science and Technology/Engineering Curriculum Framework
Appendix VIII Value of Crosscutting Concepts and Nature of Science in Curricula
In grades 9–12, students can observe patterns in systems at different scales and cite patterns as empirical evidence for causality in supporting their explanations of phenomena. They recognize that classifications or explanations used at one scale may not be useful or need revision using a different scale, thus requiring improved investigations and experiments. They use mathematical representations to identify certain patterns and analyze patterns of performance in order to re-engineer and improve a designed system.
Next Gen Science Standards HS-PS2 Motion and Stability
Crosscutting Concepts: Different patterns may be observed at each of the scales at which a system is studied and can provide evidence for causality in explanations of phenomena. (HS-PS2-4)
A Framework for K-12 Science Education
Scale, proportion, and quantity. In considering phenomena, it is critical to recognize what is relevant at different measures of size, time, and energy and to recognize how changes in scale, proportion, or quantity affect a system’s structure or performance…. The understanding of relative magnitude is only a starting point. As noted in Benchmarks for Science Literacy, “The large idea is that the way in which things work may change with scale. Different aspects of nature change at different rates with changes in scale, and so the relationships among them change, too.” Appropriate understanding of scale relationships is critical as well to engineering—no structure could be conceived, much less constructed, without the engineer’s precise sense of scale.
Dimension 2, Crosscutting Concepts, A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas (2012)
http://necsi.edu/guide/concepts/emergence.html
Quantum teleportation
Big step for quantum teleportation won’t bring us any closer to Star Trek. Here’s why
By Adrian Cho, Sep. 19, 2016 , Science (AAAS)
Two teams have set new distance records for quantum teleportation: using the weirdness of quantum mechanics to instantly transfer the condition or “state” of one quantum particle to another one in a different location. One group used the trick to send the state of a quantum particle of light, or photon, 6.2 kilometers across Calgary, Canada, using an optical fiber, while the other teleported the states of photons over 14.7 kilometers across Shanghai, China.
Both advances, reported today in Nature Photonics, could eventually lead to an unhackable quantum internet. But what else is quantum teleportation good for? And will we ever be able to use it to zip painlessly to work on a frigid January morning?
When will this stuff enable us to travel by teleportation?
Sorry to disappoint, but the answer is never. In spite of its name, quantum teleportation has nothing to do with the type of teleportation depicted in the television show Star Trek and other science fiction stories. Such teleportation generally involves disintegrating a material object, somehow beaming the contents through space, and instantly and perfectly reassembling the object in some distant location. In quantum teleportation, nothing is disintegrated and reassembled and no matter travels anywhere. What’s more, the process works only at the level of individual quantum particles: photons, electrons, atoms, etc. Long and short, quantum teleportation and “real” teleportation have nothing in common but the name.
But if quantum teleportation doesn’t move things, then what does it do?
Compared with sending an away team to a planet’s surface, quantum teleportation aims to do something both much less ambitious and much more subtle. Quantum teleportation instantly transfers the condition or “state” of one quantum particle to another distant one without sending the particle itself. It’s a bit like transferring the reading on one clock to a distant one.
What’s so impressive about reading one clock and setting a second the same way?
The quantum state of a particle like a photon is more complex and far more delicate than the reading of a clock. Whereas you can simply read the clock and then set the other clock to the same time, you generally cannot measure the state of a quantum particle without changing it. And you cannot simply “clone” the state of one quantum particle onto another. The rules of quantum mechanics don’t allow it. Instead, what you need to do is find a way to transfer the state of one quantum particle to another without ever actually measuring that state. To continue with the clock analogy, it’s as if you’re transferring the setting of one clock to another without ever looking at the first clock.
How could that possibly work?
It’s a bit complicated. To get a feel for it you need to know something about quantum states. Consider a single photon. A photon is a fundamental bit of an electromagnetic wave, so it can be “polarized” so that its electric field points vertically or horizontally. Thanks to the weirdness of quantum mechanics, the photon can also be in both states at once—so the photon can literally be polarized both vertically and horizontally at the same time. The amounts of vertical and horizontal help define the state of the photon.
But it gets even more complicated than that. In addition to the mixture of vertical and horizontal, the photon’s state is defined by a second parameter, which is a kind of angle called the “phase.” So the actual state of the photon consists of both the mixture of vertical and horizontal and the phase. It can be visualized with the help of an abstract sphere or globe, on which the north pole stands for the pure vertical state and the south pole stand for the horizontal late state.
The precise state of the photon is then a point on the globe, with the latitude giving the balance of vertical and horizontal in the state and the longitude giving the phase. Thus, for example, every point on the equator stands for a state in which the photon is in an equal mixture of vertical and horizontal, but in which the phase, which can be probed in certain more complicated measurements, is different.
So why can’t you just read the point off the globe?
You can’t because measurements of quantum particles provide only limited information. Given a photon in some unknown state, you cannot ask what the “coordinates” of the state on the globe are. Instead, you must perform an either/or measurement. The most simple would be: Is the photon polarized vertically or horizontally? That measurement will give one result or the other with probabilities that depend on the exact mixture of vertical and horizontal in the state. But it won’t tell you the phase. And it will “collapse” the original state, so that the photon is left pointing at one pole or the other, in a state that is either purely vertical or horizontal. That disturbance of the original state is unavoidable in quantum theory.
A photon’s state is described by a point on a “Bloch sphere.” The point’s latitude (angle θ) determines the mixture of horizontal and vertical polarization. The longitude (angle φ) has no classical analog but leads to many weird quantum effects.
A photon’s state is described by a point on a “Bloch sphere.” The point’s latitude (angle θ) determines the mixture of horizontal and vertical polarization. The longitude (angle φ) has no classical analog but leads to many weird quantum effects.

But if you can’t measure the exact state of the photon, how do you transfer it?
You need more photons and another weird bit of quantum mechanics. Two photons can be linked through a subtle connection called “entanglement.” When two photons are entangled, the state of each photon is completely uncertain but the two states are correlated. So, on our abstract globe, the position of each photon remains completely undetermined—it is literally pointing in every direction at once. But, in spite of that uncertainty, the states of the two photons can be correlated so that they are guaranteed to be, say, identical. That is, if you did a fancy measurement that collapsed one photon in the direction on our globe of 40º north, 80º west, you would know the second one would instantly collapse into the same state, no matter how far away it is. Such pairs are crucial to quantum teleportation.
Here’s how it works. Suppose you have two people, Alice and Bob, with a third, Charlie, in the middle. Alice prepares a photon that she wants to teleport—that is, she sets its position on the abstract globe. She sends it down an optical fiber to Charlie. At the same time, Charlie prepares a pair of entangled photons. He keeps one and sends the second one on to Bob.
Now, here’s the tricky part. When Charlie receives Alice’s photon he can take it and the one he’s kept and do a particular type of “joint” measurement on them both. Because quantum measurements collapse the states of photons, Charlie’s measurement actually forces those two photons into an entangled state. (Charlie’s measurement actually asks the either/or question: Are the photons in one particular entangled state or a complementary one?)
But as soon as Charlie does the entangling measurement on the two photons he has—the one he got from Alice and the one he kept from the original entangled pair—a striking thing happens. The photon he sent to Bob instantly collapses into the state of Alice’s original photon. That is, the globe setting of Alice’s photon has been teleported to Bob’s even if Bob is kilometers away from Charlie—as he was in these two experiments.
But why does that happen?
The experiment depends crucially on the correlations inherent in entanglement. Beyond that, to see why the state of Alice’s photon ends up transferred to Bob’s, you pretty much have to go back and work through the math. Once you get used to the notation, anybody who has taken high school algebra can do the calculation. That is one of the things algebra is good for.
Is this what the physicists actually did?
Close. The only difference is that they used two slightly different arrival times for the basic states of the photons, not different polarizations. The hard part in the experiments was guaranteeing that the two photons sent to Bob arrived at the same general time and were identical in color and polarization. If they were distinguishable, then the experiment wouldn’t work. Those were the technical challenges to teleportation over such long distances.
So what is this possibly good for?
Even though it’s abstract, quantum teleportation could be used to make a quantum internet. This would be like today’s internet, but would enable users to transfer quantum states and the information they contain instead of classical information, which is essentially strings of 0s and 1s.
Currently, physicists and engineers have built partially quantum networks in which secure messages can be sent over optical fibers. Those technologies work by using single photons to distribute the numerical keys for locking and unlocking coded messages. They take advantage of the fact that an eavesdropper could not measure those photons without disturbing them and revealing his presence. But right now, those networks aren’t fully quantum mechanical in that the message needs to be decoded and encoded at every node in the network, making the nodes susceptible to hacking.
With quantum teleportation, physicists and engineers might be able to establish an entanglement connection between distant nodes on a network. In principle, this would enable users at those nodes to pass encoded messages that could not be decoded at intermediary nodes and would be essentially unhackable. And if physicists ever succeed in building a general-purpose quantum computer—which would use “qubits” that can be set to 0, 1, or both 0 and 1 to do certain calculations that overwhelm a conventional computer—then such a quantum network might enable users to load in the computer’s initial settings from remote terminals.
When is that going to happen?
Who knows? But a quantum internet seems likely to show up a lot earlier than a general-purpose quantum computer.
Huh. Cool! But no beaming to work during the winter?
Sorry, you’ll still have to bundle up and face the cold.
___________

Possible habitat for life on Enceladus, a moon of Saturn

– This graphic illustrates how Cassini scientists think water interacts with rock at the bottom of the ocean of Saturn’s icy moon Enceladus, producing hydrogen gas. Credit: NASA/JPL-Caltech
____________________________________
Two veteran NASA missions are providing new details about icy, ocean-bearing moons of Jupiter and Saturn, further heightening the scientific interest of these and other “ocean worlds” in our solar system and beyond. The findings are presented in papers published Thursday by researchers with NASA’s Cassini mission to Saturn and Hubble Space Telescope.
In the papers, Cassini scientists announce that a form of chemical energy that life can feed on appears to exist on Saturn’s moon Enceladus, and Hubble researchers report additional evidence of plumes erupting from Jupiter’s moon Europa.
“This is the closest we’ve come, so far, to identifying a place with some of the ingredients needed for a habitable environment,” said Thomas Zurbuchen, associate administrator for NASA’s Science Mission Directorate at Headquarters in Washington. ”These results demonstrate the interconnected nature of NASA’s science missions that are getting us closer to answering whether we are indeed alone or not.”
The paper from researchers with the Cassini mission, published in the journal Science, indicates hydrogen gas, which could potentially provide a chemical energy source for life, is pouring into the subsurface ocean of Enceladus from hydrothermal activity on the seafloor.
The presence of ample hydrogen in the moon’s ocean means that microbes – if any exist there – could use it to obtain energy by combining the hydrogen with carbon dioxide dissolved in the water. This chemical reaction, known as “methanogenesis” because it produces methane as a byproduct, is at the root of the tree of life on Earth, and could even have been critical to the origin of life on our planet.
Life as we know it requires three primary ingredients: liquid water; a source of energy for metabolism; and the right chemical ingredients, primarily carbon, hydrogen, nitrogen, oxygen, phosphorus and sulfur.
With this finding, Cassini has shown that Enceladus – a small, icy moon a billion miles farther from the sun than Earth – has nearly all of these ingredients for habitability. Cassini has not yet shown phosphorus and sulfur are present in the ocean, but scientists suspect them to be, since the rocky core of Enceladus is thought to be chemically similar to meteorites that contain the two elements.
“Confirmation that the chemical energy for life exists within the ocean of a small moon of Saturn is an important milestone in our search for habitable worlds beyond Earth,” said Linda Spilker, Cassini project scientist at NASA’s Jet Propulsion Laboratory in Pasadena, California.
The Cassini spacecraft detected the hydrogen in the plume of gas and icy material spraying from Enceladus during its last, and deepest, dive through the plume on Oct. 28, 2015. Cassini also sampled the plume’s composition during flybys earlier in the mission. From these observations scientists have determined that nearly 98 percent of the gas in the plume is water, about 1 percent is hydrogen and the rest is a mixture of other molecules including carbon dioxide, methane and ammonia.
The measurement was made using Cassini’s Ion and Neutral Mass Spectrometer (INMS) instrument, which sniffs gases to determine their composition. INMS was designed to sample the upper atmosphere of Saturn’s moon Titan. After Cassini’s surprising discovery of a towering plume of icy spray in 2005, emanating from hot cracks near the south pole, scientists turned its detectors toward the small moon.
Cassini wasn’t designed to detect signs of life in the Enceladus plume – indeed, scientists didn’t know the plume existed until after the spacecraft arrived at Saturn.
“Although we can’t detect life, we’ve found that there’s a food source there for it. It would be like a candy store for microbes,” said Hunter Waite, lead author of the Cassini study.
The new findings are an independent line of evidence that hydrothermal activity is taking place in the Enceladus ocean. Previous results, published in March 2015, suggested hot water is interacting with rock beneath the sea; the new findings support that conclusion and add that the rock appears to be reacting chemically to produce the hydrogen.







