KaiserScience

Start here

Advertisements

A timescale for the origin and evolution of all of life on Earth

A timescale for the origin and evolution of all of life on Earth

August 20, 2018, University of Bristol

Palaeontologists have long sought to understand ancient  and the shared evolutionary  of life as a whole. However, the fossil record of early life is extremely fragmented, and its quality significantly deteriorates further back in time towards the Archaean period, more than 2.5 billion years ago, when the Earth’s crust had cooled enough to allow the formation of continents and the only life forms were microbes.

Holly Betts, lead author of the study, from the University of Bristol’s School of Earth Sciences, said: “There are few fossils from the Archaean and they generally cannot be unambiguously assigned to the lineages we are familiar with, like the blue-green algae or the salt-loving archaebacteria that colours salt-marshes pink all around the world.

“The problem with the early fossil record of life is that it is so limited and difficult to interpret—careful reanalysis of the some of the very oldest fossils has shown them to be crystals, not fossils at all.”

Fossil evidence for the early history of life is so fragmented and difficult to evaluate that new discoveries and reinterpretations of known fossils have led to a proliferation of conflicting ideas about the timescale of the early history of life.

Read more at: https://phys.org/news/2018-08-timescale-evolution-life-earth.html#jCp

Timeline origin of life

Co-author Professor Philip Donoghue, also from Bristol’s School of Earth Sciences, added: “Fossils do not represent the only line of evidence to understand the past. A second record of life exists, preserved in the genomes of all living creatures.”

Co-author Dr. Tom Williams, from Bristol’s School of Biological Sciences, said: “Combining fossil and genomic information, we can use an approach called the ‘molecular clock’ which is loosely based on the idea that the number of differences in the genomes of two living species (say a human and a bacterium) are proportional to the time since they shared a common ancestor.”

By making use of this method the team at Bristol and Mark Puttick from the University of Bath were able to derive a timescale for the history of life on Earth that did not rely on the ever-changing age of the oldest accepted  of life.

Co-author Professor Davide Pisani said: “Using this approach we were able to show that the Last Universal Common Ancestor of all cellular , ‘LUCA’, existed very early in Earth’s history, almost 4.5 Billion years ago—not long after Earth was impacted by the planet Theia, the event which sterilised Earth and led to the formation of the Moon.

“This is significantly earlier than the currently accepted oldest fossil evidence would suggest.

“Our results indicate that two “primary” lineages of life emerged from LUCA (the Eubacteria and the Archaebacteria), approximately one Billion years after LUCA.

“This result is testament to the power of , as it is impossible, based on the available fossil information, to discriminate between the oldest eubacterial and archaebacterial fossil remains.”

The study confirms modern views that the eukaryotes, the lineage to which human life belongs (together with the plants and the fungi, for example), is not a primary lineage of life. Professor Pisani added: “It is rather humbling to think we belong to a lineage that is billions of years younger than life itself.”

 Explore further: Breakthrough in determining ages of different microbial groups

More information: Holly C. Betts et al, Integrated genomic and fossil evidence illuminates life’s early evolution and eukaryote origin, Nature Ecology & Evolution (2018). DOI: 10.1038/s41559-018-0644-x

Also see Integrated genomic and fossil evidence illuminates life’s early evolution and eukaryote origin.

Holly C. Betts, Mark N. Puttick, James W. Clark, Tom A. Williams, Philip C. J. Donoghue & Davide Pisani . Nature Ecology & Evolution (2018)

We derive a timescale of life, combining a reappraisal of the fossil material with new molecular clock analyses. We find the last universal common ancestor of cellular life to have predated the end of late heavy bombardment (>3.9 billion years ago (Ga)).

The crown clades of the two primary divisions of life, Eubacteria and Archaebacteria, emerged much later (<3.4 Ga), relegating the oldest fossil evidence for life to their stem lineages.

The Great Oxidation Event significantly predates the origin of modern Cyanobacteria, indicating that oxygenic photosynthesis evolved within the cyanobacterial stem lineage.

Modern eukaryotes do not constitute a primary lineage of life and emerged late in Earth’s history (<1.84 Ga), falsifying the hypothesis that the Great Oxidation Event facilitated their radiation.

The symbiotic origin of mitochondria at 2.053–1.21 Ga reflects a late origin of the total-group Alphaproteobacteria to which the free living ancestor of mitochondria belonged.

https://www.nature.com/articles/s41559-018-0644-x
_______________________

This website is educational. Materials within it are being used in accord with the Fair Use doctrine, as defined by United States law.
§107. Limitations on Exclusive Rights: Fair Use. Notwithstanding the provisions of section 106, the fair use of a copyrighted work, including such use by reproduction in copies or phone records or by any other means specified by that section, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright. In determining whether the use made of a work in any particular case is a fair use, the factors to be considered shall include: the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes; the nature of the copyrighted work; the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and the effect of the use upon the potential market for or value of the copyrighted work. (added pub. l 94-553, Title I, 101, Oct 19, 1976, 90 Stat 2546)

Advertisements

The mechanics of the Nazaré Canyon wave

The Portuguese town of Nazaré can deliver 100-foot (30.4 meters) waves.

How can we explain the Nazaré Canyon geomorphologic phenomenon?

In the 16th century, Portuguese people and army protected Nazaré from pirate attacks, in the Promontório do Sítio, the cliff-top area located 110-meter above the beach.

Nazare North Canyon with transparent ocean

A screenshot from the short film “Nazaré – Entre a Terra e o Mar”, showing what the canyon would look like if the sea were very clear and transparent.

Today, from this unique site, it is possible to watch the power of the Atlantic Ocean. If you face the salt water from the nearby castle, you can easily spot the famous big waves that pump the quiet village.

What are the mechanics of the Nazaré Canyon? Is there a clear explanation for the size of the local waves? First of all, let us underline the most common swell direction in the region: West and Northwest.

A few miles off the coast of Nazaré, there are drastic differences of depth between the continental shelf and the canyon. When swell heads to shore, it is quickly amplified where the two geomorphologic variables meet causing the formation of big waves.

Furthermore, a water current is channeled by the shore – from North to South – in the direction of the incoming waves, additionally contributing to wave height. Nazaré holds the Guinness World Record for the largest wave ever surfed.

In conclusion, the difference of depths increase wave height, the canyon increases and converges the swell and the local water current helps building the biggest wave in the world. Add a perfect wind speed and direction and welcome to Nazaré.

The Mechanics of the Nazaré Canyon Wave:

1. Swell refraction: difference of depths between the continental shelf and the canyon change swell speed and direction;
2. Rapid depth reduction: wave size builds gradually;
3. Converging wave: the wave from the canyon and the wave from the continental shelf meet and form a higher one;
4. Local water channel: a seashore channel drives water towards the incoming waves to increase their height;

Nazaré Canyon wave off Portugal surfing

a) Wave fronts,   b) Head of the Nazaré Canyon,   c) Praia do Norte

Article from Surfer Today, surfertoday.com/surfing/8247-the-mechanics-of-the-nazare-canyon-wave

____________________________

From telegraph.co.uk/news/earth/earthnews/10411252/How-a-100-foot-wave-is-created.html

Currents through the canyon combine with swell driven by winds from further out in the Atlantic to create waves that propagate at different speeds.

They converge as the canyon narrows and drive the swell directly towards the lighthouse that sits on the edge of Nazaré.

From the headwall to the coastline, the seabed rises gradually from around 32 feet to become shallow enough for the swell to break. Tidal conditions also help to increase the wave height.

According to Mr McNamara’s website charting the project he has been conducting, the wave produced here are “probably the biggest in all the world” for sandy a sand sea bed.

On Monday the 80 mile an hour winds created by the St Jude’s Atlantic storm whipped up the swell to monstrous proportions,leading to waves of up to 100 feet tall.

The previous day as the storm gathered pace, waves of up to 80 feet high formed and British surfer Andrew Cotton managed to ride one of these.

Nazaré Canyon Portugal Wave

Image from How a 100 foot wave is created, The Telegraph (UK),

_____________________________

This website is educational. Materials within it are being used in accord with the Fair Use doctrine, as defined by United States law.
§107. Limitations on Exclusive Rights: Fair Use. Notwithstanding the provisions of section 106, the fair use of a copyrighted work, including such use by reproduction in copies or phone records or by any other means specified by that section, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright. In determining whether the use made of a work in any particular case is a fair use, the factors to be considered shall include: the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes; the nature of the copyrighted work; the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and the effect of the use upon the potential market for or value of the copyrighted work. (added pub. l 94-553, Title I, 101, Oct 19, 1976, 90 Stat 2546)

Blueberry Earth

Here’s a gedankenexperiment (that’s German for “thought experiment”) that ought to interest you.

A gedankenexperiment is a way that physicists ask questions about how something in our universe works, for the joy of working out it’s consequences. The experiments don’t need to be practical, although many do lead to advances in physics. Famous examples of gedankenexperiments that led to new ideas in physics include Schrödinger’s cat and Maxwell’s demon.

Blueberry Earth: The Delicious Thought Experiment That’s Roiling Planetary Scientists
“A roaring ocean of boiling jam, with the geysers of released air and steam likely ejecting at least a few berries into orbit.”

Sarah Zhang, The Atlantic, 8/2/1018

Blueberries

Image from pxhere.com, 517756, CC0 Public Domain

Sarah Zhang, in The Atlantic, 8/2/1018, writes

Can I offer you a thought experiment on what would happen if the Earth were replaced by “an equal volume of closely packed but uncompressed blueberries”? When Anders Sandberg saw this question, he could not let it go. The asker was one “billybodega,” who posted the scenario on Physics Stack Exchange. (Though the question was originally posed on Twitter by writer Sandra Newman.)

A moderator of the usually staid forum closed the discussion before Sandberg could reply. That didn’t matter. Sandberg, a researcher at Oxford’s Future of Humanity Institute, wrote a lengthy answer on his blog and then an even lengthier paper that he posted to arxiv.org, a repository for physics preprints that have not yet been peer reviewed. The result is a brilliant explanation of how planets form.

To begin: The  1.5 x 1025 pounds of “closely packed but uncompressed” berries will start to collapse onto themselves and crush the berries deeper than 11.4 meters – or 37 feet – into a pulp. “Enormous amounts of air will be pushing out from the pulp as bubbles and jets, producing spectacular geysers,” writes Sandberg. What’s more, this rapid shrinking will release a huge amount of gravitational energy—equal to, according to Sandberg’s calculations, the energy output of the sun over 20 minutes. It’s enough to make the pulp boil. Behold:

“The result is that blueberry earth will turn into a roaring ocean of boiling jam, with the geysers of released air and steam likely ejecting at least a few berries into orbit. As the planet evolves a thick atmosphere of released steam will add to the already considerable air from the berries. It is not inconceivable that the planet may heat up further due to a water vapour greenhouse effect, turning into a very odd Venusian world.”

Deep under the roiling jam waves, the pressure is high enough that even the warm jam will turn to ice. Blueberry Earth will have an ice core 4,000 miles wide, by Sandberg’s calculations. “The end result is a world that has a steam atmosphere covering an ocean of jam on top of warm blueberry granita,” he writes.

The process is not so different from the birth of a planet out of a disc of rotating debris. The coalescing, the emergence of an atmosphere, the formation of a dense core—all of these happened at one point to the real Earth. And it is currently happening elsewhere in the universe, as exoplanets are forming around other stars in other galaxies.

What happens-if-the-earth-instantly-turned into a mass of blueberries? The Atlantic

An interview with the author on Slate.com

Article by Anders on his blog

Blueberry Earth by Anders Sandberg, on Arxiv

___________________

This website is educational. Materials within it are being used in accord with the Fair Use doctrine, as defined by United States law.
§107. Limitations on Exclusive Rights: Fair Use. Notwithstanding the provisions of section 106, the fair use of a copyrighted work, including such use by reproduction in copies or phone records or by any other means specified by that section, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright. In determining whether the use made of a work in any particular case is a fair use, the factors to be considered shall include: the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes; the nature of the copyrighted work; the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and the effect of the use upon the potential market for or value of the copyrighted work. (added pub. l 94-553, Title I, 101, Oct 19, 1976, 90 Stat 2546)

What did Earth look like millions of years ago?

Ever wonder what the Earth looked like before humans came along?

Ancient Earth Globe

The 3D interactive website called Ancient Earth Globe lets you glimpse the world from space during the age of the dinosaurs — and more. Seeing the Earth at various points in geological history, from 750 million years ago to today, is an eye-opening activity to say the least. The website allows you to see the entire globe as it slowly rotates, or zoom in to see closer details of land and oceans. There’s also an option to remove clouds for an even better look.

(Text by Bonnie Burton, Cnet, 8/7/18, See what Earth looked like from space when it was ruled by dinosaurs)

You’re better than your last report card

Check out this disastrous report card. Yet John Gurdon went on to do well in college, and later became a Nobel Prize winner in Biology!

John Gurdon Report Card then Nobel Prize

That’s because he had grit, moxie, steadfastness, backbone. When you access that then you can achieve great things!  Nick Collins writes:

At the age of 15, Prof Sir John Gurdon ranked last out of the 250 boys in his Eton year group at biology, and was in the bottom set in every other science subject.

Sixty-four years later he has been recognised as one of the finest minds of his generation after being awarded the £750,000 annual prize, which he shares with Japanese stem cell researcher Shinya Yamanaka.

Speaking after learning of his award in London on Monday, Sir John revealed that his school report still sits above his desk at the Gurdon Institute in Cambridge, which is named in his honour. While it might be less than complimentary, noting that for him to study science at University would be a “sheer waste of time”, Sir John said it is the only item he has ever framed.

… After receiving the report Sir John said he switched his attention to classics and was offered a place to study at Christ Church, Oxford, but was allowed to switch courses and read zoology instead because of a mix-up in the admissions office.

It was at Oxford as a postgraduate student that he published his groundbreaking research on genetics and proved for the first time that every cell in the body contains the same genes. He did so by taking a cell from an adult frog’s intestine, removing its genes and implanting them into an egg cell, which grew into a clone of the adult frog.

The idea was controversial at the time because it contradicted previous studies by much more senior scientists, and it was a decade before the then-graduate student’s work became widely accepted. But it later led directly to the cloning of Dolly the Sheep by Prof Ian Wilmut in 1996, and to the subsequent discovery by Prof Yamanaka that adult cells can be “reprogrammed” into stem cells for use in medicine. This means that cells from someone’s skin can be made into stem cells which in turn can turn into any type of tissue in the body, meaning they can replace diseased or damaged tissue in patients.
– The Telegraph (UK), Nick Collins, Oct 8, 2012

Great things happened because John had indefatigability –  sustained enthusiastic action with unflagging vitality.

John Bertrand Gurdon Nobel prize winner

Jonathan Player. Rex Features/AP, 2003

 

 

 

 

 

Why Old Physics Still Matters

By Chad Orzel, Forbes, 7/30/18

(The following is an approximation of what I will say in my invited talk at the 2018 Summer Meeting of the American Association of Physics Teachers. They encourage sharing of slides from the talks, but my slides for this talk are done in what I think of as a TED style, with minimal text, meaning that they’re not too comprehensible by themselves. So, I thought I would turn the talk into a blog post, too, maximizing the ratio of birds to stones…

(The full title of the talk is Why “Old Physics” Still Matters: History as an Aid to Understanding, and the abstract I sent in is:

A common complaint about physics curricula is that too much emphasis is given to “old physics,” phenomena that have been understood for decades, and that curricula should spend less time on the history of physics in order to emphasize topics of more current interest. Drawing on experience both in the classroom and in writing books for a general audience, I will argue that discussing the historical development of the subject is an asset rather than an impediment. Historical presentation is particularly useful in the context of quantum mechanics and relativity, where it helps to ground the more exotic and counter-intuitive aspects of those theories in a concrete process of observation and discovery.


The title of this talk refers to a very common complaint made about the teaching of physics, namely that we spend way too much time on “old physics,” and never get to anything truly modern. This is perhaps best encapsulated by Henry Reich of MinutePhysics, who made a video open letter to Barack Obama after his re-election noting that the most modern topics on the AP Physics exam date from about 1905.

This is a reflection of the default physics curriculum, which generally starts college students off with a semester of introductory Newtonian physics, which was cutting-edge stuff in the 1600s. The next course in the usual sequence is introductory E&M, which was nailed down in the 1800’s, and shortly after that comes a course on “modern physics,” which describes work from the 1900s.

Within the usual “modern physics” course, the usual approach is also historical: we start out with the problem of blackbody radiation, solved by Max Planck in 1900, then move on to the photoelectric effect, explained by Albert Einstein in 1905, and then to Niels Bohr’s model of the hydrogen atom from 1913, and eventually matter waves and the Schrodinger equation, bringing us all the way up to the late 1920’s.

It’s almost become cliche to note that “modern physics” richly deserves to be in scare quotes. A typical historically-ordered curriculum never gets past 1950, and doesn’t deal with any of the stuff that is exciting about quantum physics today.

This is the root of the complaint about “old physics,” and it doesn’t necessarily have to be this way. There are approaches to the subject that are, well, more modern. John Townsend’s textbook for example, starts with the quantum physics of two-state systems, using electron spins as an example, and works things out from there. This is a textbook aimed at upper-level majors, but Leonard Susskind and Art Friedman’s Theoretical Minimum book uses essentially the same approach for a non-scientific audience. Looking at the table of contents of this, you can see that it deals with the currently hot topic of entanglement a few chapters before getting to particle-wave duality, flipping the historical order of stuff around, and getting to genuinely modern approaches earlier.

There’s a lot to like about these books that abandon the historical approach, but when I sat down and wrote my forthcoming general-audience book on quantum physics, I ended up taking the standard historical approach: if you look at the table of contents, you’ll see it starts with Planck’s blackbody model, then Einstein’s introduction of photons, then the Bohr model, and so on.

This is not a decision made from inertia or ignorance, but a deliberate choice, because I think the historical approach offers some big advantages not only in terms of making the specific physics content more understandable, but for boosting science more broadly. While there are good things to take away from the ahistorical approaches, they have to open with blatant assertions regarding the existence of spins. They’re presenting these as facts that simply have to be accepted as a starting point, and I think that not only loses some readers who will get hung up on that call, it goes a bit against the nature of science, as a process for generating knowledge, not a collection of facts.

This historical approach gets to the weird stuff, but grounds it in very concrete concerns. Planck didn’t start off by asserting the existence of quantized energy, he started with a very classical attack on a universal phenomenon, namely the spectrum of light emitted by a hot object. Only after he failed to explain the spectrum by classical means did he resort to the quantum, assigning a characteristic energy to light that depends on the frequency. At high frequencies, the heat energy available to produce light is less than one “quantum” of light, which cuts off the light emitted at those frequencies, rescuing the model from the “ultraviolet catastrophe” that afflicted classical approaches to the problem.

Planck used this quantum idea as a desperate trick, but Einstein picked it up and ran with us, arguing that the quantum hypothesis Planck resorted to from desperation could explain another phenomenon, the photoelectric effect. Einstein’s simple “heuristic” works brilliantly, and was what officially won him the Nobel Prize. Niels Bohr took these quantum ideas and applied them to atoms, making the first model that could begin to explain the absorption and emission of light by atoms, which used discrete energy states for electrons within atoms, and light with a characteristic energy proportional to the frequency. And quantum physics was off and running.

This history is useful because it grounds an exceptionally weird subject in concrete solutions to concrete problems. Nobody woke up one morning and asserted the existence of particles that behave like waves and vice versa. Instead, physicists were led to the idea, somewhat reluctantly but inevitably, by rigorously working out the implications of specific experiments. Going through the history makes the weird end result more plausible, and gives future physicists something to hold on to as they start on the journey for themselves.

This historical approach also has educational benefits when applied to the other great pillar of “modern physics” classes, namely Einstein’s theory of special relativity. This is another subject that is often introduced in very abstract ways– envisioning a universe filled with clocks and meter sticks and pondering the meaning of simultaneity, or considering the geometry of spacetime. Again, there are good things to take away from this– I learned some great stuff from Takeuchi’s Illustrated Guide to Relativity and Cox and Forshaw’s Why Does E=mc2?. But for a lot of students, the abstraction of this approach leads to them thinking “Why in hell are we talking about this nonsense?”

Some of those concerns can be addressed by a historical approach. The most standard way of doing this is to go back to the Michelson-Morley experiment, started while Einstein was in diapers, that proved that the speed of light was constant. But more than that, I think it’s useful to bring in some actual history– I’ve found it helpful to draw on Peer Galison’s argument in Einstein’s Clocks, Poincare’s Maps.

Galison notes that the abstract concerns about simultaneity that connect to relativity arise very directly from considering very concrete problems of timekeeping and telegraphy, used in surveying the planet to determine longitude, and establishing the modern system of time zones to straighten out the chaos that multiple incompatible local times created for railroads.

Poincare was deeply involved in work on longitude and timekeeping, and these practical issues led him to think very philosophically about the nature of time and simultaneity, several years before Einstein’s relativity. Einstein, too, was in an environment where practical timekeeping issues would’ve come up with some regularity, which naturally leads to similar thoughts. And it wasn’t only those two– Hendrik Lorentz and George FitzGerald worked out much of the necessary mathematics for relativity on their own.

So, adding some history to discussions of relativity helps both ground what is otherwise a very abstract process and also helps reinforce a broader understanding of science as a process. Relativity, seen through a historical perspective, is not merely the work of a lone genius who was bored by his job in the patent office, but the culmination of a process involving many people thinking about issues of practical importance.

Bringing in some history can also have benefits when discussing topics that are modern enough to be newsworthy. There’s a big argument going on at the moment about dark matter, with tempers running a little high. On the one hand, some physicists question whether it’s time to consider alternative explanations, while other observations bolster the theory.

Dark matter is a topic that might very well find its way into classroom discussions, and it’s worth introducing a bit of the history to explore this. Specifically, it’s good to go back to the initial observations of galaxy rotation curves. The spectral lines emitted by stars and hot gas are redshifted by the overall motion of the galaxy, but also bent into a sort of S-shape by the fact that stars on one side tend to be moving toward us due to the galaxy’s rotation, and stars on the other side tend to be moving away. The difference between these lets you find the velocity of rotation as a function of distance from the center of the galaxy, and this turns out to be higher than can be explained by the mass we can see and the normal behavior of gravity.

This work is worth introducing not only because these galaxy rotations are the crux of the matter for the current argument, but because they help make an important point about science in context. The initial evidence for something funny about these rotation curves came largely from work by Vera Rubin, who was a remarkable person. As a woman in a male-dominated field, she had to overcome many barriers along the course of her career.

Bringing up the history of dark matter observations is a natural means to discuss science in a broader social context, and the issues that Rubin faced and overcame, and how those resonate today. Talking about her work and history allows both a better grounding for the current dark matter fights, and also a chance to make clear that science takes place within and is affected by a larger societal context. That’s probably at least as important an issue to drive home as any particular aspect of the dark matter debate.

So, those are some examples of areas in which a historical approach to physics is actively helpful to students, not just a way to delay the teaching of more modern topics. By grounding abstract issues in concrete problems, making the collaborative and cumulative nature of science clear, and placing scientific discoveries in a broader social context, adding a bit of history to the classroom helps students get a better grasp on specific physics topics, and also on science as a whole.

About the author: Chad Orzel is Associate Professor in the Department of Physics and Astronomy at Union College

_______________________________________________________

This website is educational. Materials within it are being used in accord with the Fair Use doctrine, as defined by United States law.
§107. Limitations on Exclusive Rights: Fair Use. Notwithstanding the provisions of section 106, the fair use of a copyrighted work, including such use by reproduction in copies or phone records or by any other means specified by that section, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright. In determining whether the use made of a work in any particular case is a fair use, the factors to be considered shall include: the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes; the nature of the copyrighted work; the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and the effect of the use upon the potential market for or value of the copyrighted work. (added pub. l 94-553, Title I, 101, Oct 19, 1976, 90 Stat 2546)

 

The Momentum Principle Vs Newton’s 2nd Law

Practical problem solving: When we do use conservation of momentum to solve a problem? When do we use Newton’s laws of motions?

Force Newton's second

Sometimes we need to use only one or the other; other times both are equally useful. And on other occasions some problems may require the use of both approaches. Rhett Allain on Wired.com  discusses this in “Physics Face Off: The Momentum Principle Vs Newton’s 2nd Law”

__________________________

CONSIDER THE FOLLOWING physics problem.

An object with a mass of 1 kg and a velocity of 1 m/s in the x-direction has a net force of 1 Newton pushing on it (also in the x-direction). What will the velocity of the object be after 1 second? (Yes, I am using simple numbers—because the numbers aren’t the point.)

Let’s solve this simple problem two different ways. For the first method, I will use Newton’s Second Law. In one dimension, I can write this as:

F (net – x) = m x ax

Using this equation, I can get the acceleration of the object (in the x-direction). I’ll skip the details, but it should be fairly easy to see that it would have an acceleration of 1 m/s2. Next, I need the definition of acceleration (in the x-direction). Oh, and just to be clear—I’m trying to be careful about these equations since they are inherently vector equations.

a = delta Vx / time

The article continues here:

Physics Face Off: The Momentum Principle Vs Newton’s 2nd Law