Home » Posts tagged 'disease' (Page 3)
Tag Archives: disease
Mysterious link between immune system and mental illness
He Got Schizophrenia. He Got Cancer. And Then He Got Cured.
A bone-marrow transplant treated a patient’s leukemia — and his delusions, too. Some doctors think they know why.
By Moises Velasquez-Manoff
Mr. Velasquez-Manoff is a science writer.
The man was 23 when the delusions came on. He became convinced that his thoughts were leaking out of his head and that other people could hear them. When he watched television, he thought the actors were signaling him, trying to communicate. He became irritable and anxious and couldn’t sleep.
Dr. Tsuyoshi Miyaoka, a psychiatrist treating him at the Shimane University School of Medicine in Japan, eventually diagnosed paranoid schizophrenia. He then prescribed a series of antipsychotic drugs. None helped. The man’s symptoms were, in medical parlance, “treatment resistant.”
A year later, the man’s condition worsened. He developed fatigue, fever and shortness of breath, and it turned out he had a cancer of the blood called acute myeloid leukemia. He’d need a bone-marrow transplant to survive. After the procedure came the miracle. The man’s delusions and paranoia almost completely disappeared. His schizophrenia seemingly vanished.
Years later, “he is completely off all medication and shows no psychiatric symptoms,” Dr. Miyaoka told me in an email. Somehow the transplant cured the man’s schizophrenia.
A bone-marrow transplant essentially reboots the immune system. Chemotherapy kills off your old white blood cells, and new ones sprout from the donor’s transplanted blood stem cells. It’s unwise to extrapolate too much from a single case study, and it’s possible it was the drugs the man took as part of the transplant procedure that helped him. But his recovery suggests that his immune system was somehow driving his psychiatric symptoms.
At first glance, the idea seems bizarre — what does the immune system have to do with the brain? — but it jibes with a growing body of literature suggesting that the immune system is involved in psychiatric disorders from depression to bipolar disorder.
The theory has a long, if somewhat overlooked, history. In the late 19th century, physicians noticed that when infections tore through psychiatric wards, the resulting fevers seemed to cause an improvement in some mentally ill and even catatonic patients.
Inspired by these observations, the Austrian physician Julius Wagner-Jauregg developed a method of deliberate infection of psychiatric patients with malaria to induce fever. Some of his patients died from the treatment, but many others recovered. He won a Nobel Prize in 1927.
One much more recent case study relates how a woman’s psychotic symptoms — she had schizoaffective disorder, which combines symptoms of schizophrenia and a mood disorder such as depression — were gone after a severe infection with high fever.
Modern doctors have also observed that people who suffer from certain autoimmune diseases, like lupus, can develop what looks like psychiatric illness. These symptoms probably result from the immune system attacking the central nervous system or from a more generalized inflammation that affects how the brain works.
Indeed, in the past 15 years or so, a new field has emerged called autoimmune neurology. Some two dozen autoimmune diseases of the brain and nervous system have been described. The best known is probably anti-NMDA-receptor encephalitis, made famous by Susannah Cahalan’s memoir “Brain on Fire.” These disorders can resemble bipolar disorder, epilepsy, even dementia — and that’s often how they’re diagnosed initially. But when promptly treated with powerful immune-suppressing therapies, what looks like dementia often reverses. Psychosis evaporates. Epilepsy stops. Patients who just a decade ago might have been institutionalized, or even died, get better and go home.
Admittedly, these diseases are exceedingly rare, but their existencesuggests there could be other immune disorders of the brain and nervous system we don’t know about yet.
Dr. Robert Yolken, a professor of developmental neurovirology at Johns Hopkins, estimates that about a third of schizophrenia patients show some evidence of immune disturbance. “The role of immune activation in serious psychiatric disorders is probably the most interesting new thing to know about these disorders,” he told me.
Studies on the role of genes in schizophrenia also suggest immune involvement, a finding that, for Dr. Yolken, helps to resolve an old puzzle. People with schizophrenia tend not to have many children. So how have the genes that increase the risk of schizophrenia, assuming they exist, persisted in populations over time? One possibility is that we retain genes that might increase the risk of schizophrenia because those genes helped humans fight off pathogens in the past. Some psychiatric illness may be an inadvertent consequence, in part, of having an aggressive immune system.
Which brings us back to Dr. Miyaoka’s patient. There are other possible explanations for his recovery. Dr. Andrew McKeon, a neurologist at the Mayo Clinic in Rochester, Minn., a center of autoimmune neurology, points out that he could have suffered from a condition called paraneoplastic syndrome. That’s when a cancer patient’s immune system attacks a tumor — in this case, the leukemia — but because some molecule in the central nervous system happens to resemble one on the tumor, the immune system also attacks the brain, causing psychiatric or neurological problems. This condition was important historically because it pushed researchers to consider the immune system as a cause of neurological and psychiatric symptoms. Eventually they discovered that the immune system alone, unprompted by malignancy, could cause psychiatric symptoms.
Another case study from the Netherlands highlights this still-mysterious relationship. In this study, on which Dr. Yolken is a co-author, a man with leukemia received a bone-marrow transplant from a schizophrenic brother. He beat the cancer but developed schizophrenia. Once he had the same immune system, he developed similar psychiatric symptoms.
The bigger question is this: If so many syndromes can produce schizophrenia-like symptoms, should we examine more closely the entity we call schizophrenia?
Some psychiatrists long ago posited that many “schizophrenias” existed — different paths that led to what looked like one disorder. Perhaps one of those paths is autoinflammatory or autoimmune.
If this idea pans out, what can we do about it? Bone marrow transplant is an extreme and risky intervention, and even if the theoretical basis were completely sound — which it’s not yet — it’s unlikely to become a widespread treatment for psychiatric disorders. Dr. Yolken says that for now, doctors treating leukemia patients who also have psychiatric illnesses should monitor their psychiatric progress after transplantation, so that we can learn more.
And there may be other, softer interventions. A decade ago, Dr. Miyaoka accidentally discovered one. He treated two schizophrenia patients who were both institutionalized, and practically catatonic, with minocycline, an old antibiotic usually used for acne. Both completely normalized on the antibiotic. When Dr. Miyaoka stopped it, their psychosis returned. So he prescribed the patients a low dose on a continuing basis and discharged them.
Minocycline has since been studied by others. Larger trials suggest that it’s an effective add-on treatment for schizophrenia. Some have argued that it works because it tamps down inflammation in the brain. But it’s also possible that it affects the microbiome — the community of microbes in the human body — and thus changes how the immune system works.
Dr. Yolken and colleagues recently explored this idea with a different tool: probiotics, microbes thought to improve immune function. He focused on patients with mania, which has a relatively clear immunological signal. During manic episodes, many patients have elevated levels of cytokines, molecules secreted by immune cells. He had 33 mania patients who’d previously been hospitalized take a probiotic prophylactically. Over 24 weeks, patients who took the probiotic (along with their usual medications) were 75 percent less likely to be admitted to the hospital for manic attacks compared with patients who didn’t.
The study is preliminary, but it suggests that targeting immune function may improve mental health outcomes and that tinkering with the microbiome might be a practical, cost-effective way to do this.
Watershed moments occasionally come along in medical history when previously intractable or even deadly conditions suddenly become treatable or preventable. They are sometimes accompanied by a shift in how scientists understand the disorders in question.
We now seem to have reached such a threshold with certain rare autoimmune diseases of the brain. Not long ago, they could be a death sentence or warrant institutionalization. Now, with aggressive treatment directed at the immune system, patients can recover. Does this group encompass a larger chunk of psychiatric disorders? No one knows the answer yet, but it’s an exciting time to watch the question play out.
Moises Velasquez-Manoff, the author of “An Epidemic of Absence: A New Way of Understanding Allergies and Autoimmune Diseases” and an editor at Bay Nature magazine, is a contributing opinion writer.
.
Related readings
https://en.wikipedia.org/wiki/Neuroimmunology
Emerging Subspecialties in Neurology: Autoimmune neurology
https://education.questdiagnostics.com/insights/104
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5499978/
6 page PDF article. http://www.med.or.jp/english/pdf/2004_09/425_430.pdf
https://www.quora.com/What-are-some-autoimmune-neurological-disorders-How-are-they-treated
Probiotics and human health
What are probiotics?
Probiotics are live microorganisms that are intended to have health benefits. Products sold as probiotics include foods (such as yogurt), dietary supplements, and products that aren’t used orally, such as skin creams.
Although people often think of bacteria and other microorganisms as harmful “germs,” many microorganisms help our bodies function properly. For example, bacteria that are normally present in our intestines help digest food, destroy disease-causing microorganisms, and produce vitamins. Large numbers of microorganisms live on and in our bodies. Many of the microorganisms in probiotic products are the same as or similar to microorganisms that naturally live in our bodies.
What Kinds of Microorganisms Are In Probiotics?
The most common are bacteria that belong to groups called Lactobacillus and Bifidobacterium. Each of these two broad groups includes many types of bacteria. Other bacteria may also be used as probiotics, and so may yeasts such as Saccharomyces boulardii.
Probiotics, Prebiotics, and Synbiotics
“prebiotics” refers to dietary substances that favor the growth of beneficial bacteria over harmful ones.
“synbiotics” refers to products that combine probiotics and prebiotics.
How Popular Are Probiotics?
Data from the 2012 National Health Interview Survey (NHIS) show that about 4 million (1.6 percent) U.S. adults had used probiotics or prebiotics in the past 30 days. Among adults, probiotics or prebiotics were the third most commonly used dietary supplement other than vitamins and minerals, and the use of probiotics quadrupled between 2007 and 2012.
What the Science Says About the Effectiveness of Probiotics
Researchers have studied probiotics to find out whether they might help prevent or treat a variety of health problems, including:
- Digestive disorders such as diarrhea caused by infections, antibiotic-associated diarrhea, irritable bowel syndrome, and inflammatory bowel disease
- Allergic disorders such as atopic dermatitis (eczema) and allergic rhinitis (hay fever)
- Tooth decay, periodontal disease, and other oral health problems
- Colic in infants
- Liver disease
- The common cold
- Prevention of necrotizing enterocolitis in very low birth weight infants.
There’s preliminary evidence that some probiotics are helpful in preventing diarrhea caused by infections and antibiotics and in improving symptoms of irritable bowel syndrome, but more needs to be learned. We still don’t know which probiotics are helpful and which are not. We also don’t know how much of the probiotic people would have to take or who would most likely benefit from taking probiotics. Even for the conditions that have been studied the most, researchers are still working toward finding the answers to these questions.
Probiotics are not all alike. For example, if a specific kind of Lactobacillus helps prevent an illness, that doesn’t necessarily mean that another kind of Lactobacillus would have the same effect or that any of the Bifidobacterium probiotics would do the same thing.
Although some probiotics have shown promise in research studies, strong scientific evidence to support specific uses of probiotics for most health conditions is lacking. The U.S. Food and Drug Administration (FDA) has not approved any probiotics for preventing or treating any health problem. Some experts have cautioned that the rapid growth in marketing and use of probiotics may have outpaced scientific research for many of their proposed uses and benefits.
How might they work? (What is their causal mechanism?0
Probiotics may have a variety of effects in the body, and different probiotics may act in different ways.
Probiotics might:
- Help to maintain a desirable community of microorganisms
- Stabilize the digestive tract’s barriers against undesirable microorganisms or produce substances that inhibit their growth
- Help the community of microorganisms in the digestive tract return to normal after being disturbed (for example, by an antibiotic or a disease)
- Outcompete undesirable microorganisms
- Stimulate the immune response.
What science says about the safety of probiotics
Whether probiotics are likely to be safe for you depends on the state of your health.
- In people who are generally healthy, probiotics have a good safety record. Side effects, if they occur at all, usually consist only of mild digestive symptoms such as gas.
- On the other hand, there have been reports linking probiotics to severe side effects, such as dangerous infections, in people with serious underlying medical problems. The people who are most at risk of severe side effects include critically ill patients, those who have had surgery, very sick infants, and people with weakened immune systems
Even for healthy people, there are uncertainties about the safety of probiotics. Because many research studies on probiotics haven’t looked closely at safety, there isn’t enough information right now to answer some safety questions. Most of our knowledge about safety comes from studies of Lactobacillus and Bifidobacterium; less is known about other probiotics. Information on the long-term safety of probiotics is limited, and safety may differ from one type of probiotic to another.
Quality Concerns About Probiotic Products
Some probiotic products have been found to contain smaller numbers of live microorganisms than expected. In addition, some products have been found to contain bacterial strains other than those listed on the label.
Source of info: US Dept of Health and Human Services, NIH, NCCIH Pub No. D345
=================================
Where did the idea of using probiotics first develop?
The idea came from Nobel laureate Élie Metchnikoff. He postulated that yogurt-consuming Bulgarian peasants lived longer lives because of that custom. He suggested in 1907 that “the dependence of the intestinal microbes on the food makes it possible to adopt measures to modify the microbiota in our bodies and to replace the harmful microbes by useful microbes”.
There is a growing body of peer-reviewed science which indeed shows that there is a link between our gut flora (varieties of bacteria that live in our gut) and our health. But this link is complex, and it may vary widely from person to person, depending on their genes, and their gut biome.
Studies on gut bacteria and physical health
tba
Studies on gut bacteria and mental health
tba
Studies which show that treatment should be personalized
Senior author Eran Elinav, an immunologist at the Weizmann Institute of Science in Israel, and colleagues found that many people’s gastrointestinal tracts reject generic probiotics before they can get to work. Even worse, Elinav’s team found that microbial competition from off-the-shelf probiotics can prevent natural gut bacteria from reestablishing themselves after being wiped out by antibiotic drugs.
“I think our findings call for a fundamental change from the currently utilized one-size-fits-all paradigm, in which we go to the supermarket and buy a formulation of probiotics which is designed by some company, to a new method which is personalized,” Elinav says. “By measuring people in a data-driven way, one would be much better able to harness different probiotic combinations in different clinical contexts.”
… Elinav’s group isn’t claiming that probiotic supplements don’t carry heavy doses of beneficial gut bacteria. In fact, the studies confirm that they do. Because many probiotics are sold as dietary supplements, and thus aren’t subject to approval and regulation by many national drug agencies, including the U.S. Food and Drug Administration, the team first set out to ensure that the probiotic supplements in the study actually contained the 11 main strains they were supposed to deliver.
“All those strains were present and viable to consumption and beyond, following the passage through the GI tract, and even in stool, and they were still viable,” Elinav says.
But uncovering what impact these strains of bacteria have on the people who consume them required more digging, poking through patient’s stool and even inside their guts.
The authors set out to directly measure gut colonization by first finding 25 volunteers to undergo upper endoscopies and colonoscopies to map their baseline microbiomes in different parts of the gut. “Nobody has done anything quite like this before,” says Matthew Ciorba, a gastroenterologist at Washington University in Saint Louis School of Medicine unaffiliated with the study. “This takes some devoted volunteers and some very convincing researchers to get this done.”
Some of the volunteers took generic probiotics, and others a placebo, before undergoing the same procedures two months later. This truly insider’s look at the gut microbiome showed some people were “persisters,” whose guts were successfully colonized by off-the-shelf probiotics, while others, called “resisters,” expelled them before they could become established. The research suggests two reasons for the variability in the natural response of different gastrointestinal tracts to probiotics.
First and foremost is each person’s indigenous microbiome, or the unique assembly of gut bacteria that helps dictate which new strains will or won’t be able to join the party. The authors took gut microbiomes from resistant and persistent humans alike and transferred them into germ-free mice, which had no microbiome of their own. All the mice were then given the same probiotic preparation.
“We were quite surprised to see that the mice that harbored the resistant microbiome resisted the probiotics that were given to them, while mice that were given the permissive microbiome allowed much more of the probiotics to colonize their gastrointestinal tract,” Elinav explains. “This provides evidence that the microbiome contributes to a given person’s resistance or permissiveness to given probiotics.”
The second factor affecting an individual’s response to probiotics was each host’s gene expression profile. Before the probiotics were administered, volunteers who ended up being resistant were shown to have a unique gene signature in their guts—specifically, a more activated state of autoimmune response than those who were permissive to the supplements.
“So it’s probably a combination of the indigenous microbiome and the human immune system profile that team up to determine a person’s specific state of resistance or colonization to probiotics,” Elinav says. These factors were so clear that the team even found that they could predict whether an individual would be resistant or permissive by looking at their baseline microbiome and gut gene expression profile.
This unusual in situ gastrointestinal tract sampling also turned out to be key, because in a number of cases the microbiota composition found in a patient’s stool was only partially correlated with what was found inside the gut. In other words, simply using stool samples as a proxy can be misleading.
tba
The benefits of probiotics may not be so clear cut

Related articles
Do Probiotics Really Work? Scientific American
Learning Standards
TBA
Lack of exercise is a major cause of chronic diseases
Lack of exercise is a major cause of chronic diseases

U.S. Air Force photo by Staff Sgt. Christopher Hubenthal
Chronic diseases are major killers in the modern era. Physical inactivity is a primary cause of most chronic diseases.
The initial third of the article considers: activity and prevention definitions; historical evidence showing physical inactivity is detrimental to health and normal organ functional capacities; cause vs. treatment; physical activity and inactivity mechanisms differ; gene-environment interaction [including aerobic training adaptations, personalized medicine, and co-twin physical activity]; and specificity of adaptations to type of training.
Next, physical activity/exercise is examined as primary prevention against 35 chronic conditions
[Accelerated biological aging/premature death, low cardiorespiratory fitness (VO2 max), sarcopenia, metabolic syndrome, obesity, insulin resistance, prediabetes, type 2 diabetes, non-alcoholic fatty liver disease, coronary heart disease, peripheral artery disease, hypertension, stroke, congestive heart failure, endothelial dysfunction, arterial dyslipidemia, hemostasis,
deep vein thrombosis, cognitive dysfunction, depression and anxiety, osteoporosis, osteoarthritis, balance, bone fracture/falls, rheumatoid arthritis, colon cancer, breast cancer, endometrial cancer, gestational diabetes, preeclampsia, polycystic ovary syndrome, erectile dysfunction, pain, diverticulitis, constipation, and gallbladder diseases].
The article ends with consideration of deterioration of risk factors in longer-term sedentary groups; clinical consequences of inactive childhood/adolescence; and public policy. In summary, the body rapidly maladapts to insufficient physical activity, and if continued, results in substantial decreases in both total and quality years of life. Taken together, conclusive evidence exists that physical inactivity is one important cause of most chronic diseases. In addition, physical activity primarily prevents, or delays, chronic diseases, implying that chronic disease need not be an inevitable outcome during life.
Source
Lack of exercise is a major cause of chronic diseases
Frank W. Booth, Ph.D., Christian K. Roberts, Ph.D., and Matthew J. Laye, Ph.D.
PMC 2014 Nov 23, and Comprehensive Physiology 2012 Apr; 2(2): 1143–1211.5
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4241367/
https://onlinelibrary.wiley.com/doi/abs/10.1002/cphy.c110025
Americapox
DNA evidence offers proof of North American native population decline due to arrival of Europeans
by Bob Yirka, Phys.org
Most history books report that Native American populations in North America declined significantly after European colonizers appeared, subsequent to the “discovery” of the new world by Christopher Columbus in 1492, reducing their numbers by half or more in some cases. Most attribute this decline in population to the introduction of new diseases, primarily smallpox and warfare.
To back up such claims, historians have relied on archaeological evidence and written documents by new world settlers. Up to now however, no physical evidence has been available to nail down specifics regarding population declines, such as when they actually occurred and what caused it to occur. Now however, three researchers with various backgrounds in anthropological and genome sciences have banded together to undertake a study based on mitochondrial DNA evidence, and have found, as they report in their study published in the Proceedings of the National Academy of Sciences, that native populations in North America did indeed decline by roughly fifty percent, some five hundred years ago.
What’s perhaps most interesting in the study, is the implication that the sudden drop in population appeared to occur almost right after the arrival of Europeans, which means before settlement began. This means that the decline would have come about almost exclusively as a result of disease sweeping naturally through native communities, rather than from warfare, or mass slaughter as some have suggested and that stories of settlers using smallpox as a weapon may be exaggerated.
Also of interest is that the researchers found that the native population peaked some 5,000 years ago, and held steady, or even declined slightly, until the arrival of Europeans, and that the population decline that occurred was transient, meaning that it gradually rebounded as those Native Americans that survived the initial wave of smallpox passed on their hearty genes to the next generation.
The results of this research also seem to settle the argument of whether the massive loss of life due to disease was regional, as some historians have argued, or widespread as others have claimed; siding firmly with the latter.
In studying the DNA, of both pre-European arrival native population samples and that of their ancestors alive today, the researchers noted that those alive today are more genetically similar to one another than were their ancestors, which suggests a population decline and then resurgence, and that is how, by backtracking, they came to conclude that the decline occurred half a century ago. The authors are quick to point out however that the margin of error in their work does allow for the possibility that the population decline occurred somewhat later than their results showed and note that further research will need to be done to create a more precise timeline of events.
Native Americans experienced a strong population bottleneck coincident with European contact, Brendan D. O’Fallona and Lars Fehren-Schmitz
PNAS, Published online before print December 5, 2011, doi: 10.1073/pnas.1112563108
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Americapox: The Missing Plague
By CGP Grey, an an educational YouTuber. He produces explanatory videos on science, politics, geography, economics, and history. This is a transcript of his video Americapox: The Missing Plague, www.cgpgrey.com/blog/americapox
Between the first modern Europeans arriving in 1492 and the Victorian age, the indigenous population of the new world [native American Indians] dropped by at least 90%.

The cause?
Not the conquistadores and company — they killed lots of people but their death count is nothing compared to what they brought with them: small pox, typhus, tuberculosis, influenza, bubonic plague, cholera, mumps, measles and more leapt from those first explores to the costal tribes, then onward the microscopic invaders spread through a hemisphere of people with no defenses against them. Tens of millions died.
These germs decided the fate of these battles long before the fighting started.
Now ask yourself: why didn’t Europeans get sick?
If new-worlders were vulnerable to old-world diseases, then surely old-worlders would be vulnerable to new world diseases.
Yet, there was no Americapox spreading eastward infecting Europe and cutting the population from 90 million to 9. Had Americapox existed it would have rather dampened European ability for transatlantic expansion.
To answer why this didn’t happen: we need first to distinguish regular diseases — like the common cold — from what we’ll call plagues.
1) Spread quickly between people.
Sneezes spread plages faster than handshakes which are faster than… closeness. Plagues use more of this than this.
2) They kill you quickly or you become immune.
Catch a plague and your dead within seven to thirty days. Survive and you’ll never get it again. Your body has learned to fight it, you might still carry it — the plague lives in you, you can still spread it, but it can’t hurt you.
The surface answer to this question isn’t that Europeans had better immune systems to fight off new world plages — it’s that new world didn’t have plagues for them to catch. They had regular diseases but there was no Americapox to carry.
These are history’s biggest killers, and they all come from the old world.
But why?
Let’s dig deeper, and talk Cholera, a plague that spreads if your civilization does a bad job of separating drinking water from pooping water. London was terrible at this making it the cholera capital of the world. Cholera can rip through dense neighborhoods killing swaths of the population, before moving onward. But that’s the key: it has to move on.
In a small, isolated group, a plague like cholera cannot survive — it kills all available victims, leaving only the immune and then theres nowhere to go — it’s a fire that burns through its fuel.
But a city — shining city on the hill — to which rural migrants flock, where hundreds of babies are born a day: this is sanctuary for the fire of plague; fresh kindling comes to it. The plague flares and smolders and flares and smolders again — impossible to extinguish.
Historically in city borders plagues killed faster than people could breed. Cities grew because more people moved to them than died inside of them. Cities only started growing from their own population in the 1900s when medicine finally left its leaches and bloodletting phase and entered its soap and soup phase — giving humans some tools to slow death.
But before that a city was an unintentional playground for plages and a grim machine to sort the immune from the rest.
So the deeper, answer is that The New World didn’t have plagues because the new world didn’t have big, dense, terribly sanitized deeply interconnected cities for plages to thrive.
OK, but The New World wasn’t completely barren of cities. And tribes weren’t completely isolated, otherwise the newly-arrived smallpox in the 1400s couldn’t have spread.
Cities are only part of the puzzle: they’re required for plages, but cities don’t make the germs that start the plagues — those germs come from the missing piece.
Now, most germs don’t want to kill you for the same reason you don’t want to burn down your house: germs live in you. Chromic diseases like leprosy are terrible because they’re very good at not killing you.
Plague lethality is an accident, a misunderstanding, because the germs that cause them don’t know they’re in humans, they’re germs that think they’re in this.
Plagues come from animals.
Whooping cough comes from pigs, and does flu as well as from birds. Our friend the cow alone is responsible for measles, tuberculosis, and smallpox.
For the cow these diseases are no big deal — like colds for us. But when cow germs get in humans thing things they do to make the cow a little sick, makes humans very sick. Deadly sick.
Germs jumping species like this is extraordinarily rare. That’s why generations of humans can spend time around animals just fine. Being the patient zero of a new animal-to-human plague is winning a terrible lottery.
But a colonial-age city raises the odds: there used to be animals everywhere, horses, herds of livestock in the streets, open slaughterhouses, meat markets pre-refrigeration, and a river of literal human and animal excrement running through it all.
A more perfect environment for diseases to jump species could hardly be imagined.
So the deeper answer is that plagues come from animals, but so rarely you have to raise the odds and with many chances for infection and give the new-born plague a fertile environment to grow. The old world had the necessary pieces in abundance.
But, why was a city like London filled with sheep and pigs and cows and Tenochtitlan wasn’t?
This brings us to the final level. (For this video anyway)
Some animals can be put to human use — this is what domestication means, animals you can breed, not just hunt.
Forget a the moment the modern world: go back to 10,000BC when tribes of humans reached just about everywhere. If you were in one of these tribes what local animals could you capture, alive, and successfully pen to breed?
Maybe you’re in North Dakota and thinking about catching a Buffalo: an unpredictable, violent tank on hooves, that can outrun you across the planes, leap over your head head and travels in herds thousands strong.
Oh, and you have no horses to help you — because there are no horses on the continent. Horses live here — and won’t be brought over until, too late.
It’s just you, a couple buddies, and stone-based tools. American Indians didn’t fail to domesticate buffalo because they couldn’t figure it out. They failed because it’s a buffalo. No one could do it — buffalo would have been amazing creature to put to human work back in BC, but it’s not going to happen — humans have only barely domesticated buffalo with all our modern tools.
The New World didn’t have good animal candidates for domestication. Almost everything big enough to be useful is also was to too dangerous, or too agile.
Meanwhile the fertile crescent to central Europe had: cows and and pigs and sheep and goats, easy pests animals comparatively begging to be domesticated.
A wild boar is something to contend with if you only have stone tools but it’s possible to catch and pen and bread and feed to eat — because pigs can’t leap to the sky or crush all resistance beneath their hooves.
In The New World the only native domestication contestant was: llamas. They’re better than nothing, which is probably why the biggest cities existed in South America — but they’re no cow. Ever try to manage a heard of llamas in the mountains of Peru? Yeah, you can do it, but it’s not fun. Nothing but drama, these llamas.
These might seem, cherry-picked examples, because aren’t there hundreds of thousands of species of animals? Yes, but when you’re stuck at the bottom of the tech tree almost none of them can be domesticated. From the dawn of man until this fateful meeting humans domesticated maybe a baker’s dozen of unique species the world over, and even to get that high a number you need to stretch it to include honeybees and silkworms. Nice to have, but you can’t build a civilization on a foundation of honey alone.
These early tribes weren’t smarter, or better at domestication. The old world had more valuable and easy animals. With dogs, herding sheep and cattle is easier. Now humans have a buddy to keep an eye on the clothing factory, and the milk and cheeseburger machine, and the plow-puller. Now farming is easier, which means there’s more benefit to staying put, which means more domestication, which means more food which means more people and more density and oh look where we’re going. Citiesville, population lots, bring your animals, plagues welcome.
That is the full answer: The lack of new world animals to domesticate, limited not only exposure to germs sources but also limited food production, which limited population growth, which limited cities, which made plagues in The New World an almost impossibility. In the old, exactly the reverse. And thus a continent full of plague and a continent devoid of it.
So when ships landed in the new world there was no Americapox to bring back.
The game of civilization has nothing to do with the players, and everything to do with the map. Access to domesticated animals in numbers and diversity, is the key resource to bootstrapping a complex society from nothing — and that complexity brings with it, unintentionally, a passive biological weaponry devastating to outsiders.
Start the game again but move the domesticable animals across the sea and history’s arrow of disease and death flows in the opposite direction.
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Don’t Blame Columbus for All the Indians’ Ills
By JOHN NOBLE WILFORD, OCT. 29, 2002, The New York Times
Europeans first came to the Western Hemisphere armed with guns, the cross and, unknowingly, pathogens. Against the alien agents of disease, the indigenous people never had a chance. Their immune systems were unprepared to fight smallpox and measles, malaria and yellow fever.
The epidemics that resulted have been well documented. What had not been clearly recognized until now, though, is that the general health of Native Americans had apparently been deteriorating for centuries before 1492.
That is the conclusion of a team of anthropologists, economists and paleopathologists who have completed a wide-ranging study of the health of people living in the Western Hemisphere in the last 7,000 years.
The researchers, whose work is regarded as the most comprehensive yet, say their findings in no way diminish the dreadful impact Old World diseases had on the people of the New World. But it suggests that the New World was hardly a healthful Eden.
More than 12,500 skeletons from 65 sites in North and South America — slightly more than half of them from pre-Columbians — were analyzed for evidence of infections, malnutrition and other health problems in various social and geographical settings.
The researchers used standardized criteria to rate the incidence and degree of these health factors by time and geography. Some trends leapt out from the resulting index. The healthiest sites for Native Americans were typically the oldest sites, predating Columbus by more than 1,000 years. Then came a marked decline.
”Our research shows that health was on a downward trajectory long before Columbus arrived,” Dr. Richard H. Steckel and Dr. Jerome C. Rose, study leaders, wrote in ”The Backbone of History: Health and Nutrition in the Western Hemisphere,” a book they edited. It was published in August.
Dr. Steckel, an economist and anthropologist at Ohio State University, and Dr. Rose, an anthropologist at the University of Arkansas, stressed in interviews that their findings in no way mitigated the responsibility of Europeans as bearers of disease devastating to native societies. Yet the research, they said, should correct a widely held misperception that the New World was virtually free of disease before 1492.
In an epilogue to the book, Dr. Philip D. Curtin, an emeritus professor of history at Johns Hopkins University, said the skeletal evidence of the physical well-being of pre-Columbians ”shows conclusively that however much it may have deteriorated on contact with the outer world, it was far from paradisiacal before the Europeans and Africans arrived.”
About 50 scientists and scholars joined in the research and contributed chapters to the book. One of them, Dr. George J. Armelagos of Emory University, a pioneer in the field of paleopathology, said in an interview that the research provided an ”evolutionary history of disease in the New World.”
The surprise, Dr. Armelagos said, was not the evidence of many infectious diseases, but that the pre-Columbians were not better nourished and in general healthier.
Others said the research, supported by the National Science Foundation and Ohio State, would be the talk of scholarly seminars for years to come and the foundation for more detailed investigations of pre-Columbian health. Dr. Steckel is considering conducting a similar study of health patterns well into European prehistory.
”Although some of the authors occasionally appear to overstate the strength of the case they can make, they are also careful to indicate the limitations of the evidence,” Dr. Curtin wrote of the Steckel-Rose research. ”They recognize that skeletal material is the best comparative evidence we have for the human condition over such a long period of time, but it is not perfect.”
The research team gathered evidence on seven basic indicators of chronic physical conditions that can be detected in skeletons — namely, degenerative joint disease, dental health, stature, anemia, arrested tissue development, infections and trauma from injuries. Dr. Steckel and Dr. Rose called this ”by far the largest comparable data set of this type ever created.”
The researchers attributed the widespread decline in health in large part to the rise of agriculture and urban living. People in South and Central America began domesticating crops more than 5,000 years ago, and the rise of cities there began more than 2,000 years ago.
These were mixed blessings. Farming tended to limit the diversity of diets, and the congestion of towns and cities contributed to the rapid spread of disease. In the widening inequalities of urban societies, hard work on low-protein diets left most people vulnerable to illness and early death.
Similar signs of deleterious health effects have been found in the ancient Middle East, where agriculture started some 10,000 years ago. But the health consequences of farming and urbanism, Dr. Rose said, appeared to have been more abrupt in the New World.
The more mobile, less densely settled populations were usually the healthiest pre-Columbians. They were taller and had fewer signs of infectious lesions in their bones than residents of large settlements. Their diet was sufficiently rich and varied, the researchers said, for them to largely avoid the symptoms of childhood deprivation, like stunting and anemia. Even so, in the simplest hunter-gatherer societies, few people survived past age 50. In the healthiest cultures in the 1,000 years before Columbus, a life span of no more than 35 years might be usual.
In examining the skeletal evidence, paleopathologists rated the healthiest pre-Columbians to be people living 1,200 years ago on the coast of Brazil, where they had access to ample food from land and sea. Their relative isolation protected them from most infectious diseases.
Conditions also must have been salubrious along the coasts of South Carolina and Southern California, as well as among the farming and hunting societies in what is now the Midwest. Indian groups occupied the top 14 spots of the health index, and 11 of these sites predate the arrival of Europeans.
The least healthy people in the study were from the urban cultures of Mexico and Central America, notably where the Maya civilization flourished presumably at great cost to life and limb, and the Zuni of New Mexico. The Zuni lived at a 400-year-old site, Hawikku, a crowded, drought-prone farming pueblo that presumably met its demise before European settlers made contact.
It was their hard lot, Dr. Rose said, to be farmers ”on the boundaries of sustainable environments.”
”Pre-Columbian populations were among the healthiest and the least healthy in our sample,” Dr. Steckel and Dr. Rose said. ”While pre-Columbian natives may have lived in a disease environment substantially different from that in other parts of the globe, the original inhabitants also brought with them, or evolved with, enough pathogens to create chronic conditions of ill health under conditions of systematic agriculture and urban living.”
In recent examinations of 1,000-year-old Peruvian mummies, for example, paleopathologists discovered clear traces of tuberculosis in their lungs, more evidence that native Americans might already have been infected with some of the diseases that were thought to have been brought to the New World by European explorers.
Tuberculosis bears another message: as an opportunistic disease, it strikes when times are tough, often overwhelming the bodies of people already weakened by malnutrition, poor sanitation in urban centers and debilitated immune systems.
The Steckel-Rose research extended the survey to the health consequences of the first contacts with American Indians by Europeans and Africans and the health of European-Americans and African-Americans up to the early 20th century.
Not surprisingly, African-American slaves were near the bottom of the health index. An examination of plantation slaves buried in South Carolina, Dr. Steckel said, revealed that their poor health compared to that of ”pre-Columbian Indian populations threatened with extinction.”
On the other hand, blacks buried at Philadelphia’s African Church in the 1800’s were in the top half of the health index. Their general conditions were apparently superior to those of small-town, middle-class whites, Dr. Steckel said.
The researchers found one exception to the rule that the healthiest sites for Native Americans were the oldest sites. Equestrian nomads of the Great Plains of North America in the 19th century seemed to enjoy excellent health, near the top of the index. They were not fenced in to farms or cities.
In a concluding chapter of their book, Dr. Steckel and Dr. Rose said the study showed that ”the health decline was precipitous with the changes in ecological environments where people lived.” It is not a new idea in anthropology, they conceded, ”but scholars in general have yet to absorb it.”
=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
Related articles
The Great Dying 1616-1619, Ipswich Historical Commission
_______________________
Fair use
This website is educational. Materials within it are being used in accord with the Fair Use doctrine, as defined by United States law.
§107. Limitations on Exclusive Rights: Fair Use
Notwithstanding the provisions of section 106, the fair use of a copyrighted work, including such use by reproduction in copies or phone records or by any other means specified by that section, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright. In determining whether the use made of a work in any particular case is a fair use, the factors to be considered shall include:
the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes;
the nature of the copyrighted work;
the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and
the effect of the use upon the potential market for or value of the copyrighted work. (added pub. l 94-553, Title I, 101, Oct 19, 1976, 90 Stat 2546)
Scientists argue that addiction is not a disease
Addiction is not a disease
A neuroscientist argues that it’s time to change our minds on the roots of substance abuse, Laura Miller, for Salon. 6/27/15
A psychologist and former addict insists that the illness model for addiction is wrong, and dangerously so.
The mystery of addiction — what it is, what causes it and how to end it — threads through most of our lives. Experts estimate that one in 10 Americans is dependent on alcohol and other drugs, and if we concede that behaviors like gambling, overeating and playing video games can be addictive in similar ways, it’s likely that everyone has a relative or friend who’s hooked on some form of fun to a destructive degree. But what exactly is wrong with them? For several decades now, it’s been a commonplace to say that addicts have a disease. However, the very same scientists who once seemed to back up that claim have begun tearing it down.
Once, addictions were viewed as failures of character and morals, and society responded to drunks and junkies with shaming, scolding and calls for more “will power.” This proved spectacularly ineffective, although, truth be told, most addicts do quit without any form of treatment. Nevertheless, many do not, and in the mid-20th century, the recovery movement, centered around the 12-Step method developed by the founders of Alcoholics Anonymous, became a godsend for those unable to quit drinking or drugging on their own. The approach spread to so-called “behavioral addictions,” like gambling or sex, activities that don’t even involve the ingestion of any kind of mind-altering substance.
Much of the potency of AA comes from its acknowledgement that willpower isn’t enough to beat this devil and that blame, rather than whipping the blamed person into shape, is counterproductive. The first Step requires admitting one’s helplessness in the face of addiction….
…. Another factor promoting the disease model is that it has ushered addiction under the aegis of the healthcare industry, whether in the form of an illness whose treatment can be charged to an insurance company or as the focus of profit-making rehab centers.
….The recovery movement and rehab industry (two separate things, although the latter often employs the techniques of the former) have always had their critics, but lately some of the most vocal have been the neuroscientists whose findings once lent them credibility.
One of those neuroscientists is Marc Lewis, a psychologist and former addict himself, also the author of a new book “The Biology of Desire: Why Addiction is Not a Disease.”
Lewis’s argument is actually fairly simple: The disease theory, and the science sometimes used to support it, fail to take into account the plasticity of the human brain. Of course, “the brain changes with addiction,” he writes. “But the way it changes has to do with learning and development — not disease.” All significant and repeated experiences change the brain; adaptability and habit are the brain’s secret weapons. The changes wrought by addiction are not, however, permanent, and while they are dangerous, they’re not abnormal.
Through a combination of a difficult emotional history, bad luck and the ordinary operations of the brain itself, an addict is someone whose brain has been transformed, but also someone who can be pushed further along the road toward healthy development. (Lewis doesn’t like the term “recovery” because it implies a return to the addict’s state before the addiction took hold.)
“The Biology of Desire” is grouped around several case studies, each one illustrating a unique path to dependency. A striving Australian entrepreneur becomes caught up in the “clarity, power and potential” he feels after smoking meth, along with his ability to work long hours while on the drug. A social worker who behaves selflessly in her job and marriage constructs a defiant, selfish, secret life around stealing and swallowing prescription opiates. A shy Irishman who started drinking as a way to relax in social situations slowly comes to see social situations as an occasion to drink and then drinking as a reason to hole up in his apartment for days on end.
Each of these people, Lewis argues, had a particular “emotional wound” the substance helped them handle, but once they started using it, the habit itself eventually became self-perpetuating and in most cases ultimately served to deepen the wound.
Each case study focuses on a different part of the brain involved in addiction and illustrates how the function of each part — desire, emotion, impulse, automatic behavior — becomes shackled to a single goal: consuming the addictive substance. The brain is built to learn and change, Lewis points out, but it’s also built to form pathways for repetitive behavior, everything from brushing your teeth to stomping on the brake pedal, so that you don’t have to think about everything you do consciously. The brain is self-organizing. Those are all good properties, but addiction shanghais them for a bad cause.
As Lewis sees it, addiction really is habit; we just don’t appreciate how deeply habit can be engraved on the brain itself. “Repeated (motivating) experience” — i.e., the sensation of having one’s worries wafted away by the bliss of heroin — “produce brain changes that define future experiences… So getting drunk a lot will sculpt the synapses that determine future drinking patterns.”
More and more experiences and activities get looped into the addiction experience and trigger cravings and expectations like the bells that made Pavlov’s dogs salivate, from the walk home past a favorite bar to the rituals of shooting up. The world becomes a host of signs all pointing you in the same direction and activating powerful unconscious urges to follow them. At a certain point, the addictive behavior becomes compulsive, seemingly as irresistibly automatic as a reflex. You may not even want the drug anymore, but you’ve forgotten how to do anything else besides seek it out and take it.
Yet all of the addicts Lewis interviewed for “The Biology of Desire” are sober now, some through tried-and-true 12-Step programs, others through self-designed regimens, like the heroin addict who taught herself how to meditate in prison. Perhaps it’s no surprise that a psychologist would argue for some form of talk therapy addressing the underlying emotional motivations for turning to drugs. But Lewis is far from the only expert to voice this opinion, or to recommend cognitive behavioral therapy as a way to reshape the brain and redirect its systems into less self-destructive patterns.
Without a doubt, AA and similar programs have helped a lot of people. But they’ve also failed others. One size does not fit all, and there’s a growing body of evidence that empowering addicts, rather than insisting that they embrace their powerlessness and the impossibility of ever fully shedding their addiction, can be a road to health as well.
If addiction is a form of learning gone tragically wrong, it is also possible that it can be unlearned, that the brain’s native changeability can be set back on track. “Addicts aren’t diseased,” Lewis writes, “and they don’t need medical intervention in order to change their lives. What they need is sensitive, intelligent social scaffolding to hold the pieces of their imagined future in place — while they reach toward it.”
Further reading
The Irrationality of Alcoholics Anonymous
Its faith-based 12-step program dominates treatment in the United States. But researchers have debunked central tenets of AA doctrine and found dozens of other treatments more effective. By Gabrielle Glaser, The Atlantic 4/2015 The Irrationality of Alcoholics Anonymous, The Atlantic
The Surprising Failures of 12 Steps
How a pseudoscientific, religious organization birthed the most trusted method of addiction treatment. By Jake Flanagan 3/25/2014
https://www.theatlantic.com/health/archive/2014/03/the-surprising-failures-of-12-steps/284616/
Why the Disease Definition of Addiction Does Far More Harm Than Good.
Among other problems, it has obstructed other channels of investigation, including the social, psychological and societal roots of addiction. By Marc Lewis on February 9, 2018
…Viewing addiction as pathology has other, more direct detriments. If you feel that your addiction results from an underlying pathology, as implied by the brain disease model, and if that pathology is chronic, as highlighted by both NIDA and the 12-step movement, then you are less likely to believe that you will ever be free of it or that recovery can result from your own efforts. This characterization of addiction flies in the face of research indicating that a great majority of those addicted to any substance or behavior do in fact recover, and most of those who recover do so without professional care.
Why the Disease Definition of Addiction Does Far More Harm Than Good. Scientific American.
Addiction and the Brain: Development, Not Disease
By Mark Lewis, Neuroethics, April 2017, Volume 10, Issue 1, pp 7–18
I review the brain disease model of addiction promoted by medical, scientific, and clinical authorities in the US and elsewhere. I then show that the disease model is flawed because brain changes in addiction are similar to those generally observed when recurrent, highly motivated goal seeking results in the development of deep habits, Pavlovian learning, and prefrontal disengagement. This analysis relies on concepts of self-organization, neuroplasticity, personality development, and delay discounting. It also highlights neural and behavioral parallels between substance addictions, behavioral addictions, normative compulsive behaviors, and falling in love. I note that the short duration of addictive rewards leads to negative emotions that accelerate the learning cycle, but cortical reconfiguration in recovery should also inform our understanding of addiction. I end by showing that the ethos of the disease model makes it difficult to reconcile with a developmental-learning orientation.
Addiction and the Brain: Development, Not Disease. Neuroethics (journal)
The chronic disease concept of addiction: Helpful or harmful?
Thomas K. Wiens & Lawrence J. Walker. Addiction Research & Theory, Volume 23, 2015 – Issue 4
This study provides empirical support to the notion that framing addiction within a biological conceptualisation, as opposed to a psychological and social framework, weakens perceptions of agency in relation to drinking. Likewise, no evidence was found to support the common assertion that the disease model reduces feelings of stigma and shame.
The chronic disease concept of addiction: Helpful or harmful?
Probability and predictors of remission from lifetime nicotine, alcohol, cannabis, or cocaine dependence
Results from the National Epidemiologic Survey on Alcohol and Related Conditions
By Catalina Lopez-Quintero, M.D., M.P.H., Deborah S. Hasin, Ph.D., […], and Carlos Blanco, M.D., Ph.D. Addiction. 2011 Mar; 106(3): 657–669.
Most People With Addiction Simply Grow Out of It: Why Is This Widely Denied?
By Maia Szalavitz, Addictionblog.org 6/22/2015
The idea that addiction is typically a chronic, progressive disease that requires treatment is false, the evidence shows. Yet the “aging out” experience of the majority is ignored by treatment providers and journalists.
Most People With Addiction Simply Grow Out of It: Why Is This Widely Denied?
Most of Us Still Don’t Get It: Addiction Is a Learning Disorder
By Maia Szalavitz
Addiction is not about our brains being “hijacked” by drugs or experiences—it’s about learned patterns of behavior. Our inability to understand this leads to no end of absurdities.
Most of Us Still Don’t Get It: Addiction Is a Learning Disorder
5 Addiction Myths. A book review of Unbroken Brain: A Revolutionary New Way of Understanding Addiction. Laurel Sindewald, Handshake Media, 6/20/2016
Learned behavior model also explains wide array human behaviors, including political anger
Author David Brin writes
“For years I’ve followed advances that investigate reinforcement processes in the human brain, especially those involving dopamine and other messenger chemicals that are active in mediating pleasure response. One might call this topic chemically-mediated states of arousal that self-reinforce patterns of behavior.
Of course, what this boils down to — at one level — is addiction. But not only in the sense of illegal drug abuse. In very general terms, “addiction” may include desirable things, like bonding with our children and “getting high on life.” These good patterns share with drug addiction the property of being reinforced by repeated chemical stimulus, inside the brain…
Consider studies of gambling. Researchers led by Dr. Hans Breiter of Massachusetts General Hospital examined with functional magnetic resonance imaging (fMRI) which brain regions activate when volunteers won games of chance — regions that overlapped with those responding to cocaine!…
Moving along the spectrum toward activity that we consider more “normal” — neuroscientists at Harvard have found a striking similarity between the brain-states of people trying to predict financial rewards (e.g., via the stock market) and the brains of cocaine and morphine users.
… researchers at Emory University monitored brain activity while asking staunch party members, from both left and right, to evaluate information that threatened their preferred candidate prior to the 2004 Presidential election. “We did not see any increased activation of the parts of the brain normally engaged during reasoning,” said Drew Westen, Emory’s director of clinical psychology. “Instead, a network of emotion circuits lit up… reaching biased conclusions by ignoring information that could not rationally be discounted. Significantly, activity spiked in circuits involved in reward, similar to what addicts experience when they get a fix,” Westen explained.
Addicted to Self-Righteousness? An Open Letter to Researchers In the Fields of Addiction, Brain Chemistry, and Social Psychology
Indignation, addiction and hope — does it help to be “mad as hell?”: David Brin at TEDxUCSD
Fair use
This website is educational. Materials within it are being used in accord with the Fair Use doctrine, as defined by United States law.
§107. Limitations on Exclusive Rights: Fair Use
Notwithstanding the provisions of section 106, the fair use of a copyrighted work, including such use by reproduction in copies or phone records or by any other means specified by that section, for purposes such as criticism, comment, news reporting, teaching (including multiple copies for classroom use), scholarship, or research, is not an infringement of copyright. In determining whether the use made of a work in any particular case is a fair use, the factors to be considered shall include: the purpose and character of the use, including whether such use is of a commercial nature or is for nonprofit educational purposes; the nature of the copyrighted work; the amount and substantiality of the portion used in relation to the copyrighted work as a whole; and the effect of the use upon the potential market for or value of the copyrighted work. (added pub. l 94-553, Title I, 101, Oct 19, 1976, 90 Stat 2546)
Are we consuming too little salt?
…Cutting back on salt can reduce blood pressure, but often, the change in blood pressure is small. According to the American Heart Association, a person who reduces salt intake from median levels (around 3,400 milligrams ) to the federal recommended levels (no more than 2,300 mg) typically sees a slight drop of 1% to 2% in blood pressure, on average.
Also, other factors affect blood pressure. For example, blood pressure increases with weight gain and decreases with weight loss. So, keeping a healthy weight can help prevent high blood pressure. Eating foods high in potassium also seems to counter some of the effects of high salt consumption on blood pressure.
Studies comparing salt intake in different countries worldwide have not found a clear connection between salt intake and high blood pressure. Societies that eat lower levels of salt do not necessarily have less heart disease than those that eat a lot of salt.
…Surprisingly little is known about how much salt we need. U.S. residents consume, on average, about 3,400 milligrams of salt per day. For decades, the U.S. government and organizations, such as the American Heart Association, have recommended people consume less salt. Current dietary guidelines recommend no more than 2,300 mg of sodium—about a teaspoon of salt—per day for teens and adults. No more than 1,500 mg per day is recommended for groups at higher risk of heart disease, including African Americans and everyone over the age of 50.
The U.S. dietary guidelines were established in the 1970s when relatively little information was available about dietary salt and health. The guidelines were the best guess, given the information available at the time. …
Some scientists now say that the average amount of salt U.S. residents eat (3,400 mg of salt per day) is safe and may even be healthier than the lower government guidelines.
In fact, a study found that people who meet the U.S. recommended limits for salt (2,300 mg of sodium per day) have more heart trouble than those consuming more salt. This study included approximately 150,000 people from 17 countries and was published in the New England Journal of Medicine.
Scientists challenging the current guidelines say people should consume at least 3,000 mg of salt per day and up to 6,000 mg per day. The new research results suggest a low-sodium diet may stimulate the production of renin, an enzyme released by the kidneys. Renin plays a role in regulating the body’s water balance and blood pressure. Too much renin may harm blood vessels, and a high-sodium diet would help lower the amount of renin produced….

