If Only 19th-Century America Had Listened to a Woman Scientist

Where might the US be if it heeded her discovery of global warming’s source?
By Sidney Perkowitz November 28, 2019

Human-induced climate change may seem a purely modern phenomenon. Even in ancient Greece, however, people understood that human activities can change climate. Later the early United States was a lab for observing this as its settlers altered nature. By 1800 it was known that the mass clearing of forests raised temperatures in the Eastern U.S. and that climatic changes followed the pioneers as they spread west.

The causes for such changes, and the understanding that they could have global scope, came from eminent European scientists. Yet an amateur 19th-century American researcher, a woman named Eunice Foote, made a first crucial discovery about global climate change. Her story gives insight into early American science, women in science, and how the understanding of climate has changed. It also reveals how that understanding might have evolved differently to better deal with today’s climate problems.

“Foote went from a farmer’s daughter to one of the greats in the science of climate change,” says John Perlin, a visiting scholar at the University of California, Santa Barbara, and author of books on solar power and other topics. Perlin is currently writing a book about Foote. His extensive archival research and a visit to the locales where she lived and worked have yielded new understanding of her environment and development.

The story of climate change begins in the 1820s, when the French scientist Joseph Fourier discovered the greenhouse effect by recognizing that atmospheric gases must trap the sun’s heat. In 1859 the Irish physicist John Tyndall identified atmospheric water vapor and carbon dioxide gas, CO2, as main components in absorbing thermal radiation. But in a lost piece of scientific history recovered only in 2011, Foote scooped Tyndall by three years when her experiments in 1856 first revealed the roles of water vapor and CO2. Atmospheric CO2 levels at the time were only about 290 parts per million (ppm) and global climate change was not yet a known issue. Nevertheless, Foote predicted that changing CO2 levels could change global temperatures, as we see today with CO2 at over 400 ppm.

Who was Eunice Foote? At the time of her discovery she was a 37-year-old wife and mother. Born Eunice Newton in 1819 on a farm in Connecticut and raised largely by an older sister, she lived much of her life in upper New York state—a life marked by events that contributed to her scientific success.

“Foote went from a farmer’s daughter to one of the greats in the science of climate change.”

One factor was her excellent schooling. From age 17 to 19, Foote attended the Troy Female Seminary in Troy, New York. This “mecca for women” had been founded by feminist Emma Willard in 1824 as the first women’s prep school. It shared facilities with the nearby Rensselaer School (later Rensselaer Polytechnic Institute), whose co-founder Amos Eaton was a particular proponent of hands-on science training. Along with science-heavy curricula, according to Perlin, the schools featured the only two chemistry labs in the world then meant just for students. It was here, he says, that Foote learned lab technique and how to frame and carry out a research project.

Foote was also influenced by a famous neighbor, Elizabeth Cady Stanton, a major figure in early women’s rights and suffrage movements who later worked with Susan B. Anthony. Stanton co-organized the first woman’s rights convention held in the U.S., which met in Seneca Falls, NY, in 1848. Foote attended, and signed the convention’s “Declaration of Sentiments” that stated the societal changes necessary to fully include women. More than that, she helped prepare the conference proceedings. Perlin believes she was especially motivated by a resolution that all the professions should be open to women.

As Perlin points out, next to Foote’s signature on the Declaration of Sentiments you would have seen her husband Elisha’s signature. He was one of only a few male attendees who actually signed, a demonstration of his support that must have been meaningful to Eunice. Elisha also shared her scientific interests. He knew Joseph Henry, the leading American scientist of the time and the founding secretary of the Smithsonian Institution. Elisha carried out meteorological work for the Smithsonian and together with Eunice read Henry’s writings about climate.
Perkowitz_BREAKER
THE FIRST WORD ON GLOBAL WARMING: In 1856, in American Journal of Art and Science, Eunice Foote wrote, “The highest effect of the sun’s rays I have found to be in carbonic acid gas.” That prescient line was the result of her experiments showing the proportion of carbon dioxide (then “carbonic acid gas”) in the atmosphere altered its temperature.American Journal of Art and Science

All these influences contributed to Foote’s breakthrough paper, “Circumstances Affecting the Heat of the Sun’s Rays.” It was presented at the 1856 meeting of the American Association for the Advancement of Science (AAAS)—but not by her. Elisha read his own research paper whereas Eunice’s was introduced and read by Henry. Unlike the other reports at that meeting, it was not published in the proceedings. Nevertheless, it became somewhat known. It was briefly described in a few newspapers. Scientific American wrote that Eunice’s work offered “abundant evidence of the ability of woman to investigate any subject with originality and precision.” Fortunately, it did appear in full under Foote’s name in the 1856 issue of The American Journal of Science and Arts.

But then her discovery became lost. It might have remained so except for retired petroleum geologist Raymond Sorenson and a dose of what he calls “blind luck.” He collected old copies of the Annual of Scientific Discovery, a yearly summary of scientific research. In the volume for 1857, Sorenson came across a description of Foote’s work, and quickly realized that he was seeing the first reported connection between CO2 and climate change. In 2011 he presented his findings in a journal published by his professional society. “I’ve had more response to that than anything else I’ve ever written,” he has said, attention that continues as scientists and historians absorb the revelation of Foote’s work—especially in 2019, the 200th year after her birth.

“I have been wondering how many Eunice Footes are out there to be discovered.”

Foote’s research used simple apparatus, two cylinders (presumably made of clear glass) 4 inches across by 30 inches long that each held two thermometers. She filled these tubes with different gases and compared their thermal responses to sunlight as judged by their temperature rises. Foote found that moist air absorbs more solar radiation than dry air; and what she called carbonic acid gas, that is CO2, absorbs more radiation than ordinary air. That tube reached a temperature of 120 degrees Fahrenheit compared to 100 degrees Fahrenheit for ordinary air, and cooled more slowly. She also found that the tube with CO2 became much hotter than tubes filled with hydrogen or oxygen under equal conditions.

Foote unquestionably preceded Tyndall in discovering the major roles of CO2 and water vapor. This was remarkable for an amateur from the young American scientific community, compared to an educated professional working within a mature European scientific setting. Tyndall, however, established a critical feature that Foote had not. Using an infrared source in his well-equipped lab, he showed the greenhouse effect is not triggered by direct sunlight but by infrared radiation arising from the earth’s warmed surface. Foote’s experiment was not designed to distinguish between the two modes (although present-day analysis shows evidence of the infrared effect in her data). Tyndall deserves credit for uncovering this essential part of the greenhouse mechanism.

One lesson from Foote’s life is that original research arises from access to a scientific education and opportunities for mentorship and discussions. Multiple studies show the same holds true today. It’s also important to reflect on how Foote’s research was received. Why couldn’t she present her own work at AAAS, and why was it omitted from the published proceedings? Why didn’t her work receive wider attention? As Leila McNeill underscored in Smithsonian.com, bias against women is a large part of the answer. Women could join the AAAS in that era, but science historian Margaret Rossiter notes an internal hierarchy in the organization that rated “professional” men over “amateur” women. This may account for the downgrading of Foote’s presentation.

Later, Tyndall made a more serious omission. The papers he wrote about his discovery in 1859 and then in 1861—now five years after Foote’s work—did not mention her results, although he cited those of Fourier and others. Priority of discovery has always been a serious matter in science. With no citation to Foote’s work, Tyndall took center stage as “the father of climate science.” Equally important in the long run, his omission also kept Foote’s work from contributing to climate science as it developed in Europe.
Sapolsky_TH-F1
Also in Climate
To Fix the Climate, Tell Better Stories

By Michael Segal

Here are two sets of statements from far-distant opposites in the climate change debate. The first is from Naomi Klein, who in her book This Changes Everything paints a bleak picture of a global socioeconomic system gone wrong: “There is…READ MORE

Sexism colored Tyndall’s perception of women as scientists, as revealed in a passage from the biography, The Ascent of John Tyndall. The book’s author Roland Jackson writes that Tyndall “often exhibited surprise at women’s intellectual capabilities, and though he imagined that women could understand anything revealed by the savants, he did not believe they had the same powers of imagination and discovery.”

Was this attitude enough to make Tyndall purposely ignore Foote’s work? In an article published this year, Jackson denies the possibility. He argues that poor communication between American and European science makes it likely that Tyndall simply did not know of Foote’s research. He adds that covering up her work would not have been in Tyndall’s character. Perlin, however, believes the omission was deliberate. He argues that Foote’s research appeared in publications where Tyndall was virtually certain to have seen it. Perlin also notes a separate incident where Tyndall failed to credit earlier work by another American researcher (who happened to be Joseph Henry).

Whatever the link is between Tyndall and his treatment of Foote’s research, his attitude echoed what many of his male colleagues thought about women in science. Despite this barrier, Foote is an inspiration for female scientists today. One of them, Leila Carvalho, is a professor of meteorology and climate sciences at the University of California, Santa Barbara. Responding to a recent seminar about Foote held at the university, Carvalho wrote that Foote’s accomplishments “trembled my very core,” and added, “I have been wondering how many Eunice Footes are out there to be discovered, and how much of their legacy has been veiled or even discredited because they cannot resist the pressure against gender, ethnicity, and race.”

Carvalho underlines a lost opportunity for science. Had Foote’s results been quickly and widely recognized in the U.S. and Europe, they may have generated American interest in climate science. That science could have been applied to climate change, as old tree rings and other data showed a rapid climb in Northern Hemisphere temperatures after the 1860s. With such a head start, the U.S. might have developed a deeper appreciation and a stronger response to the dangers of climate change than we see today from our government. This would benefit the whole world, and all of it coming from the efforts of an American woman scientist.

Sidney Perkowitz, Candler Professor of Physics Emeritus at Emory University, writes frequently about science for general readers. His stage play Glory Enough (2005) is about Rosalind Franklin and the discovery of the structure of DNA. His latest books are Physics: A Very Short Introduction, and Real Scientists Don’t Wear Ties. @physp

Lead image: Everett Historical / Shutterstock

Advertisement

Kindness is key to health and happiness, and it’s free!

Robby Robin's Journey

Today is Thanksgiving in the U.S. and, just as with Thanksgiving in Canada (which is a little earlier, when travel is more predictable), it’s a time for many people to consider all that they have to be thankful for and to be reminded that gratitude is good for our health. In fact it’s very good for our health. Just google “gratitude and health” and you’ll find out.

As it turns out, being kind to others is also good for your health, maybe even more so. You can google that as well! Engaging in kindness has all kinds of positive physical effects. Ongoing research shows that kindness can actually extend your life. It lowers your blood pressure, reduces anxiety and depression, and helps the immune system. Research shows that kindness can help you live longer and better, both in the giving of kindness and in being the recipient of kindness. And…

View original post 522 more words

An ode to the humble tui

Love the flora and fauna of NZ.

Matthew Wright

I have tūī in my back garden. Prosthemadera novaeseelandiae to scientists. There are at least three, possibly more, which live in the area and drop in every so often to snack on harakeke (flax) nectar. They also squabble and sing. Loudly. And all of that that is a great luxury to have in the back yard. New Zealand’s indigenous flora and fauna is unique in the world, and it’s been through some tough times. We were very much the ‘lost world’ of Professor Challenger, a snapshot of how things were in the last age of the dinosaurs. Sort of.

I managed to get this picture of a plump, noisy tūī in the harakeke in my back yard.

I have to say ‘sort of’ because, while a lot of the trees and plants are the ones that flourished in Cretaceous era Gondwanaland, the dinosaurs are what was left after the extinction…

View original post 420 more words

The One Reality

I can't believe it!

If you’re following the plot of my philosophically inclined posts you will see my dismissal of materialists as modern flat earthers. So what basic philosophical stance do I regard as more appropriate? In his book The Flip, Jeffrey Krittal suggest five possible perspectives, as follows.

  • Panpsychism. Everything has mind/ has some level of consciousness/ is alive.
  • Dual-Aspect Monism. Mind and matter are aspects of a single underlying reality.
  • Quantum Mind. Quantum mechanics applies at a level of real world objects; mind is an expression of the quantum wave function. (Alexander Wendt)
  • Cosmopsychism/ panentheism. All conscious subjects are partial aspects of the more fundamental whole.
  • Idealism. Mind is fundamental and matter is a manifestation thereof.

This is all very interesting as theory, and no doubt enthusiasts of the various viewpoints could spend many an hour debating their differences. But in essence, if you don’t…

View original post 108 more words

The ‘OK boomer’ revolution will not be televised

Johanna Schneller
Special to The Globe and Mail
Published 16 hours ago
Updated November 24, 2019

Patricia Heaton’s new vehicle Carol’s Second Act might as well be called OK Carol.

Sonja Flemming/The Associated Press

“OK boomer” is so ubiquitous a retort that at this point, even mentioning it is enough to elicit an “OK boomer.” But I was born in 1962, the dusty end of the boom, so I’ve had a way-backseat view of the high jinks for as long as I’ve been alive, and hey, we mystify me, too. You would think the largest group of people in the history of the world to enter old age at the same time would have some important truths to impart, right? I’ve been waiting for the boomers to be honest about themselves in their films and television series, and I’m mostly still waiting.

(For the five of you who may not know about “OK boomer” – you must be boomers – here’s the Urban Dictionary definition: “When a baby boomer says some dumb [stuff] and you can’t even begin to explain why he’s wrong because that would be deconstructing decades of misinformation and ignorance so you just brush it off and say okay.”)

For the generations that followed, boomers destroyed the planet, bankrupted social security, seared a corrosive divide into politics, made the housing market unreachable, and now spend their twilight afternoons either shouting on the internet about Snowflakes or being startled to find that they, who were sure they were going to fix stuff, have instead ruined everything.

So are our pop-culture creators coming to an understanding that all the Botox injections in the world can’t mask one’s regrets and mistakes, that no matter how many spinning classes you crush, you can’t harden yourself against failure and loss? Are they conveying how that feels in their work?

Not so much. Instead of being bravely honest, many boomer shows and films focus on how we’re not loosening our grip on power, not admitting we’re aging. Will we be the first generation to grow older without growing wiser?

Look at the sitcom Carol’s Second Act (on Global), a star vehicle for Patricia Heaton (Everybody Loves Raymond). She plays a teacher turned doctor, and every episode is exactly the same: The twentysomething interns think Carol is too old to do X, but boy, she sure shows them, she ends up doing X better than even her boss because she has more Life Experience. It might as well be called OK Carol. A genuinely wise person does not go around yammering about how much wisdom they have, just as a genuine artist doesn’t go on and on about how artistic they are. Wisdom is not hoarding or trumpeting your smarts; it’s paying it forward.

I had some hopes for Catherine the Great, the four-hour HBO miniseries starring Helen Mirren. But, yikes. Mirren is an impressive woman, Mirren is a force, but Mirren is 74, and all the digital softening in the world can’t make her believable in Episode 1 as the mother of a 15-year-old. You may argue that men have played younger for years, but I’ve always thought that was ludicrous, too. Catherine the Great is fake-good, a sumptuous wrapper around a stale candy. The way it keeps insisting, “Wow, we’re so enlightened to show Catherine being sexual at her age,” is insulting.

It feigns wisdom: “When I was young, I dreamed of freedom,” Catherine says in the final episode. “But as you get older, your choices narrow.” Fine so far. But then she adds, “So instead, I gave us an empire.” Tooting one’s own horn kind of drowns out the life lesson. So does having Catherine’s much-younger lover insist, “I’m never conscious of your age. You just are. You’re at the centre of things. You’re young. You always will be.” OK Catherine writers.
Open this photo in gallery

Catherine the Great is fake-good, a sumptuous wrapper around a stale candy.

Robert Viglasky

On the other hand, Martin Scorsese’s new film, The Irishman (opened in selected theatres Nov. 8, on Netflix on Nov. 27), is a moving exploration of aging and rue. He opens with a gut punch for his fans: He recreates the famous entering-the-nightclub pan from Goodfellas – only this time, he’s entering an old age home. Instead of gliding through the kitchen to burst out the stage door, he takes us down carpeted hallways past walkers, wheelchairs and people dying alone alone alone.

Scorsese and writer Steven Zaillian turn this true story of Jimmy Hoffa (Al Pacino), his steadfast lieutenant Frank Sheeran (Robert DeNiro) and the mobster Russell Bufalino (Joe Pesci, headed for an Oscar) into a three-hour-plus rumination that in the midst of life, we are in death. (Most every mook is introduced with a freeze-frame explaining the circumstances of his demise.) The flashback scenes employ digital de-aging, and at first they are off-putting – we know what DeNiro looked like when he was young, and it wasn’t this. But eventually, you overlook them, because the glimpses of the past make the present more poignant.

Scene after scene speaks to the miseries of getting older-not-wiser, complete with pillboxes, canes and yellowed photos of family members who no longer speak to each other. Zaillian understands that the line “You don’t know how fast time goes until you get there” is both true and whiny, an attempt to justify selfishness. The moment near the end, when DeNiro asks his daughter, “Can I do anything now to make it up to anybody?” and she weeps at his too-little-too-lateness, is an “OK boomer” elevated to Shakespeare-level sadness.

On a lighter note, Bette Midler and Judith Light do sashay very satisfyingly into the final episode of The Politician (Netflix) to show the youngsters what’s what. And every woman over 40 is singing hosannas to the scene in Fleabag Season 2 (Amazon Prime) where Kristen Scott Thomas delivers a kick-ass speech about menopause while downing a martini. (I want her line reading of “I’m 58” to be my ringtone.)

But I revere even more the second half of that scene, where Scott Thomas admits to Fleabag (Phoebe Waller-Bridge) that she misses flirting. “There’s nothing more exciting than a roomful of people,” she says.

Except most people are rotten, Fleabag counters.

“Look at me,” Scott Thomas says. “Listen to me. People are all we’ve got.”

Not a new idea, but an excellent argument for why 58 is wiser than 33. Waller-Bridge then uncovers another truth. Fleabag propositions Scott Thomas, and Scott Thomas turns her away. “Why?” Fleabag asks.

“Honestly? I can’t be assed [bothered],” Scott Thomas replies. She’s been here before; she can see it all unfolding, from the machinations of new sex to the inevitable heartbreak, and she opts to go home instead. We need people, but sometimes they’re a lot of work.

To me, one movie about aging gets every bit of it right: Pain and Glory, Pedro Almodovar’s autobiographical portrait of a filmmaker (Antonio Banderas) who finally confronts what he’s been running away from in himself. Banderas, never better, plays him as needy, egomaniacal and magnificently human. In an interview with The Globe and Mail’s Kate Taylor, Banderas said, “I told Pedro, ‘We are getting older. There is only space for truth in our lives.’”

Apparently Almodovar listened. His movie shows us that we are the things we’ve done, yes – but we’re also the things we wish we’d done. You’re not okay, boomer. But that’s okay.

Live your best. We have a daily Life & Arts newsletter, providing you with our latest stories on health, travel, food and culture. Sign up today.

New study shows the right workout routine can help fight dementia

Alex Hutchinson
Special to The Globe and Mail
Published 16 hours ago

To understand why a new study from researchers at McMaster University’s NeuroFitLab is making waves, it helps to look back at one of its previous findings.

In 2017, a team led by the lab’s director, Jennifer Heisz, published a five-year study of more than 1,600 adults older than 65 that concluded that genetics and exercise habits contribute roughly equally to the risk of eventually developing dementia. Only one of those two factors is under your control, so researchers around the world have been striving to pin down exactly what sort of workout routine will best nourish your neurons.

Heisz’s latest study, published last month in the journal Applied Physiology, Nutrition, and Metabolism, offers a tentative answer to this much-debated question. Older adults who sweated through 12 weeks of high-intensity interval training improved their performance on a memory test by 30 per cent compared with those who did a more moderate exercise routine.

But the real significance of the findings may have less to do with the specific details of the workout routine than with the fact that even subjects as old as 88 were able to stick with a challenging exercise routine and improve their memory. Because in the quest for brain health, as in the parallel quest for physical health, the real challenge is finding ways of making exercise accessible and sustainable.

The new study involved 64 sedentary adults older than 60 who were divided into three groups that each met three times a week for 12 weeks. Before and after the training period, their physical fitness and cognitive performance were assessed.

The interval group warmed up and then did four-minute bouts of hard treadmill walking at 90 per cent to 95 per cent of their maximum heart rate, repeated four times, with three minutes of easy walking for recovery. The moderate exercise group walked at 70 to 75 per cent of max heart rate for 47 minutes, which burned the same total number of calories as the interval group. The control group, meanwhile, did 30 minutes of relaxed stretching.

Perhaps the most worrying observation was that many people in the control group showed measurable declines in both physical fitness and memory. Staying active isn’t just about boosting your performance; it’s necessary just to keep what you’ve got.

And while the headline result was that the interval group did better, Heisz points out that previous work by her group and others has found that more moderate exercise can also help, albeit over a longer time frame. Adding intensity – “this can be as simple as adding hills to [a] daily walk or picking up the pace between light posts,” she says – just speeds things up.

In fact, when you zoom out to survey the larger body of literature in this area, the picture gets even murkier. Yes, both moderate and intense aerobic exercises such as walking or running seem to work. But so does resistance training, as do “gross motor skills” exercises that engage your brain as well as your body, such as yoga or ball games or simply walking in a rich and unpredictable outdoor environment rather than on a treadmill.

Louis Bherer, a researcher who studies exercise and brain health at the University of Montreal, points to a recent report by the Global Council on Brain Health, which concludes that there simply isn’t enough evidence yet to pinpoint a “best” form of exercise for warding off cognitive decline.

With that in mind, Bherer suggests aiming for a mix of different types of exercise – and, more important, finding something you enjoy. “The name of the game is to sustain it,” he says. “As soon as you stop, you start losing the benefits.”

In Heisz’s study, the most consistent feedback was how much subjects liked the social aspects of group training. The ingredients of an enjoyable exercise routine might be different for you, but figuring that out is probably more important than fretting over your heart rate zone.

“The positive message from this study,” Heisz says, “is that it is never too late to start.”

Alex Hutchinson is the author of Endure: Mind, Body, and the Curiously Elastic Limits of Human Performance. Follow him on Twitter @sweatscience.

Is Grassfed Meat and Dairy Better for Human and Environmental Health?

The health of livestock, humans, and environments is tied to plant diversity—and associated phytochemical richness—across landscapes. Health is enhanced when livestock forage on phytochemically rich landscapes, is reduced when livestock forage on simple mixture or monoculture pastures or consume high-grain rations in feedlots, and is greatly reduced for people who eat highly processed diets. Circumstantial evidence supports the hypothesis that phytochemical richness of herbivore diets enhances biochemical richness of meat and dairy, which is linked with human and environmental health. Among many roles they play in health, phytochemicals in herbivore diets protect meat and dairy from protein oxidation and lipid peroxidation that cause low-grade systemic inflammation implicated in heart disease and cancer in humans. Yet, epidemiological and ecological studies critical of red meat consumption do not discriminate among meats from livestock fed high-grain rations as opposed to livestock foraging on landscapes of increasing phytochemical richness. The global shift away from phytochemically and biochemically rich wholesome foods to highly processed diets enabled 2.1 billion people to become overweight or obese and increased the incidence of type II diabetes, heart disease, and cancer. Unimpeded, these trends will add to a projected substantial increase in greenhouse gas emissions (GHGE) from producing food and clearing land by 2050. While agriculture contributes one quarter of GH…

Source: Is Grassfed Meat and Dairy Better for Human and Environmental Health?

Reflections on our climate emergency #1

This mirrors my angst in many ways.

Abigail Hopewell

Whoever we are, whatever we do, and wherever we live, every single one of us will be increasingly affected by anthropogenic climate change and the world’s unfolding sixth mass extinction.

My wake-up call

I remember being concerned about the health and wellbeing of planet earth from a relatively young age. My first proper wake-up call occurred in the late 1980s; as a pre-teen I learned about the escalating worries about the effects of CFC gases on the ozone layer, I cared deeply about the plight of endangered animals, I felt upset by humanity’s cruel and destructive actions on the planet and her inhabitants. Resolute that I had to do something, I joined Friends of the Earth, became a vegetarian, and got into politics. So began an interest and passion for the environment and natural world that has never really gone away, albeit has ebbed and flowed over the years.

Fast…

View original post 729 more words

Think before you post.

The Dark Psychology of Social Networks

Why it feels like everything is going haywire
Mark Pernice

Story by Jonathan Haidt and Tobias Rose-Stockwell

Suppose that the biblical story of Creation were true: God created the universe in six days, including all the laws of physics and all the physical constants that apply throughout the universe. Now imagine that one day, in the early 21st century, God became bored and, just for fun, doubled the gravitational constant. What would it be like to live through such a change? We’d all be pulled toward the floor; many buildings would collapse; birds would fall from the sky; the Earth would move closer to the sun, reestablishing orbit in a far hotter zone.

Let’s rerun this thought experiment in the social and political world, rather than the physical one. The U.S. Constitution was an exercise in intelligent design. The Founding Fathers knew that most previous democracies had been unstable and short-lived. But they were excellent psychologists, and they strove to create institutions and procedures that would work with human nature to resist the forces that had torn apart so many other attempts at self-governance.

For example, in “Federalist No. 10,” James Madison wrote about his fear of the power of “faction,” by which he meant strong partisanship or group interest that “inflamed [men] with mutual animosity” and made them forget about the common good. He thought that the vastness of the United States might offer some protection from the ravages of factionalism, because it would be hard for anyone to spread outrage over such a large distance. Madison presumed that factious or divisive leaders “may kindle a flame within their particular States, but will be unable to spread a general conflagration through the other States.” The Constitution included mechanisms to slow things down, let passions cool, and encourage reflection and deliberation.

Madison’s design has proved durable. But what would happen to American democracy if, one day in the early 21st century, a technology appeared that—over the course of a decade—changed several fundamental parameters of social and political life? What if this technology greatly increased the amount of “mutual animosity” and the speed at which outrage spread? Might we witness the political equivalent of buildings collapsing, birds falling from the sky, and the Earth moving closer to the sun?

America may be going through such a time right now.
What Social Media Changed

Facebook’s early mission was “to make the world more open and connected”—and in the first days of social media, many people assumed that a huge global increase in connectivity would be good for democracy. As social media has aged, however, optimism has faded and the list of known or suspected harms has grown: Online political discussions (often among anonymous strangers) are experienced as angrier and less civil than those in real life; networks of partisans co-create worldviews that can become more and more extreme; disinformation campaigns flourish; violent ideologies lure recruits.

The problem may not be connectivity itself but rather the way social media turns so much communication into a public performance. We often think of communication as a two-way street. Intimacy builds as partners take turns, laugh at each other’s jokes, and make reciprocal disclosures. What happens, though, when grandstands are erected along both sides of that street and then filled with friends, acquaintances, rivals, and strangers, all passing judgment and offering commentary?

The social psychologist Mark Leary coined the term sociometer to describe the inner mental gauge that tells us, moment by moment, how we’re doing in the eyes of others. We don’t really need self-esteem, Leary argued; rather, the evolutionary imperative is to get others to see us as desirable partners for various kinds of relationships. Social media, with its displays of likes, friends, followers, and retweets, has pulled our sociometers out of our private thoughts and posted them for all to see.
Human beings evolved to gossip, preen, manipulate, and ostracize. We are easily lured into this new gladiatorial circus.

If you constantly express anger in your private conversations, your friends will likely find you tiresome, but when there’s an audience, the payoffs are different—outrage can boost your status. A 2017 study by William J. Brady and other researchers at NYU measured the reach of half a million tweets and found that each moral or emotional word used in a tweet increased its virality by 20 percent, on average. Another 2017 study, by the Pew Research Center, showed that posts exhibiting “indignant disagreement” received nearly twice as much engagement—including likes and shares—as other types of content on Facebook.

The philosophers Justin Tosi and Brandon Warmke have proposed the useful phrase moral grandstanding to describe what happens when people use moral talk to enhance their prestige in a public forum. Like a succession of orators speaking to a skeptical audience, each person strives to outdo previous speakers, leading to some common patterns. Grandstanders tend to “trump up moral charges, pile on in cases of public shaming, announce that anyone who disagrees with them is obviously wrong, or exaggerate emotional displays.” Nuance and truth are casualties in this competition to gain the approval of the audience. Grandstanders scrutinize every word spoken by their opponents—and sometimes even their friends—for the potential to evoke public outrage. Context collapses. The speaker’s intent is ignored.

Human beings evolved to gossip, preen, manipulate, and ostracize. We are easily lured into this new gladiatorial circus, even when we know that it can make us cruel and shallow. As the Yale psychologist Molly Crockett has argued, the normal forces that might stop us from joining an outrage mob—such as time to reflect and cool off, or feelings of empathy for a person being humiliated—are attenuated when we can’t see the person’s face, and when we are asked, many times a day, to take a side by publicly “liking” the condemnation.

In other words, social media turns many of our most politically engaged citizens into Madison’s nightmare: arsonists who compete to create the most inflammatory posts and images, which they can distribute across the country in an instant while their public sociometer displays how far their creations have traveled.
Upgrading the Outrage Machine

At its inception, social media felt very different than it does today. Friendster, Myspace, and Facebook all appeared between 2002 and 2004, offering tools that helped users connect with friends. The sites encouraged people to post highly curated versions of their lives, but they offered no way to spark contagious outrage. This changed with a series of small steps, designed to improve user experience, that collectively altered the way news and anger spread through American society. In order to fix social media—and reduce its harm to democracy—we must try to understand this evolution.

When Twitter arrived in 2006, its primary innovation was the timeline: a constant stream of 140-character updates that users could view on their phone. The timeline was a new way of consuming information—an unending stream of content that, to many, felt like drinking from a fire hose.

Later that year, Facebook launched its own version, called the News Feed. In 2009, it added the “Like” button, for the first time creating a public metric for the popularity of content. Then it added another transformative innovation: an algorithm that determined which posts a user would see, based on predicted “engagement”—the likelihood of an individual interacting with a given post, figuring in the user’s previous likes. This innovation tamed the fire hose, turning it into a curated stream.
Mark Pernice

The News Feed’s algorithmic ordering of content flattened the hierarchy of credibility. Any post by any producer could stick to the top of our feeds as long as it generated engagement. “Fake news” would later flourish in this environment, as a personal blog post was given the same look and feel as a story from The New York Times. 


Twitter also made a key change in 2009, adding the “Retweet” button. Until then, users had to copy and paste older tweets into their status updates, a small obstacle that required a few seconds of thought and attention. The Retweet button essentially enabled the frictionless spread of content. A single click could pass someone else’s tweet on to all of your followers—and let you share in the credit for contagious content. In 2012, Facebook offered its own version of the retweet, the “Share” button, to its fastest-growing audience: smartphone users.

Chris Wetherell was one of the engineers who created the Retweet button for Twitter. He admitted to BuzzFeed earlier this year that he now regrets it. As Wetherell watched the first Twitter mobs use his new tool, he thought to himself: “We might have just handed a 4-year-old a loaded weapon.”
From Our December 2019 Issue

The coup de grâce came in 2012 and 2013, when Upworthy and other sites began to capitalize on this new feature set, pioneering the art of testing headlines across dozens of variations to find the version that generated the highest click-through rate. This was the beginning of “You won’t believe …” articles and their ilk, paired with images tested and selected to make us click impulsively. These articles were not usually intended to cause outrage (the founders of Upworthy were more interested in uplift). But the strategy’s success ensured the spread of headline testing, and with it emotional story-packaging, through new and old media alike; outrageous, morally freighted headlines proliferated in the following years. In Esquire, Luke O’Neil reflected on the changes wrought on mainstream media and declared 2013 to be “The Year We Broke the Internet.” The next year, Russia’s Internet Research Agency began mobilizing its network of fake accounts, across every major social-media platform—exploiting the new outrage machine in order to inflame partisan divisions and advance Russian goals.

The internet, of course, does not bear sole responsibility for the pitch of political anger today. The media have been fomenting division since Madison’s time, and political scientists have traced a portion of today’s outrage culture to the rise of cable television and talk radio in the 1980s and ’90s. A multiplicity of forces are pushing America toward greater polarization. But social media in the years since 2013 has become a powerful accelerant for anyone who wants to start a fire.
The Decline of Wisdom

Even if social media could be cured of its outrage-enhancing effects, it would still raise problems for the stability of democracy. One such problem is the degree to which the ideas and conflicts of the present moment dominate and displace older ideas and the lessons of the past. As children grow up in America, rivers of information flow continually into their eyes and ears—a mix of ideas, narratives, songs, images, and more. Suppose we could capture and quantify three streams in particular: information that is new (created within the past month), middle-aged (created 10 to 50 years ago, by the generations that include the child’s parents and grandparents), and old (created more than 100 years ago).
Citizens are now more connected to one another, on platforms that have been designed to make outrage contagious.

Whatever the balance of these categories was in the 18th century, the balance in the 20th century surely shifted toward the new as radios and television sets became common in American homes. And that shift almost certainly became still more pronounced, and quickly so, in the 21st century. When the majority of Americans began using social media regularly, around 2012, they hyper-connected themselves to one another in a way that massively increased their consumption of new information—entertainment such as cat videos and celebrity gossip, yes, but also daily or hourly political outrages and hot takes on current events—while reducing the share of older information. What might the effect of that shift be?

In 1790, the Anglo-Irish philosopher and statesman Edmund Burke wrote, “We are afraid to put men to live and trade each on his own private stock of reason; because we suspect that this stock in each man is small, and that the individuals would do better to avail themselves of the general bank and capital of nations and of ages.” Thanks to social media, we are embarking on a global experiment that will test whether Burke’s fear is valid. Social media pushes people of all ages toward a focus on the scandal, joke, or conflict of the day, but the effect may be particularly profound for younger generations, who have had less opportunity to acquire older ideas and information before plugging themselves into the social-media stream.

Our cultural ancestors were probably no wiser than us, on average, but the ideas we inherit from them have undergone a filtration process. We mostly learn of ideas that a succession of generations thought were worth passing on. That doesn’t mean these ideas are always right, but it does mean that they are more likely to be valuable, in the long run, than most content generated within the past month. Even though they have unprecedented access to all that has ever been written and digitized, members of Gen Z (those born after 1995 or so) may find themselves less familiar with the accumulated wisdom of humanity than any recent generation, and therefore more prone to embrace ideas that bring social prestige within their immediate network yet are ultimately misguided.

For example, a few right-wing social-media platforms have enabled the most reviled ideology of the 20th century to draw in young men hungry for a sense of meaning and belonging and willing to give Nazism a second chance. Left-leaning young adults, in contrast, seem to be embracing socialism and even, in some cases, communism with an enthusiasm that at times seems detached from the history of the 20th century. And polling suggests that young people across the political spectrum are losing faith in democracy.
Is There Any Way Back?

Social media has changed the lives of millions of Americans with a suddenness and force that few expected. The question is whether those changes might invalidate assumptions made by Madison and the other Founders as they designed a system of self-governance. Compared with Americans in the 18th century—and even the late 20th century—citizens are now more connected to one another, in ways that increase public performance and foster moral grandstanding, on platforms that have been designed to make outrage contagious, all while focusing people’s minds on immediate conflicts and untested ideas, untethered from traditions, knowledge, and values that previously exerted a stabilizing effect. This, we believe, is why many Americans—and citizens of many other countries, too—experience democracy as a place where everything is going haywire.

There was a time in American public life when atonement was seen as a form of strength—a way not only to own up to one’s missteps but also to control the narrative. That time is over, Megan Garber writes.

It doesn’t have to be this way. Social media is not intrinsically bad, and has the power to do good—as when it brings to light previously hidden harms and gives voice to previously powerless communities. Every new communication technology brings a range of constructive and destructive effects, and over time, ways are found to improve the balance. Many researchers, legislators, charitable foundations, and tech-industry insiders are now working together in search of such improvements. We suggest three types of reform that might help:

(1) Reduce the frequency and intensity of public performance. If social media creates incentives for moral grandstanding rather than authentic communication, then we should look for ways to reduce those incentives. One such approach already being evaluated by some platforms is “demetrication,” the process of obscuring like and share counts so that individual pieces of content can be evaluated on their own merit, and so that social-media users are not subject to continual, public popularity contests.

(2) Reduce the reach of unverified accounts. Bad actors—trolls, foreign agents, and domestic provocateurs—benefit the most from the current system, where anyone can create hundreds of fake accounts and use them to manipulate millions of people. Social media would immediately become far less toxic, and democracies less hackable, if the major platforms required basic identity verification before anyone could open an account—or at least an account type that allowed the owner to reach large audiences. (Posting itself could remain anonymous, and registration would need to be done in a way that protected the information of users who live in countries where the government might punish dissent. For example, verification could be done in collaboration with an independent nonprofit organization.)

(3) Reduce the contagiousness of low-quality information. Social media has become more toxic as friction has been removed. Adding some friction back in has been shown to improve the quality of content. For example, just after a user submits a comment, AI can identify text that’s similar to comments previously flagged as toxic and ask, “Are you sure you want to post this?” This extra step has been shown to help Instagram users rethink hurtful messages. The quality of information that is spread by recommendation algorithms could likewise be improved by giving groups of experts the ability to audit the algorithms for harms and biases.

Many Americans may think that the chaos of our time has been caused by the current occupant of the White House, and that things will return to normal whenever he leaves. But if our analysis is correct, this will not happen. Too many fundamental parameters of social life have changed. The effects of these changes were apparent by 2014, and these changes themselves facilitated the election of Donald Trump.

If we want our democracy to succeed—indeed, if we want the idea of democracy to regain respect in an age when dissatisfaction with democracies is rising—we’ll need to understand the many ways in which today’s social-media platforms create conditions that may be hostile to democracy’s success. And then we’ll have to take decisive action to improve social media.

This article appears in the December 2019 print edition with the headline “Why It Feels Like Everything Is Going Haywire.”
Jonathan Haidt is a social psychologist at the New York University Stern School of Business. He is the author of The Righteous Mind and the co-author of The Coddling of the American Mind, which originated as a September 2015 Atlantic story

The Seabird’s Cry

It is disturbing to read but essential to read and to act now!

I can't believe it!

the seabird cryI’ve always enjoyed time spent by the sea, and particularly Britain’s cliffs and the plethora of seabirds to be seen there. Beeston Cliffs, St Abbs Head, South Stack, Duncansby Head, Summer Isles, cliffs of the South West of England and Wales, and more… So many places. Until recently I never questioned if these great massings of seabirds would ever not be there. Yet they are in perilous decline and danger, as are seabird colonies the world over. Industrial fishing, pollution and climate breakdown are presenting insuperable problems to many species. The spectre of multiple extinctions looms.

In his magnificent and illustrated book The Seabird’s Cry, Adam Nicolson takes us through the glory of common species of seabirds, the threats they face and the effects on populations, mostly declining. It is a story at the same time beautifully told, yet almost impossible to bear.

A few of my notes will…

View original post 506 more words