Big Meat just can’t quit antibiotics

News Feed
Sunday, January 8, 2023

Cattle at the Texana Feeders feedlot in Floresville, Texas. | Daniel Acker/Bloomberg via Getty Images Meat production is making lifesaving drugs less effective. Where’s the FDA? The US Food and Drug Administration (FDA) knew that America’s meat industry had a drug problem. For decades, evidence had amassed that the widespread use of antibiotics to help chickens, pigs, and cattle grow faster — and survive the crowded conditions of factory farms — was causing bacteria to mutate and develop resistance to antibiotics. By 2009, US agriculture companies were buying up two-thirds of what are termed medically important antibiotics — those used in human medicine. This in turn has made those precious, lifesaving drugs less effective for people. Over time, once easily treatable human infections, like sepsis, urinary tract infections, and tuberculosis, became harder or sometimes impossible to treat. A foundational component of modern medicine was starting to crumble. But it wasn’t until the mid-2010s that the FDA finally took the basic steps of requiring farmers to get veterinary prescriptions for antibiotics and banning the use of antibiotics to make animals grow faster — steps that some European regulators had taken a decade or more prior. Thanks to those two actions alone, sales of medically important antibiotics for livestock plummeted 42 percent from 2015 to 2017. But according to Matthew Wellington of the Public Interest Research Group, the FDA’s reforms went after the low-hanging fruit, and they didn’t go nearly far enough. Now, in a concerning course reversal, antibiotic sales for use in livestock ticked back up 7 percent from 2017 to 2021, per a new FDA report. The chicken industry, which had led the pack in reducing antibiotic use on farms, bought 12 percent more antibiotics in 2021 than in 2020. It’s a sobering turn of events with life-and-death implications. In 2019, antibiotic-resistant bacteria directly killed over 1.2 million people, including 35,000 Americans, and more than 3 million others died from diseases where antibiotic resistance played a role — far more than the global toll of HIV/AIDS or malaria, leading the World Health Organization to call antibiotic resistance “one of the biggest threats to global health, food security, and development today.” Public health advocates want to see the FDA take the threat much more seriously, and often point to Europe as a role model. From 2011 to 2021, antibiotic sales for use in livestock fell by almost half across the European Union, and use per animal is now around half that of the US. Last year, the EU implemented perhaps its most significant reform yet: banning the routine use of antibiotics to prevent disease, reserving their use for only when animals are actually sick. That critical step is expected to slash the continent’s antibiotic use further. John MacDougall/AFP via Getty Images Activists with the environmental organization Greenpeace campaign against the excessive use of antibiotics in livestock farming in front of an outlet of discount food retailer Lidl, in Berlin on July 25, 2017. It’s unlikely the FDA will follow in Europe’s footsteps any time soon. Asked about an EU-style ban on preventive use of antibiotics, an FDA spokesperson responded, “The laws in the US and our livestock population are not the same as that of the EU or other countries. The FDA’s initiatives to promote judicious use and reduce AMR [antimicrobial resistance] were devised specifically for the US and the conditions we face with the aim of maximizing effectiveness and cooperation of drug sponsors, veterinarians, and animal producers.” The FDA and the US food industry have proven that they can make progress on the issue — but to keep antibiotics working, they need to do a lot more. That will require them to tackle beef and pork, two of the more stubborn and complex sectors of America’s meat system that just can’t seem to quit antibiotics, since doing so could demand substantive changes to how animals are farmed for food. The American antibiotic-free revolution that wasn’t It wasn’t just the FDA’s new rules that caused antibiotic sales for livestock to plunge in a two-year period — Big Chicken played a part too. In the early 2000s, the nation’s fourth-largest chicken producer Perdue Farms began efforts to wean its birds off antibiotics, which it achieved in 2016 by changing chickens’ diets and replacing antibiotics with vaccines and probiotics. At first, chicken raised without antibiotics cost 50 percent more, but the company says it has since been able to all but close the cost differential. In the mid-2010s, while Perdue was making progress, activists leveraged the momentum and successfully convinced McDonald’s to source chicken raised without medically important antibiotics. Tyson Foods, the nation’s largest poultry producer, then committed to reducing antibiotic use, contributing to a “domino effect” in which producers and restaurants made further pledges to reduce antibiotics in poultry, said Wellington. By 2020, a little over half of America’s 9 billion chickens farmed for meat were raised without antibiotics, according to an industry survey. The sea change in chicken production demonstrated it was possible to quickly scale down antibiotics in farming, but it didn’t do much to reduce overall use, as the chicken industry only used 6 percent of antibiotics in agriculture in 2016. And the momentum didn’t spread to other parts of the meat business, like beef and pork, which together account for over 80 percent of medically important antibiotics fed to farmed animals. Some of the lack of progress in beef and pork comes down to the simple fact that pigs and cattle are raised differently than chickens. Chickens are slaughtered at just six or seven weeks old, so the chance they’ll get sick is lower than pigs, who are slaughtered at six months old, or cattle, slaughtered at around three years of age. The chicken industry is also vertically integrated, meaning a company like Tyson or Perdue controls virtually every link in the supply chain, so making big changes like cutting out antibiotics is easier than in the more decentralized supply chain of beef. For example, the typical steer will change hands several times before slaughter, going from a breeder to pasture grazing to a feedlot, all of which make it harder to coordinate an antibiotic-free regimen. In the last few months of their life cattle are also fed a high-grain diet that they aren’t adapted to digest, which increases the chance they’ll develop a liver abscess, a condition that’s prevented with — you guessed it — antibiotics. The pork sector, like poultry, is also vertically integrated, but the industry has largely opposed animal welfare, environmental, and antibiotic reforms. Antibiotics in pig production shot up 25 percent from 2017 to 2021. There’s also no pork or beef giant that’s taken the antibiotic-free leap like Perdue did for chicken. That could change in the years ahead: McDonald’s, the world’s largest beef purchaser, announced at the end of 2022 that it plans to reduce antibiotic use in its beef supply chain. However, the announcement didn’t come with a timeline, which worries advocates like Wellington, and the company has failed to make good on other pledges. Although voluntary change can move the needle, without regulation, industry has little incentive to make the dramatic reductions needed to safeguard antibiotics. While the FDA has prohibited meat producers from using antibiotics to speed up growth— their original purpose in agriculture — some of the antibiotics that promote growth, like tylosin, are still allowed for disease prevention, a loophole that disincentivizes producers from reducing antibiotics, Wellington said: “Our concern has always been that they’re just putting a different name on the same kind of use, which is a problem.” Daniel Acker/Bloomberg via Getty Images Cattle at a feedlot in Texas. In response to this concern, an FDA spokesperson said, “Veterinarians are on the front lines and as prescribers, they’re in the best position to ensure that both medically important and non-medically important antimicrobials are being used appropriately.” Aside from outright banning the routine use of medically important antibiotics to prevent disease, Wellington said he’d like to see the FDA take three actions: set a target of reducing antibiotic use by 50 percent by the end of 2025 (based on 2010 levels); publish data on antibiotic use, not just sales; and limit the duration of antibiotic courses for farmed animals. An FDA spokesperson said specific reduction targets weren’t possible because the agency doesn’t know how many antibiotics farmers are using: “We cannot effectively monitor antimicrobial use without first putting a system in place for determining [a] baseline and assessing trends over time.” The agency right now only collects sales data, and it’s been exploring a voluntary public-private approach to collect and report real-world use data. Some states haven’t waited on federal regulators: Maryland and California have both restricted the use of antibiotics on farms. How the Europeans — and some Americans — are quitting antibiotics on the farm Just because it’s difficult to reduce antibiotics in beef and pork production doesn’t mean it’s impossible, as the story of Iowa pig farmers Tim and Deleana Roseland demonstrates. In 2005, they switched from raising pigs in the conventional manner — tightly cramped and fed a steady diet of antibiotics — to raising pigs for Niman Ranch, a higher-welfare meat company now owned by Perdue. That required the Roselands to ditch the routine use of antibiotics. “I was nervous about it at first but as it turned out, it was no big deal whatsoever,” Tim Roseland said. But he added that it wouldn’t have been possible with his old setup: “There’s too much overcrowding, small pens, too many pigs crammed into a little area.” Their newer system gives each pig more space in larger pens, and bedding that they root through and chew on, instead of, when they’re packed into factory farms, chewing on each other. They also give the pigs more vaccines and feed them probiotics. And there’s a lot to learn from the Europeans: Denmark, the continent’s second-largest pork producer, has become the de facto case study in how to wean Big Meat off antibiotics. In the early 1990s, it started phasing out antibiotics in pigs with little impact on the industry. From 1992 to 2008, antibiotic use per pig fell by over 50 percent, and while pig mortality went up in the short term, by 2008 it had dropped back to near-1992 levels. Tom Stoddart/Getty Images Pigs pictured at a farm in Tilsbaek, Denmark, producing 18,000 piglets per year mainly for the domestic market. The small country’s transformation wasn’t a matter of rocket science, but a suite of smart management practices: more frequent barn cleaning, better ventilation, later piglet weaning, more space per pig, extra vaccines, and experimenting with feed and additives. All this comes with difficult tradeoffs: antibiotic-free pork costs more and requires more land, which increases its carbon footprint. But we can’t expect to have cheap meat forever without a cost to public health, an uncomfortable truth that’s led many environmental and public health groups to champion a message of “less but better” meat. “I think the fact that Denmark, despite very low antibiotic use since 1995, is still one of the biggest pork exporters in the world, already speaks for itself,” said Francesca Chiara, a director at the University of Minnesota’s Center for Infectious Disease Research and Policy. Given the projected rise of global antibiotic sales for agriculture, Denmark’s example may not be speaking loudly enough. But it’s time we listen — nothing less than the future of human medicine is at stake.

Several cattle eat from a trough.
Cattle at the Texana Feeders feedlot in Floresville, Texas. | Daniel Acker/Bloomberg via Getty Images

Meat production is making lifesaving drugs less effective. Where’s the FDA?

The US Food and Drug Administration (FDA) knew that America’s meat industry had a drug problem.

For decades, evidence had amassed that the widespread use of antibiotics to help chickens, pigs, and cattle grow faster — and survive the crowded conditions of factory farms — was causing bacteria to mutate and develop resistance to antibiotics. By 2009, US agriculture companies were buying up two-thirds of what are termed medically important antibiotics — those used in human medicine. This in turn has made those precious, lifesaving drugs less effective for people.

Over time, once easily treatable human infections, like sepsis, urinary tract infections, and tuberculosis, became harder or sometimes impossible to treat. A foundational component of modern medicine was starting to crumble. But it wasn’t until the mid-2010s that the FDA finally took the basic steps of requiring farmers to get veterinary prescriptions for antibiotics and banning the use of antibiotics to make animals grow faster — steps that some European regulators had taken a decade or more prior.

Thanks to those two actions alone, sales of medically important antibiotics for livestock plummeted 42 percent from 2015 to 2017. But according to Matthew Wellington of the Public Interest Research Group, the FDA’s reforms went after the low-hanging fruit, and they didn’t go nearly far enough. Now, in a concerning course reversal, antibiotic sales for use in livestock ticked back up 7 percent from 2017 to 2021, per a new FDA report. The chicken industry, which had led the pack in reducing antibiotic use on farms, bought 12 percent more antibiotics in 2021 than in 2020.

It’s a sobering turn of events with life-and-death implications. In 2019, antibiotic-resistant bacteria directly killed over 1.2 million people, including 35,000 Americans, and more than 3 million others died from diseases where antibiotic resistance played a role — far more than the global toll of HIV/AIDS or malaria, leading the World Health Organization to call antibiotic resistance “one of the biggest threats to global health, food security, and development today.”

Public health advocates want to see the FDA take the threat much more seriously, and often point to Europe as a role model. From 2011 to 2021, antibiotic sales for use in livestock fell by almost half across the European Union, and use per animal is now around half that of the US. Last year, the EU implemented perhaps its most significant reform yet: banning the routine use of antibiotics to prevent disease, reserving their use for only when animals are actually sick. That critical step is expected to slash the continent’s antibiotic use further.

A big, fake syringe is injecting a green liquid into a big, fake piece of meat. John MacDougall/AFP via Getty Images
Activists with the environmental organization Greenpeace campaign against the excessive use of antibiotics in livestock farming in front of an outlet of discount food retailer Lidl, in Berlin on July 25, 2017.

It’s unlikely the FDA will follow in Europe’s footsteps any time soon. Asked about an EU-style ban on preventive use of antibiotics, an FDA spokesperson responded, “The laws in the US and our livestock population are not the same as that of the EU or other countries. The FDA’s initiatives to promote judicious use and reduce AMR [antimicrobial resistance] were devised specifically for the US and the conditions we face with the aim of maximizing effectiveness and cooperation of drug sponsors, veterinarians, and animal producers.”

The FDA and the US food industry have proven that they can make progress on the issue — but to keep antibiotics working, they need to do a lot more. That will require them to tackle beef and pork, two of the more stubborn and complex sectors of America’s meat system that just can’t seem to quit antibiotics, since doing so could demand substantive changes to how animals are farmed for food.

The American antibiotic-free revolution that wasn’t

It wasn’t just the FDA’s new rules that caused antibiotic sales for livestock to plunge in a two-year period — Big Chicken played a part too.

In the early 2000s, the nation’s fourth-largest chicken producer Perdue Farms began efforts to wean its birds off antibiotics, which it achieved in 2016 by changing chickens’ diets and replacing antibiotics with vaccines and probiotics. At first, chicken raised without antibiotics cost 50 percent more, but the company says it has since been able to all but close the cost differential.

In the mid-2010s, while Perdue was making progress, activists leveraged the momentum and successfully convinced McDonald’s to source chicken raised without medically important antibiotics. Tyson Foods, the nation’s largest poultry producer, then committed to reducing antibiotic use, contributing to a “domino effect” in which producers and restaurants made further pledges to reduce antibiotics in poultry, said Wellington.

By 2020, a little over half of America’s 9 billion chickens farmed for meat were raised without antibiotics, according to an industry survey.

The sea change in chicken production demonstrated it was possible to quickly scale down antibiotics in farming, but it didn’t do much to reduce overall use, as the chicken industry only used 6 percent of antibiotics in agriculture in 2016. And the momentum didn’t spread to other parts of the meat business, like beef and pork, which together account for over 80 percent of medically important antibiotics fed to farmed animals.

Some of the lack of progress in beef and pork comes down to the simple fact that pigs and cattle are raised differently than chickens. Chickens are slaughtered at just six or seven weeks old, so the chance they’ll get sick is lower than pigs, who are slaughtered at six months old, or cattle, slaughtered at around three years of age.

The chicken industry is also vertically integrated, meaning a company like Tyson or Perdue controls virtually every link in the supply chain, so making big changes like cutting out antibiotics is easier than in the more decentralized supply chain of beef. For example, the typical steer will change hands several times before slaughter, going from a breeder to pasture grazing to a feedlot, all of which make it harder to coordinate an antibiotic-free regimen. In the last few months of their life cattle are also fed a high-grain diet that they aren’t adapted to digest, which increases the chance they’ll develop a liver abscess, a condition that’s prevented with — you guessed it — antibiotics.

The pork sector, like poultry, is also vertically integrated, but the industry has largely opposed animal welfare, environmental, and antibiotic reforms. Antibiotics in pig production shot up 25 percent from 2017 to 2021.

There’s also no pork or beef giant that’s taken the antibiotic-free leap like Perdue did for chicken. That could change in the years ahead: McDonald’s, the world’s largest beef purchaser, announced at the end of 2022 that it plans to reduce antibiotic use in its beef supply chain. However, the announcement didn’t come with a timeline, which worries advocates like Wellington, and the company has failed to make good on other pledges.

Although voluntary change can move the needle, without regulation, industry has little incentive to make the dramatic reductions needed to safeguard antibiotics. While the FDA has prohibited meat producers from using antibiotics to speed up growth— their original purpose in agriculture — some of the antibiotics that promote growth, like tylosin, are still allowed for disease prevention, a loophole that disincentivizes producers from reducing antibiotics, Wellington said: “Our concern has always been that they’re just putting a different name on the same kind of use, which is a problem.”

An aerial shot of a few dozen cattle outside in a feedlot. Daniel Acker/Bloomberg via Getty Images
Cattle at a feedlot in Texas.

In response to this concern, an FDA spokesperson said, “Veterinarians are on the front lines and as prescribers, they’re in the best position to ensure that both medically important and non-medically important antimicrobials are being used appropriately.”

Aside from outright banning the routine use of medically important antibiotics to prevent disease, Wellington said he’d like to see the FDA take three actions: set a target of reducing antibiotic use by 50 percent by the end of 2025 (based on 2010 levels); publish data on antibiotic use, not just sales; and limit the duration of antibiotic courses for farmed animals.

An FDA spokesperson said specific reduction targets weren’t possible because the agency doesn’t know how many antibiotics farmers are using: “We cannot effectively monitor antimicrobial use without first putting a system in place for determining [a] baseline and assessing trends over time.” The agency right now only collects sales data, and it’s been exploring a voluntary public-private approach to collect and report real-world use data.

Some states haven’t waited on federal regulators: Maryland and California have both restricted the use of antibiotics on farms.

How the Europeans — and some Americans — are quitting antibiotics on the farm

Just because it’s difficult to reduce antibiotics in beef and pork production doesn’t mean it’s impossible, as the story of Iowa pig farmers Tim and Deleana Roseland demonstrates.

In 2005, they switched from raising pigs in the conventional manner — tightly cramped and fed a steady diet of antibiotics — to raising pigs for Niman Ranch, a higher-welfare meat company now owned by Perdue. That required the Roselands to ditch the routine use of antibiotics.

“I was nervous about it at first but as it turned out, it was no big deal whatsoever,” Tim Roseland said. But he added that it wouldn’t have been possible with his old setup: “There’s too much overcrowding, small pens, too many pigs crammed into a little area.”

Their newer system gives each pig more space in larger pens, and bedding that they root through and chew on, instead of, when they’re packed into factory farms, chewing on each other. They also give the pigs more vaccines and feed them probiotics.

And there’s a lot to learn from the Europeans: Denmark, the continent’s second-largest pork producer, has become the de facto case study in how to wean Big Meat off antibiotics. In the early 1990s, it started phasing out antibiotics in pigs with little impact on the industry. From 1992 to 2008, antibiotic use per pig fell by over 50 percent, and while pig mortality went up in the short term, by 2008 it had dropped back to near-1992 levels.

About 10 pigs sleeping together inside a barn. Tom Stoddart/Getty Images
Pigs pictured at a farm in Tilsbaek, Denmark, producing 18,000 piglets per year mainly for the domestic market.

The small country’s transformation wasn’t a matter of rocket science, but a suite of smart management practices: more frequent barn cleaning, better ventilation, later piglet weaning, more space per pig, extra vaccines, and experimenting with feed and additives.

All this comes with difficult tradeoffs: antibiotic-free pork costs more and requires more land, which increases its carbon footprint. But we can’t expect to have cheap meat forever without a cost to public health, an uncomfortable truth that’s led many environmental and public health groups to champion a message of “less but better” meat.

“I think the fact that Denmark, despite very low antibiotic use since 1995, is still one of the biggest pork exporters in the world, already speaks for itself,” said Francesca Chiara, a director at the University of Minnesota’s Center for Infectious Disease Research and Policy.

Given the projected rise of global antibiotic sales for agriculture, Denmark’s example may not be speaking loudly enough. But it’s time we listen — nothing less than the future of human medicine is at stake.

Read the full story here.
Photos courtesy of

The doomers are wrong about humanity’s future — and its past

Tyler Comrie for Vox The necessity of progress. If I wanted to convince you of the reality of human progress, of the fact that we as a species have advanced materially, morally, and politically over our time on this planet, I could quote you chapter and verse from a thick stack of development statistics. I could tell you that a little more than 200 years ago, nearly half of all children born died before they reached their 15th birthday, and that today it’s less than 5 percent globally. I could tell you that in pre-industrial times, starvation was a constant specter and life expectancy was in the 30s at best. I could tell you that at the dawn of the 19th century, barely more than one person in 10 was literate, while today that ratio has been nearly reversed. I could tell you that today is, on average, the best time to be alive in human history. But that doesn’t mean you’ll be convinced. In one 2017 Pew poll, a plurality of Americans — people who, perhaps more than anywhere else, are heirs to the benefits of centuries of material and political progress — reported that life was better 50 years ago than it is today. A 2015 survey of thousands of adults in nine rich countries found that 10 percent or fewer believed that the world was getting better. On the internet, a strange nostalgia persists for the supposedly better times before industrialization, when ordinary people supposedly worked less and life was allegedly simpler and healthier. (They didn’t and it wasn’t.) Looking backward, we imagine a halcyon past that never was; looking forward, it seems to many as if, in the words of young environmental activist Greta Thunberg, “the world is getting more and more grim every day.” So it’s boom times for doom times. But the apocalyptic mindset that has gripped so many of us not only understates how far we’ve come, but how much further we can still go. The real story of progress today is its remarkable expansion to the rest of the world in recent decades. In 1950, life expectancy in Africa was just 40; today, it’s past 62. Meanwhile more than 1 billion people have moved out of extreme poverty since 1990 alone. But there’s more to do — much more. That hundreds of millions of people still go without the benefit of electricity or live in states still racked by violence and injustice isn’t so much an indictment of progress as it is an indication that there is still more low-hanging fruit to harvest. The world hasn’t become a better place for nearly everyone who lives on it because we wished it so. The astounding economic and technological progress made over the past 200 years has been the result of deliberate policies, a drive to invent and innovate, one advance building upon another. And as our material condition improved, so, for the most part, did our morals and politics — not as a side effect, but as a direct consequence. It’s simply easier to be good when the world isn’t zero-sum. Which isn’t to say that the record of progress is one of unending wins. For every problem it solved — the lack of usable energy in the pre-fossil fuel days, for instance — it often created a new one, like climate change. But just as a primary way climate change is being addressed is through innovation that has drastically reduced the price of clean energy, so progress tends to be the best route to solving the problems that progress itself can create. Though historians still argue over what the writer Jason Crawford calls “the roots of progress,” the fundamental swerve was the belief that, after eons of relatively little meaningful change, the future could actually be different, and better. But the doomerism that risks overtaking us erodes that belief, and undercuts the policies that give it life. The biggest danger we face today, if we care about actually making the future a more perfect place, isn’t that industrial civilization will choke on its own exhaust or that democracy will crumble or that AI will rise up and overthrow us all. It’s that we will cease believing in the one force that raised humanity out of tens of thousands of years of general misery: the very idea of progress. How progress solves the problems we didn’t know were problems Progress may be about where we’re going, but it’s impossible to understand without returning to where we’ve been. So let’s take a trip back to the foreign country that was the early years of the 19th century. In 1820, according to data compiled by the historian Michail Moatsos, about three-quarters of the world’s population earned so little that they could not afford even a tiny living space, some heat and, hopefully, enough food to stave off malnutrition. It was a state that we would now call “extreme poverty,” except that for most people back then, it wasn’t extreme — it was simply life. What matters here for the story of progress isn’t the fact that the overwhelming majority of humankind lived in destitution. It’s that this was the norm, and had been the norm since essentially… forever. Poverty, illiteracy, premature death — these weren’t problems, as we would come to define them in our time. They were simply the background reality of being human, as largely unchangeable as birth and death itself. And there were only the slightest inklings at the time that this could or should change. But those inklings were there, and over time, they began to take root. The Scientific Revolution began in the 1500s, as figures like the English philosopher Francis Bacon introduced the idea that through trial and experimentation, scientific knowledge could be advanced, and with it, the human condition itself. Over time, the abstract concepts and discoveries of the Scientific Revolution led to the machines and raw power of what would be dubbed the Industrial Revolution; to James Watts’s steam engine and Michael Faraday’s electric generator and Richard Arkwright’s Cromford Mill, the progenitor of the modern factory. Advances in our ability to generate energy, advances in our ability to harness that energy for work, and advances in our ability to create an economic system that got the most out of both of those factors all intermingled. And that is when human life began to truly change, in a way that was so massive and, eventually, so all-encompassing, that we still struggle to grasp its sheer scale. Economic historian Deirdre McCloskey has simply dubbed it “the Great Fact.” The simplest fact about the Great Fact might be this: Without it, chances are I wouldn’t be writing this article and you wouldn’t be reading it In his 2022 book Slouching Towards Utopia: An Economic History of the 20th Century, the economic historian Brad DeLong uses a simple data point to describe just how much progress occurred after 1870, once the advances of the Industrial Revolution had been fully consolidated and political improvements began to follow economic ones. In 1870, an average unskilled male worker in London could earn enough per day to buy 5,000 calories worth of food for himself and his family. That was more than in 1600, but not significantly more, and not enough to easily feed everyone consistently, given that mean household size in England at the time was just under five people. By 2010 — the end of what DeLong in his book called “the long twentieth century” — that same worker could afford to buy the equivalent of 2.4 million calories of food per day, a nearly 50,000 percent increase. The simplest fact about the Great Fact might be this: Without it, chances are I wouldn’t be writing this article and you wouldn’t be reading it. Between 10,000 BCE and 1700, the average global population growth rate was just 0.04 percent per year. And that wasn’t because human beings weren’t having babies. They were simply dying, in great numbers: at birth, giving birth, in childhood from now-preventable diseases, and in young adulthood from now-preventable wars and violence. We were stuck in the Malthusian Trap, named after the 18th-century English cleric and economist Thomas Malthus. The trap argues that any increase in food production or other resources that allowed the population to grow was quickly consumed by that increased population, which then led to food shortages and population decline. (It’s striking that one of the few real spikes in wages and standard of living in pre-industrial times came in the aftermath of the Black Death, which killed off perhaps 30 percent of Europe’s population. Those who survived were able to command higher wages to work empty land — but a deadly pandemic is no reasonable person’s idea of a sustainable economic growth program.) Viewed from one angle, the human population before the Industrial Revolution was in an ecological balance of the sort we might aim to preserve if humanity were just another wild species plowing its environmental niche, its numbers kept in check by violence, disease, and starvation. But that nearly flat line of population growth, century after century, hides an untellable story of misery and suffering, one of children dead before their time, of families snuffed out by starvation, of potential and of people that would never have the chance to be realized. It was a story, as the writer Bill Bryson has put it, of “tiny coffins.” In the poorer countries of sub-Saharan Africa, progress has been slower and later, but shouldn’t be underestimated It was only with the progress of industrialization that we broke out of the Malthusian Trap, producing enough food to feed the mounting billions, enough scientific breakthroughs to conquer old killers like smallpox and the measles, and enough political advances to dwindle violent death. Between 1800 and today, our numbers grew from around 1 billion to 8 billion. And that 8 billion aren’t just healthier, richer, and better educated. On average, they can expect to live more than twice as long. The writer Steven Johnson has called this achievement humanity’s “extra life” — but that extra isn’t just the decades that have been added to our lifespans. It’s the extra people that have been added to our numbers. I’m probably one of them, and you probably are too. The Malthusian Trap isn’t easy to escape, and the progress we’ve earned has hardly been uninterrupted or perfectly distributed. The past two centuries have seen by far the bloodiest conflicts in human history, punctuated by the invention of weapons that could conceivably end humanity. Well into the 20th century, billions still lived lives that were materially little different from their impoverished ancestors. But if progress hasn’t yet fully broken the Malthusian Trap physically, it did so psychologically. Once we could prove in practice that the lot of humanity didn’t have to be hand-to-mouth existence, we could see that progress could continue to expand. The long twentieth century came late to the Global South, but it did get there. Between 1960 and today, India and China, together home to nearly one in every three people alive today, have seen life expectancy rise from 45 to 70 and 33 to 78, respectively. Per-capita GDP over those years rose some 2,600 percent for India and an astounding 13,400 percent for China, with the latter lifting an estimated 800 million people out of extreme poverty. In the poorer countries of sub-Saharan Africa, progress has been slower and later, but shouldn’t be underestimated. When we see the drastic decline in child mortality — which has fallen since 1990 from 18.1 percent of all children in that region to 7.4 percent in 2021 — or the more than 20 million measles deaths that have been prevented since 2000 in Africa alone, this is progress continuing to happen now, with the benefits overwhelmingly flowing to the poorest among us. The simplest argument for why we need to continue to build on that legacy is found in the places where it has continued to fall short. The fact that as of 2016 some 13 percent of the world still lacked access to electricity — the invisible foundation of modernity — is just as worthy of our worry. The fact that 85 percent of the world — a little less than 7 billion people — lives on less than $30 a day should keep us awake at night. Because that, as much as any existential challenge we fear hurtling toward us, shouldn’t simply be accepted as inevitable. How progress can solve the problem of being human On January 6, 1941 — 11 months before Pearl Harbor — President Franklin Delano Roosevelt gave his State of the Union speech. But it’s better known by another name: the Four Freedoms speech. As much of the world was engulfed in what would become the greatest and bloodiest conflict in human history, Roosevelt told Congress that “we look forward to a world founded upon four essential human freedoms”: freedom of speech, freedom of worship, freedom from want, and freedom from fear. There are human values that can’t be captured in dry economic statistics: life, liberty, the pursuit of happiness. If our world had somehow become as rich and as long-lived as it is today with a system of political liberties and human rights frozen in 1820, we might barely consider it progress at all. Except that we have seen startling improvements in everything from political liberty to democratic representation to human rights to even the way we treat some (if not all) animals. In 1800, according to Our World in Data, zero — none, nada, zip — people lived in what we would now classify as a liberal democracy. Just 22 million people — about 2 percent of the global population — lived in what the site classifies as “electoral autocracies,” meaning that what democracy they had was limited, and limited to a subset of the population. One hundred years later, things weren’t much better — there were actual liberal democracies, but fewer than 1 percent of the world’s population lived in them. All you have to do is roll the clock back a few decades to see the way that rights, on the whole, have been extended wider and wider: to LGBTQ citizens, to people of color, to women But in the decades that followed FDR’s “Four Freedoms” speech, things changed radically, thanks to the defeat of fascist powers, the spread of civil rights within existing democracies, and eventually, the collapse of the communist world. Today just 2 billion people live in countries that are classified as closed autocracies — relatively few legal rights, no real electoral democracy — and most of them are in China. That doesn’t mean that the liberal democracies that exist are perfect by any means, the US very much included. Nor does it mean that periods of advancement weren’t followed by periods of retrenchment or worse. Progress, especially in politics and morals, doesn’t flow as steadily as a calendar — just compare Germany in 1929 to Germany in 1939. But all you have to do is roll the clock back a few decades to see the way that rights, on the whole, have been extended wider and wider: to LGBTQ citizens, to people of color, to women. The fundamental fact is that as much as the technological and economic world of 2023 would be unrecognizable to people in 1800, the same is true of the political world. Nor can you disentangle that political progress from material progress. Take the gradual but definitive emancipation of women. That has been a hard-fought, ongoing battle, chiefly waged by women who saw the inherent unfairness of a male-dominated society. But it was aided by the invention of labor-saving technologies in the home like washing machines and refrigerators that primarily gave time back to women and made it easier for them to move into the workforce. These are all examples of the expansion of the circle of moral concern — the enlargement of who and what is considered worthy of respect and rights, from the foundation of the family or tribe all the way to humans around the world (and increasingly non-human animals as well). And it can’t be separated from the hard fact of material progress. The pre-industrial world was a zero-sum one — that, ultimately, is what the Malthusian Trap means. In a zero-sum world, you advance only at the expense of others, by taking from a set stock, not by adding, which is why wars of conquest between great powers were so common hundreds of years ago, or why homicide between neighbors was so much more frequent in the pre-industrial era. We have obviously not eradicated violence, including by the state itself. But a society that can produce more of what it needs and wants is one that will be less inclined to fight over what it has, either with its neighbors or with itself. It’s not that the humans of 2023 are necessarily better, more moral, than their ancestors 200 or more years ago. It’s that war and violence cease to make economic sense. But just as every bloody day of the Russia-Ukraine war demonstrates that moral and political progress hasn’t eradicated our violent tendencies, the material progress that has helped meet our most basic needs has opened the door to new, knottier problems: to climate change caused by industrialization, to the ills of a longer-lived society, to the mental health challenges that arise once we no longer need to worry about surviving and instead need to worry about living. And with it comes the temptation to turn back and give in. How progress creates new problems — and solves them anew On September 7, 1898, just as the world was finally escaping the Malthusian Trap, the chemist William Crookes told the British Association for the Advancement of Science that we were in danger of falling back into it. According to Crookes, the UK’s rapidly growing population was at risk of running out of food. There wasn’t more room for additional farmland on the isles, which meant the only way to increase the food supply was to boost agricultural productivity. That required nitrogen fertilizer, but existing supplies of nitrogen at the time came from natural sources like guano deposits in Peru, and they were running out. The world faced, he said, “a life-and-death question for generations to come.” On the face of it, this appeared to be Malthus’s revenge. The British population had exploded, but now it was meeting its natural limits, and nature’s correction was coming. But Crookes had a solution, one he believed we could literally pull out of thin air. The Earth’s atmosphere is 78 percent nitrogen. Crookes challenged his audience to develop a way for humanity to artificially fix atmospheric nitrogen in a way that could be used to create synthetic nitrogen fertilizer, and with it, produce enough food for Britain and the world. Less than 20 years later, two German scientists managed to do just that, developing the Haber-Bosch process to synthesize ammonia out of atmospheric nitrogen and hydrogen, which became a cornerstone of synthetic fertilizer. Combined with the increasing mechanization of agriculture, food production kept growing, and now it’s estimated that half the population alive today is dependent on the existence of synthetic fertilizer. Viewed from this angle, the story of synthetic fertilizer from Crookes to Haber and Bosch is one of just-so progress, of technological advancement rising to meet growing need. And so it is. But the story doesn’t stop there. The synthetic fertilizer industry produces about 2.6 gigatons of carbon per year — more than aviation and shipping combined. Its abundance has led to overapplication, so much so that about two-thirds of the nitrogen farmers apply to crops isn’t used by plants at all, but rather becomes run-off into the surrounding environment as a pollutant, causing dead zones like the massive one found in the Gulf of Mexico. Oh, and Fritz Haber himself later dedicated his career to developing chemical weapons that would kill thousands in World War I. So the Haber-Bosch process solved one problem, while creating new ones. That’s the story of progress as well. In fact, you’d be hard pressed to find a single scientific or technological advancement that doesn’t create an element of blowback. Breakthroughs in nuclear physics made possible the creation of zero-carbon nuclear plants — and also, the nuclear bombs that still hold the world hostage. The introduction of antibiotics may have added as much as eight years to global life expectancy, but the more they’re used, the more resistance builds up, paving the way for the next potential pandemic of antibiotic-resistant infections. Striking gains in agricultural productivity have eliminated the threat of famine in all but the poorest countries, but also contributed to a problem that was basically unheard of until recently: widespread obesity. Advances in animal breeding and diet have made meat cheap and widespread, but at the cost of local pollution and the creation of a factory farming system that sentences billions of domesticated animals to lives of terrible suffering. Above all else is climate change. If there was a single material ingredient to the Industrial Revolution and all that followed, it was coal. Coal fired the factories, coal fueled the railroads, coal made industrialization happen. It’s still coal, along with other fossil fuels like oil and natural gas, that provides the bulk of global energy consumption — energy consumption that has risen more than 3,000 percent since 1800. And coal and its fellow fossil fuels are by far the top contributors to climate change. Coal helped create progress, and coal helped create climate change. And if the story ended there, it might be reasonable to look at progress very differently. It might even be reasonable to agree with de-growthers whose demand, at the end of the day, is that we must stop economic growth and throw it in reverse, or face doom. But the story doesn’t end there. The solutions produced by progress create new problems, but almost every time, we’ve managed to find new solutions. The Haber-Bosch process created the new problem of fertilizer overapplication and pollution, but smart agriculture can get the same or greater crop yields with less fertilizer — a change that is already underway — while synthetic biology offers the promise of engineering crops that can effectively fertilize themselves. Before the Industrial Age, we lived in balance, but that balance, for the bulk of humanity, was a terrible place to be — worse, by many measures, than any future we could fear Obesity has proven to be a stubbornly resistant health problem, but new drugs and surgical treatments are poised to make it easier to lose weight even in an environment where food is everywhere. Climate change will be the most difficult challenge of all, but progress is bringing down the cost of renewable energy, reducing energy waste, and putting new forms of cleaner energy on the horizon. Progress — and that is the word — on climate policy and innovation has already bent the curve away from the worst-case climate scenarios. We’re not on course for utopia, but we no longer appear to be headed toward doom either. (At least, not this particular doom.) Before the Industrial Age, we lived in balance, but that balance, for the bulk of humanity, was a terrible place to be — worse, by many measures, than any future we could fear. With industrialization, after tens of thousands of years on this planet, we began to change that. But we also began a race: Could we keep inventing new technologies, new approaches, that would keep us ahead of the new challenges that progress created? Could we keep solving the problems of success? So far, the answer has been a qualified yes. Past performance is no guarantee of future results, but we have every reason to believe that we have far more race to run. That race can feel exhausting, even pathological — the endless sentence of a species that can never seem to be satisfied. Doomerism, at its heart, may be that exhaustion made manifest. But just as we need continued advances in clean tech or biosecurity to protect ourselves from some of the existential threats we’ve inadvertently created, so do we need continued progress to address the problems that have been with us always: of want, of freedom, even of mortality. Nothing can dispel the terminal exhaustion that seems endemic in 2023 better than the idea that there is so much more left to do to lift millions out of poverty and misery while protecting the future — which is possible, thanks to the path of the progress we’ve made. And we’ll know we’re successful if our descendants can one day look back on the present with the same mix of sympathy and relief with which we should look back on our past. How, they’ll wonder, did they ever live like that?

Climate Activists Are Turning Their Attention to Hollywood

If TV can change Americans’ views on gay marriage, why not the environment?

On a warm, windy fall night in Los Angeles, I stood in a conference room at the Warner Bros. Discovery television-production offices, straightened my spine, and stared down my showrunner, preparing to defend my idea for a minor character in our near-future science-fiction series.“This character needs a backstory, and switching jobs because she wants to work in renewable energy and not for an oil company fits perfectly,” I told the unsmiling head honcho.His face twisted, as if his assistant had delivered the wrong lunch. “Too complicated. That just feels like a lot of information to cram into a backstory. What if her story is that she wants this job because it’s near where her brother was killed in a terrorist attack? We’d just need to invent a terrorist attack.”As I tried to come up with a response, I looked at the writers, lawyers, agents, and camera operators surrounding us. I was taking part in a workshop organized by the Climate Ambassadors Network (CAN), a group of young climate activists working in Hollywood. Along with three other workshop participants, I had received a yellow index card with a mission: to convince this pretend showrunner—a documentary filmmaker in real life—that a character in the series needed a climate-related backstory.Ali Weinstein, a 28-year-old wearing a flowered jumpsuit and a dimpled grin, leaned in to hear my answer. She was all too familiar with this situation: When working as a showrunner’s assistant, she had often suggested climate story lines to her bosses, only to be rebuffed. Now, Weinstein is using that experience to help others make a stronger case for climate stories. The goal of CAN is to “infiltrate every part of the industry with climate knowledge,” Weinstein, who is now a television writer, told the group. “Hollywood is a huge cultural influence, and so if we are starting change within Hollywood, we can change a lot of other industries as well.”While Weinstein’s “infiltration” is hardly sinister, her mission is still a provocative one. The world urgently needs to slow the destructive march of climate change, but using entertainment to send social messages can be a fraught endeavor (as well as the source of a lot of cringe television). And the industry has little experience with climate stories: A collaboration between USC’s Media Impact Project and a nonprofit story consultancy called Good Energy found that 2.8 percent of the 37,453 film and television scripts that aired in the United States and were written between 2016 and 2020 used any climate change key words. Ten percent of stories that depicted “extreme weather events” such as hurricanes and wildfires tied them to any form of climate change.Weinstein and her allies argue that it’s time for the industry to tell more—and more varied—climate stories, not only to nudge societal attitudes but to create better, more believable entertainment. Television, with its ability to tell stories on a human scale, might have an especially important role to play: Recent research by the Yale Program on Climate Change Communication found that while 65 percent of American adults said they were worried about the climate crisis, only 35 percent reported discussing the topic even occasionally. Could the stories we see on-screen, in the intimacy of our homes, get us talking about the realities of life in an altered climate?Anna Jane Joyner, who founded the Good Energy story agency, says climate stories are infinitely more varied than writers and audiences might assume. When researchers from her agency and USC asked 2,000 people for examples of climate-themed movies or television shows, the most frequent answers were The Day After Tomorrow, which is almost 20 years old, and 2012, which is about the end of the world, not climate change.In an effort to expand Hollywood’s definition of a climate story, Joyner’s group created the Good Energy Playbook, a guide for writers who want to integrate climate change into their scripts. The playbook encourages writers to think beyond apocalypse, and instead approach climate change, in all its awful manifestations, as an opportunity for more inventive scriptwriting. What would a climate story look like as a Hallmark holiday movie? Could a rom-com be set at a ski resort that can no longer depend on snow and has to pivot to another business model? How might a hotter summer impact the Mafia’s waste-disposal work—and would Tony Soprano talk about it in his therapy sessions?In October 2021, the medical drama Grey’s Anatomy aired an episode called “Hotter than Hell” that depicted a heat wave in Seattle. It was a story proverbially ripped from the headlines: The previous summer, a record-shattering heat dome had enveloped the Pacific Northwest, causing ecological turmoil and human misery. Zoanne Clack, a former ER physician who is an executive producer of Grey’s Anatomy and Station Eleven, wanted to feature a disease caused by climate change, but none of the possibilities were acute enough to work within the show. She opted for a failing HVAC system that created dangerously high temperatures in the hospital’s operating rooms.Clack says climate change is now so familiar to viewers that it can serve as a convenient cheat code for scriptwriters. While Grey’s Anatomy strained viewers’ credulity in its fifth season, in 2008, with an episode about a freak ice storm—an unusual occurrence in Seattle, where the show is set—Clack says now she can attribute all kinds of wild, injury-inducing weather to climate change. “You don’t have to explain anything or go into big discussions about it, how weird it is. If you say those two words, people get it.”Dorothy Fortenberry, a writer and producer on The Handmaid’s Tale and the upcoming climate-themed anthology Extrapolations, says writers are only beginning to explore the creative possibilities. “If all the climate stories are the same, and the same type of view, it will be boring and bad,” she says. “My hope is every creative person takes this in the direction that is fruitful for the narrative and we end up with a real panoply of narratives.”Faced with the realities of climate change, some people switch abruptly from complacency to doomerism—perhaps because certainty of any kind feels safer than the muddle of a looming crisis. Joyner says climate-themed stories can help audiences navigate between these extremes, by either offering examples of courage and creativity or providing opportunities to process grief. There’s nothing wrong with apocalypse narratives, she says, but other approaches offer more hope: “Stories help create a world that isn’t the apocalypse, but shows us something more positive or somewhere in between—a vision for something we’re working towards.”Screenwriters have reason to believe that even passing mentions of climate change can transform public attitudes. Americans watch an average of three hours of television every day, meaning that they spend almost a fifth of their waking lives in the worlds it creates. History shows that issues raised on television can lead to real-world change—for better, and for worse.Producers of the show Cheers, which aired from 1982 to 1993, helped to popularize the concept of a “designated driver” by showing the patrons of the show’s eponymous bar calling cabs or getting a ride home after drinking. The term, which Harvard’s Center for Health Communication began promoting in 1988 in an effort to prevent alcohol-related deaths, became so common after appearing in the show’s dialogue that in 1991, it was listed in the Random House Webster’s College Dictionary. Seven years later, a poll showed that a majority of adults who drank had either been a designated driver or been driven home by one—and the uptake of the practice was strongest among the youngest drinkers. Between 1988 and 1992, alcohol-related traffic fatalities dropped by 25 percent, a decrease researchers attributed in part to the messages of shows like Cheers.Will & Grace, which first appeared on NBC in 1998, was the first popular sitcom with two gay lead characters. At the height of the show’s popularity, 17.3 million people tuned in each week to watch two successful men living openly as a couple. In 2006, the final year of the show’s original run, a study analyzed attitudes around homosexuality. “For those viewers with the fewest direct gay contacts, exposure to Will & Grace appears to have the strongest potential influence on reducing sexual prejudice,” the authors wrote, “while for those with many gay friends, there is no significant relationship between levels of prejudice and their exposure to the show.” In 2012, then Vice President Joe Biden cited the show as one reason for Americans’ support of marriage equality—cementing the show’s legacy as a landmark influence.The power of television isn’t always harnessed for health and equity. A recent study that compared tobacco use in cities that had access to TV in the 1940s, when it first appeared, with those that didn’t concluded that television increased the share of smokers in the population by 5 to 15 percent, creating 11 million additional smokers in the U.S. In 2023, the industry-funded Propane Education and Research Council plans to spend $13 million on its anti-electrification campaign, including $600,000 in fees to “influencers” such as cable-TV home-makeover stars who extol the virtues of propane as they remodel houses. Meanwhile, shows ranging from Lifestyles of the Rich and Famous to The Kardashians glamorize private planes, huge homes, and ways of living that are far from sustainable.When I asked Weinstein about the frequent characterization of climate content as a form of propaganda, she shrugged. Every detail in a TV show is a choice, and in her view, show creators can use those details, and the stories that surround them, to address climate change and its potential solutions—or they can choose not to. Those who choose not to, Weinstein and her allies argue, risk being left behind by their audiences. Most viewers surveyed by the USC and Good Energy researchers believed that they care more about climate change than the characters they see on television and in film. Half of the respondents wanted to see more climate-related stories in scripted entertainment, and another quarter were open to them.Joyner acknowledges that major studios are still wary of being perceived as environmental activists, and that writers, and their bosses, have long avoided touchy political and cultural issues out of fear of alienating audiences: “Historically, there were two things you couldn’t talk about in a writer’s room: abortion and climate.” But resistance from the top might be softening. At COP 26, the international climate meeting held in Glasgow, Scotland, in late 2021, 12 of the U.K. and Ireland’s biggest entertainment CEOs signed a Climate Content Pledge, and representatives of major U.S. studios now meet regularly to discuss how to better represent sustainability on-screen.  This winter, as rain flooded the streets of Los Angeles and hillsides started to liquefy in Northern California, I logged on to a restricted website to watch a few episodes of a new experiment in climate storytelling: the drama series Extrapolations, which begins streaming tomorrow on Apple+. The show begins in the near future, in 2037, and follows a mix of characters into the 2040s and beyond.In the world of the show, the science is familiar: Oceans are so acidified that biodiversity has dropped precipitously, wildfires rage everywhere, and companies race to bank species’ genomes before they go extinct for a future zoo. Yet the dramatic tension in Extrapolations is less about whether the characters will die from climate change than about how they can live through it, and with it: A rabbi in Miami tries to convince the city that his temple is worth saving from rising seas; a mother struggles to help her young son, who suffers from a genetic heat-related health condition, imagine his future in a warming world.Scott Z. Burns, a writer and director of Extrapolations, also produced Al Gore’s 2006 documentary An Inconvenient Truth. When the film opened, he was confident that its evidence would persuade people to drastically change their ways. “It was like, well, that’ll solve the problem—certainly the world can’t be the same again after this,” Burns told me over Zoom, with a dry laugh. “But I think what we learned is that the problem is so large and so systemic, that obviously a documentary wasn’t going to change the way people saw life on Earth or their own behavior.”Burns started to think about storytelling that, instead of threatening disaster, simply brought the event horizon closer, transforming climate change from an unimaginable eventuality into an immediate and pervasive problem. “I wanted to tell human stories set against a world that had a very different climate,” he said. He found inspiration in World War II–era novels, movies, and shows, and points out that the war, while obviously a historic tragedy, was also a source of great creative energy for people in the middle of the last century. “Climate is sort of our World War II,” he said. “This is the existential moment that this generation faces—and where are the great works of art that help us come to terms with this? I think we’re beginning to see them.”As he finished pitching the show in 2020, the coronavirus pandemic began to lock down the world. Burns had also written Contagion, a movie that turned out to be an eerily accurate portrayal of a pandemic’s spread. He wanted the scientific underpinnings of Extrapolations to be just as solid. But while past pandemics informed his work on Contagion, the human-caused climate change we’re experiencing today has no precedent. “It’s a very reckless gamble,” he told me. “But as a screenwriter, a reckless gamble is also a suspense movie. And that’s what I tried to do, was look at the science and what it suggests may happen, and then look at human beings and think about how they behave.”Burns found the serialized nature of a television show more compelling than a two-hour movie—it allowed for the narrative to unfold as chapters, with overlapping characters and story lines that extended over decades. It allowed a viewer to follow the slow-moving climate crisis as it intensified.When Apple+ bought Burns’s show, he called up Adam McKay, who at the time was working on Don’t Look Up, a satire about climate-change denial that was released in 2021. McKay’s generous response was instructive, Burns said. “It was like, great—there’s more than one cop show. There’s more than one hospital show. There needs to be more than one show about this.”Some people will see climate change as a social-justice story, he said, while others will see it as a parenting story. “Everybody has a different way in which this constellation of issues is going to encroach on your life,” Burns told me. “My first hope is that we maybe crack open the door a little bit to get executives at streamers and platforms to think, Oh, there’s cool work to be done in this area, and artists to think, What stories can we tell in this space that might make a difference?”Burns has heard the old adage that audiences don’t want to watch something that’s not hopeful, but he disagrees. He likes to replace the word hope with courage: “What sort of courageous act can my characters do?” he asked. “What I’m interested in telling is the story that says, right up until the moment we’re going to die, we get to live. What do we do with that life?”This story is part of the Atlantic Planet series supported by the HHMI Department of Science Education.

Climate activists must target power structures, not the public | Letters

Dr Laura Thomas-Walters, Tim Williamson and Paul Chandler respond to Jack Shenker’s article that asked if the disruptive tactics of groups such as Extinction Rebellion and Just Stop Oil are workingI am an environmental social scientist and climate activist. As Jack Shenker describes in his article (The existential question for climate activists: have disruption tactics stopped working?, 6 March), Extinction Rebellion’s recent decision to stop disrupting the public caused quite a fuss. Some people applauded the move as they thought it would favourably shift public opinion, while others insisted public disruption needs to remain a primary tactic to garner wider attention.Unfortunately, both camps are missing the point – once you have enough dedicated activists, the public is largely irrelevant to achieving political change. It is not the opinion, or even attention, of the public that matters, it is whether or not you are disrupting structures of power. Historical social movements have shown this repeatedly. Continue reading...

Dr Laura Thomas-Walters, Tim Williamson and Paul Chandler respond to Jack Shenker’s article that asked if the disruptive tactics of groups such as Extinction Rebellion and Just Stop Oil are workingI am an environmental social scientist and climate activist. As Jack Shenker describes in his article (The existential question for climate activists: have disruption tactics stopped working?, 6 March), Extinction Rebellion’s recent decision to stop disrupting the public caused quite a fuss. Some people applauded the move as they thought it would favourably shift public opinion, while others insisted public disruption needs to remain a primary tactic to garner wider attention.Unfortunately, both camps are missing the point – once you have enough dedicated activists, the public is largely irrelevant to achieving political change. It is not the opinion, or even attention, of the public that matters, it is whether or not you are disrupting structures of power. Historical social movements have shown this repeatedly. Continue reading...

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.