Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

What Myths About the Anthropocene Get Wrong

News Feed
Thursday, April 18, 2024

Jan A. Zalasiewicz, Scott L. Wing and the Anthropocene Working Group The concept of the Anthropocene epoch was born in February 2000 out of a moment of spontaneity. Chemist and Nobel Prize winner Paul Crutzen had been listening to a narrative emerging at an international convening of scientists in Mexico. All day, scientists had presented data that showed how the human-caused changes in climate, chemical cycles and biology of recent decades were jarringly different from the relative stability of the Holocene, the geological epoch that began 11,700 years prior. They kept referring to the remarkably rapid environmental changes of the late Holocene. Exasperated, Crutzen finally broke into the discussion: “We aren’t in the Holocene anymore, we’re in … the Anthropocene!” The improvised term quickly caught fire as a foundational concept among earth scientists, and in the last decade the word has proliferated through other sciences, the arts, humanities and popular culture. Along the way, “Anthropocene” gained many meanings and implications unrelated to—or even opposing—Crutzen’s original concept, blurring and sometimes wholly obscuring its original meaning. But what did Crutzen intend by the Anthropocene, a concept since enhanced and refined by years of scientific study? It’s absurdly simple. The shift from the Holocene to the Anthropocene epoch hits like a brick wall when looking at graphs that show changes in three major greenhouse gases and in global temperature during the last 30 millennia. All four of these critical planetary parameters shift from near-horizontal to near-vertical lines in the last 70 years or so. The graphs are simple, but they show changes in atmospheric chemistry and—lagging a little behind—temperature, that affect the habitability of the planet for all its organisms, including humans. On a time scale of millennia, the shifts don’t resemble a hockey stick as much as a stair step. Furthermore, these changes affect the whole atmosphere and ocean, so they are essentially irreversible on any human time scale. Our distant descendants will still be living with the planetary changes that humans have wrought in a single lifetime. The stunning effect of humans on the atmosphere can be seen in the concentration of three important greenhouse gases: nitrous oxide, methane and carbon dioxide. These gases have increased far more in the last 70 years than in the previous 30,000 years or more. Global temperature has begun to spike as a result, and it will continue to rise as the full effect of higher greenhouse gas concentration is felt. Martin Head If we zoom in on the time axis to look at just the last 300 years, ten human generations, we see remarkably large and rapid change in a whole range of factors that mark the effect of humans at a global scale: not just carbon emissions, but also production of metals, plastics, fertilizers, concrete and farm animals, and even a giant increase in the ultimate geological currency: sediment. The amount of sediment moved every year by humans now exceeds the amount moved by non-human processes by a factor of 15. Cropping the time frame tightly in this way, we see that the global shifts are most rapid beginning in the mid-20th century. The Anthropocene Working Group, a body of 34 scientists from 14 countries constituted in 2009 by the International Commission on Stratigraphy, proposed placing the beginning of a new Anthropocene Epoch in 1952, when sediments are marked globally by the first major increase in the element plutonium, derived from the earliest tests of thermonuclear weapons. Scientists proposed recognizing a new geological epoch, the Anthropocene, marked by rapid changes beginning in the mid-20th century. Sediments deposited in the last 70 years are marked by abundant artificial materials including concrete, metals, plastics and fertilizer. Ecosystems have also been transformed by the great increases in fertilizer production (ammonia) and raising livestock (meat production). Humans are also prodigious producers of sediment. Colin Waters By proposing a formal, geologically defined Anthropocene epoch, the working group intended to provide a precise definition for this recent, large, permanent and rapid transition in Earth’s physical, chemical and biological systems. The proposal was rejected by the international hierarchy of stratigraphy—of which the International Commission on Stratigraphy is a part—without citing substantive reasons, but most public criticisms of the Anthropocene stem from a range of sources: from within the heart of geology, to well outside it, among the social sciences and humanities. Tourists look down at the Hoover Dam. The amount of sediment settled behind the world’s thousands of big dams would cover all of California to a depth of five meters. Robert Nickelsberg / Getty Images Across a spectrum of disciplines, the Anthropocene touched—and often jabbed—a nerve: sometimes as a gut response to a disturbing new idea and sometimes with discomfort at unfamiliar sociopolitical implications. For whatever reasons, the Anthropocene came under fire. But the barrage of criticism has often focused on what the Anthropocene isn’t rather than what it is. Fundamental misconceptions have come to surround this concept and to cloud its meaning. Here we debunk ten common myths about the Anthropocene. 1. The Anthropocene fails to represent all human impacts. This is true enough—but it misses the point entirely. Recognizing an Anthropocene epoch does not at all underplay the impacts that humans have caused for many millennia by hunting, by farming, and by building cities and trade networks. But those early impacts were not global, were not synchronous around the planet and did not shift the global environment permanently. The reason for naming a new geological epoch, both in Crutzen’s original formulation and in the highly detailed proposal of the working group, is to mark the departure of the Earth and its inhabitants from the stable planetary system of the Holocene. The Anthropocene epoch was never meant to encompass all anthropogenic impacts. 2. The Anthropocene is too short to be a geological epoch—just one human lifetime. The Anthropocene’s duration is short, true—so far. But it’s the Holocene that shows the greatest change in duration from other epochs: nearly three orders of magnitude (0.0117 million years versus 2.57 million years for the Pleistocene epoch that precedes it). The difference in duration between Holocene and Anthropocene epochs is proportionately less, and the Anthropocene represents far more significant and enduring change to the planet than does the Holocene. 3. The Anthropocene is just a blip in Earth history. Or, as the New York Times writes, a senior member of the geological time-scale hierarchy calls it “a blip of a blip of a blip.” What this point of view misunderstands is that these approximately 70 years have altered the planet fundamentally and set it on a new trajectory. Already, many geological signals are sharper than, and as pronounced as, the sudden carbon release and global warming that initiated the Eocene epoch 56 million years ago. Take just the climate impacts from burning fossil fuels, of which 90 percent have been burned in the last 70 years. These impacts will roll across the planet for at least many thousands of years. We and many generations to come are locked into a climate unlike that of the Holocene. Carbon dioxide already in the atmosphere will make the Earth hotter than it has been for at least 3 million years. Many of the biological changes of the last 70 years are permanent, too: extinctions, of course, but also the spread of many species through the intended and unintended assistance of humans, making fauna and flora more homogeneous worldwide. The biosphere has been changed forever. This is no blip. 4. Anthropocene strata are “minimal” or “negligible.” That’s a very geological objection—but it’s wrong. Humans have, since the mid-20th century, been prodigious reshapers of the landscape and movers of rock and sediment (now, by more than an order of magnitude than natural sediment movers such as glaciers and rivers.) The amount of sediment settled behind the world’s thousands of big dams would cover all of California to a depth of five meters, and such sediments are full of distinctive markers, like pesticide residues, metals, microplastics and the fossils of invasive species. To define a time period formally, geologists must identify distinctive signals in sediments or rocks that can be correlated around the globe, and the presence of such markers is ubiquitous. The geology is real. Plastic debris collects after a rainstorm near Culver City, California. Microplastics that result from such debris can often be found in sediment. Citizen of the Planet / UIG via Getty Images 5. The geological record is too complex and gradational to draw one single boundary for the Anthropocene. All of history (of Earth and of humans) is complex, is gradational and varies through time and across space. Nevertheless, geologists define epochs because such time units are useful, indeed indispensable to their work. In geology, each time unit is precisely defined by a “golden spike”—a specified level in a sedimentary succession at a specified location that is chosen because it can be correlated to other sedimentary sequences around the globe. This golden spike identifies a global time plane, but the planetary transition that motivates the placement of a golden spike can be anything but simple. The last ice age of the Pleistocene gave way to Holocene interglacial conditions over the course of about 13,000 years—and took a different course between Northern and Southern Hemispheres. Yet the defined Holocene boundary within that transition, at 11,700 years ago, is accepted and used without complaint. The Holocene-Anthropocene transition is much sharper and more globally synchronous, and so is easier to define and recognize. 6. Other animals have affected the environment and caused geological change, so there’s nothing special about the Anthropocene. Other animals have indeed changed the environment, but that can help rather than hinder the recognition of geological time intervals. For instance, the rise of mobile, muscular animals that could burrow through sediment serves as the basis for defining the Cambrian Period. But none of those previous changes has swept across all environments on the planet so quickly—or been triggered by an animal conscious of the changes it was making. This consciousness, we note, is yet to be effectively translated into action to ward off the worst consequences of these changes. Too many still pursue economic and industrial development without considering the long-term cost to planetary health. 7. The Anthropocene blames all humans equally for the global environmental crises. The Anthropocene assigns neither blame nor credit; it simply recognizes a great, abrupt and more or less permanent change to the course of Earth history. There is no doubt that some humans, societies, institutions and nation-states have driven far more change than others, and that the benefits and costs of change have been and are unevenly distributed. The societal value of the Anthropocene epoch is that it announces the unambiguous scientific evidence showing that humans have permanently changed the global environment. And it might encourage us to recognize that we all must deal with the rapid, permanent, global changes that are underway. 8. The Anthropocene signals defeat in our efforts to mitigate environmental change. The first step in solving problems is to diagnose them. We cannot return the Earth to the conditions in which our grandparents or any other Holocene generation lived. But we can make wiser decisions about the future that will ameliorate and mitigate change. That’s realism, not defeatism. 9. Naming the Anthropocene after humans is hubristic. The planetary transformation that ushered in the Anthropocene epoch was caused by humans. It could have been called a lot of things, but Anthropocene caught the imagination of many because its meaning is evident and accurate. If only that were true. Accepting that we are no longer living in a Holocene world is a first step in addressing the issues facing humans and non-humans in the immediate future. These myths have persisted in the scientific community despite being systematically refuted in scientific papers by the Anthropocene Working Group and others. This suggests that, like all myths, they are reactions based on ideology, conviction or personal philosophy rather than evidence. These misconceptions lie at the heart, too, of the recent formal rejection of the Anthropocene epoch by the hierarchy of international stratigraphy. Why has the Anthropocene been misunderstood and mythologized in so many ways? Probably because it’s deeply uncomfortable to many. It’s very brief (so far). It includes smelly landfill sites as strata to “foul up” a geological time scale that is sacrosanct to many geologists. And it raises the specter that the calm abstractions of geological time have come up against the tough predicaments we face in the present and future. Change is hard, and the Anthropocene is an uncomfortable concept. It is hard to accept that we as a society have gained so much power to change the Earth and have thought so little about how to use that power. Scientific knowledge can transform our perspectives (think of heliocentrism and evolution)—so it’s not surprising that the Anthropocene is hard to accept. But, recognizing our role in suddenly, recently driving the Earth towards a new future is a necessary first step to engaging with the planetary changes we have set in train. Get the latest on what's happening At the Smithsonian in your inbox.

These ten misconceptions underplay how much we have altered the global environment and undermine the new perspective we need to deal with a drastically changed world

Jan A. Zalasiewicz, Scott L. Wing and the Anthropocene Working Group

The concept of the Anthropocene epoch was born in February 2000 out of a moment of spontaneity. Chemist and Nobel Prize winner Paul Crutzen had been listening to a narrative emerging at an international convening of scientists in Mexico.

All day, scientists had presented data that showed how the human-caused changes in climate, chemical cycles and biology of recent decades were jarringly different from the relative stability of the Holocene, the geological epoch that began 11,700 years prior. They kept referring to the remarkably rapid environmental changes of the late Holocene.

Exasperated, Crutzen finally broke into the discussion: “We aren’t in the Holocene anymore, we’re in … the Anthropocene!” The improvised term quickly caught fire as a foundational concept among earth scientists, and in the last decade the word has proliferated through other sciences, the arts, humanities and popular culture.

Along the way, “Anthropocene” gained many meanings and implications unrelated to—or even opposing—Crutzen’s original concept, blurring and sometimes wholly obscuring its original meaning. But what did Crutzen intend by the Anthropocene, a concept since enhanced and refined by years of scientific study?

It’s absurdly simple. The shift from the Holocene to the Anthropocene epoch hits like a brick wall when looking at graphs that show changes in three major greenhouse gases and in global temperature during the last 30 millennia. All four of these critical planetary parameters shift from near-horizontal to near-vertical lines in the last 70 years or so. The graphs are simple, but they show changes in atmospheric chemistry and—lagging a little behind—temperature, that affect the habitability of the planet for all its organisms, including humans. On a time scale of millennia, the shifts don’t resemble a hockey stick as much as a stair step. Furthermore, these changes affect the whole atmosphere and ocean, so they are essentially irreversible on any human time scale. Our distant descendants will still be living with the planetary changes that humans have wrought in a single lifetime.

Greenhouse Gases Graphic
The stunning effect of humans on the atmosphere can be seen in the concentration of three important greenhouse gases: nitrous oxide, methane and carbon dioxide. These gases have increased far more in the last 70 years than in the previous 30,000 years or more. Global temperature has begun to spike as a result, and it will continue to rise as the full effect of higher greenhouse gas concentration is felt. Martin Head

If we zoom in on the time axis to look at just the last 300 years, ten human generations, we see remarkably large and rapid change in a whole range of factors that mark the effect of humans at a global scale: not just carbon emissions, but also production of metals, plastics, fertilizers, concrete and farm animals, and even a giant increase in the ultimate geological currency: sediment. The amount of sediment moved every year by humans now exceeds the amount moved by non-human processes by a factor of 15.

Cropping the time frame tightly in this way, we see that the global shifts are most rapid beginning in the mid-20th century. The Anthropocene Working Group, a body of 34 scientists from 14 countries constituted in 2009 by the International Commission on Stratigraphy, proposed placing the beginning of a new Anthropocene Epoch in 1952, when sediments are marked globally by the first major increase in the element plutonium, derived from the earliest tests of thermonuclear weapons.

Anthropocene Graphic
Scientists proposed recognizing a new geological epoch, the Anthropocene, marked by rapid changes beginning in the mid-20th century. Sediments deposited in the last 70 years are marked by abundant artificial materials including concrete, metals, plastics and fertilizer. Ecosystems have also been transformed by the great increases in fertilizer production (ammonia) and raising livestock (meat production). Humans are also prodigious producers of sediment. Colin Waters

By proposing a formal, geologically defined Anthropocene epoch, the working group intended to provide a precise definition for this recent, large, permanent and rapid transition in Earth’s physical, chemical and biological systems.

The proposal was rejected by the international hierarchy of stratigraphy—of which the International Commission on Stratigraphy is a part—without citing substantive reasons, but most public criticisms of the Anthropocene stem from a range of sources: from within the heart of geology, to well outside it, among the social sciences and humanities.

Hoover Dam
Tourists look down at the Hoover Dam. The amount of sediment settled behind the world’s thousands of big dams would cover all of California to a depth of five meters. Robert Nickelsberg / Getty Images

Across a spectrum of disciplines, the Anthropocene touched—and often jabbed—a nerve: sometimes as a gut response to a disturbing new idea and sometimes with discomfort at unfamiliar sociopolitical implications. For whatever reasons, the Anthropocene came under fire.

But the barrage of criticism has often focused on what the Anthropocene isn’t rather than what it is. Fundamental misconceptions have come to surround this concept and to cloud its meaning. Here we debunk ten common myths about the Anthropocene.

1. The Anthropocene fails to represent all human impacts.

This is true enough—but it misses the point entirely. Recognizing an Anthropocene epoch does not at all underplay the impacts that humans have caused for many millennia by hunting, by farming, and by building cities and trade networks. But those early impacts were not global, were not synchronous around the planet and did not shift the global environment permanently. The reason for naming a new geological epoch, both in Crutzen’s original formulation and in the highly detailed proposal of the working group, is to mark the departure of the Earth and its inhabitants from the stable planetary system of the Holocene. The Anthropocene epoch was never meant to encompass all anthropogenic impacts.

2. The Anthropocene is too short to be a geological epoch—just one human lifetime.

The Anthropocene’s duration is short, true—so far. But it’s the Holocene that shows the greatest change in duration from other epochs: nearly three orders of magnitude (0.0117 million years versus 2.57 million years for the Pleistocene epoch that precedes it). The difference in duration between Holocene and Anthropocene epochs is proportionately less, and the Anthropocene represents far more significant and enduring change to the planet than does the Holocene.

3. The Anthropocene is just a blip in Earth history.

Or, as the New York Times writes, a senior member of the geological time-scale hierarchy calls it “a blip of a blip of a blip.” What this point of view misunderstands is that these approximately 70 years have altered the planet fundamentally and set it on a new trajectory. Already, many geological signals are sharper than, and as pronounced as, the sudden carbon release and global warming that initiated the Eocene epoch 56 million years ago.

Take just the climate impacts from burning fossil fuels, of which 90 percent have been burned in the last 70 years. These impacts will roll across the planet for at least many thousands of years. We and many generations to come are locked into a climate unlike that of the Holocene. Carbon dioxide already in the atmosphere will make the Earth hotter than it has been for at least 3 million years. Many of the biological changes of the last 70 years are permanent, too: extinctions, of course, but also the spread of many species through the intended and unintended assistance of humans, making fauna and flora more homogeneous worldwide. The biosphere has been changed forever. This is no blip.

4. Anthropocene strata are “minimal” or “negligible.”

That’s a very geological objection—but it’s wrong. Humans have, since the mid-20th century, been prodigious reshapers of the landscape and movers of rock and sediment (now, by more than an order of magnitude than natural sediment movers such as glaciers and rivers.) The amount of sediment settled behind the world’s thousands of big dams would cover all of California to a depth of five meters, and such sediments are full of distinctive markers, like pesticide residues, metals, microplastics and the fossils of invasive species. To define a time period formally, geologists must identify distinctive signals in sediments or rocks that can be correlated around the globe, and the presence of such markers is ubiquitous. The geology is real.

Plastic Pollution in California
Plastic debris collects after a rainstorm near Culver City, California. Microplastics that result from such debris can often be found in sediment. Citizen of the Planet / UIG via Getty Images

5. The geological record is too complex and gradational to draw one single boundary for the Anthropocene.

All of history (of Earth and of humans) is complex, is gradational and varies through time and across space. Nevertheless, geologists define epochs because such time units are useful, indeed indispensable to their work. In geology, each time unit is precisely defined by a “golden spike”—a specified level in a sedimentary succession at a specified location that is chosen because it can be correlated to other sedimentary sequences around the globe. This golden spike identifies a global time plane, but the planetary transition that motivates the placement of a golden spike can be anything but simple.

The last ice age of the Pleistocene gave way to Holocene interglacial conditions over the course of about 13,000 years—and took a different course between Northern and Southern Hemispheres. Yet the defined Holocene boundary within that transition, at 11,700 years ago, is accepted and used without complaint. The Holocene-Anthropocene transition is much sharper and more globally synchronous, and so is easier to define and recognize.

6. Other animals have affected the environment and caused geological change, so there’s nothing special about the Anthropocene.

Other animals have indeed changed the environment, but that can help rather than hinder the recognition of geological time intervals. For instance, the rise of mobile, muscular animals that could burrow through sediment serves as the basis for defining the Cambrian Period. But none of those previous changes has swept across all environments on the planet so quickly—or been triggered by an animal conscious of the changes it was making. This consciousness, we note, is yet to be effectively translated into action to ward off the worst consequences of these changes. Too many still pursue economic and industrial development without considering the long-term cost to planetary health.

7. The Anthropocene blames all humans equally for the global environmental crises.

The Anthropocene assigns neither blame nor credit; it simply recognizes a great, abrupt and more or less permanent change to the course of Earth history. There is no doubt that some humans, societies, institutions and nation-states have driven far more change than others, and that the benefits and costs of change have been and are unevenly distributed. The societal value of the Anthropocene epoch is that it announces the unambiguous scientific evidence showing that humans have permanently changed the global environment. And it might encourage us to recognize that we all must deal with the rapid, permanent, global changes that are underway.

8. The Anthropocene signals defeat in our efforts to mitigate environmental change.

The first step in solving problems is to diagnose them. We cannot return the Earth to the conditions in which our grandparents or any other Holocene generation lived. But we can make wiser decisions about the future that will ameliorate and mitigate change. That’s realism, not defeatism.

9. Naming the Anthropocene after humans is hubristic.

The planetary transformation that ushered in the Anthropocene epoch was caused by humans. It could have been called a lot of things, but Anthropocene caught the imagination of many because its meaning is evident and accurate.

If only that were true. Accepting that we are no longer living in a Holocene world is a first step in addressing the issues facing humans and non-humans in the immediate future.

These myths have persisted in the scientific community despite being systematically refuted in scientific papers by the Anthropocene Working Group and others. This suggests that, like all myths, they are reactions based on ideology, conviction or personal philosophy rather than evidence. These misconceptions lie at the heart, too, of the recent formal rejection of the Anthropocene epoch by the hierarchy of international stratigraphy.

Why has the Anthropocene been misunderstood and mythologized in so many ways? Probably because it’s deeply uncomfortable to many. It’s very brief (so far). It includes smelly landfill sites as strata to “foul up” a geological time scale that is sacrosanct to many geologists. And it raises the specter that the calm abstractions of geological time have come up against the tough predicaments we face in the present and future.

Change is hard, and the Anthropocene is an uncomfortable concept. It is hard to accept that we as a society have gained so much power to change the Earth and have thought so little about how to use that power. Scientific knowledge can transform our perspectives (think of heliocentrism and evolution)—so it’s not surprising that the Anthropocene is hard to accept. But, recognizing our role in suddenly, recently driving the Earth towards a new future is a necessary first step to engaging with the planetary changes we have set in train.

Get the latest on what's happening At the Smithsonian in your inbox.

Read the full story here.
Photos courtesy of

In Alaska’s Warming Arctic, Photos Show an Indigenous Elder Passing Down Hunting Traditions

An Inupiaq elder teaches his great-grandson to hunt in rapidly warming Northwest Alaska where thinning ice, shifting caribou migrations and severe storms are reshaping life

KOTZEBUE, Alaska (AP) — The low autumn light turned the tundra gold as James Schaeffer, 7, and his cousin Charles Gallahorn, 10, raced down a dirt path by the cemetery on the edge of town. Permafrost thaw had buckled the ground, tilting wooden cross grave markers sideways. The boys took turns smashing slabs of ice that had formed in puddles across the warped road.Their great-grandfather, Roswell Schaeffer, 78, trailed behind. What was a playground to the kids was, for Schaeffer – an Inupiaq elder and prolific hunter – a reminder of what warming temperatures had undone: the stable ice he once hunted seals on, the permafrost cellars that kept food frozen all summer, the salmon runs and caribou migrations that once defined the seasons.Now another pressure loomed. A 211-mile mining road that would cut through caribou and salmon habitat was approved by the Trump administration this fall, though the project still faces lawsuits and opposition from environmental and native groups. Schaeffer and other critics worry it could open the region to outside hunters and further devastate already declining herds. “If we lose our caribou – both from climate change and overhunting – we’ll never be the same,” he said. “We’re going to lose our culture totally.”Still, Schaeffer insists on taking the next generation out on the land, even when the animals don’t come. It was late September and he and James would normally have been at their camp hunting caribou. But the herd has been migrating later each year and still hadn’t arrived – a pattern scientists link to climate change, mostly caused by the burning of oil, gas and coal. So instead of caribou, they scanned the tundra for swans, ptarmigan and ducks.Caribou antlers are stacked outside Schaeffer's home. Traditional seal hooks and whale harpoons hang in his hunting shed. Inside, a photograph of him with a hunted beluga is mounted on the wall beside the head of a dall sheep and a traditional mask his daughter Aakatchaq made from caribou hide and lynx fur.He got his first caribou at 14 and began taking his own children out at 7. James made his first caribou kill this past spring with a .22 rifle. He teaches James what his father taught him: that power comes from giving food and a hunter’s responsibility is to feed the elders.“When you’re raised an Inupiaq, your whole being is to make sure the elders have food,” he said.But even as he passes down those lessons, Schaeffer worries there won’t be enough to sustain the next generation – or to sustain him. “The reason I’ve been a successful hunter is the firm belief that, when I become old, people will feed me,” he said. “My great-grandson and my grandson are my future for food.” That future feels tenuous These days, they’re eating less hunted food and relying more on farmed chicken and processed goods from the store. The caribou are fewer, the salmon scarcer, the storms more severe. Record rainfall battered Northwest Alaska this year, flooding Schaeffer’s backyard twice this fall alone. He worries about the toll on wildlife and whether his grandchildren will be able to live in Kotzebue as the changes accelerate.“It’s kind of scary to think about what’s going to happen,” he said.That afternoon, James ducked into the bed of Schaeffer’s truck and aimed into the water. He shot two ducks. Schaeffer helped him into waders – waterproof overalls – so they could collect them and bring them home for dinner, but the tide was too high. They had to turn back without collecting the ducks. The changes weigh on others, too. Schaeffer’s friend, writer and commercial fisherman Seth Kantner grew up along the Kobuk River, where caribou once reliably crossed by the hundreds of thousands. “I can hardly stand how lonely it feels without all the caribou that used to be here,” he said. “This road is the largest threat. But right beside it is climate change.”The Associated Press receives support from the Walton Family Foundation for coverage of water and environmental policy. The AP is solely responsible for all content. For all of AP’s environmental coverage, visit https://apnews.com/hub/climate-and-environmentCopyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See – December 2025

Changes to polar bear DNA could help them adapt to global heating, study finds

Scientists say bears in southern Greenland differ genetically to those in the north, suggesting they could adjustChanges in polar bear DNA that could help the animals adapt to warmer climates have been detected by researchers, in a study thought to be the first time a statistically significant link has been found between rising temperatures and changing DNA in a wild mammal species.Climate breakdown is threatening the survival of polar bears. Two-thirds of them are expected to have disappeared by 2050 as their icy habitat melts and the weather becomes hotter. Continue reading...

Changes in polar bear DNA that could help the animals adapt to warmer climates have been detected by researchers, in a study thought to be the first time a statistically significant link has been found between rising temperatures and changing DNA in a wild mammal species.Climate breakdown is threatening the survival of polar bears. Two-thirds of them are expected to have disappeared by 2050 as their icy habitat melts and the weather becomes hotter.Now scientists at the University of East Anglia have found that some genes related to heat stress, ageing and metabolism are behaving differently in polar bears living in south-east Greenland, suggesting they may be adjusting to warmer conditions.The researchers analysed blood samples taken from polar bears in two regions of Greenland and compared “jumping genes”: small, mobile pieces of the genome that can influence how other genes work. Scientists looked at the genes in relation to temperatures in the two regions and at the associated changes in gene expression.“DNA is the instruction book inside every cell, guiding how an organism grows and develops,” said the lead researcher, Dr Alice Godden. “By comparing these bears’ active genes to local climate data, we found that rising temperatures appear to be driving a dramatic increase in the activity of jumping genes within the south-east Greenland bears’ DNA.”As local climates and diets evolve as a result of changes in habitat and prey forced by global heating, the genetics of the bears appear to be adapting, with the group of bears in the warmest part of the country showing more changes than the communities farther north. The authors of the study have said these changes could help us understand how polar bears might survive in a warming world, inform understanding of which populations are most at risk and guide future conservation efforts.This is because the findings, published on Friday in the journal Mobile DNA, suggest the genes that are changing play a crucial role in how different polar bear populations are evolving.Godden said: “This finding is important because it shows, for the first time, that a unique group of polar bears in the warmest part of Greenland are using ‘jumping genes’ to rapidly rewrite their own DNA, which might be a desperate survival mechanism against melting sea ice.”Temperatures in north-east Greenland are colder and less variable, while in the south-east there is a much warmer and less icy environment, with steep temperature fluctuations.DNA sequences in animals change over time, but this process can be accelerated by environmental stress such as a rapidly heating climate.There were some interesting DNA changes, such as in areas linked to fat processing, that could help polar bears survive when food is scarce. Bears in warmer regions had more rough, plant-based diets compared with the fatty, seal-based diets of northern bears, and the DNA of south-eastern bears seemed to be adapting to this.Godden said: “We identified several genetic hotspots where these jumping genes were highly active, with some located in the protein-coding regions of the genome, suggesting that the bears are undergoing rapid, fundamental genetic changes as they adapt to their disappearing sea ice habitat.”The next step will be to look at other polar bear populations, of which there are 20 around the world, to see if similar changes are happening to their DNA.This research could help protect the bears from extinction. But the scientists said it was crucial to stop temperature rises accelerating by reducing the burning of fossil fuels.Godden said: “We cannot be complacent, this offers some hope but does not mean that polar bears are at any less risk of extinction. We still need to be doing everything we can to reduce global carbon emissions and slow temperature increases.”

A Deadly Pathogen Decimated Sunflower Sea Stars. Look Inside the Lab Working to Bring Them Back by Freezing and Thawing Their Larvae

For the first time, scientists have cryopreserved and revived the larvae of a sea star species. The breakthrough, made with the giant pink star, gives hope the technique could be repeated to save the imperiled predator

A Deadly Pathogen Decimated Sunflower Sea Stars. Look Inside the Lab Working to Bring Them Back by Freezing and Thawing Their Larvae For the first time, scientists have cryopreserved and revived the larvae of a sea star species. The breakthrough, made with the giant pink star, gives hope the technique could be repeated to save the imperiled predator Juvenile sunflower sea stars at the Sunflower Star Laboratory in Moss Landing, California. At this phase, each is less than an inch wide, but they can grow to be more than three feet across as adults. Avery Schuyler Nunn Key takeaways: Recovering sunflower sea stars by freezing them in time Ravaged by infectious bacteria, sunflower sea stars literally wasted away across the Pacific coast of North America—and their resulting population crash destabilized kelp forest ecosystems. Scientists pioneered a cryopreservation technique on the closely related giant pink star, raising hopes that a bank of frozen sunflower star larvae could one day be thawed in the same way and released into the wild. Along a working California harbor, where gulls wheel over weathered pilings and the old Western Flyer—the ship John Steinbeck once sailed to the Sea of Cortez—sits restored in its berth, researchers buzz about in a modest lab tucked between warehouses and boatyards. Inside, amid the hiss of pumps and the faint smell of brine from seawater tables, a scientist lifts a small vial from a plume of liquid nitrogen, its frosted casing holding the tiniest flicker of hope for a species on the brink. Each of the 18 vials contains between 500 and 700 larval giant pink sea stars. At this stage, they are tiny specks suspended in seawater, invisible to the naked eye. These particular larvae have been cryopreserved and stored at roughly minus 180 degrees Celsius since March. At the Sunflower Star Laboratory (SSL) in Moss Landing, California, scientists thawed the larval pink sea stars and coaxed them to successfully develop into juveniles this summer—a first for any sea star species. In October, the scientists thawed another batch of larvae from the same cohort to test larval growth and survival under different freezing conditions and thawing protocols. The breakthrough, however, isn’t really about the giant pink star, a species that’s common in the wild. Instead, these larvae serve as a crucial stand-in for the far more imperiled sunflower sea star (Pycnopodia helianthoides)—a vanishing species for which larvae are precious, limited and increasingly difficult to obtain. Perfecting cryopreservation methods on pink stars—ensuring they can survive freezing, resume feeding and grow into juveniles—lays the scientific groundwork for facilitating a return of Pycnopodia. The contents of a thawed vial are placed under a microscope to assess viability of the larvae. Avery Schuyler Nunn The discovery arrives at a precarious time, as sunflower stars have disappeared at a pace rarely seen in marine ecosystems. As a mysterious pathogen ravaged their population along the western shores of North America beginning in 2013, the creatures collapsed from an estimated six billion individuals to functional extinction in parts of their range—all within just a few years. Their loss left kelp forests with dramatically fewer predators, destabilizing ecosystems across the Pacific coast and allowing urchins to proliferate and graze formerly lush underwater canopies into barren rock. Now, scientists hope that “freezing” their larvae will offer a new avenue for bringing the species back. “Cryopreservation is particularly important on the population level when thinking about recovery for this endangered species, because it had major population losses,” says Marissa Baskett, an environmental scientist at the University of California, Davis, who was not involved in the project. The process lets scientists preserve the sea stars’ existing genetic diversity for future reintroduction to the wild, she adds. “Especially given the uncertainty about different disease outbreaks, having that stock to return to is incredibly valuable.” A mysterious and “complete collapse” Sunflower sea stars have long lived in abundance up and down the rugged Pacific coast—from Alaskan archipelagoes to Baja California. The 24-limbed echinoderms sprawled across the seafloor in shades of ochre, crimson and violet. Among the fastest-moving and largest of all sea stars—capable of stretching nearly three feet across—these radiant predators coursed through kelp forests, voraciously hunting purple sea urchins and preventing them from over-grazing on the holdfasts that root towering golden canopies of kelp. An adult sunflower sea star has 24 limbs and can be more than three feet wide. This one was photographed off Point Dume State Beach near Los Angeles. Brent Durand via Getty Images “In Northern California and Oregon, there historically would have been multiple keystone predators within the kelp forest ecosystem who are punching on purple urchins and keeping their population in check,” says Reuven Bank, board chair of SSL. “But the southern sea otter was extirpated across its historic range, so we were left with sunflower stars being the last major keystone predator of purple urchins across over 100 miles of coastline.” “And sunflower stars didn’t just eat urchins, they scared them,” Bank adds. “Urchins can smell a sunflower star approaching, and in healthy kelp forests they hide more and graze less. Even without consuming them, sunflower stars helped keep urchin behavior, and therefore kelp forests, in balance.” Then, in June 2013, tidepool monitors along Washington’s Olympic Peninsula documented an unprecedented sight. The once-sturdy sea stars had turned soft, pale and contorted, their arms curling and detaching from their bodies. By late summer, the same mysterious affliction had surfaced in British Columbia, and it began sweeping both north and south with startling speed. The emerging epidemic, which caused the invertebrates to literally disintegrate, would soon be known as sea star wasting disease. An infamous marine heatwave—nicknamed “The Blob”—had settled over the Pacific by 2014, thrusting the coast into a fever. Ocean temperatures spiked, likely speeding up the disease progression in already stressed sea stars and leading to higher mortality. In the warm, stagnant water, infected sunflower stars dissolved at an eerily rapid pace, leaving behind ghost-white films of bacterial mass where the vibrant predators had been just days before. “You’d have apparently healthy stars basically melt away into puddles of goo within 48 hours,” says Andrew Kim, lab manager at SSL. “It happened so quickly, and I don’t think folks were prepared for the ensuing ecosystem shift. You don’t often expect diseases to come through and totally reshape ecosystem dynamics within such a short period. But that’s what we saw.” Without sunflower sea stars to keep those spiny purple urchins in check, the balance began to falter, setting the stage for an unprecedented chain reaction. Urchin populations skyrocketed, grazing on kelp without limits, and once-thriving underwater forests collapsed into barren rock. A dense group of purple sea urchins, which exploded in population after the sunflower sea stars disappeared, photographed near Mendocino Headlands State Park, north of San Francisco. Brent Durand via Getty Images In California, with 99 percent loss, sunflower sea stars are now considered functionally extinct. “Even though there may be a few remnant individuals left, they can no longer fulfill their historic role in the ecosystem,” Bank says. As sunflower stars unraveled in the wild, another species—its thick-armed cousin, the giant pink star—offered an unexpected foothold for hope. The pink stars share a nearly identical geographic range and life history with sunflower stars, and crucially, their larvae can be raised in aquaria. If scientists could learn to freeze and revive the pink star in its early life stages, they wondered, could that knowledge become a lifeline for the sunflower star? That’s where the small team in Moss Landing stepped in. Freezing sea stars for the future What these scientists did was something no one had ever pulled off with a sea star. Working with giant pink stars, researchers spawned adults at the Aquarium of the Pacific in Long Beach, California, fertilized their gametes to produce thousands of larvae, and shipped those microscopic bodies to the Frozen Zoo—a cryopreserved archive of creatures operated by the San Diego Zoo Wildlife Alliance. There, reproductive scientists plunged the larvae into liquid nitrogen, cooling them to extremely low temperatures and pausing their cells’ biological activity. The larvae, essentially frozen in time, were shielded from ice crystal damage with special cryoprotectant mixtures. Sunflower Star Laboratory researchers remove a vial of pink star larvae from an insulated cooler at around minus 180 degrees Celsius in preparation for thawing. Avery Schuyler Nunn After months in this suspended state, the larvae were sent to the Sunflower Star Laboratory where Carly Young, a San Diego Zoo Wildlife Alliance scientist who advances cryopreservation and reproductive-rescue tools, led the team in thawing the vials. She had fine-tuned the ideal way to keep the larvae alive as they returned to real-world temperatures, carefully testing more than 100 “recipes” with various warming rates, cryoprotectant dilutions and rehydration steps. The pink star larvae not only survived thawing, but have thus far lived all the way through metamorphosis into juveniles. Scientists watched the little stars settle spontaneously along the bottom of their beakers just 19 days after revival. The success prompted the team to apply the same cryopreservation protocols to sunflower star larvae from the Alaska SeaLife Center. The larvae will be frozen in perpetuity, creating the first-ever cryopreserved archive of the species—like a seed bank, but for the baby sea stars. “A famous quote from the ’70s, when the Frozen Zoo in San Diego was established, was, ‘You must collect things for reasons you don’t yet understand,’” says Ashley Kidd, conservation project manager at SSL. “We don’t know when the other shoe is going to drop and what populations are going to look like as the planet changes. So, rather than chasing ghosts around the ocean floor, we really focused on what we can do with animals that are currently under human care somewhere.” While cryopreservation itself isn’t a ready-made restoration tool, it opens the door to conserving genetic diversity of a species and banking rare lineages for potential reintroduction to the wild. In the 1970s and 1990s, researchers began testing cryopreservation of marine invertebrates with sperm and larvae, establishing the basic protocols that this team could apply to sea stars. The breakthrough doesn’t restore kelp forests by itself, but the SSL scientists note that cryopreservation creates something the conservation community has desperately needed: time. Time to hold onto genetic diversity, time to refine captive rearing and time to prepare for future reintroduction at scales big enough to matter. The ultimate test, the researchers say, will be translating the thawing process to sunflower sea stars. Carly Young, at the Sunflower Star Laboratory, looks for movement in the young sea stars. Avery Schuyler Nunn Just this summer, scientists uncovered a piece of the puzzle that had eluded them for more than a decade: the pathogen behind sea star wasting disease. In a four-year international effort, researchers traced the outbreak to a strain of the marine bacterium Vibrio pectenicida. When cultured and injected into healthy sea stars, it reproduced the telltale symptoms—softening arms, rapid disintegration and death within days. The finding, published in Nature Ecology and Evolution in August, gives recovery teams a way to test for the pathogen in labs and hatcheries, tighten quarantine measures and understand disease risks before returning captive-bred sea stars to the Pacific. “It’s massively important to know what to look for, and the fact that we are now able to test for this disease is going to be critical in advancing our ability to move forward with reintroductions and continuing the research,” notes Kim. “We’ve already been able to take fluid samples from all of our stars and get them analyzed for the presence of Vibrio pectenicida, so we’ve mobilized very quickly on the heels of development.” Paired with this new diagnostic clarity, advances in cryopreservation offer a second front in the effort to save the species. Frozen larvae can be stored for decades and offer flexibility for selective breeding of disease-tolerant traits, notes the team. Cryopreservation adds another tool to the scientists’ toolbox as they fight to prevent the species—and, in turn, its ecosystem—from wasting away. “Bringing back sunflower stars,” Bank says, “is the single-most important step we can take toward restoring kelp forest balance.” Get the latest Science stories in your inbox.

Archaeologists Are Unraveling the Mysteries Behind Deep Pits Found Near Stonehenge

Based on a comprehensive study, researchers are now convinced the shafts were human-made, likely dug during the Late Neolithic period roughly 4,000 years ago

Archaeologists Are Unraveling the Mysteries Behind Deep Pits Found Near Stonehenge Based on a comprehensive study, researchers are now convinced the shafts were human-made, likely dug during the Late Neolithic period roughly 4,000 years ago Sarah Kuta - Daily Correspondent December 10, 2025 9:59 a.m. The pits are evenly spaced around a large circle. University of Bradford In 2020, archaeologists in the United Kingdom made a surprising discovery. At Durrington Walls, a large Neolithic henge not far from Stonehenge, they found more than a dozen large, deep pits buried under layers of loose clay. The pits are mysterious. Each one measures roughly 30 feet wide by 15 feet deep, and together they form a mile-wide circle around Durrington Walls and neighboring Woodhenge. They also appear to be linked with the much older Larkhill causewayed enclosure, built more than 1,000 years before Durrington Walls. For the last few years, archaeologists have been puzzling over their origins: Were they dug intentionally by human hands? Were they naturally occurring structures, like sinkholes? Or is there some other possible explanation for the existence of these colossal shafts? Quick fact: The purpose of Durrington Walls While Stonehenge is thought to have been a sacred place for ceremonies, Durrington Walls was a place where people actually lived. In a new paper published in the journal Internet Archaeology, archaeologists report that they have a much better understanding of the pits’ purpose, chronology and environmental setting. And, now, they are confident the shafts were made by humans. “They can’t be occurring naturally,” says lead author Vincent Gaffney, an archaeologist at the University of Bradford, to the Guardian’s Steven Morris. “It just can’t happen. We think we’ve nailed it.” Chris Gaffney, an archaeologist at the at the University of Bradford, surveys the ground near Durrington Walls. University of Bradford For the study, researchers returned to the site in southern England and used several different methods to further analyze the unusual structures. They used a technique known as electrical resistance tomography to calculate the pits’ depths, and radar and magnetometry to suss out their shapes. They also took core samples of the sediment, then ran the soil through a variety of tests. For instance, they used optically stimulated luminescence to determine the last time each layer of soil had been exposed to the sun. They also looked for traces of animal or plant DNA. Astonishing' Stonehenge discovery offers new insights into Neolithic ancestors. Together, the results of these analyses indicate humans must have been involved, which suggests the pits could be “one of the largest prehistoric structures in Britain, if not the largest,” Gaffney tells the BBC’s Sophie Parker. Researchers suspect the circle pits were created by people living at the site over a short period of time during the Late Neolithic period roughly 4,000 years ago. They were not “simply dug and abandoned” but, rather, appear to have been part of a “structured, monumental landscape that speaks to the complexity and sophistication of Neolithic society,” Gaffney says in a statement. For example, the pits are fairly evenly spaced around the circle, which suggests their Neolithic creators were measuring the distances between them somehow. “The skill and effort that must have been required to not only dig the pits, but also to place them so precisely within the landscape is a marvel,” says study co-author Richard Bates, a geophysicist at the University of St Andrews, in a statement. “When you consider that the pits are spread over such a large distance, the fact they are located in a near perfect circular pattern is quite remarkable.” Researchers used multiple methods to investigate the pits at Durrington Walls. University of Bradford But who dug the pits? And, perhaps more importantly, why? Archaeologists are still trying to definitively answer those questions, but they suspect the shafts were created to serve as some sort of sacred boundary around Durrington Walls. Their creators may also have been trying to connect with the underworld, per the Guardian. “They’re inscribing something about their cosmology, their belief systems, into the earth itself in a very dramatic way,” Gaddney tells the BBC. Get the latest stories in your inbox every weekday.

Is red meat bad for you? Limited research robs us of a clear answer.

We’d all appreciate more definitive guidance. Eating a varied diet is a wise move while we wait.

Over and over, we ask the question: Is Food X good or bad for you? And, over and over, belief in the answer — whether it’s yes or no — is held with conviction totally out of proportion with the strength of the evidence.Today’s illustration: red meat. It has become one of the most-disputed issues in food. It’s so polarizing that some people decide to eat no meat at all, while others decide to eat only meat. It’s poison, or it’s the only true fuel.The latest salvo in the Meat Wars was kicked off by a new report that outlines the optimal diet for both people and planet. The EAT-Lancet Report comes down hard on red meat; its recommended daily intake is a mere 14 grams — that’s half an ounce.Read on, and the news gets worse: “Because intake of red meat is not essential and appears to be linearly related to higher total mortality and risks of other health outcomes in populations that have consumed it for many years, the optimal intake may be zero.”Note that word: “related.” It’s the source of the problem with the report and its recommendation.The EAT-Lancet report, by researchers from 17 countries, bases its recommendation solely on observational data. When you do that, meat comes out looking pretty bad. In study after study, people who report eating a lot of meat have worse health outcomes than people who eat little. Meat-eating correlates with increased risk of heart disease, some cancers and all-cause mortality.But, as always with observational research that attempts to connect the dots between diet and health, the key question is whether the meat itself, or something else associated with a meat-heavy lifestyle, is actually causing the bad outcomes.That’s a hard question to answer, but there are clues that people who eat a lot of meat are very different from people who eat a little.Let’s look at a study, published in JAMA Internal Medicine, cited by the EAT-Lancet report; it has a convenient demographic summary. According to it, people in the top one-fifth of meat eaters are different from people in the bottom fifth in a lot of important ways: They weigh more, they’re more likely to smoke, they’re not as well-educated, they get less exercise, and they report lower intakes of fruit, vegetables and fiber. On the plus side, they report drinking less alcohol. But other than that, we’re looking at a litany of markers for a lifestyle that’s not particularly health-conscious.So, to suss out whether it’s the meat that’s raising disease risk, you have to somehow correct for any of the differences on that list — and most of that information also comes from observational research, so even the confounders are confounded.Then there are the things you can’t correct for. Sleep quality, depression and screen time, for example, all correlate with some of the same diseases meat correlates with, but most studies have no information on those.All this confounding explains one of my all-time favorite findings from observational research. It comes from the same study the demographics came from (analyzed in a 2015 paper). Sure enough, the people who ate the most meat were more likely to die of cancer and heart disease, but they were also more likely to die in accidents. And the biggest difference came from the catchall category “all others,” which invariably includes causes of death that have nothing to do with meat.Basically, there’s a very simple problem with relying on observational research: People who eat a lot of meat are very different from people who eat less of it. The meat definitely isn’t causing the accidental deaths (unless, perhaps, they’re tragic backyard grill mishaps), and it isn’t causing at least some of the “all others” deaths, so we know that heavy and light meat-eaters are different in all kinds of ways.That’s where controlled trials come in.In a perfect world, we could figure this out by keeping a large group of people captive for a lifetime, feeding half of them meat, and seeing what happens. Okay, maybe that’s not a perfect world, but it would be the best solution to this particular problem.Instead, we have trials that are short-term (because of logistics and cost), and necessarily rely on markers for disease, rather than the disease itself. For that to be useful, you need a marker that’s a reliable indicator. For a lot of diseases — including cancer — those are hard to come by. For heart disease, we have a good one: low-density lipoprotein (LDL) cholesterol. So, most of the controlled trials of meat-eating focus on heart disease.If you spend some time reading those trials (and I did, so you don’t have to), you find that most of them show some increase in LDL cholesterol, although it’s generally small.A 2025 analysis of 44 controlled trials on meat found that the only ones showing positive cardiovascular outcomes had links to the meat industry, and even then, only about one in five came out positive. Of the independent studies, about three-quarters showed negative outcomes, and the remaining one-quarter was neutral.This isn’t surprising. Red meat contains saturated fat, and we have countless trials that demonstrate sat fat’s ability to raise LDL. But if the meat you eat is relatively lean, that effect is going to be small.The lesson here is that we don’t have a lot of good evidence on meat and health. The observational evidence is hopelessly confounded, and the evidence from clinical trials is woefully limited. There’s so much we simply don’t know. There may be other ways meat raises risk (leading to over-absorption of heme iron and stimulating the production of TMAO, or trimethylamine N-oxide), but there’s little definitive evidence for them. And, of course, there’s the question of what you eat instead. If you’re eating red meat instead of, say, instant ramen, that may be an improvement. If, instead, you’re cutting back on your lentils, not so much.As always, the single-most important thing to remember about nutrition is that what we know is absolutely dwarfed by what we don’t know. Which means that, if you’re making decisions based on what we do know, you could very well be wrong.So what’s an eater to do? Meat is a nutritious food. In fact, animal foods are the only natural sources of a vitamin we need — B12 — which is an indication that we evolved with meat and dairy as part of our diet. It’s very hard to know whether eating some lean meat leads to better outcomes than eating no meat, but I think some meat is a good hedge against all that uncertainty. (The ethical and environmental concerns are also important, but for today let’s focus on health.)But plant foods are also nutritious. And eating a wide variety of them is also a good hedge against uncertainty. Which means the carnivore diet — all meat, all the time! — is a pretty bad bet.Unfortunately, “uncertainty” is not a word that features prominently in the Meat Wars. Instead, we have an unappetizing combination of nastiness and sanctimony, with each camp convinced that the truth and the light are on their side.Not that this is a metaphor for our times or anything.

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.