Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

GoGreenNation News

Learn more about the issues presented in our films
Show Filters

As Amazon's 'Flying Rivers' Weaken With Tree Loss, Scientists Warn of Worsening Droughts

Scientists warn that “flying rivers” — invisible streams of moisture that carry rain from the Atlantic Ocean westward across the Amazon — are weakening as deforestation and climate change advance

BOGOTA, Colombia (AP) — Droughts have withered crops in Peru, fires have scorched the Amazon and hydroelectric dams in Ecuador have struggled to keep the lights on as rivers dry up. Scientists say the cause may lie high above the rainforest, where invisible “flying rivers” carry rain from the Atlantic Ocean across South America.New analysis warns that relentless deforestation is disrupting that water flow and suggests that continuing tree loss will worsen droughts in the southwestern Amazon and could eventually trigger those regions to shift from rainforest to drier savanna — grassland with far fewer trees.“These are the forces that actually create and sustain the Amazon rainforest,” said Matt Finer, a senior researcher with Amazon Conservation’s Monitoring of the Andean Amazon Project (MAAP), which tracks deforestation and climate threats across the basin and carried out the analysis. “If you break that pump by cutting down too much forest, the rains stop reaching where they need to go.” What are flying rivers and how do they work? Most of the Amazon’s rainfall starts over the Atlantic Ocean. Moist air is pushed inland by steady winds that blow west along the equator, known as the trade winds. The forest then acts like a pump, effectively relaying the water thousands of miles westward as the trees absorb water, then release it back into the air.Brazilian climate scientist Carlos Nobre was among the early researchers who calculated how much of the water vapor from the Atlantic would move through and eventually out of the Amazon basin. He and colleagues coined the “flying rivers” term at a 2006 scientific meeting, and interest grew as scientists warned that a weakening of the rivers could push the Amazon into a tipping point where rainforest would turn to savanna.That's important because the Amazon rainforest is a vast storehouse for the carbon dioxide that largely drives the world's warming. Such a shift would devastate wildlife and Indigenous communities and threaten farming, water supplies and weather stability far beyond the region. Warning signs in Peru and Bolivia The analysis by Finer's group found that southern Peru and northern Bolivia are especially vulnerable. During the dry season, flying rivers sweep across southern Brazil before reaching the Andes — precisely where deforestation is most intense. The loss of trees means less water vapor is carried westward, raising the risk of drought in iconic protected areas such as Peru’s Manu National Park.“Peru can do everything right to protect a place like Manu,” Finer said. “But if deforestation keeps cutting into the pump in Brazil, the rains that sustain it may never arrive.”Nobre said as much as 50% of rainfall in the western Amazon near the Andes depends on the flying rivers.Corine Vriesendorp, Amazon Conservation’s director of science based in Cusco, Peru, said the changes are already visible. “The last two years have brought the driest conditions the Amazon has ever seen,” Vriesendorp said. “Ecological calendars that Indigenous communities use — when to plant, when to fish, when animals reproduce — are increasingly out of sync. Having less and more unpredictable rain will have an even bigger impact on their lives than climate change is already having.” Forest makes a fragile pump MAAP researchers found that rainfall patterns depend on when and where the flying rivers cross the basin. In the wet season, their northern route flows mostly over intact forests in Guyana, Suriname and northern Brazil, keeping the system strong. But in the dry season — when forests are already stressed by heat — the aerial rivers cut across southern Brazil, where deforestation fronts spread along highways and farms and there simply are fewer trees to help move the moisture along.“It’s during the dry months, when the forest most needs water, that the flying rivers are most disrupted,” Finer said. Finer pointed to roads that can accelerate deforestation, noting that the controversial BR-319 highway in Brazil — a project to pave a road through one of the last intact parts of the southern Amazon — could create an entirely new deforestation front.For years, scientists have warned about the Amazon tipping toward savannah. Finer said the new study complicates that picture. “It’s not a single, all-at-once collapse,” he said. “Certain areas, like the southwest Amazon, are more vulnerable and will feel the impacts first. And we’re already seeing early signs of rainfall reduction downwind of deforested areas.”Nobre said the risks are stark. Amazon forests have already lost about 17% of their cover, mostly to cattle and soy. Those ecosystems recycle far less water. “The dry season is now five weeks longer than it was 45 years ago, with 20 to 30% less rainfall,” he said. “If deforestation exceeds 20 to 25% and warming reaches 2 degrees Celsius, there’s no way to prevent the Amazon from reaching the tipping point.”Protecting intact forests, supporting Indigenous land rights and restoring deforested areas are the clearest paths forward, researchers say.“To avoid collapse we need zero deforestation, degradation and fires — immediately,” Nobre said. “And we must begin large-scale forest restoration, not less than half a million square kilometers. If we do that, and keep global warming below 2 degrees, we can still save the Amazon.”Finer said governments should consider new conservation categories specifically designed to protect flying rivers — safeguarding not just land but the atmospheric flows that make the rainforest possible.For Vriesendorp, that means regional cooperation. She praised Peru for creating vast parks and Indigenous reserves in the southeast, including Manu National Park. But, she said, “this can’t be solved by one country alone. Peru depends on Brazil, and Brazil depends on its neighbors. We need basin-wide solutions.”The Associated Press’ climate and environmental coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP’s standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org. Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See – Sept. 2025

California needs biomass energy to meet its wildfire goals. Its projects keep going South

California needs to burn vegetation both for wildfire mitigation and to generate power. So why do biomass energy projects keep leaving the state?

Arbor Energy is, essentially, a poster child of the kind of biomass energy project California keeps saying it wants.The state’s goal is to reduce wildfire risk on 1 million acres of wildlands every year, including by thinning overgrown forests, which is expected to generate roughly 10 million tons of wood waste annually. Arbor hopes to take that waste, blast it through a “vegetarian rocket engine” to produce energy, then sequester all of the carbon the process would generate underground.California has billed Arbor — and the handful of other similarly aimed projects it’s financed — as a win-win-win: wildfire mitigation, clean energy and carbon sequestration all in one.Yet, after Arbor initially won state financial backing for a pilot project in Placer County, the El Segundo-based company’s California ambitions fell through, like many biomass projects before it.Instead, it’s heading to Louisiana.California, biomass energy advocates say, has struggled to get past its distrust of the technology, given traditional biomass’ checkered past of clear-cutting forests and polluting poorer communities. Further, the state’s strict permitting requirements have given residents tremendous power to veto projects and created regulatory headaches.But many environmental groups argue it’s an example of California’s environmental and health protections actually working. If not done carefully, bioenergy projects run the risk of emitting carbon — not sequestering it — and polluting communities already grappling with some of the state’s dirtiest air. “When you look at biomass facilities across California — and we’ve done Public Records Act requests to look at emissions, violations and exceedances ... the reality is that we’re not in some kind of idealized pen-and-paper drawing of what the equipment does,” said Shaye Wolf, climate science director at the Center for Biological Diversity. “In the real world, there are just too many problems with failures and faults in the equipment.”There are simpler and safer uses for this wood waste, these critics say: fertilizer for agriculture, wood chips and mulch. It may not provide carbon-negative energy but comes with none of the risks of bioenergy projects, they say. The Center for Biological Diversity and others advocate for a “hands-off” approach to California’s forests and urge management of the wildfire crisis through home hardening and evacuation planning alone. But fire and ecology experts say more than a century of fire suppression have made that unrealistic.However, the sweeping forest-thinning projects these experts say are needed will cost billions, and so the state needs every source of funding it can get. “Our bottleneck right now is, how do we pay for treating a million acres a year?” said Deputy Chief John McCarthy of the California Department of Forestry and Fire Protection, who oversees the agency’s wood products and bioenergy program.In theory, the class of next-generation biomass energy proposals popping up across California could help fund this work.“California has an incredible opportunity,” said Arbor chief executive and co-founder Brad Hartwig. With the state’s leftover biomass from forest thinning, “we could make it basically the leader in carbon removal in the world.”A lot of wood with nowhere to goBiomass energy first took off in California in the 1980s after small pioneering plants at sawmills and food-processing facilities proved successful and the state’s utilities began offering favorable contracts for energy sources they deemed “renewable” — a category that included biomass. In the late ‘80s and early ‘90s, the state had more than 60 operating biomass plants, providing up to 9% of the state’s residential power. Researchers estimate the industry supported about 60,000 acres of forest treatment to reduce wildfire risk per year at the time. But biomass energy’s heyday was short-lived.In 1994, the California Public Utilities Commission shifted the state’s emphasis away from creating a renewable and diverse energy mix and toward simply buying the cheapest possible power.Biomass — an inherently more expensive endeavor — struggled. Many plants took buyouts to shut down early. Despite California’s repeated attempts to revitalize the industry, the number of biomass plants continued to dwindle.Today, only 23 biomass plants remain in operation, according to the industry advocate group California Biomass Energy Alliance. The state Energy Commission expects the number to continue declining because of aging infrastructure and a poor bioenergy market. California’s forest and wildfire leadership are trying to change that.In 2021, Gov. Gavin Newsom created a task force to address California’s growing wildfire crisis. After convening the state’s top wildfire and forest scientists, the task force quickly came to a daunting conclusion: The more than a century of fire suppression in California’s forests — especially in the Sierra Nevada — had dramatically increased their density, providing fires with ample fuel to explode into raging beasts.To solve it, the state needed to rapidly remove that extra biomass on hundreds of thousands, if not millions, of acres of wildlands every year through a combination of prescribed burns, rehabilitation of burned areas and mechanically thinning the forest.McCarthy estimated treating a single acre of land could cost $2,000 to $3,000. At a million acres a year, that’s $2 billion to $3 billion annually.“Where is that going to come from?” McCarthy said. “Grants — maybe $200 million … 10% of the whole thing. So, we need markets. We need some sort of way to pay for this stuff and in a nontraditional way.”McCarthy believes bioenergy is one of those ways — essentially, by selling the least valuable, borderline unusable vegetation from the forest floor. You can’t build a house with pine cones, needles and twigs, but you can power a bioenergy plant.However, while biomass energy has surged in Southern states such as Georgia, projects in California have struggled to get off the ground.In 2022, a bid by Chevron, Microsoft and the oil-drilling technology company Schlumberger to revive a traditional biomass plant near Fresno and affix carbon capture to it fell through after the U.S. Environmental Protection Agency requested the project withdraw its permit application. Environmental groups including the Center for Biological Diversity and residents in nearby Mendota opposed the project.This year, a sweeping effort supported by rural Northern California counties to process more than 1 million tons of biomass a year into wood pellets and ship them to European bioenergy plants (with no carbon capture involved) in effect died after facing pushback from watch groups that feared the project, led by Golden State Natural Resources, would harm forests, and environmental justice groups that worried processing facilities at the Port of Stockton would worsen the air quality in one of the state’s most polluted communities.Arbor believed its fate would be different. Bioenergy from the ground upBefore founding Arbor, Hartwig served in the California Air National Guard for six years and on a Marin County search and rescue team. He now recalls a common refrain on the job: “There is no rescue in fire. It’s all search,” Hartwig said. “It’s looking for bodies — not even bodies, it’s teeth and bones.”In 2022, he started Arbor, with the idea of taking a different approach to bioenergy than the biomass plants shuttering across California.To understand Arbor’s innovation, start with coal plants, which burn fossil fuels to heat up water and produce steam that turns a turbine to generate electricity. Traditional biomass plants work essentially the same but replace coal with vegetation as the fuel. Typically, the smoke from the vegetation burning is simply released into the air. Small detail of the 16,000-pound proof-of-concept system being tested by Arbor that will burn biomass, capture carbon dioxide and generate electricity. (Myung J. Chun/Los Angeles Times) Arbor’s solution is more like a tree-powered rocket engine.The company can utilize virtually any form of biomass, from wood to sticks to pine needles and brush. Arbor heats it to extreme temperatures and deprives it of enough oxygen to make the biomass fully combust. The organic waste separates into a flammable gas — made of carbon monoxide, carbon dioxide, methane and hydrogen — and a small amount of solid waste.The machine then combusts the gas at extreme temperatures and pressures, which then accelerates a turbine at much higher rates than typical biomass plants. The resulting carbon dioxide exhaust is then sequestered underground. Arbor portrays its solution as a flexible, carbon-negative and clean device: It can operate anywhere with a hookup for carbon sequestration. Multiple units can work together for extra power. All of the carbon in the trees and twigs the machine ingests ends up in the ground — not back in the air.But biomass watchdogs warn previous attempts at technology like Arbor’s have fallen short.This biomass process creates a dry, flaky ash mainly composed of minerals — essentially everything in the original biomass that wasn’t “bio” — that can include heavy metals that the dead plants sucked up from the air or soil. If agricultural or construction waste is used, it can include nasty chemicals from wood treatments and pesticides.Arbor plans — at least initially — on using woody biomass directly from the forest, which typically contains less of these dangerous ash chemicals.Turning wood waste into gas also generates a thick, black tar composed of volatile organic compounds — which are also common contaminants following wildfires. The company says its gasification process uses high enough temperatures to break down the troublesome tar, but researchers say tar is an inevitable byproduct of this process. Grant Niccum, left, Arbor lead systems engineer and Kevin Saboda, systems engineer, at the company‘s test site in San Bernardino. Biomass is fed into this component and then compressed to 100 times atmospheric pressure and burned to create a synthetic gas. (Myung J. Chun / Los Angeles Times) Watchdogs also caution that the math to determine whether bioenergy projects sequester or release carbon is complicated and finicky.“Biomass is tricky, and there’s a million exceptions to every rule that need to be accounted for,” said Zeke Hausfather, climate research lead with Frontier Climate, which vets carbon capture projects such as Arbor’s and connects them with companies interested in buying carbon credits. “There are examples where we have found a project that actually works on the carbon accounting math, but we didn’t want to do it because it was touching Canadian boreal forest that’s old-growth forest.”Frontier Climate, along with the company Isometric, audits Arbor’s technology and operations. However, critics note that because both companies ultimately support the sale of carbon credits, their assessments may be biased.At worst, biomass projects can decimate forests and release their stored carbon into the atmosphere. Arbor hopes, instead, to be a best-case scenario: improving — or at least maintaining — forest health and stuffing carbon underground.When it all goes SouthArbor had initially planned to build a proof of concept in Placer County. To do it, Arbor won $2 million through McCarthy’s Cal Fire program and $500,000 through a state Department of Conservation program in 2023.But as California fell into a deficit in 2023, state funding dried up. So Arbor turned to private investors. In September 2024, Arbor reached an agreement with Microsoft in which the technology company would buy carbon credits backed by Arbor’s sequestration. In July of this year, the company announced a $41-million deal (well over 15 times the funding it ever received from California) with Frontier Climate, whose carbon credit buyers include Google, the online payment company Stripe and Meta, which owns Instagram and Facebook.To fulfill the credits, it would build its first commercial facility near Lake Charles, La., in part powering nearby data centers.“We were very excited about Arbor,” McCarthy said. “They pretty much walked away from their grant and said they’re not going to do this in California. … We were disappointed in that.”But for Arbor, relying on the state was no longer feasible.“We can’t rely on California for the money to develop the technology and deploy the initial systems,” said Hartwig, standing in Arbor’s plant-covered El Segundo office. “For a lot of reasons, it makes sense to go test the machine, improve the technology in the market elsewhere before we actually get to do deployments in California, which is a much more difficult permitting and regulatory environment.” Rigger Arturo Hernandez, left, and systems engineer Kevin Saboda secure Arbor’s proof-of-concept system in the company’s San Bernardino test site after its journey from Arbor’s headquarters in El Segundo. The steel frame was welded in Texas while the valves, tubing and other hardware were installed in El Segundo. (Myung J. Chun/Los Angeles Times) It’s not the first next-generation biomass company based in California to build elsewhere. San Francisco-based Charm Industrial, whose technology doesn’t involve energy generation, began its sequestration efforts in the Midwest and plans to expand into Louisiana.The American South has less stringent logging and environmental regulations, which has led biomass energy projects to flock to the area: In 2024, about 2.3% of the South’s energy came from woody biomass — up from 2% in 2010, according to the U.S. Energy Information Administration. Meanwhile, that number on the West Coast was only 1.2%, continuing on its slow decline. And, unlike in the West, companies aiming to create wood pellets to ship abroad have proliferated in the South. In 2024, the U.S. produced more than 10.7 million tons of biomass pellets; 82% of which was exported. That’s up from virtually zero in 2000. The vast majority of the biomass pellets produced last year — 84% — was from the South. Watchdogs warn that this lack of guardrails has allowed the biomass industry to harm the South’s forests, pollute poor communities living near biomass facilities and fall short of its climate claims.Over the last five years, Drax — a company that harvests and exports wood pellets and was working with Golden State Natural Resources — has had to pay Louisiana and Mississippi a combined $5 million for violating air pollution laws. Residents living next to biomass plants, like Drax’s, say the operations have worsened asthma and routinely leave a film of dust on their cars.But operating a traditional biomass facility or shipping wood pellets to Europe wasn’t Arbor’s founding goal — albeit powering data centers in the American South wasn’t exactly either.Hartwig, who grew up in the Golden State, hopes Arbor’s technology can someday return to California to help finance the solution for the wildfire crisis he spent so many years facing head-on.“We’ve got an interest in Arkansas, in Texas, all the way up to Minnesota,” Hartwig said. “Eventually, we’d like to come back to California.”

3 Questions: Addressing the world’s most pressing challenges

Mihaela Papa discusses the BRICS Lab, her role at the Center for International Studies, and the center's ongoing ambition to tackle the world's most complex challenges in new and creative ways.

The Center for International Studies (CIS) empowers students, faculty, and scholars to bring MIT’s interdisciplinary style of research and scholarship to address complex global challenges. In this Q&A, Mihaela Papa, the center's director of research and a principal research scientist at MIT, describes her role as well as research within the BRICS Lab at MIT — a reference the BRICS intergovernmental organization, which comprises the nations of Brazil, Russia, India, China, South Africa, Egypt, Ethiopia, Indonesia, Iran and the United Arab Emirates. She also discusses the ongoing mission of CIS to tackle the world's most complex challenges in new and creative ways.Q: What is your role at CIS, and some of your key accomplishments since joining the center just over a year ago?A: I serve as director of research and principal research scientist at CIS, a role that bridges management and scholarship. I oversee grant and fellowship programs, spearhead new research initiatives, build research communities across our center's area programs and MIT schools, and mentor the next generation of scholars. My academic expertise is in international relations, and I publish on global governance and sustainable development, particularly through my new BRICS Lab. This past year, I focused on building collaborative platforms that highlight CIS’ role as an interdisciplinary hub and expand its research reach. With Evan Lieberman, the director of CIS, I launched the CIS Global Research and Policy Seminar series to address current challenges in global development and governance, foster cross-disciplinary dialogue, and connect theoretical insights to policy solutions. We also convened a Climate Adaptation Workshop, which examined promising strategies for financing adaptation and advancing policy innovation. We documented the outcomes in a workshop report that outlines a broader research agenda contributing to MIT’s larger climate mission.In parallel, I have been reviewing CIS’ grant-making programs to improve how we serve our community, while also supporting regional initiatives such as research planning related to Ukraine. Together with the center's MIT-Brazil faculty director Brad Olsen, I secured a MITHIC [MIT Human Insight Collaboration] Connectivity grant to build an MIT Amazonia research community that connects MIT scholars with regional partners and strengthens collaboration across the Amazon. Finally, I launched the BRICS Lab to analyze transformations in global governance and have ongoing research on BRICS and food security and data centers in BRICS. Q: Tell us more about the BRICS Lab.A: The BRICS countries comprise the majority of the world’s population and an expanding share of the global economy. [Originally comprising Brazil, Russia, India, and China, BRICS currently includes 11 nations.] As a group, they carry the collective weight to shape international rules, influence global markets, and redefine norms — yet the question remains: Will they use this power effectively? The BRICS Lab explores the implications of the bloc’s rise for international cooperation and its role in reshaping global politics. Our work focuses on three areas: the design and strategic use of informal groups like BRICS in world affairs; the coalition’s potential to address major challenges such as food security, climate change, and artificial intelligence; and the implications of U.S. policy toward BRICS for the future of multilateralism.Q: What are the center’s biggest research priorities right now?A: Our center was founded in response to rising geopolitical tensions and the urgent need for policy rooted in rigorous, evidence-based research. Since then, we have grown into a hub that combines interdisciplinary scholarship and actively engages with policymakers and the public. Today, as in our early years, the center brings together exceptional researchers with the ambition to address the world’s most pressing challenges in new and creative ways.Our core focus spans security, development, and human dignity. Security studies have been a priority for the center, and our new nuclear security programming advances this work while training the next generation of scholars in this critical field. On the development front, our work has explored how societies manage diverse populations, navigate international migration, as well as engage with human rights and the changing patterns of regime dynamics.We are pursuing new research in three areas. First, on climate change, we seek to understand how societies confront environmental risks and harms, from insurance to water and food security in the international context. Second, we examine shifting patterns of global governance as rising powers set new agendas and take on greater responsibilities in the international system. Finally, we are initiating research on the impact of AI — how it reshapes governance across international relations, what is the role of AI corporations, and how AI-related risks can be managed.As we approach our 75th anniversary in 2026, we are excited to bring researchers together to spark bold ideas that open new possibilities for the future.

Is there such a thing as a ‘problem shark’? Plan to catch repeat biters divides scientists

Some experts think a few sharks may be responsible for a disproportionate number of attacks. Should they be hunted down?First was the French tourist, killed while swimming off Saint-Martin in December 2020. The manager of a nearby water sports club raced out in a dinghy to help, only to find her lifeless body floating face down, a gaping wound where part of her right thigh should have been. Then, a month later, another victim. Several Caribbean islands away, a woman snorkelling off St Kitts and Nevis was badly bitten on her left leg by a shark. Fortunately, she survived.Soon after the fatal incident in December, Eric Clua, a marine biologist at the École Pratique des Hautes Études in Paris, got a phone call. Island nations often ask for his help after a shark bite, he says, “because I am actually presenting a new vision … I say, ‘You don’t have a problem with sharks, you have a problem with one shark.’” Continue reading...

First was the French tourist, killed while swimming off Saint-Martin in December 2020. The manager of a nearby water sports club raced out in a dinghy to help, only to find her lifeless body floating face down, a gaping wound where part of her right thigh should have been. Then, a month later, another victim. Several Caribbean islands away, a woman snorkelling off St Kitts and Nevis was badly bitten on her left leg by a shark. Fortunately, she survived.Soon after the fatal incident in December, Eric Clua, a marine biologist at the École Pratique des Hautes Études in Paris, got a phone call. Island nations often ask for his help after a shark bite, he says, “because I am actually presenting a new vision … I say, ‘You don’t have a problem with sharks, you have a problem with one shark.’”Human-shark conflicts are not solely the result of accidents or happenstance, Clua says. Instead, he says there are such things as problem sharks: bold individuals that may have learned, perhaps while still young, that humans are prey. It’s a controversial stance, but Clua thinks that if it’s true – and if he can identify and remove these problem sharks – it might dissuade authorities from taking even more extreme forms of retribution, including culls.A shark killed a man at Long Reef beach in Dee Why, Sydney, on 6 September, 2025. Photograph: Dean Lewins/AAPThough culls of sharks after human-shark conflict are becoming less common and are generally regarded by scientists as ineffective, they do still happen. One of the last big culls took place near Réunion, a French island in the Indian Ocean, between 2011 and 2013, resulting in the deaths of more than 500 sharks. Even that was not enough for some – four years later, a professional surfer called for daily shark culls near the island.And so, in the immediate aftermath of the French tourist’s death in Saint-Martin, when one of Clua’s contacts called to explain what had happened, he recalls telling them: “Just go there on the beach … I want swabbing of the wounds.”After that bite and the one that occurred a month later, medical professionals collected samples of mucus that the shark had left behind to send off for analysis, though it took weeks for the results to come back. But as Clua and colleagues describe in a study published last year, the DNA analysis confirmed that the same tiger shark was responsible for both incidents.Even before the DNA test was complete, however, analysis of the teeth marks left on the Saint-Martin victim, and of the tooth fragment collected from her leg, suggested the perpetrator was a tiger shark (Galeocerdo cuvier) roughly 3 metres (10ft) long. Armed with this knowledge, Clua and his colleagues set out to catch the killer.During January and February 2021, Clua and his team hauled 24 tiger sharks from the water off Saint-Martin and analysed a further 25 sharks that they caught either around St Barts or St Kitts and Nevis.Eric Clua and his colleagues took DNA samples from nearly 50 tiger sharks to try to find one that had bitten two women. Photograph: Courtesy of Eric CluaBecause both of the women who were bitten had lost a substantial amount of flesh, the scientists saw this as a chance to find the shark responsible. Each time they dragged a tiger shark out of the water they flipped it upside down, flooded its innards with water, and pressed firmly on its stomach to make it vomit. A shark is, generally, “a very easy puker”, Clua says. The team’s examinations turned up no evidence of human remains.Clua and his colleagues also took DNA samples from each of the tiger sharks, as well as from dead sharks landed by fishers in St Kitts and Nevis. None matched the DNA swabbed from the wounds suffered by the two women.But the team has not given up. Clua is now waiting for DNA analysis of mucus samples recovered from a third shark bite that happened off Saint-Martin in May 2024. If that matches samples from the earlier bites, Clua says, that would suggest it “might be possible” to catch the culprit shark in the future.For people who don’t want to risk interacting with sharks, I have great news – swimming pools existCatherine Macdonald, conservation biologistThat some specific sharks have developed a propensity for biting people is controversial among marine scientists, though Lucille Chapuis, a marine sensory ecologist at La Trobe University in Australia, is not entirely sure why. The concept of problem animals is well established on land, she says. Terrestrial land managers routinely contend with problem lions, tigers and bears. “Why not a fish?” asks Chapuis. “We know that fishes, including sharks, have amazing cognitive abilities.”Yet having gleaned a range of opinions on Clua’s ideas, some marine scientists rejected the concept of problem sharks outright.A tiger shark. Some scientists fear that merely talking about problem sharks could perpetuate the preconception of human-eating monsters. Photograph: Jeff Milisen/AlamyClua is aware that his approach is divisive: “I have many colleagues – experts – that are against the work I’m doing.”The biggest pushback is from scientists who say there is no concrete evidence for the idea that there are extra dangerous, human-biting sharks roaming the seas. Merely talking about problem sharks, they say, could perpetuate the idea that some sharks are hungry, human-eating monsters such as the beast from the wildly unscientific movie Jaws.Clua says the monster from Jaws and his definition of a problem shark are completely different. A problem shark is not savage or extreme; it’s just a shark that learned at some point that humans are among the things it might prey on. Environmental factors, as well as personality, might trigger or aggravate such behaviour.Besides the tiger shark that struck off Saint-Martin and St Kitts and Nevis, Clua’s 2024 study detailed the case of another tiger shark involved in multiple bites in Costa Rica. A third case focused on an oceanic whitetip shark in Egypt that killed a female swimmer by biting off her right leg. The same shark later attempted to bite the shoulder of one of Clua’s colleagues during a dive.Pilot fish follow an oceanic whitetip shark. A woman was killed when an oceanic whitetip bit off her right leg in Egypt. Photograph: Amar and Isabelle Guillen/Guillen Photo LLC/AlamyToby Daly-Engel, a shark expert at the Florida Institute of Technology, says the genetic analysis connecting the same tiger shark to two bite victims in the Caribbean is robust. However, she says such behaviour must be rare. “They’re just opportunistic. I mean, these things eat tyres.”Diego Biston Vaz, curator of fishes at the Natural History Museum in London, also praises Clua’s work, calling it “really forensic”. He, too, emphasises it should not be taken as an excuse to demonise sharks. “They’re not villains; they’re just trying to survive,” he says.Chapuis adds that the small number of animals involved in Clua’s recent studies mean the research does not prove problem sharks are real. Plus, while some sharks might learn to bite humans, she questions whether they would continue to do so long term. People tend to defend themselves well and, given there are only a few dozen unprovoked shark bites recorded around the world each year, she says there is no data to support the idea that even the boldest sharks benefit from biting people.Plus, Clua’s plan – to capture problem sharks and bring them to justice – is unrealistic, says David Shiffman, a marine conservation biologist based in Washington DC. Even if scientists can prove beyond doubt that a few specific sharks are responsible for a string of incidents – “which I do not believe he has done”, Shiffman adds – he thinks finding those sharks is not viable.Any resources used to track down problem sharks would be better spent on preventive measures such as lifeguards, who could spot sharks approaching a busy beach, says Catherine Macdonald, a conservation biologist at the University of Miami in Florida.While identifying and removing a problem shark is better than culling large numbers, she urges people to answer harder questions about coexisting with predators. “For people who don’t want to risk interacting with sharks, I have great news,” she says. “Swimming pools exist.”Identifying and removing a problem shark is often regarded as better than culling large numbers. Photograph: Humane Society International/AAPClua, for his part, intends to carry on. He’s working with colleagues on Saint-Martin to swab shark-bite injuries when they occur, and to track down potential problem sharks.Asked whether he has ever experienced a dangerous encounter with a large shark himself, Clua says that in 58 years of diving it has happened only once, while spear fishing off New Caledonia. Poised underwater, waiting for a fish to appear, he turned his head. “There was a bull shark coming [toward] my back,” he says.He got the feeling at that moment that he was about to become prey. But there was no violence. Clua looked at the bull shark as it turned and swam away.This story was originally published in bioGraphic, an independent magazine about nature and regeneration from the California Academy of Sciences.

Biomethane not viable for widespread use in UK home heating, report finds

Gas derived from farm waste can meet only 18% of current gas demand by 2050, despite claims of fossil fuel lobbyists, study findsGas derived from farm waste will never be an alternative to the widespread adoption of heat pumps, research shows, despite the claims of fossil fuel lobbyists.Biomethane, which comes mainly from “digesting” manure, sewage and other organic waste, has been touted as a low-carbon substitute for fossil fuel gas, for use in home heating. Proponents say it would be less disruptive than ripping out the UK’s current gas infrastructure and installing heat pumps. Continue reading...

Gas derived from farm waste will never be an alternative to the widespread adoption of heat pumps, research shows, despite the claims of fossil fuel lobbyists.Biomethane, which comes mainly from “digesting” manure, sewage and other organic waste, has been touted as a low-carbon substitute for fossil fuel gas, for use in home heating. Proponents say it would be less disruptive than ripping out the UK’s current gas infrastructure and installing heat pumps.But research seen by the Guardian shows that while there may be a role for biomethane in some industries and on farms, it will not make a viable alternative for the vast majority of homes.A study by the analyst company Regen, commissioned by the MCS Foundation charity, found that biomethane could account for only up to 18% of the UK’s current gas demand by 2050. That is because the available sources: manure, farm waste and sewage, cannot be scaled up to the extent needed without distorting the UK’s economy, or using unsustainable sources.Faced with the limitations of biomethane, ministers would do better to rule out its widespread use in home heating and concentrate on heat pumps, MCS concluded.Garry Felgate, the chief executive of the MCS Foundation, said: “Biomethane has an important role to play in decarbonisation – but not in homes. If we are to meet our climate targets and ensure that every household has access to secure, affordable energy, there is simply no viable way that we can continue to heat homes using the gas grid, whether that is using fossil gas, hydrogen, or biomethane.”Gas companies have a strong vested interest in the future of biomethane because its widespread use would allow them to keep the current gas infrastructure of pipelines, distribution technology and home boilers in operation. If the UK shifts most homes to heat pumps, those networks will become redundant.The same arguments are made by gas companies, and by some trade unions, in favour of hydrogen, which has also been touted as a low-carbon alternative to heat pumps, but which numerous studies have shown will not be economically viable at the scale required.At the Labour party conference this week, delegates were bombarded by lobbyists claiming that biomethane could take the place of 6m gas boilers and delay the phase-out of gas boilers.Felgate said ministers must require the decommissioning of the gas grid by 2050, and set a clear deadline for phasing out boilers.“Failure to plan for the decommissioning of the gas grid will result in it becoming a stranded asset,” he said. “Consumers and industry need certainty: biomethane will not replace fossil fuel gas in homes, electric heating such as heat pumps is the only viable way to decarbonise homes.”Tamsyn Lonsdale-Smith, the energy analyst at Regen who wrote the report, said there were uses for biomethane in industry, but that it was not suitable for widespread consumer use. “Biomethane can be a green gas with minimal environmental and land use impacts – but only if produced from the right sources, in the right way and at an appropriate scale,” she said. “The government is right to be focusing on scaling up biomethane production, but as sustainable supplies are likely to be limited, it is critical that its use is prioritised for only the highest value uses where carbon reductions are greatest.”A government spokesperson said: “Biomethane can play an important role in reducing our reliance on imported gas, increasing our country’s energy security and helping to deliver net zero. We are looking at how we can further support the sector and plan to publish a consultation on biomethane early next year.”

Responding to the climate impact of generative AI

Explosive growth of AI data centers is expected to increase greenhouse gas emissions. Researchers are now seeking solutions to reduce these environmental harms.

In part 2 of our two-part series on generative artificial intelligence’s environmental impacts, MIT News explores some of the ways experts are working to reduce the technology’s carbon footprint.The energy demands of generative AI are expected to continue increasing dramatically over the next decade.For instance, an April 2025 report from the International Energy Agency predicts that the global electricity demand from data centers, which house the computing infrastructure to train and deploy AI models, will more than double by 2030, to around 945 terawatt-hours. While not all operations performed in a data center are AI-related, this total amount is slightly more than the energy consumption of Japan.Moreover, an August 2025 analysis from Goldman Sachs Research forecasts that about 60 percent of the increasing electricity demands from data centers will be met by burning fossil fuels, increasing global carbon emissions by about 220 million tons. In comparison, driving a gas-powered car for 5,000 miles produces about 1 ton of carbon dioxide.These statistics are staggering, but at the same time, scientists and engineers at MIT and around the world are studying innovations and interventions to mitigate AI’s ballooning carbon footprint, from boosting the efficiency of algorithms to rethinking the design of data centers.Considering carbon emissionsTalk of reducing generative AI’s carbon footprint is typically centered on “operational carbon” — the emissions used by the powerful processors, known as GPUs, inside a data center. It often ignores “embodied carbon,” which are emissions created by building the data center in the first place, says Vijay Gadepally, senior scientist at MIT Lincoln Laboratory, who leads research projects in the Lincoln Laboratory Supercomputing Center.Constructing and retrofitting a data center, built from tons of steel and concrete and filled with air conditioning units, computing hardware, and miles of cable, consumes a huge amount of carbon. In fact, the environmental impact of building data centers is one reason companies like Meta and Google are exploring more sustainable building materials. (Cost is another factor.)Plus, data centers are enormous buildings — the world’s largest, the China Telecomm-Inner Mongolia Information Park, engulfs roughly 10 million square feet — with about 10 to 50 times the energy density of a normal office building, Gadepally adds. “The operational side is only part of the story. Some things we are working on to reduce operational emissions may lend themselves to reducing embodied carbon, too, but we need to do more on that front in the future,” he says.Reducing operational carbon emissionsWhen it comes to reducing operational carbon emissions of AI data centers, there are many parallels with home energy-saving measures. For one, we can simply turn down the lights.“Even if you have the worst lightbulbs in your house from an efficiency standpoint, turning them off or dimming them will always use less energy than leaving them running at full blast,” Gadepally says.In the same fashion, research from the Supercomputing Center has shown that “turning down” the GPUs in a data center so they consume about three-tenths the energy has minimal impacts on the performance of AI models, while also making the hardware easier to cool.Another strategy is to use less energy-intensive computing hardware.Demanding generative AI workloads, such as training new reasoning models like GPT-5, usually need many GPUs working simultaneously. The Goldman Sachs analysis estimates that a state-of-the-art system could soon have as many as 576 connected GPUs operating at once.But engineers can sometimes achieve similar results by reducing the precision of computing hardware, perhaps by switching to less powerful processors that have been tuned to handle a specific AI workload.There are also measures that boost the efficiency of training power-hungry deep-learning models before they are deployed.Gadepally’s group found that about half the electricity used for training an AI model is spent to get the last 2 or 3 percentage points in accuracy. Stopping the training process early can save a lot of that energy.“There might be cases where 70 percent accuracy is good enough for one particular application, like a recommender system for e-commerce,” he says.Researchers can also take advantage of efficiency-boosting measures.For instance, a postdoc in the Supercomputing Center realized the group might run a thousand simulations during the training process to pick the two or three best AI models for their project.By building a tool that allowed them to avoid about 80 percent of those wasted computing cycles, they dramatically reduced the energy demands of training with no reduction in model accuracy, Gadepally says.Leveraging efficiency improvementsConstant innovation in computing hardware, such as denser arrays of transistors on semiconductor chips, is still enabling dramatic improvements in the energy efficiency of AI models.Even though energy efficiency improvements have been slowing for most chips since about 2005, the amount of computation that GPUs can do per joule of energy has been improving by 50 to 60 percent each year, says Neil Thompson, director of the FutureTech Research Project at MIT’s Computer Science and Artificial Intelligence Laboratory and a principal investigator at MIT’s Initiative on the Digital Economy.“The still-ongoing ‘Moore’s Law’ trend of getting more and more transistors on chip still matters for a lot of these AI systems, since running operations in parallel is still very valuable for improving efficiency,” says Thomspon.Even more significant, his group’s research indicates that efficiency gains from new model architectures that can solve complex problems faster, consuming less energy to achieve the same or better results, is doubling every eight or nine months.Thompson coined the term “negaflop” to describe this effect. The same way a “negawatt” represents electricity saved due to energy-saving measures, a “negaflop” is a computing operation that doesn’t need to be performed due to algorithmic improvements.These could be things like “pruning” away unnecessary components of a neural network or employing compression techniques that enable users to do more with less computation.“If you need to use a really powerful model today to complete your task, in just a few years, you might be able to use a significantly smaller model to do the same thing, which would carry much less environmental burden. Making these models more efficient is the single-most important thing you can do to reduce the environmental costs of AI,” Thompson says.Maximizing energy savingsWhile reducing the overall energy use of AI algorithms and computing hardware will cut greenhouse gas emissions, not all energy is the same, Gadepally adds.“The amount of carbon emissions in 1 kilowatt hour varies quite significantly, even just during the day, as well as over the month and year,” he says.Engineers can take advantage of these variations by leveraging the flexibility of AI workloads and data center operations to maximize emissions reductions. For instance, some generative AI workloads don’t need to be performed in their entirety at the same time.Splitting computing operations so some are performed later, when more of the electricity fed into the grid is from renewable sources like solar and wind, can go a long way toward reducing a data center’s carbon footprint, says Deepjyoti Deka, a research scientist in the MIT Energy Initiative.Deka and his team are also studying “smarter” data centers where the AI workloads of multiple companies using the same computing equipment are flexibly adjusted to improve energy efficiency.“By looking at the system as a whole, our hope is to minimize energy use as well as dependence on fossil fuels, while still maintaining reliability standards for AI companies and users,” Deka says.He and others at MITEI are building a flexibility model of a data center that considers the differing energy demands of training a deep-learning model versus deploying that model. Their hope is to uncover the best strategies for scheduling and streamlining computing operations to improve energy efficiency.The researchers are also exploring the use of long-duration energy storage units at data centers, which store excess energy for times when it is needed.With these systems in place, a data center could use stored energy that was generated by renewable sources during a high-demand period, or avoid the use of diesel backup generators if there are fluctuations in the grid.“Long-duration energy storage could be a game-changer here because we can design operations that really change the emission mix of the system to rely more on renewable energy,” Deka says.In addition, researchers at MIT and Princeton University are developing a software tool for investment planning in the power sector, called GenX, which could be used to help companies determine the ideal place to locate a data center to minimize environmental impacts and costs.Location can have a big impact on reducing a data center’s carbon footprint. For instance, Meta operates a data center in Lulea, a city on the coast of northern Sweden where cooler temperatures reduce the amount of electricity needed to cool computing hardware.Thinking farther outside the box (way farther), some governments are even exploring the construction of data centers on the moon where they could potentially be operated with nearly all renewable energy.AI-based solutionsCurrently, the expansion of renewable energy generation here on Earth isn’t keeping pace with the rapid growth of AI, which is one major roadblock to reducing its carbon footprint, says Jennifer Turliuk MBA ’25, a short-term lecturer, former Sloan Fellow, and former practice leader of climate and energy AI at the Martin Trust Center for MIT Entrepreneurship.The local, state, and federal review processes required for a new renewable energy projects can take years.Researchers at MIT and elsewhere are exploring the use of AI to speed up the process of connecting new renewable energy systems to the power grid.For instance, a generative AI model could streamline interconnection studies that determine how a new project will impact the power grid, a step that often takes years to complete.And when it comes to accelerating the development and implementation of clean energy technologies, AI could play a major role.“Machine learning is great for tackling complex situations, and the electrical grid is said to be one of the largest and most complex machines in the world,” Turliuk adds.For instance, AI could help optimize the prediction of solar and wind energy generation or identify ideal locations for new facilities.It could also be used to perform predictive maintenance and fault detection for solar panels or other green energy infrastructure, or to monitor the capacity of transmission wires to maximize efficiency.By helping researchers gather and analyze huge amounts of data, AI could also inform targeted policy interventions aimed at getting the biggest “bang for the buck” from areas such as renewable energy, Turliuk says.To help policymakers, scientists, and enterprises consider the multifaceted costs and benefits of AI systems, she and her collaborators developed the Net Climate Impact Score.The score is a framework that can be used to help determine the net climate impact of AI projects, considering emissions and other environmental costs along with potential environmental benefits in the future.At the end of the day, the most effective solutions will likely result from collaborations among companies, regulators, and researchers, with academia leading the way, Turliuk adds.“Every day counts. We are on a path where the effects of climate change won’t be fully known until it is too late to do anything about it. This is a once-in-a-lifetime opportunity to innovate and make AI systems less carbon-intense,” she says.

A Revolution in Tracking Life on Earth

A suite of technologies are helping taxonomists speed up species identification.

Across a Swiss meadow and into its forested edges, the drone dragged a jumbo-size cotton swab from a 13-foot tether. Along its path, the moistened swab collected scraps of life: some combination of sloughed skin and hair; mucus, saliva, and blood splatters; pollen flecks and fungal spores.Later, biologists used a sequencer about the size of a phone to stream the landscape’s DNA into code, revealing dozens upon dozens of species, some endangered, some invasive. The researchers never saw the wasps, stink bugs, or hawk moths whose genetic signatures they collected. But all of those, and many more, were out there.The researchers, from the Swiss Federal Institute for Forest, Snow and Landscape Research, were field-testing a new approach to biodiversity monitoring, in this case to map insect life across different kinds of vegetation. They make up one of many teams now deploying a suite of technologies to track nature at a resolution and pace once unimaginable for taxonomists. “We know a lot more about what’s happening,” Camille Albouy, an environmental scientist at ETH Zurich, and member of the team, told me, “even if a lot still escapes us.”Today, autonomous robots collect DNA while state-of-the-art sequencers process genetic samples quickly and cheaply, and machine-learning algorithms detect life by sound or shape. These technologies are revolutionizing humanity’s ability to catalog Earth’s species, which are estimated to number 8 million—though perhaps far, far more—by illuminating the teeming life that so often eludes human observation. Only about 2.3 million species have been formally described. The rest are nameless and unstudied—part of what biologists call dark taxa.Insects, for example, likely compose more than half of all animal species, yet most (an estimated four out of five) have never been recorded by science. From the tropics to the poles, on land and in water, they pollinate, prey, scavenge, burrow, and parasitize—an unobserved majority of life on Earth. “It is difficult to relate to nonspecialists how vast our ignorance truly is,” an international consortium of insect scientists lamented in 2018. Valerio Caruso, an entomologist at the University of Padua, in Italy, studies scuttle flies, a skittering family containing an estimated 30,000 to 50,000 species. Only about 4,000 have been described, Caruso told me. “One lifetime is not enough to understand them all.”The minute distinctions within even one family of flies matter more than they might seem to: Species that look identical can occupy entirely different ecological niches—evading different predators and hunting different prey, parasitizing different hosts, pollinating different plants, decomposing different materials, or carrying different diseases. Each is a unique evolutionary experiment that might give rise to compounds that unlock new medicines, behaviors that offer agricultural solutions, and other adaptations that could further our understanding of how life persists.Only with today’s machines and technology do scientists stand a chance of keeping up with life’s abundance. For most of history, humans have relied primarily on their eyes to classify the natural world: Observations of shape, size, and color helped Carl Linnaeus catalog about 12,000 species in the 18th century—a monumental undertaking, but a laughable fraction of reality. Accounting for each creature demanded the meticulous labor of dehydrating, dissecting, mounting, pinning, labeling—essentially the main techniques available until the turn of the 21st century, when genetic sequencing allowed taxonomists to zoom in on DNA bar codes. Even then, those might not have identified specimens beyond genus or family.Now technologies such as eDNA, high-throughput sequencing, autonomous robotics, and AI have broadened our vision of the natural world. They decode the genomes of fungi, bacteria, and yeasts that are difficult or impossible to culture in a lab. Specialized AI isolates species’ calls from noisy recordings, translating air vibrations into an acoustic field guide. Others parse photo pixels to tease out variations in wing veins or bristles as fine as a dust mote to identify and classify closely related species. High-resolution 3-D scans allow researchers to visualize minuscule anatomies without lifting a scalpel. Other tools can map dynamic ecosystems as they transform in real time, tracking how wetlands contract and expand season by season or harnessing hundreds of millions of observations from citizen-science databases to identify species and map their shifting ranges.One unassuming setup in a lush Panamanian rainforest involved a UV light luring moths to a white panel and a solar-powered camera that snapped a photo every 10 seconds, from dusk to dawn. In a single week, AI processed many thousands of images each night, in which experts detected 2,000 moth species—half of them unknown to science. “It breaks my heart to see people think science is about wrapping up the last details of understanding, and that all the big discoveries are done,” David Rolnick, a computer scientist at McGill University and Mila - Quebec AI Institute, who was part of the expedition, told me. In Colombia, one of the world’s most biodiverse countries, the combination of drone-collected data and machine learning has helped describe tens of thousands of species, 200 of which are new to science.These tools’ field of view is still finite. AI algorithms see only as far as their training data, and taxonomical data overrepresent the global North and charismatic organisms. In a major open-access biodiversity database, for example, less than 5 percent of the entries in recent years pertained to insects, while more than 80 percent related to birds (which account for less than 1 percent of named species). Because many dark taxa are absent from training data sets, even the most advanced image-recognition models work best as triage—rapidly sorting through familiar taxa and flagging likely new discoveries for human taxonomists to investigate.AI systems “don’t have intuition; they don’t have creativity,” said Rolnick, whose team co-created Antenna, a ready-to-use AI platform for ecologists. Human taxonomists are still better at imagining how a rare feature arose evolutionarily, or exploring the slight differences that can mark an entirely new species. And ultimately, every identification—whether by algorithm or DNA or human expert—still depends on people.That human labor is also a dwindling resource, especially in entomology. “The number of people who are paid to be taxonomists in the world is practically nil,” Rolnick said. And time is against them. The world’s largest natural-history museums hold a wealth of specimens and objects (more than 1 billion, according to one study) yet only a fraction of those have digitally accessible records, and genomic records are accessible for just 0.2 percent of biological specimens. Many historical collections—all those drawers packed with pinned, flattened, and stuffed specimens; all those jars of floating beings—are chronically underfunded, and their contents are vulnerable to the physical consequences of neglect. Preservation fluids evaporate, poor storage conditions invite pests and mold, and DNA degrades until it is unsequenceable.Today’s tools are still far from fully capturing the extent and complexity of Earth’s biodiversity, and much of that could vanish before anyone catalogs it. “We are too few, studying too many things,” Caruso, the Padua entomologist, said. Many liken taxonomy to cataloging an already burning library. As Mehrdad Hajibabaei, chief scientific officer for the Center for Biodiversity Genomics at the University of Guelph, in Canada, told me: “We’re not stamp-collecting here.” Taxonomists are instead working to preserve a planetary memory—an archive of life—and to decode which traits help creatures adapt, migrate, or otherwise survive in a rapidly changing climate.The climate crisis is unraveling the life cycles of wildlife around the world—by one estimate, for about half of all species. Flowers now bloom weeks before pollinators stir; fruit withers before migrating birds can reach it. Butterflies attuned to rainfall falter in drought. Tropical birds and alpine plants climb toward cooler, though finite, mountaintops. Fish slip farther out to sea; disease-carrying mosquitoes ride the heat into new territories. Extreme weather at the poles stresses crucial moss and lichen, and shreds entire habitats in hours. Mass die-offs are now routine.“Once you lose one species, you’ll probably lose more species,” Caruso said. “Over time, everything is going to collapse.” One in eight could vanish by century’s end—many of them dark taxa, lost before we ever meet them. Most countries—and global bodies such as the International Union for Conservation of Nature—cannot assess, and therefore cannot protect, unnamed organisms. As Edward O. Wilson told Time in 1986: “It’s like having astronomy without knowing where the stars are.”Today’s machine-assisted taxonomy faces the same problem Linnaeus did: Nature’s complexity still far outstrips human insight, even with machines’ assistance. “We don’t perceive the world as it is in all its chaotic glory,” the biologist Carol Kaesuk Yoon wrote in her 2010 book, Naming Nature. “We sense a very particular subset of what surrounds us, and we see it in a particularly human way.” On the flip side, every new data point sharpens the predictive models guiding conservation, says Evgeny Zakharov, genomics director for the Center for Biodiversity Genomics. “The more we know about the world, the more power we have to properly manage and protect it,” he told me. With tools, the speed of taxonomists’ work is accelerating, but so is the countdown—they will take all the help they can get.

A beacon of light

A lantern created in the Design Intelligence Lab creates sustainable alternatives for consumer electronics.

Placing a lit candle in a window to welcome friends and strangers is an old Irish tradition that took on greater significance when Mary Robinson was elected president of Ireland in 1990. At the time, Robinson placed a lamp in Áras an Uachtaráin — the official residence of Ireland’s presidents — noting that the Irish diaspora and all others are always welcome in Ireland. Decades later, a lit lamp remains in a window in Áras an Uachtaráin.The symbolism of Robinson’s lamp was shared by Hashim Sarkis, dean of the MIT School of Architecture and Planning (SA+P), at the school’s graduation ceremony in May, where Robinson addressed the class of 2025. To replicate the generous intentions of Robinson’s lamp and commemorate her visit to MIT, Sarkis commissioned a unique lantern as a gift for Robinson. He commissioned an identical one for his office, which is in the front portico of MIT at 77 Massachusetts Ave.“The lamp will welcome all citizens of the world to MIT,” says Sarkis.No ordinary lanternThe bespoke lantern was created by Marcelo Coelho SM ’08, PhD ’12, director of the Design Intelligence Lab and associate professor of the practice in the Department of Architecture.One of several projects in the Geoletric research at the Design Intelligence Lab, the lantern showcases the use of geopolymers as a sustainable material alternative for embedded computers and consumer electronics.“The materials that we use to make computers have a negative impact on climate, so we’re rethinking how we make products with embedded electronics — such as a lamp or lantern — from a climate perspective,” says Coelho.Consumer electronics rely on materials that are high in carbon emissions and difficult to recycle. As the demand for embedded computing increases, so too does the need for alternative materials that have a reduced environmental impact while supporting electronic functionality.The Geolectric lantern advances the formulation and application of geopolymers — a class of inorganic materials that form covalently bonded, non-crystalline networks. Unlike traditional ceramics, geopolymers do not require high-temperature firing, allowing electronic components to be embedded seamlessly during production.Geopolymers are similar to ceramics, but have a lower carbon footprint and present a sustainable alternative for consumer electronics, product design, and architecture. The minerals Coelho uses to make the geopolymers — aluminum silicate and sodium silicate — are those regularly used to make ceramics.“Geopolymers aren’t particularly new, but are becoming more popular,” says Coelho. “They have high strength in both tension and compression, superior durability, fire resistance, and thermal insulation. Compared to concrete, geopolymers don’t release carbon dioxide. Compared to ceramics, you don’t have to worry about firing them. What’s even more interesting is that they can be made from industrial byproducts and waste materials, contributing to a circular economy and reducing waste.”The lantern is embedded with custom electronics that serve as a proximity and touch sensor. When a hand is placed over the top, light shines down the glass tubes.The timeless design of the Geoelectric lantern — minimalist, composed of natural materials — belies its future-forward function. Coelho’s academic background is in fine arts and computer science. Much of his work, he says, “bridges these two worlds.”Working at the Design Intelligence Lab with Coelho on the lanterns are Jacob Payne, a graduate architecture student, and Jean-Baptiste Labrune, a research affiliate.A light for MITA few weeks before commencement, Sarkis saw the Geoelectric lantern in Palazzo Diedo Berggruen Arts and Culture in Venice, Italy. The exhibition, a collateral event of the Venice Biennale’s 19th International Architecture Exhibition, featured the work of 40 MIT architecture faculty.The sustainability feature of Geolectric is the key reason Sarkis regarded the lantern as the perfect gift for Robinson. After her career in politics, Robinson founded the Mary Robinson Foundation — Climate Justice, an international center addressing the impacts of climate change on marginalized communities.The third iteration of Geolectric for Sarkis’ office is currently underway. While the lantern was a technical prototype and an opportunity to showcase his lab’s research, Coelho — an immigrant from Brazil — was profoundly touched by how Sarkis created the perfect symbolism to both embody the welcoming spirit of the school and honor President Robinson.“When the world feels most fragile, we need to urgently find sustainable and resilient solutions for our built environment. It’s in the darkest times when we need light the most,” says Coelho. 

74 countries have now ratified a landmark treaty to protect the high seas. Why hasn’t NZ?

The High Seas Treaty comes into force in January. New Zealand lags behind on several fronts, including marine protection and recognition of Māori customary rights.

Getty ImagesThe ratification by more than 60 states, the minimum required to turn the Agreement on Biodiversity Beyond National Jurisdiction (better known as the High Seas Treaty) into law, means it will enter into force on January 17. The treaty covers nearly two-thirds of the ocean – an area of sea and seabed outside the national jurisdiction of any country, which has come under growing pressure from mining, fishing and geoengineering interests, with climate change a compounding factor. The High Seas Treaty sits under the United Nations Convention on the Law of the Sea, which New Zealand ratified in 1996. This established the international legal framework governing the marine environment within each country’s jurisdiction, including the territorial sea, exclusive economic zone (EEZ) and continental shelf. New Zealand’s EEZ is the fifth largest in the world and 15 times its landmass. The objective of the High Seas Treaty is to ensure the conservation and sustainable use of marine biological diversity beyond national jurisdiction – where the seabed and its resources are “common heritage of humankind”. It addresses four main issues: marine genetic resources and benefit sharing, marine protection, environmental impact assessments, and technology transfer. New Zealand is the last country reported to be bottom trawling in the South Pacific high seas for species such as the long-lived orange roughy. It also has ambitions to allow seabed mining in its own waters. The High Seas Treaty is drawing much-needed attention to New Zealand’s approach to ocean governance, both at home and on the world stage. What this means for NZ New Zealand was an active participant in the drafting of the High Seas Treaty and an early signatory in September 2023. A total of 74 nations have now ratified it, but New Zealand is not one of them. The deep seafloor beneath much of the high seas includes various habitats with rich biodiversity, much of it undescribed. Bottom trawling uses large nets to scrape the seafloor. The bycatch can include deepwater corals and sponges, which destroys the habitat of fish and other species. While the High Seas Treaty doesn’t directly regulate extractive activities such as fishing and mining in the high seas and deep seabed, it has implications for their exercise. International organisations such as the International Seabed Authority and regional fisheries management groups regulate mining and fisheries, respectively. But new international institutions will be established to enforce compliance with the High Seas Treaty, including to establish marine protected areas in support of the Global Biodiversity Framework’s goal of protecting 30% of the ocean by 2030. The Treaty also requires new activities in the high seas and deep seabed - aquaculture, geoengineering or seabed mining – to undergo an evaluation of environmental impacts. A beacon for best-practice ocean governance The High Seas Treaty reflects contemporary international legal consensus on best-practice ocean governance. Its guiding principles include: Those who pollute marine areas should bear the costs of managing the issue any benefits flowing from marine resources should be shared equitably (including with Indigenous peoples) states should take a precautionary approach to marine uses where their effects are not well understood states should take an ecosystem-based and integrated approach to ocean management states should use an ocean-governance approach that builds resilience to climate change and recognises the ocean’s role in the global carbon cycle, and states should use the best available science and traditional knowledge in ocean governance and respect the rights of Indigenous peoples. These principles align with broader ocean-focused initiatives as part of the UN Decade of Ocean Science for Sustainable Development and the sustainable development goals, which signals a growing awareness of the need to improve how ocean resources are managed. In New Zealand, international law is not directly enforceable in the courts unless incorporated into domestic legislation. But the courts can refer to international treaties when interpreting domestic legislation. This happened when the Supreme Court used the Law of the Sea Convention to direct decision makers to take a precautionary and ecosystem-based approach to approving seabed mining within New Zealand’s EEZ, based on science, tikanga and mātauranga Māori. The High Seas Treaty also reflects the unequivocal international recognition that states, including New Zealand, have obligations under international law to reduce the impacts of climate change on marine areas, reduce pollution and support the restoration of the ocean. However, New Zealand lags behind other countries in the protection of marine biodiversity. The government has delayed marine protection legislation in the Hauraki Gulf and proposed the removal of a requirement for cameras on fishing industry boats. It has also increased catch limits for some commercial fish species, but reduced them for orange roughy after being taken to court by environmental advocates. It has also opened up seabed mining to the fast-track consenting regime, despite a failure to meet basic standards for environmental impact assessment. And it is proposing to rework the coastal policy statement to enable the use and development of the coastal environment for “priority activities” such as aquaculture, resource extraction and energy generation. Time for NZ to show ocean leadership Ocean advocates and scientists have repeatedly called for reform of New Zealand’s highly fragmented and outdated oceans governance frameworks. The international call to states to uphold the rights of Indigenous peoples stands in stark contrast to the New Zealand government’s recent track record on Māori marine and coastal rights and interests. The courts recently overturned government polices that failed to uphold Māori fishing rights protected by Treaty of Waitangi settlements. But the government nevertheless plans legal changes that would further undermine Māori customary rights in marine and coastal areas. Upholding Māori rights in line with international law is not just an obligation but an opportunity. Iwi and hapū Māori have significant knowledge to contribute to the management of the ocean. It is high time for New Zealand to show leadership on oceans policy on the global stage by ratifying the High Seas Treaty. But it is as important to look after matters within domestic waters, aligning fragmented and outdated marine laws to match global best practice in ocean governance. Elizabeth Macpherson receives funding from Te Apārangi The Royal Society of New Zealand. Conrad Pilditch receives funding from Department of Conservation, MBIE, regional councils and PROs. He is affiliated with the Mussel Reef Restoration Trust and the Whangateau Catchment Collective. Karen Fisher receives funding from MBIE and the Government of Canada’s New Frontiers in Research Fund (NFRF). Simon Francis Thrush receives funding from MBIE, the Marsden Fund, the EU and philanthropic sources.

No Results today.

Our news is updated constantly with the latest environmental stories from around the world. Reset or change your filters to find the most active current topics.

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.