Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Geoengineering Wins Reluctant Interest from Scientists as Earth’s Climate Unravels

News Feed
Tuesday, September 24, 2024

As recently as 10 years ago most scientists I interviewed and heard speak at conferences did not support geoengineering as a way to counteract climate change. Whether the idea was to release large amounts of sulfur dioxide into the stratosphere to “block” the sun’s heating or to spread iron across the ocean to supercharge algae that breathe in carbon dioxide, researchers resisted on principle: don’t mess with natural systems because unintended consequences could ruin Earth. They also worried that trying the techniques even at a small scale could be a slippery slope to wider deployment, and that countries would use the promise of geoengineering as an excuse to keep burning carbon-emitting fossil fuels.But today more and more climate scientists openly support experimenting with these and other proposed strategies, in part because entrepreneurs and organizations are going ahead with the methods anyway—often based on little data or field trials. Scientists want to run controlled experiments to see if the methods are productive, to test consequences and perhaps to show objectively that the approaches can cause serious problems.“We do need to try the techniques to figure them out,” says Rob Jackson, a professor at Stanford University, chair of the international research partnership Global Carbon Project and author of a book on climate solutions called Into the Clear Blue Sky (Scribner, 2024). “But doing research does make them more likely to happen. That is the knotty part of all this.”On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.A tacit race may be starting among scientists and entrepreneurs. More funding is being offered to researchers, and investments are growing in companies that would pursue geoengineering. In 2023 a start-up called Make Sunsets launched balloons containing sulfur dioxide into the stratosphere, selling “cooling credits” to companies and individuals. In early September, 23 academics at the not-for-profit consortium Exploring Ocean Iron Solutions unveiled a plan to assess how much CO2 iron fertilization could sequester in the deep sea, and they hope to start trials across the northeastern Pacific Ocean in 2026. Big corporations, including oil companies, are already building large industrial facilities to pull CO2 from the air, and the U.S. government is offering them billions of dollars.There is confusion, too. Some scientists say the term “geoengineering” should refer only to techniques that alter Earth systems. For example, sulfur dioxide in the stratosphere creates tiny droplets that reflect incoming sunlight back to space, an approach called solar radiation management (SRM). But this method could also affect weather patterns or weaken the ozone layer that protects us from ultraviolet radiation. Brightening clouds by spraying them with ocean mist from below can reduce sunlight but could interfere with rain patterns. Spreading iron across the ocean helps phytoplankton to grow and consume CO2, yet these organisms would also consume other nutrients, which could possibly starvesea life. Spreading certain kinds of pulverized rock across the sea surface can make the water more alkaline, allowing it to absorb more CO2 from the air, but it could affect ocean chemistry, too.In contrast, using machines to pull CO2 from the air, a technique known as direct air capture (DAC), doesn’t directly interfere with natural systems and so shouldn’t be called geoengineering, some argue. Social concerns could still be high, however; for example, millions of these machines would be needed to reduce warming by a meaningful extent, and that would require huge amounts of energy. If they were powered by wind and solar, as they ideally would be to avoid more greenhouse gas emissions, the installations could compete with agriculture for land.For years Jackson refused to endorse DAC, but he now supports research. He has come around because warming is rising relentlessly, along with fossil fuel use, creating dangerous floods, droughts and heat waves that are killing people worldwide. “We are out of time and options,” he says, “an unfortunate outcome of our inaction on climate.”He is not quite ready to endorse SRM, he says, “because I don’t trust our ability to do it equitably around the world.” Blocking the sun above certain countries could alter rainfall patterns elsewhere, which could be particularly problematic if it occurred in poorer regions. Research shows that the huge 1991 Mount Pinatubo volcanic eruption in the Philippines, which lofted about 20 million metric tons of sulfur dioxide into the stratosphere, ended up exacerbating drought in parts of the world as well as reducing ozone levels by a small percentage. If one country launches sulfur into the stratosphere, Jackson asks, who is going to pay for drought that happens in another country?When Ken Caldeira, a longtime climate researcher at the Carnegie Institution for Science’s Department of Global Ecology at Stanford, started to look into geoengineering in 2000, he says, older researchers warned him that delving into the taboo topic would put his career at risk. Small studies did ensue, but public opinion against geoengineering mounted and most work stopped.A big change came in 2018, when the Intergovernmental Panel on Climate Change released its Global Warming of 1.5°C report. It stated that without carbon removal or other techniques, the world stood little chance of holding warming to no more than 1.5 degrees Celsius (2.7 degrees Fahrenheit) above preindustrial levels, the goal agreed on in the 2015 Paris climate agreement. “That was an inflection point,” Jackson says. “It pointed out the difficulty, almost the futility, of reaching the 1.5-degree-C target, or even the two-degree-C target, without carbon removal.” Even if carbon removal is not considered geoengineering, the report bolstered some scientists’ willingness to experiment.David Keith, formerly an applied physics professor and public policy professor at Harvard University, is among them, and he prefers the term “climate engineering” to refer to techniques such as SRM. For more than two decades, Keith has been looking into SRM. In 2021 he and others were about to carry out the world’s first field trial in Sweden, but protests by Indigenous peoples and other groups persuaded the Swedish government to cancel it. Last year the University of Chicago hired Keith to oversee a new program called the Climate Systems Engineering initiative, perhaps the first program to hire people specifically to do SRM research. Keith refutes the slippery slope argument, too. “There is no ethical argument for not pursuing research,” he says.Nevertheless, many scientists and environmental groups remain skeptical—and they have been voicing their wariness to Scientific American. Lili Fuhr, an analyst at the Center for International Environmental Law, interviewed for an article on DAC in SciAm’s September issue, said, “the reliance on these future speculative techno fixes delays real climate action right now.” Deep-sea expert Lisa Levin of the Scripps Institution of Oceanography, interviewed for a September 12 article about iron fertilization, said the technique is likely to “affect something that we don’t really understand yet.”But to Caldeira, who is credited with coining the term “solar radiation management” in 2006, that is the reason to do “outdoor research”—not just computer modeling studies. “The key is identifying what could go wrong, and demonstrating how,” he says. “What experiment could you do to narrow the uncertainty about whether a technique is bad or okay?” If the outcomes raise threats to the environment or people, he says, “we should know that now.”Although Caldeira is not pursuing it, one intriguing idea would be to equip a fast-mobilizing aircraft team that could fly a specially outfitted plane above a sudden volcanic eruption to test all sorts of stratospheric parameters. But it would be difficult to spend money on a special aircraft that, most of the time, would sit idle in a hangar.Caldeira thinks support for SRM will continue to broaden, especially if drought and famine caused by climate change—which have already begun—happen year after year and disproportionately affect poorer countries. “SRM is the only way to start cooling the Earth within a few years,” he says. “There would be mounting pressure on political leaders [in poorer affected countries] to deploy it. Or at least, they could use the threat of SRM to get more aid from wealthy countries.”

More and more climate scientists are supporting experiments to cool Earth by altering the stratosphere or the ocean

As recently as 10 years ago most scientists I interviewed and heard speak at conferences did not support geoengineering as a way to counteract climate change. Whether the idea was to release large amounts of sulfur dioxide into the stratosphere to “block” the sun’s heating or to spread iron across the ocean to supercharge algae that breathe in carbon dioxide, researchers resisted on principle: don’t mess with natural systems because unintended consequences could ruin Earth. They also worried that trying the techniques even at a small scale could be a slippery slope to wider deployment, and that countries would use the promise of geoengineering as an excuse to keep burning carbon-emitting fossil fuels.

But today more and more climate scientists openly support experimenting with these and other proposed strategies, in part because entrepreneurs and organizations are going ahead with the methods anyway—often based on little data or field trials. Scientists want to run controlled experiments to see if the methods are productive, to test consequences and perhaps to show objectively that the approaches can cause serious problems.

“We do need to try the techniques to figure them out,” says Rob Jackson, a professor at Stanford University, chair of the international research partnership Global Carbon Project and author of a book on climate solutions called Into the Clear Blue Sky (Scribner, 2024). “But doing research does make them more likely to happen. That is the knotty part of all this.”


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


A tacit race may be starting among scientists and entrepreneurs. More funding is being offered to researchers, and investments are growing in companies that would pursue geoengineering. In 2023 a start-up called Make Sunsets launched balloons containing sulfur dioxide into the stratosphere, selling “cooling credits” to companies and individuals. In early September, 23 academics at the not-for-profit consortium Exploring Ocean Iron Solutions unveiled a plan to assess how much CO2 iron fertilization could sequester in the deep sea, and they hope to start trials across the northeastern Pacific Ocean in 2026. Big corporations, including oil companies, are already building large industrial facilities to pull CO2 from the air, and the U.S. government is offering them billions of dollars.

There is confusion, too. Some scientists say the term “geoengineering” should refer only to techniques that alter Earth systems. For example, sulfur dioxide in the stratosphere creates tiny droplets that reflect incoming sunlight back to space, an approach called solar radiation management (SRM). But this method could also affect weather patterns or weaken the ozone layer that protects us from ultraviolet radiation. Brightening clouds by spraying them with ocean mist from below can reduce sunlight but could interfere with rain patterns. Spreading iron across the ocean helps phytoplankton to grow and consume CO2, yet these organisms would also consume other nutrients, which could possibly starvesea life. Spreading certain kinds of pulverized rock across the sea surface can make the water more alkaline, allowing it to absorb more CO2 from the air, but it could affect ocean chemistry, too.

In contrast, using machines to pull CO2 from the air, a technique known as direct air capture (DAC), doesn’t directly interfere with natural systems and so shouldn’t be called geoengineering, some argue. Social concerns could still be high, however; for example, millions of these machines would be needed to reduce warming by a meaningful extent, and that would require huge amounts of energy. If they were powered by wind and solar, as they ideally would be to avoid more greenhouse gas emissions, the installations could compete with agriculture for land.

For years Jackson refused to endorse DAC, but he now supports research. He has come around because warming is rising relentlessly, along with fossil fuel use, creating dangerous floods, droughts and heat waves that are killing people worldwide. “We are out of time and options,” he says, “an unfortunate outcome of our inaction on climate.”

He is not quite ready to endorse SRM, he says, “because I don’t trust our ability to do it equitably around the world.” Blocking the sun above certain countries could alter rainfall patterns elsewhere, which could be particularly problematic if it occurred in poorer regions. Research shows that the huge 1991 Mount Pinatubo volcanic eruption in the Philippines, which lofted about 20 million metric tons of sulfur dioxide into the stratosphere, ended up exacerbating drought in parts of the world as well as reducing ozone levels by a small percentage. If one country launches sulfur into the stratosphere, Jackson asks, who is going to pay for drought that happens in another country?

When Ken Caldeira, a longtime climate researcher at the Carnegie Institution for Science’s Department of Global Ecology at Stanford, started to look into geoengineering in 2000, he says, older researchers warned him that delving into the taboo topic would put his career at risk. Small studies did ensue, but public opinion against geoengineering mounted and most work stopped.

A big change came in 2018, when the Intergovernmental Panel on Climate Change released its Global Warming of 1.5°C report. It stated that without carbon removal or other techniques, the world stood little chance of holding warming to no more than 1.5 degrees Celsius (2.7 degrees Fahrenheit) above preindustrial levels, the goal agreed on in the 2015 Paris climate agreement. “That was an inflection point,” Jackson says. “It pointed out the difficulty, almost the futility, of reaching the 1.5-degree-C target, or even the two-degree-C target, without carbon removal.” Even if carbon removal is not considered geoengineering, the report bolstered some scientists’ willingness to experiment.

David Keith, formerly an applied physics professor and public policy professor at Harvard University, is among them, and he prefers the term “climate engineering” to refer to techniques such as SRM. For more than two decades, Keith has been looking into SRM. In 2021 he and others were about to carry out the world’s first field trial in Sweden, but protests by Indigenous peoples and other groups persuaded the Swedish government to cancel it. Last year the University of Chicago hired Keith to oversee a new program called the Climate Systems Engineering initiative, perhaps the first program to hire people specifically to do SRM research. Keith refutes the slippery slope argument, too. “There is no ethical argument for not pursuing research,” he says.

Nevertheless, many scientists and environmental groups remain skeptical—and they have been voicing their wariness to Scientific American. Lili Fuhr, an analyst at the Center for International Environmental Law, interviewed for an article on DAC in SciAm’s September issue, said, “the reliance on these future speculative techno fixes delays real climate action right now.” Deep-sea expert Lisa Levin of the Scripps Institution of Oceanography, interviewed for a September 12 article about iron fertilization, said the technique is likely to “affect something that we don’t really understand yet.”

But to Caldeira, who is credited with coining the term “solar radiation management” in 2006, that is the reason to do “outdoor research”—not just computer modeling studies. “The key is identifying what could go wrong, and demonstrating how,” he says. “What experiment could you do to narrow the uncertainty about whether a technique is bad or okay?” If the outcomes raise threats to the environment or people, he says, “we should know that now.”

Although Caldeira is not pursuing it, one intriguing idea would be to equip a fast-mobilizing aircraft team that could fly a specially outfitted plane above a sudden volcanic eruption to test all sorts of stratospheric parameters. But it would be difficult to spend money on a special aircraft that, most of the time, would sit idle in a hangar.

Caldeira thinks support for SRM will continue to broaden, especially if drought and famine caused by climate change—which have already begun—happen year after year and disproportionately affect poorer countries. “SRM is the only way to start cooling the Earth within a few years,” he says. “There would be mounting pressure on political leaders [in poorer affected countries] to deploy it. Or at least, they could use the threat of SRM to get more aid from wealthy countries.”

Read the full story here.
Photos courtesy of

What’s the best way to expand the US electricity grid?

A study by MIT researchers illuminates choices about reliability, cost, and emissions.

Growing energy demand means the U.S. will almost certainly have to expand its electricity grid in coming years. What’s the best way to do this? A new study by MIT researchers examines legislation introduced in Congress and identifies relative tradeoffs involving reliability, cost, and emissions, depending on the proposed approach.The researchers evaluated two policy approaches to expanding the U.S. electricity grid: One would concentrate on regions with more renewable energy sources, and the other would create more interconnections across the country. For instance, some of the best untapped wind-power resources in the U.S. lie in the center of the country, so one type of grid expansion would situate relatively more grid infrastructure in those regions. Alternatively, the other scenario involves building more infrastructure everywhere in roughly equal measure, which the researchers call the “prescriptive” approach. How does each pencil out?After extensive modeling, the researchers found that a grid expansion could make improvements on all fronts, with each approach offering different advantages. A more geographically unbalanced grid buildout would be 1.13 percent less expensive, and would reduce carbon emissions by 3.65 percent compared to the prescriptive approach. And yet, the prescriptive approach, with more national interconnection, would significantly reduce power outages due to extreme weather, among other things.“There’s a tradeoff between the two things that are most on policymakers’ minds: cost and reliability,” says Christopher Knittel, an economist at the MIT Sloan School of Management, who helped direct the research. “This study makes it more clear that the more prescriptive approach ends up being better in the face of extreme weather and outages.”The paper, “Implications of Policy-Driven Transmission Expansion on Costs, Emissions and Reliability in the United States,” is published today in Nature Energy.The authors are Juan Ramon L. Senga, a postdoc in the MIT Center for Energy and Environmental Policy Research; Audun Botterud, a principal research scientist in the MIT Laboratory for Information and Decision Systems; John E. Parson, the deputy director for research at MIT’s Center for Energy and Environmental Policy Research; Drew Story, the managing director at MIT’s Policy Lab; and Knittel, who is the George P. Schultz Professor at MIT Sloan, and associate dean for climate and sustainability at MIT.The new study is a product of the MIT Climate Policy Center, housed within MIT Sloan and committed to bipartisan research on energy issues. The center is also part of the Climate Project at MIT, founded in 2024 as a high-level Institute effort to develop practical climate solutions.In this case, the project was developed from work the researchers did with federal lawmakers who have introduced legislation aimed at bolstering and expanding the U.S. electric grid. One of these bills, the BIG WIRES Act, co-sponsored by Sen. John Hickenlooper of Colorado and Rep. Scott Peters of California, would require each transmission region in the U.S. to be able to send at least 30 percent of its peak load to other regions by 2035.That would represent a substantial change for a national transmission scenario where grids have largely been developed regionally, without an enormous amount of national oversight.“The U.S. grid is aging and it needs an upgrade,” Senga says. “Implementing these kinds of policies is an important step for us to get to that future where we improve the grid, lower costs, lower emissions, and improve reliability. Some progress is better than none, and in this case, it would be important.”To conduct the study, the researchers looked at how policies like the BIG WIRES Act would affect energy distribution. The scholars used a model of energy generation developed at the MIT Energy Initiative — the model is called “Gen X” — and examined the changes proposed by the legislation.With a 30 percent level of interregional connectivity, the study estimates, the number of outages due to extreme cold would drop by 39 percent, for instance, a substantial increase in reliability. That would help avoid scenarios such as the one Texas experienced in 2021, when winter storms damaged distribution capacity.“Reliability is what we find to be most salient to policymakers,” Senga says.On the other hand, as the paper details, a future grid that is “optimized” with more transmission capacity near geographic spots of new energy generation would be less expensive.“On the cost side, this kind of optimized system looks better,” Senga says.A more geographically imbalanced grid would also have a greater impact on reducing emissions. Globally, the levelized cost of wind and solar dropped by 89 percent and 69 percent, respectively, from 2010 to 2022, meaning that incorporating less-expensive renewables into the grid would help with both cost and emissions.“On the emissions side, a priori it’s not clear the optimized system would do better, but it does,” Knittel says. “That’s probably tied to cost, in the sense that it’s building more transmission links to where the good, cheap renewable resources are, because they’re cheap. Emissions fall when you let the optimizing action take place.”To be sure, these two differing approaches to grid expansion are not the only paths forward. The study also examines a hybrid approach, which involves both national interconnectivity requirements and local buildouts based around new power sources on top of that. Still, the model does show that there may be some tradeoffs lawmakers will want to consider when developing and considering future grid legislation.“You can find a balance between these factors, where you’re still going to still have an increase in reliability while also getting the cost and emission reductions,” Senga observes.For his part, Knittel emphasizes that working with legislation as the basis for academic studies, while not generally common, can be productive for everyone involved. Scholars get to apply their research tools and models to real-world scenarios, and policymakers get a sophisticated evaluation of how their proposals would work.“Compared to the typical academic path to publication, this is different, but at the Climate Policy Center, we’re already doing this kind of research,” Knittel says. 

UK farmers lose £800m after heat and drought cause one of worst harvests on record

Many now concerned about ability to make living in fast-changing climate after one of worst grain harvests recordedRecord heat and drought cost Britain’s arable farmers more than £800m in lost production in 2025 in one of the worst harvests recorded, analysis has estimated.Three of the five worst harvests on record have now occurred since 2020, leaving some farmers asking whether the growing impacts of the climate crisis are making it too financially risky to sow their crops. Farmers are already facing heavy financial pressure as the costs of fertilisers and other inputs have risen faster than prices. Continue reading...

Record heat and drought cost Britain’s arable farmers more than £800m in lost production in 2025 in one of the worst harvests recorded, analysis has estimated.Three of the five worst harvests on record have now occurred since 2020, leaving some farmers asking whether the growing impacts of the climate crisis are making it too financially risky to sow their crops. Farmers are already facing heavy financial pressure as the costs of fertilisers and other inputs have risen faster than prices.This year Britain had the hottest and driest spring on record, and the hottest summer, with drought conditions widespread. As a result, the production of the five staple arable crops – wheat, oats, spring and winter barley, and oilseed rape – fell by 20% compared with the 10-year average, according to the analysis by the Energy and Climate Intelligence Unit (ECIU). The harvest in England was the second-worst in records going back to 1984.Supercharged by global heating, extreme rainfall in the winters of 2019-20 and 2023-24 also led to very poor harvests, as farmers were unable to access waterlogged and flooded fields to drill their crops.“This has been another torrid year for many farmers in the UK, with the pendulum swinging from too wet to too hot and dry,” said Tom Lancaster at the ECIU. “British farmers have once again been left counting the costs of climate change, with four-fifths now concerned about their ability to make a living due to the fast-changing climate.”He added: “There is an urgent need to ensure farmers are better supported to adapt to these climate shocks and build their resilience as the bedrock of our food security. In this context, the delays [by ministers] to the relaunch of vital green farming schemes are the last thing the industry needs.” The sustainable farming incentive was closed in March.Many farmers are struggling to break even and some blame environmental policies, but Lancaster said: “The evidence suggests that climate impacts are what’s actually driving issues of profitability, certainly in the arable sector, as opposed to policy change. Without reaching net zero emission there is no way to limit the impacts making food production in the UK ever more difficult.”David Lord, an arable farmer from Essex, said: “As a farmer, I’m used to taking the rough with the smooth, but recent years have seen near constant extreme rainfall, heat and drought. It’s getting to the point with climate change where I can’t take the risk of investing in a new crop of wheat or barley because the return on that investment is just so uncertain.“Green farming schemes are a vital lifeline for me, helping build my resilience to these shocks whilst providing cashflow to help buffer me financially.”Green farming approaches include planting winter cover crops. These increase resilience by boosting the organic content of soil, meaning it can retain water better during droughts. Cover crops can also help break up compacted soil, allowing it to drain better during wet periods.The ECIU analysis used production data for England published in October and current grain prices and then extrapolated it to the UK as a whole, a method shown to be reliable in previous years. Since 2020, which was the worst harvest on record, lost revenue associated with the impact of extreme weather is now more than £2bn for UK arable farmers. Grain prices are set globally, so low harvests in the UK do not translate in the market to higher prices.The link between worsening extreme weather and global heating is increasingly clear. The Met Office said the UK summer of 2025 was the hottest in more than a century of records and was made 70 times more probable because of the climate crisis. Global heating also made the severe rainfall in the winter storms of 2023-24 about 20% heavier.“This year’s harvest was extremely challenging,” said Jamie Burrows, the chair of the National Farmers’ Union combinable crops board. “Growing crops in the UK isn’t easy due to the unpredictable weather we are seeing more of. Funding is needed for climate adaptation and resilient crop varieties to safeguard our ability to feed the nation.”The price of some foods hit by extreme weather are rising more than four times faster than others in the average shop, the ECIU reported in October. It found the price of butter, beef, milk, coffee and chocolate had risen by an average of 15.6% over the year, compared with 2.8% for other food and drink.Drought in the UK led to poor grass growth, hitting butter and beef production, while extreme heat and rain in west Africa pushed up cocoa prices and droughts in Brazil and Vietnam led to a surge in coffee prices.A spokesperson for the Department of Environment, Food and Rural Affairs said farmers were stewards of the nation’s food security. “We know there are challenges in the sector and weather extremes have affected harvests,” she said. “We are backing our farmers in the face of a changing climate with the largest nature-friendly farming budget in history to grow their businesses and get more British food on our plates.”

Realtors just forced Zillow to hide a key piece of information about buying a home. Here’s why

Until recently, when you looked at a house for sale on Zillow, you could see property-specific scores for the risk of flooding, wildfires, wind from storms and hurricanes, extreme heat, and air quality. The numbers came from First Street, a nonprofit that uses peer-reviewed methodologies to calculate “climate risk.” But Zillow recently removed those scores after pressure from CRMLS, one of the large real-estate listing services that supplies its data. “The reality is these models have been around for over five years,” says Matthew Eby, CEO of First Street, which also provides its data to sites like Realtor.com and Redfin. (Zillow started displaying the information in 2024, but Realtor.com incorporated First Street’s “Flood Scores” in 2020.) “And what’s happened is the market’s gotten very tight. And now they’re looking for ways to try and make it easier to sell homes at the expense of homebuyers.” The California Regional MLS, like others across the country, controls the database that feeds real estate listings to sites like Zillow. The organization said in a statement to the New York Times that it was “suspicious” after seeing predictions of high flood risk in areas that hadn’t flooded in the past. When Fast Company asked for an example of a location, they pointed to a neighborhood in Huntington Beach—but that area actually just flooded last week. In a statement, First Street said that it stands behind the accuracy of its scores. “Our models are built on transparent, peer-reviewed science and are continuously validated against real-world outcomes. In the CRMLS coverage area, during the Los Angeles wildfires, our maps identified over 90% of the homes that ultimately burned as being at severe or extreme risk—our highest risk rating—and 100% as having some level of risk, significantly outperforming CalFire’s official state hazard maps. So when claims are made that our models are inaccurate, we ask for evidence. To date, all the empirical validation shows our science is working as designed and providing better risk insight than the tools the industry has relied on historically.” Zillow’s trust in the data has not changed, and that data is important to consumers: In one survey, it saw that more than 80% of buyers considered the data when shopping for a house. But the company said in a statement that it updated its “climate risk product experience to adhere to varying MLS requirements.” It’s not clear exactly what happened: In response to questions for this story, CRMLS now says it only asked Zillow to remove “predictive numbers” and flood map layers on listings, while Zillow says the MLS board voted to demand they block all of the data. It’s also not clear what would have happened if Zillow hadn’t made any changes, though in theory, the MLS could have stopped giving the site access to its listings. Images of Zillow’s climate risk tools from a 2024 press release [Image: Zillow] Zillow still links to First Street’s website in each listing, so homebuyers can access the information, but it’s less easy to find. The site also still includes a map that consumers can use to view overall neighborhood risk, if they take the extra step to click on checkboxes for flooding, fire, or other hazards. But the main scores are gone. Obviously, seeing that a particular house has a high flood risk or fire risk can hurt sales. Nevertheless, after First Street first launched, the National Association of Realtors put out guidance saying that the information was useful—and that since realtors aren’t experts in things like flood risk, they shouldn’t try to tell buyers themselves that a particular house is safe, even if it hasn’t flooded in the past. First Street’s flood data goes further than that of the Federal Emergency Management Agency, which uses outdated flood maps. It also incorporates more climate predictions, along with the risk of flooding from heavy rainfall and surface runoff, not just flooding from rivers or the coast. And it includes predictions of small amounts of flooding (for example, whether an inch of water is likely to reach the property). Buyers can dig deeper to figure out how much that amount of flooding might affect a particular house. It’s not surprising that some high risk scores have upset home sellers who haven’t experienced flooding or other problems in the past. But as the climate changes, past experiences don’t guarantee what a property will be like for the next 30 years. Take the example of North Carolina, where some residents hadn’t ever experienced flooding until Hurricane Helene dumped unprecedented rainfall on their neighborhoods. Redfin, another site that uses the data, plans to continue providing it, though sellers have the option to ask for it to be removed from a particular home if they believe it’s inaccurate. (First Street also allows homeowners to ask for their data to be revised if there’s a problem, and then reviews the accuracy.) “Redfin will continue to provide the best-possible estimates of the risks of fires, floods, and storms,” Redfin chief economist Daryl Fairweather said in a statement. “Homebuyers want to know, because losing a home in a catastrophe is heartbreaking, and insuring against these risks is getting more and more expensive.” Realtor.com is working with CRMLS and data providers to look into the issues raised by the MLS over the scores. “We aim to balance transparency about the evolving environmental risks to what is often a family’s biggest investment, with an understanding that the available data can sometimes be limited,” the company said in a statement. “For this reason we always encourage consumers to consult a local real estate professional for guidance or to learn more. When issues are raised, we work with our data partners to review them and make updates when appropriate.” If more real estate sites take down the scores, it’s likely that some buyers won’t see the information at all. First Street says that while it’s good that Zillow still includes a link to its site, the impact is real. “Whenever you add friction into something, it just is used less,” Eby says. “And so not having that information at the tip of your fingers is definitely going to have an impact on the millions of people that go to Zillow every day to see it.”

Until recently, when you looked at a house for sale on Zillow, you could see property-specific scores for the risk of flooding, wildfires, wind from storms and hurricanes, extreme heat, and air quality. The numbers came from First Street, a nonprofit that uses peer-reviewed methodologies to calculate “climate risk.” But Zillow recently removed those scores after pressure from CRMLS, one of the large real-estate listing services that supplies its data. “The reality is these models have been around for over five years,” says Matthew Eby, CEO of First Street, which also provides its data to sites like Realtor.com and Redfin. (Zillow started displaying the information in 2024, but Realtor.com incorporated First Street’s “Flood Scores” in 2020.) “And what’s happened is the market’s gotten very tight. And now they’re looking for ways to try and make it easier to sell homes at the expense of homebuyers.” The California Regional MLS, like others across the country, controls the database that feeds real estate listings to sites like Zillow. The organization said in a statement to the New York Times that it was “suspicious” after seeing predictions of high flood risk in areas that hadn’t flooded in the past. When Fast Company asked for an example of a location, they pointed to a neighborhood in Huntington Beach—but that area actually just flooded last week. In a statement, First Street said that it stands behind the accuracy of its scores. “Our models are built on transparent, peer-reviewed science and are continuously validated against real-world outcomes. In the CRMLS coverage area, during the Los Angeles wildfires, our maps identified over 90% of the homes that ultimately burned as being at severe or extreme risk—our highest risk rating—and 100% as having some level of risk, significantly outperforming CalFire’s official state hazard maps. So when claims are made that our models are inaccurate, we ask for evidence. To date, all the empirical validation shows our science is working as designed and providing better risk insight than the tools the industry has relied on historically.” Zillow’s trust in the data has not changed, and that data is important to consumers: In one survey, it saw that more than 80% of buyers considered the data when shopping for a house. But the company said in a statement that it updated its “climate risk product experience to adhere to varying MLS requirements.” It’s not clear exactly what happened: In response to questions for this story, CRMLS now says it only asked Zillow to remove “predictive numbers” and flood map layers on listings, while Zillow says the MLS board voted to demand they block all of the data. It’s also not clear what would have happened if Zillow hadn’t made any changes, though in theory, the MLS could have stopped giving the site access to its listings. Images of Zillow’s climate risk tools from a 2024 press release [Image: Zillow] Zillow still links to First Street’s website in each listing, so homebuyers can access the information, but it’s less easy to find. The site also still includes a map that consumers can use to view overall neighborhood risk, if they take the extra step to click on checkboxes for flooding, fire, or other hazards. But the main scores are gone. Obviously, seeing that a particular house has a high flood risk or fire risk can hurt sales. Nevertheless, after First Street first launched, the National Association of Realtors put out guidance saying that the information was useful—and that since realtors aren’t experts in things like flood risk, they shouldn’t try to tell buyers themselves that a particular house is safe, even if it hasn’t flooded in the past. First Street’s flood data goes further than that of the Federal Emergency Management Agency, which uses outdated flood maps. It also incorporates more climate predictions, along with the risk of flooding from heavy rainfall and surface runoff, not just flooding from rivers or the coast. And it includes predictions of small amounts of flooding (for example, whether an inch of water is likely to reach the property). Buyers can dig deeper to figure out how much that amount of flooding might affect a particular house. It’s not surprising that some high risk scores have upset home sellers who haven’t experienced flooding or other problems in the past. But as the climate changes, past experiences don’t guarantee what a property will be like for the next 30 years. Take the example of North Carolina, where some residents hadn’t ever experienced flooding until Hurricane Helene dumped unprecedented rainfall on their neighborhoods. Redfin, another site that uses the data, plans to continue providing it, though sellers have the option to ask for it to be removed from a particular home if they believe it’s inaccurate. (First Street also allows homeowners to ask for their data to be revised if there’s a problem, and then reviews the accuracy.) “Redfin will continue to provide the best-possible estimates of the risks of fires, floods, and storms,” Redfin chief economist Daryl Fairweather said in a statement. “Homebuyers want to know, because losing a home in a catastrophe is heartbreaking, and insuring against these risks is getting more and more expensive.” Realtor.com is working with CRMLS and data providers to look into the issues raised by the MLS over the scores. “We aim to balance transparency about the evolving environmental risks to what is often a family’s biggest investment, with an understanding that the available data can sometimes be limited,” the company said in a statement. “For this reason we always encourage consumers to consult a local real estate professional for guidance or to learn more. When issues are raised, we work with our data partners to review them and make updates when appropriate.” If more real estate sites take down the scores, it’s likely that some buyers won’t see the information at all. First Street says that while it’s good that Zillow still includes a link to its site, the impact is real. “Whenever you add friction into something, it just is used less,” Eby says. “And so not having that information at the tip of your fingers is definitely going to have an impact on the millions of people that go to Zillow every day to see it.”

Researchers Slightly Lower Study's Estimate of Drop in Global Income Due to Climate Change

Researchers who examined climate change’s potential effect on the global economy say data errors led them to slightly overstate an expected drop in income over the next 25 years

The authors of a study that examined climate change's potential effect on the global economy said Wednesday that data errors led them to slightly overstate an expected drop in income over the next 25 years.The researchers at Germany's Potsdam Institute for Climate Impact Research, writing in the journal Nature in 2024, had forecast a 19% drop in global income by 2050. Their revised analysis puts the figure at 17%.The authors also said in their original work that there was a 99% chance that, by midcentury, it would cost more to fix damage from climate change than it would cost to build resilience. Their new analysis, not yet peer-reviewed, lowered that figure to 91%.The Associated Press reported on the original study. Nature posted a retraction of it Wednesday.The researchers cited data inaccuracies in the first paper, particularly with underlying economic data for Uzbekistan between 1995 and 1999 that had a large influence on the results, and that their analysis had underestimated statistical uncertainty.Max Kotz, one of the study’s authors, told the AP that the heart of the study is unchanged: Climate change will be enormously damaging to the world economy if unchecked, and that the impact will hit hardest in the lowest-income areas that contribute the fewest emissions driving the planet's warming. Gernot Wagner, a climate economist at Columbia Business School who wasn't involved with the research, said the thrust of the Potsdam Institute's work remains the same “no matter which part of the range the true figure will be.”“Climate change already hits home, quite literally. Home insurance premiums across the U.S. have already seen, in part, a doubling over the past decade alone,” Wagner said. “Rapidly accumulating climate risks will only make the numbers go up even more.”The Associated Press’ climate and environmental coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP’s standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See – Nov. 2025

Climate Change Is Killing the Myth of Los Angeles

I once lived in an apartment in Los Angeles that flooded every time it rained. Not just a polite drip, either. The ceiling sagged and dripped into long wet ribbons, and the wall beside my desk would bleed water like I was playing out Barton Fink in color. I wonder how that space looks now, as Southern California comes out of a long rain event where the hills above Altadena saw nearly nine inches at the site of January’s Eaton fire, between November 14 and November 21. People love to talk about tanned and toned Dallas Raines, the veteran KABC meteorologist who can summon high drama from a passing low-pressure system. Or the obligatory SUV hydroplaning down the 5 Freeway. In L.A., weather banter is its own civic dialect. We rarely admit how fragile the physical city really is, and how the very places that frame our daily lives—the courtyard where you catch the first blue of morning, the balcony where you watch the hills smolder at golden hour—can start to fail the moment the skies decide to turn. Everything here is built for one type of weather. And most of the time it works. But when it doesn’t, it really doesn’t work. L.A. has spent over a century advertising its perfect Mediterranean climate. Now increasingly frequent severe weather events are triggering citywide soul-searching about who deserves protection, what neighborhoods get resources, which elected officials are to blame, and whether the promise of this place still holds. Some parts of L.A. County picked up close to a foot of rain in 10 days in February 2023, leaving more than 80,000 Los Angeles Department of Water and Power customers without power, while unhoused residents faced flooded encampments, freezing nights, and packed shelters. Almost exactly a year later, emergency crews pulled a pregnant, unhoused woman from a storm drain above a raging river. The January 2025 fires in the Palisades and Altadena further exposed the gap between the city we imagine and the one we actually live in. What happens when a city built on the mythology of sublime weather has to finally face how to live with a climate that refuses to stay in line?The Los Angeles myth goes back more than a century: Between the 1880s and the 1920s, the Los Angeles Chamber of Commerce mailed millions of pamphlets eastward, selling Midwestern families on a kingdom of eternal spring. Sunkist built a national brand on winter oranges ripening while Chicago froze. Railroads sponsored booster fiction and postcards promising a life where weather was not an obstacle but an asset. In the dead of winter, “[you could] have a small, five-acre citrus farm and do really well and then hop on the streetcar and go to the beach for the day,” said professor Char Miller, a historian and environmental analysis scholar at Pomona College.Miller has spent decades tracing how this mythology ossified. While the pitch obscured who paid the price—Indigenous communities pushed off their land, Chinese and Japanese residents marginalized or excluded—the promise endured in part because the landscape helped carry it. But for all the valleys, deserts, and coastlines, there were also floods, fires, earthquakes, and landslides: hazards only mentioned in the fine print. There’s an old line Miller heard during his early days on the West Coast in the 1970s: “California is 90 percent paradise, 10 percent apocalypse.” It was something people once said with a kind of wry affection, the same sensibility baked into disaster films that love to see Los Angeles perpetually destroyed. It was the myth of a place that could always be rebuilt, where catastrophe was fleeting and bounty would always return. But that ratio, Miller says, is shifting, leaning more toward calamity. It was nearly midnight in New York when my phone lit up. A friend in Los Angeles was calling to ask if I wanted him to move anything out of my apartment, which had just fallen under an evacuation order while I was back East. Earlier that afternoon, on January 8, West Hollywood had been in the mid-70s—bone-dry, humidity in the 20s. The kind of day that feels ominous if you’ve lived here long enough to know what those numbers mean. By nightfall, another fire was creeping toward Runyon Canyon, the hiking trail so quintessentially L.A. it sometimes has a valet. In the weeks that followed the January fires, the political blame game was relentless. Some went after Mayor Bass, others after Governor Newsom. But the fury felt like a way to avoid the harder truth of a city playing dumb about its own new climate reality.Even while the January fires were still burning, city and state leaders promised to rebuild immediately, suspending regulations that might have slowed development in the very zones that were incinerated. “What that did was to take off the table any kind of transformation that might have slowed down the very things that that fire consumed, which is rapid growth up into fire zones,” Miller said. A recent CalMatters analysis found that nearly four million people in Southern California are living in such hazardous zones.Climate scientist Daniel Swain told me that despite all the finger-pointing after the January fires, the forecast wasn’t the problem. Meteorologists had issued “crystal clear warnings” days ahead of time. The real issue, he suggested, is that Los Angeles still treats climate disasters as if they can be willed away, as if better heroics in the moment could out-muscle physics. “We can’t expect to have a firefighting force that can magically overcome hurricane-force winds amid record dry conditions producing a blizzard of embers in the suburbs,” Swain said. “You just can’t fight that in the moment.”The deeper problem is structural. Southern California is one of the most fire-prone landscapes in the country, and millions now live in or immediately downwind of terrain primed to burn. Many neighborhoods haven’t seen major fire in decades, which feeds the illusion of safety. But growth has pushed suburbs further into the wildland-urban interface just as warming has lengthened fire season, increasing the chances that a Santa Ana wind event arrives when vegetation is crisp and unrecoverably dry. Most years won’t align as catastrophically as January did, Swain noted, but when they do the math is unforgiving.Work has to happen long before the flames arrive. Swain pointed to neighborhoods where community groups had already tackled vegetation management, replaced vulnerable vents, or cleared brush from wooden fences. Those blocks didn’t just fare slightly better, but some avoided becoming ignition points entirely. Fire resilience, he emphasized, is cumulative; every house that doesn’t burn is one less launching pad for embers to race downwind.The fixes aren’t always grand or expensive. Sometimes it’s a few hundred dollars for finer mesh vents that stop embers from blowing into attics. Sometimes it’s ripping out head-high brush along a property line. Sometimes it’s insisting that new construction in fire zones meet tougher standards or retrofitting homes that were built for a climate that no longer exists.Swain sees the January fires as a preview of what strong Santa Ana events will look like going forward. Historically, many of the strongest Santa Ana events came after at least some winter rain. Now that rain is arriving later, meaning more wind events strike when the hills are still crisped from autumn, as was the case in January. But the problem in Los Angeles isn’t just meteorological: It is political, infrastructural, and deeply cultural. Miller likes to point to other parts of the country that faced similar crossroads and chose differently. After catastrophic floods in 1998, San Antonio bought out homeowners in riparian zones rather than sending them back into danger. Houston did something similar after Hurricane Harvey. These weren’t mass seizures or punitive acts; they were buyouts at market rate, voluntary and forward-looking. “What if,” Miller wondered, “you went to people who were burned out in Altadena and the Palisades and said, ‘We’re going to pay you not to rebuild’?” It’s a planner’s maxim—build up, not out—but in Southern California, the political will rarely matches the topographic reality.And yet, amid the devastation, there were signs of another kind of civic instinct. In Altadena, neighbors organized mutual aid networks at local businesses like Octavia’s Bookshelf and Bike Oven, and community leaders helped residents navigate insurance, microloans, and temporary housing. New nonprofits sprang up to support people psychologically and financially. Miller is skeptical of rebuilding policy, but he’s quick to note the human creativity that emerged in the fire’s wake—a kind of grassroots adaptation that government hasn’t yet matched.In May, Miller remembers stepping off a plane at LAX behind someone wearing a leather jacket with two mottos curved across the back: “Never forget” on top, “Rebuild Altadena” on the bottom. “I think the bottom circle erases the top,” Miller said. “If you rebuild, you have already forgotten because you are not paying attention to what happened and why it happened.”

I once lived in an apartment in Los Angeles that flooded every time it rained. Not just a polite drip, either. The ceiling sagged and dripped into long wet ribbons, and the wall beside my desk would bleed water like I was playing out Barton Fink in color. I wonder how that space looks now, as Southern California comes out of a long rain event where the hills above Altadena saw nearly nine inches at the site of January’s Eaton fire, between November 14 and November 21. People love to talk about tanned and toned Dallas Raines, the veteran KABC meteorologist who can summon high drama from a passing low-pressure system. Or the obligatory SUV hydroplaning down the 5 Freeway. In L.A., weather banter is its own civic dialect. We rarely admit how fragile the physical city really is, and how the very places that frame our daily lives—the courtyard where you catch the first blue of morning, the balcony where you watch the hills smolder at golden hour—can start to fail the moment the skies decide to turn. Everything here is built for one type of weather. And most of the time it works. But when it doesn’t, it really doesn’t work. L.A. has spent over a century advertising its perfect Mediterranean climate. Now increasingly frequent severe weather events are triggering citywide soul-searching about who deserves protection, what neighborhoods get resources, which elected officials are to blame, and whether the promise of this place still holds. Some parts of L.A. County picked up close to a foot of rain in 10 days in February 2023, leaving more than 80,000 Los Angeles Department of Water and Power customers without power, while unhoused residents faced flooded encampments, freezing nights, and packed shelters. Almost exactly a year later, emergency crews pulled a pregnant, unhoused woman from a storm drain above a raging river. The January 2025 fires in the Palisades and Altadena further exposed the gap between the city we imagine and the one we actually live in. What happens when a city built on the mythology of sublime weather has to finally face how to live with a climate that refuses to stay in line?The Los Angeles myth goes back more than a century: Between the 1880s and the 1920s, the Los Angeles Chamber of Commerce mailed millions of pamphlets eastward, selling Midwestern families on a kingdom of eternal spring. Sunkist built a national brand on winter oranges ripening while Chicago froze. Railroads sponsored booster fiction and postcards promising a life where weather was not an obstacle but an asset. In the dead of winter, “[you could] have a small, five-acre citrus farm and do really well and then hop on the streetcar and go to the beach for the day,” said professor Char Miller, a historian and environmental analysis scholar at Pomona College.Miller has spent decades tracing how this mythology ossified. While the pitch obscured who paid the price—Indigenous communities pushed off their land, Chinese and Japanese residents marginalized or excluded—the promise endured in part because the landscape helped carry it. But for all the valleys, deserts, and coastlines, there were also floods, fires, earthquakes, and landslides: hazards only mentioned in the fine print. There’s an old line Miller heard during his early days on the West Coast in the 1970s: “California is 90 percent paradise, 10 percent apocalypse.” It was something people once said with a kind of wry affection, the same sensibility baked into disaster films that love to see Los Angeles perpetually destroyed. It was the myth of a place that could always be rebuilt, where catastrophe was fleeting and bounty would always return. But that ratio, Miller says, is shifting, leaning more toward calamity. It was nearly midnight in New York when my phone lit up. A friend in Los Angeles was calling to ask if I wanted him to move anything out of my apartment, which had just fallen under an evacuation order while I was back East. Earlier that afternoon, on January 8, West Hollywood had been in the mid-70s—bone-dry, humidity in the 20s. The kind of day that feels ominous if you’ve lived here long enough to know what those numbers mean. By nightfall, another fire was creeping toward Runyon Canyon, the hiking trail so quintessentially L.A. it sometimes has a valet. In the weeks that followed the January fires, the political blame game was relentless. Some went after Mayor Bass, others after Governor Newsom. But the fury felt like a way to avoid the harder truth of a city playing dumb about its own new climate reality.Even while the January fires were still burning, city and state leaders promised to rebuild immediately, suspending regulations that might have slowed development in the very zones that were incinerated. “What that did was to take off the table any kind of transformation that might have slowed down the very things that that fire consumed, which is rapid growth up into fire zones,” Miller said. A recent CalMatters analysis found that nearly four million people in Southern California are living in such hazardous zones.Climate scientist Daniel Swain told me that despite all the finger-pointing after the January fires, the forecast wasn’t the problem. Meteorologists had issued “crystal clear warnings” days ahead of time. The real issue, he suggested, is that Los Angeles still treats climate disasters as if they can be willed away, as if better heroics in the moment could out-muscle physics. “We can’t expect to have a firefighting force that can magically overcome hurricane-force winds amid record dry conditions producing a blizzard of embers in the suburbs,” Swain said. “You just can’t fight that in the moment.”The deeper problem is structural. Southern California is one of the most fire-prone landscapes in the country, and millions now live in or immediately downwind of terrain primed to burn. Many neighborhoods haven’t seen major fire in decades, which feeds the illusion of safety. But growth has pushed suburbs further into the wildland-urban interface just as warming has lengthened fire season, increasing the chances that a Santa Ana wind event arrives when vegetation is crisp and unrecoverably dry. Most years won’t align as catastrophically as January did, Swain noted, but when they do the math is unforgiving.Work has to happen long before the flames arrive. Swain pointed to neighborhoods where community groups had already tackled vegetation management, replaced vulnerable vents, or cleared brush from wooden fences. Those blocks didn’t just fare slightly better, but some avoided becoming ignition points entirely. Fire resilience, he emphasized, is cumulative; every house that doesn’t burn is one less launching pad for embers to race downwind.The fixes aren’t always grand or expensive. Sometimes it’s a few hundred dollars for finer mesh vents that stop embers from blowing into attics. Sometimes it’s ripping out head-high brush along a property line. Sometimes it’s insisting that new construction in fire zones meet tougher standards or retrofitting homes that were built for a climate that no longer exists.Swain sees the January fires as a preview of what strong Santa Ana events will look like going forward. Historically, many of the strongest Santa Ana events came after at least some winter rain. Now that rain is arriving later, meaning more wind events strike when the hills are still crisped from autumn, as was the case in January. But the problem in Los Angeles isn’t just meteorological: It is political, infrastructural, and deeply cultural. Miller likes to point to other parts of the country that faced similar crossroads and chose differently. After catastrophic floods in 1998, San Antonio bought out homeowners in riparian zones rather than sending them back into danger. Houston did something similar after Hurricane Harvey. These weren’t mass seizures or punitive acts; they were buyouts at market rate, voluntary and forward-looking. “What if,” Miller wondered, “you went to people who were burned out in Altadena and the Palisades and said, ‘We’re going to pay you not to rebuild’?” It’s a planner’s maxim—build up, not out—but in Southern California, the political will rarely matches the topographic reality.And yet, amid the devastation, there were signs of another kind of civic instinct. In Altadena, neighbors organized mutual aid networks at local businesses like Octavia’s Bookshelf and Bike Oven, and community leaders helped residents navigate insurance, microloans, and temporary housing. New nonprofits sprang up to support people psychologically and financially. Miller is skeptical of rebuilding policy, but he’s quick to note the human creativity that emerged in the fire’s wake—a kind of grassroots adaptation that government hasn’t yet matched.In May, Miller remembers stepping off a plane at LAX behind someone wearing a leather jacket with two mottos curved across the back: “Never forget” on top, “Rebuild Altadena” on the bottom. “I think the bottom circle erases the top,” Miller said. “If you rebuild, you have already forgotten because you are not paying attention to what happened and why it happened.”

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.