Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

As data centers go up, North Carolina weighs how to handle energy demand

News Feed
Tuesday, September 16, 2025

In small communities across North Carolina, data centers are already sparking conflict over land use, water use, and quality of life. Now, the debate over the facilities’ voracious need for electricity — and whether it can be met with clean sources — is heating up in the state capital of Raleigh. For months, North Carolina’s predominant utility, Duke Energy, has forecast ballooning demand from large customers like data centers: immense buildings that house the computing devices powering AI and other software that’s become part of everyday life. Early last year, Duke projected these ​“large loads” would need an additional 3.9 gigawatts of capacity, equal to about four nuclear power plants and enough to serve millions of households. By May of this year, the company’s prediction had swelled to almost 6 gigawatts. The eye-popping estimates helped lead regulators to approve Duke’s current plan to build a massive new fleet of gas plants, alongside some clean energy investments, despite a state law requiring the utility to decarbonize. The projections are certain to factor into the next iteration of Duke’s long-term blueprint, a draft of which is due in the coming weeks. The forecasts have ​“thrown everything out of whack,” said Nick Jimenez, senior attorney with the Southern Environmental Law Center. That’s why his organization asked the state’s Utilities Commission to host a technical conference on large loads. Electricity-demand projections undergird virtually every Duke case before the panel. But at a technical conference, commissioners could grapple exclusively with the issues vexing energy experts across the country: How can data center demand be predicted with the most accuracy? Will the tech giants pay their fair share of grid upgrades and other costs? What will power the new facilities, and will it be carbon-free? In June, the Utilities Commission granted the law center’s request and then some by opening an entire proceeding to debate these questions. Stakeholders had the summer to submit written comments, with responses due from Duke early this month. In-person presentations are scheduled for Oct. 14. It’s not clear if the process will culminate in a discrete order from the commission, or simply inform the myriad other Duke cases before it. But Jimenez praised regulators for being proactive. ​“You need a proceeding to get your arms around some of these issues,” he said. ​“I think that’s really smart and forward-looking.” The data center boom In the race against other states to attract economic development, Duke and North Carolina officials keep confidential exactly which entities hope to draw power from the electric grid. And skeptics question whether all of the new facilities behind predictions of unprecedented demand growth will pan out. But there’s little doubt that data centers are on the rise, propelled by the AI explosion. Researchers say they could account for 44% of U.S. load growth by 2028, and there’s ample evidence that North Carolina is following the national trend. In June, Amazon Web Services announced a $10 billion, 800-acre computing campus in Richmond County, east of Charlotte, billed as the largest single capital project in North Carolina history. To the west of Charlotte, the development of a ​“data center corridor” is underway: Apple says its Catawba County site is included in its $500 billion U.S. expansion plans, and Microsoft envisions four new data centers nearby. Google is considering growing its facility in neighboring Caldwell County. Not all communities are welcoming data centers with open arms. The town council of tiny Tarboro, an hour east of Raleigh, just voted to reject a $6.4 billion facility. In Apex, southwest of the city, opposition is mounting to a proposed ​“digital campus” that would displace 190 acres of farmland. Still, early this month, Gov. Josh Stein, a Democrat and former attorney general, issued an executive order creating an ​“AI Accelerator” and a council designed to make the state ​“a national leader in AI literacy, governance, and deployment to the benefit of our residents, communities, and economy.” Stein did note the technology’s downsides, including ​“the uncertainty around AI systems and their associated energy and water needs.” But his edict also reflects the seeming common wisdom of the moment: AI and its requisite facilities are multiplying and expanding, bringing economic opportunities that can outweigh their challenges. “We can come to the table” In the open docket before regulators, experts say that with the right policies in place, clean energy, efficiency, and related strategies can meet the moment. ​“We can come to the table,” said John Burns, general counsel for Carolinas Clean Energy Business Association, a trade group representing developers, manufacturers, and others in the clean energy industry. In their comments, Burns and others particularly promoted ​“load flexibility,” a form of demand response in which data centers curtail their electricity use when the grid is strained by lots of energy consumption. Load flexibility is feasible because data centers don’t run at maximum capacity 24/7, said Tyler Norris, former special adviser at the U.S. Department of Energy and a doctoral fellow at Duke University, which has no connection to the utility. “You never actually run the chips and the servers to 100% of their rated nameplate power,” he said. ​“You wouldn’t want to, because they overheat and they don’t perform as well when they’re running that hard.” Norris is the lead author of a February paper showing that Duke’s two utilities in the Carolinas could accommodate 4.1 gigawatts of load if data centers shave just 0.5% off their peak usage annually. In a simple example, the facilities could operate at half their maximum capacity for 88 hours over the course of a year. A load-flexibility arrangement between Duke Energy and data centers could, in theory, avert the construction of several gigawatts of new gas plant capacity and expensive and time-consuming transmission upgrades. Last month, Google announced demand-response agreements with the utilities Indiana Michigan Power and the Tennessee Valley Authority. In formal comments to the North Carolina Utilities Commission, Norris called the tech giant’s move the ​“first documented case where AI data center flexibility is explicitly integrated into U.S. utility planning.”

In small communities across North Carolina, data centers are already sparking conflict over land use, water use, and quality of life. Now, the debate over the facilities’ voracious need for electricity — and whether it can be met with clean sources — is heating up in the state capital of Raleigh. For months, North…

In small communities across North Carolina, data centers are already sparking conflict over land use, water use, and quality of life. Now, the debate over the facilities’ voracious need for electricity — and whether it can be met with clean sources — is heating up in the state capital of Raleigh.

For months, North Carolina’s predominant utility, Duke Energy, has forecast ballooning demand from large customers like data centers: immense buildings that house the computing devices powering AI and other software that’s become part of everyday life.

Early last year, Duke projected these large loads” would need an additional 3.9 gigawatts of capacity, equal to about four nuclear power plants and enough to serve millions of households. By May of this year, the company’s prediction had swelled to almost 6 gigawatts.

The eye-popping estimates helped lead regulators to approve Duke’s current plan to build a massive new fleet of gas plants, alongside some clean energy investments, despite a state law requiring the utility to decarbonize. The projections are certain to factor into the next iteration of Duke’s long-term blueprint, a draft of which is due in the coming weeks.

The forecasts have thrown everything out of whack,” said Nick Jimenez, senior attorney with the Southern Environmental Law Center.

That’s why his organization asked the state’s Utilities Commission to host a technical conference on large loads. Electricity-demand projections undergird virtually every Duke case before the panel. But at a technical conference, commissioners could grapple exclusively with the issues vexing energy experts across the country: How can data center demand be predicted with the most accuracy? Will the tech giants pay their fair share of grid upgrades and other costs? What will power the new facilities, and will it be carbon-free?

In June, the Utilities Commission granted the law center’s request and then some by opening an entire proceeding to debate these questions. Stakeholders had the summer to submit written comments, with responses due from Duke early this month. In-person presentations are scheduled for Oct. 14.

It’s not clear if the process will culminate in a discrete order from the commission, or simply inform the myriad other Duke cases before it. But Jimenez praised regulators for being proactive. You need a proceeding to get your arms around some of these issues,” he said. I think that’s really smart and forward-looking.”

The data center boom

In the race against other states to attract economic development, Duke and North Carolina officials keep confidential exactly which entities hope to draw power from the electric grid. And skeptics question whether all of the new facilities behind predictions of unprecedented demand growth will pan out.

But there’s little doubt that data centers are on the rise, propelled by the AI explosion. Researchers say they could account for 44% of U.S. load growth by 2028, and there’s ample evidence that North Carolina is following the national trend.

In June, Amazon Web Services announced a $10 billion, 800-acre computing campus in Richmond County, east of Charlotte, billed as the largest single capital project in North Carolina history. To the west of Charlotte, the development of a data center corridor” is underway: Apple says its Catawba County site is included in its $500 billion U.S. expansion plans, and Microsoft envisions four new data centers nearby. Google is considering growing its facility in neighboring Caldwell County.

Not all communities are welcoming data centers with open arms. The town council of tiny Tarboro, an hour east of Raleigh, just voted to reject a $6.4 billion facility. In Apex, southwest of the city, opposition is mounting to a proposed digital campus” that would displace 190 acres of farmland.

Still, early this month, Gov. Josh Stein, a Democrat and former attorney general, issued an executive order creating an AI Accelerator” and a council designed to make the state a national leader in AI literacy, governance, and deployment to the benefit of our residents, communities, and economy.”

Stein did note the technology’s downsides, including the uncertainty around AI systems and their associated energy and water needs.” But his edict also reflects the seeming common wisdom of the moment: AI and its requisite facilities are multiplying and expanding, bringing economic opportunities that can outweigh their challenges.

We can come to the table”

In the open docket before regulators, experts say that with the right policies in place, clean energy, efficiency, and related strategies can meet the moment. We can come to the table,” said John Burns, general counsel for Carolinas Clean Energy Business Association, a trade group representing developers, manufacturers, and others in the clean energy industry.

In their comments, Burns and others particularly promoted load flexibility,” a form of demand response in which data centers curtail their electricity use when the grid is strained by lots of energy consumption.

Load flexibility is feasible because data centers don’t run at maximum capacity 24/7, said Tyler Norris, former special adviser at the U.S. Department of Energy and a doctoral fellow at Duke University, which has no connection to the utility.

You never actually run the chips and the servers to 100% of their rated nameplate power,” he said. You wouldn’t want to, because they overheat and they don’t perform as well when they’re running that hard.”

Norris is the lead author of a February paper showing that Duke’s two utilities in the Carolinas could accommodate 4.1 gigawatts of load if data centers shave just 0.5% off their peak usage annually. In a simple example, the facilities could operate at half their maximum capacity for 88 hours over the course of a year.

A load-flexibility arrangement between Duke Energy and data centers could, in theory, avert the construction of several gigawatts of new gas plant capacity and expensive and time-consuming transmission upgrades.

Last month, Google announced demand-response agreements with the utilities Indiana Michigan Power and the Tennessee Valley Authority. In formal comments to the North Carolina Utilities Commission, Norris called the tech giant’s move the first documented case where AI data center flexibility is explicitly integrated into U.S. utility planning.”

Read the full story here.
Photos courtesy of

Working to eliminate barriers to adopting nuclear energy

Nuclear waste continues to be a bottleneck in the widespread use of nuclear energy, so doctoral student Dauren Sarsenbayev is developing models to address the problem.

What if there were a way to solve one of the most significant obstacles to the use of nuclear energy — the disposal of high-level nuclear waste (HLW)? Dauren Sarsenbayev, a third-year doctoral student at the MIT Department of Nuclear Science and Engineering (NSE), is addressing the challenge as part of his research.Sarsenbayev focuses on one of the primary problems related to HLW: decay heat released by radioactive waste. The basic premise of his solution is to extract the heat from spent fuel, which simultaneously takes care of two objectives: gaining more energy from an existing carbon-free resource while decreasing the challenges associated with storage and handling of HLW. “The value of carbon-free energy continues to rise each year, and we want to extract as much of it as possible,” Sarsenbayev explains.While the safe management and disposal of HLW has seen significant progress, there can be more creative ways to manage or take advantage of the waste. Such a move would be especially important for the public’s acceptance of nuclear energy. “We’re reframing the problem of nuclear waste, transforming it from a liability to an energy source,” Sarsenbayev says.The nuances of nuclearSarsenbayev had to do a bit of reframing himself in how he perceived nuclear energy. Growing up in Almaty, the largest city in Kazakhstan, the collective trauma of Soviet nuclear testing loomed large over the public consciousness. Not only does the country, once a part of the Soviet Union, carry the scars of nuclear weapon testing, Kazakhstan is the world’s largest producer of uranium. It’s hard to escape the collective psyche of such a legacy.At the same time, Sarsenbayev saw his native Almaty choking under heavy smog every winter, due to the burning of fossil fuels for heat. Determined to do his part to accelerate the process of decarbonization, Sarsenbayev gravitated to undergraduate studies in environmental engineering at Kazakh-German University. It was during this time that Sarsenbayev realized practically every energy source, even the promising renewable ones, came with challenges, and decided nuclear was the way to go for its reliable, low-carbon power. “I was exposed to air pollution from childhood; the horizon would be just black. The biggest incentive for me with nuclear power was that as long as we did it properly, people could breathe cleaner air,” Sarsenbayev says.Studying transport of radionuclidesPart of “doing nuclear properly” involves studying — and reliably predicting — the long-term behavior of radionuclides in geological repositories.Sarsenbayev discovered an interest in studying nuclear waste management during an internship at Lawrence Berkeley National Laboratory as a junior undergraduate student.While at Berkeley, Sarsenbayev focused on modeling the transport of radionuclides from the nuclear waste repository’s barrier system to the surrounding host rock. He discovered how to use the tools of the trade to predict long-term behavior. “As an undergrad, I was really fascinated by how far in the future something could be predicted. It’s kind of like foreseeing what future generations will encounter,” Sarsenbayev says.The timing of the Berkeley internship was fortuitous. It was at the laboratory that he worked with Haruko Murakami Wainwright, who was herself getting started at MIT NSE. (Wainwright is the Mitsui Career Development Professor in Contemporary Technology, and an assistant professor of NSE and of civil and environmental engineering).Looking to pursue graduate studies in the field of nuclear waste management, Sarsenbayev followed Wainwright to MIT, where he has further researched the modeling of radionuclide transport. He is the first author on a paper that details mechanisms to increase the robustness of models describing the transport of radionuclides. The work captures the complexity of interactions between engineered barrier components, including cement-based materials and clay barriers, the typical medium proposed for the storage and disposal of spent nuclear fuel.Sarsenbayev is pleased with the results of the model’s prediction, which closely mirrors experiments conducted at the Mont Terri research site in Switzerland, famous for studies in the interactions between cement and clay. “I was fortunate to work with Doctor Carl Steefel and Professor Christophe Tournassat, leading experts in computational geochemistry,” he says.Real-life transport mechanisms involve many physical and chemical processes, the complexities of which increase the size of the computational model dramatically. Reactive transport modeling — which combines the simulation of fluid flow, chemical reactions, and the transport of substances through subsurface media — has evolved significantly over the past few decades. However, running accurate simulations comes with trade-offs: The software can require days to weeks of computing time on high-performance clusters running in parallel.To arrive at results faster by saving on computing time, Sarsenbayev is developing a framework that integrates AI-based “surrogate models,” which train on simulated data and approximate the physical systems. The AI algorithms make predictions of radionuclide behavior faster and less computationally intensive than the traditional equivalent.Doctoral research focusSarsenbayev is using his modeling expertise in his primary doctoral work as well — in evaluating the potential of spent nuclear fuel as an anthropogenic geothermal energy source. “In fact, geothermal heat is largely due to the natural decay of radioisotopes in Earth’s crust, so using decay heat from spent fuel is conceptually similar,” he says. A canister of nuclear waste can generate, under conservative assumptions, the energy equivalent of 1,000 square meters (a little under a quarter of an acre) of solar panels.Because the potential for heat from a canister is significant — a typical one (depending on how long it was cooled in the spent fuel pool) has a temperature of around 150 degrees Celsius — but not enormous, extracting heat from this source makes use of a process called a binary cycle system. In such a system, heat is extracted indirectly: the canister warms a closed water loop, which in turn transfers that heat to a secondary low-boiling-point fluid that powers the turbine.Sarsenbayev’s work develops a conceptual model of a binary-cycle geothermal system powered by heat from high-level radioactive waste. Early modeling results have been published and look promising. While the potential for such energy extraction is at the proof-of-concept stage in modeling, Sarsenbayev is hopeful that it will find success when translated to practice. “Converting a liability into an energy source is what we want, and this solution delivers,” he says.Despite work being all-consuming — “I’m almost obsessed with and love my work” — Sarsenbayev finds time to write reflective poetry in both Kazakh, his native language, and Russian, which he learned growing up. He’s also enamored by astrophotography, taking pictures of celestial bodies. Finding the right night sky can be a challenge, but the canyons near his home in Almaty are an especially good fit. He goes on photography sessions whenever he visits home for the holidays, and his love for Almaty shines through. “Almaty means 'the place where apples originated.' This part of Central Asia is very beautiful; although we have environmental pollution, this is a place with a rich history,” Sarsenbayev says.Sarsenbayev is especially keen on finding ways to communicate both the arts and sciences to future generations. “Obviously, you have to be technically rigorous and get the modeling right, but you also have to understand and convey the broader picture of why you’re doing the work, what the end goal is,” he says. Through that lens, the impact of Sarsenbayev’s doctoral work is significant. The end goal? Removing the bottleneck for nuclear energy adoption by producing carbon-free power and ensuring the safe disposal of radioactive waste.

The EPA was considering a massive lead cleanup in Omaha. Then Trump shifted guidance.

Tens of thousands of Omahans have lead in their yards at levels that experts say is dangerous, especially for kids. Growing momentum to do more cleanup in what’s already the nation’s largest residential lead Superfund site now may stall.

The county health worker scanned the Omaha home with an X-ray gun, searching for the poison. It was 2022, and doctors had recently found high levels of lead in the blood of Crystalyn Prine’s 2-year-old son, prompting the Douglas County Health Department to investigate. The worker said it didn’t seem to come from the walls, where any lead would be buried under layers of smooth paint. The lead assessor swabbed the floors for dust but didn’t find answers as to how Prine’s son had been exposed. A danger did lurk outside, the worker told her. For more than a century, a smelter and other factories had spewed lead-laced smoke across the city’s east side, leading the federal government to declare a huge swath of Omaha a Superfund site and to dig up and replace nearly 14,000 yards — including about a third of the east side’s residential properties — since 1999. Prine looked up the soil tests for her home online and discovered her yard contained potentially harmful levels of lead. But when she called the city, officials told her that her home didn’t qualify for government-funded cleanup under the standard in place from the U.S. Environmental Protection Agency. Prine didn’t want to move out of the home that had been in her husband’s family for generations. So she followed the county’s advice to keep her five kids safe. They washed their hands frequently and took off their shoes when they came inside. Then, Prine heard some news at the clinic where she worked as a nurse that gave her hope: In January 2024, the EPA under President Joe Biden lowered the lead levels that could trigger cleanup. Her home was above the new threshold. On a recent Sunday morning, 5-year- old Jack Prine, left, plays with his 2-year-old brother at home. Tests showed lead in the blood of both children. Rebecca S. Gratz for ProPublica and the Flatwater Free Press That didn’t automatically mean her yard would be cleaned up, local officials told her, but last year, the EPA began to study the possibility of cleaning up tens of thousands of more yards in Omaha, according to emails and other records obtained by the Flatwater Free Press and ProPublica. The agency was also discussing with local officials whether to expand the cleanup area to other parts of Omaha and its surrounding suburbs. Then, this October, the Trump administration rolled back the Biden administration’s guidance. In doing so, it tripled the amount of lead that had to be in the soil to warrant a potential cleanup, meaning that Prine and other families might again be out of luck. Prine’s son Jack, now 5, struggles to speak. He talks less than his 2-year-old brother and stumbles over five-word sentences. “You would think that if lead is this impactful on a small child, that you would definitely want to be fixing it,” she said. “What do you do as a parent? I don’t want to keep my kid from playing outside. He loves playing outside, and I should be able to do that in my own yard.” Scientists have long agreed about the dangers of lead. The toxic metal can get into kids’ brains and nervous systems, causing IQ loss and developmental delays. Experts say the Trump administration’s guidance runs counter to decades of research: In the 26 years since the government began to clean up east Omaha — the largest residential lead Superfund site in the country — scientists have found harm at ever lower levels of exposure. Yet what gets cleaned up is often not just a matter of science but also money and government priorities, according to experts who have studied the Superfund program. Crystalyn Prine holds hands with her 6-month-old daughter. Tests found lead in the blood of two of her other children. Rebecca S. Gratz for ProPublica and the Flatwater Free Press Prine’s block illustrates how widespread Omaha’s lead problem is and how many people who might have benefited from the Biden guidance may no longer get relief. Of the 11 homes on her block, four were cleaned up by the EPA. Six others tested below the original cleanup standard but above the levels in the Biden guidance and were never remediated. The Flatwater Free Press and ProPublica are embarking on a yearlong project about Omaha’s lead legacy, including testing soil to find out how effective the cleanup has been. If you live in or near the affected area, you can sign up for free lead testing of your soil. Despite the changing guidance, Omaha still follows a cleanup standard set in 2009: Properties qualify for cleanup if parts of the yard have more than 400 parts per million of lead in the soil — the equivalent of a marble in a 10-pound bucket of dirt. The Biden administration lowered the guidance for so-called removal management levels to 200 parts per million. The Trump administration has said its new guidance, which raised them to 600 parts per million, would speed cleanups by providing clearer direction and streamlining investigations of contaminated sites. But environmental advocates said it only accelerates project completion by cleaning up fewer properties. The EPA disputed that. “Protecting communities from lead exposure at contaminated sites is EPA’s statutory responsibility and a top priority for the Trump EPA,” the agency said in a statement. “The criticism that our Residential Soil Lead Directive will result in EPA doing less is false.” The new guidance doesn’t necessarily scrap the hopes of Omaha homeowners or the conversations that were happening around the Biden recommendations. That’s because the Trump administration continues to allow EPA managers to study properties with lower levels of lead, depending on how widespread the contamination is and how likely people are to be harmed. What actually gets cleaned up is decided by local EPA officials, who can set remediation levels higher or lower based on the circumstances of specific sites. Regional EPA spokesperson Kellen Ashford said the agency is continuing to assess the Omaha site and will meet with local and state leaders to “chart a path forward with how the updated residential lead directive may apply.” More than 25 years after the EPA declared Omaha’s east side a Superfund site, the city is still working to clean up lead-contaminated properties, including this vacant lot. Rebecca S. Gratz for ProPublica and the Flatwater Free Press Gabriel Filippelli, executive director of Indiana University’s Environmental Resilience Institute, has studied lead and Superfund sites for decades and said he is doubtful the EPA will spend the money to clean up more yards in Omaha. The EPA doesn’t act if “you don’t have local people raising alarm bells,” he said. Yet in Omaha, many are unaware of the debate — or even the presence of lead in their yards. Most of the cleanup happened more than a decade ago. As years passed, new people moved in, and younger residents never learned about the site. Others who did know assumed the lead problem was solved. The dustup around lead has mostly settled even if much of the toxic metal in the city’s dirt never left. “Mass poison” When Prine moved into Omaha’s Field Club neighborhood in 2018, she loved the Queen Anne and Victorian-style homes that lined shady boulevards and how her neighbors decorated heavily for Halloween and Christmas. While she had visited the home previously to see her husband’s family, Prine had no idea her neighborhood was in the middle of a massive environmental cleanup. “The first time I heard about it was when my son had an elevated blood-lead level,” she said. From 1870 to 1997, the American Smelting and Refining Company sat on the Missouri River in downtown Omaha, melting and refining so much lead to make batteries, cover cables and enrich gasoline that it was once the largest operation in the country, according to a 1949 newspaper article. By the 1970s, researchers had proven lead was poisoning American children. Doctors in Omaha noticed kids with elevated blood-lead levels and published findings connecting the toxic metal in their bodies to the smoke pouring out of ASARCO and other polluters. The view of Omaha’s riverfront in 1968. Omaha factories, primarily a lead smelter, deposited 400 million pounds of the toxic metal across the city over more than a century. Courtesy of the Omaha World-Herald In the late 1990s, when city leaders wanted to demolish ASARCO and redevelop the site into a riverfront park, they had to figure out how to clean up Omaha’s lead legacy. They turned to the EPA, which declared a 27-square-mile swath of east Omaha a Superfund site, a federal designation that would allow the agency to clean up the contamination and try to hold the polluters responsible to pay for it. The agency estimated the smelter, along with other polluters, had spewed about 400 million pounds of lead dust over an area, where 125,000 people, including 14,000 young children, lived. The EPA won $246 million in settlements from ASARCO and others to fund the cleanup. By 2015, most of the yards that tested above 400 parts per million had their soil replaced, and the EPA handed the remaining work to the city. The old smelter site was redeveloped into a science museum with a playground outside. The project seemed like a success. The number of kids testing high for lead has dropped dramatically since the 1990s, though similar patterns exist nationwide and fewer than half the kids in the site are tested annually, according to data from the Douglas County Health Department. But evidence had already been emerging that the cleanup levels the EPA had set in Omaha “may not protect children,” which the agency acknowledged in 2019, during the first Trump administration. Managers wrote in a site review that “increasing evidence supports a lower blood-lead level of concern” than the 1994 health guidance that informed the cleanup plan. Lead, even in incredibly small amounts, can build up in the brains, bones or organs of children as well as adults, said Bruce Lanphear, a professor at Simon Fraser University in Canada who has studied lead for decades. “Lead represents the largest mass poison in human history,” he said. The site of the former American Smelting and Refining Company, long known in Omaha as the ASARCO plant, is now home to the Kiewit Luminarium. Rebecca S. Gratz for ProPublica and the Flatwater Free Press After the Centers for Disease Control and Prevention lowered its blood-lead level standard, the EPA’s Office of Superfund Remediation and Technology Innovation began working on new lead cleanup guidance for the EPA regions in 2012, said James Woolford, director of the office from 2006 to 2020. The EPA took a “cautious, studied” approach to how much lead in dirt is acceptable. “Zero was obviously the preference. But what could you do given what’s in the environment?” he asked. “And so we were kind of stuck there.” Then, in 2024, Biden stepped in. If regional EPA officials applied the administration’s guidance to the Omaha site, over 13,000 more properties in Omaha could have qualified, a Flatwater Free Press and ProPublica analysis of EPA and City of Omaha soil tests found. The number could have been even higher, records show. Nearly 27,000 properties, including those that never received cleanup and those that received partial cleanup, would have been eligible for further evaluation, EPA manager Preston Law wrote to a state environmental official in March 2024. The EPA had also been discussing with city and state officials whether to expand the cleanup area: A map that an EPA contractor created with a computer model to simulate the smelter’s plume shows that it likely stretched 23 miles north to south across five counties in Nebraska and Iowa. A computer-simulated map shows the smelter’s plume stretching 23 miles north to south across five counties in Nebraska and Iowa. The model was created by an EPA contractor in 2024 as part of a new assessment of the site. Map obtained by Flatwater Free Press and ProPublica But cleaning up all the properties to the Biden levels could cost more than $800 million, the then-interim director of the Nebraska Department of Energy and Environment, Thaddeus Fineran, wrote to the EPA’s administrator in May 2024. If cleanup costs exceeded the funds set aside from Omaha’s settlements, the EPA would have to dip into the federal Superfund trust fund, which generally requires a 10% match from the state, said Ashford, the EPA spokesperson. That could mean a contribution of $80 million or more from Nebraska, which is already facing a $471 million budget deficit. In the letter, Fineran wrote that the state would “reserve the right to challenge the Updated Lead Soil Guidance and any actions taken in furtherance thereof.” The Nebraska Department of Water, Energy, and Environment, as the agency is now called, declined an interview, referring questions to the EPA. Researchers and decision-makers are likely taking a cautious approach toward what they agree to clean up in Omaha, Woolford said. Given its size, it could carry weight elsewhere. “It will set the baseline for sites across the country,” he said. “Hollow” claims The Trump administration may upend any plans to expand the cleanup. In March, the EPA announced what it called the “biggest deregulatory action in U.S. history.” By July, about 1 in 5 employees who worked for the EPA when Trump took office were gone. The administration proposed slashing the EPA’s budget in half. The administration promised to prioritize Superfund cleanups. But in October, it changed the lead guidance. As a result, more people will be at risk of absorbing damaging amounts of lead into their bodies, said Tom Neltner, national director for the advocacy organization Unleaded Kids. “It signals that the claims that lead is a priority for them are hollow,” he said. The Trump administration said Biden’s approach had “inconsistencies and inefficiencies” that led to “analysis paralysis” and slowed projects down. “Children can’t wait years for us to put a shovel in the dirt to clean up the areas where they live and play,” EPA Administrator Lee Zeldin said in a statement. To avoid the lead-contaminated soil in their yard, the Prine children play only on the back patio and sidewalk. Rebecca S. Gratz for ProPublica and the Flatwater Free Press Under the guidance, the EPA could issue a lower standard for the Omaha site. But Robert Weinstock, director of Northwestern University’s Environmental Advocacy Center, said that’s unlikely unless the state sets a lower state standard than the EPA. Trump’s guidance has some advantages in being more clear, said Filippelli of Indiana University. The Biden guidance seemed overly ambitious: Filippelli and other researchers estimated 1 in 4 American homes could have qualified for cleanup with an estimated cost of $290 billion to $1.2 trillion. While Omaha could be the litmus test for how low the Trump EPA is willing to set cleanup standards, the new guidelines don’t inspire confidence that the administration will do more to clean up old sites where work is nearly finished. “I imagine the inertia would be just to say, ‘Oh, we’re done with Omaha,’” he said. Steve Zivny, program manager of Omaha’s Lead Information Office. Rebecca S. Gratz for ProPublica and the Flatwater Free Press The city has received no timeline from the EPA, said Steve Zivny, program manager of Omaha’s Lead Information Office. He’s guessing money will play a big part in the decision over whether to clean up at a lower lead level, though. About $90 million of the Omaha Superfund settlement remains. “If the data is there and the science is there and the money’s there, I think we would expect it to be lowered,” Zivny said. “But there’s just so many factors that are not really in our control.” If cleanup levels aren’t lowered in Omaha, advocates will have more work to do, said Kiley Petersmith, an assistant professor at Nebraska Methodist College who until recently oversaw a statewide blood-lead testing program. “I think we’re just gonna have to rally together to do more to prevent it from getting from our environment into our kids,” she said. A buried issue Despite the cleanup efforts, Omahans are still exposed at higher rates compared with the national average, said Dr. Egg Qin, an epidemiologist at the University of Nebraska Medical Center who has studied the Superfund site. Yet the city seems to be moving on, he said. “Somebody needs to take the responsibility,” Qin said, “to make sure the community knows lead poisoning still exists significantly in Omaha.” About 40% of the 398 people who have already signed up to have their soil tested by Flatwater Free Press and ProPublica said they did not feel knowledgeable about the history of lead contamination in Omaha. Like the Prines, Omaha resident Vanessa Ballard takes care to not wear shoes in her home to avoid high levels of lead-contaminated soil. Rebecca S. Gratz for ProPublica and the Flatwater Free Press That may in part be due to disclosure rules. When a person sells a home, state and federal law requires them to share any knowledge about lead hazards. The EPA’s original cleanup plan from 2009 says that should include providing buyers with soil test results. But in most cases, there can be very little disclosure, said Tim Reeder, a real estate agent who works in the Superfund site. Omaha’s association of real estate agents provides a map of the Superfund site to give to buyers, along with some basic information, if the home is within the boundaries. City and local health officials spread the word about lead through neighborhood meetings, local TV interviews and billboards. But most people don’t take it seriously until someone they know tests high, Petersmith said. “Unfortunately, once it affects them personally, like if their child or grandchild or cousin has lead exposure, then it’s too late,” she said. When Omaha pediatrician Katie MacKrell moved into a house in the Dundee neighborhood, she thought her kids were fine to play in the yard. Her son sucked his thumb. Her daughter dropped her pacifier and put it back in. When their kids both tested high for lead, MacKrell and her husband went to work fixing lead paint issues in the house. When it came to the yard, her property tested for lead levels above the Biden guidance but didn’t qualify under the original cleanup threshold. And without government help, it could cost the couple more than $10,000 to pay for the remediation themselves. Vanessa Ballard sits with her 19-month-old son, DiVine Cronin, as he plays with a new toy at home. Ballard covers the windows in her home with plastic to keep DiVine and her 5-year-old, MJ Collins, from touching the lead paint and to prevent lead-contaminated dust from blowing inside. Rebecca S. Gratz for ProPublica and the Flatwater Free Press The lead also caught Vanessa Ballard, a high school teacher and mom of two young boys, by surprise. She had imagined growing fruit trees in her backyard until she discovered lead levels high enough to potentially clean up under the Biden guidelines. Now, no one goes in the backyard. Her oldest son splashes in soapy water after making tracks for his Hot Wheels cars in the dirt, and she mixes droplets of iron with the kids’ juice every night to help their bodies repel lead. “I have no hand in the cause of this, but I have all the responsibility in the prevention of it harming me and my family,” she said. Prine will never know whether lead stunted Jack’s speech development, but she worries about it every day. Starting kindergarten helped. But her son is still behind other kids. Prine said she tries to put on a brave face, to believe one day he’ll catch up. If he doesn’t, it’s hard not to suspect the culprit could be in her soil. MJ Collins, Vanessa Ballard’s 5-year-old son, at home. Ballard takes steps to protect her children from the lead present in the family’s yard. Rebecca S. Gratz for ProPublica and the Flatwater Free Press It seemed the government, at least for a short while, agreed. Now she, and so many others in Omaha, don’t know when, if ever, to expect a solution. “Why does it take so long, when they say it’s not safe, to then come in and say, ‘We’re gonna take this seriously?’” Prine asked. “‘That we’re gonna help these kids and protect them?’” Flatwater Free Press is continuing to report on lead contamination in Omaha. If you live in or near the Superfund site in Omaha and want to know if you’ve been exposed to lead, sign up for Flatwater Free Press and ProPublica’s free soil testing. This reporting will help fuel investigative journalism about the largest residential lead Superfund site and the health risks it poses, especially to children. Reporting was contributed by Cassandra Garibay of ProPublica, Destiny Herbers of Flatwater Free Press and Leah Keinama of Nebraska Journalism Trust. This story was originally published by Grist with the headline The EPA was considering a massive lead cleanup in Omaha. Then Trump shifted guidance. on Dec 14, 2025.

Making clean energy investments more successful

Tools for forecasting and modeling technological improvements and the impacts of policy decisions can result in more effective and impactful decision-making.

Governments and companies constantly face decisions about how to allocate finite amounts of money to clean energy technologies that can make a difference to the world’s climate, its economies, and to society as a whole. The process is inherently uncertain, but research has been shown to help predict which technologies will be most successful. Using data-driven bases for such decisions can have a significant impact on allowing more informed decisions that produce the desired results.The role of these predictive tools, and the areas where further research is needed, are addressed in a perspective article published Nov. 24 in Nature Energy, by professor Jessika Trancik of MIT’s Sociotechnical Systems Research Center and Institute of Data, Systems, and Society and 13 co-authors from institutions around the world.She and her co-authors span engineering and social science and share “a common interest in understanding how to best use data and models to inform decisions that influence how technology evolves,” Trancik says. They are interested in “analyzing many evolving technologies — rather than focusing on developing only one particular technology — to understand which ones can deliver.” Their paper is aimed at companies and governments, as well as researchers. “Increasingly, companies have as much agency as governments over these technology portfolio decisions,” she says, “although government policy can still do a lot because it can provide a sort of signal across the market.”The study looked at three stages of the process, starting with forecasting the actual technological changes that are likely to play important roles in coming years, then looking at how those changes could affect economic, social, and environmental conditions, and finally, how to apply these insights into the actual decision-making processes as they occur.Forecasting usually falls into two categories, either data-driven or expert-driven, or a combination of those. That provides an estimate of how technologies may be improving, as well as an estimate of the uncertainties in those predictions. Then in the next step, a variety of models are applied that are “very wide ranging,” Trancik says, “different models that cover energy systems, transportation systems, electricity, and also integrated assessment models that look at the impact of technology on the environment and on the economy.”And then, the third step is “finding structured ways to use the information from predictive models to interact with people that may be using that information to inform their decision-making process,” she says. “In all three of these steps, how you need to recognize the vast uncertainty and tease out the predictive aspects. How you deal with uncertainty is really important.”In the implementation of these decisions, “people may have different objectives, or they may have the same objective but different beliefs about how to get there. And so, part of the research is bringing in this quantitative analysis, these research results, into that process,” Trancik says. And a very important aspect of that third step, she adds, is “recognizing that it’s not just about presenting the model results and saying, ‘here you go, this is the right answer.’ Rather, you have to bring people into the process of designing the studies and interacting with the modeling results.”She adds that “the role of research is to provide information to, in this case, the decision-making processes. It’s not the role of the researchers to push for one outcome or another, in terms of balancing the trade-offs,” such as between economic, environmental, and social equity concerns. It’s about providing information, not just for the decision-makers themselves, but also for the public who may influence those decisions. “I do think it’s relevant for the public to think about this, and to think about the agency that actually they could have over how technology is evolving.”In the study, the team highlighted priorities for further research that needs to be done. Those priorities, Trancik says, include “streamlining and validating models, and also streamlining data collection,” because these days “we often have more data than we need, just tons of data,” and yet “there’s often a scarcity of data in certain key areas like technology performance and evolution. How technologies evolve is just so important in influencing our daily lives, yet it’s hard sometimes to access good representative data on what’s actually happening with this technology.” But she sees opportunities for concerted efforts to assemble large, comprehensive data on technology from publicly available sources.Trancik points out that many models are developed to represent some real-world process, and “it’s very important to test how well that model does against reality,” for example by using the model to “predict” some event whose outcome is already known and then “seeing how far off you are.” That’s easier to do with a more streamlined model, she says.“It’s tempting to develop a model that includes many, many parameters and lots of different detail. But often what you need to do is only include detail that’s relevant for the particular question you’re asking, and that allows you to make your model simpler.” Sometimes that means you can simplify the decision down to just solving an equation, and other times, “you need to simulate things, but you can still validate the model against real-world data that you have.”“The scale of energy and climate problems mean there is much more to do,” says Gregory Nemet, faculty chair in business and regulation at the University of Wisconsin at Madison, who was a co-author of the paper. He adds, “while we can’t accurately forecast individual technologies on their own, a variety of methods have been developed that in conjunction can enable decision-makers to make public dollars go much further, and enhance the likelihood that future investments create strong public benefits.”This work is perhaps particularly relevant now, Trancik says, in helping to address global challenges including climate change and meeting energy demand, which were in focus at the global climate conference COP 30 that just took place in Brazil. “I think with big societal challenges like climate change, always a key question is, ‘how do you make progress with limited time and limited financial resources?’” This research, she stresses, “is all about that. It’s about using data, using knowledge that’s out there, expertise that’s out there, drawing out the relevant parts of all of that, to allow people and society to be more deliberate and successful about how they’re making decisions about investing in technology.”As with other areas such as epidemiology, where the power of analytical forecasting may be more widely appreciated, she says, “in other areas of technology as well, there’s a lot we can do to anticipate where things are going, how technology is evolving at the global or at the national scale … There are these macro-level trends that you can steer in certain directions, that we actually have more agency over as a society than we might recognize.”The study included researchers in Massachusetts, Wisconsin, Colorado, Maryland, Maine, California, Austria, Norway, Mexico, Finland, Italy, the U.K., and the Netherlands. 

German Coalition Agrees to Fast-Track Infrastructure, Scrap Unpopular Heating Law

BERLIN, Dec 11 (Reuters) - Germany's ruling coalition has agreed ‌a ​new law to fast-track infrastructure projects ‌and to scrap clean-heating...

BERLIN, Dec 11 (Reuters) - Germany's ruling coalition has agreed ‌a ​new law to fast-track infrastructure projects ‌and to scrap clean-heating legislation in favour of a broader law ​on modernising buildings, Chancellor Friedrich Merz said on Thursday.Merz's government, which took power seven months ago, has ‍pledged to revive Germany's sluggish economy, ​Europe's largest, by accelerating projects to improve infrastructure.The conservative chancellor said a wide range of ​transport schemes ⁠would be classified as being of "overriding public interest" under the new law, giving them priority in planning and approval processes.All related administrative procedures will move to a "digital only" standard intended to shorten timelines, while electrifying rail lines of up to 60 kilometres (37 miles) will no longer require ‌an environmental impact assessment, he said."Environmental protection remains important but it can no longer block ​urgently ‌needed measures through endless procedures," ‍Merz told ⁠a press conference following Wednesday evening's cabinet meeting.Germany was long admired for the efficiency of its infrastructure but has been increasingly criticised for letting it decay due to successive governments' aversion to taking on new debt.Breaking with that fiscal tradition, Merz's government earlier this year pushed through debt reforms to borrow hundreds of billions of euros in a special fund, though critics say some of that fiscal firepower has ​been used to prop up day-to-day spending.MORE FLEXIBILITY ON TECHNOLOGY CHOICESOn heating, Merz confirmed the coalition would scrap a contested law that requires most newly installed systems to run largely on renewable energy.The measure, pushed through by the previous centre-left government, triggered a backlash from homeowners and opposition parties and was widely seen as contributing to a sharp slump in support for the coalition that eventually collapsed.The revamped Building Modernisation Act will keep the goal of cutting emissions from buildings but give households more flexibility over technology choices and timelines. The government plans to send it to parliament ​by next spring.With five state elections looming next year, Merz's conservatives and their junior coalition partner, the centre-left Social Democrats, need some wins after a series of political blunders.Support for both parties has dropped since February's federal election, while the far-right Alternative ​for Germany has shot into pole position in nationwide surveys.(Reporting by Sarah Marsh; editing by Matthias Williams and Gareth Jones)Copyright 2025 Thomson Reuters.Photos You Should See – December 2025

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.