Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Explosion of power-hungry data centers could derail California clean energy goals

News Feed
Monday, August 12, 2024

Near the Salton Sea, a company plans to build a data center to support artificial intelligence that would cover land the size of 15 football fields and require power that could support 425,000 homes. In Santa Clara — the heart of Silicon Valley — electric rates are rising as the municipal utility spends heavily on transmission lines and other infrastructure to accommodate the voracious power demand from more than 50 data centers, which now consume 60% of the city’s electricity.And earlier this year, Pacific Gas & Electric told investors that its customers have proposed more than two dozen data centers, requiring 3.5 gigawatts of power — the output of three new nuclear reactors. Vantage Data Center in Santa Clara is equipped with its own electrical substations. (Paul Kuroda / For The Times) While the benefits and risks of AI continue to be debated, one thing is clear: The technology is rapacious for power. Experts warn that the frenzy of data center construction could delay California’s transition away from fossil fuels and raise electric bills for everyone else. The data centers’ insatiable appetite for electricity, they say, also increases the risk of blackouts.Even now, California is at the verge of not having enough power. An analysis of public data by the nonprofit GridClue ranks California 49th of the 50 states in resilience — or the ability to avoid blackouts by having more electricity available than homes and businesses need at peak hours.“California is working itself into a precarious position,” said Thomas Popik, president of the Foundation for Resilient Societies, which created GridClue to educate the public on threats posed by increasing power use.The state has already extended the lives of Pacific Gas & Electric Co.’s Diablo Canyon nuclear plant as well as some natural gas-fueled plants in an attempt to avoid blackouts on sweltering days when power use surges. Worried that California could no longer predict its need for power because of fast-rising use, an association of locally run electricity providers called on state officials in May to immediately analyze how quickly demand was increasing. The California Community Choice Assn. sent its letter to the state energy commission after officials had to revise their annual forecast of power demand upward because of skyrocketing use by Santa Clara’s dozens of data centers. A large NTT data center rises in a Santa Clara neighborhood. (Paul Kuroda / For The Times) The facilities, giant warehouses of computer servers, have long been big power users. They support all that Americans do on the internet — from online shopping to streaming Netflix to watching influencers on TikTok.But the specialized chips required for generative AI use far more electricity — and water — than those that support the typical internet search because they are designed to read through vast amounts of data.A ChatGPT-powered search, according to the International Energy Agency, consumes 10 times the power as a search on Google without AI.And because those new chips generate so much heat, more power and water is required to keep them cool.“I’m just surprised that the state isn’t tracking this, with so much attention on power and water use here in California,” said Shaolei Ren, associate professor of electrical and computer engineering at UC Riverside.Ren and his colleagues calculated that the global use of AI could require as much fresh water in 2027 as that now used by four to six countries the size of Denmark.Driving the data center construction is money. Today’s stock market rewards companies that say they are investing in AI. Electric utilities profit as power use rises. And local governments benefit from the property taxes paid by data centers. Transmission lines are reflected on the side of the NTT data center in Santa Clara. (Paul Kuroda / For The Times) Silicon Valley is the world’s epicenter of AI, with some of the biggest developers headquartered there, including Alphabet, Apple and Meta. OpenAI, the creator of ChatGPT, is based in San Francisco. Nvidia, the maker of chips needed for AI, operates from Santa Clara.The big tech companies leading in AI, which also include Microsoft and Amazon, are spending billions to build new data centers around the world. They are also paying to rent space for their servers in so-called co-location data centers built by other companies.In a Chicago suburb, a developer recently bought 55 homes so they could be razed to build a sprawling data center campus.Energy officials in northern Virginia, which has more data centers than any other region in the world, have proposed a transmission line to shore up the grid that would depend on coal plants that had been expected to be shuttered.In Oregon, Google and the city of The Dalles fought for 13 months to prevent the Oregonian from getting records of how much water the company’s data centers were consuming. The newspaper won the court case, learning the facilities drank up 29% of the city’s water.By 2030, data centers could account for as much as 11% of U.S. power demand — up from 3% now, according to analysts at Goldman Sachs.“We must demand more efficient data centers or else their continued growth will place an unsustainable strain on energy resources, impact new home building, and increase both carbon emissions and California residents’ cost of electricity,” wrote Charles Giancarlo, chief executive of the Santa Clara IT firm Pure Storage.Santa Clara a top market for data centers (Paul Kuroda / For The Times) California has more than 270 data centers, with the biggest concentration in Santa Clara. The city is an attractive location because its electric rates are 40% lower than those charged by PG&E.But the lower rates come with a higher cost to the climate. The city’s utility, Silicon Valley Power, emits more greenhouse gas than the average California electric utility because 23% of its power for commercial customers comes from gas-fired plants. Another 35% is purchased on the open market where the electricity’s origin can’t be traced.The utility also gives data centers and other big industrial customers a discount on electric rates.While Santa Clara households pay more for each kilowatt hour beyond a certain threshold, the rate for data centers declines as they use more power.The city receives millions of dollars of property taxes from the data centers. And 5% of the utility’s revenue goes to the city’s general fund, where it pays for services such as road maintenance and police.An analysis last year by the Silicon Valley Voice newspaper questioned the lower rates data centers pay compared with residents.“What impetus do Santa Clarans have to foot the bill for these environmentally unfriendly behemoth buildings?” wrote managing editor Erika Towne.In October, Manuel Pineda, the utility’s top official, told the City Council that his team was working to double power delivery over the next 10 years. “We prioritize growth as a strategic opportunity,” he said.He said usage by data centers was continuing to escalate, but the utility was nearing its power limit. He said 13 new data centers were under construction and 12 more were moving forward with plans.“We cannot currently serve all data centers that would like to be in Santa Clara,” he said. Dozens of data centers have been built for artificial intelligence and the internet in Santa Clara. (Paul Kuroda / For The Times) To accommodate increasing power use, the city is now spending heavily on transmission lines, substations and other infrastructure. At the same time, electric rates are rising. Rates had been increasing by 2% to 3% a year, but they jumped by 8% in January 2023, another 5% in July 2023 and 10% last January.Pineda told The Times that it wasn’t just the new infrastructure that pushed rates up. The biggest factor, he said, was a spike in natural gas prices in 2022, which increased power costs.He said residential customers pay higher rates because the distribution system to homes requires more poles, wires and transformers than the system serving data centers, which increases maintenance costs.Pineda said the city’s decisions to approve new data centers “are generally based on land use factors, not on revenue generation.”Loretta Lynch, former chair of the state’s public utilities commission, noted that big commercial customers such as data centers pay lower rates for electricity across the state. That means when transmission lines and other infrastructure must be built to handle the increasing power needs, residential customers pick up more of the bill.“Why aren’t data centers paying their fair share for infrastructure? That’s my question,” she said.PG&E eyes profits from boom The grid’s limited capacity has not stopped PG&E from wooing companies that want to build data centers.“I think we will definitely be one of the big ancillary winners of the demand growth for data centers,” Patricia Poppe, PG&E’s chief executive, told Wall Street analysts on an April conference call.Poppe said she recently invited the company’s tech customers to an event at a San José substation.“When I got there, I was pleasantly surprised to see AWS, Microsoft, Apple, Google, Equinix, Cisco, Western Digital Semiconductors, Tesla, all in attendance. These are our customers that we serve who want us to serve more,” she said on the call. “They were very clear: they would build … if we can provide.”In June, PG&E revealed it had received 26 applications for new data centers, including three that need at least 500 megawatts of power, 24 hours a day. In all, the proposed data centers would use 3.5 gigawatts. That amount of power could support nearly 5 million homes, based on the average usage of a California household of 6,174 kilowatts a year.In the June presentation, PG&E said the new data centers would require it to spend billions of dollars on new infrastructure.Already PG&E can’t keep up with connecting customers to the grid. It has fallen so far behind on connecting new housing developments that last year legislators passed a law to try to shorten the delays. At that time, the company told Politico that the delays stemmed from rising electricity demand, including from data centers.In a statement to The Times, PG&E said its system was “ready for data centers.”The company said its analysis showed that adding the data centers would not increase bills for other customers.Most of the year, excluding extreme hot weather, its grid “is only 45% utilized on average,” the company said.“Data centers’ baseload will enable us to utilize more of this percentage and deliver more per customer dollar,” the company said. “For every 1,000 MW load from data centers we anticipate our customers could expect 1-2% saving on their monthly electricity bill.”The company added that it was “developing tools to ensure that every customer can cost-effectively connect new loads to the system with minimal delay.”Lynch questioned the company’s analysis that adding data centers could reduce bills for other customers. She pointed out that utilities earn profits by investing in new infrastructure. That’s because they get to recover that cost — plus an annual rate of return — through rates billed to all customers.“The more they spend, the more they make,” she said.In the desert, cheap land and green energy A geothermal plant viewed from across the Salton Sea in December 2022. (Gina Ferazzi / Los Angeles Times) The power and land constraints in Santa Clara and other cities have data center developers looking for new frontiers.“On the edge of the Southern California desert in Imperial County sits an abundance of land,” begins the sales brochure for the data center that a company called CalEthos is building near the south shore of the Salton Sea.Electricity for the data center’s servers would come from the geothermal and solar plants built near the site in an area that has become known as Lithium Valley.The company is negotiating to purchase as much as 500 megawatts of power, the brochure said.Water for the project would come from the state’s much fought over allotment from the Colorado River.Imperial County is one of California’s poorest counties. More than 80% of its population are Latino. Many residents are farmworkers.Executives from Tustin-based CalEthos told The Times that by using power from the nearby geothermal plants it would help the local community.“By creating demand for local energy, CalEthos will help accelerate the development of Lithium Valley and its associated economic benefits,” Joel Stone, the company’s president, wrote in an email.“We recognize the importance of responsible energy and water use in California,” Stone said. “Our data centers will be designed to be as efficient as possible.”For example, Stone said that in order to minimize water use, CalEthos plans a cooling system where water is recirculated and “requires minimal replenishment due to evaporation.” Already, a local community group, Comite Civico del Valle, has raised concerns about the environmental and health risks of one of the nearby geothermal plants that plans to produce lithium from the brine brought up in the energy production process.One of the group’s concerns about the geothermal plant is that its water use will leave less to replenish the Salton Sea. The lake has been decreasing in size, creating a larger dry shoreline that is laden with bacteria and chemicals left from decades of agricultural runoff. Scientists have tied the high rate of childhood asthma in the area to dust from the shrinking lake’s shores.James Blair, associate professor of geography and anthropology at Cal Poly Pomona, questioned whether the area was the right place for a mammoth data center.“Data centers drain massive volumes of energy and water for chillers and cooling towers to prevent servers from overheating,” he said. Blair said that while the company can tell customers its data center is supported by environmentally friendly solar and geothermal power, it will take that renewable energy away from the rest of California’s grid, making it harder for the state to meet its climate goals. Newsletter Toward a more sustainable California Get Boiling Point, our newsletter exploring climate change, energy and the environment, and become part of the conversation — and the solution. You may occasionally receive promotional content from the Los Angeles Times.

Experts warn that a frenzy of data center construction could delay California's transition away from fossil fuels and raise everyone's electric bills.

Near the Salton Sea, a company plans to build a data center to support artificial intelligence that would cover land the size of 15 football fields and require power that could support 425,000 homes.

In Santa Clara — the heart of Silicon Valley — electric rates are rising as the municipal utility spends heavily on transmission lines and other infrastructure to accommodate the voracious power demand from more than 50 data centers, which now consume 60% of the city’s electricity.

And earlier this year, Pacific Gas & Electric told investors that its customers have proposed more than two dozen data centers, requiring 3.5 gigawatts of power — the output of three new nuclear reactors.

An electrical substation.

Vantage Data Center in Santa Clara is equipped with its own electrical substations.

(Paul Kuroda / For The Times)

While the benefits and risks of AI continue to be debated, one thing is clear: The technology is rapacious for power. Experts warn that the frenzy of data center construction could delay California’s transition away from fossil fuels and raise electric bills for everyone else. The data centers’ insatiable appetite for electricity, they say, also increases the risk of blackouts.

Even now, California is at the verge of not having enough power. An analysis of public data by the nonprofit GridClue ranks California 49th of the 50 states in resilience — or the ability to avoid blackouts by having more electricity available than homes and businesses need at peak hours.

“California is working itself into a precarious position,” said Thomas Popik, president of the Foundation for Resilient Societies, which created GridClue to educate the public on threats posed by increasing power use.

The state has already extended the lives of Pacific Gas & Electric Co.’s Diablo Canyon nuclear plant as well as some natural gas-fueled plants in an attempt to avoid blackouts on sweltering days when power use surges.

Worried that California could no longer predict its need for power because of fast-rising use, an association of locally run electricity providers called on state officials in May to immediately analyze how quickly demand was increasing.

The California Community Choice Assn. sent its letter to the state energy commission after officials had to revise their annual forecast of power demand upward because of skyrocketing use by Santa Clara’s dozens of data centers.

A large data center rises in an urban business district.

A large NTT data center rises in a Santa Clara neighborhood.

(Paul Kuroda / For The Times)

The facilities, giant warehouses of computer servers, have long been big power users. They support all that Americans do on the internet — from online shopping to streaming Netflix to watching influencers on TikTok.

But the specialized chips required for generative AI use far more electricity — and water — than those that support the typical internet search because they are designed to read through vast amounts of data.

A ChatGPT-powered search, according to the International Energy Agency, consumes 10 times the power as a search on Google without AI.

And because those new chips generate so much heat, more power and water is required to keep them cool.

“I’m just surprised that the state isn’t tracking this, with so much attention on power and water use here in California,” said Shaolei Ren, associate professor of electrical and computer engineering at UC Riverside.

Ren and his colleagues calculated that the global use of AI could require as much fresh water in 2027 as that now used by four to six countries the size of Denmark.

Driving the data center construction is money. Today’s stock market rewards companies that say they are investing in AI. Electric utilities profit as power use rises. And local governments benefit from the property taxes paid by data centers.

Transmission lines are reflected on the side of a building.

Transmission lines are reflected on the side of the NTT data center in Santa Clara.

(Paul Kuroda / For The Times)

Silicon Valley is the world’s epicenter of AI, with some of the biggest developers headquartered there, including Alphabet, Apple and Meta. OpenAI, the creator of ChatGPT, is based in San Francisco. Nvidia, the maker of chips needed for AI, operates from Santa Clara.

The big tech companies leading in AI, which also include Microsoft and Amazon, are spending billions to build new data centers around the world. They are also paying to rent space for their servers in so-called co-location data centers built by other companies.

In a Chicago suburb, a developer recently bought 55 homes so they could be razed to build a sprawling data center campus.

Energy officials in northern Virginia, which has more data centers than any other region in the world, have proposed a transmission line to shore up the grid that would depend on coal plants that had been expected to be shuttered.

In Oregon, Google and the city of The Dalles fought for 13 months to prevent the Oregonian from getting records of how much water the company’s data centers were consuming. The newspaper won the court case, learning the facilities drank up 29% of the city’s water.

By 2030, data centers could account for as much as 11% of U.S. power demand — up from 3% now, according to analysts at Goldman Sachs.

“We must demand more efficient data centers or else their continued growth will place an unsustainable strain on energy resources, impact new home building, and increase both carbon emissions and California residents’ cost of electricity,” wrote Charles Giancarlo, chief executive of the Santa Clara IT firm Pure Storage.

Santa Clara a top market for data centers

Boys ride their bikes on Main Street near a large data center in Santa Clara.

(Paul Kuroda / For The Times)

California has more than 270 data centers, with the biggest concentration in Santa Clara. The city is an attractive location because its electric rates are 40% lower than those charged by PG&E.

But the lower rates come with a higher cost to the climate. The city’s utility, Silicon Valley Power, emits more greenhouse gas than the average California electric utility because 23% of its power for commercial customers comes from gas-fired plants. Another 35% is purchased on the open market where the electricity’s origin can’t be traced.

The utility also gives data centers and other big industrial customers a discount on electric rates.

While Santa Clara households pay more for each kilowatt hour beyond a certain threshold, the rate for data centers declines as they use more power.

The city receives millions of dollars of property taxes from the data centers. And 5% of the utility’s revenue goes to the city’s general fund, where it pays for services such as road maintenance and police.

An analysis last year by the Silicon Valley Voice newspaper questioned the lower rates data centers pay compared with residents.

“What impetus do Santa Clarans have to foot the bill for these environmentally unfriendly behemoth buildings?” wrote managing editor Erika Towne.

In October, Manuel Pineda, the utility’s top official, told the City Council that his team was working to double power delivery over the next 10 years. “We prioritize growth as a strategic opportunity,” he said.

He said usage by data centers was continuing to escalate, but the utility was nearing its power limit. He said 13 new data centers were under construction and 12 more were moving forward with plans.

“We cannot currently serve all data centers that would like to be in Santa Clara,” he said.

A data center rises many stories into the sky.

Dozens of data centers have been built for artificial intelligence and the internet in Santa Clara.

(Paul Kuroda / For The Times)

To accommodate increasing power use, the city is now spending heavily on transmission lines, substations and other infrastructure. At the same time, electric rates are rising. Rates had been increasing by 2% to 3% a year, but they jumped by 8% in January 2023, another 5% in July 2023 and 10% last January.

Pineda told The Times that it wasn’t just the new infrastructure that pushed rates up. The biggest factor, he said, was a spike in natural gas prices in 2022, which increased power costs.

He said residential customers pay higher rates because the distribution system to homes requires more poles, wires and transformers than the system serving data centers, which increases maintenance costs.

Pineda said the city’s decisions to approve new data centers “are generally based on land use factors, not on revenue generation.”

Loretta Lynch, former chair of the state’s public utilities commission, noted that big commercial customers such as data centers pay lower rates for electricity across the state. That means when transmission lines and other infrastructure must be built to handle the increasing power needs, residential customers pick up more of the bill.

“Why aren’t data centers paying their fair share for infrastructure? That’s my question,” she said.

PG&E eyes profits from boom

The grid’s limited capacity has not stopped PG&E from wooing companies that want to build data centers.

“I think we will definitely be one of the big ancillary winners of the demand growth for data centers,” Patricia Poppe, PG&E’s chief executive, told Wall Street analysts on an April conference call.

Poppe said she recently invited the company’s tech customers to an event at a San José substation.

“When I got there, I was pleasantly surprised to see AWS, Microsoft, Apple, Google, Equinix, Cisco, Western Digital Semiconductors, Tesla, all in attendance. These are our customers that we serve who want us to serve more,” she said on the call. “They were very clear: they would build … if we can provide.”

In June, PG&E revealed it had received 26 applications for new data centers, including three that need at least 500 megawatts of power, 24 hours a day. In all, the proposed data centers would use 3.5 gigawatts. That amount of power could support nearly 5 million homes, based on the average usage of a California household of 6,174 kilowatts a year.

In the June presentation, PG&E said the new data centers would require it to spend billions of dollars on new infrastructure.

Already PG&E can’t keep up with connecting customers to the grid. It has fallen so far behind on connecting new housing developments that last year legislators passed a law to try to shorten the delays. At that time, the company told Politico that the delays stemmed from rising electricity demand, including from data centers.

In a statement to The Times, PG&E said its system was “ready for data centers.”

The company said its analysis showed that adding the data centers would not increase bills for other customers.

Most of the year, excluding extreme hot weather, its grid “is only 45% utilized on average,” the company said.

“Data centers’ baseload will enable us to utilize more of this percentage and deliver more per customer dollar,” the company said. “For every 1,000 MW load from data centers we anticipate our customers could expect 1-2% saving on their monthly electricity bill.”

The company added that it was “developing tools to ensure that every customer can cost-effectively connect new loads to the system with minimal delay.”

Lynch questioned the company’s analysis that adding data centers could reduce bills for other customers. She pointed out that utilities earn profits by investing in new infrastructure. That’s because they get to recover that cost — plus an annual rate of return — through rates billed to all customers.

“The more they spend, the more they make,” she said.

In the desert, cheap land and green energy

Dusk settles over the low Salton Sea.

A geothermal plant viewed from across the Salton Sea in December 2022.

(Gina Ferazzi / Los Angeles Times)

The power and land constraints in Santa Clara and other cities have data center developers looking for new frontiers.

“On the edge of the Southern California desert in Imperial County sits an abundance of land,” begins the sales brochure for the data center that a company called CalEthos is building near the south shore of the Salton Sea.

Electricity for the data center’s servers would come from the geothermal and solar plants built near the site in an area that has become known as Lithium Valley.

The company is negotiating to purchase as much as 500 megawatts of power, the brochure said.

Water for the project would come from the state’s much fought over allotment from the Colorado River.

Imperial County is one of California’s poorest counties. More than 80% of its population are Latino. Many residents are farmworkers.

Executives from Tustin-based CalEthos told The Times that by using power from the nearby geothermal plants it would help the local community.

“By creating demand for local energy, CalEthos will help accelerate the development of Lithium Valley and its associated economic benefits,” Joel Stone, the company’s president, wrote in an email.

“We recognize the importance of responsible energy and water use in California,” Stone said. “Our data centers will be designed to be as efficient as possible.”

For example, Stone said that in order to minimize water use, CalEthos plans a cooling system where water is recirculated and “requires minimal replenishment due to evaporation.”

Already, a local community group, Comite Civico del Valle, has raised concerns about the environmental and health risks of one of the nearby geothermal plants that plans to produce lithium from the brine brought up in the energy production process.

One of the group’s concerns about the geothermal plant is that its water use will leave less to replenish the Salton Sea. The lake has been decreasing in size, creating a larger dry shoreline that is laden with bacteria and chemicals left from decades of agricultural runoff. Scientists have tied the high rate of childhood asthma in the area to dust from the shrinking lake’s shores.

James Blair, associate professor of geography and anthropology at Cal Poly Pomona, questioned whether the area was the right place for a mammoth data center.

“Data centers drain massive volumes of energy and water for chillers and cooling towers to prevent servers from overheating,” he said.

Blair said that while the company can tell customers its data center is supported by environmentally friendly solar and geothermal power, it will take that renewable energy away from the rest of California’s grid, making it harder for the state to meet its climate goals.

Newsletter

Toward a more sustainable California

Get Boiling Point, our newsletter exploring climate change, energy and the environment, and become part of the conversation — and the solution.

You may occasionally receive promotional content from the Los Angeles Times.

Read the full story here.
Photos courtesy of

Latest Kote climate order aims to speed up Oregon’s clean energy transition

The executive order seeks to accelerate wind and solar energy and energy storage, energy efficiency and the transition to clean fuels in Oregon.

Gov. Tina Kotek has issued another broad climate executive order directing state agencies to take specific actions to reduce greenhouse gas emissions and speed up Oregon’s move to carbon-free electricity. Her order Wednesday seeks to accelerate wind and solar energy and energy storage by streamlining land use and environmental reviews, siting, permitting and grid connections.It sets an energy storage goal and directs agencies to prioritize public-private partnerships for clean energy projects and to find ways to support emerging technologies such as enhanced geothermal technology, offshore wind and advanced battery storage. The order also calls for state agencies to increase energy efficiency in public and private buildings and extends Oregon’s Clean Fuels Program through 2040. The program requires suppliers to steadily cut fuel pollution.“The rising cost of living is hitting Oregonians household budgets hard, so we must act effectively and prudently to protect ratepayers from increased energy costs, while also building a more resilient, clean energy future,” Kotek said at a press conference at the state Capitol while flanked by a group of clean energy and climate action supporters.Kotek’s move comes amid growing doubts about Oregon’s ability to hit its ambitious 100% clean energy target. State law requires investor-owned utilities in Oregon to reduce emissions by 80% by 2030 and to transition to all clean electricity by 2040, something experts say utilities are unlikely to do given the lack of transmission lines and the extraordinary growth in electricity demand from data centers, buildings and cars. The order also lands as the Trump administration has moved aggressively to roll back federal climate policies, reversing many emissions-reduction measures enacted under President Joe Biden – including halting wind and solar projects on federal lands and dismantling generous tax credits funded by the Biden-era Inflation Reduction Act. It’s Kotek’s third climate-related executive order in less than a month. At the end of October, she directed state agencies to harness the potential of forests, farms, wetlands and waterways to reduce emissions, preserve wildlife habitat and help communities withstand the threat of climate change. And in early October, she pushed to streamline and accelerate the pace of wind and solar project development in the state before the clock runs out on federal clean energy tax credits.Kotek said the latest executive order can help slow climate change, expand transmission grid capacity, attract new businesses and create economic opportunities across Oregon’s energy sector. The order sets a goal of 8 gigawatts of energy storage in Oregon by 2045. Building more energy storage is key, the governor’s office said, because it provides backup electricity when wind or solar power production is low and during outages or peak demand periods. Energy storage projects also reduce the need for building additional electricity-generating resources such as wind or solar projects.Eight gigawatts is achievable, the governor’s office said, because the state already has nearly 500 megawatts of energy storage and more than 7 gigawatts of storage projects are currently planned for development. The order also directs the state Department of Energy to designate transmission corridors, including on public land, and streamline siting and approval in those corridors or in existing rights of way. The order requires a 50% reduction in carbon intensity of Oregon fuels by 2040. The current rule requires a 10% reduction in average carbon intensity from 2015 levels by 2025, followed by a 20% reduction by 2030 and 37% by 2035. Most fuel producers mix in cleaner fuels such as ethanol, biodiesel or renewable diesel into traditional gasoline and diesel or buy credits from others who have gone beyond the state requirement. In 2024, the Clean Fuels Program led to the reduction of approximately 3 million metric tons of greenhouse gases. Over the lifetime of the program, since 2016, approximately 14.6 million metric tons of greenhouse gases have been reduced.Much of the order focuses on state agencies – including the Department of Energy, the Department of Land Conservation and Development, Department of Environmental Quality and the Public Utility Commission – aligning their decisions, investments and activities, including the implementation of existing programs, to advance clean energy, clean fuels and energy efficiency. It doesn’t entail new programs or additional funding for the remainder of the 2025-2027 biennium but may lead to new funding demands in future years, said Kotek spokesperson Anca Matica. The order directs agencies to tally the barriers to clean energy permitting, construction and connecting into the transmission grid and come up with solutions by next fall. The agencies are to focus on projects that benefit Oregon ratepayers and that involve upgrades to the existing grid and transmission expansion in existing rights-of-way.By September 2026, agencies are to identify strategies to streamline and accelerate the construction of wind and solar projects. Agencies must provide quarterly updates on progress in advancing public-private partnerships. The governor’s office said the order won’t raise rates. Rather, the order directs agencies to prioritize energy efficiency and investments that deliver the greatest value to ratepayers, the governor’s office said. (should you move this up where she has the quote?)Reporter Carlos Fuentes contributed to this story. If you purchase a product or register for an account through a link on our site, we may receive compensation. By using this site, you consent to our User Agreement and agree that your clicks, interactions, and personal information may be collected, recorded, and/or stored by us and social media and other third-party partners in accordance with our Privacy Policy.

Groups Push Back on Montana’s ‘Data Center Boom’ in Petition Before Utility Commission

A group of nonprofit organizations are asking Montana's utility board to tighten its oversight of NorthWestern Energy as it plans to provide large amounts of electricity to data centers

A group of nonprofits is petitioning Montana’s utility board to tighten its oversight of NorthWestern Energy, arguing existing customers could foot the bill for the utility’s plan to provide data centers with electricity.Nine groups working on energy, conservation, social justice and affordability issues on Tuesday asked the Public Service Commission to impose rules on NorthWestern so its 413,000-plus residential customers won’t be forced to shoulder the cost of new power plants and transmission lines to power data centers.Here’s what we know about the data centers in question, how Montana law intersects with the debate and what the petitioners are asking the PSC to do in response. How much power do these data centers want NorthWestern Energy to supply? NorthWestern Energy has signed letters of intent to supply power to three data centers, according to the complaint. If all goes according to the forecasted demand, by 2030, NorthWestern will supply 1,400 megawatts of power to these data centers to meet their needs. That’s roughly equivalent to the annual electricity needs of more than 1 million homes and more than double the 759 megawatts of power NorthWestern’s existing customers require on a typical day.NorthWestern has signed agreements with Atlas Power, which seeks 75 megawatts of power for a facility in Butte starting in 2026 and and another 75 megawatts by 2030; Sabey Data Center Properties, which would initially require 50 megawatts to power a 600-acre campus planned for Butte and eventually expand its use to 250 megawatts; and Quantica Infrastructure, which wants to secure 175 megawatts for a project in Yellowstone County by late 2027 and increase its electrical footprint to 1,000 megawatts by 2030.According to the complaint, NorthWestern currently owns or has standing contracts for about 2,100 megawatts of power. It will acquire 592 additional megawatts of power from the Colstrip coal-fired power plant on Jan. 1, although it already has plans for some of that additional electricity. Why are the petitioners worried about these data centers? The petitioners argue that NorthWestern’s plan to sign electricity service agreements before garnering regulatory approval is “unreasonable, insufficient and contrary to Montana law.”More specifically, they argue that NorthWestern has “short circuited” the public’s right to know what the company is doing. The petitioners also say NorthWestern is inappropriately blocking oversight by, for example, moving to shield the letters of intent from public review. The PSC has the authority to ensure NorthWestern won’t shift new costs to its ratepayers, who are unable to shop around for power from other utilities, the petitioners contend.The petitioners are Big Sky 55+, Butte Watchdogs for Social and Environmental Justice, Climate Smart Missoula, Golden Triangle Resource Council, Helena Interfaith Climate Advocates, Honor the Earth, Montana Environmental Information Center, Montana Public Interest Research Group and NW Energy Coalition.Shannon James, Montana Environmental Information Center’s climate and campaigns organizer, said in a press release Tuesday that Montana should learn from other states’ missteps and avoid a hands-off approach to data center regulation.“Communities across the country have suffered when large, noisy data centers move into their neighborhoods, raising their power bills and taking their water,” James said. “Montana has a chance to get ahead of the curve and protect existing utility customers from having to pay for expensive new fossil fuel power plants so NorthWestern Energy can cater to wealthy tech companies.” What do the petitioners want the PSC to do? The petition asks the PSC to create a separate customer class for data centers, complete with a separate tariff, or rate structure, for the power they buy. In addition to establishing a unique formula for data centers’ power bills, a specialized tariff could stipulate that data centers give NorthWestern plenty of notice before changing their power usage. That could “provide more predictability” to the utility and shield its other customers from undue risk, the complaint reads.If the PSC grants the request, the petitioners will have an opportunity to ask NorthWestern about its plans in a quasi-judicial public hearing. The groups will also have the opportunity to call experts to testify about potential impacts to NorthWestern’s customers if data centers tie into NorthWestern’s grid. What kinds of state laws are in play? The petition references a Montana law outlining the process for large new customers to secure electrical service from a regulated utility. That law says that a new retail customer can’t purchase more than 5 megawatts of power from a public utility unless it first demonstrates to the PSC “that the provision of electricity supply service … will not adversely impact the public utility’s other customers over the long term.”The petition also highlights sections of Montana law that establish the authority and duties of the PSC, which is made up of five elected officials. In keeping with a two-decade trend, the PSC is an all-Republican board.The laws in question give the PSC the authority to “inquire into the management of the business of all public utilities,” and obtain “all necessary information to enable the commission to perform its duties.” It also authorizes the PSC to “inspect the books, accounts, papers, records and memoranda of any public utility and examine, under oath, any officer, agent, or employee of the public utility in relation to its business and affairs.” What does NorthWestern say about the data center agreements? Jo Dee Black, a spokesperson for NorthWestern Energy, wrote in an email to MTFP on Tuesday that the company has committed to establishing a tariff specifically for large-load customers. She added that contracts for new data center customers will be submitted to the PSC “as they are executed.”“New commercial customers with large energy loads, including data centers, will pay their fair share of integration and service costs,” Black wrote. “Infrastructure investments will ultimately mean a larger, more resilient energy system in Montana, however, new large load customers, such as data centers, will have to pay for their costs to integrate with the energy system.” Black didn’t directly answer MTFP’s question regarding the number of agreements NorthWestern has signed with data centers, offering only that the company “has the three Letters of Intent” referenced in the petitioners’ complaint.If the PSC grants the request, parties to the proceeding — the petitioners, NorthWestern Energy and other organizations or individuals that the PSC clears for participation — will start building a case for commissioners to review. The PSC could issue an order based on the case, with or without first scheduling a hearing.This story was originally published by Montana Free Press and distributed through a partnership with The Associated Press. Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See – Nov. 2025

Community Benefits

Across California, communities and developers are coming to the negotiating table in an effort to distribute prosperity. Community Benefits Agreements can help.

Construction of a new stadium or solar farm can spark both alarm and promise for local residents, and for good reasons. Often, communities are sidelined in decision making about these projects, and the benefits of such large-scale developments are not always evenly distributed.  Historically, when these opportunities arrive, local officials have held public hearings where residents could voice concerns. However, this type of engagement has its drawbacks. It tends to favor vocal residents with the time and resources to attend. Moreover, research shows residents who attend these public hearings are disproportionately project opponents, rather than those who are pushing for more energy infrastructure or housing. And, ultimately, there is no guarantee that local electeds will take community feedback into consideration.Community Benefit Agreements (CBA) have emerged as one way to increase local control over development decisions and ensure that economic and other gains from new infrastructure are more widely shared.  What is a CBA? A Community Benefit Agreement is a legally binding contract between a developer and local governments or community groups such as labor unions, neighborhood associations, or environmental advocates.  In exchange for specific, tangible benefits, such as job training programs, affordable housing units, local hiring guarantees, parks, reduced electricity rates, or direct financial payments, local organizations agree to support a proposed project – or at least not oppose it. In this way, CBAs might be able to help speed up approval processes and accelerate development by navigating potential community opposition. CBAs to Support Clean Energy Development As California moves toward its goal of 100% renewable energy by 2045, communities are beginning to see many more wind and solar infrastructure projects — particularly those in the inland and rural counties of the state. As of November 2025, there are 282 planned utility-scale solar projects in California. Their total planned capacity is 59,721 megawatts (MW). Historically, Community Benefits Agreements have resulted from extensive advocacy and organizing by local community members. However, instead of pushing communities to self-organize for these benefits, California has begun to require clean energy developers to enter into legally-binding agreements with local community organizations in order to benefit from streamlined permitting at the state level.  CBAs for renewable energy are becoming increasingly prominent in policy and some jurisdictions both in California and other states have institutionalized community benefits:  Riverside County’s Policy B-29 requires large solar projects to pay approximately $150 per acre. Imperial County’s Public Benefit Program collects fees from solar projects to issue grants for infrastructure improvements and job creation.  California’s AB 205 now requires developers seeking state-level permits for large solar and wind facilities to execute a CBA Michigan’s recent legislation mandates that developers enter Host Community Agreements with minimum payments of $2,000 per megawatt. New York established a Host Community Benefits program with annual fees per megawatt issued as electric bill credits to residents of municipalities hosting renewable energy projects Read the Report: Rethinking Community Benefits: Industry-Specific Insights for a Transforming California  In order to help community groups who want to negotiate benefits agreements with developers, our team at the Possibility Lab – in partnership with CA FWD – built an Energy Project Benefits Agreement Database to identify common characteristics of successful agreements.  Explore our Energy Project Benefits Agreement Database  The Promise and Challenges of CBAs The promise of CBAs is that they give communities direct power to negotiate for their needs and preferences. However, it can be unclear who actually represents “the community.” Because CBAs are often negotiated by select community groups, they can lack democratic accountability. And just as the residents attending a public hearing may not be representative of the demographics of a community, with varying and unequal access to economic and political capital, the same could be true of the community groups who participate in negotiating CBAs.  As a result, some critics view CBAs as essentially allowing developers to “buy off” opposition in order to streamline approvals. The importance of timing in these agreements doesn’t improve optics: offered too early, benefits might feel like bribes; too late, they may seem like unjust compensation for negative impacts.  In the end, CBAs are private contracts and the details of many agreements stay hidden. As a result, despite many examples of CBAs in and outside California, surprisingly little is known about their actual structure, benefits, and outcomes. Many important questions remain unanswered, including whether CBAs speed up or slow down development. Which communities successfully negotiate CBAs, and which don’t? What happens when negotiations are unsuccessful? Who follows through to ensure commitments are fulfilled? CBAs are a promising vehicle to address the potential tensions between the need to quickly build more infrastructure and the desire to engage communities in decision-making. Nonetheless, more research is needed to understand their effectiveness in delivering real benefits to communities while enabling progress on housing, energy, and other new development. To learn more, visit the UC Berkeley Possibility Lab’s People-Centered Policymaking site

Introducing the MIT-GE Vernova Climate and Energy Alliance

Five-year collaboration between MIT and GE Vernova aims to accelerate the energy transition and scale new innovations.

MIT and GE Vernova launched the MIT-GE Vernova Energy and Climate Alliance on Sept. 15, a collaboration to advance research and education focused on accelerating the global energy transition.Through the alliance — an industry-academia initiative conceived by MIT Provost Anantha Chandrakasan and GE Vernova CEO Scott Strazik — GE Vernova has committed $50 million over five years in the form of sponsored research projects and philanthropic funding for research, graduate student fellowships, internships, and experiential learning, as well as professional development programs for GE Vernova leaders.“MIT has a long history of impactful collaborations with industry, and the collaboration between MIT and GE Vernova is a shining example of that legacy,” said Chandrakasan in opening remarks at a launch event. “Together, we are working on energy and climate solutions through interdisciplinary research and diverse perspectives, while providing MIT students the benefit of real-world insights from an industry leader positioned to bring those ideas into the world at scale.”The energy of changeAn independent company since its spinoff from GE in April 2024, GE Vernova is focused on accelerating the global energy transition. The company generates approximately 25 percent of the world’s electricity — with the world’s largest installed base of over 7,000 gas turbines, about 57,000 wind turbines, and leading-edge electrification technology.GE Vernova’s slogan, “The Energy of Change,” is reflected in decisions such as locating its headquarters in Cambridge, Massachusetts — in close proximity to MIT. In pursuing transformative approaches to the energy transition, the company has identified MIT as a key collaborator.A key component of the mission to electrify and decarbonize the world is collaboration, according to CEO Scott Strazik. “We want to inspire, and be inspired by, students as we work together on our generation’s greatest challenge, climate change. We have great ambition for what we want the world to become, but we need collaborators. And we need folks that want to iterate with us on what the world should be from here.”Representing the Healey-Driscoll administration at the launch event were Massachusetts Secretary of Energy and Environmental Affairs Rebecca Tepper and Secretary of the Executive Office of Economic Development Eric Paley. Secretary Tepper highlighted the Mass Leads Act, a $1 billion climate tech and life sciences initiative enacted by Governor Maura Healey last November to strengthen Massachusetts’ leadership in climate tech and AI.“We're harnessing every part of the state, from hydropower manufacturing facilities to the blue-to-blue economy in our south coast, and right here at the center of our colleges and universities. We want to invent and scale the solutions to climate change in our own backyard,” said Tepper. “That’s been the Massachusetts way for decades.”Real-world problems, insights, and solutionsThe launch celebration featured interactive science displays and student presenters introducing the first round of 13 research projects led by MIT faculty. These projects focus on generating scalable solutions to our most pressing challenges in the areas of electrification, decarbonization, renewables acceleration, and digital solutions. Read more about the funded projects here.Collaborating with industry offers the opportunity for researchers and students to address real-world problems informed by practical insights. The diverse, interdisciplinary perspectives from both industry and academia will significantly strengthen the research supported through the GE Vernova Fellowships announced at the launch event.“I’m excited to talk to the industry experts at GE Vernova about the problems that they work on,” said GE Vernova Fellow Aaron Langham. “I’m looking forward to learning more about how real people and industries use electrical power.”Fellow Julia Estrin echoed a similar sentiment: “I see this as a chance to connect fundamental research with practical applications — using insights from industry to shape innovative solutions in the lab that can have a meaningful impact at scale.”GE Vernova’s commitment to research is also providing support and inspiration for fellows. “This level of substantive enthusiasm for new ideas and technology is what comes from a company that not only looks toward the future, but also has the resources and determination to innovate impactfully,” says Owen Mylotte, a GE Vernova Fellow.The inaugural cohort of eight fellows will continue their research at MIT with tuition support from GE Vernova. Find the full list of fellows and their research topics here.Pipeline of future energy leadersHighlighting the alliance’s emphasis on cultivating student talent and leadership, GE Vernova CEO Scott Strazik introduced four MIT alumni who are now leaders at GE Vernova: Dhanush Mariappan SM ’03, PhD ’19, senior engineering manager in the GE Vernova Advanced Research Center; Brent Brunell SM ’00, technology director in the Advanced Research Center; Paolo Marone MBA ’21, CFO of wind; and Grace Caza MAP ’22, chief of staff in supply chain and operations.The four shared their experiences of working with MIT as students and their hopes for the future of this alliance in the realm of “people development,” as Mariappan highlighted. “Energy transition means leaders. And every one of the innovative research and professional education programs that will come out of this alliance is going to produce the leaders of the energy transition industry.”The alliance is underscoring its commitment to developing future energy leaders by supporting the New Engineering Education Transformation program (NEET) and expanding opportunities for student internships. With 100 new internships for MIT students announced in the days following the launch, GE Vernova is opening broad opportunities for MIT students at all levels to contribute to a sustainable future.“GE Vernova has been a tremendous collaborator every step of the way, with a clear vision of the technical breakthroughs we need to affect change at scale and a deep respect for MIT’s strengths and culture, as well as a hunger to listen and learn from us as well,” said Betar Gallant, alliance director who is also the Kendall Rohsenow Associate Professor of Mechanical Engineering at MIT. “Students, take this opportunity to learn, connect, and appreciate how much you’re valued, and how bright your futures are in this area of decarbonizing our energy systems. Your ideas and insight are going to help us determine and drive what’s next.”Daring to create the future we wantThe launch event transformed MIT’s Lobby 13 with green lighting and animated conversation around the posters and hardware demos on display, reflecting the sense of optimism for the future and the type of change the alliance — and the Commonwealth of Massachusetts — seeks to advance.“Because of this collaboration and the commitment to the work that needs doing, many things will be created,” said Secretary Paley. “People in this room will work together on all kinds of projects that will do incredible things for our economy, for our innovation, for our country, and for our climate.”The alliance builds on MIT’s growing portfolio of initiatives around sustainable energy systems, including the Climate Project at MIT, a presidential initiative focused on developing solutions to some of the toughest barriers to an effective global climate response. “This new alliance is a significant opportunity to move the needle of energy and climate research as we dare to create the future that we want, with the promise of impactful solutions for the world,” said Evelyn Wang, MIT vice president for energy and climate, who attended the launch.To that end, the alliance is supporting critical cross-institution efforts in energy and climate policy, including funding three master’s students in MIT Technology and Policy Program and hosting an annual symposium in February 2026 to advance interdisciplinary research. GE Vernova is also providing philanthropic support to the MIT Human Insight Collaborative. For 2025-26, this support will contribute to addressing global energy poverty by supporting the MIT Abdul Latif Jameel Poverty Action Lab (J-PAL) in its work to expand access to affordable electricity in South Africa.“Our hope to our fellows, our hope to our students is this: While the stakes are high and the urgency has never been higher, the impact that you are going to have over the decades to come has never been greater,” said Roger Martella, chief corporate and sustainability officer at GE Vernova. “You have so much opportunity to move the world in a better direction. We need you to succeed. And our mission is to serve you and enable your success.”With the alliance’s launch — and GE Vernova’s new membership in several other MIT consortium programs related to sustainability, automation and robotics, and AI, including the Initiative for New Manufacturing, MIT Energy Initiative, MIT Climate and Sustainability Consortium, and Center for Transportation and Logistics — it’s evident why Betar Gallant says the company is “all-in at MIT.”The potential for tremendous impact on the energy industry is clear to those involved in the alliance. As GE Vernova Fellow Jack Morris said at the launch, “This is the beginning of something big.”

Bigger datasets aren’t always better

MIT researchers developed a way to identify the smallest dataset that guarantees optimal solutions to complex problems.

Determining the least expensive path for a new subway line underneath a metropolis like New York City is a colossal planning challenge — involving thousands of potential routes through hundreds of city blocks, each with uncertain construction costs. Conventional wisdom suggests extensive field studies across many locations would be needed to determine the costs associated with digging below certain city blocks.Because these studies are costly to conduct, a city planner would want to perform as few as possible while still gathering the most useful data for making an optimal decision.With almost countless possibilities, how would they know where to start?A new algorithmic method developed by MIT researchers could help. Their mathematical framework provably identifies the smallest dataset that guarantees finding the optimal solution to a problem, often requiring fewer measurements than traditional approaches suggest.In the case of the subway route, this method considers the structure of the problem (the network of city blocks, construction constraints, and budget limits) and the uncertainty surrounding costs. The algorithm then identifies the minimum set of locations where field studies would guarantee finding the least expensive route. The method also identifies how to use this strategically collected data to find the optimal decision.This framework applies to a broad class of structured decision-making problems under uncertainty, such as supply chain management or electricity network optimization.“Data are one of the most important aspects of the AI economy. Models are trained on more and more data, consuming enormous computational resources. But most real-world problems have structure that can be exploited. We’ve shown that with careful selection, you can guarantee optimal solutions with a small dataset, and we provide a method to identify exactly which data you need,” says Asu Ozdaglar, Mathworks Professor and head of the MIT Department of Electrical Engineering and Computer Science (EECS), deputy dean of the MIT Schwarzman College of Computing, and a principal investigator in the Laboratory for Information and Decision Systems (LIDS).Ozdaglar, co-senior author of a paper on this research, is joined by co-lead authors Omar Bennouna, an EECS graduate student, and his brother Amine Bennouna, a former MIT postdoc who is now an assistant professor at Northwestern University; and co-senior author Saurabh Amin, co-director of Operations Research Center, a professor in the MIT Department of Civil and Environmental Engineering, and a principal investigator in LIDS. The research will be presented at the Conference on Neural Information Processing Systems.An optimality guaranteeMuch of the recent work in operations research focuses on how to best use data to make decisions, but this assumes these data already exist.The MIT researchers started by asking a different question — what are the minimum data needed to optimally solve a problem? With this knowledge, one could collect far fewer data to find the best solution, spending less time, money, and energy conducting experiments and training AI models.The researchers first developed a precise geometric and mathematical characterization of what it means for a dataset to be sufficient. Every possible set of costs (travel times, construction expenses, energy prices) makes some particular decision optimal. These “optimality regions” partition the decision space. A dataset is sufficient if it can determine which region contains the true cost.This characterization offers the foundation of the practical algorithm they developed that identifies datasets that guarantee finding the optimal solution.Their theoretical exploration revealed that a small, carefully selected dataset is often all one needs.“When we say a dataset is sufficient, we mean that it contains exactly the information needed to solve the problem. You don’t need to estimate all the parameters accurately; you just need data that can discriminate between competing optimal solutions,” says Amine Bennouna.Building on these mathematical foundations, the researchers developed an algorithm that finds the smallest sufficient dataset.Capturing the right dataTo use this tool, one inputs the structure of the task, such as the objective and constraints, along with the information they know about the problem.For instance, in supply chain management, the task might be to reduce operational costs across a network of dozens of potential routes. The company may already know that some shipment routes are especially costly, but lack complete information on others.The researchers’ iterative algorithm works by repeatedly asking, “Is there any scenario that would change the optimal decision in a way my current data can't detect?” If yes, it adds a measurement that captures that difference. If no, the dataset is provably sufficient.This algorithm pinpoints the subset of locations that need to be explored to guarantee finding the minimum-cost solution.Then, after collecting those data, the user can feed them to another algorithm the researchers developed which finds that optimal solution. In this case, that would be the shipment routes to include in a cost-optimal supply chain.“The algorithm guarantees that, for whatever scenario could occur within your uncertainty, you’ll identify the best decision,” Omar Bennouna says.The researchers’ evaluations revealed that, using this method, it is possible to guarantee an optimal decision with a much smaller dataset than would typically be collected.“We challenge this misconception that small data means approximate solutions. These are exact sufficiency results with mathematical proofs. We’ve identified when you’re guaranteed to get the optimal solution with very little data — not probably, but with certainty,” Amin says.In the future, the researchers want to extend their framework to other types of problems and more complex situations. They also want to study how noisy observations could affect dataset optimality.“I was impressed by the work’s originality, clarity, and elegant geometric characterization. Their framework offers a fresh optimization perspective on data efficiency in decision-making,” says Yao Xie, the Coca-Cola Foundation Chair and Professor at Georgia Tech, who was not involved with this work.

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.