Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Explosion of power-hungry data centers could derail California clean energy goals

News Feed
Monday, August 12, 2024

Near the Salton Sea, a company plans to build a data center to support artificial intelligence that would cover land the size of 15 football fields and require power that could support 425,000 homes. In Santa Clara — the heart of Silicon Valley — electric rates are rising as the municipal utility spends heavily on transmission lines and other infrastructure to accommodate the voracious power demand from more than 50 data centers, which now consume 60% of the city’s electricity.And earlier this year, Pacific Gas & Electric told investors that its customers have proposed more than two dozen data centers, requiring 3.5 gigawatts of power — the output of three new nuclear reactors. Vantage Data Center in Santa Clara is equipped with its own electrical substations. (Paul Kuroda / For The Times) While the benefits and risks of AI continue to be debated, one thing is clear: The technology is rapacious for power. Experts warn that the frenzy of data center construction could delay California’s transition away from fossil fuels and raise electric bills for everyone else. The data centers’ insatiable appetite for electricity, they say, also increases the risk of blackouts.Even now, California is at the verge of not having enough power. An analysis of public data by the nonprofit GridClue ranks California 49th of the 50 states in resilience — or the ability to avoid blackouts by having more electricity available than homes and businesses need at peak hours.“California is working itself into a precarious position,” said Thomas Popik, president of the Foundation for Resilient Societies, which created GridClue to educate the public on threats posed by increasing power use.The state has already extended the lives of Pacific Gas & Electric Co.’s Diablo Canyon nuclear plant as well as some natural gas-fueled plants in an attempt to avoid blackouts on sweltering days when power use surges. Worried that California could no longer predict its need for power because of fast-rising use, an association of locally run electricity providers called on state officials in May to immediately analyze how quickly demand was increasing. The California Community Choice Assn. sent its letter to the state energy commission after officials had to revise their annual forecast of power demand upward because of skyrocketing use by Santa Clara’s dozens of data centers. A large NTT data center rises in a Santa Clara neighborhood. (Paul Kuroda / For The Times) The facilities, giant warehouses of computer servers, have long been big power users. They support all that Americans do on the internet — from online shopping to streaming Netflix to watching influencers on TikTok.But the specialized chips required for generative AI use far more electricity — and water — than those that support the typical internet search because they are designed to read through vast amounts of data.A ChatGPT-powered search, according to the International Energy Agency, consumes 10 times the power as a search on Google without AI.And because those new chips generate so much heat, more power and water is required to keep them cool.“I’m just surprised that the state isn’t tracking this, with so much attention on power and water use here in California,” said Shaolei Ren, associate professor of electrical and computer engineering at UC Riverside.Ren and his colleagues calculated that the global use of AI could require as much fresh water in 2027 as that now used by four to six countries the size of Denmark.Driving the data center construction is money. Today’s stock market rewards companies that say they are investing in AI. Electric utilities profit as power use rises. And local governments benefit from the property taxes paid by data centers. Transmission lines are reflected on the side of the NTT data center in Santa Clara. (Paul Kuroda / For The Times) Silicon Valley is the world’s epicenter of AI, with some of the biggest developers headquartered there, including Alphabet, Apple and Meta. OpenAI, the creator of ChatGPT, is based in San Francisco. Nvidia, the maker of chips needed for AI, operates from Santa Clara.The big tech companies leading in AI, which also include Microsoft and Amazon, are spending billions to build new data centers around the world. They are also paying to rent space for their servers in so-called co-location data centers built by other companies.In a Chicago suburb, a developer recently bought 55 homes so they could be razed to build a sprawling data center campus.Energy officials in northern Virginia, which has more data centers than any other region in the world, have proposed a transmission line to shore up the grid that would depend on coal plants that had been expected to be shuttered.In Oregon, Google and the city of The Dalles fought for 13 months to prevent the Oregonian from getting records of how much water the company’s data centers were consuming. The newspaper won the court case, learning the facilities drank up 29% of the city’s water.By 2030, data centers could account for as much as 11% of U.S. power demand — up from 3% now, according to analysts at Goldman Sachs.“We must demand more efficient data centers or else their continued growth will place an unsustainable strain on energy resources, impact new home building, and increase both carbon emissions and California residents’ cost of electricity,” wrote Charles Giancarlo, chief executive of the Santa Clara IT firm Pure Storage.Santa Clara a top market for data centers (Paul Kuroda / For The Times) California has more than 270 data centers, with the biggest concentration in Santa Clara. The city is an attractive location because its electric rates are 40% lower than those charged by PG&E.But the lower rates come with a higher cost to the climate. The city’s utility, Silicon Valley Power, emits more greenhouse gas than the average California electric utility because 23% of its power for commercial customers comes from gas-fired plants. Another 35% is purchased on the open market where the electricity’s origin can’t be traced.The utility also gives data centers and other big industrial customers a discount on electric rates.While Santa Clara households pay more for each kilowatt hour beyond a certain threshold, the rate for data centers declines as they use more power.The city receives millions of dollars of property taxes from the data centers. And 5% of the utility’s revenue goes to the city’s general fund, where it pays for services such as road maintenance and police.An analysis last year by the Silicon Valley Voice newspaper questioned the lower rates data centers pay compared with residents.“What impetus do Santa Clarans have to foot the bill for these environmentally unfriendly behemoth buildings?” wrote managing editor Erika Towne.In October, Manuel Pineda, the utility’s top official, told the City Council that his team was working to double power delivery over the next 10 years. “We prioritize growth as a strategic opportunity,” he said.He said usage by data centers was continuing to escalate, but the utility was nearing its power limit. He said 13 new data centers were under construction and 12 more were moving forward with plans.“We cannot currently serve all data centers that would like to be in Santa Clara,” he said. Dozens of data centers have been built for artificial intelligence and the internet in Santa Clara. (Paul Kuroda / For The Times) To accommodate increasing power use, the city is now spending heavily on transmission lines, substations and other infrastructure. At the same time, electric rates are rising. Rates had been increasing by 2% to 3% a year, but they jumped by 8% in January 2023, another 5% in July 2023 and 10% last January.Pineda told The Times that it wasn’t just the new infrastructure that pushed rates up. The biggest factor, he said, was a spike in natural gas prices in 2022, which increased power costs.He said residential customers pay higher rates because the distribution system to homes requires more poles, wires and transformers than the system serving data centers, which increases maintenance costs.Pineda said the city’s decisions to approve new data centers “are generally based on land use factors, not on revenue generation.”Loretta Lynch, former chair of the state’s public utilities commission, noted that big commercial customers such as data centers pay lower rates for electricity across the state. That means when transmission lines and other infrastructure must be built to handle the increasing power needs, residential customers pick up more of the bill.“Why aren’t data centers paying their fair share for infrastructure? That’s my question,” she said.PG&E eyes profits from boom The grid’s limited capacity has not stopped PG&E from wooing companies that want to build data centers.“I think we will definitely be one of the big ancillary winners of the demand growth for data centers,” Patricia Poppe, PG&E’s chief executive, told Wall Street analysts on an April conference call.Poppe said she recently invited the company’s tech customers to an event at a San José substation.“When I got there, I was pleasantly surprised to see AWS, Microsoft, Apple, Google, Equinix, Cisco, Western Digital Semiconductors, Tesla, all in attendance. These are our customers that we serve who want us to serve more,” she said on the call. “They were very clear: they would build … if we can provide.”In June, PG&E revealed it had received 26 applications for new data centers, including three that need at least 500 megawatts of power, 24 hours a day. In all, the proposed data centers would use 3.5 gigawatts. That amount of power could support nearly 5 million homes, based on the average usage of a California household of 6,174 kilowatts a year.In the June presentation, PG&E said the new data centers would require it to spend billions of dollars on new infrastructure.Already PG&E can’t keep up with connecting customers to the grid. It has fallen so far behind on connecting new housing developments that last year legislators passed a law to try to shorten the delays. At that time, the company told Politico that the delays stemmed from rising electricity demand, including from data centers.In a statement to The Times, PG&E said its system was “ready for data centers.”The company said its analysis showed that adding the data centers would not increase bills for other customers.Most of the year, excluding extreme hot weather, its grid “is only 45% utilized on average,” the company said.“Data centers’ baseload will enable us to utilize more of this percentage and deliver more per customer dollar,” the company said. “For every 1,000 MW load from data centers we anticipate our customers could expect 1-2% saving on their monthly electricity bill.”The company added that it was “developing tools to ensure that every customer can cost-effectively connect new loads to the system with minimal delay.”Lynch questioned the company’s analysis that adding data centers could reduce bills for other customers. She pointed out that utilities earn profits by investing in new infrastructure. That’s because they get to recover that cost — plus an annual rate of return — through rates billed to all customers.“The more they spend, the more they make,” she said.In the desert, cheap land and green energy A geothermal plant viewed from across the Salton Sea in December 2022. (Gina Ferazzi / Los Angeles Times) The power and land constraints in Santa Clara and other cities have data center developers looking for new frontiers.“On the edge of the Southern California desert in Imperial County sits an abundance of land,” begins the sales brochure for the data center that a company called CalEthos is building near the south shore of the Salton Sea.Electricity for the data center’s servers would come from the geothermal and solar plants built near the site in an area that has become known as Lithium Valley.The company is negotiating to purchase as much as 500 megawatts of power, the brochure said.Water for the project would come from the state’s much fought over allotment from the Colorado River.Imperial County is one of California’s poorest counties. More than 80% of its population are Latino. Many residents are farmworkers.Executives from Tustin-based CalEthos told The Times that by using power from the nearby geothermal plants it would help the local community.“By creating demand for local energy, CalEthos will help accelerate the development of Lithium Valley and its associated economic benefits,” Joel Stone, the company’s president, wrote in an email.“We recognize the importance of responsible energy and water use in California,” Stone said. “Our data centers will be designed to be as efficient as possible.”For example, Stone said that in order to minimize water use, CalEthos plans a cooling system where water is recirculated and “requires minimal replenishment due to evaporation.” Already, a local community group, Comite Civico del Valle, has raised concerns about the environmental and health risks of one of the nearby geothermal plants that plans to produce lithium from the brine brought up in the energy production process.One of the group’s concerns about the geothermal plant is that its water use will leave less to replenish the Salton Sea. The lake has been decreasing in size, creating a larger dry shoreline that is laden with bacteria and chemicals left from decades of agricultural runoff. Scientists have tied the high rate of childhood asthma in the area to dust from the shrinking lake’s shores.James Blair, associate professor of geography and anthropology at Cal Poly Pomona, questioned whether the area was the right place for a mammoth data center.“Data centers drain massive volumes of energy and water for chillers and cooling towers to prevent servers from overheating,” he said. Blair said that while the company can tell customers its data center is supported by environmentally friendly solar and geothermal power, it will take that renewable energy away from the rest of California’s grid, making it harder for the state to meet its climate goals. Newsletter Toward a more sustainable California Get Boiling Point, our newsletter exploring climate change, energy and the environment, and become part of the conversation — and the solution. You may occasionally receive promotional content from the Los Angeles Times.

Experts warn that a frenzy of data center construction could delay California's transition away from fossil fuels and raise everyone's electric bills.

Near the Salton Sea, a company plans to build a data center to support artificial intelligence that would cover land the size of 15 football fields and require power that could support 425,000 homes.

In Santa Clara — the heart of Silicon Valley — electric rates are rising as the municipal utility spends heavily on transmission lines and other infrastructure to accommodate the voracious power demand from more than 50 data centers, which now consume 60% of the city’s electricity.

And earlier this year, Pacific Gas & Electric told investors that its customers have proposed more than two dozen data centers, requiring 3.5 gigawatts of power — the output of three new nuclear reactors.

An electrical substation.

Vantage Data Center in Santa Clara is equipped with its own electrical substations.

(Paul Kuroda / For The Times)

While the benefits and risks of AI continue to be debated, one thing is clear: The technology is rapacious for power. Experts warn that the frenzy of data center construction could delay California’s transition away from fossil fuels and raise electric bills for everyone else. The data centers’ insatiable appetite for electricity, they say, also increases the risk of blackouts.

Even now, California is at the verge of not having enough power. An analysis of public data by the nonprofit GridClue ranks California 49th of the 50 states in resilience — or the ability to avoid blackouts by having more electricity available than homes and businesses need at peak hours.

“California is working itself into a precarious position,” said Thomas Popik, president of the Foundation for Resilient Societies, which created GridClue to educate the public on threats posed by increasing power use.

The state has already extended the lives of Pacific Gas & Electric Co.’s Diablo Canyon nuclear plant as well as some natural gas-fueled plants in an attempt to avoid blackouts on sweltering days when power use surges.

Worried that California could no longer predict its need for power because of fast-rising use, an association of locally run electricity providers called on state officials in May to immediately analyze how quickly demand was increasing.

The California Community Choice Assn. sent its letter to the state energy commission after officials had to revise their annual forecast of power demand upward because of skyrocketing use by Santa Clara’s dozens of data centers.

A large data center rises in an urban business district.

A large NTT data center rises in a Santa Clara neighborhood.

(Paul Kuroda / For The Times)

The facilities, giant warehouses of computer servers, have long been big power users. They support all that Americans do on the internet — from online shopping to streaming Netflix to watching influencers on TikTok.

But the specialized chips required for generative AI use far more electricity — and water — than those that support the typical internet search because they are designed to read through vast amounts of data.

A ChatGPT-powered search, according to the International Energy Agency, consumes 10 times the power as a search on Google without AI.

And because those new chips generate so much heat, more power and water is required to keep them cool.

“I’m just surprised that the state isn’t tracking this, with so much attention on power and water use here in California,” said Shaolei Ren, associate professor of electrical and computer engineering at UC Riverside.

Ren and his colleagues calculated that the global use of AI could require as much fresh water in 2027 as that now used by four to six countries the size of Denmark.

Driving the data center construction is money. Today’s stock market rewards companies that say they are investing in AI. Electric utilities profit as power use rises. And local governments benefit from the property taxes paid by data centers.

Transmission lines are reflected on the side of a building.

Transmission lines are reflected on the side of the NTT data center in Santa Clara.

(Paul Kuroda / For The Times)

Silicon Valley is the world’s epicenter of AI, with some of the biggest developers headquartered there, including Alphabet, Apple and Meta. OpenAI, the creator of ChatGPT, is based in San Francisco. Nvidia, the maker of chips needed for AI, operates from Santa Clara.

The big tech companies leading in AI, which also include Microsoft and Amazon, are spending billions to build new data centers around the world. They are also paying to rent space for their servers in so-called co-location data centers built by other companies.

In a Chicago suburb, a developer recently bought 55 homes so they could be razed to build a sprawling data center campus.

Energy officials in northern Virginia, which has more data centers than any other region in the world, have proposed a transmission line to shore up the grid that would depend on coal plants that had been expected to be shuttered.

In Oregon, Google and the city of The Dalles fought for 13 months to prevent the Oregonian from getting records of how much water the company’s data centers were consuming. The newspaper won the court case, learning the facilities drank up 29% of the city’s water.

By 2030, data centers could account for as much as 11% of U.S. power demand — up from 3% now, according to analysts at Goldman Sachs.

“We must demand more efficient data centers or else their continued growth will place an unsustainable strain on energy resources, impact new home building, and increase both carbon emissions and California residents’ cost of electricity,” wrote Charles Giancarlo, chief executive of the Santa Clara IT firm Pure Storage.

Santa Clara a top market for data centers

Boys ride their bikes on Main Street near a large data center in Santa Clara.

(Paul Kuroda / For The Times)

California has more than 270 data centers, with the biggest concentration in Santa Clara. The city is an attractive location because its electric rates are 40% lower than those charged by PG&E.

But the lower rates come with a higher cost to the climate. The city’s utility, Silicon Valley Power, emits more greenhouse gas than the average California electric utility because 23% of its power for commercial customers comes from gas-fired plants. Another 35% is purchased on the open market where the electricity’s origin can’t be traced.

The utility also gives data centers and other big industrial customers a discount on electric rates.

While Santa Clara households pay more for each kilowatt hour beyond a certain threshold, the rate for data centers declines as they use more power.

The city receives millions of dollars of property taxes from the data centers. And 5% of the utility’s revenue goes to the city’s general fund, where it pays for services such as road maintenance and police.

An analysis last year by the Silicon Valley Voice newspaper questioned the lower rates data centers pay compared with residents.

“What impetus do Santa Clarans have to foot the bill for these environmentally unfriendly behemoth buildings?” wrote managing editor Erika Towne.

In October, Manuel Pineda, the utility’s top official, told the City Council that his team was working to double power delivery over the next 10 years. “We prioritize growth as a strategic opportunity,” he said.

He said usage by data centers was continuing to escalate, but the utility was nearing its power limit. He said 13 new data centers were under construction and 12 more were moving forward with plans.

“We cannot currently serve all data centers that would like to be in Santa Clara,” he said.

A data center rises many stories into the sky.

Dozens of data centers have been built for artificial intelligence and the internet in Santa Clara.

(Paul Kuroda / For The Times)

To accommodate increasing power use, the city is now spending heavily on transmission lines, substations and other infrastructure. At the same time, electric rates are rising. Rates had been increasing by 2% to 3% a year, but they jumped by 8% in January 2023, another 5% in July 2023 and 10% last January.

Pineda told The Times that it wasn’t just the new infrastructure that pushed rates up. The biggest factor, he said, was a spike in natural gas prices in 2022, which increased power costs.

He said residential customers pay higher rates because the distribution system to homes requires more poles, wires and transformers than the system serving data centers, which increases maintenance costs.

Pineda said the city’s decisions to approve new data centers “are generally based on land use factors, not on revenue generation.”

Loretta Lynch, former chair of the state’s public utilities commission, noted that big commercial customers such as data centers pay lower rates for electricity across the state. That means when transmission lines and other infrastructure must be built to handle the increasing power needs, residential customers pick up more of the bill.

“Why aren’t data centers paying their fair share for infrastructure? That’s my question,” she said.

PG&E eyes profits from boom

The grid’s limited capacity has not stopped PG&E from wooing companies that want to build data centers.

“I think we will definitely be one of the big ancillary winners of the demand growth for data centers,” Patricia Poppe, PG&E’s chief executive, told Wall Street analysts on an April conference call.

Poppe said she recently invited the company’s tech customers to an event at a San José substation.

“When I got there, I was pleasantly surprised to see AWS, Microsoft, Apple, Google, Equinix, Cisco, Western Digital Semiconductors, Tesla, all in attendance. These are our customers that we serve who want us to serve more,” she said on the call. “They were very clear: they would build … if we can provide.”

In June, PG&E revealed it had received 26 applications for new data centers, including three that need at least 500 megawatts of power, 24 hours a day. In all, the proposed data centers would use 3.5 gigawatts. That amount of power could support nearly 5 million homes, based on the average usage of a California household of 6,174 kilowatts a year.

In the June presentation, PG&E said the new data centers would require it to spend billions of dollars on new infrastructure.

Already PG&E can’t keep up with connecting customers to the grid. It has fallen so far behind on connecting new housing developments that last year legislators passed a law to try to shorten the delays. At that time, the company told Politico that the delays stemmed from rising electricity demand, including from data centers.

In a statement to The Times, PG&E said its system was “ready for data centers.”

The company said its analysis showed that adding the data centers would not increase bills for other customers.

Most of the year, excluding extreme hot weather, its grid “is only 45% utilized on average,” the company said.

“Data centers’ baseload will enable us to utilize more of this percentage and deliver more per customer dollar,” the company said. “For every 1,000 MW load from data centers we anticipate our customers could expect 1-2% saving on their monthly electricity bill.”

The company added that it was “developing tools to ensure that every customer can cost-effectively connect new loads to the system with minimal delay.”

Lynch questioned the company’s analysis that adding data centers could reduce bills for other customers. She pointed out that utilities earn profits by investing in new infrastructure. That’s because they get to recover that cost — plus an annual rate of return — through rates billed to all customers.

“The more they spend, the more they make,” she said.

In the desert, cheap land and green energy

Dusk settles over the low Salton Sea.

A geothermal plant viewed from across the Salton Sea in December 2022.

(Gina Ferazzi / Los Angeles Times)

The power and land constraints in Santa Clara and other cities have data center developers looking for new frontiers.

“On the edge of the Southern California desert in Imperial County sits an abundance of land,” begins the sales brochure for the data center that a company called CalEthos is building near the south shore of the Salton Sea.

Electricity for the data center’s servers would come from the geothermal and solar plants built near the site in an area that has become known as Lithium Valley.

The company is negotiating to purchase as much as 500 megawatts of power, the brochure said.

Water for the project would come from the state’s much fought over allotment from the Colorado River.

Imperial County is one of California’s poorest counties. More than 80% of its population are Latino. Many residents are farmworkers.

Executives from Tustin-based CalEthos told The Times that by using power from the nearby geothermal plants it would help the local community.

“By creating demand for local energy, CalEthos will help accelerate the development of Lithium Valley and its associated economic benefits,” Joel Stone, the company’s president, wrote in an email.

“We recognize the importance of responsible energy and water use in California,” Stone said. “Our data centers will be designed to be as efficient as possible.”

For example, Stone said that in order to minimize water use, CalEthos plans a cooling system where water is recirculated and “requires minimal replenishment due to evaporation.”

Already, a local community group, Comite Civico del Valle, has raised concerns about the environmental and health risks of one of the nearby geothermal plants that plans to produce lithium from the brine brought up in the energy production process.

One of the group’s concerns about the geothermal plant is that its water use will leave less to replenish the Salton Sea. The lake has been decreasing in size, creating a larger dry shoreline that is laden with bacteria and chemicals left from decades of agricultural runoff. Scientists have tied the high rate of childhood asthma in the area to dust from the shrinking lake’s shores.

James Blair, associate professor of geography and anthropology at Cal Poly Pomona, questioned whether the area was the right place for a mammoth data center.

“Data centers drain massive volumes of energy and water for chillers and cooling towers to prevent servers from overheating,” he said.

Blair said that while the company can tell customers its data center is supported by environmentally friendly solar and geothermal power, it will take that renewable energy away from the rest of California’s grid, making it harder for the state to meet its climate goals.

Newsletter

Toward a more sustainable California

Get Boiling Point, our newsletter exploring climate change, energy and the environment, and become part of the conversation — and the solution.

You may occasionally receive promotional content from the Los Angeles Times.

Read the full story here.
Photos courtesy of

Designing a new way to optimize complex coordinated systems

Using diagrams to represent interactions in multipart systems can provide a faster way to design software improvements.

Coordinating complicated interactive systems, whether it’s the different modes of transportation in a city or the various components that must work together to make an effective and efficient robot, is an increasingly important subject for software designers to tackle. Now, researchers at MIT have developed an entirely new way of approaching these complex problems, using simple diagrams as a tool to reveal better approaches to software optimization in deep-learning models.They say the new method makes addressing these complex tasks so simple that it can be reduced to a drawing that would fit on the back of a napkin.The new approach is described in the journal Transactions of Machine Learning Research, in a paper by incoming doctoral student Vincent Abbott and Professor Gioele Zardini of MIT’s Laboratory for Information and Decision Systems (LIDS).“We designed a new language to talk about these new systems,” Zardini says. This new diagram-based “language” is heavily based on something called category theory, he explains.It all has to do with designing the underlying architecture of computer algorithms — the programs that will actually end up sensing and controlling the various different parts of the system that’s being optimized. “The components are different pieces of an algorithm, and they have to talk to each other, exchange information, but also account for energy usage, memory consumption, and so on.” Such optimizations are notoriously difficult because each change in one part of the system can in turn cause changes in other parts, which can further affect other parts, and so on.The researchers decided to focus on the particular class of deep-learning algorithms, which are currently a hot topic of research. Deep learning is the basis of the large artificial intelligence models, including large language models such as ChatGPT and image-generation models such as Midjourney. These models manipulate data by a “deep” series of matrix multiplications interspersed with other operations. The numbers within matrices are parameters, and are updated during long training runs, allowing for complex patterns to be found. Models consist of billions of parameters, making computation expensive, and hence improved resource usage and optimization invaluable.Diagrams can represent details of the parallelized operations that deep-learning models consist of, revealing the relationships between algorithms and the parallelized graphics processing unit (GPU) hardware they run on, supplied by companies such as NVIDIA. “I’m very excited about this,” says Zardini, because “we seem to have found a language that very nicely describes deep learning algorithms, explicitly representing all the important things, which is the operators you use,” for example the energy consumption, the memory allocation, and any other parameter that you’re trying to optimize for.Much of the progress within deep learning has stemmed from resource efficiency optimizations. The latest DeepSeek model showed that a small team can compete with top models from OpenAI and other major labs by focusing on resource efficiency and the relationship between software and hardware. Typically, in deriving these optimizations, he says, “people need a lot of trial and error to discover new architectures.” For example, a widely used optimization program called FlashAttention took more than four years to develop, he says. But with the new framework they developed, “we can really approach this problem in a more formal way.” And all of this is represented visually in a precisely defined graphical language.But the methods that have been used to find these improvements “are very limited,” he says. “I think this shows that there’s a major gap, in that we don’t have a formal systematic method of relating an algorithm to either its optimal execution, or even really understanding how many resources it will take to run.” But now, with the new diagram-based method they devised, such a system exists.Category theory, which underlies this approach, is a way of mathematically describing the different components of a system and how they interact in a generalized, abstract manner. Different perspectives can be related. For example, mathematical formulas can be related to algorithms that implement them and use resources, or descriptions of systems can be related to robust “monoidal string diagrams.” These visualizations allow you to directly play around and experiment with how the different parts connect and interact. What they developed, he says, amounts to “string diagrams on steroids,” which incorporates many more graphical conventions and many more properties.“Category theory can be thought of as the mathematics of abstraction and composition,” Abbott says. “Any compositional system can be described using category theory, and the relationship between compositional systems can then also be studied.” Algebraic rules that are typically associated with functions can also be represented as diagrams, he says. “Then, a lot of the visual tricks we can do with diagrams, we can relate to algebraic tricks and functions. So, it creates this correspondence between these different systems.”As a result, he says, “this solves a very important problem, which is that we have these deep-learning algorithms, but they’re not clearly understood as mathematical models.” But by representing them as diagrams, it becomes possible to approach them formally and systematically, he says.One thing this enables is a clear visual understanding of the way parallel real-world processes can be represented by parallel processing in multicore computer GPUs. “In this way,” Abbott says, “diagrams can both represent a function, and then reveal how to optimally execute it on a GPU.”The “attention” algorithm is used by deep-learning algorithms that require general, contextual information, and is a key phase of the serialized blocks that constitute large language models such as ChatGPT. FlashAttention is an optimization that took years to develop, but resulted in a sixfold improvement in the speed of attention algorithms.Applying their method to the well-established FlashAttention algorithm, Zardini says that “here we are able to derive it, literally, on a napkin.” He then adds, “OK, maybe it’s a large napkin.” But to drive home the point about how much their new approach can simplify dealing with these complex algorithms, they titled their formal research paper on the work “FlashAttention on a Napkin.”This method, Abbott says, “allows for optimization to be really quickly derived, in contrast to prevailing methods.” While they initially applied this approach to the already existing FlashAttention algorithm, thus verifying its effectiveness, “we hope to now use this language to automate the detection of improvements,” says Zardini, who in addition to being a principal investigator in LIDS, is the Rudge and Nancy Allen Assistant Professor of Civil and Environmental Engineering, and an affiliate faculty with the Institute for Data, Systems, and Society.The plan is that ultimately, he says, they will develop the software to the point that “the researcher uploads their code, and with the new algorithm you automatically detect what can be improved, what can be optimized, and you return an optimized version of the algorithm to the user.”In addition to automating algorithm optimization, Zardini notes that a robust analysis of how deep-learning algorithms relate to hardware resource usage allows for systematic co-design of hardware and software. This line of work integrates with Zardini’s focus on categorical co-design, which uses the tools of category theory to simultaneously optimize various components of engineered systems.Abbott says that “this whole field of optimized deep learning models, I believe, is quite critically unaddressed, and that’s why these diagrams are so exciting. They open the doors to a systematic approach to this problem.”“I’m very impressed by the quality of this research. ... The new approach to diagramming deep-learning algorithms used by this paper could be a very significant step,” says Jeremy Howard, founder and CEO of Answers.ai, who was not associated with this work. “This paper is the first time I’ve seen such a notation used to deeply analyze the performance of a deep-learning algorithm on real-world hardware. ... The next step will be to see whether real-world performance gains can be achieved.”“This is a beautifully executed piece of theoretical research, which also aims for high accessibility to uninitiated readers — a trait rarely seen in papers of this kind,” says Petar Velickovic, a senior research scientist at Google DeepMind and a lecturer at Cambridge University, who was not associated with this work. These researchers, he says, “are clearly excellent communicators, and I cannot wait to see what they come up with next!”The new diagram-based language, having been posted online, has already attracted great attention and interest from software developers. A reviewer from Abbott’s prior paper introducing the diagrams noted that “The proposed neural circuit diagrams look great from an artistic standpoint (as far as I am able to judge this).” “It’s technical research, but it’s also flashy!” Zardini says.

The UK Says at an Energy Summit That Green Power Will Boost Security, as the US Differs

Britain has announced a major investment in wind power as it hosts an international summit on energy security

LONDON (AP) — Britain announced a major investment in wind power Thursday as it hosted an international summit on energy security — with Europe and the United States at odds over whether to cut their reliance on fossil fuels.U.K. Prime Minister Keir Starmer said the government will invest 300 million pounds ($400 million) in boosting Britain’s capacity to manufacture components for the offshore wind industry, a move it hopes will encourage private investment in the U.K.’s renewable energy sector.“As long as energy can be weaponized against us, our countries and our citizens are vulnerable and exposed,” U.K. Energy Secretary Ed Miliband told delegates.He said “low-carbon power” was a route to energy security as well as a way to slow climate change.Britain now gets more than half its electricity from renewable sources such as wind and solar power, and the rest from natural gas and nuclear energy. It aims to generate all the U.K.’s energy from renewable sources by 2030.Tommy Joyce, U.S. acting assistant secretary of energy for international affairs, told participants they should be “honest about the world’s growing energy needs, not focused on net-zero politics.”He called policies that push for clean power over fossil fuels "harmful and dangerous," and claimed building wind turbines requires "concessions to or coercion from China" because it supplies necessary rare minerals.Hosted by the British government and the International Energy Agency, the two-day summit brings together government ministers from 60 countries, senior European Union officials, energy sector CEOs, heads of international organizations and nonprofits to assess risks to the global energy system and figure out solutions. Associated Press writer Jennifer McDermott contributed to this story. ___The Associated Press’ climate and environmental coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP’s standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See - Feb. 2025

Steelhead trout rescued from Palisades fire spawn in their new Santa Barbara County home

After a stressful journey out of the burn zone in Malibu, the endangered trout have spawned in their adopted stream in Santa Barbara County.

Wildlife officials feared critically endangered steelhead trout rescued from the Palisades fire burn scar might not be up for spawning after all they’d been through over the last few months.After their watershed in the Santa Monica Mountains was scorched in January, the fish were stunned with electricity, scooped up in buckets, trucked to a hatchery, fed unfamiliar food and then moved to a different creek. It was all part of a liberation effort pulled off in the nick of time. “This whole thing is just a very stressful and traumatic event, and I’m happy that we didn’t really kill many fish,” said Kyle Evans, an environmental program manager for the California Department of Fish and Wildlife, which led the rescue. “But I was concerned that I might have just disrupted this whole months-long process of getting ready to spawn.” Steelhead were once abundant in Southern California, but their numbers plummeted amid coastal development and overfishing. A distinct Southern California population is listed as endangered at the state and federal level. (Alex Vejar / California Department of Fish and Wildlife) But this month spawn they did.It’s believed that there are now more than 100 baby trout swishing around their new digs in Arroyo Hondo Creek in Santa Barbara County.Their presence is a triumph — for the species and for their adopted home.However, more fish require more suitable habitat, which is lacking in Southern California — in part due to drought and the increased frequency of devastating wildfires. Steelhead trout are the same species as rainbow trout, but they have different lifestyles. Steelheads migrate to the ocean and return to their natal streams to spawn, while rainbows spend their lives in freshwater.Steelhead were once abundant in Southern California, but their numbers plummeted amid coastal development and overfishing. A distinct Southern California population is listed as endangered at the state and federal level.The young fish sighted this month mark the next generation of what was the last population of steelhead in the Santa Monica Mountains, a range that stretches from the Hollywood Hills to Point Mugu in Ventura County. They also represent the return of a species to a watershed that itself was devastated by a fire four years ago, but has since recovered. It’s believed that there are now more than 100 baby trout swishing around their new digs in Arroyo Hondo Creek in Santa Barbara County. (Kyle Kusa / Land Trust for Santa Barbara County) The Alisal blaze torched roughly 95% of the Arroyo Hondo Preserve located west of Santa Barbara, and subsequent debris flows choked the creek of the same name that housed steelhead. All the fish perished, according to Meredith Hendricks, executive director of the Land Trust for Santa Barbara County, a nonprofit organization that owns and manages the preserve.“To be able to … offer space for these fish to be transplanted to — when we ourselves had experienced a similar situation but lost our fish — it was just a really big deal,” Hendricks said. Arroyo Hondo Creek bears similarities to the trout’s native Topanga Creek; they are both coastal streams of roughly the same size. And it has a bonus feature: a state-funded fish passage constructed under Highway 101 in 2008, which improved fish movement between the stream and the ocean.Spawning is a biologically and energetically demanding endeavor for steelhead, and the process likely began in December or earlier, according to Evans.That means it was already underway when 271 steelhead were evacuated in January from Topanga Creek, a biodiversity hot spot located in Malibu that was badly damaged by the Palisades fire.It continued when they were hauled about 50 miles north to a hatchery in Fillmore, where they hung out until 266 of them made it to Arroyo Hondo the following month.State wildlife personnel regularly surveyed the fish in their new digs but didn’t see the spawning nests, which can be missed. VIDEO | 00:16 Steelhead trout in Arroyo Hondo Creek in Santa Barbara County Steelhead trout in Arroyo Hondo Creek in Santa Barbara County. (Calif. Dept. of Fish & Game) Then, on April 7, Evans got a text message from the Land Trust’s land programs director, Leslie Chan, with a video that appeared to show a freshly hatched young-of-the-year — the wonky name for fish born during the steelheads’ sole annual spawn.The following day, Evans’ team was dispatched to the creek and confirmed the discovery. They tallied about 100 of the newly hatched fish. The young trout span roughly one inch and, as Evans put it, aren’t too bright. They hang out in the shallows and don’t bolt from predators.“They’re kind of just happy to be alive, and they’re not really trying to hide,” he said.By the end of summer, Evans estimates two-thirds will die off. But the survivors are enough to keep the population charging onward. Evans hopes that in a few years, there will be three to four times the number of fish that initially moved in.The plan is to eventually relocate at least some back to their native home of Topanga Creek.Right now, Topanga “looks pretty bad,” Evans said. The Palisades fire stripped the surrounding hillsides of vegetation, paving the way for dirt, ash and other material to pour into the waterway. Another endangered fish, northern tidewater gobies, were rescued from the same watershed shortly before the steelhead were liberated. Within two days of the trouts’ removal, the first storm of the season arrived, likely burying the remaining fish in a muddy slurry. Citizen scientists Bernard Yin, center, and Rebecca Ramirez, right, join government agency staffers in rescuing federally endangered fish in the Topanga Lagoon in Malibu on Jan. 17. (Christina House / Los Angeles Times) Evans expects it will be about four years before Topanga Creek is ready to support steelhead again, based on his experience observing streams recover after the Thomas, Woolsey, Alisal and other fires. There’s also discussion about moving around steelhead to create backup populations should calamity befall one, as well as boost genetic diversity of the rare fish.For example, some of the steelhead saved from Topanga could be moved to Malibu Creek, another stream in the Santa Monica Mountains that empties into Santa Monica Bay. There are efforts underway to remove the 100-foot Rindge Dam in Malibu Creek to open up more habitat for the fish.“As we saw, if you have one population in the Santa Monica Mountains and a fire happens, you could just lose it forever,” Evans said. “So having fish in multiple areas is the kind of way to defend against that.”With the Topanga Creek steelhead biding their time up north, it’s believed there are none currently inhabiting the Santa Monicas. Habitat restoration is key for the species’ survival, according to Evans, who advocates for directing funding to such efforts, including soon-to-come-online money from Proposition 4, a $10-billion bond measure to finance water, clean energy and other environmental projects.“It doesn’t matter how many fish you have, or if you’re growing them in a hatchery, or what you’re doing,” he said. “If they can’t be supported on the landscape, then there’s no point.”Some trout will end up making their temporary lodging permanent, according to Hendricks, of the Land Trust. Arroyo Hondo is a long creek with plenty of nooks and crannies for trout to hide in. So when it comes time to bring the steelhead home, she said, “I’m sure some will get left behind.”

Chicago Teachers Union secures clean energy wins in new contract

The Chicago Teachers Union expects its new, hard-fought contract to help drive clean energy investments and train the next generation of clean energy workers, even as the Trump administration attacks such priorities. The contract approved by 97% of union members this month represents the first time the union has…

The Chicago Teachers Union expects its new, hard-fought contract to help drive clean energy investments and train the next generation of clean energy workers, even as the Trump administration attacks such priorities. The contract approved by 97% of union members this month represents the first time the union has bargained with school officials specifically around climate change and energy, said union Vice President Jackson Potter. The deal still needs to be approved by the Chicago Board of Education. If approved, the contract will result in new programs that prepare students for clean energy jobs, developed in collaboration with local labor unions. It mandates that district officials work with the teachers union to seek funding for clean energy investments and update a climate action plan by 2026. And it calls for installing heat pumps and outfitting 30 schools with solar panels — if funding can be secured. During almost a year of contentious negotiations, the more than 25,000-member union had also demanded paid climate-educator positions, an all-electric school bus fleet, and that all newly constructed schools be carbon-free. While those provisions did not end up in the final agreement, leaders say the four-year contract is a ​“transformative” victory that sets the stage for more ambitious demands next time. “This contract is setting the floor of what we hope we can accomplish,” said Lauren Bianchi, who taught social studies at George Washington High School on the city’s South Side for six years before becoming green schools organizer for the union. ​“It shows we can win on climate, even despite Trump.” The climate-related provisions are part of what the Chicago Teachers Union and an increasing number of unions nationwide refer to as ​“common good” demands, meant to benefit not only their members in the workplace but the entire community. In this and its 2019 contract, the Chicago union also won ​“common good” items such as protections for immigrant students and teachers, and affordable housing–related measures. The new contract also guarantees teachers academic freedom at a time when the federal government is trying to limit schools from teaching materials related to diversity, equity, and inclusion. “Black history, Indigenous history, climate science — that’s protected instruction now,” said Potter. Chicago Public Schools did not respond to emailed questions for this story, except to forward a press release that did not mention clean energy provisions. Training Chicago’s students for clean energy jobs The union crafted its proposals based on discussions with three environmental and community organizations, Bianchi said — the Southeast Environmental Task Force, People for Community Recovery, and ONE Northside. The Southeast Environmental Task Force led the successful fight to ban new petcoke storage in Chicago, and the group’s co-executive director Olga Bautista is also vice president of the 21-member school board. People for Community Recovery was founded by Hazel Johnson, who is often known as ​“the mother of the environmental justice movement.” And ONE Northside emphasizes the link between clean energy and affordable housing. Clean energy job training was a priority for all three of the organizations, Potter said. Under the contract, the union and district officials will work with other labor unions to create pre-apprenticeship programs for students, which are crucial to entering the union-dominated building trades to install solar, do energy-efficiency overhauls, and electrify homes with heat pumps and other technology. The contract demands the district create one specific new clean energy jobs pathway program during each year of the four-year contract. It also mandates renovating schools for energy efficiency and installing modern HVAC systems, and orders the school district to work with trade unions to create opportunities for Chicago Public Schools students and graduates to be hired for such work. “The people in the community have identified jobs and economic justice as being essential for environmental justice,” said Bianchi. ​“I’ve mostly taught juniors and seniors; a lot expressed frustration that college is not their plan. They wish they could learn job skills to enter a trade.” Chicago schools progress on solar, energy efficiency, and electrification Installing solar could help the district meet its clean energy goals, which include sourcing 100% of its electricity from renewables by this year. The district has invested more than $6 million in energy efficiency and efficient lighting since 2018, and cut its carbon dioxide emissions by more than 27,000 metric tons, school district spokesperson Evan Moore told Canary Media last fall as contract negotiations were proceeding. The schools are eligible for subsidized solar panels under the state Illinois Shines program, and they can tap the federal 30% investment tax credit for solar arrays, with a new direct-pay option tailored to tax-exempt organizations like schools.

Costa Rica Proposes Strict Penalties for Illegal National Park Entries

Costa Rica is cracking down on illegal entries into its national parks and protected areas, citing dangers to visitors and environmental harm. Franz Tattenbach, Minister of Environment and Energy (MINAE), has called on lawmakers to approve a bill imposing fines of up to ¢2.3 million (approximately $4,400) on individuals and tour operators who access these […] The post Costa Rica Proposes Strict Penalties for Illegal National Park Entries appeared first on The Tico Times | Costa Rica News | Travel | Real Estate.

Costa Rica is cracking down on illegal entries into its national parks and protected areas, citing dangers to visitors and environmental harm. Franz Tattenbach, Minister of Environment and Energy (MINAE), has called on lawmakers to approve a bill imposing fines of up to ¢2.3 million (approximately $4,400) on individuals and tour operators who access these areas without authorization. Over 500 unauthorized entries into Costa Rica’s 30 national parks and reserves, have been reported so far this year. High-risk areas like Poás, Turrialba, Rincón de la Vieja, and Arenal volcanoes are frequent targets, where illegal tours bypass safety protocols. Unscrupulous operators promote these “exclusive” experiences on social media, often lacking insurance, safety equipment, or trained guides. “These operators abandon clients if intercepted by authorities, leaving them vulnerable in hazardous areas,” Tattenbach said. Poás Volcano National Park, closed since March 26 due to seismic activity and ash emissions, remains a hotspot for illegal tours. The proposed bill, under discussion by MINAE and the National System of Conservation Areas (SINAC), would introduce fines ranging from ¢1.3 million to ¢2.3 million ($2,500 to $4,400) for unauthorized entry, targeting both operators and participants. If a rescue operation is required, involving the Costa Rican Red Cross or MINAE personnel, an additional fine of ¢2.3 million ($4,400) could be imposed. Current laws penalize illegal entry under Article 58 of Forestry Law 7575, with three months to three years in prison, but enforcement is inconsistent. The new bill aims to strengthen deterrence. “These hikes involve steep slopes, toxic gases, and the risk of volcanic eruptions, which can be fatal,” Tattenbach warned, citing the 2017 Poás eruption that closed the park for over a year. Illegal entries also threaten Costa Rica’s biodiversity, which includes 5% of the world’s species. Unauthorized trails disrupt ecosystems and increase risks of poaching, according to Jorge Mario Rodríguez, Vice Minister of Environment. The Volcanological and Seismological Observatory of Costa Rica (OVSICORI) monitors volcanic activity to inform park closures, but illegal tours undermine these safety measures. Increased Surveillance SINAC, the Costa Rican Fire Department, Red Cross, and Police Force will intensify surveillance going forward, targeting high-risk national parks and roadways to prevent unauthorized access, wildlife extraction, hunting, and trade in protected flora and fauna. “These operations safeguard our natural heritage and ensure visitor safety,” Tattenbach said. SINAC’s year-round efforts have intercepted numerous illegal tours in 2025. Visiting Parks Safely: MINAE and SINAC urge visitors to use authorized operators and purchase tickets via the SINAC website or park entrances. Guided tours, available through platforms like Viator or Get Your Guide, offer safe experiences in parks like Manuel Antonio or Corcovado. Tourists should check park statuses before planning visits, as closures due to volcanic activity or weather are common. “Respecting regulations protects both you and Costa Rica’s natural treasures,” Rodríguez said. Preserving Ecotourism: As the proposed bill awaits Legislative Assembly review, MINAE urges compliance to maintain Costa Rica’s status as a global conservation leader. For updates on the bill or park regulations, visit MINAE’s Website The post Costa Rica Proposes Strict Penalties for Illegal National Park Entries appeared first on The Tico Times | Costa Rica News | Travel | Real Estate.

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.