Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Data center emissions likely 662% higher than big tech claims. Can it keep up the ruse?

News Feed
Sunday, September 15, 2024

Big tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are likely about 662% – or 7.62 times – higher than officially reported.Amazon is the largest emitter of the big five tech companies by a mile – the emissions of the second-largest emitter, Apple, were less than half of Amazon’s in 2022. However, Amazon has been kept out of the calculation above because its differing business model makes it difficult to isolate data center-specific emissions figures for the company.As energy demands for these data centers grow, many are worried that carbon emissions will, too. The International Energy Agency stated that data centers already accounted for 1% to 1.5% of global electricity consumption in 2022 – and that was before the AI boom began with ChatGPT’s launch at the end of that year.AI is far more energy-intensive on data centers than typical cloud-based applications. According to Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search, and data center power demand will grow 160% by 2030. Goldman competitor Morgan Stanley’s research has made similar findings, projecting data center emissions globally to accumulate to 2.5bn metric tons of CO2 equivalent by 2030.In the meantime, all five tech companies have claimed carbon neutrality, though Google dropped the label last year as it stepped up its carbon accounting standards. Amazon is the most recent company to do so, claiming in July that it met its goal seven years early, and that it had implemented a gross emissions cut of 3%.“It’s down to creative accounting,” explained a representative from Amazon Employees for Climate Justice, an advocacy group composed of current Amazon employees who are dissatisfied with their employer’s action on climate. “Amazon – despite all the PR and propaganda that you’re seeing about their solar farms, about their electric vans – is expanding its fossil fuel use, whether it’s in data centers or whether it’s in diesel trucks.”A misguided metricThe most important tools in this “creative accounting” when it comes to data centers are renewable energy certificates, or Recs. These are certificates that a company purchases to show it is buying renewable energy-generated electricity to match a portion of its electricity consumption – the catch, though, is that the renewable energy in question doesn’t need to be consumed by a company’s facilities. Rather, the site of production can be anywhere from one town over to an ocean away.Recs are used to calculate “market-based” emissions, or the official emissions figures used by the firms. When Recs and offsets are left out of the equation, we get “location-based emissions” – the actual emissions generated from the area where the data is being processed.The trend in those emissions is worrying. If these five companies were one country, the sum of their “location-based” emissions in 2022 would rank them as the 33rd highest-emitting country, behind the Philippines and above Algeria.Many data center industry experts also recognize that location-based metrics are more honest than the official, market-based numbers reported.“Location-based [accounting] gives an accurate picture of the emissions associated with the energy that’s actually being consumed to run the data center. And Uptime’s view is that it’s the right metric,” said Jay Dietrich, the research director of sustainability at Uptime Institute, a leading data center advisory and research organization.Nevertheless, Greenhouse Gas (GHG) Protocol, a carbon accounting oversight body, allows Recs to be used in official reporting, though the extent to which they should be allowed remains controversial between tech companies and has led to a lobbying battle over GHG Protocol’s rule-making process between two factions.On one side there is the Emissions First Partnership, spearheaded by Amazon and Meta. It aims to keep Recs in the accounting process regardless of their geographic origins. In practice, this is only a slightly looser interpretation of what GHG Protocol already permits.The opposing faction, headed by Google and Microsoft, argues that there needs to be time-based and location-based matching of renewable production and energy consumption for data centers. Google calls this its 24/7 goal, or its goal to have all of its facilities run on renewable energy 24 hours a day, seven days a week by 2030. Microsoft calls it its 100/100/0 goal, or its goal to have all its facilities running on 100% carbon-free energy 100% of the time, making zero carbon-based energy purchases by 2030.Google has already phased out its Rec use and Microsoft aims to do the same with low-quality “unbundled” (non location-specific) Recs by 2030.Academics and carbon management industry leaders alike are also against the GHG Protocol’s permissiveness on Recs. In an open letter from 2015, more than 50 such individuals argued that “it should be a bedrock principle of GHG accounting that no company be allowed to report a reduction in its GHG footprint for an action that results in no change in overall GHG emissions. Yet this is precisely what can happen under the guidance given the contractual/Rec-based reporting method.”To GHG Protocol’s credit, the organization does ask companies to report location-based figures alongside their Rec-based figures. Despite that, no company includes both location-based and market-based metrics for all three subcategories of emissions in the bodies of their annual environmental reports.In fact, location-based numbers are only directly reported (that is, not hidden in third-party assurance statements or in footnotes) by two companies – Google and Meta. And those two firms only include those figures for one subtype of emissions: scope 2, or the indirect emissions companies cause by purchasing energy from utilities and large-scale generators.In-house data centersScope 2 is the category that includes the majority of the emissions that come from in-house data center operations, as it concerns the emissions associated with purchased energy – mainly, electricity.Data centers should also make up a majority of overall scope 2 emissions for each company except Amazon, given that the other sources of scope 2 emissions for these companies stem from the electricity consumed by firms’ offices and retail spaces – operations that are relatively small and not carbon-intensive. Amazon has one other carbon-intensive business vertical to account for in its scope 2 emissions: its warehouses and e-commerce logistics.For the firms that give data center-specific data – Meta and Microsoft – this holds true: data centers made up 100% of Meta’s market-based (official) scope 2 emissions and 97.4% of its location-based emissions. For Microsoft, those numbers were 97.4% and 95.6%, respectively.The massive differences in location-based and official scope 2 emissions numbers showcase just how carbon intensive data centers really are, and how deceptive firms’ official emissions numbers can be. Meta, for example, reports its official scope 2 emissions for 2022 as 273 metric tons CO2 equivalent – all of that attributable to data centers. Under the location-based accounting system, that number jumps to more than 3.8m metric tons of CO2 equivalent for data centers alone – a more than 19,000 times increase.A similar result can be seen with Microsoft. The firm reported its official data center-related emissions for 2022 as 280,782 metric tons CO2 equivalent. Under a location-based accounting method, that number jumps to 6.1m metric tons CO2 equivalent. That’s a nearly 22 times increase.While Meta’s reporting gap is more egregious, both firms’ location-based emissions are higher because they undercount their data center emissions specifically, with 97.4% of the gap between Meta’s location-based and official scope 2 number in 2022 being unreported data center-related emissions, and 95.55% of Microsoft’s.Specific data center-related emissions numbers aren’t available for the rest of the firms. However, given that Google and Apple have similar scope 2 business models to Meta and Microsoft, it is likely that the multiple on how much higher their location-based data center emissions are would be similar to the multiple on how much higher their overall location-based scope 2 emissions are.In total, the sum of location-based emissions in this category between 2020 and 2022 was at least 275% higher (or 3.75 times) than the sum of their official figures. Amazon did not provide the Guardian with location-based scope 2 figures for 2020 and 2021, so its official (and likely much lower) numbers were used for this calculation for those years.Third-party data centersBig tech companies also rent a large portion of their data center capacity from third-party data center operators (or “colocation” data centers). According to the Synergy Research Group, large tech companies (or “hyperscalers”) represented 37% of worldwide data center capacity in 2022, with half of that capacity coming through third-party contracts. While this group includes companies other than Google, Amazon, Meta, Microsoft and Apple, it gives an idea of the extent of these firms’ activities with third-party data centers.Those emissions should theoretically fall under scope 3, all emissions a firm is responsible for that can’t be attributed to the fuel or electricity it consumes.When it comes to a big tech firm’s operations, this would encapsulate everything from the manufacturing processes of the hardware it sells (like the iPhone or Kindle) to the emissions from employees’ cars during their commutes to the office.When it comes to data centers, scope 3 emissions include the carbon emitted from the construction of in-house data centers, as well as the carbon emitted during the manufacturing process of the equipment used inside those in-house data centers. It may also include those emissions as well as the electricity-related emissions of third-party data centers that are partnered with.However, whether or not these emissions are fully included in reports is almost impossible to prove. “Scope 3 emissions are hugely uncertain,” said Dietrich. “This area is a mess just in terms of accounting.”According to Dietrich, some third-party data center operators put their energy-related emissions in their own scope 2 reporting, so those who rent from them can put those emissions into their scope 3. Other third-party data center operators put energy-related emissions into their scope 3 emissions, expecting their tenants to report those emissions in their own scope 2 reporting.Additionally, all firms use market-based metrics for these scope 3 numbers, which means third-party data center emissions are also undercounted in official figures.Of the firms that report their location-based scope 3 emissions in the footnotes, only Apple has a large gap between its official scope 3 figure and its location-based scope 3 figure, starting in 2022.This gap can largely be attributed to data center emissions accounting. The only change to Apple’s scope 3 methodology in 2022 was to include “work from home, third-party cloud services, electricity transmission and distribution losses, and upstream impacts from scope 1 fuels”. Since the firm listed third-party cloud services as having zero emissions under its official scope 3 reporting, that means all emissions associated with those third-party services would only show up in location-based scope 3 emissions from 2022 onwards.2025 and beyondEven though big tech hides these emissions, they are due to keep rising. Data centers’ electricity demand is projected to double by 2030 due to the additional load that artificial intelligence poses, according to the Electric Power Research Institute.Google and Microsoft both blamed AI for their recent upticks in market-based emissions.“The relative contribution of AI computing loads to Google’s data centers, as I understood it when I left [in 2022], was relatively modest,” said Chris Taylor, current CEO of utility storage firm Gridstor and former site lead for Google’s data center energy strategy unit. “Two years ago, [AI] was not the main thing that we were worried about, at least on the energy team.”Taylor explained that most of the growth that he saw in data centers while at Google was attributable to growth in Google Cloud, as most enterprises were moving their IT tasks to the firm’s cloud servers.Whether today’s power grids can withstand the growing energy demands of AI is uncertain. One industry leader – Marc Ganzi, the CEO of DigitalBridge, a private equity firm that owns two of the world’s largest third-party data center operators – has gone as far as to say that the data center sector may run out of power within the next two years.And as grid interconnection backlogs continue to pile up worldwide, it may be nearly impossible for even the most well intentioned of companies to get new renewable energy production capacity online in time to meet that demand.

Emissions from in-house data centers of Google, Microsoft, Meta and Apple may be 7.62 times higher than official tallyBig tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are likely about 662% – or 7.62 times – higher than officially reported. Continue reading...

Big tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.

According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are likely about 662% – or 7.62 times – higher than officially reported.

Amazon is the largest emitter of the big five tech companies by a mile – the emissions of the second-largest emitter, Apple, were less than half of Amazon’s in 2022. However, Amazon has been kept out of the calculation above because its differing business model makes it difficult to isolate data center-specific emissions figures for the company.

As energy demands for these data centers grow, many are worried that carbon emissions will, too. The International Energy Agency stated that data centers already accounted for 1% to 1.5% of global electricity consumption in 2022 – and that was before the AI boom began with ChatGPT’s launch at the end of that year.

AI is far more energy-intensive on data centers than typical cloud-based applications. According to Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search, and data center power demand will grow 160% by 2030. Goldman competitor Morgan Stanley’s research has made similar findings, projecting data center emissions globally to accumulate to 2.5bn metric tons of CO2 equivalent by 2030.

In the meantime, all five tech companies have claimed carbon neutrality, though Google dropped the label last year as it stepped up its carbon accounting standards. Amazon is the most recent company to do so, claiming in July that it met its goal seven years early, and that it had implemented a gross emissions cut of 3%.

“It’s down to creative accounting,” explained a representative from Amazon Employees for Climate Justice, an advocacy group composed of current Amazon employees who are dissatisfied with their employer’s action on climate. “Amazon – despite all the PR and propaganda that you’re seeing about their solar farms, about their electric vans – is expanding its fossil fuel use, whether it’s in data centers or whether it’s in diesel trucks.”

A misguided metric

The most important tools in this “creative accounting” when it comes to data centers are renewable energy certificates, or Recs. These are certificates that a company purchases to show it is buying renewable energy-generated electricity to match a portion of its electricity consumption – the catch, though, is that the renewable energy in question doesn’t need to be consumed by a company’s facilities. Rather, the site of production can be anywhere from one town over to an ocean away.

Recs are used to calculate “market-based” emissions, or the official emissions figures used by the firms. When Recs and offsets are left out of the equation, we get “location-based emissions” – the actual emissions generated from the area where the data is being processed.

The trend in those emissions is worrying. If these five companies were one country, the sum of their “location-based” emissions in 2022 would rank them as the 33rd highest-emitting country, behind the Philippines and above Algeria.

Many data center industry experts also recognize that location-based metrics are more honest than the official, market-based numbers reported.

“Location-based [accounting] gives an accurate picture of the emissions associated with the energy that’s actually being consumed to run the data center. And Uptime’s view is that it’s the right metric,” said Jay Dietrich, the research director of sustainability at Uptime Institute, a leading data center advisory and research organization.

Nevertheless, Greenhouse Gas (GHG) Protocol, a carbon accounting oversight body, allows Recs to be used in official reporting, though the extent to which they should be allowed remains controversial between tech companies and has led to a lobbying battle over GHG Protocol’s rule-making process between two factions.

On one side there is the Emissions First Partnership, spearheaded by Amazon and Meta. It aims to keep Recs in the accounting process regardless of their geographic origins. In practice, this is only a slightly looser interpretation of what GHG Protocol already permits.

The opposing faction, headed by Google and Microsoft, argues that there needs to be time-based and location-based matching of renewable production and energy consumption for data centers. Google calls this its 24/7 goal, or its goal to have all of its facilities run on renewable energy 24 hours a day, seven days a week by 2030. Microsoft calls it its 100/100/0 goal, or its goal to have all its facilities running on 100% carbon-free energy 100% of the time, making zero carbon-based energy purchases by 2030.

Google has already phased out its Rec use and Microsoft aims to do the same with low-quality “unbundled” (non location-specific) Recs by 2030.

Academics and carbon management industry leaders alike are also against the GHG Protocol’s permissiveness on Recs. In an open letter from 2015, more than 50 such individuals argued that “it should be a bedrock principle of GHG accounting that no company be allowed to report a reduction in its GHG footprint for an action that results in no change in overall GHG emissions. Yet this is precisely what can happen under the guidance given the contractual/Rec-based reporting method.”

To GHG Protocol’s credit, the organization does ask companies to report location-based figures alongside their Rec-based figures. Despite that, no company includes both location-based and market-based metrics for all three subcategories of emissions in the bodies of their annual environmental reports.

In fact, location-based numbers are only directly reported (that is, not hidden in third-party assurance statements or in footnotes) by two companies – Google and Meta. And those two firms only include those figures for one subtype of emissions: scope 2, or the indirect emissions companies cause by purchasing energy from utilities and large-scale generators.

In-house data centers

Scope 2 is the category that includes the majority of the emissions that come from in-house data center operations, as it concerns the emissions associated with purchased energy – mainly, electricity.

Data centers should also make up a majority of overall scope 2 emissions for each company except Amazon, given that the other sources of scope 2 emissions for these companies stem from the electricity consumed by firms’ offices and retail spaces – operations that are relatively small and not carbon-intensive. Amazon has one other carbon-intensive business vertical to account for in its scope 2 emissions: its warehouses and e-commerce logistics.

For the firms that give data center-specific data – Meta and Microsoft – this holds true: data centers made up 100% of Meta’s market-based (official) scope 2 emissions and 97.4% of its location-based emissions. For Microsoft, those numbers were 97.4% and 95.6%, respectively.

The massive differences in location-based and official scope 2 emissions numbers showcase just how carbon intensive data centers really are, and how deceptive firms’ official emissions numbers can be. Meta, for example, reports its official scope 2 emissions for 2022 as 273 metric tons CO2 equivalent – all of that attributable to data centers. Under the location-based accounting system, that number jumps to more than 3.8m metric tons of CO2 equivalent for data centers alone – a more than 19,000 times increase.

A similar result can be seen with Microsoft. The firm reported its official data center-related emissions for 2022 as 280,782 metric tons CO2 equivalent. Under a location-based accounting method, that number jumps to 6.1m metric tons CO2 equivalent. That’s a nearly 22 times increase.

While Meta’s reporting gap is more egregious, both firms’ location-based emissions are higher because they undercount their data center emissions specifically, with 97.4% of the gap between Meta’s location-based and official scope 2 number in 2022 being unreported data center-related emissions, and 95.55% of Microsoft’s.

Specific data center-related emissions numbers aren’t available for the rest of the firms. However, given that Google and Apple have similar scope 2 business models to Meta and Microsoft, it is likely that the multiple on how much higher their location-based data center emissions are would be similar to the multiple on how much higher their overall location-based scope 2 emissions are.

In total, the sum of location-based emissions in this category between 2020 and 2022 was at least 275% higher (or 3.75 times) than the sum of their official figures. Amazon did not provide the Guardian with location-based scope 2 figures for 2020 and 2021, so its official (and likely much lower) numbers were used for this calculation for those years.

Third-party data centers

Big tech companies also rent a large portion of their data center capacity from third-party data center operators (or “colocation” data centers). According to the Synergy Research Group, large tech companies (or “hyperscalers”) represented 37% of worldwide data center capacity in 2022, with half of that capacity coming through third-party contracts. While this group includes companies other than Google, Amazon, Meta, Microsoft and Apple, it gives an idea of the extent of these firms’ activities with third-party data centers.

Those emissions should theoretically fall under scope 3, all emissions a firm is responsible for that can’t be attributed to the fuel or electricity it consumes.

When it comes to a big tech firm’s operations, this would encapsulate everything from the manufacturing processes of the hardware it sells (like the iPhone or Kindle) to the emissions from employees’ cars during their commutes to the office.

When it comes to data centers, scope 3 emissions include the carbon emitted from the construction of in-house data centers, as well as the carbon emitted during the manufacturing process of the equipment used inside those in-house data centers. It may also include those emissions as well as the electricity-related emissions of third-party data centers that are partnered with.

However, whether or not these emissions are fully included in reports is almost impossible to prove. “Scope 3 emissions are hugely uncertain,” said Dietrich. “This area is a mess just in terms of accounting.”

According to Dietrich, some third-party data center operators put their energy-related emissions in their own scope 2 reporting, so those who rent from them can put those emissions into their scope 3. Other third-party data center operators put energy-related emissions into their scope 3 emissions, expecting their tenants to report those emissions in their own scope 2 reporting.

Additionally, all firms use market-based metrics for these scope 3 numbers, which means third-party data center emissions are also undercounted in official figures.

Of the firms that report their location-based scope 3 emissions in the footnotes, only Apple has a large gap between its official scope 3 figure and its location-based scope 3 figure, starting in 2022.

This gap can largely be attributed to data center emissions accounting. The only change to Apple’s scope 3 methodology in 2022 was to include “work from home, third-party cloud services, electricity transmission and distribution losses, and upstream impacts from scope 1 fuels”. Since the firm listed third-party cloud services as having zero emissions under its official scope 3 reporting, that means all emissions associated with those third-party services would only show up in location-based scope 3 emissions from 2022 onwards.

2025 and beyond

Even though big tech hides these emissions, they are due to keep rising. Data centers’ electricity demand is projected to double by 2030 due to the additional load that artificial intelligence poses, according to the Electric Power Research Institute.

Google and Microsoft both blamed AI for their recent upticks in market-based emissions.

“The relative contribution of AI computing loads to Google’s data centers, as I understood it when I left [in 2022], was relatively modest,” said Chris Taylor, current CEO of utility storage firm Gridstor and former site lead for Google’s data center energy strategy unit. “Two years ago, [AI] was not the main thing that we were worried about, at least on the energy team.”

Taylor explained that most of the growth that he saw in data centers while at Google was attributable to growth in Google Cloud, as most enterprises were moving their IT tasks to the firm’s cloud servers.

Whether today’s power grids can withstand the growing energy demands of AI is uncertain. One industry leader – Marc Ganzi, the CEO of DigitalBridge, a private equity firm that owns two of the world’s largest third-party data center operators – has gone as far as to say that the data center sector may run out of power within the next two years.

And as grid interconnection backlogs continue to pile up worldwide, it may be nearly impossible for even the most well intentioned of companies to get new renewable energy production capacity online in time to meet that demand.

Read the full story here.
Photos courtesy of

Designing a new way to optimize complex coordinated systems

Using diagrams to represent interactions in multipart systems can provide a faster way to design software improvements.

Coordinating complicated interactive systems, whether it’s the different modes of transportation in a city or the various components that must work together to make an effective and efficient robot, is an increasingly important subject for software designers to tackle. Now, researchers at MIT have developed an entirely new way of approaching these complex problems, using simple diagrams as a tool to reveal better approaches to software optimization in deep-learning models.They say the new method makes addressing these complex tasks so simple that it can be reduced to a drawing that would fit on the back of a napkin.The new approach is described in the journal Transactions of Machine Learning Research, in a paper by incoming doctoral student Vincent Abbott and Professor Gioele Zardini of MIT’s Laboratory for Information and Decision Systems (LIDS).“We designed a new language to talk about these new systems,” Zardini says. This new diagram-based “language” is heavily based on something called category theory, he explains.It all has to do with designing the underlying architecture of computer algorithms — the programs that will actually end up sensing and controlling the various different parts of the system that’s being optimized. “The components are different pieces of an algorithm, and they have to talk to each other, exchange information, but also account for energy usage, memory consumption, and so on.” Such optimizations are notoriously difficult because each change in one part of the system can in turn cause changes in other parts, which can further affect other parts, and so on.The researchers decided to focus on the particular class of deep-learning algorithms, which are currently a hot topic of research. Deep learning is the basis of the large artificial intelligence models, including large language models such as ChatGPT and image-generation models such as Midjourney. These models manipulate data by a “deep” series of matrix multiplications interspersed with other operations. The numbers within matrices are parameters, and are updated during long training runs, allowing for complex patterns to be found. Models consist of billions of parameters, making computation expensive, and hence improved resource usage and optimization invaluable.Diagrams can represent details of the parallelized operations that deep-learning models consist of, revealing the relationships between algorithms and the parallelized graphics processing unit (GPU) hardware they run on, supplied by companies such as NVIDIA. “I’m very excited about this,” says Zardini, because “we seem to have found a language that very nicely describes deep learning algorithms, explicitly representing all the important things, which is the operators you use,” for example the energy consumption, the memory allocation, and any other parameter that you’re trying to optimize for.Much of the progress within deep learning has stemmed from resource efficiency optimizations. The latest DeepSeek model showed that a small team can compete with top models from OpenAI and other major labs by focusing on resource efficiency and the relationship between software and hardware. Typically, in deriving these optimizations, he says, “people need a lot of trial and error to discover new architectures.” For example, a widely used optimization program called FlashAttention took more than four years to develop, he says. But with the new framework they developed, “we can really approach this problem in a more formal way.” And all of this is represented visually in a precisely defined graphical language.But the methods that have been used to find these improvements “are very limited,” he says. “I think this shows that there’s a major gap, in that we don’t have a formal systematic method of relating an algorithm to either its optimal execution, or even really understanding how many resources it will take to run.” But now, with the new diagram-based method they devised, such a system exists.Category theory, which underlies this approach, is a way of mathematically describing the different components of a system and how they interact in a generalized, abstract manner. Different perspectives can be related. For example, mathematical formulas can be related to algorithms that implement them and use resources, or descriptions of systems can be related to robust “monoidal string diagrams.” These visualizations allow you to directly play around and experiment with how the different parts connect and interact. What they developed, he says, amounts to “string diagrams on steroids,” which incorporates many more graphical conventions and many more properties.“Category theory can be thought of as the mathematics of abstraction and composition,” Abbott says. “Any compositional system can be described using category theory, and the relationship between compositional systems can then also be studied.” Algebraic rules that are typically associated with functions can also be represented as diagrams, he says. “Then, a lot of the visual tricks we can do with diagrams, we can relate to algebraic tricks and functions. So, it creates this correspondence between these different systems.”As a result, he says, “this solves a very important problem, which is that we have these deep-learning algorithms, but they’re not clearly understood as mathematical models.” But by representing them as diagrams, it becomes possible to approach them formally and systematically, he says.One thing this enables is a clear visual understanding of the way parallel real-world processes can be represented by parallel processing in multicore computer GPUs. “In this way,” Abbott says, “diagrams can both represent a function, and then reveal how to optimally execute it on a GPU.”The “attention” algorithm is used by deep-learning algorithms that require general, contextual information, and is a key phase of the serialized blocks that constitute large language models such as ChatGPT. FlashAttention is an optimization that took years to develop, but resulted in a sixfold improvement in the speed of attention algorithms.Applying their method to the well-established FlashAttention algorithm, Zardini says that “here we are able to derive it, literally, on a napkin.” He then adds, “OK, maybe it’s a large napkin.” But to drive home the point about how much their new approach can simplify dealing with these complex algorithms, they titled their formal research paper on the work “FlashAttention on a Napkin.”This method, Abbott says, “allows for optimization to be really quickly derived, in contrast to prevailing methods.” While they initially applied this approach to the already existing FlashAttention algorithm, thus verifying its effectiveness, “we hope to now use this language to automate the detection of improvements,” says Zardini, who in addition to being a principal investigator in LIDS, is the Rudge and Nancy Allen Assistant Professor of Civil and Environmental Engineering, and an affiliate faculty with the Institute for Data, Systems, and Society.The plan is that ultimately, he says, they will develop the software to the point that “the researcher uploads their code, and with the new algorithm you automatically detect what can be improved, what can be optimized, and you return an optimized version of the algorithm to the user.”In addition to automating algorithm optimization, Zardini notes that a robust analysis of how deep-learning algorithms relate to hardware resource usage allows for systematic co-design of hardware and software. This line of work integrates with Zardini’s focus on categorical co-design, which uses the tools of category theory to simultaneously optimize various components of engineered systems.Abbott says that “this whole field of optimized deep learning models, I believe, is quite critically unaddressed, and that’s why these diagrams are so exciting. They open the doors to a systematic approach to this problem.”“I’m very impressed by the quality of this research. ... The new approach to diagramming deep-learning algorithms used by this paper could be a very significant step,” says Jeremy Howard, founder and CEO of Answers.ai, who was not associated with this work. “This paper is the first time I’ve seen such a notation used to deeply analyze the performance of a deep-learning algorithm on real-world hardware. ... The next step will be to see whether real-world performance gains can be achieved.”“This is a beautifully executed piece of theoretical research, which also aims for high accessibility to uninitiated readers — a trait rarely seen in papers of this kind,” says Petar Velickovic, a senior research scientist at Google DeepMind and a lecturer at Cambridge University, who was not associated with this work. These researchers, he says, “are clearly excellent communicators, and I cannot wait to see what they come up with next!”The new diagram-based language, having been posted online, has already attracted great attention and interest from software developers. A reviewer from Abbott’s prior paper introducing the diagrams noted that “The proposed neural circuit diagrams look great from an artistic standpoint (as far as I am able to judge this).” “It’s technical research, but it’s also flashy!” Zardini says.

The UK Says at an Energy Summit That Green Power Will Boost Security, as the US Differs

Britain has announced a major investment in wind power as it hosts an international summit on energy security

LONDON (AP) — Britain announced a major investment in wind power Thursday as it hosted an international summit on energy security — with Europe and the United States at odds over whether to cut their reliance on fossil fuels.U.K. Prime Minister Keir Starmer said the government will invest 300 million pounds ($400 million) in boosting Britain’s capacity to manufacture components for the offshore wind industry, a move it hopes will encourage private investment in the U.K.’s renewable energy sector.“As long as energy can be weaponized against us, our countries and our citizens are vulnerable and exposed,” U.K. Energy Secretary Ed Miliband told delegates.He said “low-carbon power” was a route to energy security as well as a way to slow climate change.Britain now gets more than half its electricity from renewable sources such as wind and solar power, and the rest from natural gas and nuclear energy. It aims to generate all the U.K.’s energy from renewable sources by 2030.Tommy Joyce, U.S. acting assistant secretary of energy for international affairs, told participants they should be “honest about the world’s growing energy needs, not focused on net-zero politics.”He called policies that push for clean power over fossil fuels "harmful and dangerous," and claimed building wind turbines requires "concessions to or coercion from China" because it supplies necessary rare minerals.Hosted by the British government and the International Energy Agency, the two-day summit brings together government ministers from 60 countries, senior European Union officials, energy sector CEOs, heads of international organizations and nonprofits to assess risks to the global energy system and figure out solutions. Associated Press writer Jennifer McDermott contributed to this story. ___The Associated Press’ climate and environmental coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP’s standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See - Feb. 2025

Steelhead trout rescued from Palisades fire spawn in their new Santa Barbara County home

After a stressful journey out of the burn zone in Malibu, the endangered trout have spawned in their adopted stream in Santa Barbara County.

Wildlife officials feared critically endangered steelhead trout rescued from the Palisades fire burn scar might not be up for spawning after all they’d been through over the last few months.After their watershed in the Santa Monica Mountains was scorched in January, the fish were stunned with electricity, scooped up in buckets, trucked to a hatchery, fed unfamiliar food and then moved to a different creek. It was all part of a liberation effort pulled off in the nick of time. “This whole thing is just a very stressful and traumatic event, and I’m happy that we didn’t really kill many fish,” said Kyle Evans, an environmental program manager for the California Department of Fish and Wildlife, which led the rescue. “But I was concerned that I might have just disrupted this whole months-long process of getting ready to spawn.” Steelhead were once abundant in Southern California, but their numbers plummeted amid coastal development and overfishing. A distinct Southern California population is listed as endangered at the state and federal level. (Alex Vejar / California Department of Fish and Wildlife) But this month spawn they did.It’s believed that there are now more than 100 baby trout swishing around their new digs in Arroyo Hondo Creek in Santa Barbara County.Their presence is a triumph — for the species and for their adopted home.However, more fish require more suitable habitat, which is lacking in Southern California — in part due to drought and the increased frequency of devastating wildfires. Steelhead trout are the same species as rainbow trout, but they have different lifestyles. Steelheads migrate to the ocean and return to their natal streams to spawn, while rainbows spend their lives in freshwater.Steelhead were once abundant in Southern California, but their numbers plummeted amid coastal development and overfishing. A distinct Southern California population is listed as endangered at the state and federal level.The young fish sighted this month mark the next generation of what was the last population of steelhead in the Santa Monica Mountains, a range that stretches from the Hollywood Hills to Point Mugu in Ventura County. They also represent the return of a species to a watershed that itself was devastated by a fire four years ago, but has since recovered. It’s believed that there are now more than 100 baby trout swishing around their new digs in Arroyo Hondo Creek in Santa Barbara County. (Kyle Kusa / Land Trust for Santa Barbara County) The Alisal blaze torched roughly 95% of the Arroyo Hondo Preserve located west of Santa Barbara, and subsequent debris flows choked the creek of the same name that housed steelhead. All the fish perished, according to Meredith Hendricks, executive director of the Land Trust for Santa Barbara County, a nonprofit organization that owns and manages the preserve.“To be able to … offer space for these fish to be transplanted to — when we ourselves had experienced a similar situation but lost our fish — it was just a really big deal,” Hendricks said. Arroyo Hondo Creek bears similarities to the trout’s native Topanga Creek; they are both coastal streams of roughly the same size. And it has a bonus feature: a state-funded fish passage constructed under Highway 101 in 2008, which improved fish movement between the stream and the ocean.Spawning is a biologically and energetically demanding endeavor for steelhead, and the process likely began in December or earlier, according to Evans.That means it was already underway when 271 steelhead were evacuated in January from Topanga Creek, a biodiversity hot spot located in Malibu that was badly damaged by the Palisades fire.It continued when they were hauled about 50 miles north to a hatchery in Fillmore, where they hung out until 266 of them made it to Arroyo Hondo the following month.State wildlife personnel regularly surveyed the fish in their new digs but didn’t see the spawning nests, which can be missed. VIDEO | 00:16 Steelhead trout in Arroyo Hondo Creek in Santa Barbara County Steelhead trout in Arroyo Hondo Creek in Santa Barbara County. (Calif. Dept. of Fish & Game) Then, on April 7, Evans got a text message from the Land Trust’s land programs director, Leslie Chan, with a video that appeared to show a freshly hatched young-of-the-year — the wonky name for fish born during the steelheads’ sole annual spawn.The following day, Evans’ team was dispatched to the creek and confirmed the discovery. They tallied about 100 of the newly hatched fish. The young trout span roughly one inch and, as Evans put it, aren’t too bright. They hang out in the shallows and don’t bolt from predators.“They’re kind of just happy to be alive, and they’re not really trying to hide,” he said.By the end of summer, Evans estimates two-thirds will die off. But the survivors are enough to keep the population charging onward. Evans hopes that in a few years, there will be three to four times the number of fish that initially moved in.The plan is to eventually relocate at least some back to their native home of Topanga Creek.Right now, Topanga “looks pretty bad,” Evans said. The Palisades fire stripped the surrounding hillsides of vegetation, paving the way for dirt, ash and other material to pour into the waterway. Another endangered fish, northern tidewater gobies, were rescued from the same watershed shortly before the steelhead were liberated. Within two days of the trouts’ removal, the first storm of the season arrived, likely burying the remaining fish in a muddy slurry. Citizen scientists Bernard Yin, center, and Rebecca Ramirez, right, join government agency staffers in rescuing federally endangered fish in the Topanga Lagoon in Malibu on Jan. 17. (Christina House / Los Angeles Times) Evans expects it will be about four years before Topanga Creek is ready to support steelhead again, based on his experience observing streams recover after the Thomas, Woolsey, Alisal and other fires. There’s also discussion about moving around steelhead to create backup populations should calamity befall one, as well as boost genetic diversity of the rare fish.For example, some of the steelhead saved from Topanga could be moved to Malibu Creek, another stream in the Santa Monica Mountains that empties into Santa Monica Bay. There are efforts underway to remove the 100-foot Rindge Dam in Malibu Creek to open up more habitat for the fish.“As we saw, if you have one population in the Santa Monica Mountains and a fire happens, you could just lose it forever,” Evans said. “So having fish in multiple areas is the kind of way to defend against that.”With the Topanga Creek steelhead biding their time up north, it’s believed there are none currently inhabiting the Santa Monicas. Habitat restoration is key for the species’ survival, according to Evans, who advocates for directing funding to such efforts, including soon-to-come-online money from Proposition 4, a $10-billion bond measure to finance water, clean energy and other environmental projects.“It doesn’t matter how many fish you have, or if you’re growing them in a hatchery, or what you’re doing,” he said. “If they can’t be supported on the landscape, then there’s no point.”Some trout will end up making their temporary lodging permanent, according to Hendricks, of the Land Trust. Arroyo Hondo is a long creek with plenty of nooks and crannies for trout to hide in. So when it comes time to bring the steelhead home, she said, “I’m sure some will get left behind.”

Chicago Teachers Union secures clean energy wins in new contract

The Chicago Teachers Union expects its new, hard-fought contract to help drive clean energy investments and train the next generation of clean energy workers, even as the Trump administration attacks such priorities. The contract approved by 97% of union members this month represents the first time the union has…

The Chicago Teachers Union expects its new, hard-fought contract to help drive clean energy investments and train the next generation of clean energy workers, even as the Trump administration attacks such priorities. The contract approved by 97% of union members this month represents the first time the union has bargained with school officials specifically around climate change and energy, said union Vice President Jackson Potter. The deal still needs to be approved by the Chicago Board of Education. If approved, the contract will result in new programs that prepare students for clean energy jobs, developed in collaboration with local labor unions. It mandates that district officials work with the teachers union to seek funding for clean energy investments and update a climate action plan by 2026. And it calls for installing heat pumps and outfitting 30 schools with solar panels — if funding can be secured. During almost a year of contentious negotiations, the more than 25,000-member union had also demanded paid climate-educator positions, an all-electric school bus fleet, and that all newly constructed schools be carbon-free. While those provisions did not end up in the final agreement, leaders say the four-year contract is a ​“transformative” victory that sets the stage for more ambitious demands next time. “This contract is setting the floor of what we hope we can accomplish,” said Lauren Bianchi, who taught social studies at George Washington High School on the city’s South Side for six years before becoming green schools organizer for the union. ​“It shows we can win on climate, even despite Trump.” The climate-related provisions are part of what the Chicago Teachers Union and an increasing number of unions nationwide refer to as ​“common good” demands, meant to benefit not only their members in the workplace but the entire community. In this and its 2019 contract, the Chicago union also won ​“common good” items such as protections for immigrant students and teachers, and affordable housing–related measures. The new contract also guarantees teachers academic freedom at a time when the federal government is trying to limit schools from teaching materials related to diversity, equity, and inclusion. “Black history, Indigenous history, climate science — that’s protected instruction now,” said Potter. Chicago Public Schools did not respond to emailed questions for this story, except to forward a press release that did not mention clean energy provisions. Training Chicago’s students for clean energy jobs The union crafted its proposals based on discussions with three environmental and community organizations, Bianchi said — the Southeast Environmental Task Force, People for Community Recovery, and ONE Northside. The Southeast Environmental Task Force led the successful fight to ban new petcoke storage in Chicago, and the group’s co-executive director Olga Bautista is also vice president of the 21-member school board. People for Community Recovery was founded by Hazel Johnson, who is often known as ​“the mother of the environmental justice movement.” And ONE Northside emphasizes the link between clean energy and affordable housing. Clean energy job training was a priority for all three of the organizations, Potter said. Under the contract, the union and district officials will work with other labor unions to create pre-apprenticeship programs for students, which are crucial to entering the union-dominated building trades to install solar, do energy-efficiency overhauls, and electrify homes with heat pumps and other technology. The contract demands the district create one specific new clean energy jobs pathway program during each year of the four-year contract. It also mandates renovating schools for energy efficiency and installing modern HVAC systems, and orders the school district to work with trade unions to create opportunities for Chicago Public Schools students and graduates to be hired for such work. “The people in the community have identified jobs and economic justice as being essential for environmental justice,” said Bianchi. ​“I’ve mostly taught juniors and seniors; a lot expressed frustration that college is not their plan. They wish they could learn job skills to enter a trade.” Chicago schools progress on solar, energy efficiency, and electrification Installing solar could help the district meet its clean energy goals, which include sourcing 100% of its electricity from renewables by this year. The district has invested more than $6 million in energy efficiency and efficient lighting since 2018, and cut its carbon dioxide emissions by more than 27,000 metric tons, school district spokesperson Evan Moore told Canary Media last fall as contract negotiations were proceeding. The schools are eligible for subsidized solar panels under the state Illinois Shines program, and they can tap the federal 30% investment tax credit for solar arrays, with a new direct-pay option tailored to tax-exempt organizations like schools.

Costa Rica Proposes Strict Penalties for Illegal National Park Entries

Costa Rica is cracking down on illegal entries into its national parks and protected areas, citing dangers to visitors and environmental harm. Franz Tattenbach, Minister of Environment and Energy (MINAE), has called on lawmakers to approve a bill imposing fines of up to ¢2.3 million (approximately $4,400) on individuals and tour operators who access these […] The post Costa Rica Proposes Strict Penalties for Illegal National Park Entries appeared first on The Tico Times | Costa Rica News | Travel | Real Estate.

Costa Rica is cracking down on illegal entries into its national parks and protected areas, citing dangers to visitors and environmental harm. Franz Tattenbach, Minister of Environment and Energy (MINAE), has called on lawmakers to approve a bill imposing fines of up to ¢2.3 million (approximately $4,400) on individuals and tour operators who access these areas without authorization. Over 500 unauthorized entries into Costa Rica’s 30 national parks and reserves, have been reported so far this year. High-risk areas like Poás, Turrialba, Rincón de la Vieja, and Arenal volcanoes are frequent targets, where illegal tours bypass safety protocols. Unscrupulous operators promote these “exclusive” experiences on social media, often lacking insurance, safety equipment, or trained guides. “These operators abandon clients if intercepted by authorities, leaving them vulnerable in hazardous areas,” Tattenbach said. Poás Volcano National Park, closed since March 26 due to seismic activity and ash emissions, remains a hotspot for illegal tours. The proposed bill, under discussion by MINAE and the National System of Conservation Areas (SINAC), would introduce fines ranging from ¢1.3 million to ¢2.3 million ($2,500 to $4,400) for unauthorized entry, targeting both operators and participants. If a rescue operation is required, involving the Costa Rican Red Cross or MINAE personnel, an additional fine of ¢2.3 million ($4,400) could be imposed. Current laws penalize illegal entry under Article 58 of Forestry Law 7575, with three months to three years in prison, but enforcement is inconsistent. The new bill aims to strengthen deterrence. “These hikes involve steep slopes, toxic gases, and the risk of volcanic eruptions, which can be fatal,” Tattenbach warned, citing the 2017 Poás eruption that closed the park for over a year. Illegal entries also threaten Costa Rica’s biodiversity, which includes 5% of the world’s species. Unauthorized trails disrupt ecosystems and increase risks of poaching, according to Jorge Mario Rodríguez, Vice Minister of Environment. The Volcanological and Seismological Observatory of Costa Rica (OVSICORI) monitors volcanic activity to inform park closures, but illegal tours undermine these safety measures. Increased Surveillance SINAC, the Costa Rican Fire Department, Red Cross, and Police Force will intensify surveillance going forward, targeting high-risk national parks and roadways to prevent unauthorized access, wildlife extraction, hunting, and trade in protected flora and fauna. “These operations safeguard our natural heritage and ensure visitor safety,” Tattenbach said. SINAC’s year-round efforts have intercepted numerous illegal tours in 2025. Visiting Parks Safely: MINAE and SINAC urge visitors to use authorized operators and purchase tickets via the SINAC website or park entrances. Guided tours, available through platforms like Viator or Get Your Guide, offer safe experiences in parks like Manuel Antonio or Corcovado. Tourists should check park statuses before planning visits, as closures due to volcanic activity or weather are common. “Respecting regulations protects both you and Costa Rica’s natural treasures,” Rodríguez said. Preserving Ecotourism: As the proposed bill awaits Legislative Assembly review, MINAE urges compliance to maintain Costa Rica’s status as a global conservation leader. For updates on the bill or park regulations, visit MINAE’s Website The post Costa Rica Proposes Strict Penalties for Illegal National Park Entries appeared first on The Tico Times | Costa Rica News | Travel | Real Estate.

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.