Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Data center emissions likely 662% higher than big tech claims. Can it keep up the ruse?

News Feed
Sunday, September 15, 2024

Big tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are likely about 662% – or 7.62 times – higher than officially reported.Amazon is the largest emitter of the big five tech companies by a mile – the emissions of the second-largest emitter, Apple, were less than half of Amazon’s in 2022. However, Amazon has been kept out of the calculation above because its differing business model makes it difficult to isolate data center-specific emissions figures for the company.As energy demands for these data centers grow, many are worried that carbon emissions will, too. The International Energy Agency stated that data centers already accounted for 1% to 1.5% of global electricity consumption in 2022 – and that was before the AI boom began with ChatGPT’s launch at the end of that year.AI is far more energy-intensive on data centers than typical cloud-based applications. According to Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search, and data center power demand will grow 160% by 2030. Goldman competitor Morgan Stanley’s research has made similar findings, projecting data center emissions globally to accumulate to 2.5bn metric tons of CO2 equivalent by 2030.In the meantime, all five tech companies have claimed carbon neutrality, though Google dropped the label last year as it stepped up its carbon accounting standards. Amazon is the most recent company to do so, claiming in July that it met its goal seven years early, and that it had implemented a gross emissions cut of 3%.“It’s down to creative accounting,” explained a representative from Amazon Employees for Climate Justice, an advocacy group composed of current Amazon employees who are dissatisfied with their employer’s action on climate. “Amazon – despite all the PR and propaganda that you’re seeing about their solar farms, about their electric vans – is expanding its fossil fuel use, whether it’s in data centers or whether it’s in diesel trucks.”A misguided metricThe most important tools in this “creative accounting” when it comes to data centers are renewable energy certificates, or Recs. These are certificates that a company purchases to show it is buying renewable energy-generated electricity to match a portion of its electricity consumption – the catch, though, is that the renewable energy in question doesn’t need to be consumed by a company’s facilities. Rather, the site of production can be anywhere from one town over to an ocean away.Recs are used to calculate “market-based” emissions, or the official emissions figures used by the firms. When Recs and offsets are left out of the equation, we get “location-based emissions” – the actual emissions generated from the area where the data is being processed.The trend in those emissions is worrying. If these five companies were one country, the sum of their “location-based” emissions in 2022 would rank them as the 33rd highest-emitting country, behind the Philippines and above Algeria.Many data center industry experts also recognize that location-based metrics are more honest than the official, market-based numbers reported.“Location-based [accounting] gives an accurate picture of the emissions associated with the energy that’s actually being consumed to run the data center. And Uptime’s view is that it’s the right metric,” said Jay Dietrich, the research director of sustainability at Uptime Institute, a leading data center advisory and research organization.Nevertheless, Greenhouse Gas (GHG) Protocol, a carbon accounting oversight body, allows Recs to be used in official reporting, though the extent to which they should be allowed remains controversial between tech companies and has led to a lobbying battle over GHG Protocol’s rule-making process between two factions.On one side there is the Emissions First Partnership, spearheaded by Amazon and Meta. It aims to keep Recs in the accounting process regardless of their geographic origins. In practice, this is only a slightly looser interpretation of what GHG Protocol already permits.The opposing faction, headed by Google and Microsoft, argues that there needs to be time-based and location-based matching of renewable production and energy consumption for data centers. Google calls this its 24/7 goal, or its goal to have all of its facilities run on renewable energy 24 hours a day, seven days a week by 2030. Microsoft calls it its 100/100/0 goal, or its goal to have all its facilities running on 100% carbon-free energy 100% of the time, making zero carbon-based energy purchases by 2030.Google has already phased out its Rec use and Microsoft aims to do the same with low-quality “unbundled” (non location-specific) Recs by 2030.Academics and carbon management industry leaders alike are also against the GHG Protocol’s permissiveness on Recs. In an open letter from 2015, more than 50 such individuals argued that “it should be a bedrock principle of GHG accounting that no company be allowed to report a reduction in its GHG footprint for an action that results in no change in overall GHG emissions. Yet this is precisely what can happen under the guidance given the contractual/Rec-based reporting method.”To GHG Protocol’s credit, the organization does ask companies to report location-based figures alongside their Rec-based figures. Despite that, no company includes both location-based and market-based metrics for all three subcategories of emissions in the bodies of their annual environmental reports.In fact, location-based numbers are only directly reported (that is, not hidden in third-party assurance statements or in footnotes) by two companies – Google and Meta. And those two firms only include those figures for one subtype of emissions: scope 2, or the indirect emissions companies cause by purchasing energy from utilities and large-scale generators.In-house data centersScope 2 is the category that includes the majority of the emissions that come from in-house data center operations, as it concerns the emissions associated with purchased energy – mainly, electricity.Data centers should also make up a majority of overall scope 2 emissions for each company except Amazon, given that the other sources of scope 2 emissions for these companies stem from the electricity consumed by firms’ offices and retail spaces – operations that are relatively small and not carbon-intensive. Amazon has one other carbon-intensive business vertical to account for in its scope 2 emissions: its warehouses and e-commerce logistics.For the firms that give data center-specific data – Meta and Microsoft – this holds true: data centers made up 100% of Meta’s market-based (official) scope 2 emissions and 97.4% of its location-based emissions. For Microsoft, those numbers were 97.4% and 95.6%, respectively.The massive differences in location-based and official scope 2 emissions numbers showcase just how carbon intensive data centers really are, and how deceptive firms’ official emissions numbers can be. Meta, for example, reports its official scope 2 emissions for 2022 as 273 metric tons CO2 equivalent – all of that attributable to data centers. Under the location-based accounting system, that number jumps to more than 3.8m metric tons of CO2 equivalent for data centers alone – a more than 19,000 times increase.A similar result can be seen with Microsoft. The firm reported its official data center-related emissions for 2022 as 280,782 metric tons CO2 equivalent. Under a location-based accounting method, that number jumps to 6.1m metric tons CO2 equivalent. That’s a nearly 22 times increase.While Meta’s reporting gap is more egregious, both firms’ location-based emissions are higher because they undercount their data center emissions specifically, with 97.4% of the gap between Meta’s location-based and official scope 2 number in 2022 being unreported data center-related emissions, and 95.55% of Microsoft’s.Specific data center-related emissions numbers aren’t available for the rest of the firms. However, given that Google and Apple have similar scope 2 business models to Meta and Microsoft, it is likely that the multiple on how much higher their location-based data center emissions are would be similar to the multiple on how much higher their overall location-based scope 2 emissions are.In total, the sum of location-based emissions in this category between 2020 and 2022 was at least 275% higher (or 3.75 times) than the sum of their official figures. Amazon did not provide the Guardian with location-based scope 2 figures for 2020 and 2021, so its official (and likely much lower) numbers were used for this calculation for those years.Third-party data centersBig tech companies also rent a large portion of their data center capacity from third-party data center operators (or “colocation” data centers). According to the Synergy Research Group, large tech companies (or “hyperscalers”) represented 37% of worldwide data center capacity in 2022, with half of that capacity coming through third-party contracts. While this group includes companies other than Google, Amazon, Meta, Microsoft and Apple, it gives an idea of the extent of these firms’ activities with third-party data centers.Those emissions should theoretically fall under scope 3, all emissions a firm is responsible for that can’t be attributed to the fuel or electricity it consumes.When it comes to a big tech firm’s operations, this would encapsulate everything from the manufacturing processes of the hardware it sells (like the iPhone or Kindle) to the emissions from employees’ cars during their commutes to the office.When it comes to data centers, scope 3 emissions include the carbon emitted from the construction of in-house data centers, as well as the carbon emitted during the manufacturing process of the equipment used inside those in-house data centers. It may also include those emissions as well as the electricity-related emissions of third-party data centers that are partnered with.However, whether or not these emissions are fully included in reports is almost impossible to prove. “Scope 3 emissions are hugely uncertain,” said Dietrich. “This area is a mess just in terms of accounting.”According to Dietrich, some third-party data center operators put their energy-related emissions in their own scope 2 reporting, so those who rent from them can put those emissions into their scope 3. Other third-party data center operators put energy-related emissions into their scope 3 emissions, expecting their tenants to report those emissions in their own scope 2 reporting.Additionally, all firms use market-based metrics for these scope 3 numbers, which means third-party data center emissions are also undercounted in official figures.Of the firms that report their location-based scope 3 emissions in the footnotes, only Apple has a large gap between its official scope 3 figure and its location-based scope 3 figure, starting in 2022.This gap can largely be attributed to data center emissions accounting. The only change to Apple’s scope 3 methodology in 2022 was to include “work from home, third-party cloud services, electricity transmission and distribution losses, and upstream impacts from scope 1 fuels”. Since the firm listed third-party cloud services as having zero emissions under its official scope 3 reporting, that means all emissions associated with those third-party services would only show up in location-based scope 3 emissions from 2022 onwards.2025 and beyondEven though big tech hides these emissions, they are due to keep rising. Data centers’ electricity demand is projected to double by 2030 due to the additional load that artificial intelligence poses, according to the Electric Power Research Institute.Google and Microsoft both blamed AI for their recent upticks in market-based emissions.“The relative contribution of AI computing loads to Google’s data centers, as I understood it when I left [in 2022], was relatively modest,” said Chris Taylor, current CEO of utility storage firm Gridstor and former site lead for Google’s data center energy strategy unit. “Two years ago, [AI] was not the main thing that we were worried about, at least on the energy team.”Taylor explained that most of the growth that he saw in data centers while at Google was attributable to growth in Google Cloud, as most enterprises were moving their IT tasks to the firm’s cloud servers.Whether today’s power grids can withstand the growing energy demands of AI is uncertain. One industry leader – Marc Ganzi, the CEO of DigitalBridge, a private equity firm that owns two of the world’s largest third-party data center operators – has gone as far as to say that the data center sector may run out of power within the next two years.And as grid interconnection backlogs continue to pile up worldwide, it may be nearly impossible for even the most well intentioned of companies to get new renewable energy production capacity online in time to meet that demand.

Emissions from in-house data centers of Google, Microsoft, Meta and Apple may be 7.62 times higher than official tallyBig tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are likely about 662% – or 7.62 times – higher than officially reported. Continue reading...

Big tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.

According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are likely about 662% – or 7.62 times – higher than officially reported.

Amazon is the largest emitter of the big five tech companies by a mile – the emissions of the second-largest emitter, Apple, were less than half of Amazon’s in 2022. However, Amazon has been kept out of the calculation above because its differing business model makes it difficult to isolate data center-specific emissions figures for the company.

As energy demands for these data centers grow, many are worried that carbon emissions will, too. The International Energy Agency stated that data centers already accounted for 1% to 1.5% of global electricity consumption in 2022 – and that was before the AI boom began with ChatGPT’s launch at the end of that year.

AI is far more energy-intensive on data centers than typical cloud-based applications. According to Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search, and data center power demand will grow 160% by 2030. Goldman competitor Morgan Stanley’s research has made similar findings, projecting data center emissions globally to accumulate to 2.5bn metric tons of CO2 equivalent by 2030.

In the meantime, all five tech companies have claimed carbon neutrality, though Google dropped the label last year as it stepped up its carbon accounting standards. Amazon is the most recent company to do so, claiming in July that it met its goal seven years early, and that it had implemented a gross emissions cut of 3%.

“It’s down to creative accounting,” explained a representative from Amazon Employees for Climate Justice, an advocacy group composed of current Amazon employees who are dissatisfied with their employer’s action on climate. “Amazon – despite all the PR and propaganda that you’re seeing about their solar farms, about their electric vans – is expanding its fossil fuel use, whether it’s in data centers or whether it’s in diesel trucks.”

A misguided metric

The most important tools in this “creative accounting” when it comes to data centers are renewable energy certificates, or Recs. These are certificates that a company purchases to show it is buying renewable energy-generated electricity to match a portion of its electricity consumption – the catch, though, is that the renewable energy in question doesn’t need to be consumed by a company’s facilities. Rather, the site of production can be anywhere from one town over to an ocean away.

Recs are used to calculate “market-based” emissions, or the official emissions figures used by the firms. When Recs and offsets are left out of the equation, we get “location-based emissions” – the actual emissions generated from the area where the data is being processed.

The trend in those emissions is worrying. If these five companies were one country, the sum of their “location-based” emissions in 2022 would rank them as the 33rd highest-emitting country, behind the Philippines and above Algeria.

Many data center industry experts also recognize that location-based metrics are more honest than the official, market-based numbers reported.

“Location-based [accounting] gives an accurate picture of the emissions associated with the energy that’s actually being consumed to run the data center. And Uptime’s view is that it’s the right metric,” said Jay Dietrich, the research director of sustainability at Uptime Institute, a leading data center advisory and research organization.

Nevertheless, Greenhouse Gas (GHG) Protocol, a carbon accounting oversight body, allows Recs to be used in official reporting, though the extent to which they should be allowed remains controversial between tech companies and has led to a lobbying battle over GHG Protocol’s rule-making process between two factions.

On one side there is the Emissions First Partnership, spearheaded by Amazon and Meta. It aims to keep Recs in the accounting process regardless of their geographic origins. In practice, this is only a slightly looser interpretation of what GHG Protocol already permits.

The opposing faction, headed by Google and Microsoft, argues that there needs to be time-based and location-based matching of renewable production and energy consumption for data centers. Google calls this its 24/7 goal, or its goal to have all of its facilities run on renewable energy 24 hours a day, seven days a week by 2030. Microsoft calls it its 100/100/0 goal, or its goal to have all its facilities running on 100% carbon-free energy 100% of the time, making zero carbon-based energy purchases by 2030.

Google has already phased out its Rec use and Microsoft aims to do the same with low-quality “unbundled” (non location-specific) Recs by 2030.

Academics and carbon management industry leaders alike are also against the GHG Protocol’s permissiveness on Recs. In an open letter from 2015, more than 50 such individuals argued that “it should be a bedrock principle of GHG accounting that no company be allowed to report a reduction in its GHG footprint for an action that results in no change in overall GHG emissions. Yet this is precisely what can happen under the guidance given the contractual/Rec-based reporting method.”

To GHG Protocol’s credit, the organization does ask companies to report location-based figures alongside their Rec-based figures. Despite that, no company includes both location-based and market-based metrics for all three subcategories of emissions in the bodies of their annual environmental reports.

In fact, location-based numbers are only directly reported (that is, not hidden in third-party assurance statements or in footnotes) by two companies – Google and Meta. And those two firms only include those figures for one subtype of emissions: scope 2, or the indirect emissions companies cause by purchasing energy from utilities and large-scale generators.

In-house data centers

Scope 2 is the category that includes the majority of the emissions that come from in-house data center operations, as it concerns the emissions associated with purchased energy – mainly, electricity.

Data centers should also make up a majority of overall scope 2 emissions for each company except Amazon, given that the other sources of scope 2 emissions for these companies stem from the electricity consumed by firms’ offices and retail spaces – operations that are relatively small and not carbon-intensive. Amazon has one other carbon-intensive business vertical to account for in its scope 2 emissions: its warehouses and e-commerce logistics.

For the firms that give data center-specific data – Meta and Microsoft – this holds true: data centers made up 100% of Meta’s market-based (official) scope 2 emissions and 97.4% of its location-based emissions. For Microsoft, those numbers were 97.4% and 95.6%, respectively.

The massive differences in location-based and official scope 2 emissions numbers showcase just how carbon intensive data centers really are, and how deceptive firms’ official emissions numbers can be. Meta, for example, reports its official scope 2 emissions for 2022 as 273 metric tons CO2 equivalent – all of that attributable to data centers. Under the location-based accounting system, that number jumps to more than 3.8m metric tons of CO2 equivalent for data centers alone – a more than 19,000 times increase.

A similar result can be seen with Microsoft. The firm reported its official data center-related emissions for 2022 as 280,782 metric tons CO2 equivalent. Under a location-based accounting method, that number jumps to 6.1m metric tons CO2 equivalent. That’s a nearly 22 times increase.

While Meta’s reporting gap is more egregious, both firms’ location-based emissions are higher because they undercount their data center emissions specifically, with 97.4% of the gap between Meta’s location-based and official scope 2 number in 2022 being unreported data center-related emissions, and 95.55% of Microsoft’s.

Specific data center-related emissions numbers aren’t available for the rest of the firms. However, given that Google and Apple have similar scope 2 business models to Meta and Microsoft, it is likely that the multiple on how much higher their location-based data center emissions are would be similar to the multiple on how much higher their overall location-based scope 2 emissions are.

In total, the sum of location-based emissions in this category between 2020 and 2022 was at least 275% higher (or 3.75 times) than the sum of their official figures. Amazon did not provide the Guardian with location-based scope 2 figures for 2020 and 2021, so its official (and likely much lower) numbers were used for this calculation for those years.

Third-party data centers

Big tech companies also rent a large portion of their data center capacity from third-party data center operators (or “colocation” data centers). According to the Synergy Research Group, large tech companies (or “hyperscalers”) represented 37% of worldwide data center capacity in 2022, with half of that capacity coming through third-party contracts. While this group includes companies other than Google, Amazon, Meta, Microsoft and Apple, it gives an idea of the extent of these firms’ activities with third-party data centers.

Those emissions should theoretically fall under scope 3, all emissions a firm is responsible for that can’t be attributed to the fuel or electricity it consumes.

When it comes to a big tech firm’s operations, this would encapsulate everything from the manufacturing processes of the hardware it sells (like the iPhone or Kindle) to the emissions from employees’ cars during their commutes to the office.

When it comes to data centers, scope 3 emissions include the carbon emitted from the construction of in-house data centers, as well as the carbon emitted during the manufacturing process of the equipment used inside those in-house data centers. It may also include those emissions as well as the electricity-related emissions of third-party data centers that are partnered with.

However, whether or not these emissions are fully included in reports is almost impossible to prove. “Scope 3 emissions are hugely uncertain,” said Dietrich. “This area is a mess just in terms of accounting.”

According to Dietrich, some third-party data center operators put their energy-related emissions in their own scope 2 reporting, so those who rent from them can put those emissions into their scope 3. Other third-party data center operators put energy-related emissions into their scope 3 emissions, expecting their tenants to report those emissions in their own scope 2 reporting.

Additionally, all firms use market-based metrics for these scope 3 numbers, which means third-party data center emissions are also undercounted in official figures.

Of the firms that report their location-based scope 3 emissions in the footnotes, only Apple has a large gap between its official scope 3 figure and its location-based scope 3 figure, starting in 2022.

This gap can largely be attributed to data center emissions accounting. The only change to Apple’s scope 3 methodology in 2022 was to include “work from home, third-party cloud services, electricity transmission and distribution losses, and upstream impacts from scope 1 fuels”. Since the firm listed third-party cloud services as having zero emissions under its official scope 3 reporting, that means all emissions associated with those third-party services would only show up in location-based scope 3 emissions from 2022 onwards.

2025 and beyond

Even though big tech hides these emissions, they are due to keep rising. Data centers’ electricity demand is projected to double by 2030 due to the additional load that artificial intelligence poses, according to the Electric Power Research Institute.

Google and Microsoft both blamed AI for their recent upticks in market-based emissions.

“The relative contribution of AI computing loads to Google’s data centers, as I understood it when I left [in 2022], was relatively modest,” said Chris Taylor, current CEO of utility storage firm Gridstor and former site lead for Google’s data center energy strategy unit. “Two years ago, [AI] was not the main thing that we were worried about, at least on the energy team.”

Taylor explained that most of the growth that he saw in data centers while at Google was attributable to growth in Google Cloud, as most enterprises were moving their IT tasks to the firm’s cloud servers.

Whether today’s power grids can withstand the growing energy demands of AI is uncertain. One industry leader – Marc Ganzi, the CEO of DigitalBridge, a private equity firm that owns two of the world’s largest third-party data center operators – has gone as far as to say that the data center sector may run out of power within the next two years.

And as grid interconnection backlogs continue to pile up worldwide, it may be nearly impossible for even the most well intentioned of companies to get new renewable energy production capacity online in time to meet that demand.

Read the full story here.
Photos courtesy of

How Mississippians Can Intervene in Natural Gas Pipeline Proposal

Mississippi residents can comment on a proposal for a natural gas pipeline that would span nearly the full width of the state

Mississippians have until Tuesday to intervene in a proposal for a natural gas pipeline that would span nearly the full width of the state.The pipeline, called the “Mississippi Crossing Project,” would start in Greenville, cross through Humphreys, Holmes, Attala, Leake, Neshoba, Newton, Lauderdale and Clarke counties and end near Butler, Alabama, stretching nearly 208 miles.Tennessee Gas Pipeline Co., a subsidiary of Kinder Morgan, sent an application for the project to the Federal Energy Regulatory Commission on June 30. The company hopes the pipeline, which would transfer up to 12 billion cubic feet of natural gas per day, will address a rising energy demand by increasing its transportation capacity.Kinder Morgan says on its website that, should it receive approval, construction would begin at the end of 2027 and the pipeline would begin service in November 2028. The company says the project would cost $1.7 billion and create 750 temporary jobs as well as 15 permanent positions.The project would also include new compressor stations in Humphreys, Attala and Lauderdale counties, although exact locations haven’t been set.Singleton Schreiber, a national law firm that focuses on environmental justice, is looking to spread awareness of the public’s ability to participate in the approval process, whether or not they support the proposal.“We’re just trying to raise awareness to make sure that people know this is happening,” said Laura Singleton, an attorney with the firm. “They’re going to have to dig and construct new pipelines, so it’s going to pass through sensitive ecosystems like wetlands, private property, farmland, things like that. So you can have issues that come up like soil degradation, water contamination, and then after the pipeline is built you could potentially have leaks, spills.”Singleton added while such issues with pipelines are rare, when “things go bad, they go pretty bad.”To comment, protest, or file a motion to intervene, the public can go to FERC’s website (new users have to create an account, and then use the docket number “CP25-514-000”). The exact deadline is 4 p.m. on Aug. 5. More instructions can also be found here.In addition to FERC, the proposal will also face review from the U.S. Army Corps of Engineers, U.S. Fish and Wildlife Service, National Park Service and the state environmental agencies in Mississippi and Alabama.Mississippians have seen multiple incidents related to gas leaks in recent years. In March, three workers were injured after accidentally rupturing an Atmos Energy pipeline doing routine maintenance in Lee County, leaving thousands without service. Then last year, the National Transportation Safety Board found that Atmos discovered gas leaks over a month prior to two explosions in Jackson, one of which claimed the life of an 82-year-old woman.This story was originally published by Mississippi Today and distributed through a partnership with The Associated Press.Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See - June 2025

BPA faces suit over energy market decision that opponents say would raise rates

The lawsuit comes after governors, lawmakers, utility regulators and renewable energy proponents in the region unsuccessfully pressed the BPA to reconsider its plans.

Five energy and conservation nonprofits are suing the Bonneville Power Administration over its decision to join a new energy trading market, claiming it will raise electricity and transmission costs in Oregon and across the region. The lawsuit, filed Thursday in the 9th U.S. Circuit Court of Appeals, alleges that BPA’s move violates the Northwest Power Act and the National Environmental Policy Act and will also weaken energy grid reliability and reduce access to clean energy. BPA, the Northwest’s largest transmission grid operator, in May announced it would join the Arkansas-based Southwest Power Pool day-ahead market known as “Markets Plus” instead of joining California’s day-ahead market. The Southwest market is smaller with fewer electrical generation resources, experts say. Prior to that decision, Pacific Northwest governors, lawmakers, utility regulators and renewable energy proponents had pressed the BPA for months to reconsider its plans, which the agency initially announced in March.The nonprofits involved in the legal challenge are the Oregon Citizens’ Utility Board, a watchdog organization that advocates for utility customers; national environmental group the Sierra Club; the Montana Environmental Information Center, which promotes clean energy; the Idaho Conservation League, a natural landscape conservation group; and the NW Energy Coalition, which promotes affordable energy policies. The groups, represented by San Francisco-based environmental law nonprofit Earthjustice, want the court to vacate BPA’s decision, require the agency to prepare an environmental impact statement and rescind the financial commitments already made to the Southwest energy market.The BPA’s spokesperson Nick Quinata declined to comment on the pending litigation. Previously, the agency said the Southwest day-ahead market is superior to the California one because it would allow BPA to remain more independent due to its market design and governance structure. BPA, part of the U.S. Department of Energy, markets hydropower from 31 federal dams in the Columbia River Basin and supplies a third of the Northwest’s electricity, most of it to publicly owned rural utilities and electric cooperatives. It also owns and operates 15,000 miles – 75% – of the Northwest’s high-voltage transmission lines. Nearly every electric utility in Oregon benefits from either the clean hydroelectricity or the transmission lines controlled by BPA. BPA’s decision sets the stage for having two energy markets across the West.The lawsuit says that will likely lead to rising prices and blackouts during periods of high electricity demand because of the complexity of transmitting power across boundaries between different utilities and the agreements required for such transfers. Oregon’s two largest utilities, investor-owned Portland General Electric and Pacific Power, have both signed agreements to join California’s day-ahead market instead. They, too, have argued that once BPA leaves the Western market, the available energy they can purchase would diminish and become more expensive, leading to higher prices for customers across the region.Regional electricity providers also may have to construct additional power generation facilities, increase operation of existing facilities or both, to make up for BPA’s participation in a smaller and less efficient energy market, the suit contends. It could also increase reliance on generation resources powered by fossil fuels such as coal or natural gas plants because clean energy isn’t as widely available in the smaller Southwest market, the suit says. The Northwest Power Act, passed by Congress in the 1980s, requires BPA to provide low-cost power to the region while encouraging renewable energy, conservation and protection of fish and wildlife.BPA violated those duties when it chose the Southwest market option, according to the lawsuit. The groups also allege BPA’s market choice could harm fish and wildlife in the Columbia basin because it could alter the operation of the federal hydroelectric dams from which Bonneville markets power. The lawsuit claims BPA failed to comply with federal environmental law by not conducting any environmental impact analysis on impacts to fish and wildlife before making its decision. The Citizens’ Utility Board, a party to the lawsuit, said it hoped the BPA reverses course – otherwise its decision will splinter the West’s electricity markets, costing utility customers billions of dollars at a time when many are already dealing with skyrocketing bills.The board, as well as other critics of BPA’s decision, have pointed to an initiative developing an independent governance structure for California’s day-ahead market.“Oregon is facing overlapping energy challenges: rising utility bills, rising electricity demand from data centers, and stalling progress on meeting clean energy requirements. The last thing we need is for one of our region’s largest clean energy suppliers to reduce ties with the Pacific Northwest,” said the group’s spokesperson Charlotte Shuff. — Gosia Wozniacka covers environmental justice, climate change, the clean energy transition and other environmental issues. Reach her at gwozniacka@oregonian.com or 971-421-3154.If you purchase a product or register for an account through a link on our site, we may receive compensation. By using this site, you consent to our User Agreement and agree that your clicks, interactions, and personal information may be collected, recorded, and/or stored by us and social media and other third-party partners in accordance with our Privacy Policy.

States, enviro groups fight Trump plan to keep dirty power plants going

In late spring, the Department of Energy ordered two aging and costly fossil-fueled power plants that were on the verge of shutting down to stay open. The agency claimed that the moves were necessary to prevent the power grid from collapsing — and that it has the power to force the plants to stay open even if the…

In late spring, the Department of Energy ordered two aging and costly fossil-fueled power plants that were on the verge of shutting down to stay open. The agency claimed that the moves were necessary to prevent the power grid from collapsing — and that it has the power to force the plants to stay open even if the utilities, state regulators, and grid operators managing them say that no such emergency exists. But state regulators, regional grid operators, environmental groups, and consumer groups are pushing back on the notion that the grids in question even need these interventions — and are challenging the legality of the DOE’s stay-open orders. The DOE claimed that the threat of large-scale grid blackouts forced its hand. But state utility regulators, environmental groups, consumer advocates, and energy experts say that careful analysis from the plant’s owners, state regulators, regional grid operators, and grid reliability experts had determined both plants could be safely closed. These groups argue that clean energy, not fossil fuels, are the true solution to the country’s grid challenges — even if the ​“big, beautiful” bill signed by Trump last week will make those resources more expensive to build. Some of the environmental organizations challenging DOE’s orders have pledged to take their case to federal court if necessary. “We need to get more electrons on the grid. We need those to be clean, reliable, and affordable,” said Robert Routh, Pennsylvania climate and energy policy director for the Natural Resources Defense Council, one of the groups demanding that DOE reconsider its orders. Keeping J.H. Campbell and Eddystone open ​“results in the exact opposite. It’s costly, harmful, unnecessary, and unlawful.” Taking on the DOE’s grid emergency claims The groups challenging the DOE’s J.H. Campbell and Eddystone stay-open orders point out that the agency is using a power originally designed to protect the grid against unanticipated emergencies, including during wartime, but without proving that such an emergency is underway. “This authority that the Department of Energy is acting under — Section 202(c) of the Federal Power Act — is a very tailored emergency authority,” said Caroline Reiser, NRDC senior attorney for climate and energy. ​“Congress intentionally wrote it only to be usable in specific, narrow, short-term emergencies. This is not that.” For decades, the DOE has used its Section 202(c) power sparingly, and only in response to requests from utilities or grid operators to waive federal air pollution regulations or other requirements in moments when the grid faces imminent threats like widespread power outages, Reiser said. But DOE’s orders for Eddystone and J.H. Campbell were not spurred by requests from state regulators or regional grid operators. In fact, the orders caught those parties by surprise. They also came mere days before the plants were set to close down and after years of effort to ensure their closure wouldn’t threaten grid reliability. J.H. Campbell was scheduled to close in May under a plan that has been in the works since 2021 as part of a broader agreement between utility Consumers Energy and state regulators, and which was approved by the Midcontinent Independent System Operator (MISO), the entity that manages grid reliability across Michigan and 14 other states. “The plant is really old, unreliable, extremely polluting, and extremely expensive,” Reiser said. ​“Nobody is saying that this plant is needed or is going to be beneficial for any reliability purposes.” To justify its stay-open order, DOE cited reports from the North American Electric Reliability Corp. (NERC), a nonprofit regulatory authority that includes utilities and grid operators in the U.S. and Canada. NERC found MISO is at higher risk of summertime reliability problems than other U.S. grid regions, but environmental groups argue in their rehearing request that DOE has ​“misrepresented the reports on which it relies,” and that Consumers Energy, Michigan regulators, and MISO have collectively shown closing the plant won’t endanger grid reliability. Eddystone, which had operated only infrequently over the past few years, also went through a rigorous process with mid-Atlantic grid operator PJM Interconnection to ensure its closure wouldn’t harm grid reliability. The DOE’s reason for keeping that plant open is based on a report from PJM that states the grid operator might need to ask utility customers to use less power if it faces extreme conditions this summer — an even scantier justification than what the agency cited in its J.H. Campbell order, Reiser said. As long as the DOE continues to take the position that it can issue emergency stay-open orders to any power plant it decides to, these established methods for managing plant closures and fairly allocating costs will be thrown into disarray, she said. “We have a system of competitive energy markets in the United States that is successful in keeping the lights on and maintaining reliability the vast, vast majority of the time,” Reiser said. ​“The Department of Energy stepping in and using a command-and-control system interferes with those markets.”

Designing a new way to optimize complex coordinated systems

Using diagrams to represent interactions in multipart systems can provide a faster way to design software improvements.

Coordinating complicated interactive systems, whether it’s the different modes of transportation in a city or the various components that must work together to make an effective and efficient robot, is an increasingly important subject for software designers to tackle. Now, researchers at MIT have developed an entirely new way of approaching these complex problems, using simple diagrams as a tool to reveal better approaches to software optimization in deep-learning models.They say the new method makes addressing these complex tasks so simple that it can be reduced to a drawing that would fit on the back of a napkin.The new approach is described in the journal Transactions of Machine Learning Research, in a paper by incoming doctoral student Vincent Abbott and Professor Gioele Zardini of MIT’s Laboratory for Information and Decision Systems (LIDS).“We designed a new language to talk about these new systems,” Zardini says. This new diagram-based “language” is heavily based on something called category theory, he explains.It all has to do with designing the underlying architecture of computer algorithms — the programs that will actually end up sensing and controlling the various different parts of the system that’s being optimized. “The components are different pieces of an algorithm, and they have to talk to each other, exchange information, but also account for energy usage, memory consumption, and so on.” Such optimizations are notoriously difficult because each change in one part of the system can in turn cause changes in other parts, which can further affect other parts, and so on.The researchers decided to focus on the particular class of deep-learning algorithms, which are currently a hot topic of research. Deep learning is the basis of the large artificial intelligence models, including large language models such as ChatGPT and image-generation models such as Midjourney. These models manipulate data by a “deep” series of matrix multiplications interspersed with other operations. The numbers within matrices are parameters, and are updated during long training runs, allowing for complex patterns to be found. Models consist of billions of parameters, making computation expensive, and hence improved resource usage and optimization invaluable.Diagrams can represent details of the parallelized operations that deep-learning models consist of, revealing the relationships between algorithms and the parallelized graphics processing unit (GPU) hardware they run on, supplied by companies such as NVIDIA. “I’m very excited about this,” says Zardini, because “we seem to have found a language that very nicely describes deep learning algorithms, explicitly representing all the important things, which is the operators you use,” for example the energy consumption, the memory allocation, and any other parameter that you’re trying to optimize for.Much of the progress within deep learning has stemmed from resource efficiency optimizations. The latest DeepSeek model showed that a small team can compete with top models from OpenAI and other major labs by focusing on resource efficiency and the relationship between software and hardware. Typically, in deriving these optimizations, he says, “people need a lot of trial and error to discover new architectures.” For example, a widely used optimization program called FlashAttention took more than four years to develop, he says. But with the new framework they developed, “we can really approach this problem in a more formal way.” And all of this is represented visually in a precisely defined graphical language.But the methods that have been used to find these improvements “are very limited,” he says. “I think this shows that there’s a major gap, in that we don’t have a formal systematic method of relating an algorithm to either its optimal execution, or even really understanding how many resources it will take to run.” But now, with the new diagram-based method they devised, such a system exists.Category theory, which underlies this approach, is a way of mathematically describing the different components of a system and how they interact in a generalized, abstract manner. Different perspectives can be related. For example, mathematical formulas can be related to algorithms that implement them and use resources, or descriptions of systems can be related to robust “monoidal string diagrams.” These visualizations allow you to directly play around and experiment with how the different parts connect and interact. What they developed, he says, amounts to “string diagrams on steroids,” which incorporates many more graphical conventions and many more properties.“Category theory can be thought of as the mathematics of abstraction and composition,” Abbott says. “Any compositional system can be described using category theory, and the relationship between compositional systems can then also be studied.” Algebraic rules that are typically associated with functions can also be represented as diagrams, he says. “Then, a lot of the visual tricks we can do with diagrams, we can relate to algebraic tricks and functions. So, it creates this correspondence between these different systems.”As a result, he says, “this solves a very important problem, which is that we have these deep-learning algorithms, but they’re not clearly understood as mathematical models.” But by representing them as diagrams, it becomes possible to approach them formally and systematically, he says.One thing this enables is a clear visual understanding of the way parallel real-world processes can be represented by parallel processing in multicore computer GPUs. “In this way,” Abbott says, “diagrams can both represent a function, and then reveal how to optimally execute it on a GPU.”The “attention” algorithm is used by deep-learning algorithms that require general, contextual information, and is a key phase of the serialized blocks that constitute large language models such as ChatGPT. FlashAttention is an optimization that took years to develop, but resulted in a sixfold improvement in the speed of attention algorithms.Applying their method to the well-established FlashAttention algorithm, Zardini says that “here we are able to derive it, literally, on a napkin.” He then adds, “OK, maybe it’s a large napkin.” But to drive home the point about how much their new approach can simplify dealing with these complex algorithms, they titled their formal research paper on the work “FlashAttention on a Napkin.”This method, Abbott says, “allows for optimization to be really quickly derived, in contrast to prevailing methods.” While they initially applied this approach to the already existing FlashAttention algorithm, thus verifying its effectiveness, “we hope to now use this language to automate the detection of improvements,” says Zardini, who in addition to being a principal investigator in LIDS, is the Rudge and Nancy Allen Assistant Professor of Civil and Environmental Engineering, and an affiliate faculty with the Institute for Data, Systems, and Society.The plan is that ultimately, he says, they will develop the software to the point that “the researcher uploads their code, and with the new algorithm you automatically detect what can be improved, what can be optimized, and you return an optimized version of the algorithm to the user.”In addition to automating algorithm optimization, Zardini notes that a robust analysis of how deep-learning algorithms relate to hardware resource usage allows for systematic co-design of hardware and software. This line of work integrates with Zardini’s focus on categorical co-design, which uses the tools of category theory to simultaneously optimize various components of engineered systems.Abbott says that “this whole field of optimized deep learning models, I believe, is quite critically unaddressed, and that’s why these diagrams are so exciting. They open the doors to a systematic approach to this problem.”“I’m very impressed by the quality of this research. ... The new approach to diagramming deep-learning algorithms used by this paper could be a very significant step,” says Jeremy Howard, founder and CEO of Answers.ai, who was not associated with this work. “This paper is the first time I’ve seen such a notation used to deeply analyze the performance of a deep-learning algorithm on real-world hardware. ... The next step will be to see whether real-world performance gains can be achieved.”“This is a beautifully executed piece of theoretical research, which also aims for high accessibility to uninitiated readers — a trait rarely seen in papers of this kind,” says Petar Velickovic, a senior research scientist at Google DeepMind and a lecturer at Cambridge University, who was not associated with this work. These researchers, he says, “are clearly excellent communicators, and I cannot wait to see what they come up with next!”The new diagram-based language, having been posted online, has already attracted great attention and interest from software developers. A reviewer from Abbott’s prior paper introducing the diagrams noted that “The proposed neural circuit diagrams look great from an artistic standpoint (as far as I am able to judge this).” “It’s technical research, but it’s also flashy!” Zardini says.

The UK Says at an Energy Summit That Green Power Will Boost Security, as the US Differs

Britain has announced a major investment in wind power as it hosts an international summit on energy security

LONDON (AP) — Britain announced a major investment in wind power Thursday as it hosted an international summit on energy security — with Europe and the United States at odds over whether to cut their reliance on fossil fuels.U.K. Prime Minister Keir Starmer said the government will invest 300 million pounds ($400 million) in boosting Britain’s capacity to manufacture components for the offshore wind industry, a move it hopes will encourage private investment in the U.K.’s renewable energy sector.“As long as energy can be weaponized against us, our countries and our citizens are vulnerable and exposed,” U.K. Energy Secretary Ed Miliband told delegates.He said “low-carbon power” was a route to energy security as well as a way to slow climate change.Britain now gets more than half its electricity from renewable sources such as wind and solar power, and the rest from natural gas and nuclear energy. It aims to generate all the U.K.’s energy from renewable sources by 2030.Tommy Joyce, U.S. acting assistant secretary of energy for international affairs, told participants they should be “honest about the world’s growing energy needs, not focused on net-zero politics.”He called policies that push for clean power over fossil fuels "harmful and dangerous," and claimed building wind turbines requires "concessions to or coercion from China" because it supplies necessary rare minerals.Hosted by the British government and the International Energy Agency, the two-day summit brings together government ministers from 60 countries, senior European Union officials, energy sector CEOs, heads of international organizations and nonprofits to assess risks to the global energy system and figure out solutions. Associated Press writer Jennifer McDermott contributed to this story. ___The Associated Press’ climate and environmental coverage receives financial support from multiple private foundations. AP is solely responsible for all content. Find AP’s standards for working with philanthropies, a list of supporters and funded coverage areas at AP.org.Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See - Feb. 2025

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.