Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Data center emissions likely 662% higher than big tech claims. Can it keep up the ruse?

News Feed
Sunday, September 15, 2024

Big tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are likely about 662% – or 7.62 times – higher than officially reported.Amazon is the largest emitter of the big five tech companies by a mile – the emissions of the second-largest emitter, Apple, were less than half of Amazon’s in 2022. However, Amazon has been kept out of the calculation above because its differing business model makes it difficult to isolate data center-specific emissions figures for the company.As energy demands for these data centers grow, many are worried that carbon emissions will, too. The International Energy Agency stated that data centers already accounted for 1% to 1.5% of global electricity consumption in 2022 – and that was before the AI boom began with ChatGPT’s launch at the end of that year.AI is far more energy-intensive on data centers than typical cloud-based applications. According to Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search, and data center power demand will grow 160% by 2030. Goldman competitor Morgan Stanley’s research has made similar findings, projecting data center emissions globally to accumulate to 2.5bn metric tons of CO2 equivalent by 2030.In the meantime, all five tech companies have claimed carbon neutrality, though Google dropped the label last year as it stepped up its carbon accounting standards. Amazon is the most recent company to do so, claiming in July that it met its goal seven years early, and that it had implemented a gross emissions cut of 3%.“It’s down to creative accounting,” explained a representative from Amazon Employees for Climate Justice, an advocacy group composed of current Amazon employees who are dissatisfied with their employer’s action on climate. “Amazon – despite all the PR and propaganda that you’re seeing about their solar farms, about their electric vans – is expanding its fossil fuel use, whether it’s in data centers or whether it’s in diesel trucks.”A misguided metricThe most important tools in this “creative accounting” when it comes to data centers are renewable energy certificates, or Recs. These are certificates that a company purchases to show it is buying renewable energy-generated electricity to match a portion of its electricity consumption – the catch, though, is that the renewable energy in question doesn’t need to be consumed by a company’s facilities. Rather, the site of production can be anywhere from one town over to an ocean away.Recs are used to calculate “market-based” emissions, or the official emissions figures used by the firms. When Recs and offsets are left out of the equation, we get “location-based emissions” – the actual emissions generated from the area where the data is being processed.The trend in those emissions is worrying. If these five companies were one country, the sum of their “location-based” emissions in 2022 would rank them as the 33rd highest-emitting country, behind the Philippines and above Algeria.Many data center industry experts also recognize that location-based metrics are more honest than the official, market-based numbers reported.“Location-based [accounting] gives an accurate picture of the emissions associated with the energy that’s actually being consumed to run the data center. And Uptime’s view is that it’s the right metric,” said Jay Dietrich, the research director of sustainability at Uptime Institute, a leading data center advisory and research organization.Nevertheless, Greenhouse Gas (GHG) Protocol, a carbon accounting oversight body, allows Recs to be used in official reporting, though the extent to which they should be allowed remains controversial between tech companies and has led to a lobbying battle over GHG Protocol’s rule-making process between two factions.On one side there is the Emissions First Partnership, spearheaded by Amazon and Meta. It aims to keep Recs in the accounting process regardless of their geographic origins. In practice, this is only a slightly looser interpretation of what GHG Protocol already permits.The opposing faction, headed by Google and Microsoft, argues that there needs to be time-based and location-based matching of renewable production and energy consumption for data centers. Google calls this its 24/7 goal, or its goal to have all of its facilities run on renewable energy 24 hours a day, seven days a week by 2030. Microsoft calls it its 100/100/0 goal, or its goal to have all its facilities running on 100% carbon-free energy 100% of the time, making zero carbon-based energy purchases by 2030.Google has already phased out its Rec use and Microsoft aims to do the same with low-quality “unbundled” (non location-specific) Recs by 2030.Academics and carbon management industry leaders alike are also against the GHG Protocol’s permissiveness on Recs. In an open letter from 2015, more than 50 such individuals argued that “it should be a bedrock principle of GHG accounting that no company be allowed to report a reduction in its GHG footprint for an action that results in no change in overall GHG emissions. Yet this is precisely what can happen under the guidance given the contractual/Rec-based reporting method.”To GHG Protocol’s credit, the organization does ask companies to report location-based figures alongside their Rec-based figures. Despite that, no company includes both location-based and market-based metrics for all three subcategories of emissions in the bodies of their annual environmental reports.In fact, location-based numbers are only directly reported (that is, not hidden in third-party assurance statements or in footnotes) by two companies – Google and Meta. And those two firms only include those figures for one subtype of emissions: scope 2, or the indirect emissions companies cause by purchasing energy from utilities and large-scale generators.In-house data centersScope 2 is the category that includes the majority of the emissions that come from in-house data center operations, as it concerns the emissions associated with purchased energy – mainly, electricity.Data centers should also make up a majority of overall scope 2 emissions for each company except Amazon, given that the other sources of scope 2 emissions for these companies stem from the electricity consumed by firms’ offices and retail spaces – operations that are relatively small and not carbon-intensive. Amazon has one other carbon-intensive business vertical to account for in its scope 2 emissions: its warehouses and e-commerce logistics.For the firms that give data center-specific data – Meta and Microsoft – this holds true: data centers made up 100% of Meta’s market-based (official) scope 2 emissions and 97.4% of its location-based emissions. For Microsoft, those numbers were 97.4% and 95.6%, respectively.The massive differences in location-based and official scope 2 emissions numbers showcase just how carbon intensive data centers really are, and how deceptive firms’ official emissions numbers can be. Meta, for example, reports its official scope 2 emissions for 2022 as 273 metric tons CO2 equivalent – all of that attributable to data centers. Under the location-based accounting system, that number jumps to more than 3.8m metric tons of CO2 equivalent for data centers alone – a more than 19,000 times increase.A similar result can be seen with Microsoft. The firm reported its official data center-related emissions for 2022 as 280,782 metric tons CO2 equivalent. Under a location-based accounting method, that number jumps to 6.1m metric tons CO2 equivalent. That’s a nearly 22 times increase.While Meta’s reporting gap is more egregious, both firms’ location-based emissions are higher because they undercount their data center emissions specifically, with 97.4% of the gap between Meta’s location-based and official scope 2 number in 2022 being unreported data center-related emissions, and 95.55% of Microsoft’s.Specific data center-related emissions numbers aren’t available for the rest of the firms. However, given that Google and Apple have similar scope 2 business models to Meta and Microsoft, it is likely that the multiple on how much higher their location-based data center emissions are would be similar to the multiple on how much higher their overall location-based scope 2 emissions are.In total, the sum of location-based emissions in this category between 2020 and 2022 was at least 275% higher (or 3.75 times) than the sum of their official figures. Amazon did not provide the Guardian with location-based scope 2 figures for 2020 and 2021, so its official (and likely much lower) numbers were used for this calculation for those years.Third-party data centersBig tech companies also rent a large portion of their data center capacity from third-party data center operators (or “colocation” data centers). According to the Synergy Research Group, large tech companies (or “hyperscalers”) represented 37% of worldwide data center capacity in 2022, with half of that capacity coming through third-party contracts. While this group includes companies other than Google, Amazon, Meta, Microsoft and Apple, it gives an idea of the extent of these firms’ activities with third-party data centers.Those emissions should theoretically fall under scope 3, all emissions a firm is responsible for that can’t be attributed to the fuel or electricity it consumes.When it comes to a big tech firm’s operations, this would encapsulate everything from the manufacturing processes of the hardware it sells (like the iPhone or Kindle) to the emissions from employees’ cars during their commutes to the office.When it comes to data centers, scope 3 emissions include the carbon emitted from the construction of in-house data centers, as well as the carbon emitted during the manufacturing process of the equipment used inside those in-house data centers. It may also include those emissions as well as the electricity-related emissions of third-party data centers that are partnered with.However, whether or not these emissions are fully included in reports is almost impossible to prove. “Scope 3 emissions are hugely uncertain,” said Dietrich. “This area is a mess just in terms of accounting.”According to Dietrich, some third-party data center operators put their energy-related emissions in their own scope 2 reporting, so those who rent from them can put those emissions into their scope 3. Other third-party data center operators put energy-related emissions into their scope 3 emissions, expecting their tenants to report those emissions in their own scope 2 reporting.Additionally, all firms use market-based metrics for these scope 3 numbers, which means third-party data center emissions are also undercounted in official figures.Of the firms that report their location-based scope 3 emissions in the footnotes, only Apple has a large gap between its official scope 3 figure and its location-based scope 3 figure, starting in 2022.This gap can largely be attributed to data center emissions accounting. The only change to Apple’s scope 3 methodology in 2022 was to include “work from home, third-party cloud services, electricity transmission and distribution losses, and upstream impacts from scope 1 fuels”. Since the firm listed third-party cloud services as having zero emissions under its official scope 3 reporting, that means all emissions associated with those third-party services would only show up in location-based scope 3 emissions from 2022 onwards.2025 and beyondEven though big tech hides these emissions, they are due to keep rising. Data centers’ electricity demand is projected to double by 2030 due to the additional load that artificial intelligence poses, according to the Electric Power Research Institute.Google and Microsoft both blamed AI for their recent upticks in market-based emissions.“The relative contribution of AI computing loads to Google’s data centers, as I understood it when I left [in 2022], was relatively modest,” said Chris Taylor, current CEO of utility storage firm Gridstor and former site lead for Google’s data center energy strategy unit. “Two years ago, [AI] was not the main thing that we were worried about, at least on the energy team.”Taylor explained that most of the growth that he saw in data centers while at Google was attributable to growth in Google Cloud, as most enterprises were moving their IT tasks to the firm’s cloud servers.Whether today’s power grids can withstand the growing energy demands of AI is uncertain. One industry leader – Marc Ganzi, the CEO of DigitalBridge, a private equity firm that owns two of the world’s largest third-party data center operators – has gone as far as to say that the data center sector may run out of power within the next two years.And as grid interconnection backlogs continue to pile up worldwide, it may be nearly impossible for even the most well intentioned of companies to get new renewable energy production capacity online in time to meet that demand.

Emissions from in-house data centers of Google, Microsoft, Meta and Apple may be 7.62 times higher than official tallyBig tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are likely about 662% – or 7.62 times – higher than officially reported. Continue reading...

Big tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.

According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are likely about 662% – or 7.62 times – higher than officially reported.

Amazon is the largest emitter of the big five tech companies by a mile – the emissions of the second-largest emitter, Apple, were less than half of Amazon’s in 2022. However, Amazon has been kept out of the calculation above because its differing business model makes it difficult to isolate data center-specific emissions figures for the company.

As energy demands for these data centers grow, many are worried that carbon emissions will, too. The International Energy Agency stated that data centers already accounted for 1% to 1.5% of global electricity consumption in 2022 – and that was before the AI boom began with ChatGPT’s launch at the end of that year.

AI is far more energy-intensive on data centers than typical cloud-based applications. According to Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search, and data center power demand will grow 160% by 2030. Goldman competitor Morgan Stanley’s research has made similar findings, projecting data center emissions globally to accumulate to 2.5bn metric tons of CO2 equivalent by 2030.

In the meantime, all five tech companies have claimed carbon neutrality, though Google dropped the label last year as it stepped up its carbon accounting standards. Amazon is the most recent company to do so, claiming in July that it met its goal seven years early, and that it had implemented a gross emissions cut of 3%.

“It’s down to creative accounting,” explained a representative from Amazon Employees for Climate Justice, an advocacy group composed of current Amazon employees who are dissatisfied with their employer’s action on climate. “Amazon – despite all the PR and propaganda that you’re seeing about their solar farms, about their electric vans – is expanding its fossil fuel use, whether it’s in data centers or whether it’s in diesel trucks.”

A misguided metric

The most important tools in this “creative accounting” when it comes to data centers are renewable energy certificates, or Recs. These are certificates that a company purchases to show it is buying renewable energy-generated electricity to match a portion of its electricity consumption – the catch, though, is that the renewable energy in question doesn’t need to be consumed by a company’s facilities. Rather, the site of production can be anywhere from one town over to an ocean away.

Recs are used to calculate “market-based” emissions, or the official emissions figures used by the firms. When Recs and offsets are left out of the equation, we get “location-based emissions” – the actual emissions generated from the area where the data is being processed.

The trend in those emissions is worrying. If these five companies were one country, the sum of their “location-based” emissions in 2022 would rank them as the 33rd highest-emitting country, behind the Philippines and above Algeria.

Many data center industry experts also recognize that location-based metrics are more honest than the official, market-based numbers reported.

“Location-based [accounting] gives an accurate picture of the emissions associated with the energy that’s actually being consumed to run the data center. And Uptime’s view is that it’s the right metric,” said Jay Dietrich, the research director of sustainability at Uptime Institute, a leading data center advisory and research organization.

Nevertheless, Greenhouse Gas (GHG) Protocol, a carbon accounting oversight body, allows Recs to be used in official reporting, though the extent to which they should be allowed remains controversial between tech companies and has led to a lobbying battle over GHG Protocol’s rule-making process between two factions.

On one side there is the Emissions First Partnership, spearheaded by Amazon and Meta. It aims to keep Recs in the accounting process regardless of their geographic origins. In practice, this is only a slightly looser interpretation of what GHG Protocol already permits.

The opposing faction, headed by Google and Microsoft, argues that there needs to be time-based and location-based matching of renewable production and energy consumption for data centers. Google calls this its 24/7 goal, or its goal to have all of its facilities run on renewable energy 24 hours a day, seven days a week by 2030. Microsoft calls it its 100/100/0 goal, or its goal to have all its facilities running on 100% carbon-free energy 100% of the time, making zero carbon-based energy purchases by 2030.

Google has already phased out its Rec use and Microsoft aims to do the same with low-quality “unbundled” (non location-specific) Recs by 2030.

Academics and carbon management industry leaders alike are also against the GHG Protocol’s permissiveness on Recs. In an open letter from 2015, more than 50 such individuals argued that “it should be a bedrock principle of GHG accounting that no company be allowed to report a reduction in its GHG footprint for an action that results in no change in overall GHG emissions. Yet this is precisely what can happen under the guidance given the contractual/Rec-based reporting method.”

To GHG Protocol’s credit, the organization does ask companies to report location-based figures alongside their Rec-based figures. Despite that, no company includes both location-based and market-based metrics for all three subcategories of emissions in the bodies of their annual environmental reports.

In fact, location-based numbers are only directly reported (that is, not hidden in third-party assurance statements or in footnotes) by two companies – Google and Meta. And those two firms only include those figures for one subtype of emissions: scope 2, or the indirect emissions companies cause by purchasing energy from utilities and large-scale generators.

In-house data centers

Scope 2 is the category that includes the majority of the emissions that come from in-house data center operations, as it concerns the emissions associated with purchased energy – mainly, electricity.

Data centers should also make up a majority of overall scope 2 emissions for each company except Amazon, given that the other sources of scope 2 emissions for these companies stem from the electricity consumed by firms’ offices and retail spaces – operations that are relatively small and not carbon-intensive. Amazon has one other carbon-intensive business vertical to account for in its scope 2 emissions: its warehouses and e-commerce logistics.

For the firms that give data center-specific data – Meta and Microsoft – this holds true: data centers made up 100% of Meta’s market-based (official) scope 2 emissions and 97.4% of its location-based emissions. For Microsoft, those numbers were 97.4% and 95.6%, respectively.

The massive differences in location-based and official scope 2 emissions numbers showcase just how carbon intensive data centers really are, and how deceptive firms’ official emissions numbers can be. Meta, for example, reports its official scope 2 emissions for 2022 as 273 metric tons CO2 equivalent – all of that attributable to data centers. Under the location-based accounting system, that number jumps to more than 3.8m metric tons of CO2 equivalent for data centers alone – a more than 19,000 times increase.

A similar result can be seen with Microsoft. The firm reported its official data center-related emissions for 2022 as 280,782 metric tons CO2 equivalent. Under a location-based accounting method, that number jumps to 6.1m metric tons CO2 equivalent. That’s a nearly 22 times increase.

While Meta’s reporting gap is more egregious, both firms’ location-based emissions are higher because they undercount their data center emissions specifically, with 97.4% of the gap between Meta’s location-based and official scope 2 number in 2022 being unreported data center-related emissions, and 95.55% of Microsoft’s.

Specific data center-related emissions numbers aren’t available for the rest of the firms. However, given that Google and Apple have similar scope 2 business models to Meta and Microsoft, it is likely that the multiple on how much higher their location-based data center emissions are would be similar to the multiple on how much higher their overall location-based scope 2 emissions are.

In total, the sum of location-based emissions in this category between 2020 and 2022 was at least 275% higher (or 3.75 times) than the sum of their official figures. Amazon did not provide the Guardian with location-based scope 2 figures for 2020 and 2021, so its official (and likely much lower) numbers were used for this calculation for those years.

Third-party data centers

Big tech companies also rent a large portion of their data center capacity from third-party data center operators (or “colocation” data centers). According to the Synergy Research Group, large tech companies (or “hyperscalers”) represented 37% of worldwide data center capacity in 2022, with half of that capacity coming through third-party contracts. While this group includes companies other than Google, Amazon, Meta, Microsoft and Apple, it gives an idea of the extent of these firms’ activities with third-party data centers.

Those emissions should theoretically fall under scope 3, all emissions a firm is responsible for that can’t be attributed to the fuel or electricity it consumes.

When it comes to a big tech firm’s operations, this would encapsulate everything from the manufacturing processes of the hardware it sells (like the iPhone or Kindle) to the emissions from employees’ cars during their commutes to the office.

When it comes to data centers, scope 3 emissions include the carbon emitted from the construction of in-house data centers, as well as the carbon emitted during the manufacturing process of the equipment used inside those in-house data centers. It may also include those emissions as well as the electricity-related emissions of third-party data centers that are partnered with.

However, whether or not these emissions are fully included in reports is almost impossible to prove. “Scope 3 emissions are hugely uncertain,” said Dietrich. “This area is a mess just in terms of accounting.”

According to Dietrich, some third-party data center operators put their energy-related emissions in their own scope 2 reporting, so those who rent from them can put those emissions into their scope 3. Other third-party data center operators put energy-related emissions into their scope 3 emissions, expecting their tenants to report those emissions in their own scope 2 reporting.

Additionally, all firms use market-based metrics for these scope 3 numbers, which means third-party data center emissions are also undercounted in official figures.

Of the firms that report their location-based scope 3 emissions in the footnotes, only Apple has a large gap between its official scope 3 figure and its location-based scope 3 figure, starting in 2022.

This gap can largely be attributed to data center emissions accounting. The only change to Apple’s scope 3 methodology in 2022 was to include “work from home, third-party cloud services, electricity transmission and distribution losses, and upstream impacts from scope 1 fuels”. Since the firm listed third-party cloud services as having zero emissions under its official scope 3 reporting, that means all emissions associated with those third-party services would only show up in location-based scope 3 emissions from 2022 onwards.

2025 and beyond

Even though big tech hides these emissions, they are due to keep rising. Data centers’ electricity demand is projected to double by 2030 due to the additional load that artificial intelligence poses, according to the Electric Power Research Institute.

Google and Microsoft both blamed AI for their recent upticks in market-based emissions.

“The relative contribution of AI computing loads to Google’s data centers, as I understood it when I left [in 2022], was relatively modest,” said Chris Taylor, current CEO of utility storage firm Gridstor and former site lead for Google’s data center energy strategy unit. “Two years ago, [AI] was not the main thing that we were worried about, at least on the energy team.”

Taylor explained that most of the growth that he saw in data centers while at Google was attributable to growth in Google Cloud, as most enterprises were moving their IT tasks to the firm’s cloud servers.

Whether today’s power grids can withstand the growing energy demands of AI is uncertain. One industry leader – Marc Ganzi, the CEO of DigitalBridge, a private equity firm that owns two of the world’s largest third-party data center operators – has gone as far as to say that the data center sector may run out of power within the next two years.

And as grid interconnection backlogs continue to pile up worldwide, it may be nearly impossible for even the most well intentioned of companies to get new renewable energy production capacity online in time to meet that demand.

Read the full story here.
Photos courtesy of

The world’s carbon emissions continue to rise. But 35 countries show progress in cutting carbon

In 2025 the world has fallen short, again, of peaking and reducing its fossil fuel use. But there are many countries on a path to greener energy.

Global fossil fuel emissions are projected to rise in 2025 to a new all-time high, with all sources – coal, gas, and oil – contributing to the increase. At the same time, our new global snapshot of carbon dioxide emissions and carbon sinks shows at least 35 countries have a plan to decarbonise. Australia, Germany, New Zealand and many others have shown statistically significant declines in fossil carbon emissions during the past decade, while their economies have continued to grow. China’s emissions have also been been growing at a much slower pace than recent trends and might even be flat by year’s end. As world leaders and delegates meet in Brazil for the United Nations’ global climate summit, COP30, many countries that have submitted new emissions commitments to 2035 have shown increased ambition. But unless these efforts are scaled up substantially, current global temperature trends are projected to significantly exceed the Paris Agreement target that aims to keep warming well below 2°C. These 35 countries are now emitting less carbon dioxide even as their economies grow. Global Carbon Project 2025, CC BY-NC-ND Fossil fuel emissions up again in 2025 Together with colleagues from 102 research institutions worldwide, the Global Carbon Project today releases the Global Carbon Budget 2025. This is an annual stocktake of the sources and sinks of carbon dioxide worldwide. We also publish the major scientific advances enabling us to pinpoint the global human and natural sources and sinks of carbon dioxide with higher confidence. Carbon sinks are natural or artificial systems such as forests which absorb more carbon dioxide from the atmosphere than they release. Global CO₂ emissions from the use of fossil fuels continue to increase. They are set to rise by 1.1% in 2025, on top of a similar rise in 2024. All fossil fuels are contributing to the rise. Emissions from natural gas grew 1.3%, followed by oil (up 1.0%) and coal (up 0.8%). Altogether, fossil fuels produced 38.1 billion tonnes of CO₂ in 2025. Not all the news is bad. Our research finds emissions from the top emitter, China (32% of global CO₂ emissions) will increase significantly more slowly below its growth over the past decade, with a modest 0.4% increase. Emissions from India (8% of global) are projected to increase by 1.4%, also below recent trends. However, emissions from the United States (13% of global) and the European Union (6% of global) are expected to grow above recent trends. For the US, a projected growth of 1.9% is driven by a colder start to the year, increased liquefied natural gas (LNG) exports, increased coal use, and higher demand for electricity. EU emissions are expected to grow 0.4%, linked to lower hydropower and wind output due to weather. This led to increased electricity generation from LNG. Uncertainties in currently available data also include the possibility of no growth or a small decline. Fossil fuel emissions hit a new high in 2025, but the growth rate is slowing and there are encouraging signs from countries cutting emissions. Global Carbon Project 2025, CC BY-NC-ND Drop in land use emissions In positive news, net carbon emissions from changes to land use such as deforestation, degradation and reforestation have declined over the past decade. They are expected to produce 4.1 billion tonnes of carbon dioxide in 2025 down from the annual average of 5 billion tonnes over the past decade. Permanent deforestation remains the largest source of emissions. This figure also takes into account the 2.2 billion tonnes of carbon soaked up by human-driven reforestation annually. Three countries – Brazil, Indonesia and the Democratic Republic of the Congo – contribute 57% of global net land-use change CO₂ emissions. When we combine the net emissions from land-use change and fossil fuels, we find total global human-caused emissions will reach 42.2 billion tonnes of carbon dioxide in 2025. This total has grown 0.3% annually over the past decade, compared with 1.9% in the previous one (2005–14). Carbon sinks largely stagnant Natural carbon sinks in the ocean and terrestrial ecosystems remove about half of all human-caused carbon emissions. But our new data suggests these sinks are not growing as we would expect. The ocean carbon sink has been relatively stagnant since 2016, largely because of climate variability and impacts from ocean heatwaves. The land CO₂ sink has been relatively stagnant since 2000, with a significant decline in 2024 due to warmer El Niño conditions on top of record global warming. Preliminary estimates for 2025 show a recovery of this sink to pre-El Niño levels. Since 1960, the negative effects of climate change on the natural carbon sinks, particularly on the land sink, have suppressed a fraction of the full sink potential. This has left more CO₂ in the atmosphere, with an increase in the CO₂ concentration by an additional 8 parts per million. This year, atmospheric CO₂ levels are expected to reach just above 425 ppm. Tracking global progress Despite the continued global rise of carbon emissions, there are clear signs of progress towards lower-carbon energy and land use in our data. There are now 35 countries that have reduced their fossil carbon emissions over the past decade, while still growing their economy. Many more, including China, are shifting to cleaner energy production. This has led to a significant slowdown of emissions growth. Existing policies supporting national emissions cuts under the Paris Agreement are projected to lead to global warming of 2.8°C above preindustrial levels by the end of this century. This is an improvement over the previous assessment of 3.1°C, although methodological changes also contributed to the lower warming projection. New emissions cut commitments to 2035, for those countries that have submitted them, show increased mitigation ambition. This level of expected mitigation falls still far short of what is needed to meet the Paris Agreement goal of keeping warming well below 2°C. At current levels of emissions, we calculate that the remaining global carbon budget – the carbon dioxide still able to be emitted before reaching specific global temperatures (averaged over multiple years) – will be used up in four years for 1.5°C (170 gigatonnes remaining), 12 years for 1.7°C (525 Gt) and 25 years for 2°C (1,055 Gt). Falling short Our improved and updated global carbon budget shows the relentless global increase of fossil fuel CO₂ emissions. But it also shows detectable and measurable progress towards decarbonisation in many countries. The recovery of the natural CO₂ sinks is a positive finding. But large year-to-year variability shows the high sensitivity of these sinks to heat and drought. Overall, this year’s carbon report card shows we have fallen short, again, of reaching a global peak in fossil fuel use. We are yet to begin the rapid decline in carbon emissions needed to stabilise the climate. Pep Canadell receives funding from the Australian National Environmental Science Program - Climate Systems HubClemens Schwingshackl receives funding from the European Union's Horizon Europe research and innovation programme and Schmidt Sciences.Corinne Le Quéré receives funding from the UK Natural Environment Research Council, the UK Royal Society, and the UK Advanced Research + Invention Agency. She was granted a research donation by Schmidt Futures (project CALIPSO – Carbon Loss In Plants, Soils and Oceans). Corinne Le Quéré is a member of the UK Climate Change Committee. Her position here is her own and does not necessarily reflect that of the Committee. Corinne Le Quéré is a member of the Scientific Advisory Council of Societe Generale. Glen Peters receives funding from the European Union's Horizon Europe research and innovation programme.Judith Hauck receives funding from the European Union's Horizon Europe research and innovation programme, the European Research Council and Germany's Federal Ministry of Research, Technology and Space.Julia Pongratz receives funding from the European Horizon Europe research and innovation programme and Germany's Federal Ministry of Research, Technology and Space.Mike O'Sullivan receives funding from the European Union's Horizon Europe research and innovation programme, and the European Space Agency.Pierre Friedlingstein receives funding from the European Union's Horizon Europe research and innovation programmeRobbie Andrew receives funding from the European Union's Horizon Europe research and innovation programme and the Norwegian Environment Agency.

AI power use forecast finds the industry far off track to net zero

Several large tech firms that are active in AI have set goals to hit net zero by 2030, but a new forecast of the energy and water required to run large data centres shows they’re unlikely to meet those targets

A data centre in Ashburn, VirginiaJIM LO SCALZO/EPA/Shutterstock As the AI industry rapidly expands, questions about the environmental impact of data centres are coming to the forefront – and a new forecast warns the industry is unlikely to meet net zero targets by 2030. Fengqi You at Cornell University in New York and his colleagues modelled how much energy, water and carbon today’s leading AI servers could use by 2030, taking into account different growth scenarios and possible data centre locations within the United States. They combined projected chip supply, server power usage and cooling efficiency with state-by-state electrical grid data to conduct their analysis. While not every AI company has set a net zero target, some larger tech firms that are active in AI, such as Google, Microsoft and Meta have set goals with a deadline of 2030. “The rapid growth of AI computing is basically reshaping everything,” says You. “We’re trying to understand how, as a sector grows, what’s going to be the impact?” Their estimates suggest US AI server buildout will require between 731 million and 1.125 billion additional cubic metres of water by 2030, while emitting the equivalent of between 24 and 44 million tonnes of carbon dioxide a year. The forecast depends on how fast AI demand grows, how many high-end servers can actually be built and where new US data centres are located. The researchers modelled five scenarios based on the speed of growth, and identified various ways to reduce the impact. “Number one is location, location, location,” says You. Placing data centres in Midwestern states, where water is more available and the energy grid is powered by a higher proportion of renewables, can reduce the impact. The team also pinpoints decarbonising energy supplies and improving the efficiency of data centre computing and cooling processes as major ways to limit the impact. Collectively, those three approaches could cut the industry’s emissions by 73 per cent and its water footprint by 86 per cent. But the group’s projections could also be scuppered by public opposition to data centre installations because of their potentially extractive impact on the environment. In Virginia, which hosts about one-eighth of global data centre capacity, residents have begun lodging opposition to further planned construction, citing the impact on their water reserves and the wider environment. Similar petitions against data centres have been lodged in Pennsylvania, Texas, Arizona, California and Oregon. Figures from Data Center Watch, a research firm tracking data centre development, suggests local opposition has stymied $64 billion worth of projects. However, it is unclear, even in places that have successfully rejected data centres, just how much power and water they may use. That is why the new findings have been welcomed – albeit cautiously – by those who have attempted to study and quantify AI’s environmental impact. “AI is such a fast-moving field that it’s really hard to make any kind of meaningful future projections,” says Sasha Luccioni at AI company Hugging Face. “As the authors themselves say, the breakthroughs in the industry could fundamentally alter computing and energy requirements, like what we’ve seen with DeepSeek”, which used different techniques to reduce brute-force computation. Chris Preist at the University of Bristol in the UK says, “the authors are right to point out the need to invest in additional renewable energy capacity”, and adds data centre location matters. “I think their assumptions regarding water use to directly cool AI data centres are pretty pessimistic,” he says, suggesting the model’s “best case” scenario is more like “business as usual” for data centres these days. Luccioni believes the paper highlights what is missing in the AI world: “more transparency”. She explains that could be fixed by “requiring model developers to track and report their compute and energy use, and to provide this information to users and policymakers and to make firm commitments to reduce their overall environmental impacts, including emissions”.

Having children plays a complicated role in the rate we age

The effort of reproducing may divert energy away from repairing DNA or fighting illness, which could drive ageing, but a new study suggests that is only the case when environmental conditions are tough

Some say children keep you young, but it’s complicatedJavier Zayas/Getty Images For millennia, we have tried to understand why we age, with the ancient Greek philosopher Aristotle proposing it occurs alongside the gradual drying up of the internal moisture necessary for life. In modern times, a leading idea known as the disposable soma hypothesis suggests that ageing is the price we pay for reproduction, with evolution prioritising the passing on of genes above all else. This creates a fundamental trade-off: the immense energy devoted to having and raising offspring comes at the cost of repairing DNA, fighting off illness and keeping organs in good shape. This may particularly apply to women, who invest more in reproduction than men via pregnancy and breastfeeding. However, when scientists have tested this hypothesis by checking if women with more children live shorter lives, the results have been mixed: some studies support the idea, while others have found no effect. “It is very difficult to disentangle what is just correlation [between having more children and a shorter life] and what is the underlying causation, unless you have a good, big dataset that covers several generations,” says Elisabeth Bolund at the Swedish University of Agricultural Sciences, who wasn’t involved in the study. Euan Young at the University of Groningen in the Netherlands and his colleagues hypothesised that the inconsistency between studies exists because the cost of reproduction isn’t fixed – it depends on a mother’s environment. “In good times, this trade-off isn’t really visible. The trade-off only becomes apparent when times are tough,” says Young. To investigate this idea, the researchers analysed the parish records of more than 4500 Finnish women, spanning 250 years. These included the period of the Great Finnish Famine from 1866 to 1868, providing a means to gauge how hard times affect reproduction and longevity, says Young. They found that among the women who lived before or after the famine or who didn’t have children during it, there was no significant association between the number of children they had and their lifespan. However, for the women who did have children during the famine, their life expectancy decreased by six months for every child they had. The study builds on research published last year that used a dataset from a pre-industrial population in Quebec, Canda, monitored over two centuries, which showed this trade-off in mothers who were probably in poor health or under great stress, but didn’t explore how this was affected by specific environmental conditions. In contrast, Young’s team points to a specific, catastrophic event as the driver that exposes the trade-off for mothers. “This very large dataset makes it feasible to account for confounding factors [such as genetics and lifestyle factors],” says Bolund. “The study gets us as close as we can to identifying causation without running a controlled experiment in the lab.” The study also confirms the energetic demands of pregnancy and breastfeeding, which require hundreds of extra calories per day. During a famine, women can’t get this energy from food, so their bodies pay the price, “lowering basal metabolism [the minimum number of calories your body needs to function at a basic level] and thus slowing or shutting down other important functions, resulting in a decline in health and shorter lifespans”, says Young. It also explains why previous studies sometimes found the trade-off only in lower socioeconomic groups, which were effectively always living in relatively resource-scarce environments, he says. According to Bolund, the fact that this trade-off seems to occur in particularly tough circumstances, and when women typically had many children, may partly explain why women generally live longer than men today, with girls born between 2021 and 2023 in the UK expected to live four years longer than their male counterparts. The costs of reproduction are now fairly low in Western societies, where the average number of children women give birth to has reduced considerably over the centuries, says Bolund. As a result, few women today will probably reach the threshold where the cost to their lifetime becomes obvious. Bolund and her colleagues’ research on a historical population in Utah, for instance, found this only appeared when women had more than five children – well below the 1.6 births that the average woman in the US is expected to have in her lifetime. Other environmental factors may therefore become more significant in explaining the lifespan gap between men and women. Men tend to be more likely to smoke than women and also drink more alcohol, which affect lifespan, says Bolund. The current longevity gap between men and women is probably a combination of the latter’s reduced reproductive costs compared with other times in history and lifestyle differences between the sexes. Research also suggests that sex chromosomal differences are involved. “Sexes differ in a multitude of ways, beyond reproductive costs, so we need to conduct more research into how different factors contribute to sex-specific ageing,” says Young.

Michigan OKs Landmark Regulations That Push Up-Front Costs to Data Centers

Michigan regulators have adopted landmark standards for the booming data center industry with a plan they say tries to protect residents from subsidizing the industry’s hefty energy use

Michigan regulators on Thursday adopted landmark standards for the booming data center industry with a plan they say tries to protect residents from subsidizing the industry’s hefty energy use.In a 3-0 vote, the Michigan Public Service Commission adopted a rate structure that requires data centers and other energy-intensive industries in Consumers Energy’s territory to sign long-term power contracts with steep penalties for exiting early.The order also requires Consumers to show that data centers will shoulder all costs to build transmission lines, substations and other infrastructure before adding them to the grid.Commission Chair Dan Scripps called it a “balanced approach” that shows Michigan is “open for business from data centers and other large load customers, while also leveraging those potential benefits of the growth … in a way that’s good for all customers.”The deal disappointed some environmentalists, who had pushed for explicit requirements that data center power come from renewable sources. Michigan utilities are legally required to achieve 100% clean energy by 2040. They must detail how they plan to meet that requirement in filings next year.“While the order includes important consumer protection terms, the commission missed an opportunity to emphasize the importance of the state’s climate goals,” said Daniel Abrams, an attorney with the Environmental Law and Policy Center. The rate structure applies to customers whose energy use exceeds 100 megawatts. Data centers are among very few industries that demand that much power. Often, they demand an order of magnitude more.Consumers serves 1.9 million customers across much of the Lower Peninsula. Company spokesperson Matt Johnson said officials are still reviewing Thursday’s order and “its impact on all stakeholders.“Consumers Energy intends to work hard to continue to attract new businesses, including data centers, to Michigan, in a way that benefits everyone and fuels the state’s economic development,” he added.The deal comes amid an uncertain time for the data industry, which is growing fast because of artificial intelligence. Much more energy is needed to power the transformation, but many industry analysts fear rising AI stocks are a bubble and demand for the technology won’t materialize, leaving utilities and ratepayers to pick up the infrastructure tab for failed projects.Hoping to avoid such an outcome, Consumers in February proposed special regulations that would lock data centers into 15-year contracts that guarantee consistent electricity use and require payments even if a facility ceases or downsizes operations mid-contract.The commission’s decision Thursday approves much of that request, with some significant modifications. DTE takes a different approach The other big utility in Michigan, DTE Energy, is taking a different approach.Rather than establishing a blanket rate structure like Consumers, DTE wants to negotiate its first data center contract individually while aiming to avoid public vetting of the deal.Michigan law allows such expedited reviews in cases that would bring no added costs to utility consumers. DTE officials argue adding the Stargate data center to its system will help keep rates down for everyone by spreading fixed costs among more paying customers. “Given the sizable affordability benefits for our customers, as well as the economic impact the project will have, we think moving forward in this fashion makes the most sense,” spokesperson Jill Wilmot said.But DTE officials also stated in its filing that the company expects to spend some $500 million upgrading its transmission system and building a substation to serve the data center. Critics argue the utility is so intentionally vague it is impossible to vet DTE’s claims about affordability.“It’s just highly concerning that they are trying to keep this somewhat private, because there’s so much at stake,” said Bryan Smigielski, a Michigan organizer with the Sierra Club.Michigan Attorney General Dana Nessel also opposes DTE’s quest for expedited review, and has requested a thorough vetting of the proposed contract.Members of the Public Service Commission have not decided whether to grant DTE’s request for quick approval, Scripps said.Michigan’s data center electricity rate deliberations come amid a surge of interest from developers looking to take advantage of new tax breaks that could save the industry tens of millions of dollars. Lawmakers last year voted to exempt large data centers from Michigan’s 6% sales and use tax in an effort to lure the industry to Michigan.Beyond the Stargate campus, DTE is in late-stage negotiations for another 3 gigawatts’ worth of data center capacity, while Consumers Energy is nearing deals for three large data centers amounting to a collective 2 gigawatts of power.Developers are also scoping out rural land throughout the southern Lower Peninsula, from the Grand Rapids area to the outskirts of Monroe.The wave of interest could have big implications for water and land use in Michigan. Hyperscale data centers occupy hundreds of acres apiece. Those that use water vapor to cool the servers inside the facilities — the industry’s most common cooling technique — also use large amounts of water.This story was originally published by Bridge Michigan and distributed through a partnership with The Associated Press.Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See – Oct. 2025

Why some quantum materials stall while others scale

In a new study, MIT researchers evaluated quantum materials’ potential for scalable commercial success — and identified promising candidates.

People tend to think of quantum materials — whose properties arise from quantum mechanical effects — as exotic curiosities. But some quantum materials have become a ubiquitous part of our computer hard drives, TV screens, and medical devices. Still, the vast majority of quantum materials never accomplish much outside of the lab.What makes certain quantum materials commercial successes and others commercially irrelevant? If researchers knew, they could direct their efforts toward more promising materials — a big deal since they may spend years studying a single material.Now, MIT researchers have developed a system for evaluating the scale-up potential of quantum materials. Their framework combines a material’s quantum behavior with its cost, supply chain resilience, environmental footprint, and other factors. The researchers used their framework to evaluate over 16,000 materials, finding that the materials with the highest quantum fluctuation in the centers of their electrons also tend to be more expensive and environmentally damaging. The researchers also identified a set of materials that achieve a balance between quantum functionality and sustainability for further study.The team hopes their approach will help guide the development of more commercially viable quantum materials that could be used for next generation microelectronics, energy harvesting applications, medical diagnostics, and more.“People studying quantum materials are very focused on their properties and quantum mechanics,” says Mingda Li, associate professor of nuclear science and engineering and the senior author of the work. “For some reason, they have a natural resistance during fundamental materials research to thinking about the costs and other factors. Some told me they think those factors are too ‘soft’ or not related to science. But I think within 10 years, people will routinely be thinking about cost and environmental impact at every stage of development.”The paper appears in Materials Today. Joining Li on the paper are co-first authors and PhD students Artittaya Boonkird, Mouyang Cheng, and Abhijatmedhi Chotrattanapituk, along with PhD students Denisse Cordova Carrizales and Ryotaro Okabe; former graduate research assistants Thanh Nguyen and Nathan Drucker; postdoc Manasi Mandal; Instructor Ellan Spero of the Department of Materials Science and Engineering (DMSE); Professor Christine Ortiz of the Department of DMSE; Professor Liang Fu of the Department of Physics; Professor Tomas Palacios of the Department of Electrical Engineering and Computer Science (EECS); Associate Professor Farnaz Niroui of EECS; Assistant Professor Jingjie Yeo of Cornell University; and PhD student Vsevolod Belosevich and Assostant Professor Qiong Ma of Boston College.Materials with impactCheng and Boonkird say that materials science researchers often gravitate toward quantum materials with the most exotic quantum properties rather than the ones most likely to be used in products that change the world.“Researchers don’t always think about the costs or environmental impacts of the materials they study,” Cheng says. “But those factors can make them impossible to do anything with.”Li and his collaborators wanted to help researchers focus on quantum materials with more potential to be adopted by industry. For this study, they developed methods for evaluating factors like the materials’ price and environmental impact using their elements and common practices for mining and processing those elements. At the same time, they quantified the materials’ level of “quantumness” using an AI model created by the same group last year, based on a concept proposed by MIT professor of physics Liang Fu, termed quantum weight.“For a long time, it’s been unclear how to quantify the quantumness of a material,” Fu says. “Quantum weight is very useful for this purpose. Basically, the higher the quantum weight of a material, the more quantum it is.”The researchers focused on a class of quantum materials with exotic electronic properties known as topological materials, eventually assigning over 16,000 materials scores on environmental impact, price, import resilience, and more.For the first time, the researchers found a strong correlation between the material’s quantum weight and how expensive and environmentally damaging it is.“That’s useful information because the industry really wants something very low-cost,” Spero says. “We know what we should be looking for: high quantum weight, low-cost materials. Very few materials being developed meet that criteria, and that likely explains why they don’t scale to industry.”The researchers identified 200 environmentally sustainable materials and further refined the list down to 31 material candidates that achieved an optimal balance of quantum functionality and high-potential impact.The researchers also found that several widely studied materials exhibit high environmental impact scores, indicating they will be hard to scale sustainably. “Considering the scalability of manufacturing and environmental availability and impact is critical to ensuring practical adoption of these materials in emerging technologies,” says Niroui.Guiding researchMany of the topological materials evaluated in the paper have never been synthesized, which limited the accuracy of the study’s environmental and cost predictions. But the authors say the researchers are already working with companies to study some of the promising materials identified in the paper.“We talked with people at semiconductor companies that said some of these materials were really interesting to them, and our chemist collaborators also identified some materials they find really interesting through this work,” Palacios says. “Now we want to experimentally study these cheaper topological materials to understand their performance better.”“Solar cells have an efficiency limit of 34 percent, but many topological materials have a theoretical limit of 89 percent. Plus, you can harvest energy across all electromagnetic bands, including our body heat,” Fu says. “If we could reach those limits, you could easily charge your cell phone using body heat. These are performances that have been demonstrated in labs, but could never scale up. That’s the kind of thing we’re trying to push forward."This work was supported, in part, by the National Science Foundation and the U.S. Department of Energy.

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.