Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Data center emissions likely 662% higher than big tech claims. Can it keep up the ruse?

News Feed
Sunday, September 15, 2024

Big tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are likely about 662% – or 7.62 times – higher than officially reported.Amazon is the largest emitter of the big five tech companies by a mile – the emissions of the second-largest emitter, Apple, were less than half of Amazon’s in 2022. However, Amazon has been kept out of the calculation above because its differing business model makes it difficult to isolate data center-specific emissions figures for the company.As energy demands for these data centers grow, many are worried that carbon emissions will, too. The International Energy Agency stated that data centers already accounted for 1% to 1.5% of global electricity consumption in 2022 – and that was before the AI boom began with ChatGPT’s launch at the end of that year.AI is far more energy-intensive on data centers than typical cloud-based applications. According to Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search, and data center power demand will grow 160% by 2030. Goldman competitor Morgan Stanley’s research has made similar findings, projecting data center emissions globally to accumulate to 2.5bn metric tons of CO2 equivalent by 2030.In the meantime, all five tech companies have claimed carbon neutrality, though Google dropped the label last year as it stepped up its carbon accounting standards. Amazon is the most recent company to do so, claiming in July that it met its goal seven years early, and that it had implemented a gross emissions cut of 3%.“It’s down to creative accounting,” explained a representative from Amazon Employees for Climate Justice, an advocacy group composed of current Amazon employees who are dissatisfied with their employer’s action on climate. “Amazon – despite all the PR and propaganda that you’re seeing about their solar farms, about their electric vans – is expanding its fossil fuel use, whether it’s in data centers or whether it’s in diesel trucks.”A misguided metricThe most important tools in this “creative accounting” when it comes to data centers are renewable energy certificates, or Recs. These are certificates that a company purchases to show it is buying renewable energy-generated electricity to match a portion of its electricity consumption – the catch, though, is that the renewable energy in question doesn’t need to be consumed by a company’s facilities. Rather, the site of production can be anywhere from one town over to an ocean away.Recs are used to calculate “market-based” emissions, or the official emissions figures used by the firms. When Recs and offsets are left out of the equation, we get “location-based emissions” – the actual emissions generated from the area where the data is being processed.The trend in those emissions is worrying. If these five companies were one country, the sum of their “location-based” emissions in 2022 would rank them as the 33rd highest-emitting country, behind the Philippines and above Algeria.Many data center industry experts also recognize that location-based metrics are more honest than the official, market-based numbers reported.“Location-based [accounting] gives an accurate picture of the emissions associated with the energy that’s actually being consumed to run the data center. And Uptime’s view is that it’s the right metric,” said Jay Dietrich, the research director of sustainability at Uptime Institute, a leading data center advisory and research organization.Nevertheless, Greenhouse Gas (GHG) Protocol, a carbon accounting oversight body, allows Recs to be used in official reporting, though the extent to which they should be allowed remains controversial between tech companies and has led to a lobbying battle over GHG Protocol’s rule-making process between two factions.On one side there is the Emissions First Partnership, spearheaded by Amazon and Meta. It aims to keep Recs in the accounting process regardless of their geographic origins. In practice, this is only a slightly looser interpretation of what GHG Protocol already permits.The opposing faction, headed by Google and Microsoft, argues that there needs to be time-based and location-based matching of renewable production and energy consumption for data centers. Google calls this its 24/7 goal, or its goal to have all of its facilities run on renewable energy 24 hours a day, seven days a week by 2030. Microsoft calls it its 100/100/0 goal, or its goal to have all its facilities running on 100% carbon-free energy 100% of the time, making zero carbon-based energy purchases by 2030.Google has already phased out its Rec use and Microsoft aims to do the same with low-quality “unbundled” (non location-specific) Recs by 2030.Academics and carbon management industry leaders alike are also against the GHG Protocol’s permissiveness on Recs. In an open letter from 2015, more than 50 such individuals argued that “it should be a bedrock principle of GHG accounting that no company be allowed to report a reduction in its GHG footprint for an action that results in no change in overall GHG emissions. Yet this is precisely what can happen under the guidance given the contractual/Rec-based reporting method.”To GHG Protocol’s credit, the organization does ask companies to report location-based figures alongside their Rec-based figures. Despite that, no company includes both location-based and market-based metrics for all three subcategories of emissions in the bodies of their annual environmental reports.In fact, location-based numbers are only directly reported (that is, not hidden in third-party assurance statements or in footnotes) by two companies – Google and Meta. And those two firms only include those figures for one subtype of emissions: scope 2, or the indirect emissions companies cause by purchasing energy from utilities and large-scale generators.In-house data centersScope 2 is the category that includes the majority of the emissions that come from in-house data center operations, as it concerns the emissions associated with purchased energy – mainly, electricity.Data centers should also make up a majority of overall scope 2 emissions for each company except Amazon, given that the other sources of scope 2 emissions for these companies stem from the electricity consumed by firms’ offices and retail spaces – operations that are relatively small and not carbon-intensive. Amazon has one other carbon-intensive business vertical to account for in its scope 2 emissions: its warehouses and e-commerce logistics.For the firms that give data center-specific data – Meta and Microsoft – this holds true: data centers made up 100% of Meta’s market-based (official) scope 2 emissions and 97.4% of its location-based emissions. For Microsoft, those numbers were 97.4% and 95.6%, respectively.The massive differences in location-based and official scope 2 emissions numbers showcase just how carbon intensive data centers really are, and how deceptive firms’ official emissions numbers can be. Meta, for example, reports its official scope 2 emissions for 2022 as 273 metric tons CO2 equivalent – all of that attributable to data centers. Under the location-based accounting system, that number jumps to more than 3.8m metric tons of CO2 equivalent for data centers alone – a more than 19,000 times increase.A similar result can be seen with Microsoft. The firm reported its official data center-related emissions for 2022 as 280,782 metric tons CO2 equivalent. Under a location-based accounting method, that number jumps to 6.1m metric tons CO2 equivalent. That’s a nearly 22 times increase.While Meta’s reporting gap is more egregious, both firms’ location-based emissions are higher because they undercount their data center emissions specifically, with 97.4% of the gap between Meta’s location-based and official scope 2 number in 2022 being unreported data center-related emissions, and 95.55% of Microsoft’s.Specific data center-related emissions numbers aren’t available for the rest of the firms. However, given that Google and Apple have similar scope 2 business models to Meta and Microsoft, it is likely that the multiple on how much higher their location-based data center emissions are would be similar to the multiple on how much higher their overall location-based scope 2 emissions are.In total, the sum of location-based emissions in this category between 2020 and 2022 was at least 275% higher (or 3.75 times) than the sum of their official figures. Amazon did not provide the Guardian with location-based scope 2 figures for 2020 and 2021, so its official (and likely much lower) numbers were used for this calculation for those years.Third-party data centersBig tech companies also rent a large portion of their data center capacity from third-party data center operators (or “colocation” data centers). According to the Synergy Research Group, large tech companies (or “hyperscalers”) represented 37% of worldwide data center capacity in 2022, with half of that capacity coming through third-party contracts. While this group includes companies other than Google, Amazon, Meta, Microsoft and Apple, it gives an idea of the extent of these firms’ activities with third-party data centers.Those emissions should theoretically fall under scope 3, all emissions a firm is responsible for that can’t be attributed to the fuel or electricity it consumes.When it comes to a big tech firm’s operations, this would encapsulate everything from the manufacturing processes of the hardware it sells (like the iPhone or Kindle) to the emissions from employees’ cars during their commutes to the office.When it comes to data centers, scope 3 emissions include the carbon emitted from the construction of in-house data centers, as well as the carbon emitted during the manufacturing process of the equipment used inside those in-house data centers. It may also include those emissions as well as the electricity-related emissions of third-party data centers that are partnered with.However, whether or not these emissions are fully included in reports is almost impossible to prove. “Scope 3 emissions are hugely uncertain,” said Dietrich. “This area is a mess just in terms of accounting.”According to Dietrich, some third-party data center operators put their energy-related emissions in their own scope 2 reporting, so those who rent from them can put those emissions into their scope 3. Other third-party data center operators put energy-related emissions into their scope 3 emissions, expecting their tenants to report those emissions in their own scope 2 reporting.Additionally, all firms use market-based metrics for these scope 3 numbers, which means third-party data center emissions are also undercounted in official figures.Of the firms that report their location-based scope 3 emissions in the footnotes, only Apple has a large gap between its official scope 3 figure and its location-based scope 3 figure, starting in 2022.This gap can largely be attributed to data center emissions accounting. The only change to Apple’s scope 3 methodology in 2022 was to include “work from home, third-party cloud services, electricity transmission and distribution losses, and upstream impacts from scope 1 fuels”. Since the firm listed third-party cloud services as having zero emissions under its official scope 3 reporting, that means all emissions associated with those third-party services would only show up in location-based scope 3 emissions from 2022 onwards.2025 and beyondEven though big tech hides these emissions, they are due to keep rising. Data centers’ electricity demand is projected to double by 2030 due to the additional load that artificial intelligence poses, according to the Electric Power Research Institute.Google and Microsoft both blamed AI for their recent upticks in market-based emissions.“The relative contribution of AI computing loads to Google’s data centers, as I understood it when I left [in 2022], was relatively modest,” said Chris Taylor, current CEO of utility storage firm Gridstor and former site lead for Google’s data center energy strategy unit. “Two years ago, [AI] was not the main thing that we were worried about, at least on the energy team.”Taylor explained that most of the growth that he saw in data centers while at Google was attributable to growth in Google Cloud, as most enterprises were moving their IT tasks to the firm’s cloud servers.Whether today’s power grids can withstand the growing energy demands of AI is uncertain. One industry leader – Marc Ganzi, the CEO of DigitalBridge, a private equity firm that owns two of the world’s largest third-party data center operators – has gone as far as to say that the data center sector may run out of power within the next two years.And as grid interconnection backlogs continue to pile up worldwide, it may be nearly impossible for even the most well intentioned of companies to get new renewable energy production capacity online in time to meet that demand.

Emissions from in-house data centers of Google, Microsoft, Meta and Apple may be 7.62 times higher than official tallyBig tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are likely about 662% – or 7.62 times – higher than officially reported. Continue reading...

Big tech has made some big claims about greenhouse gas emissions in recent years. But as the rise of artificial intelligence creates ever bigger energy demands, it’s getting hard for the industry to hide the true costs of the data centers powering the tech revolution.

According to a Guardian analysis, from 2020 to 2022 the real emissions from the “in-house” or company-owned data centers of Google, Microsoft, Meta and Apple are likely about 662% – or 7.62 times – higher than officially reported.

Amazon is the largest emitter of the big five tech companies by a mile – the emissions of the second-largest emitter, Apple, were less than half of Amazon’s in 2022. However, Amazon has been kept out of the calculation above because its differing business model makes it difficult to isolate data center-specific emissions figures for the company.

As energy demands for these data centers grow, many are worried that carbon emissions will, too. The International Energy Agency stated that data centers already accounted for 1% to 1.5% of global electricity consumption in 2022 – and that was before the AI boom began with ChatGPT’s launch at the end of that year.

AI is far more energy-intensive on data centers than typical cloud-based applications. According to Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity to process as a Google search, and data center power demand will grow 160% by 2030. Goldman competitor Morgan Stanley’s research has made similar findings, projecting data center emissions globally to accumulate to 2.5bn metric tons of CO2 equivalent by 2030.

In the meantime, all five tech companies have claimed carbon neutrality, though Google dropped the label last year as it stepped up its carbon accounting standards. Amazon is the most recent company to do so, claiming in July that it met its goal seven years early, and that it had implemented a gross emissions cut of 3%.

“It’s down to creative accounting,” explained a representative from Amazon Employees for Climate Justice, an advocacy group composed of current Amazon employees who are dissatisfied with their employer’s action on climate. “Amazon – despite all the PR and propaganda that you’re seeing about their solar farms, about their electric vans – is expanding its fossil fuel use, whether it’s in data centers or whether it’s in diesel trucks.”

A misguided metric

The most important tools in this “creative accounting” when it comes to data centers are renewable energy certificates, or Recs. These are certificates that a company purchases to show it is buying renewable energy-generated electricity to match a portion of its electricity consumption – the catch, though, is that the renewable energy in question doesn’t need to be consumed by a company’s facilities. Rather, the site of production can be anywhere from one town over to an ocean away.

Recs are used to calculate “market-based” emissions, or the official emissions figures used by the firms. When Recs and offsets are left out of the equation, we get “location-based emissions” – the actual emissions generated from the area where the data is being processed.

The trend in those emissions is worrying. If these five companies were one country, the sum of their “location-based” emissions in 2022 would rank them as the 33rd highest-emitting country, behind the Philippines and above Algeria.

Many data center industry experts also recognize that location-based metrics are more honest than the official, market-based numbers reported.

“Location-based [accounting] gives an accurate picture of the emissions associated with the energy that’s actually being consumed to run the data center. And Uptime’s view is that it’s the right metric,” said Jay Dietrich, the research director of sustainability at Uptime Institute, a leading data center advisory and research organization.

Nevertheless, Greenhouse Gas (GHG) Protocol, a carbon accounting oversight body, allows Recs to be used in official reporting, though the extent to which they should be allowed remains controversial between tech companies and has led to a lobbying battle over GHG Protocol’s rule-making process between two factions.

On one side there is the Emissions First Partnership, spearheaded by Amazon and Meta. It aims to keep Recs in the accounting process regardless of their geographic origins. In practice, this is only a slightly looser interpretation of what GHG Protocol already permits.

The opposing faction, headed by Google and Microsoft, argues that there needs to be time-based and location-based matching of renewable production and energy consumption for data centers. Google calls this its 24/7 goal, or its goal to have all of its facilities run on renewable energy 24 hours a day, seven days a week by 2030. Microsoft calls it its 100/100/0 goal, or its goal to have all its facilities running on 100% carbon-free energy 100% of the time, making zero carbon-based energy purchases by 2030.

Google has already phased out its Rec use and Microsoft aims to do the same with low-quality “unbundled” (non location-specific) Recs by 2030.

Academics and carbon management industry leaders alike are also against the GHG Protocol’s permissiveness on Recs. In an open letter from 2015, more than 50 such individuals argued that “it should be a bedrock principle of GHG accounting that no company be allowed to report a reduction in its GHG footprint for an action that results in no change in overall GHG emissions. Yet this is precisely what can happen under the guidance given the contractual/Rec-based reporting method.”

To GHG Protocol’s credit, the organization does ask companies to report location-based figures alongside their Rec-based figures. Despite that, no company includes both location-based and market-based metrics for all three subcategories of emissions in the bodies of their annual environmental reports.

In fact, location-based numbers are only directly reported (that is, not hidden in third-party assurance statements or in footnotes) by two companies – Google and Meta. And those two firms only include those figures for one subtype of emissions: scope 2, or the indirect emissions companies cause by purchasing energy from utilities and large-scale generators.

In-house data centers

Scope 2 is the category that includes the majority of the emissions that come from in-house data center operations, as it concerns the emissions associated with purchased energy – mainly, electricity.

Data centers should also make up a majority of overall scope 2 emissions for each company except Amazon, given that the other sources of scope 2 emissions for these companies stem from the electricity consumed by firms’ offices and retail spaces – operations that are relatively small and not carbon-intensive. Amazon has one other carbon-intensive business vertical to account for in its scope 2 emissions: its warehouses and e-commerce logistics.

For the firms that give data center-specific data – Meta and Microsoft – this holds true: data centers made up 100% of Meta’s market-based (official) scope 2 emissions and 97.4% of its location-based emissions. For Microsoft, those numbers were 97.4% and 95.6%, respectively.

The massive differences in location-based and official scope 2 emissions numbers showcase just how carbon intensive data centers really are, and how deceptive firms’ official emissions numbers can be. Meta, for example, reports its official scope 2 emissions for 2022 as 273 metric tons CO2 equivalent – all of that attributable to data centers. Under the location-based accounting system, that number jumps to more than 3.8m metric tons of CO2 equivalent for data centers alone – a more than 19,000 times increase.

A similar result can be seen with Microsoft. The firm reported its official data center-related emissions for 2022 as 280,782 metric tons CO2 equivalent. Under a location-based accounting method, that number jumps to 6.1m metric tons CO2 equivalent. That’s a nearly 22 times increase.

While Meta’s reporting gap is more egregious, both firms’ location-based emissions are higher because they undercount their data center emissions specifically, with 97.4% of the gap between Meta’s location-based and official scope 2 number in 2022 being unreported data center-related emissions, and 95.55% of Microsoft’s.

Specific data center-related emissions numbers aren’t available for the rest of the firms. However, given that Google and Apple have similar scope 2 business models to Meta and Microsoft, it is likely that the multiple on how much higher their location-based data center emissions are would be similar to the multiple on how much higher their overall location-based scope 2 emissions are.

In total, the sum of location-based emissions in this category between 2020 and 2022 was at least 275% higher (or 3.75 times) than the sum of their official figures. Amazon did not provide the Guardian with location-based scope 2 figures for 2020 and 2021, so its official (and likely much lower) numbers were used for this calculation for those years.

Third-party data centers

Big tech companies also rent a large portion of their data center capacity from third-party data center operators (or “colocation” data centers). According to the Synergy Research Group, large tech companies (or “hyperscalers”) represented 37% of worldwide data center capacity in 2022, with half of that capacity coming through third-party contracts. While this group includes companies other than Google, Amazon, Meta, Microsoft and Apple, it gives an idea of the extent of these firms’ activities with third-party data centers.

Those emissions should theoretically fall under scope 3, all emissions a firm is responsible for that can’t be attributed to the fuel or electricity it consumes.

When it comes to a big tech firm’s operations, this would encapsulate everything from the manufacturing processes of the hardware it sells (like the iPhone or Kindle) to the emissions from employees’ cars during their commutes to the office.

When it comes to data centers, scope 3 emissions include the carbon emitted from the construction of in-house data centers, as well as the carbon emitted during the manufacturing process of the equipment used inside those in-house data centers. It may also include those emissions as well as the electricity-related emissions of third-party data centers that are partnered with.

However, whether or not these emissions are fully included in reports is almost impossible to prove. “Scope 3 emissions are hugely uncertain,” said Dietrich. “This area is a mess just in terms of accounting.”

According to Dietrich, some third-party data center operators put their energy-related emissions in their own scope 2 reporting, so those who rent from them can put those emissions into their scope 3. Other third-party data center operators put energy-related emissions into their scope 3 emissions, expecting their tenants to report those emissions in their own scope 2 reporting.

Additionally, all firms use market-based metrics for these scope 3 numbers, which means third-party data center emissions are also undercounted in official figures.

Of the firms that report their location-based scope 3 emissions in the footnotes, only Apple has a large gap between its official scope 3 figure and its location-based scope 3 figure, starting in 2022.

This gap can largely be attributed to data center emissions accounting. The only change to Apple’s scope 3 methodology in 2022 was to include “work from home, third-party cloud services, electricity transmission and distribution losses, and upstream impacts from scope 1 fuels”. Since the firm listed third-party cloud services as having zero emissions under its official scope 3 reporting, that means all emissions associated with those third-party services would only show up in location-based scope 3 emissions from 2022 onwards.

2025 and beyond

Even though big tech hides these emissions, they are due to keep rising. Data centers’ electricity demand is projected to double by 2030 due to the additional load that artificial intelligence poses, according to the Electric Power Research Institute.

Google and Microsoft both blamed AI for their recent upticks in market-based emissions.

“The relative contribution of AI computing loads to Google’s data centers, as I understood it when I left [in 2022], was relatively modest,” said Chris Taylor, current CEO of utility storage firm Gridstor and former site lead for Google’s data center energy strategy unit. “Two years ago, [AI] was not the main thing that we were worried about, at least on the energy team.”

Taylor explained that most of the growth that he saw in data centers while at Google was attributable to growth in Google Cloud, as most enterprises were moving their IT tasks to the firm’s cloud servers.

Whether today’s power grids can withstand the growing energy demands of AI is uncertain. One industry leader – Marc Ganzi, the CEO of DigitalBridge, a private equity firm that owns two of the world’s largest third-party data center operators – has gone as far as to say that the data center sector may run out of power within the next two years.

And as grid interconnection backlogs continue to pile up worldwide, it may be nearly impossible for even the most well intentioned of companies to get new renewable energy production capacity online in time to meet that demand.

Read the full story here.
Photos courtesy of

Technique makes complex 3D printed parts more reliable

New research enables computer designs to incorporate the limitations of 3D printers, to better control materials’ performance in aerospace, medical, and other applications.

People are increasingly turning to software to design complex material structures like airplane wings and medical implants. But as design models become more capable, our fabrication techniques haven’t kept up. Even 3D printers struggle to reliably produce the precise designs created by algorithms. The problem has led to a disconnect between the ways a material is expected to perform and how it actually works.Now, MIT researchers have created a way for models to account for 3D printing’s limitations during the design process. In experiments, they showed their approach could be used to make materials that perform much more closely to the way they’re intended to.“If you don’t account for these limitations, printers can either over- or under-deposit material by quite a lot, so your part becomes heavier or lighter than intended. It can also over- or underestimate the material performance significantly,” says Gilbert W. Winslow Associate Professor of Civil and Environmental Engineering Josephine Carstensen. “With our technique, you know what you’re getting in terms of performance because the numerical model and experimental results align very well.”The approach is described in the journal Materials and Design, in an open-access paper co-authored by Carstensen and PhD student Hajin Kim-Tackowiak.Matching theory with realityOver the last decade, new design and fabrication technologies have transformed the way things are made, especially in industries like aerospace, automotive, and biomedical engineering, where materials must reach precise weight-to-strength ratios and other performance thresholds. In particular, 3D printing allows materials to be made with more complex internal structures.“3D printing processes generally give us more flexibility because we don’t have to come up with forms or molds for things that would be made through more traditional means like injection molding,” Kim-Tackowiak explains.As 3D printing has made production more precise, so have methods for designing complex material structures. One of the most advanced computational design techniques is known as topology optimization. Topology optimization has been used to generate new and often surprising material structures that can outperform conventional designs, in some cases approaching the theoretical limits of certain performance thresholds. It is currently being used to design materials with optimized stiffness and strength, maximized energy absorption, fluid permeability, and more.But topology optimization often creates designs at extremely fine scales that 3D printers have struggled to reliably reproduce. The problem is the size of the print head that extrudes the material. If the design specifies a layer to be 0.5 millimeters thick, for instance, and the print head is only capable of extruding 1-millimeter-thick layers, the final design will be warped and imprecise.Another problem has to do with the way 3D printers create parts, with a print head extruding a thin bead of material as it glides across the printing area, gradually building parts layer by layer. That can cause weak bonding between layers, making the part more prone to separation or failure.The researchers sought to address the disconnect between expected and actual properties of materials that arise from those limitations.“We thought, ‘We know these limitations in the beginning, and the field has gotten better at quantifying these limitations, so we might as well design from the get-go with that in mind,” Kim-Tackowiak says.In previous work, Carstensen developed an algorithm that embedded information about the print nozzle size into design algorithms for beam structures. For this paper, the researchers built off that approach to incorporate the direction of the print head and the corresponding impact of weak bonding between layers. They also made it work with more complex, porous structures that can have extremely elastic properties.The approach allows users to add variables to the design algorithms that account for the center of the bead being extruded from a print head and the exact location of the weaker bonding region between layers. The approach also automatically dictates the path the print head should take during production.The researchers used their technique to create a series of repeating 2D designs with various sizes of hollow pores, or densities. They compared those creations to materials made using traditional topology optimization designs of the same densities.In tests, the traditionally designed materials deviated from their intended mechanical performance more than materials designed using the researchers’ new technique at material densities under 70 percent. The researchers also found that conventional designs consistently over-deposited material during fabrication. Overall, the researchers’ approach led to parts with more reliable performance at most densities.“One of the challenges of topology optimization has been that you need a lot of expertise to get good results, so that once you take the designs off the computer, the materials behave the way you thought they would,” Carstensen says. “We’re trying to make it easy to get these high-fidelity products.”Scaling a new design approachThe researchers believe this is the first time a design technique has accounted for both the print head size and weak bonding between layers.“When you design something, you should use as much context as possible,” Kim-Tackowiak says. “It was rewarding to see that putting more context into the design process makes your final materials more accurate. It means there are fewer surprises. Especially when we’re putting so much more computational resources into these designs, it’s nice to see we can correlate what comes out of the computer with what comes out of the production process.”In future work, the researchers hope to improve their method for higher material densities and for different kinds of materials like cement and ceramics. Still, they said their approach offered an improvement over existing techniques, which often require experienced 3D printing specialists to help account for the limitations of the machines and materials.“It was cool to see that just by putting in the size of your deposition and the bonding property values, you get designs that would have required the consultation of somebody who’s worked in the space for years,” Kim-Tackowiak says.The researchers say the work paves the way to design with more materials.“We’d like to see this enable the use of materials that people have disregarded because printing with them has led to issues,” Kim-Tackowiak says. “Now we can leverage those properties or work with those quirks as opposed to just not using all the material options we have at our disposal.”

Energy Department plans to claw back $13B in green funds

The Energy Department is planning to claw back $13 billion in unspent climate funds, it announced Wednesday. In a press release, the department said that it plans to "return more than $13 billion in unobligated funds initially appropriated to advance the previous Administration’s wasteful Green New Scam agenda." The press release did not specify exactly where the...

The Energy Department is planning to claw back $13 billion in unspent climate funds, it announced Wednesday.  In a press release, the department said that it plans to "return more than $13 billion in unobligated funds initially appropriated to advance the previous Administration’s wasteful Green New Scam agenda." The press release did not specify exactly where the money would have otherwise gone or what it will be used for now, if anything. Spokespeople for the Energy Department did not immediately respond to The Hill's request for additional information. Asked about the money during the New York Times's Climate Forward event on Wednesday, Energy Secretary Chris Wright said the funds "hadn't been assigned to projects yet" but that they were aimed at subsidizing more wind and solar energy, as well as electric vehicles.  The Trump administration has repeatedly sought to curtail spending on renewable energy — and set up barriers that hamper its deployment — while trying to expedite fossil fuels and nuclear power.  The Energy Department has made several attempts to cut climate spending, including previous funding recissions.  The Environmental Protection Agency has separately sought to rescind billions of its own climate spending that was issued under the Biden administration. 

States get a blueprint to speed up heat-pump adoption

States are ramping up efforts to get residents to switch from fossil-fuel-fired heating systems to all-electric heat pumps. Now, they’ve got a big new tool kit to pull from. Last week, the interagency nonprofit Northeast States for Coordinated Air Use Management, or NESCAUM, released an 80-page action plan laying…

Heat pumps are slowly catching on. In the U.S., the units outsold gas furnaces by their biggest-ever margin last year, but their share of the market is still modest. Citing data from the Air-Conditioning, Heating, and Refrigeration Institute, a trade association, Levin said that in 2021, heat pumps accounted for about 25% of the combined shipments of gas furnaces, heat pumps, and air conditioners, the three largest reported HVAC categories. In 2024, they’d risen to about 32%. “No matter how you look at it, there are still a lot of gas furnaces being sold, there are still a lot of one-way central air-conditioners being sold — all of which could really become heat pumps,” Levin said. Produced in consultation with state agencies, environmental justice organizations, and technical and policy experts, the NESCAUM report lays out a diverse set of more than 50 strategies — both carrots and sticks — covering equity and workforce investments, obligations to reduce carbon, building standards, and utility regulation. A wide range of decision-makers, often in collaboration, can pull these levers — from utility regulators to governor’s offices, state legislatures, and energy, environment, labor, and economic development agencies. Here are six recommendations from the report that stand out. Make heat pumps more accessible to lower-income and renter households. A number of barriers need to be overcome to make heat pumps available to these groups, who often struggle to afford the appliances or lack the autonomy to install them. For example, contractors can’t put heat pumps in homes with hazards like mold, lead, asbestos, and rotten beams, but the process to address these problems can itself cost tens of thousands of dollars. Philadelphia’s Built to Last program coordinates aid to carry out these necessary pre-electrification repairs. On the other side of the country, California is launching a program this fall to install heat pumps in qualifying low- and moderate-income homes — for free. Notably, owners of low-income multifamily buildings can also use the program to upgrade their tenants’ heating systems, but they must agree to keep rent from increasing more than 3% per year for up to 10 years after the project.Set an all-electric standard for new buildings. States have the ability to establish the minimum health, safety, and energy standards that developers must adhere to. New York recently became the first state to require that most new buildings be electric only, making heat pumps the default heating appliances. The rules withstood a legal challenge in July and take effect on Dec. 31.Use building performance standards to encourage heat pumps in existing structures. Such standards require building owners to meet specific annual limits on energy use or carbon emissions and bring them down over time, or face penalties. Several states and cities have already developed these rules. Maryland, for one, stipulates that owners of most edifices 35,000 square feet or greater must report their CO2 emissions starting this year, hit standards by 2030, and fully ditch fossil-fueled appliances by 2040.Leverage emissions rules that improve air quality and protect public health. For example, in 2023, the San Francisco Bay Area air district, home to more than 7 million people, set landmark rules requiring that new residential water and space heaters don’t spew health-harming nitrogen oxides, starting in 2027 and 2029, respectively. Heat pumps fit the bill. Switching to the tech nationwide could avert more than 2,600 premature deaths annually, according to electrification advocacy nonprofit Rewiring America.Push utilities to deliver clean heat.States can require utilities to slash emissions and electrify buildings. For example, in 2021, Colorado adopted a first-in-the-nation clean-heat law doing just that. Lawmakers also mandated that utilities file their implementation plans for approval. In 2024, regulators greenlit a $440 million proposal from Xcel Energy, the state’s largest utility, which included electrifying 200,000 homes with heat pumps by 2030. Maryland is developing a similar standard.Reform electricity rates so that they incentivize zero-emissions heating. Households with heat pumps tend to use more electricity than other customers, which means they pay disproportionately for fixed costs to maintain the grid on their energy bills. Utilities can correct that imbalance with adjusted rates. For example, Massachusetts has required its three major electric utilities to offer discounted winter electricity rates to households with heat pumps. Elizabeth Mahony, commissioner of the state’s Department of Energy Resources, said she expects the new rates to save heat-pump owners on average $540 per year.NESCAUM’s Levin stressed that the report is ​“a menu — not a recipe.” Each state will need to consider its own goals and constraints to pick the approaches that fit it best, she added. Still, ​“I see [heat-pump electricity] rates as one of the areas that’s most promising,” Levin said. Massachusetts’ reforms ​“are really going to change their customer economics to make it more attractive to switch to a heat pump.” When done right, rate design also avoids the need for states to find new funding. ​“You’re not raising costs on anybody, you’re only reducing costs,” Levin said. At a time when households are seeing energy prices rise faster than inflation, the tactic could have widespread political appeal, she noted. NESCAUM plans to check back in with states and report out on their progress each year, Levin said. ​“The cool thing about our work is that we bring states together to learn from one another,” she added. ​“Part of making this transition happen more rapidly is lifting up the things that are really working well.”

New California law could expand energy trading across the West

After years of failed attempts, California lawmakers have cleared the way to create an electricity-trading market that would stretch across the U.S. West. Advocates say that could cut the region’s power costs by billions of dollars and support the growth of renewable energy. But opponents say it may make the state’s…

After years of failed attempts, California lawmakers have cleared the way to create an electricity-trading market that would stretch across the U.S. West. Advocates say that could cut the region’s power costs by billions of dollars and support the growth of renewable energy. But opponents say it may make the state’s climate and clean-energy policies vulnerable to the Trump administration. Those are the fault lines over AB 825, also known as the ​“Pathways Initiative” bill, which was signed into law by Democratic Gov. Gavin Newsom on Sept. 19 as part of a major climate-and-energy legislative package. The law will grant the California Independent System Operator (CAISO), which runs the transmission grid and energy markets in most of the state, the authority to collaborate with other states and utilities across the West to create a shared day-ahead energy-trading regime. Passage of this bill won’t create that market overnight — that will take years of negotiations. CAISO’s board wouldn’t even be allowed to vote on creating the market until 2028. But for advocates who’ve been working for more than a decade on plans for a West-wide regional energy market, it’s a momentous advance. ​“We’ve shot the starting gun,” said Brian Turner, a director at clean-energy trade group Advanced Energy United, which was outspoken in support of the legislation. Today, utilities across the Western U.S. trade energy via bilateral arrangements — a clunky and inefficient way to take advantage of cheaper or cleaner power available across an interconnected transmission grid. An integrated day-ahead trading regime could drive major savings for all participants — nearly $1.2 billion per year, according to a 2022 study commissioned by CAISO. That integrated market could create opportunities for solar power from California and the Southwest and wind power from the Rocky Mountains and Pacific Northwest to be shared more efficiently, driving down energy costs and increasing reliability during extreme weather. Lower-cost power more readily deliverable to where it’s needed could also reduce consumers’ monthly utility bills — a welcome prospect at a time of soaring electricity rates. The regional energy market plan is backed by a coalition that includes clean-energy trade groups such as Advanced Energy United and the American Clean Power Association; environmental groups including the Sierra Club, Union of Concerned Scientists, and the Natural Resources Defense Council; business groups including the California Chamber of Commerce and the Clean Energy Buyers Association; and the state’s major utilities. It also has the backing of U.S. senators representing California, Oregon, and Washington, all states with strong clean-energy goals. Assemblymember Cottie Petrie-Norris, a Democrat who authored AB 825, said in a statement following its passage that it ​“will protect California’s energy independence while opening the door to new opportunities to build and share renewable power across the West.” But consumer advocates, including The Utility Reform Network, Consumer Watchdog, and Public Citizen, say the bill as passed fails to protect that energy independence. The Center for Biological Diversity and the Environmental Working Group share their concerns. They fear a new trading market will allow fossil fuel–friendly states like Idaho, Utah, and Wyoming to push costly, dirty coal power into California — and give an opening to the Trump administration to use the federal government’s power over regional energy markets to undermine the state’s clean-energy agenda. What a Western energy market could achieve The arguments for a day-ahead energy-trading market can be boiled down to a simple concept, Turner said — bigger is better. Being able to obtain power from across the region could reduce the amount of generation capacity that individual utilities have to build. And tapping into energy supplies spanning from the Pacific Ocean to the Rocky Mountains would allow states undergoing heat waves and winter storms to draw on power from parts of the region that aren’t under the same grid stress, improving resiliency against extreme weather. A Western trading market could also serve as a starting point for even more integrated activity between the dozens of utilities in the region that now plan and build power plants and transmission grids in an uncoordinated way. A 2022 study commissioned by Advanced Energy United found that a regional energy organization could yield $2 billion in annual energy savings, enable up to 4.4 gigawatts of additional clean power, and create hundreds of thousands of permanent jobs. For advocates of a Western market, the chief challenge has been to design a structure that doesn’t give up California’s control over its own energy and climate policies, but allows other states and their utilities a share of decision-making authority over how the market works. Taking a lead on that design work has been the West-Wide Governance Pathways Initiative, a group of utilities, state regulators, and environmental and consumer advocates.

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.