Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

These 4 companies’ excellence in innovation spans multiple categories

News Feed
Thursday, November 30, 2023

Fast Company’s Next Big Things in Tech awards for 2023 honor 119 innovations that are paying dividends right now—and hold the potential to drive further progress over the next five years. We decided to give four organizations an additional Excellence in Innovation award to acknowledge the breath of their ingenuity. Among well-known companies, Adobe and the Walt Disney Co. have been busy imbuing multiple areas of their businesses with new technologies. Meanwhile, Phasecraft has taken on the big, essential challenge of figuring out how quantum-computing software should work. And Wiliot’s fresh approach to the internet of things can help companies wrangle everything from groceries to gadgetry. Adobe: AI everywhere—but responsibly Years before the rise of ChatGPT and Midjourney, Adobe knew that AI would be both a solution and a problem. In late 2019, the company introduced the Content Authenticity Initiative, in collaboration with The New York Times and Twitter, as a “nutrition label” for digital content, allowing users to see, for instance, if an image had been doctored in Photoshop. That’s led to the launch of open-source tools for developers and technical standards for the industry. [Illustration: Ard Su] Adobe envisions a system in which users can click on an online image to reveal details about its origins. It has debuted a “content credentials” symbol that can be inserted through apps such as Adobe Photoshop and Bing Image Creator. The company has also demoed a website for comparing an image to its original version. To boost adoption, it’s partnering with camera makers; media organizations, including The New York Times; and other tech companies, such as Microsoft. Discussions with social networks are ongoing. Dana Rao, Adobe’s general counsel and chief trust officer, says the idea emerged when the company was merely teasing new AI features but has gained traction over the past year with the rise of viral deepfakes. “It will be imperative,” he says, “for people to have a way to know what they can believe.” At the same time, Adobe is pushing ahead with its own AI advancements, including image generation in Photoshop, object removal for videos in Premiere Pro, and a text-based video editor that lets users move text around by copying and pasting a computer-generated transcript. The Content Authenticity Initiative is Adobe’s way of shining a light on these capabilities—both from Adobe and others—instead of snuffing them out. “When we think about the consequences of AI,” Rao says, “the answer isn’t going to be, ‘Stop using it.’ ” —Jared Newman Phasecraft: The software side of quantum Quantum computers promise to upend whole industries and our understanding of the world, but despite billions in investment and some scientific feats, they still can’t do much. By carefully manipulating tiny physical building blocks known as qubits, the machines aim to exploit the weird behavior of the subatomic world and process vast amounts of data more quickly than classical computers can. To date, the effectiveness of quantum computers has been limited by the challenges involved in increasing the number of qubits from a few hundred to many thousands. [Illustration: Ard Su] U.K.-based Phasecraft is building algorithms to help nudge these noisy, finicky machines to quantum advantage sooner. “The more we can do on the algorithm side, the less we have to wait for the hardware to improve,” says cofounder Toby Cubitt (yes, his name sounds the same as qubit). By redesigning an algorithm for simulating electrons, Phasecraft researchers cut the number of qubits required by a factor of a million, approaching a point that will enable existing quantum computers to work. Experimenting on industry-leading machines at IBM, Google, and Rigetti (and with $21 million in venture funding), the 20-person startup is building an AI-enhanced software pipeline to tackle the physics-modeling problems that could unlock breakthroughs in batteries and solar energy. Cubitt, a professor of quantum information at University College London, expects the first useful computations within three years. Getting there will require Phasecraft to keep developing “a really deep understanding of how the hardware works on a physics level,” he says. “And sometimes even the hardware companies don’t know this.” —Alex Pasternack The Walt Disney Co.: From ads to virtual stages People think of the House of Mouse as an entertainment behemoth, but over the past year, Disney has been seriously flexing its tech chops on a number of fronts: A new internal ad server allows the company programmatic ad capability to process more than 5 billion advertising impression queries per day across Hulu and Disney+, and new special-effects technology is transforming studios. When The Mandalorian debuted in 2019, it reinvigorated interest in the Star Wars universe while simultaneously revolutionizing green-screen technology and virtual filmmaking. Earlier this year, Disney debuted the latest evolution of that StageCraft technology for ESPN: Catalyst Stage deploys massive LED displays with real-time rendering to create 3D studio environments. Using such cutting-edge technology systems as GhostFrame, Unreal Engine, and Disguise XR, Catalyst Stage allows producers to put talent anywhere—football locker rooms, outdoor basketball courts, hockey rinks—and create multiple designs and backdrops for SportsCenter sets, all without leaving the studio. [Illustration: Oscar Duarte] To achieve that, the company created a breakthrough user-interface technology called GRACE (Graphic Real-time Automation and Control Environment) that integrates all of those tech systems within the existing ESPN production ecosystem, allowing Disney technologists to manipulate and move seamlessly between 3D virtual scenes on the stage. “For us, the technology that became available and could work synchronously was something we all recognized as a big leap in possibility,” says Christiaan Cokas, Disney Entertainment and ESPN director for creative automation and studio technology. “That’s what the Catalyst Stage is—a dance of so much technology working together seamlessly in order to make the magic happen.” —Jeff Beer Wiliot: The internet of things, only better and cheaper The “internet of things” may no longer be the term du jour, in part because the idea failed to meet its original promise of connecting billions of dumb objects to the cloud in service of mitigating all manner of potential supply-chain disruptions. But IoT technology remains a key to confronting some of our biggest business and environmental challenges. Wiliot makes an inexpensive Bluetooth beacon tag that can be attached to everything from food to drugs to apparel, enabling real-time tracking of location and condition. The stamp-size tag—powered by radio waves in the environment—contains an ARM processor and a radio, and it continually sends signal data collected from its various sensors to the Wiliot cloud for analysis. The results can be viewed via an app. In one use case, a COVID vaccine maker affixed a Wiliot tag to its vials to make sure the vaccine remained cool enough to retain its potency. Wiliot recently introduced sensors that can monitor humidity levels, which could help protect a range of products, from produce to electronics. [Illustration: Simo Liu] “It’s hard to appreciate how dark and offline the physical world is,” says Wiliot chief marketing officer Steve Statler. “As we talk to our customers, we realize that it’s really kind of a random form of chaos out there in terms of how we handle our supply chains.” The Tel Aviv, Israel–based company has raised more than $250 million in funding and now counts 30 of the world’s 500 largest businesses as customers. It’s primarily focused on providing its tags to traditional grocery retailers, who have relatively little visibility into their supply chains: “We’re enabling brick-and-mortar grocery stores to survive and thrive against larger, digital-first competitors,” Statler says. —Mark Sullivan The companies behind these technologies are among the honorees in Fast Company’s Next Big Things in Tech awards for 2023. See a full list of all the winners across all categories and read more about the methodology behind the selection process.

Fast Company’s Next Big Things in Tech awards for 2023 honor 119 innovations that are paying dividends right now—and hold the potential to drive further progress over the next five years. We decided to give four organizations an additional Excellence in Innovation award to acknowledge the breath of their ingenuity. Among well-known companies, Adobe and the Walt Disney Co. have been busy imbuing multiple areas of their businesses with new technologies. Meanwhile, Phasecraft has taken on the big, essential challenge of figuring out how quantum-computing software should work. And Wiliot’s fresh approach to the internet of things can help companies wrangle everything from groceries to gadgetry. Adobe: AI everywhere—but responsibly Years before the rise of ChatGPT and Midjourney, Adobe knew that AI would be both a solution and a problem. In late 2019, the company introduced the Content Authenticity Initiative, in collaboration with The New York Times and Twitter, as a “nutrition label” for digital content, allowing users to see, for instance, if an image had been doctored in Photoshop. That’s led to the launch of open-source tools for developers and technical standards for the industry. [Illustration: Ard Su] Adobe envisions a system in which users can click on an online image to reveal details about its origins. It has debuted a “content credentials” symbol that can be inserted through apps such as Adobe Photoshop and Bing Image Creator. The company has also demoed a website for comparing an image to its original version. To boost adoption, it’s partnering with camera makers; media organizations, including The New York Times; and other tech companies, such as Microsoft. Discussions with social networks are ongoing. Dana Rao, Adobe’s general counsel and chief trust officer, says the idea emerged when the company was merely teasing new AI features but has gained traction over the past year with the rise of viral deepfakes. “It will be imperative,” he says, “for people to have a way to know what they can believe.” At the same time, Adobe is pushing ahead with its own AI advancements, including image generation in Photoshop, object removal for videos in Premiere Pro, and a text-based video editor that lets users move text around by copying and pasting a computer-generated transcript. The Content Authenticity Initiative is Adobe’s way of shining a light on these capabilities—both from Adobe and others—instead of snuffing them out. “When we think about the consequences of AI,” Rao says, “the answer isn’t going to be, ‘Stop using it.’ ” —Jared Newman Phasecraft: The software side of quantum Quantum computers promise to upend whole industries and our understanding of the world, but despite billions in investment and some scientific feats, they still can’t do much. By carefully manipulating tiny physical building blocks known as qubits, the machines aim to exploit the weird behavior of the subatomic world and process vast amounts of data more quickly than classical computers can. To date, the effectiveness of quantum computers has been limited by the challenges involved in increasing the number of qubits from a few hundred to many thousands. [Illustration: Ard Su] U.K.-based Phasecraft is building algorithms to help nudge these noisy, finicky machines to quantum advantage sooner. “The more we can do on the algorithm side, the less we have to wait for the hardware to improve,” says cofounder Toby Cubitt (yes, his name sounds the same as qubit). By redesigning an algorithm for simulating electrons, Phasecraft researchers cut the number of qubits required by a factor of a million, approaching a point that will enable existing quantum computers to work. Experimenting on industry-leading machines at IBM, Google, and Rigetti (and with $21 million in venture funding), the 20-person startup is building an AI-enhanced software pipeline to tackle the physics-modeling problems that could unlock breakthroughs in batteries and solar energy. Cubitt, a professor of quantum information at University College London, expects the first useful computations within three years. Getting there will require Phasecraft to keep developing “a really deep understanding of how the hardware works on a physics level,” he says. “And sometimes even the hardware companies don’t know this.” —Alex Pasternack The Walt Disney Co.: From ads to virtual stages People think of the House of Mouse as an entertainment behemoth, but over the past year, Disney has been seriously flexing its tech chops on a number of fronts: A new internal ad server allows the company programmatic ad capability to process more than 5 billion advertising impression queries per day across Hulu and Disney+, and new special-effects technology is transforming studios. When The Mandalorian debuted in 2019, it reinvigorated interest in the Star Wars universe while simultaneously revolutionizing green-screen technology and virtual filmmaking. Earlier this year, Disney debuted the latest evolution of that StageCraft technology for ESPN: Catalyst Stage deploys massive LED displays with real-time rendering to create 3D studio environments. Using such cutting-edge technology systems as GhostFrame, Unreal Engine, and Disguise XR, Catalyst Stage allows producers to put talent anywhere—football locker rooms, outdoor basketball courts, hockey rinks—and create multiple designs and backdrops for SportsCenter sets, all without leaving the studio. [Illustration: Oscar Duarte] To achieve that, the company created a breakthrough user-interface technology called GRACE (Graphic Real-time Automation and Control Environment) that integrates all of those tech systems within the existing ESPN production ecosystem, allowing Disney technologists to manipulate and move seamlessly between 3D virtual scenes on the stage. “For us, the technology that became available and could work synchronously was something we all recognized as a big leap in possibility,” says Christiaan Cokas, Disney Entertainment and ESPN director for creative automation and studio technology. “That’s what the Catalyst Stage is—a dance of so much technology working together seamlessly in order to make the magic happen.” —Jeff Beer Wiliot: The internet of things, only better and cheaper The “internet of things” may no longer be the term du jour, in part because the idea failed to meet its original promise of connecting billions of dumb objects to the cloud in service of mitigating all manner of potential supply-chain disruptions. But IoT technology remains a key to confronting some of our biggest business and environmental challenges. Wiliot makes an inexpensive Bluetooth beacon tag that can be attached to everything from food to drugs to apparel, enabling real-time tracking of location and condition. The stamp-size tag—powered by radio waves in the environment—contains an ARM processor and a radio, and it continually sends signal data collected from its various sensors to the Wiliot cloud for analysis. The results can be viewed via an app. In one use case, a COVID vaccine maker affixed a Wiliot tag to its vials to make sure the vaccine remained cool enough to retain its potency. Wiliot recently introduced sensors that can monitor humidity levels, which could help protect a range of products, from produce to electronics. [Illustration: Simo Liu] “It’s hard to appreciate how dark and offline the physical world is,” says Wiliot chief marketing officer Steve Statler. “As we talk to our customers, we realize that it’s really kind of a random form of chaos out there in terms of how we handle our supply chains.” The Tel Aviv, Israel–based company has raised more than $250 million in funding and now counts 30 of the world’s 500 largest businesses as customers. It’s primarily focused on providing its tags to traditional grocery retailers, who have relatively little visibility into their supply chains: “We’re enabling brick-and-mortar grocery stores to survive and thrive against larger, digital-first competitors,” Statler says. —Mark Sullivan The companies behind these technologies are among the honorees in Fast Company’s Next Big Things in Tech awards for 2023. See a full list of all the winners across all categories and read more about the methodology behind the selection process.

Fast Company’s Next Big Things in Tech awards for 2023 honor 119 innovations that are paying dividends right now—and hold the potential to drive further progress over the next five years. We decided to give four organizations an additional Excellence in Innovation award to acknowledge the breath of their ingenuity. Among well-known companies, Adobe and the Walt Disney Co. have been busy imbuing multiple areas of their businesses with new technologies. Meanwhile, Phasecraft has taken on the big, essential challenge of figuring out how quantum-computing software should work. And Wiliot’s fresh approach to the internet of things can help companies wrangle everything from groceries to gadgetry.

Adobe: AI everywhere—but responsibly

Years before the rise of ChatGPT and Midjourney, Adobe knew that AI would be both a solution and a problem. In late 2019, the company introduced the Content Authenticity Initiative, in collaboration with The New York Times and Twitter, as a “nutrition label” for digital content, allowing users to see, for instance, if an image had been doctored in Photoshop. That’s led to the launch of open-source tools for developers and technical standards for the industry.

[Illustration: Ard Su]

Adobe envisions a system in which users can click on an online image to reveal details about its origins. It has debuted a “content credentials” symbol that can be inserted through apps such as Adobe Photoshop and Bing Image Creator. The company has also demoed a website for comparing an image to its original version. To boost adoption, it’s partnering with camera makers; media organizations, including The New York Times; and other tech companies, such as Microsoft. Discussions with social networks are ongoing.

Dana Rao, Adobe’s general counsel and chief trust officer, says the idea emerged when the company was merely teasing new AI features but has gained traction over the past year with the rise of viral deepfakes. “It will be imperative,” he says, “for people to have a way to know what they can believe.”

At the same time, Adobe is pushing ahead with its own AI advancements, including image generation in Photoshop, object removal for videos in Premiere Pro, and a text-based video editor that lets users move text around by copying and pasting a computer-generated transcript. The Content Authenticity Initiative is Adobe’s way of shining a light on these capabilities—both from Adobe and others—instead of snuffing them out.

“When we think about the consequences of AI,” Rao says, “the answer isn’t going to be, ‘Stop using it.’ ” —Jared Newman

Phasecraft: The software side of quantum

Quantum computers promise to upend whole industries and our understanding of the world, but despite billions in investment and some scientific feats, they still can’t do much. By carefully manipulating tiny physical building blocks known as qubits, the machines aim to exploit the weird behavior of the subatomic world and process vast amounts of data more quickly than classical computers can. To date, the effectiveness of quantum computers has been limited by the challenges involved in increasing the number of qubits from a few hundred to many thousands.

[Illustration: Ard Su]

U.K.-based Phasecraft is building algorithms to help nudge these noisy, finicky machines to quantum advantage sooner. “The more we can do on the algorithm side, the less we have to wait for the hardware to improve,” says cofounder Toby Cubitt (yes, his name sounds the same as qubit).

By redesigning an algorithm for simulating electrons, Phasecraft researchers cut the number of qubits required by a factor of a million, approaching a point that will enable existing quantum computers to work. Experimenting on industry-leading machines at IBM, Google, and Rigetti (and with $21 million in venture funding), the 20-person startup is building an AI-enhanced software pipeline to tackle the physics-modeling problems that could unlock breakthroughs in batteries and solar energy.

Cubitt, a professor of quantum information at University College London, expects the first useful computations within three years. Getting there will require Phasecraft to keep developing “a really deep understanding of how the hardware works on a physics level,” he says. “And sometimes even the hardware companies don’t know this.” —Alex Pasternack

The Walt Disney Co.: From ads to virtual stages

People think of the House of Mouse as an entertainment behemoth, but over the past year, Disney has been seriously flexing its tech chops on a number of fronts: A new internal ad server allows the company programmatic ad capability to process more than 5 billion advertising impression queries per day across Hulu and Disney+, and new special-effects technology is transforming studios.

When The Mandalorian debuted in 2019, it reinvigorated interest in the Star Wars universe while simultaneously revolutionizing green-screen technology and virtual filmmaking. Earlier this year, Disney debuted the latest evolution of that StageCraft technology for ESPN: Catalyst Stage deploys massive LED displays with real-time rendering to create 3D studio environments. Using such cutting-edge technology systems as GhostFrame, Unreal Engine, and Disguise XR, Catalyst Stage allows producers to put talent anywhere—football locker rooms, outdoor basketball courts, hockey rinks—and create multiple designs and backdrops for SportsCenter sets, all without leaving the studio.

[Illustration: Oscar Duarte]

To achieve that, the company created a breakthrough user-interface technology called GRACE (Graphic Real-time Automation and Control Environment) that integrates all of those tech systems within the existing ESPN production ecosystem, allowing Disney technologists to manipulate and move seamlessly between 3D virtual scenes on the stage.

“For us, the technology that became available and could work synchronously was something we all recognized as a big leap in possibility,” says Christiaan Cokas, Disney Entertainment and ESPN director for creative automation and studio technology. “That’s what the Catalyst Stage is—a dance of so much technology working together seamlessly in order to make the magic happen.” —Jeff Beer

Wiliot: The internet of things, only better and cheaper

The “internet of things” may no longer be the term du jour, in part because the idea failed to meet its original promise of connecting billions of dumb objects to the cloud in service of mitigating all manner of potential supply-chain disruptions. But IoT technology remains a key to confronting some of our biggest business and environmental challenges. Wiliot makes an inexpensive Bluetooth beacon tag that can be attached to everything from food to drugs to apparel, enabling real-time tracking of location and condition. The stamp-size tag—powered by radio waves in the environment—contains an ARM processor and a radio, and it continually sends signal data collected from its various sensors to the Wiliot cloud for analysis. The results can be viewed via an app. In one use case, a COVID vaccine maker affixed a Wiliot tag to its vials to make sure the vaccine remained cool enough to retain its potency. Wiliot recently introduced sensors that can monitor humidity levels, which could help protect a range of products, from produce to electronics.

[Illustration: Simo Liu]

“It’s hard to appreciate how dark and offline the physical world is,” says Wiliot chief marketing officer Steve Statler. “As we talk to our customers, we realize that it’s really kind of a random form of chaos out there in terms of how we handle our supply chains.” The Tel Aviv, Israel–based company has raised more than $250 million in funding and now counts 30 of the world’s 500 largest businesses as customers. It’s primarily focused on providing its tags to traditional grocery retailers, who have relatively little visibility into their supply chains: “We’re enabling brick-and-mortar grocery stores to survive and thrive against larger, digital-first competitors,” Statler says. —Mark Sullivan

The companies behind these technologies are among the honorees in Fast Company’s Next Big Things in Tech awards for 2023. See a full list of all the winners across all categories and read more about the methodology behind the selection process.

Read the full story here.
Photos courtesy of

Tech Still Isn’t Doing Enough to Care for the Environment

Priscilla Chomba-Kinywa, CTO of Greenpeace, says technology firms must shape up—and consumers and business clients should walk away if they don’t.

We are in a climate crisis, and technology can be either a part of the problem or a force for good, says Greenpeace CTO Priscilla Chomba-Kinywa. According to the International Panel on Climate Change, she explains, we have “less than seven years before Earth becomes really difficult to live on.” Last year alone, the world witnessed wildfires in North America, floods in Southern Africa, and even the double tragedy of floods and fires in places like Greece, she says.Social media allows people from across the world to communicate, but “we’re seeing misinformation, disinformation, and a wanton disregard for sustainability by some of these platforms—and unfortunately, people don’t have many other options.”Chomba-Kinywa says that VCs, startups, investors, and technologists should invest in alternative platforms “that are green, that are ethical, that are value-based, and that give us an alternative to what we have right now, being built by people so passionate about the environment that they will not sell out in the name of profits.”Even though conventional investment is supposed to maximize shareholder value, she argues, investing in these platforms is a price worth paying, as customers will soon be demanding action.Chomba-Kinywa salutes companies already taking action—such as Hyundai, which recently committed to stop supplying the heavy machinery used for illegal mining in the Amazon. This was possible, she says, through the use of satellite imagery and pressure from leaders in Indigenous communities, which led to a report that Hyundai couldn’t ignore.Good data, she explains, is vital—Greenpeace has been using it since 2009 to persuade some tech giants to switch to 100 percent renewable energy. For those that refused, the campaigning NGO just walked away. Other organizations should do the same, she says.“What if you could use your influence to apply pressure on these organizations to change?” she asks. “Say, ‘We’ve looked at the data, we’ve looked at your plans. You’re not doing enough, and we won’t give you our money.’ Then maybe we can make a little bit more of a change.”Finally, she says businesses need to work with communities from places like Senegal, Zambia, Nigeria, Bangladesh, and Mexico to understand and support their movements. “Sit with the elders in their communities, listen to the Indigenous knowledge that allowed them to coexist with nature, and start to reapply some of those principles,” she suggests. “They are scrambling for their lives.”Chomba-Kinywa also says that conversations on AI need to focus on the planet. “We’re talking about values, ethics, and putting guardrails in place—but we can’t do that without talking about the environment,” she argues. “We need to think through the environmental cost of AI. It has the potential to help us solve some of humanity’s grand challenges, but that’s only useful if humanity has a livable planet.”This article appears in the March/April 2024 issue of WIRED UK magazine.

The smog case before the Supreme Court puts America's air quality at risk

On Feb. 21, the Supreme Court will hear arguments in yet another challenge to environmental regulations

In recent years, the U.S. Supreme Court has become an abattoir for environmental laws. The casualties so far include a comprehensive plan to control greenhouse gas (GHG) emissions from power plants, Clean Water Act protections for a hundred million acresacre of wetlands, and — soon — the Chevron deference principle that has kept judges from second-guessing many of EPA’s regulations. On February 21, the Supreme Court will hear arguments in yet another challenge by industries and conservative states to a federal regulation. And yet again, the Court isn’t just threatening our health. It’s challenging the legal principles that used to limit the power of unelected judges in our society. As with many of the Supreme Court’s recent forays into environmental law, the Court is hearing Ohio v. EPA in an unprecedented context. No lower court has said that the regulation at issue violates the law. Instead, industry and their allies are asking the Court to temporarily block the regulation even though a lower court refused to do so before ruling on its validity — and they aren’t asking the high court to rule on the regulation’s validity itself. To understand the threat, it helps to understand how the Clean Air Act works. Like many environmental laws, the Act gives states a major regulatory role. EPA scientists decide how polluted our air can be, but it’s up to state officials to develop plans for achieving that minimum air quality. The EPA’s analysis shows that for an annual cost of $910 million, the plan would deliver health benefits worth $4 billion to over $15 billion. This system gives states considerable flexibility, but there’s a problem: downwind states face a disadvantage. Their air drifts in from other states, pre-polluted by sources they can’t control. To address that problem, Congress put a backstop provision in the Act that prohibits one state from running air quality programs that create air quality problems for another. Hence the name: the Good Neighbor Provision. Last year, EPA concluded that 21 states are not doing enough to help their downwind neighbors meet air quality standards for ground-level ozone, better known as smog. EPA “disapproved” inadequate Clean Air Act implementation plans in those states, and — as the Clean Air Act requires — set out a federal plan to replace them. EPA’s “Good Neighbor Plan” requires upwind states to control major pollution sources inside their borders (think cement kilns, power plants, incinerators) to help their neighbors meet smog standards. Want more health and science stories in your inbox? Subscribe to Salon's weekly newsletter Lab Notes. The smog standards we’re talking about are neither new nor ambitious, by the way. While they were drafted during the Obama administration, the Trump administration kept them, and endorsed them again in 2020 with the support of GOP politicians. Even the industry-aligned National Association of Manufacturers agrees that they are “based on sound science’ and would support “sustainable domestic growth.” Because smog is so harmful, particularly to children, actually achieving these standards would deliver enormous benefits. That’s what the Good Neighbor Plan aims to do, and that’s why it’s worth implementing now. The EPA’s analysis shows that for an annual cost of $910 million, the plan would deliver health benefits worth $4 billion to over $15 billion. For a 1 percent increase in electricity costs, the rule will prevent over 2,000 hospital visits, avoid 25,000 lost days of work, and avoid 430,000 missed school days for kids whose asthma would otherwise keep them at home. All that in 2026 alone. Polluters in upwind states, of course, focus only on one side of that ledger. They warn of impossible standards and crippling costs. Companies driven by short-term returns are practically required to say these things before regulations come into effect. But history shows that they then innovate to cut pollution in new and cheaper ways. As a result, studies suggest that EPA is likely overestimating the costs of the Good Neighbor Plan. The Supreme Court used to reserve its own stay authority for true emergencies. There’s none here. That doesn’t mean that we should deny polluters their day in court. Congress recognized that polluters will always question new air regulations. But it also knew that having this play out in multiple courts would create chaos. So the Clean Air Act requires anyone who wants to challenge regulations like the Good Neighbor Rule to do so in a single court that can hear them all together: the D.C. Circuit. Cue the gamesmanship. Polluting industries and some conservative states skirted this requirement by challenging the EPA decision that disapproved their state plans — not the federal Good Neighbor Plan. By taking a national matter to regional courts that didn’t consider the full consequences of their rulings, they got decisions that effectively blocked the Plan, at least temporarily, in 12 states. The challengers then sought to leverage this patchwork of regional stays into a national stay of the entire Good Neighbor Plan. They told the D.C. Circuit that if the Plan couldn’t go into effect everywhere, it shouldn’t go into effect anywhere, that the court should stay—that is, pause—the entire project until they could finish all their legal challenges. The D.C. Circuit declined to do so before ruling on the plan’s validity, but the challengers knew where to go next. The Supreme Court used to reserve its own stay authority for true emergencies. There’s none here. By the time the Court hears arguments, it’ll have been over four months since the stay applications were filed. If the Court really thought that the D.C. Circuit had mishandled an emergency situation, surely it would have moved more quickly. What’s worse, the industry petitioners haven’t identified any obvious legal flaw of the type normally required to justify a stay. They’re just asking the Court to second-guess EPA’s decision on a legally complex issue involving exhaustive scientific analysis. Judged by the normal standards of Supreme Court practice, it’s surprising — shocking even — that the Court hasn’t already denied relief.   The Supreme Court could easily have let these challenges, like thousands of other challenges brought every year, proceed through the lower courts in the normal course of business. It is hard to avoid the conclusion that it is treating environmental and other public health regulations as guilty until proven innocent. That is not the way our legal system is supposed to work. The proper response would have been for the Court to wave away any request to stay EPA’s Good Neighbor Plan before the D.C. Circuit rules on its validity. It still can. Our right-wing Justices should follow the rules the Court created for these situations, rather than their ambition to serve as our regulators-in-chief.  Read more about the environment

Long Live the Street Grid

After falling out of favor, one of city planning’s greatest innovations is making a comeback.

Growing up in Lexington, Kentucky, my best friend and I lived only a quarter mile apart as the crow flies. We had nearly identical houses, both clad in a blend of brick and vinyl that allowed our newly minted middle-class parents to signal status without breaking the bank. Getting to each other’s homes should have been simple.The trouble was, we lived on opposite ends of two cul-de-sac neighborhoods, each fronting a busy corridor that had once been a farm road. A strictly legal trip from his house to mine involved a 25-minute, mile-long trek along aimless streets, largely without a sidewalk. So we cheated, cutting through backyards to the howls of homeowners. This was the early 2000s; privacy fences have since been installed that probably would have ended our friendship.Ours was a problem that city planning was supposed to prevent: Cities were meant to grow along a coordinated pattern of easily navigable streets and public spaces. Until the 20th century, they did. The street grid—an innovation as useful today as in antiquity—reigned. But about a century ago, when the modern era of American city planning began, the grid fell out of favor. Arterial roads and winding cul-de-sacs, far friendlier to cars than to pedestrians, were ascendant.In one sense, my friend and I lived in the most planned environment in history. Every building around us was subject to a set of rigid regulations. If our neighbor turned her garage into an apartment or adjusted the pitch of her roof, zoning enforcers would be out in 24 hours. But when it came to the public realm—the space between buildings that ties a city together—there was no plan, except to move cars through a landscape of lawns.[M. Nolan Gray: Cancel zoning]We were the victims of an American approach to city planning that had lost its way. But the next generation of kids may not be so unlucky. After a long demise, the grid is showing signs of a comeback.Humans have been doing something like city planning for millennia, though it hasn’t always been called that.In the West, our city-planning tradition traces its roots to Hippodamus of Miletus, who laid out the ancient-Greek port of Piraeus in a rectangular grid expanding outward from an agora. The Romans replicated this model in colonies across the Mediterranean, adding sewers and stormwater infrastructure. They so loved their sewers, in fact, that they designated a goddess to preside over them.The grid carried over to the New World. The Spanish empire instructed colonists to plan cities around a town square with a church and a royal council, violently imposed on top of existing Mesoamerican cities. From these twin centers of Spanish power, grids shot outward.The American colonial period marked a renaissance in city planning, as settlers inspired by Enlightenment ideas platted hundreds of street grids on dispossessed land, each testing new designs. In 1682, the surveyor Thomas Holme laid out a rectangular grid connecting the Schuylkill and Delaware Rivers, with four parks of equal size orbiting a central plaza—the basis for Philadelphia. In 1811, surveyors divided a largely uninhabited Manhattan into 155 streets and a dozen grand avenues, laying the groundwork for New York City.On occasion, American city planners even infused grids with spiritual significance. Joseph Smith’s “plat of Zion”—a grid design that eventually served as the basis for Salt Lake City—was said to be divinely inspired.Divinity aside, there was good reason to love grids. As Alain Bertaud—a former city planner for the World Bank—points out, planning a grid in advance of growth allows surveyors to demarcate the public and private realms, reserving space for necessary infrastructure and ensuring that future expansion follows a coherent pattern. That might sound restrictive, but the result is a blank canvas that empowers cities to grow and adapt.In the 19th century, the landscape architect Frederick Law Olmsted spiced up America’s grids with citywide parks systems in places such as Louisville, Kentucky, and Buffalo, New York. A young City Beautiful movement carved grand boulevards and civic plazas out of capitals such as Denver and Washington, D.C. Transit companies blanketed metropolises such as Los Angeles and Chicago with networks of streetcars and commuter trains. Without ever using the phrase city planning, Americans had, by the dawn of the 20th century, perfected a formula for planning cities.And then we invented city planning.Beginning in the 1910s, the first local city-planning departments were established, not so much to plot out the physical growth of cities but to implement a novel policy: zoning. Zoning shifted the focus of city planning from stewarding the public realm to managing private development. The forebears of professional planners were unconcerned with land uses and densities, allowing mixed-use neighborhoods to emerge. But zoning remade cities into a fragmented landscape of malls, office parks, and residential subdivisions.[From the July/August 2023 issue: How parking ruined everything]Developers lost the power to decide what to build on any given lot. But they gained unprecedented powers to shape the public realm; as long as the streets were sufficiently wide and the corners sufficiently rounded, they could plot out new streets untethered from any broader plan. With federal backing, new neighborhoods became a collection of winding streets and cul-de-sacs, connected to the broader city by only one entrance. That these neighborhoods discouraged walking was seen by contemporary planners as a feature, not a bug.This novel approach to planning was premised on a particular vision of the ideal city that still holds sway today. Taken on its own terms, the appeal is easy to see: The archetypal American would live on a quiet street in a single-family house surrounded by a lawn. He would work in a central business district or perhaps a new industrial park on the edge of town, and spend his earnings in a dedicated shopping district. Traveling from home to work to errands would be fast, solitary, and—most important—in a car.One hundred years later, this grand experiment has resulted in cities that are unaffordable, stagnant, segregated, and sprawling. Walk into any planning office today, and you’ll be hard-pressed to find the street and park plans of yore. Instead, you will find reams of zoning rules listing permitted and prohibited uses, maximum building heights, minimum yard depths, required lot dimensions, limits on unit numbers, and required parking.Unlike their predecessors who spent their time sketching grids, modern American city planners now dedicate themselves to either enforcing the fading dream of zoning or working around it. Indeed, the principal function of the planning office in most major cities is to help developers navigate incoherent rules adopted decades ago, in pursuit of an urban-design vision that few still believe in—reviewing stacks of paperwork and organizing endless hearings just to get something built.I know, because it used to be my job. After a childhood of navigating the sprawl, I decided to go to planning school and do something about it. I knew that working as a planner in New York wasn’t going to be a game of SimCity. But after a few years of managing rezoning applications—generating mountains of paperwork, mostly to legalize existing buildings and businesses—I wasn’t sure whether I had done even a day of planning. Like so many idealistic young city planners before me, I walked away.Nearly a decade ago, the Cornell historian Thomas J. Campanella described the malaise haunting American city planning. A century after it had become a profession, what did we have to show for ourselves? Streets careen toward dead ends, home prices keep spiraling, public spaces metastasize into monster regional parks, and schools sequester themselves on prisonlike campuses—all inaccessible to anyone without a car.The quest to govern cities as idealized suburbs has had another unintended consequence: Nearly all U.S. housing growth has been driven to the unplanned periphery, where Americans are especially vulnerable to worsening environmental risks. In California, restrictive rules in big blue cities have forced hundreds of thousands of families into the path of wildfires. In red states such as Florida and Texas, development in floodplains—nudged along by generous federal subsidies—continues largely unabated.The ironic result: Many American cities are both overplanned and under-planned.The breakdown of the American street grid was quantified in a 2020 study by Geoff Boeing, a University of Southern California planning professor. Boeing used geospatial methods to show how robust street grids—compact, dense, interconnected—devolved into a mess of cul-de-sacs over the course of the 20th century, a transformation that hooked Americans on cars and increased greenhouse-gas emissions.[Read: Cities aren’t built for kids]But he also found reason for hope: From a nadir in the 1990s, grids seem to be making a slow recovery. Thanks in part to advocacy by groups such as the new urbanists, a rising generation of planners is ushering in a partial return to traditional neighborhood design and commonsense city planning.Across Texas, this alternative vision has started to take root. In 2018, the rapidly growing Austin suburb of Bastrop adopted a city plan that dispensed with rigid zoning and established a street grid that will accommodate future development, carefully directing it away from flood-prone areas. One year earlier, and more than 200 miles south, Laredo adopted a similar plan, calling for growth to follow a pattern of grids and plazas.Let’s hope they stick to their plans. If we revive the street grid and start to undo decades of arbitrary zoning rules, American city planning might finally get back on track. If nothing else, the next generation of suburban teens might find a new friend.

Power grab: the hidden costs of Ireland’s datacentre boom

Datacentres are part of Ireland’s vision of itself as a tech hub. There are now more than 80, using vast amounts of electricity. Have we entrusted our memories to a system that might destroy them?In the doldrum days between Christmas and New Year, we take a family trip to see a datacentre. Over the past two decades, datacentres have become a common sight on the outskirts of Dublin and many other Irish cities and towns. Situated in industrial business parks, they are easy to miss. But these buildings are critical to the maintenance of contemporary life: inside their walls stand rows and rows of networked servers; inside the servers, terabytes of data flow.It’s a seven-minute drive from where we live now in Artane, Dublin, to the Clonshaugh datacentre, situated in a business park behind Northside shopping centre. Although we live close by, we haven’t driven this way before, and our route takes us through a number of the local authority estates that my husband lived in as a boy. These estates are set on either side of a long, straight road pocked with chicanes to deter joyriders. Even though the housing development sprawls for miles on either side – with large wind-blasted green spaces in between – the houses huddle, squashed together. It looks as if someone has transplanted a warren of inner-city Victorian terraces to this desolate terrain. Continue reading...

In the doldrum days between Christmas and New Year, we take a family trip to see a datacentre. Over the past two decades, datacentres have become a common sight on the outskirts of Dublin and many other Irish cities and towns. Situated in industrial business parks, they are easy to miss. But these buildings are critical to the maintenance of contemporary life: inside their walls stand rows and rows of networked servers; inside the servers, terabytes of data flow.It’s a seven-minute drive from where we live now in Artane, Dublin, to the Clonshaugh datacentre, situated in a business park behind Northside shopping centre. Although we live close by, we haven’t driven this way before, and our route takes us through a number of the local authority estates that my husband lived in as a boy. These estates are set on either side of a long, straight road pocked with chicanes to deter joyriders. Even though the housing development sprawls for miles on either side – with large wind-blasted green spaces in between – the houses huddle, squashed together. It looks as if someone has transplanted a warren of inner-city Victorian terraces to this desolate terrain.My eldest daughter, who is six, sits in her car seat behind us and draws her impression of what a datacentre might look like. She shows it to me. It’s a large square, subdivided into many smaller squares. In the middle of each of the smaller squares swims a small tadpole-like dot. The effect is unsettling. “No windows?” I ask.She considers this for a moment. “Mummy, this is the back of the building. The back bits don’t have windows.”When Google Maps tells us we have arrived at our destination, we swing off the main road and into a newer cul-de-sac and park the car. To our right, small houses, their Christmas decor forlorn in the brownish-grey light of an Irish winter’s afternoon. To our left, the industrial park’s security-spiked fence, lining Clonshaugh Road as far as the eye can see.In 2023, the consulting company Bitpower put the number of datacentres in Ireland at 82. Ireland’s Central Statistics Office reported in 2021 that these centres were using up to 18% of the country’s metered electricity, the same amount as every urban household in Ireland combined. The datacentre we’re visiting, situated in the midst of some of Dublin’s most impoverished council estates, was only the third to be built in Ireland. At 11,500 sq metres (about 124,000 sq ft), the Clonshaugh datacentre is small compared with the one Facebook opened in 2018 in Clonee, County Meath, which is about 150,000 sq metres (about 1.6m sq ft). A 2008 Irish Times article on the building of the Clonshaugh datacentre is optimistic in tone, quoting Cathal Magee, Eircom’s manager director of retail: “Customers get the ideal environment for their critical systems, as well as access to high-value technical specialists who are skilled at managing the hardware and software that businesses require.”The Clonshaugh datacentre was developed by the US-based company Digital Realty Trust and is operated by Eir – the company that evolved from Ireland’s state-run Department of Posts and Telegraphs to first become Telecom Éireann, then the privately owned Eircom, via a disastrous flotation and shares scandal in the late 1990s. In January 2008, when Eir invested €100m in the Clonshaugh data­centre, Ireland was only months away from becoming the first country in the eurozone to enter a recession.Yet the data centres survived the downturn, heralds of a new economy that promised to one day move the nation away from the banking and housing bubble that had left it bankrupt. Datacentres were one part of a longstanding vision of Ireland as a tech hub, a place where multinationals such as Google, Facebook and Amazon would base their European headquarters, attracted by the country’s well-educated workforce and – most importantly – the low corporate tax rate, which was 12.5% until 2023. (The average corporate tax rate globally is 23%; on 1 January 2024, Ireland increased its tax rate for large businesses to 15%, in line with guidance from the Organisation for Economic Cooperation and Development.)Since the 1960s, Ireland’s Industrial Development Agency has had a policy of aiming to attract international investment through low corporate tax rates, starting with an initial rate of 0%. Ireland has long been home to tech companies: IBM and Ericsson offices opened in the 1950s, and factories owned by Dell, Intel, HP and Microsoft followed in the 1970s and 1980s. The focus of these operations was hardware. The pivot to software development coincided with the boom years of the early 2000s, when Ireland became known as the “Celtic Tiger”. Google’s European headquarters opened in Dublin in 2004, and since then, the country has become home to 16 of the 20 largest global tech companies.Google’s European headquarters in Dublin in 2006. Photograph: John Cogill/APIn the nine years between the 1999 Eircom shares scandal and the 2008 Irish banking scandal, which exposed the country’s citizens to massive debt, Ireland enjoyed a period of rapid economic growth. Even as it struggled to exit recession in the 2010s, Ireland’s continued policy of low corporate taxation encouraged the growth of big tech in the country. The result is that Ireland’s economy is heavily dependent on tech companies, with low corporate taxes meaning that these companies contribute little to the Irish exchequer – and, by extension, to the Irish citizen left heavily indebted by the recession.At Clonshaugh, we cross the winding road that skirts the industrial estate and follow a man walking a dog. He cuts through a door in the fence. It’s got a magnetic lock, but it’s resting open, and there’s a sign warning against leaving dog mess. To one side, a scrubby rise of overgrown grass and browning dock leaves, and on the other, the slick grey face of the datacentre. We stand and consider it. A low industrial hum fills the air: the sound of heavy machinery being operated some distance away. But the datacentre itself is silent.My daughter begins to sketch what she sees, and as she does so, I move off and wander alongside the fence for a bit. Apart from a few cars parked in the car park, there is nothing to look at; it feels like the building itself is looking away. As with my daughter’s initial sketch, it’s difficult to identify the building’s front, although some panels of dark glass and a central door give a subtle hint toward ingress. The windowless grey facade is broken up by a number of grids, which look like part of the building’s cooling system. Ireland’s climate has been a major factor in attracting datacentres to the country – servers need to be kept cool, and Ireland’s temperate climate makes this easier. A 2023 Irish Times article notes that Iceland, too, is now trying to attract datacentre investment – the new 1,000-mile Iris cable, which runs along the seabed between Ireland and Iceland to create a direct cable link between the two countries, could make this plan more viable. Paired with Iceland’s cold climate, low population density and commitment to sustainability – all but 15% of Iceland’s energy consumption is sourced from renewables – this means that Ireland could offload some of its data processing to Iceland to help offset the catastrophic impact of datacentres on Ireland’s energy consumption. According to Ireland’s Environmental Protection Agency, Ireland will miss its 2030 carbon reduction targets by “a significant margin”.As we walk around the datacentre’s fence, I notice a multitude of cameras around the building. A security guard in hi-vis clothing appears, talking into a walkie-talkie, perhaps wondering why this small family is hanging around the fence. I use my phone to take a picture of a planning permission sign.There is very little of tangible value that could be taken from this building (although there have been incidents of thieves breaking into data facilities in the US and stealing computer equipment), but the data on the servers is precious, and any disruption to the building’s power supply could cost the companies that pay for storage here millions. As the number of datacentres in Ireland grows – planning has been approved for another 40 – so, too, will their energy footprint. The prospect of rolling blackouts has become more and more likely.A Google datacentre in Grangecastle, outside Dublin. Photograph: Patrick Bolger/The GuardianMy daughter shows me a new drawing of the datacentre. Instead of a subdivided square, the building is now a subdivided rectangle. “Do you think it looks less scary now?” I ask.“Yeah. But I still didn’t do any windows.”If there were a window to peer through, what would I see? The internet shows me images of floors housing large servers, multiples of the kind we might be familiar with from our workplaces. About 30 people usually work in a datacentre, including security guards, cleaners and technicians, but this is a global estimate; our small Clonshaugh centre likely has far fewer. There are four or five cars parked outside the day we visit, but most people are probably still on holiday for the Christmas break.Up to 88% of what is stored in the cloud is considered junk data that will not be accessed again by users. But the value of data lies in its scale: apps, websites and cookies track our day-to-day activities, and businesses can put this information to lucrative use in order to sell us things. Most of us consider our data safe when we save it to the cloud. As a writer, any time I complain about losing work or accidentally deleting a file, I’m asked the same question: “Didn’t you back it up in the cloud?” My email account recently threatened to stop working unless I bought more storage for the hundreds of photos and videos I’ve saved of my kids. After an afternoon spent deleting, I succumbed, and now my personal history is safely tucked away in the cloud for future use – isn’t it?We drive back from Clonshaugh through Priorswood and Darndale, estates built during the 1980s, a time when Ireland suffered successive recessions, mass emigration and a heroin plague. The estates seem to have changed little since those days, even though the country as a whole has seen massive economic and social shifts, and I start thinking about the fragility of social and national memory. I wonder if datacentres such as the one in Clonshaugh will contribute to the kind of record-keeping Ireland has not always excelled at as a nation. Ireland is a country with a long memory, but a patchy one. We’ve just completed our celebrations of the Decade of Centenaries – a 10-year-long project to explore and reflect upon the decade in which the independent Irish state came into being.The commemorations started with the 1913 lockout, a general strike that strengthened the labour movement that would ultimately support the 1916 Easter Rising. In the early 1920s, further violence erupted with the war of independence and then the civil war; the latter was a bitter internecine conflict that shifted the politics of Ireland away from the revolutionary ideals of the Easter Rising and toward the conservative Christian values that defined its 20th century. One of the conflict’s pivotal moments occurred on 30 June 1922, when the Public Record Office, a repository of more than 700 years of local records, was burned during a battle between the Irish Republican Army, which had rejected the terms of the 1921 Anglo-Irish treaty that brought the Irish Free State into being, and the Free State government. “It was an act of cultural vandalism,” said Catriona Crowe, the former head of special projects at the National Archives of Ireland. “For a long time people didn’t realise what we had lost.”The Four Courts, the headquarters of the anti-treaty Republicans, shelled by Free State forces during the Irish civil war. Photograph: Central Press/Getty ImagesDuring the civil war, the Public Record Office – adjacent to the Four Courts building, where the anti-treaty IRA had stationed itself – was used as an armoury. When the pro-treaty side enlisted the help of British artillery – which it eventually did, four months after the initial occupation of the Four Courts – the Public Record Office suffered the worst damage. I think of the cameras hanging off the corners of the Clonshaugh datacentre, their domed glass lenses allowing them a 360-degree view of those who might come to threaten the flow of information.The burning of public records “was a massive own goal”, Crowe told me. “It wiped out the history of the occupants of this island, most of whom kept no records.” She mentioned Slievemore, a settlement of 80 to 100 abandoned cottages on the slopes of Slievemore mountain on Achill Island, off the coast of County Mayo: “It was thriving before the famine. If we had the 1841 census, we would know the names, religions, occupations of the people.” The area was settled for more than 5,000 years. With the birth of the new state, we lost all record of that famine generation.I asked Crowe what she thinks of digitisation in general, of the replacement of tangible records with a digital copy of each of our earthly transactions, stored among various servers. “Well, the first thing to say is that the most secure form of knowledge preservation is stone, and the oldest,” Crowe said. “After that, parchment – it survives all kinds of difficulties and remains robust. Then we had rag-based paper, from the 15th century onward, and after that acid paper cut from forests in the early 19th century. This deteriorates very quickly and needs to be kept stable. But by far the most unstable form is digital. We have a black hole opening in history. When Irish government departments started using computers in the 1970s, there was no network, and many of those files can’t be read any longer. There’s no real policy for digital preservation of state records. A nightmare is facing us. Emails, Excel, Word, PowerPoint – they’ll all vanish – unless there’s a decision made by the government.”I have a little experience with the instability of digital files myself. In a past life, I worked on a project to digitise the extensive archive of Dublin’s Abbey Theatre, the English-speaking world’s first state-subsidised theatre. I was surprised by the projected speed of deterioration of the images we were creating for our database; and even Tiff files, which don’t degrade, may become inaccessible if the software used to read them becomes obsolete. And so all that material floating around in the cloud – which is in reality being bounced from server to server, degrading each time this happens – is not really being preserved in the way we might imagine. Its continued existence is dependent on a steady flow of electricity, the continued provision of which is contingent upon governments reaching renewables targets they can’t agree upon. And even then, these files will degrade, deteriorate and become obsolete. Rather than creating something permanent and inviolable, we’ve made our memories more contingent than ever upon a fantasy of technological stability that, given the constant churn of history, seems inevitably fleeting.As world temperatures increase, datacentres have migrated to places where the climate is temperate. There, they consume vast amounts of energy, increasing carbon emissions. It’s a frightening and seemingly unsustainable pattern; we’ve entrusted our memories to a system that might destroy them, and us. Because of this fraught reality, datacentres in Ireland have become controversial in the past five years, and the tone of the newspaper articles discussing them has changed. It’s suggested that if all the datacentres currently proposed in Ireland are built, they could be using up to 70% of the country’s electricity by 2030.Facebook’s datacentre in Luleå, Sweden. Photograph: David Levene/The GuardianReliance on fossil fuels and on-site generation has remained a concern for environmentalists in the intervening years, with close scrutiny being paid to developers’ commitments to contributing to Ireland’s renewable power grid. The journalist Aoife Barry, in her research for her recent book Social Capital, has identified the ways in which multinationals are greenwashing their contribution toward renewables, including the case of a high court review of a planned Apple datacentre in County Galway in 2018:“The board sought more information on the plans, saying there was a lack of clarity around ‘direct sustainable energy sources’, including how Apple would live up to its promise of running on 100% renewable energy. When Apple submitted a revised environmental impact assessment to the council, it indicated that it wouldn’t be generating renewable energy itself. Instead, it would buy renewable generated power from an energy supplier ‘equal to the total power consumption of the datacentre building in any particular year’.”This equation works out only if energy demands don’t continue to rise in the coming years, meaning any renewable investment is going to continue to lag behind the needs of the expanding datacentre sector.In an effort to entice further foreign direct investment, the government has implemented measures aimed at streamlining the planning process for datacentres, which would allow concerned citizens less visibility into the estimated environmental impacts of these centres. Decreasing transparency in this instance seems unnerving, and symptomatic of the Irish state’s strange relationship with multinationals. In 2016, the Irish government rejected the European Commission’s ruling that Apple should pay Ireland €13bn in unpaid taxes, arguing that the lower corporate taxation rate the country offered at the time makes Ireland more attractive for investors. The logic behind that decision might have been confusing to everyday Irish citizens, given that at that time they were each saddled with €42,000 in debt accrued after the International Monetary Fund’s bailout of Ireland’s banks.Datacentres have contributed €7.3bn to the Irish economy, but provide only about 16,000 jobs to a country of 5.28 million people. The lack of employment these centres provide leads to questions over who benefits from their existence. In August 2022, after two consecutive amber alerts for electricity outages in Ireland, then-finance minister Paschal Donohoe appeared on national broadcaster RTÉ’s radio show Morning Ireland, where he was quizzed about the low employment figures at datacentres and the fact that their profit margins were soaring while electricity bills had reached new highs for Irish citizens. He dismissed the lack of employment, emphasising instead the “huge importance of them to really large employers within our country, whose taxes and jobs are playing an invaluable part in our economic performance at the moment”.The benefits of the data centre economy are diffuse, intangible. In 2022, due to concerns about pressure on the National Grid and the potential for rolling blackouts, EirGrid, Ireland’s energy grid, placed a moratorium on the development of new datacentres in Dublin until 2028. But applications for centres outside the capital are still being granted. Other European countries, such as the Netherlands, are halting their development of datacentres. Singapore imposed a three-year moratorium from 2019 to 2022, and is now seeking applications within new parameters to ensure sustainability. Unless Ireland figures out a way to surge forward with its slow development of renewables, these datacentres seem impossible to sustain. One potential solution is to look more carefully at what data we retain, and why. We must weigh the short-term financial benefits of seemingly infinite data retention against the long-term threat of climate crisis.Ireland is no exception to the rule that what we remember and what we forget are always contingent upon the power structures and hierarchies that shape our contemporary moment. At the birth of the state, we burned our history in an act of carelessness; we also freed ourselves to create a new national history. We entrusted the church with our moral guidance and guardianship, and then allowed it to commit unspeakable cruelties on our citizens, including the abuses recounted in the Report of the Commission to Inquire Into Child Abuse (2009) and the Commission of Investigation Into Mother and Baby Homes (2021). At the latter end of the century, and in the wake of joining the EU, we moved away from our old bad memories and toward a prosperous new era, placing our faith in international investment, almost at any cost. But in a small country like Ireland, the old names – whether they be companies or state organisations or political dynasties – crop up again and again. Sometimes our faulty memories flash up a warning. But often that history is stored in the cloud: intangible, vulnerable to exploitation – and degrading over time.This article first appeared in issue 13 of The Dial

Private Equity Kept Aging Pennsylvania Coal Plant Open, Then Closed It With No Plan for Workers and Community

“Polluting behemoth” Homer City Generating Station was the state's largest coal-fired power plant. The post Private Equity Kept Aging Pennsylvania Coal Plant Open, Then Closed It With No Plan for Workers and Community appeared first on .

When the owners of the Homer City Generating Station announced that it would finally be shuttering by July of 2023 due to competition from cheap natural gas and the costs of adhering to environmental regulations, it signaled a long-anticipated shift in Pennsylvania’s energy mix. The coal-fired power plant was the last in the state — and the largest — to make decommissioning plans, so the announcement gave some in the fossil-fuel heavy Keystone State a sense of relief, both in terms of environmental impact and health effects. “This will prevent premature deaths, illness & slash CO2 emissions,” former Pennsylvania Department of Environmental Protection Secretary John Hanger tweeted.   But many across the political spectrum feared what would happen next to the small community of Homer City, in southwestern Pennsylvania, home to just over 1,500 people. Years of instability meant its shuttering could have been better prepared for. The plant owners gave just 90 days’ notice of its closure as more than 120 workers prepared to pack their bags to face an uncertain future. “Instead of recognizing the market shift and preparing for the transition to clean, renewable generation,” Leigh Martinez, communications director at nonprofit advocacy group PennFuture, wrote in a statement, “fossil fuel-friendly legislators have spent years in denial.” Homer City residents and former plant employees say the community has been left in dire straits as a result.  That denial campaign was waged in part by the plant’s private equity owners, who slashed headcount and cut back on maintenance to keep the plant alive and squeeze whatever profit they could out of the aging facility. After the plant was acquired by private equity in 2017, Homer City Generation President and CEO William Wexler cut right to the chase about his intentions: “I’ve spoken to all the employees, and I explained who we are and what we’re here to do… to get this plant to be a far more profitable member of the community and to sell it.”  “Changes in ownership, operators, and gaps in maintenance have led to operational issues and instability” at the plant, energy firm NRG, which once managed the plant, wrote in a one-pager, which has since been taken down.  For some industry observers and analysts, the Homer City station stands as a cautionary tale regarding the growing role of private equity, typically backed by large institutional investors, like pension funds, in the financing of coal production and the operation of coal-fired power plants, marked by a desire for quick profits. While banks have increasingly pulled back from financing fossil fuels, notorious for their damage to the environment, private equity has helped keep the coal industry alive. In the case of Homer City, the plant’s private equity owners kept a failing company operating for another six years — as it further polluted the environment and extended the power sector’s reliance on coal. A Polluting Behemoth Homer City — a 2,000 megawatt power plant an hour east of Pittsburgh in Indiana County, Pennsylvania — was once known as one of the biggest polluters in the state. (“A polluting behemoth,” Hanger called the site in his tweet.)  In 2011, the U.S. Environmental Protection Agency sued Homer City’s owners for operating new equipment without permits; the following year, the plant came under fire for exceeding federal limits on sulfur dioxide, a pollutant that wreaks havoc on the respiratory system. A few years prior, the state fined the Homer City plant for improperly dispatching selenium, toxic to aquatic life, and wastewater into nearby creeks. But its reputation as a “terrible neighbor” was eventually eclipsed by its apparent financial instability. Over the 2010s, Homer City endured two bankruptcies and two subsequent changes of ownership, at least one default on a debt payment, a lawsuit against its coal supplier and staff cutbacks — the lattermost, it eventually came to blame on Pennsylvania’s looming entry into the Regional Greenhouse Gas Initiative (RGGI), a regional cap-and-trade program the state has yet to join. Yet, amid the tumult, its smokestacks — known as the tallest in America, towering over farmland and the modest town of Homer City — continued to run. By April 2023, the plant was managed by a lean team; the remaining 129 employees were to be laid off in the wake of the closure. This number had fallen from 240 in 2017. The plant hadn’t run at full capacity for years — in the six years before it closed, it operated at around 20%, on the hook to supply power to PJM Interconnection, the regional grid operator delivering power from Delaware to Michigan, on an as-needed basis. The parties responsible for managing this contract were a group of private equity firms that included Knighthead Management, which “specializes in event driven, distressed credit and special situation opportunities,” and, at one point, the Carlyle Group, a global behemoth that was given a failing grade by the Private Equity Stakeholder Project in 2022 for the millions of tons of carbon dioxide its assets emit, despite the firm’s net zero by 2050 commitment. Homer City was acquired by private equity in 2017 after its previous owner filed for bankruptcy, aiming to erase $600 million in debt from its balance sheet. It was owned by General Electric, and by the utility Edison International, before that. The new owners formed a limited liability company (LLC), Homer City Generation LLC, a way of protecting themselves from responsibility for the company’s debts.    Very little is known about what happened within the plant after this, because private companies operate without regulatory oversight, largely out of view of the public. Sometimes, they buy assets from publicly traded companies responding to increased environmental scrutiny from shareholders. It’s widely understood that these firms aim to drive profitability in the short term, typically cutting overhead along the way.  “The big concern with private equity swooping in,” says Nichole Heil, research and campaign coordinator at the Private Equity Stakeholder Project, “They come into a coal plant … and they are looking to squeeze as much profit out of it as they can.”  “They’re looking to get in and get out,” she said. Money First Private equity firms have come under fire for operating with a profit-first strategy in a range of industries — slashing nursing home staff, eliminating services at prisons and raising patient costs at healthcare facilities. In the power sector, that business model could look like forgoing energy efficiency upgrades in pursuit of short-term profits before shutting an asset down. In Homer City’s case, it meant keeping alive a plant that would’ve been well-suited for a staged decommissioning years ago — only to close it and leave the community picking up the pieces.  “PE firms push risks onto the communities,” Dennis Wamsted, energy analyst at the Institute for Energy Economics and Financial Analysis (IEEFA), writes in an August 2023 report on private equity in PJM. “When their plants are no longer economic, PE generators can simply decide to close up shop and get out, leaving unprepared localities facing significant economic dislocations from job and tax losses.”  “This exact scenario played out … at the Homer City power plant,” he continued. Though long-anticipated, the closure felt disconcertingly abrupt, says a former employee who spoke with Capital & Main on the condition of anonymity. The worker says they never heard from the union representing Homer City staff, IBEW Local 459, nor from politicians who had publicly advocated for the plant. The energy firm contracted to manage the plant, NRG Energy, offered job relocation, but many of those roles were out of state, and thus impractical for those with families, or who didn’t want to move, says the former employee. Workers were also referred to a state-run career portal, but many of those jobs didn’t pay close to what they’d earned at Homer City, they said.  “Everybody just kind of walked away from us,” said the worker, for whom the reality of life outside the plant felt particularly harsh after years at a steady job that routinely required 18-hour shifts and holiday work.  “We always heard from all our politicians that there will be jobs to transition into,” the worker continued. “No politicians came and talked to us.”  Multiple sources Capital & Main spoke with emphasized the economic ripple effects of the Homer City’s closure on the broader community. Rob Nymick, Homer City borough manager, doesn’t mince words: “We’re fighting for our survival right now.”  Nymick says the plant’s closure, and the shrinking coal economy in the years leading up to it, have hit the town’s tax base hard. That’s affected the local school district above all else. A 60-year resident of this part of the state, Nymick remembers when Homer City’s economy was built not just on the power plant, but on the coal mines that fueled it. He watched those mines shutter, sat through several waves of layoffs at the plant and says its closure was only a matter of time. “We knew this day was coming, he said. “Maybe we weren’t as prepared for it, because it always seemed to get bailed out.”  Wamsted urges other towns to look to Homer City as a cautionary tale. “That community is not likely going to be the last unless local leaders begin planning now,” he warned in his report. Private capital — a mix of private companies and private equity firms — owns some 60% of fossil-fuel generation in PJM, according to the August report, and is responsible for more than 50% of the region’s annual CO2 releases. Seven years ago, the biggest players in PJM were publicly traded energy companies, per the Institute for Energy Economics and Financial Analysis’ December report. Today, the biggest players in the regional market are private equity firms. “Difficult-to-track private equity (PE) investment has reshaped the PJM power market in the past decade,” the report reads. Experts see private equity ownership slowly taking over the economy, gaining a steady grip on sectors like hospitals, mobile home communities and prisons. Some notoriously bad actors in the energy sector are owned at least in part by private equity firms: Diversified, an oil and gas operator known for buying more “low-decline, low-cost,” aging oil wells than any other company in the U.S., inked a $1 billion partnership with private equity firm Oaktree in 2021 to expand its operations into Southern states. Private equity giant KKR has a majority stake in the Coastal GasLink Pipeline, which has faced years of resistance for cutting through Wet’suwet’en lands in northern British Columbia without consent from local hereditary chiefs. The list goes on. Unlike utilities, which are publicly traded and must file routinely with regulators, private equity firms can operate in relative secrecy — about their ownership, financials, expenses, risks and the like. Soon, publicly traded companies will be required to disclose their emissions and climate-related risks, per a pending federal rule. Private companies will be left out of this standard.  “When a utility owns a power plant, they would have to file their expenditures on maintenance with the Public Service Commission,” Wamsted says. “When a private equity company does that, they don’t have to file anything.  “So you have no idea, really, what they’re repairing, when they’re repairing and how much their power costs,” he said.  These companies may end up selling their assets when they’re no longer profitable — and operate them with the aim of increasing their profitability above all else. After the plant was acquired by Knighthead in 2017, Homer City CEO William Wexler told the Pittsburgh Post-Gazette that his aim was to make the plant lucrative ahead of a future sale. First steps to achieving this aim included cutting costs of fuel and maintenance, he told the Post-Gazette. It also included layoffs — “a pretty standard PE trick,” Wamsted told Capital & Main. “Cut expenses to the bone.”  Capital & Main reached out to Wexler and the union representing Homer City workers, IBEW Local 459, neither of which responded by publication time. IBEW has posted a handful of updates to its website in recent months addressed to “those affected by the closure of Homer City Generating Station.” As of February, those updates are now password-protected. “Local Union 459 has received the unfortunate news that Homer City Generating Station will be decommissioned July 1, 2023,” the union posted several days after the site’s decommissioning announcement. “The Local has reached out to the Company for further information regarding this process. At this time the Company has not provided any further details.” A Harbinger Private equity firms find a grab-bag worth of aging, low-cost assets in what’s called a deregulated power market, wherein power generation and transmission are handled by different entities, and grid operators manage auctions to oversee the sale of energy between the camps. In markets run by vertically integrated utilities, like in the Southeast and Northwest, by contrast, monopoly investor-owned utilities are responsible for production, transmission and distribution of energy. That is, how it’s made, how it makes it to the grid, and how it makes it to the customer.   PJM, specifically, offers aging assets profits via the capacity market, designed to ensure that the grid stays reliable under strain, like during cold snaps and heatwaves. While real-time and day-ahead markets offer generators a place to bid their energy for use in the short term, the capacity market contracts generators to be available to be called on at times of peak demand on the grid three years in advance. Operators agree to come online when they are needed; if they don’t, they’re subject to a steep fine. Capacity markets play a “vital role” in “propping up older, less competitive generators,” Wamsted and co-authors write in a 2022 IEEFA report. They offer stable payments to power plants without asking for much in return, and are attractive for energy assets that aren’t competitive on the real-time and day-ahead markets — those not equipped to run efficiently enough, such as older power plants, or power plants being run on a shoestring budget, slashed by a private equity firm. Plants like Homer City.  Not all grid operators use capacity markets — and whether they’re the right tool to manage energy demand through a green transition remains the subject of rather esoteric theoretical debate. Sylwia Bialek-Gregory, head of the unit for microeconomics at the German Council of Economic Experts and former economist at New York University’s Institute for Policy Integrity, blames poor regulations for the proliferation of high-polluting, low-capacity plants. She sees private equity firms as “bad guys” — but bad guys who are just doing their jobs within a weak regulatory landscape. Were carbon pricing or adequate emissions caps in place, aging power plants wouldn’t be nearly as profitable as they are. “If you were to punish emitting [plants] for the damage they create to society, they wouldn’t be attractive for the private equity investors,” she said. “They would just not make a profit.”  Ironically, Homer City’s owners blamed regulations for the plant’s death. But many argue it was coal’s dwindling value compared to renewables that rendered Homer City uneconomical. A solar facility is sitting in PJM’s interconnection queue, ready to take up the plant’s valuable connection to the grid.  Nymick, Homer City borough manager, says he’s open to any energy form that could offer his community an economic lifeline. And, until that happens, he’s putting his faith in a new sector: tourism. He’s applied for a handful of state and federal grants to clean up acid mine drainage in two local creeks that coal mining polluted long ago. He’s not sure if this plan will reap the jobs coal once did, but nonetheless, Nymick wants to see Homer City “become more of a destination.”  “I think it’s very important to the growth of our community, looking beyond coal, and the power plant,” he said. “Our elected officials have relied on coal as the main driver for their reelection. And this is a whole different approach.”  With all of Pennsylvania’s coal-fired power plants in the process of decommissioning, Wamsted fears a similar future for a new wave of natural gas plants being kept alive on PJM’s capacity market, brought online by high capacity prices of the 2010s that have since halved. In June 2022, Wamsted called these capacity payments a “losing bet” for gas plants. In December of the following year, he called Homer City a “harbinger” for future plant closures. (Two other coal-fired plants, both less than 20 miles away in Indiana County, are slated to be decommissioned by 2028. The county has four years to prepare.)  Some 80% of new fracked gas energy capacity in PJM is owned by private equity or some other private ownership model, according to Wamsted’s August 2023 IEEFA report. Some 45,000 megawatts of generation capacity was built between 2011 and 2022, predicated in part on steady, high prices in the capacity market that ensured financers could cover their debt.    Those capacity prices have since fallen — new projects are being canceled, existing ones are seeing their credit ratings downgraded, and ratings agencies are expressing concern about power generators’ profitability. PJM is facing a yearslong queue backlog, and new projects permitted are likely to be renewable, Wamsted’s report argues. Gas plants that have come online in recent years could soon face a fate similar to that of coal plants like Homer City.  “We’ve been arguing it should have closed five years ago, because it wasn’t economic,” Wamsted said. “Unfortunately, we were right. “But nobody planned for it to be closed,” he continued. “And so, now, people are going to be out of work. If this planning process had started five years ago, there might have been a possibility for a transition.”   Copyright 2024 Capital & Main

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.