Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Predicting Chaos With AI: The New Frontier in Autonomous Control

News Feed
Saturday, May 18, 2024

Advanced machine learning algorithms have shown potential in efficiently controlling complex systems, promising significant improvements in autonomous technology and digital infrastructure.Recent research highlights the development of advanced machine learning algorithms capable of controlling complex systems efficiently. These new algorithms, tested on digital twins of chaotic electronic circuits, not only predict and control these systems effectively but also offer significant improvements in power consumption and computational demands.Systems controlled by next-generation computing algorithms could give rise to better and more efficient machine learning products, a new study suggests.Using machine learning tools to create a digital twin, or a virtual copy, of an electronic circuit that exhibits chaotic behavior, researchers found that they were successful at predicting how it would behave and using that information to control it. The Limitations of Linear ControllersMany everyday devices, like thermostats and cruise control, utilize linear controllers – which use simple rules to direct a system to a desired value. Thermostats, for example, employ such rules to determine how much to heat or cool a space based on the difference between the current and desired temperatures.Yet because of how straightforward these algorithms are, they struggle to control systems that display complex behavior, like chaos.As a result, advanced devices like self-driving cars and aircraft often rely on machine learning-based controllers, which use intricate networks to learn the optimal control algorithm needed to best operate. However, these algorithms have significant drawbacks, the most demanding of which is that they can be extremely challenging and computationally expensive to implement.The Impact of Efficient Digital TwinsNow, having access to an efficient digital twin is likely to have a sweeping impact on how scientists develop future autonomous technologies, said Robert Kent, lead author of the study and a graduate student in physics at The Ohio State University.“The problem with most machine learning-based controllers is that they use a lot of energy or power and they take a long time to evaluate,” said Kent. “Developing traditional controllers for them has also been difficult because chaotic systems are extremely sensitive to small changes.”These issues, he said, are critical in situations where milliseconds can make a difference between life and death, such as when self-driving vehicles must decide to brake to prevent an accident.The study was published recently in Nature Communications.Advancements in Machine Learning ArchitectureCompact enough to fit on an inexpensive computer chip capable of balancing on your fingertip and able to run without an internet connection, the team’s digital twin was built to optimize a controller’s efficiency and performance, which researchers found resulted in a reduction of power consumption. It achieves this quite easily, mainly because it was trained using a type of machine learning approach called reservoir computing.“The great thing about the machine learning architecture we used is that it’s very good at learning the behavior of systems that evolve in time,” Kent said. “It’s inspired by how connections spark in the human brain.”Practical Applications and Future DirectionsAlthough similarly sized computer chips have been used in devices like smart fridges, according to the study, this novel computing ability makes the new model especially well-equipped to handle dynamic systems such as self-driving vehicles as well as heart monitors, which must be able to quickly adapt to a patient’s heartbeat.“Big machine learning models have to consume lots of power to crunch data and come out with the right parameters, whereas our model and training is so extremely simple that you could have systems learning on the fly,” he said.To test this theory, researchers directed their model to complete complex control tasks and compared its results to those from previous control techniques. The study revealed that their approach achieved a higher accuracy at the tasks than its linear counterpart and is significantly less computationally complex than a previous machine learning-based controller.“The increase in accuracy was pretty significant in some cases,” said Kent. Though the outcome showed that their algorithm does require more energy than a linear controller to operate, this tradeoff means that when it is powered up, the team’s model lasts longer and is considerably more efficient than current machine learning-based controllers on the market.“People will find good use out of it just based on how efficient it is,” Kent said. “You can implement it on pretty much any platform and it’s very simple to understand.” The algorithm was recently made available to scientists.Economic and Environmental ConsiderationsOutside of inspiring potential advances in engineering, there’s also an equally important economic and environmental incentive for creating more power-friendly algorithms, said Kent.As society becomes more dependent on computers and AI for nearly all aspects of daily life, demand for data centers is soaring, leading many experts to worry over digital systems’ enormous power appetite and what future industries will need to do to keep up with it.And because building these data centers as well as large-scale computing experiments can generate a large carbon footprint, scientists are looking for ways to curb carbon emissions from this technology.To advance their results, future work will likely be steered toward training the model to explore other applications like quantum information processing, Kent said. In the meantime, he expects that these new elements will reach far into the scientific community.“Not enough people know about these types of algorithms in the industry and engineering, and one of the big goals of this project is to get more people to learn about them,” said Kent. “This work is a great first step toward reaching that potential.”Reference: “Controlling chaos using edge computing hardware” by Robert M. Kent, Wendson A. S. Barbosa and Daniel J. Gauthier, 8 May 2024, Nature Communications.DOI: 10.1038/s41467-024-48133-3This study was supported by the U.S. Air Force’s Office of Scientific Research. Other Ohio State co-authors include Wendson A.S. Barbosa and Daniel J. Gauthier.

Recent research highlights the development of advanced machine learning algorithms capable of controlling complex systems efficiently. These new algorithms, tested on digital twins of chaotic...

Machine Learning AI Technology Illustration

Advanced machine learning algorithms have shown potential in efficiently controlling complex systems, promising significant improvements in autonomous technology and digital infrastructure.

Recent research highlights the development of advanced machine learning algorithms capable of controlling complex systems efficiently. These new algorithms, tested on digital twins of chaotic electronic circuits, not only predict and control these systems effectively but also offer significant improvements in power consumption and computational demands.

Systems controlled by next-generation computing algorithms could give rise to better and more efficient machine learning products, a new study suggests.

Using machine learning tools to create a digital twin, or a virtual copy, of an electronic circuit that exhibits chaotic behavior, researchers found that they were successful at predicting how it would behave and using that information to control it.

The Limitations of Linear Controllers

Many everyday devices, like thermostats and cruise control, utilize linear controllers – which use simple rules to direct a system to a desired value. Thermostats, for example, employ such rules to determine how much to heat or cool a space based on the difference between the current and desired temperatures.

Yet because of how straightforward these algorithms are, they struggle to control systems that display complex behavior, like chaos.

As a result, advanced devices like self-driving cars and aircraft often rely on machine learning-based controllers, which use intricate networks to learn the optimal control algorithm needed to best operate. However, these algorithms have significant drawbacks, the most demanding of which is that they can be extremely challenging and computationally expensive to implement.

The Impact of Efficient Digital Twins

Now, having access to an efficient digital twin is likely to have a sweeping impact on how scientists develop future autonomous technologies, said Robert Kent, lead author of the study and a graduate student in physics at The Ohio State University.

“The problem with most machine learning-based controllers is that they use a lot of energy or power and they take a long time to evaluate,” said Kent. “Developing traditional controllers for them has also been difficult because chaotic systems are extremely sensitive to small changes.”

These issues, he said, are critical in situations where milliseconds can make a difference between life and death, such as when self-driving vehicles must decide to brake to prevent an accident.

The study was published recently in Nature Communications.

Advancements in Machine Learning Architecture

Compact enough to fit on an inexpensive computer chip capable of balancing on your fingertip and able to run without an internet connection, the team’s digital twin was built to optimize a controller’s efficiency and performance, which researchers found resulted in a reduction of power consumption. It achieves this quite easily, mainly because it was trained using a type of machine learning approach called reservoir computing.

“The great thing about the machine learning architecture we used is that it’s very good at learning the behavior of systems that evolve in time,” Kent said. “It’s inspired by how connections spark in the human brain.”

Practical Applications and Future Directions

Although similarly sized computer chips have been used in devices like smart fridges, according to the study, this novel computing ability makes the new model especially well-equipped to handle dynamic systems such as self-driving vehicles as well as heart monitors, which must be able to quickly adapt to a patient’s heartbeat.

“Big machine learning models have to consume lots of power to crunch data and come out with the right parameters, whereas our model and training is so extremely simple that you could have systems learning on the fly,” he said.

To test this theory, researchers directed their model to complete complex control tasks and compared its results to those from previous control techniques. The study revealed that their approach achieved a higher accuracy at the tasks than its linear counterpart and is significantly less computationally complex than a previous machine learning-based controller.

“The increase in accuracy was pretty significant in some cases,” said Kent. Though the outcome showed that their algorithm does require more energy than a linear controller to operate, this tradeoff means that when it is powered up, the team’s model lasts longer and is considerably more efficient than current machine learning-based controllers on the market.

“People will find good use out of it just based on how efficient it is,” Kent said. “You can implement it on pretty much any platform and it’s very simple to understand.” The algorithm was recently made available to scientists.

Economic and Environmental Considerations

Outside of inspiring potential advances in engineering, there’s also an equally important economic and environmental incentive for creating more power-friendly algorithms, said Kent.

As society becomes more dependent on computers and AI for nearly all aspects of daily life, demand for data centers is soaring, leading many experts to worry over digital systems’ enormous power appetite and what future industries will need to do to keep up with it.

And because building these data centers as well as large-scale computing experiments can generate a large carbon footprint, scientists are looking for ways to curb carbon emissions from this technology.

To advance their results, future work will likely be steered toward training the model to explore other applications like quantum information processing, Kent said. In the meantime, he expects that these new elements will reach far into the scientific community.

“Not enough people know about these types of algorithms in the industry and engineering, and one of the big goals of this project is to get more people to learn about them,” said Kent. “This work is a great first step toward reaching that potential.”

Reference: “Controlling chaos using edge computing hardware” by Robert M. Kent, Wendson A. S. Barbosa and Daniel J. Gauthier, 8 May 2024, Nature Communications.
DOI: 10.1038/s41467-024-48133-3

This study was supported by the U.S. Air Force’s Office of Scientific Research. Other Ohio State co-authors include Wendson A.S. Barbosa and Daniel J. Gauthier.

Read the full story here.
Photos courtesy of

The world’s carbon emissions continue to rise. But 35 countries show progress in cutting carbon

In 2025 the world has fallen short, again, of peaking and reducing its fossil fuel use. But there are many countries on a path to greener energy.

Global fossil fuel emissions are projected to rise in 2025 to a new all-time high, with all sources – coal, gas, and oil – contributing to the increase. At the same time, our new global snapshot of carbon dioxide emissions and carbon sinks shows at least 35 countries have a plan to decarbonise. Australia, Germany, New Zealand and many others have shown statistically significant declines in fossil carbon emissions during the past decade, while their economies have continued to grow. China’s emissions have also been been growing at a much slower pace than recent trends and might even be flat by year’s end. As world leaders and delegates meet in Brazil for the United Nations’ global climate summit, COP30, many countries that have submitted new emissions commitments to 2035 have shown increased ambition. But unless these efforts are scaled up substantially, current global temperature trends are projected to significantly exceed the Paris Agreement target that aims to keep warming well below 2°C. These 35 countries are now emitting less carbon dioxide even as their economies grow. Global Carbon Project 2025, CC BY-NC-ND Fossil fuel emissions up again in 2025 Together with colleagues from 102 research institutions worldwide, the Global Carbon Project today releases the Global Carbon Budget 2025. This is an annual stocktake of the sources and sinks of carbon dioxide worldwide. We also publish the major scientific advances enabling us to pinpoint the global human and natural sources and sinks of carbon dioxide with higher confidence. Carbon sinks are natural or artificial systems such as forests which absorb more carbon dioxide from the atmosphere than they release. Global CO₂ emissions from the use of fossil fuels continue to increase. They are set to rise by 1.1% in 2025, on top of a similar rise in 2024. All fossil fuels are contributing to the rise. Emissions from natural gas grew 1.3%, followed by oil (up 1.0%) and coal (up 0.8%). Altogether, fossil fuels produced 38.1 billion tonnes of CO₂ in 2025. Not all the news is bad. Our research finds emissions from the top emitter, China (32% of global CO₂ emissions) will increase significantly more slowly below its growth over the past decade, with a modest 0.4% increase. Emissions from India (8% of global) are projected to increase by 1.4%, also below recent trends. However, emissions from the United States (13% of global) and the European Union (6% of global) are expected to grow above recent trends. For the US, a projected growth of 1.9% is driven by a colder start to the year, increased liquefied natural gas (LNG) exports, increased coal use, and higher demand for electricity. EU emissions are expected to grow 0.4%, linked to lower hydropower and wind output due to weather. This led to increased electricity generation from LNG. Uncertainties in currently available data also include the possibility of no growth or a small decline. Fossil fuel emissions hit a new high in 2025, but the growth rate is slowing and there are encouraging signs from countries cutting emissions. Global Carbon Project 2025, CC BY-NC-ND Drop in land use emissions In positive news, net carbon emissions from changes to land use such as deforestation, degradation and reforestation have declined over the past decade. They are expected to produce 4.1 billion tonnes of carbon dioxide in 2025 down from the annual average of 5 billion tonnes over the past decade. Permanent deforestation remains the largest source of emissions. This figure also takes into account the 2.2 billion tonnes of carbon soaked up by human-driven reforestation annually. Three countries – Brazil, Indonesia and the Democratic Republic of the Congo – contribute 57% of global net land-use change CO₂ emissions. When we combine the net emissions from land-use change and fossil fuels, we find total global human-caused emissions will reach 42.2 billion tonnes of carbon dioxide in 2025. This total has grown 0.3% annually over the past decade, compared with 1.9% in the previous one (2005–14). Carbon sinks largely stagnant Natural carbon sinks in the ocean and terrestrial ecosystems remove about half of all human-caused carbon emissions. But our new data suggests these sinks are not growing as we would expect. The ocean carbon sink has been relatively stagnant since 2016, largely because of climate variability and impacts from ocean heatwaves. The land CO₂ sink has been relatively stagnant since 2000, with a significant decline in 2024 due to warmer El Niño conditions on top of record global warming. Preliminary estimates for 2025 show a recovery of this sink to pre-El Niño levels. Since 1960, the negative effects of climate change on the natural carbon sinks, particularly on the land sink, have suppressed a fraction of the full sink potential. This has left more CO₂ in the atmosphere, with an increase in the CO₂ concentration by an additional 8 parts per million. This year, atmospheric CO₂ levels are expected to reach just above 425 ppm. Tracking global progress Despite the continued global rise of carbon emissions, there are clear signs of progress towards lower-carbon energy and land use in our data. There are now 35 countries that have reduced their fossil carbon emissions over the past decade, while still growing their economy. Many more, including China, are shifting to cleaner energy production. This has led to a significant slowdown of emissions growth. Existing policies supporting national emissions cuts under the Paris Agreement are projected to lead to global warming of 2.8°C above preindustrial levels by the end of this century. This is an improvement over the previous assessment of 3.1°C, although methodological changes also contributed to the lower warming projection. New emissions cut commitments to 2035, for those countries that have submitted them, show increased mitigation ambition. This level of expected mitigation falls still far short of what is needed to meet the Paris Agreement goal of keeping warming well below 2°C. At current levels of emissions, we calculate that the remaining global carbon budget – the carbon dioxide still able to be emitted before reaching specific global temperatures (averaged over multiple years) – will be used up in four years for 1.5°C (170 gigatonnes remaining), 12 years for 1.7°C (525 Gt) and 25 years for 2°C (1,055 Gt). Falling short Our improved and updated global carbon budget shows the relentless global increase of fossil fuel CO₂ emissions. But it also shows detectable and measurable progress towards decarbonisation in many countries. The recovery of the natural CO₂ sinks is a positive finding. But large year-to-year variability shows the high sensitivity of these sinks to heat and drought. Overall, this year’s carbon report card shows we have fallen short, again, of reaching a global peak in fossil fuel use. We are yet to begin the rapid decline in carbon emissions needed to stabilise the climate. Pep Canadell receives funding from the Australian National Environmental Science Program - Climate Systems HubClemens Schwingshackl receives funding from the European Union's Horizon Europe research and innovation programme and Schmidt Sciences.Corinne Le Quéré receives funding from the UK Natural Environment Research Council, the UK Royal Society, and the UK Advanced Research + Invention Agency. She was granted a research donation by Schmidt Futures (project CALIPSO – Carbon Loss In Plants, Soils and Oceans). Corinne Le Quéré is a member of the UK Climate Change Committee. Her position here is her own and does not necessarily reflect that of the Committee. Corinne Le Quéré is a member of the Scientific Advisory Council of Societe Generale. Glen Peters receives funding from the European Union's Horizon Europe research and innovation programme.Judith Hauck receives funding from the European Union's Horizon Europe research and innovation programme, the European Research Council and Germany's Federal Ministry of Research, Technology and Space.Julia Pongratz receives funding from the European Horizon Europe research and innovation programme and Germany's Federal Ministry of Research, Technology and Space.Mike O'Sullivan receives funding from the European Union's Horizon Europe research and innovation programme, and the European Space Agency.Pierre Friedlingstein receives funding from the European Union's Horizon Europe research and innovation programmeRobbie Andrew receives funding from the European Union's Horizon Europe research and innovation programme and the Norwegian Environment Agency.

AI power use forecast finds the industry far off track to net zero

Several large tech firms that are active in AI have set goals to hit net zero by 2030, but a new forecast of the energy and water required to run large data centres shows they’re unlikely to meet those targets

A data centre in Ashburn, VirginiaJIM LO SCALZO/EPA/Shutterstock As the AI industry rapidly expands, questions about the environmental impact of data centres are coming to the forefront – and a new forecast warns the industry is unlikely to meet net zero targets by 2030. Fengqi You at Cornell University in New York and his colleagues modelled how much energy, water and carbon today’s leading AI servers could use by 2030, taking into account different growth scenarios and possible data centre locations within the United States. They combined projected chip supply, server power usage and cooling efficiency with state-by-state electrical grid data to conduct their analysis. While not every AI company has set a net zero target, some larger tech firms that are active in AI, such as Google, Microsoft and Meta have set goals with a deadline of 2030. “The rapid growth of AI computing is basically reshaping everything,” says You. “We’re trying to understand how, as a sector grows, what’s going to be the impact?” Their estimates suggest US AI server buildout will require between 731 million and 1.125 billion additional cubic metres of water by 2030, while emitting the equivalent of between 24 and 44 million tonnes of carbon dioxide a year. The forecast depends on how fast AI demand grows, how many high-end servers can actually be built and where new US data centres are located. The researchers modelled five scenarios based on the speed of growth, and identified various ways to reduce the impact. “Number one is location, location, location,” says You. Placing data centres in Midwestern states, where water is more available and the energy grid is powered by a higher proportion of renewables, can reduce the impact. The team also pinpoints decarbonising energy supplies and improving the efficiency of data centre computing and cooling processes as major ways to limit the impact. Collectively, those three approaches could cut the industry’s emissions by 73 per cent and its water footprint by 86 per cent. But the group’s projections could also be scuppered by public opposition to data centre installations because of their potentially extractive impact on the environment. In Virginia, which hosts about one-eighth of global data centre capacity, residents have begun lodging opposition to further planned construction, citing the impact on their water reserves and the wider environment. Similar petitions against data centres have been lodged in Pennsylvania, Texas, Arizona, California and Oregon. Figures from Data Center Watch, a research firm tracking data centre development, suggests local opposition has stymied $64 billion worth of projects. However, it is unclear, even in places that have successfully rejected data centres, just how much power and water they may use. That is why the new findings have been welcomed – albeit cautiously – by those who have attempted to study and quantify AI’s environmental impact. “AI is such a fast-moving field that it’s really hard to make any kind of meaningful future projections,” says Sasha Luccioni at AI company Hugging Face. “As the authors themselves say, the breakthroughs in the industry could fundamentally alter computing and energy requirements, like what we’ve seen with DeepSeek”, which used different techniques to reduce brute-force computation. Chris Preist at the University of Bristol in the UK says, “the authors are right to point out the need to invest in additional renewable energy capacity”, and adds data centre location matters. “I think their assumptions regarding water use to directly cool AI data centres are pretty pessimistic,” he says, suggesting the model’s “best case” scenario is more like “business as usual” for data centres these days. Luccioni believes the paper highlights what is missing in the AI world: “more transparency”. She explains that could be fixed by “requiring model developers to track and report their compute and energy use, and to provide this information to users and policymakers and to make firm commitments to reduce their overall environmental impacts, including emissions”.

Having children plays a complicated role in the rate we age

The effort of reproducing may divert energy away from repairing DNA or fighting illness, which could drive ageing, but a new study suggests that is only the case when environmental conditions are tough

Some say children keep you young, but it’s complicatedJavier Zayas/Getty Images For millennia, we have tried to understand why we age, with the ancient Greek philosopher Aristotle proposing it occurs alongside the gradual drying up of the internal moisture necessary for life. In modern times, a leading idea known as the disposable soma hypothesis suggests that ageing is the price we pay for reproduction, with evolution prioritising the passing on of genes above all else. This creates a fundamental trade-off: the immense energy devoted to having and raising offspring comes at the cost of repairing DNA, fighting off illness and keeping organs in good shape. This may particularly apply to women, who invest more in reproduction than men via pregnancy and breastfeeding. However, when scientists have tested this hypothesis by checking if women with more children live shorter lives, the results have been mixed: some studies support the idea, while others have found no effect. “It is very difficult to disentangle what is just correlation [between having more children and a shorter life] and what is the underlying causation, unless you have a good, big dataset that covers several generations,” says Elisabeth Bolund at the Swedish University of Agricultural Sciences, who wasn’t involved in the study. Euan Young at the University of Groningen in the Netherlands and his colleagues hypothesised that the inconsistency between studies exists because the cost of reproduction isn’t fixed – it depends on a mother’s environment. “In good times, this trade-off isn’t really visible. The trade-off only becomes apparent when times are tough,” says Young. To investigate this idea, the researchers analysed the parish records of more than 4500 Finnish women, spanning 250 years. These included the period of the Great Finnish Famine from 1866 to 1868, providing a means to gauge how hard times affect reproduction and longevity, says Young. They found that among the women who lived before or after the famine or who didn’t have children during it, there was no significant association between the number of children they had and their lifespan. However, for the women who did have children during the famine, their life expectancy decreased by six months for every child they had. The study builds on research published last year that used a dataset from a pre-industrial population in Quebec, Canda, monitored over two centuries, which showed this trade-off in mothers who were probably in poor health or under great stress, but didn’t explore how this was affected by specific environmental conditions. In contrast, Young’s team points to a specific, catastrophic event as the driver that exposes the trade-off for mothers. “This very large dataset makes it feasible to account for confounding factors [such as genetics and lifestyle factors],” says Bolund. “The study gets us as close as we can to identifying causation without running a controlled experiment in the lab.” The study also confirms the energetic demands of pregnancy and breastfeeding, which require hundreds of extra calories per day. During a famine, women can’t get this energy from food, so their bodies pay the price, “lowering basal metabolism [the minimum number of calories your body needs to function at a basic level] and thus slowing or shutting down other important functions, resulting in a decline in health and shorter lifespans”, says Young. It also explains why previous studies sometimes found the trade-off only in lower socioeconomic groups, which were effectively always living in relatively resource-scarce environments, he says. According to Bolund, the fact that this trade-off seems to occur in particularly tough circumstances, and when women typically had many children, may partly explain why women generally live longer than men today, with girls born between 2021 and 2023 in the UK expected to live four years longer than their male counterparts. The costs of reproduction are now fairly low in Western societies, where the average number of children women give birth to has reduced considerably over the centuries, says Bolund. As a result, few women today will probably reach the threshold where the cost to their lifetime becomes obvious. Bolund and her colleagues’ research on a historical population in Utah, for instance, found this only appeared when women had more than five children – well below the 1.6 births that the average woman in the US is expected to have in her lifetime. Other environmental factors may therefore become more significant in explaining the lifespan gap between men and women. Men tend to be more likely to smoke than women and also drink more alcohol, which affect lifespan, says Bolund. The current longevity gap between men and women is probably a combination of the latter’s reduced reproductive costs compared with other times in history and lifestyle differences between the sexes. Research also suggests that sex chromosomal differences are involved. “Sexes differ in a multitude of ways, beyond reproductive costs, so we need to conduct more research into how different factors contribute to sex-specific ageing,” says Young.

Michigan OKs Landmark Regulations That Push Up-Front Costs to Data Centers

Michigan regulators have adopted landmark standards for the booming data center industry with a plan they say tries to protect residents from subsidizing the industry’s hefty energy use

Michigan regulators on Thursday adopted landmark standards for the booming data center industry with a plan they say tries to protect residents from subsidizing the industry’s hefty energy use.In a 3-0 vote, the Michigan Public Service Commission adopted a rate structure that requires data centers and other energy-intensive industries in Consumers Energy’s territory to sign long-term power contracts with steep penalties for exiting early.The order also requires Consumers to show that data centers will shoulder all costs to build transmission lines, substations and other infrastructure before adding them to the grid.Commission Chair Dan Scripps called it a “balanced approach” that shows Michigan is “open for business from data centers and other large load customers, while also leveraging those potential benefits of the growth … in a way that’s good for all customers.”The deal disappointed some environmentalists, who had pushed for explicit requirements that data center power come from renewable sources. Michigan utilities are legally required to achieve 100% clean energy by 2040. They must detail how they plan to meet that requirement in filings next year.“While the order includes important consumer protection terms, the commission missed an opportunity to emphasize the importance of the state’s climate goals,” said Daniel Abrams, an attorney with the Environmental Law and Policy Center. The rate structure applies to customers whose energy use exceeds 100 megawatts. Data centers are among very few industries that demand that much power. Often, they demand an order of magnitude more.Consumers serves 1.9 million customers across much of the Lower Peninsula. Company spokesperson Matt Johnson said officials are still reviewing Thursday’s order and “its impact on all stakeholders.“Consumers Energy intends to work hard to continue to attract new businesses, including data centers, to Michigan, in a way that benefits everyone and fuels the state’s economic development,” he added.The deal comes amid an uncertain time for the data industry, which is growing fast because of artificial intelligence. Much more energy is needed to power the transformation, but many industry analysts fear rising AI stocks are a bubble and demand for the technology won’t materialize, leaving utilities and ratepayers to pick up the infrastructure tab for failed projects.Hoping to avoid such an outcome, Consumers in February proposed special regulations that would lock data centers into 15-year contracts that guarantee consistent electricity use and require payments even if a facility ceases or downsizes operations mid-contract.The commission’s decision Thursday approves much of that request, with some significant modifications. DTE takes a different approach The other big utility in Michigan, DTE Energy, is taking a different approach.Rather than establishing a blanket rate structure like Consumers, DTE wants to negotiate its first data center contract individually while aiming to avoid public vetting of the deal.Michigan law allows such expedited reviews in cases that would bring no added costs to utility consumers. DTE officials argue adding the Stargate data center to its system will help keep rates down for everyone by spreading fixed costs among more paying customers. “Given the sizable affordability benefits for our customers, as well as the economic impact the project will have, we think moving forward in this fashion makes the most sense,” spokesperson Jill Wilmot said.But DTE officials also stated in its filing that the company expects to spend some $500 million upgrading its transmission system and building a substation to serve the data center. Critics argue the utility is so intentionally vague it is impossible to vet DTE’s claims about affordability.“It’s just highly concerning that they are trying to keep this somewhat private, because there’s so much at stake,” said Bryan Smigielski, a Michigan organizer with the Sierra Club.Michigan Attorney General Dana Nessel also opposes DTE’s quest for expedited review, and has requested a thorough vetting of the proposed contract.Members of the Public Service Commission have not decided whether to grant DTE’s request for quick approval, Scripps said.Michigan’s data center electricity rate deliberations come amid a surge of interest from developers looking to take advantage of new tax breaks that could save the industry tens of millions of dollars. Lawmakers last year voted to exempt large data centers from Michigan’s 6% sales and use tax in an effort to lure the industry to Michigan.Beyond the Stargate campus, DTE is in late-stage negotiations for another 3 gigawatts’ worth of data center capacity, while Consumers Energy is nearing deals for three large data centers amounting to a collective 2 gigawatts of power.Developers are also scoping out rural land throughout the southern Lower Peninsula, from the Grand Rapids area to the outskirts of Monroe.The wave of interest could have big implications for water and land use in Michigan. Hyperscale data centers occupy hundreds of acres apiece. Those that use water vapor to cool the servers inside the facilities — the industry’s most common cooling technique — also use large amounts of water.This story was originally published by Bridge Michigan and distributed through a partnership with The Associated Press.Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See – Oct. 2025

Why some quantum materials stall while others scale

In a new study, MIT researchers evaluated quantum materials’ potential for scalable commercial success — and identified promising candidates.

People tend to think of quantum materials — whose properties arise from quantum mechanical effects — as exotic curiosities. But some quantum materials have become a ubiquitous part of our computer hard drives, TV screens, and medical devices. Still, the vast majority of quantum materials never accomplish much outside of the lab.What makes certain quantum materials commercial successes and others commercially irrelevant? If researchers knew, they could direct their efforts toward more promising materials — a big deal since they may spend years studying a single material.Now, MIT researchers have developed a system for evaluating the scale-up potential of quantum materials. Their framework combines a material’s quantum behavior with its cost, supply chain resilience, environmental footprint, and other factors. The researchers used their framework to evaluate over 16,000 materials, finding that the materials with the highest quantum fluctuation in the centers of their electrons also tend to be more expensive and environmentally damaging. The researchers also identified a set of materials that achieve a balance between quantum functionality and sustainability for further study.The team hopes their approach will help guide the development of more commercially viable quantum materials that could be used for next generation microelectronics, energy harvesting applications, medical diagnostics, and more.“People studying quantum materials are very focused on their properties and quantum mechanics,” says Mingda Li, associate professor of nuclear science and engineering and the senior author of the work. “For some reason, they have a natural resistance during fundamental materials research to thinking about the costs and other factors. Some told me they think those factors are too ‘soft’ or not related to science. But I think within 10 years, people will routinely be thinking about cost and environmental impact at every stage of development.”The paper appears in Materials Today. Joining Li on the paper are co-first authors and PhD students Artittaya Boonkird, Mouyang Cheng, and Abhijatmedhi Chotrattanapituk, along with PhD students Denisse Cordova Carrizales and Ryotaro Okabe; former graduate research assistants Thanh Nguyen and Nathan Drucker; postdoc Manasi Mandal; Instructor Ellan Spero of the Department of Materials Science and Engineering (DMSE); Professor Christine Ortiz of the Department of DMSE; Professor Liang Fu of the Department of Physics; Professor Tomas Palacios of the Department of Electrical Engineering and Computer Science (EECS); Associate Professor Farnaz Niroui of EECS; Assistant Professor Jingjie Yeo of Cornell University; and PhD student Vsevolod Belosevich and Assostant Professor Qiong Ma of Boston College.Materials with impactCheng and Boonkird say that materials science researchers often gravitate toward quantum materials with the most exotic quantum properties rather than the ones most likely to be used in products that change the world.“Researchers don’t always think about the costs or environmental impacts of the materials they study,” Cheng says. “But those factors can make them impossible to do anything with.”Li and his collaborators wanted to help researchers focus on quantum materials with more potential to be adopted by industry. For this study, they developed methods for evaluating factors like the materials’ price and environmental impact using their elements and common practices for mining and processing those elements. At the same time, they quantified the materials’ level of “quantumness” using an AI model created by the same group last year, based on a concept proposed by MIT professor of physics Liang Fu, termed quantum weight.“For a long time, it’s been unclear how to quantify the quantumness of a material,” Fu says. “Quantum weight is very useful for this purpose. Basically, the higher the quantum weight of a material, the more quantum it is.”The researchers focused on a class of quantum materials with exotic electronic properties known as topological materials, eventually assigning over 16,000 materials scores on environmental impact, price, import resilience, and more.For the first time, the researchers found a strong correlation between the material’s quantum weight and how expensive and environmentally damaging it is.“That’s useful information because the industry really wants something very low-cost,” Spero says. “We know what we should be looking for: high quantum weight, low-cost materials. Very few materials being developed meet that criteria, and that likely explains why they don’t scale to industry.”The researchers identified 200 environmentally sustainable materials and further refined the list down to 31 material candidates that achieved an optimal balance of quantum functionality and high-potential impact.The researchers also found that several widely studied materials exhibit high environmental impact scores, indicating they will be hard to scale sustainably. “Considering the scalability of manufacturing and environmental availability and impact is critical to ensuring practical adoption of these materials in emerging technologies,” says Niroui.Guiding researchMany of the topological materials evaluated in the paper have never been synthesized, which limited the accuracy of the study’s environmental and cost predictions. But the authors say the researchers are already working with companies to study some of the promising materials identified in the paper.“We talked with people at semiconductor companies that said some of these materials were really interesting to them, and our chemist collaborators also identified some materials they find really interesting through this work,” Palacios says. “Now we want to experimentally study these cheaper topological materials to understand their performance better.”“Solar cells have an efficiency limit of 34 percent, but many topological materials have a theoretical limit of 89 percent. Plus, you can harvest energy across all electromagnetic bands, including our body heat,” Fu says. “If we could reach those limits, you could easily charge your cell phone using body heat. These are performances that have been demonstrated in labs, but could never scale up. That’s the kind of thing we’re trying to push forward."This work was supported, in part, by the National Science Foundation and the U.S. Department of Energy.

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.