Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

As Starlink and Other Satellites Proliferate, Astronomers Learn to Manage Interference

News Feed
Friday, March 28, 2025

In the next few months, from its perch atop a mountain in Chile, the Vera C. Rubin Observatory will begin surveying the cosmos with the largest camera ever built. Every three nights, it will produce a map of the entire southern sky filled with stars, galaxies, asteroids and supernovae — and swarms of bright satellites ruining some of the view.Astronomers didn’t worry much about satellites photobombing Rubin’s images when they started drawing up plans for the observatory more than two decades ago. But as the space around Earth becomes increasingly congested, researchers are having to find fresh ways to cope — or else lose precious data from Rubin and hundreds of other observatories.The number of working satellites has soared in the past five years to around 11,000, mostly because of constellations of orbiters that provide Internet connectivity around the globe (see ‘Satellite surge’). Just one company, SpaceX in Hawthorne, California, has more than 7,000 operational Starlink satellites, all launched since 2019; OneWeb, a space communications company in London, has more than 630 satellites in its constellation. On paper, tens to hundreds of thousands more are planned from a variety of companies and nations, although probably not all of these will be launched.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.Satellites play a crucial part in connecting people, including bringing Internet to remote communities and emergency responders. But the rising number can be a problem for scientists because the satellites interfere with ground-based astronomical observations, by creating bright streaks on images and electromagnetic interference with radio telescopes. The satellite boom also poses other threats, including adding pollution to the atmosphere.When the first Starlinks launched, some astronomers warned of existential threats to their discipline. Now, researchers in astronomy and other fields are working with satellite companies to help quantify and mitigate the impacts on science — and society. “There is growing interest in collaborating and finding solutions together,” says Giuliana Rotola, a space-policy researcher at the Sant’Anna School of Advanced Studies in Pisa, Italy.Timing things rightThe first step to reduce satellite interference is knowing when and where a satellite will pass above an observatory. “The aim is to minimize the surprise,” says Mike Peel, an astronomer at Imperial College London.Before the launch of Starlinks, astronomers had no centralized reference for tracking satellites. Now, the International Astronomical Union (IAU) has a virtual Centre for the Protection of the Dark and Quiet Sky from Satellite Constellation Interference (CPS), which serves as an information hub and to which researchers, including Peel and Rotola, volunteer their time.One of the centre’s tools, called SatChecker, draws on a public database of satellite orbits, fed by information from observers and companies that track objects in space. Astronomers can use SatChecker to confirm what satellite is passing overhead during their observations. The tool isn’t perfect; atmospheric drag and intentional manoeuvring can affect a satellite’s position, and the public database doesn’t always reflect the latest information. For instance, the BlueWalker 3 satellite from telecommunications firm AST SpaceMobile in Midland, Texas, launched in 2022 and was sometimes brighter than most stars; yet uncertainty of its position was so great at times that astronomers had difficulty predicting whether it would be in their field of view for their night-time observations.Starlink satellites leave streaks in a 2019 image taken by a 4-meter telescope at the Cerro Tololo Inter-American Observatory in Chile.Tools such as SatChecker help telescope operators to avoid problems by allowing them to target a different part of the sky when a satellite passes overhead or by simply pausing observations as it flies by. It would aid astronomers if SatChecker had even more accurate information about satellite positions, but there are constraints on improving the system. SatChecker data come from the US Space Force, which draws on a global network of sensors that tracks objects in orbit and issues updates on satellite locations as often as several times a day. The frequency of these updates is limited by factors such as how often a sensor can observe an object and whether the sensor can distinguish what it’s looking at.Currently, satellite streaks are a relatively minor issue for telescope operators. But the problem will grow as satellite numbers continue to increase drastically, meaning more observation time will be lost, and this issue will be magnified for Rubin.Fixing the streaksRubin, which cost US$810 million to build, is a unique case because it scans large swathes of the sky frequently — meaning it can detect rapidly changing phenomena such as incoming asteroids or cosmic explosions. Astronomers don’t want to be fooled by passing satellites, as happened in 2017 when researchers spotted what they thought was a γ-ray burst — high-energy flashes of light — from a distant galaxy but turned out to be sunlight reflecting off a piece of space junk.Rubin’s powerful camera, coupled with its 8.4-metre telescope, will take about 1,000 nightly exposures of the sky, each about 45 times the area of the full Moon. That’s more wide-field pictures of the sky than any optical observatory has ever taken. Simulations suggest that if satellite numbers in low Earth orbit rise to around 40,000 over the 10 years of Rubin’s survey — a not-impossible forecast — then at least 10% of its images, and the majority of those taken during twilight, will contain a satellite trail3.SpaceX took early steps to try to mitigate the problem. Working with Rubin astronomers, the company tested changes to the design and positions of Starlinks to try to keep their brightness beneath a target threshold. Amazon, the retail and technology giant based in Seattle, Washington, is also testing mitigations on prototype satellites for its planned Kuiper constellation. Such changes reduce, but don’t eliminate, the problem.To limit satellite interference, Rubin astronomers are creating observation schedules to help researchers avoid certain parts of the sky (for example, near the horizon) and at certain times (such as around twilight)4. For when they can’t avoid the satellites, Rubin researchers have incorporated steps into their data-processing pipeline to detect and remove satellite streaks. All these changes mean less time doing science and more time processing data, but they need to be done, astronomers say. “We are really looking forward to getting data from Rubin and seeing how it turns out,” Peel says.For other observatories, the IAU CPS is working on tools to help astronomers identify and correct satellite streaks in their data. One is a new database of crowdsourced observations of satellite brightnesses called SCORE, which is currently being beta tested and is planned for wider release in the coming months. This will help scientists to work backwards — they might see something puzzling in their past observations and be able to work it out, Peel says.The database “is definitely a very valuable tool” because it’s one of few that have data freely available, says Marco Langbroek, a space-tracking specialist at Delft University of Technology in the Netherlands. As a beta tester, Langbroek has added a number of entries to SCORE, including measurements of a NASA solar sail that changes in brightness as it tumbles through space. Going forwards, he says, SCORE will be most useful if a lot of astronomers contribute high-quality observations to the database, thereby building up a resource over time.Tuning things outAstronomers who work in the radio portion of the electromagnetic spectrum face extra challenges when it comes to satellites.Big radio telescopes are typically located in remote regions, to be as far as possible from mobile-phone masts and other technological infrastructure that leak radio emissions. But satellites can’t be avoided. “If signals are coming from the sky, they’re always there,” says Federico Di Vruno, an astronomer at the Square Kilometre Array Observatory in Jodrell Bank, UK, and co-director of the IAU CPS.When satellites transmit signals, the electromagnetic interference can overwhelm faint radio signals coming from the cosmos. One solution is to re-direct or temporarily turn off satellite transmissions. The US National Radio Astronomy Observatory and SpaceX have been working on ways to accomplish this, and the company now momentarily redirects or disables transmissions when Starlinks pass above sensitive telescopes including the Green Bank Telescope in West Virginia5. The method requires voluntary buy-in by all partners, plus a lot of data sharing and intensive programming by the companies and by the astronomers, but it does reduce interference. It has been successful enough that small group of radio astronomers visited China last month to discuss the strategy with satellite operators and scientists there.An image made from multiple exposures shows streaks from Starlink satellites, the International Space Station and other satellites over a site in Wales.But as soon as one solution is found, fresh challenges appear. One is the rise of ‘direct-to-cell’ satellites, which function like mobile-phone towers in space and can transmit to areas on the ground that otherwise don’t have coverage. Optical astronomers worry about these because they are physically large and therefore bright6, and they are a big problem for radio astronomers because direct-to-cell transmissions are extremely powerful. If one of those hits a radio observatory, “the telescope might be blind for a little bit”, Di Vruno says. So astronomers and satellite operators are discussing how they can share information about these as well, to avoid each other when a satellite passes over an observatory.Another emerging challenge is ‘unintended’ emissions — which happen when satellites ‘leak’ radiation in wavelengths far outside the bands typically used for transmissions and other tasks. Early tests for the Square Kilometre Array radio telescopes, which are under construction in Australia and South Africa, discovered such leakage coming from Starlinks and other satellites7.Many of these unintended emissions are at the low frequencies that are used in some studies including those of the early Universe. So far, astronomers haven’t come up with a good solution, other than scheduling telescopes to not record data when a satellite passes through the part of the sky being observed. In the future, it is possible that authorities such as the International Telecommunication Union might be able to issue regulations on this, as it already does for other shared uses of the electromagnetic spectrum.Cleaning up the atmosphereAstronomers aren’t the only researchers concerned about the impacts of satellite constellations. In the past few years, a growing number of atmospheric scientists have been warning that these fleets will pollute Earth’s upper atmosphere during launches and then when their orbits decline and they burn up. Researchers are just starting to get to grips with the scope of this pollution, says Connor Barker, an atmospheric chemist at University College London (UCL).The point of satellite constellations is to have lots of satellites in orbit, but refreshing them when new technology comes along means that the pace of launches and re-entries will accelerate. In February alone, an average of four Starlink satellites a day re-entered the atmosphere and burned up.Each re-entry adds chemicals to the upper atmosphere. In a 2023 study, researchers reported that measurements made during high-altitude aeroplane flights detected more than 20 chemical elements in Earth’s upper atmosphere that probably came from satellite re-entries, including aluminium, copper and lead8. Other work has found that satellite constellations contributed around 40% of many types of carbon emission from the space industry in 2022, including black carbon particles and carbon dioxide9 that could contribute to warming the atmosphere. It’s not yet clear how much this warms the planet or contributes to other environmental problems. Some early analyses suggest that satellite launches could contribute a small but measurable amount of ozone destruction.There are no regulations on satellite atmospheric pollution. Barker and his colleagues at UCL say a good first step towards a solution is to get better estimates of the scope of the problem. They have been building an emissions inventory for rocket launches and satellite re-entries, carefully tallying up the contaminants involved and estimating the altitudes at which they enter the atmosphere. “Even though this is currently a relatively small industry that’s having a relatively small impact on the atmosphere, we should still be aware of it,” says Eloise Marais, an atmospheric chemist at UCL.Researchers are trying to raise the profile of these and other concerns linked to satellite fleets. Some of these issues were discussed in February in Vienna, at a meeting of the United Nations Committee on the Peaceful Uses of Outer Space. It was the first time that the committee formally discussed the impacts of satellite constellations on astronomy.No major actions were taken, as expected for these early discussions. But “now all of the member states know of dark and quiet skies”, Di Vruno says. That in itself, he says, is a success.This article is reproduced with permission and was first published on March 18, 2025.

Swarms of satellites launched by SpaceX and other companies are disrupting astronomical observations. Here's how scientists are coping

In the next few months, from its perch atop a mountain in Chile, the Vera C. Rubin Observatory will begin surveying the cosmos with the largest camera ever built. Every three nights, it will produce a map of the entire southern sky filled with stars, galaxies, asteroids and supernovae — and swarms of bright satellites ruining some of the view.

Astronomers didn’t worry much about satellites photobombing Rubin’s images when they started drawing up plans for the observatory more than two decades ago. But as the space around Earth becomes increasingly congested, researchers are having to find fresh ways to cope — or else lose precious data from Rubin and hundreds of other observatories.

The number of working satellites has soared in the past five years to around 11,000, mostly because of constellations of orbiters that provide Internet connectivity around the globe (see ‘Satellite surge’). Just one company, SpaceX in Hawthorne, California, has more than 7,000 operational Starlink satellites, all launched since 2019; OneWeb, a space communications company in London, has more than 630 satellites in its constellation. On paper, tens to hundreds of thousands more are planned from a variety of companies and nations, although probably not all of these will be launched.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Satellites play a crucial part in connecting people, including bringing Internet to remote communities and emergency responders. But the rising number can be a problem for scientists because the satellites interfere with ground-based astronomical observations, by creating bright streaks on images and electromagnetic interference with radio telescopes. The satellite boom also poses other threats, including adding pollution to the atmosphere.

When the first Starlinks launched, some astronomers warned of existential threats to their discipline. Now, researchers in astronomy and other fields are working with satellite companies to help quantify and mitigate the impacts on science — and society. “There is growing interest in collaborating and finding solutions together,” says Giuliana Rotola, a space-policy researcher at the Sant’Anna School of Advanced Studies in Pisa, Italy.

Timing things right

The first step to reduce satellite interference is knowing when and where a satellite will pass above an observatory. “The aim is to minimize the surprise,” says Mike Peel, an astronomer at Imperial College London.

Before the launch of Starlinks, astronomers had no centralized reference for tracking satellites. Now, the International Astronomical Union (IAU) has a virtual Centre for the Protection of the Dark and Quiet Sky from Satellite Constellation Interference (CPS), which serves as an information hub and to which researchers, including Peel and Rotola, volunteer their time.

One of the centre’s tools, called SatChecker, draws on a public database of satellite orbits, fed by information from observers and companies that track objects in space. Astronomers can use SatChecker to confirm what satellite is passing overhead during their observations. The tool isn’t perfect; atmospheric drag and intentional manoeuvring can affect a satellite’s position, and the public database doesn’t always reflect the latest information. For instance, the BlueWalker 3 satellite from telecommunications firm AST SpaceMobile in Midland, Texas, launched in 2022 and was sometimes brighter than most stars; yet uncertainty of its position was so great at times that astronomers had difficulty predicting whether it would be in their field of view for their night-time observations.

Starlink satellites leave streaks in a 2019 image taken by a 4-meter telescope at the Cerro Tololo Inter-American Observatory in Chile.

Starlink satellites leave streaks in a 2019 image taken by a 4-meter telescope at the Cerro Tololo Inter-American Observatory in Chile.

Tools such as SatChecker help telescope operators to avoid problems by allowing them to target a different part of the sky when a satellite passes overhead or by simply pausing observations as it flies by. It would aid astronomers if SatChecker had even more accurate information about satellite positions, but there are constraints on improving the system. SatChecker data come from the US Space Force, which draws on a global network of sensors that tracks objects in orbit and issues updates on satellite locations as often as several times a day. The frequency of these updates is limited by factors such as how often a sensor can observe an object and whether the sensor can distinguish what it’s looking at.

Currently, satellite streaks are a relatively minor issue for telescope operators. But the problem will grow as satellite numbers continue to increase drastically, meaning more observation time will be lost, and this issue will be magnified for Rubin.

Fixing the streaks

Rubin, which cost US$810 million to build, is a unique case because it scans large swathes of the sky frequently — meaning it can detect rapidly changing phenomena such as incoming asteroids or cosmic explosions. Astronomers don’t want to be fooled by passing satellites, as happened in 2017 when researchers spotted what they thought was a γ-ray burst — high-energy flashes of light — from a distant galaxy but turned out to be sunlight reflecting off a piece of space junk.

Rubin’s powerful camera, coupled with its 8.4-metre telescope, will take about 1,000 nightly exposures of the sky, each about 45 times the area of the full Moon. That’s more wide-field pictures of the sky than any optical observatory has ever taken. Simulations suggest that if satellite numbers in low Earth orbit rise to around 40,000 over the 10 years of Rubin’s survey — a not-impossible forecast — then at least 10% of its images, and the majority of those taken during twilight, will contain a satellite trail3.

SpaceX took early steps to try to mitigate the problem. Working with Rubin astronomers, the company tested changes to the design and positions of Starlinks to try to keep their brightness beneath a target threshold. Amazon, the retail and technology giant based in Seattle, Washington, is also testing mitigations on prototype satellites for its planned Kuiper constellation. Such changes reduce, but don’t eliminate, the problem.

To limit satellite interference, Rubin astronomers are creating observation schedules to help researchers avoid certain parts of the sky (for example, near the horizon) and at certain times (such as around twilight)4. For when they can’t avoid the satellites, Rubin researchers have incorporated steps into their data-processing pipeline to detect and remove satellite streaks. All these changes mean less time doing science and more time processing data, but they need to be done, astronomers say. “We are really looking forward to getting data from Rubin and seeing how it turns out,” Peel says.

For other observatories, the IAU CPS is working on tools to help astronomers identify and correct satellite streaks in their data. One is a new database of crowdsourced observations of satellite brightnesses called SCORE, which is currently being beta tested and is planned for wider release in the coming months. This will help scientists to work backwards — they might see something puzzling in their past observations and be able to work it out, Peel says.

The database “is definitely a very valuable tool” because it’s one of few that have data freely available, says Marco Langbroek, a space-tracking specialist at Delft University of Technology in the Netherlands. As a beta tester, Langbroek has added a number of entries to SCORE, including measurements of a NASA solar sail that changes in brightness as it tumbles through space. Going forwards, he says, SCORE will be most useful if a lot of astronomers contribute high-quality observations to the database, thereby building up a resource over time.

Tuning things out

Astronomers who work in the radio portion of the electromagnetic spectrum face extra challenges when it comes to satellites.

Big radio telescopes are typically located in remote regions, to be as far as possible from mobile-phone masts and other technological infrastructure that leak radio emissions. But satellites can’t be avoided. “If signals are coming from the sky, they’re always there,” says Federico Di Vruno, an astronomer at the Square Kilometre Array Observatory in Jodrell Bank, UK, and co-director of the IAU CPS.

When satellites transmit signals, the electromagnetic interference can overwhelm faint radio signals coming from the cosmos. One solution is to re-direct or temporarily turn off satellite transmissions. The US National Radio Astronomy Observatory and SpaceX have been working on ways to accomplish this, and the company now momentarily redirects or disables transmissions when Starlinks pass above sensitive telescopes including the Green Bank Telescope in West Virginia5. The method requires voluntary buy-in by all partners, plus a lot of data sharing and intensive programming by the companies and by the astronomers, but it does reduce interference. It has been successful enough that small group of radio astronomers visited China last month to discuss the strategy with satellite operators and scientists there.

An image made from multiple exposures shows streaks from Starlink satellites, the International Space Station and other satellites over a site in Wales.

An image made from multiple exposures shows streaks from Starlink satellites, the International Space Station and other satellites over a site in Wales.

But as soon as one solution is found, fresh challenges appear. One is the rise of ‘direct-to-cell’ satellites, which function like mobile-phone towers in space and can transmit to areas on the ground that otherwise don’t have coverage. Optical astronomers worry about these because they are physically large and therefore bright6, and they are a big problem for radio astronomers because direct-to-cell transmissions are extremely powerful. If one of those hits a radio observatory, “the telescope might be blind for a little bit”, Di Vruno says. So astronomers and satellite operators are discussing how they can share information about these as well, to avoid each other when a satellite passes over an observatory.

Another emerging challenge is ‘unintended’ emissions — which happen when satellites ‘leak’ radiation in wavelengths far outside the bands typically used for transmissions and other tasks. Early tests for the Square Kilometre Array radio telescopes, which are under construction in Australia and South Africa, discovered such leakage coming from Starlinks and other satellites7.

Many of these unintended emissions are at the low frequencies that are used in some studies including those of the early Universe. So far, astronomers haven’t come up with a good solution, other than scheduling telescopes to not record data when a satellite passes through the part of the sky being observed. In the future, it is possible that authorities such as the International Telecommunication Union might be able to issue regulations on this, as it already does for other shared uses of the electromagnetic spectrum.

Cleaning up the atmosphere

Astronomers aren’t the only researchers concerned about the impacts of satellite constellations. In the past few years, a growing number of atmospheric scientists have been warning that these fleets will pollute Earth’s upper atmosphere during launches and then when their orbits decline and they burn up. Researchers are just starting to get to grips with the scope of this pollution, says Connor Barker, an atmospheric chemist at University College London (UCL).

The point of satellite constellations is to have lots of satellites in orbit, but refreshing them when new technology comes along means that the pace of launches and re-entries will accelerate. In February alone, an average of four Starlink satellites a day re-entered the atmosphere and burned up.

Each re-entry adds chemicals to the upper atmosphere. In a 2023 study, researchers reported that measurements made during high-altitude aeroplane flights detected more than 20 chemical elements in Earth’s upper atmosphere that probably came from satellite re-entries, including aluminium, copper and lead8. Other work has found that satellite constellations contributed around 40% of many types of carbon emission from the space industry in 2022, including black carbon particles and carbon dioxide9 that could contribute to warming the atmosphere. It’s not yet clear how much this warms the planet or contributes to other environmental problems. Some early analyses suggest that satellite launches could contribute a small but measurable amount of ozone destruction.

There are no regulations on satellite atmospheric pollution. Barker and his colleagues at UCL say a good first step towards a solution is to get better estimates of the scope of the problem. They have been building an emissions inventory for rocket launches and satellite re-entries, carefully tallying up the contaminants involved and estimating the altitudes at which they enter the atmosphere. “Even though this is currently a relatively small industry that’s having a relatively small impact on the atmosphere, we should still be aware of it,” says Eloise Marais, an atmospheric chemist at UCL.

Researchers are trying to raise the profile of these and other concerns linked to satellite fleets. Some of these issues were discussed in February in Vienna, at a meeting of the United Nations Committee on the Peaceful Uses of Outer Space. It was the first time that the committee formally discussed the impacts of satellite constellations on astronomy.

No major actions were taken, as expected for these early discussions. But “now all of the member states know of dark and quiet skies”, Di Vruno says. That in itself, he says, is a success.

This article is reproduced with permission and was first published on March 18, 2025.

Read the full story here.
Photos courtesy of

Like Many Holiday Traditions, Lighting Candles and Fireplaces Is Best Done in Moderation

The warm scents of gingerbread and pine are holiday favorites, but experts warn they can affect indoor air quality

The warm spices in gingerbread, the woodsy aroma of pine and fir trees, and the fruity tang of mulled wine are smells synonymous with the holiday season. Many people enjoy lighting candles, incense and fireplaces in their homes to evoke the moods associated with these festive fragrances.Burning scented products may create a cozy ambiance, and in the case of fireplaces, provide light and heat, but some experts want people to consider how doing so contributes to the quality of the air indoors. All flames release chemicals that may cause allergy-like symptoms or contribute to long-term respiratory problems if they are inhaled in sufficient quantities.However, people don't have to stop sitting by the hearth or get rid of products like perfumed candles and essential oil diffusers, said Dr. Meredith McCormack, director of the pulmonary and critical care medicine division at John Hopkins University’s medical school. Instead, she recommends taking precautions to control the pollutants in their homes.“Clean air is fragrance free,” said McCormack, who has studied air quality and lung health for more than 20 years. “If having seasonal scents is part of your tradition or evokes feelings of nostalgia, maybe think about it in moderation.” What to know about indoor air quality People in the Northern Hemisphere tend to spend more time indoors during the end-of-year holidays, when temperatures are colder. Indoor air can be significantly more polluted than outdoor air because pollutants get trapped inside and concentrated without proper ventilation or filtration, according to the American Lung Association.For example, active fireplaces and gas appliances release tiny airborne particles that can get into the lungs and chemicals like nitrogen dioxide, a major component of smog, according to the U.S. Environmental Protection Agency. Cleaning products, air fresheners and candles also emit air pollutants at varying concentrations.The risk fragrances and other air pollutants may pose to respiratory health depends on the source, the length and intensity of a person’s exposure, and individual health, McCormack said.It is also important to note that some pollutants have no smell, so unscented products still can affect indoor air quality, experts say. Some people are more vulnerable Polluted air affects everyone but not equally. Children, older adults, minority populations and people of low socioeconomic status are more likely to be affected by poor air quality because of either physiological vulnerabilities or higher exposure, according to the environmental agency.Children are more susceptible to air pollution because of their lung size, which means they get a greater dose of exposure relative to their body size, McCormack said. Pollutants inside the home also post a greater hazard to people with heart or lung conditions, including asthma, she said.Signs of respiratory irritation include coughing, shortness of breath, headaches, a runny nose and sneezing. Experts advise stopping use of pollutant-releasing products or immediately ventilating rooms if symptoms occur.“The more risk factors you have, the more harmful air pollution or poor air quality indoors can be,” McCormack said. Practical precautions to take Ellen Wilkowe burns candles with scents like vanilla and cinnamon when she does yoga, writes or when she is showering at her home in New Jersey. Her teenage daughter, on the other hand, likes more seasonally scented candles like gingerbread.“The candle has a calming presence. They are also very symbolic and used in rituals and many religions,” she said.Wilkowe said she leans toward candles made with soy-based waxes instead of petroleum-based paraffin. Experts note that all lit candles give off air pollutants regardless of what they are made of.Buying products with fewer ingredients, opening windows if the temperatures allow, and using air purifiers with HEPA filters are ways to reduce exposure to any pollutants from indoor fireplaces, appliances and candle displays, McCormack said. She also recommends switching on kitchen exhaust fans before starting a gas-powered stovetop and using the back burners so the vent can more easily suck up pollutants.Setting polite boundaries with guests who smoke cigarettes or other tobacco products is also a good idea, she said.“Small improvements in air quality can have measurable health benefits," McCormack said. "Similarly to if we exercise and eat a little better, we can be healthier.”Rachael Lewis-Abbott, a member of the Indoor Air Quality Association, an organization for professionals who identify and address air quality problems, said people don't usually notice what they are breathing in until problems like gas leaks or mold develop.“It is out of sight, out of mind,” she said.Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See – December 2025

This moss survived in space for 9 months

In an experiment on the outside of the International Space Station, a species of moss survived in space for 9 months. And it could have lasted much longer. The post This moss survived in space for 9 months first appeared on EarthSky.

Meet a spreading earthmoss known as Physcomitrella patens. It’s frequently used as a model organism for studies on plant evolution, development, and physiology. In this image, a reddish-brown sporophyte sits at the top center of a leafy gametophore. This capsule contains numerous spores inside. Scientists tested samples like these on the outside of the International Space Station (ISS) to see if they could tolerate the extreme airless environment. And they did. The moss survived in space for 9 months and could have lasted even longer. Image via Tomomichi Fujita/ EurekAlert! (CC BY-SA). Space is a deadly environment, with no air, extreme temperature swings and harsh radiation. Could any life survive there? Reasearchers in Japan tested a type of moss called spreading earthmoss on the exterior of the International Space Station. The moss survived for nine months, and the spores were still able to reproduce when brought back to Earth. Moss survived in space for 9 months Can life exist in space? Not simply on other planets or moons, but in the cold, dark, airless void of space itself? Most organisms would perish almost immediately, to be sure. But researchers in Japan recently experimented with moss, with surprising results. They said on November 20, 2025, that more than 80% of their moss spores survived nine months on the outside of the International Space Station. Not only that, but when brought back to Earth, they were still capable of reproducing. Nature, it seems, is even tougher than we thought! Amazingly, the results show that some primitive plants – not even just microorganisms – can survive long-term exposure to the extreme space environment. The researchers published their peer-reviewed findings in the journal iScience on November 20, 2025. A deadly environment for life Space is a horrible place for life. The lack of air, radiation and extreme cold make it pretty much unsurvivable for life as we know it. As lead author Tomomichi Fujita at Hokkaido University in Japan stated: Most living organisms, including humans, cannot survive even briefly in the vacuum of space. However, the moss spores retained their vitality after nine months of direct exposure. This provides striking evidence that the life that has evolved on Earth possesses, at the cellular level, intrinsic mechanisms to endure the conditions of space. This #moss survived 9 months directly exposed to the vacuum space and could still reproduce after returning to Earth. ? ? spkl.io/63322AdFrpTomomichi Fujita & colleagues@cp-iscience.bsky.social — Cell Press (@cellpress.bsky.social) 2025-11-24T16:00:02.992Z What about moss? Researchers wanted to see if any Earthly life could survive in space’s deadly environment for the long term. To find out, they decided to do some experiments with a type of moss called spreading earthmoss, or Physcomitrium patens. The researchers sent hundreds of sporophytes – encapsulated moss spores – to the International Space Station in March 2022, aboard the Cygnus NG-17 spacecraft. They attached the sporophyte samples to the outside of the ISS, where they were exposed to the vacuum of space for 283 days. By doing so, the samples were subjected to high levels of UV (ultraviolet) radiation and extreme swings of temperature. The samples later returned to Earth in January 2023. The researchers tested three parts of the moss. These were the protonemata, or juvenile moss; brood cells, or specialized stem cells that emerge under stress conditions; and the sporophytes. Fujita said: We anticipated that the combined stresses of space, including vacuum, cosmic radiation, extreme temperature fluctuations and microgravity, would cause far greater damage than any single stress alone. Astronauts placed the moss samples on the outside of the International Space Station for the 9-month-long experiment. Incredibly, more than 80% of the the encapsulated spores survived the trip to space and back to Earth. Image via NASA/ Roscosmos. The moss survived! So, how did the moss do? The results were mixed, but overall showed that the moss could survive in space. The radiation was the most difficult aspect of the space environment to withstand. The sporophytes were the most resilient. Incredibly, they were able to survive and germinate after being exposed to -196 degrees Celsius (-320 degrees Fahrenheit) for more than a week. At the other extreme, they also survived in 55° degrees C (131 degrees F) heat for a month. Some brood cells survived as well, but the encased spores were about 1,000 times more tolerant to the UV radiation. On the other hand, none of the juvenile moss survived the high UV levels or the extreme temperatures. Samples of moss spores that germinated after their 9-month exposure to space. Image via Dr. Chang-hyun Maeng/ Maika Kobayashi/ EurekAlert!. (CC BY-SA). How did the spores survive? So why did the encapsulated spores do so well? The researchers said the natural structure surrounding the spore itself helps to protect the spore. Essentially, it absorbs the UV radiation and surrounds the inner spore both physically and chemically to prevent damage. As it turns out, this might be associated with the evolution of mosses. This is an adaptation that helped bryophytes – the group of plants to which mosses belong – to make the transition from aquatic to terrestrial plants 500 million years ago. Overall, more than 80% of the spores survived the journey to space and then back to Earth. And only 11% were unable to germinate after being brought back to the lab on Earth. That’s impressive! In addition, the researchers also tested the levels of chlorophyll in the spores. After the exposure to space, the spores still had normal amounts of chlorophyll, except for chlorophyll a specifically. In that case, there was a 20% reduction. Chlorophyll a is used in oxygenic photosynthesis. It absorbs the most energy from wavelengths of violet-blue and orange-red light. Tomomichi Fujita at Hokkaido University in Japan is the lead author of the new study about moss in space. Image via Hokkaido University. Spores could have survived for 15 years The time available for the experiment was limited to the several months. However, the researchers wondered if the moss spores could have survived even longer. And using mathematical models, they determined the spores would likely have continued to live in space for about 15 years, or 5,600 days, altogether. The researchers note this prediction is a rough estimate. More data would still be needed to make that assessment even more accurate. So the results show just how resilient moss is, and perhaps some other kinds of life, too. Fujita said: This study demonstrates the astonishing resilience of life that originated on Earth. Ultimately, we hope this work opens a new frontier toward constructing ecosystems in extraterrestrial environments such as the moon and Mars. I hope that our moss research will serve as a starting point. Bottom line: In an experiment on the outside of the International Space Station, a species of moss survived in space for nine months. And it could have lasted much longer. Source: Extreme environmental tolerance and space survivability of the moss, Physcomitrium patens Via EurekAlert! Read more: This desert moss could grow on Mars, no greenhouse needed Read more: Colorful life on exoplanets might be lurking in cloudsThe post This moss survived in space for 9 months first appeared on EarthSky.

Medical Imaging Contributing To Water Pollution, Experts Say

By Dennis Thompson HealthDay ReporterTHURSDAY, Dec. 11, 2025 (HealthDay News) — Contrast chemicals injected into people for medical imaging scans...

By Dennis Thompson HealthDay ReporterTHURSDAY, Dec. 11, 2025 (HealthDay News) — Contrast chemicals injected into people for medical imaging scans are likely contributing to water pollution, a new study says.Medicare patients alone received 13.5 billion milliliters of contrast media between 2011 and 2024, and those chemicals wound up in waterways after people excreted them, researchers recently reported in JAMA Network Open.“Contrast agents are necessary for effective imaging, but they don’t disappear after use,” said lead researcher Dr. Florence Doo, an assistant professor at the University of Maryland Medical Intelligent Imaging Center in Baltimore.“Iodine and gadolinium are non-renewable resources that can enter wastewater and accumulate in rivers, oceans and even drinking water,” Doo said in a news release.People undergoing X-ray or CT scans are sometimes given iodine or barium-sulfate compounds that cause certain tissues, blood vessels or organs to light up, allowing radiologists a better look at potential health problems.For MRI scans, radiologists use gadolinium, a substance that alters the magnetic properties of water molecules in the human body.These are critical for diagnosing disease, but they are also persistent pollutants, researchers said in background notes. They aren’t biodegradable, and conventional wastewater treatment doesn’t fully remove them.For the new study, researchers analyzed 169 million contrast-enhanced imaging procedures that Medicare covered over 13 years.Iodine-based contrast agents accounted for more than 95% of the total volume, or nearly 12.9 billion milliliters. Of those, agents used in CT scans of the abdomen and pelvis alone contributed 4.4 billion milliliters.Gadolinium agents were less frequently used, but still contributed nearly 600 million milliliters, researchers said. Brain MRIs were the most common scan using these contrast materials.Overall, just a handful of procedures accounted for 80% of all contrast use, researchers concluded.“Our study shows that a small number of imaging procedures drive the majority of contrast use. Focusing on those highest-use imaging types make meaningful changes tractable and could significantly reduce health care’s environmental footprint,” researcher Elizabeth Rula, executive director of the Harvey L. Neiman Health Policy Institute in Reston, Va., said in a news release.Doctors can help by making sure their imaging orders are necessary, while radiologists can lower the doses of contrast agents by basing them on a patient’s weight, researchers said.Biodegradable contrast media are under development, researchers noted. Another solution could involve AI, which might be able to accurately analyze medical imaging scans even if less contrast media is used.“We can’t ignore the environmental consequences of medical imaging,” Doo said. “Stewardship of contrast agents is a measurable and impactful way to align patient care with planetary health and should be an important part of broader health care sustainability efforts.”SOURCES: Harvey L. Neiman Health Policy Institute, news release, Dec. 4, 2025; JAMA Network Open, Dec. 5, 2025Copyright © 2025 HealthDay. All rights reserved.

Cars to AI: How new tech drives demand for specialized materials

Generative artificial intelligence has become widely accepted as a tool that increases productivity. Yet the technology is far from mature. Large language models advance rapidly from one generation to the next, and experts can only speculate how AI will affect the workforce and people’s daily lives. As a materials scientist, I am interested in how materials and the technologies that derive from them affect society. AI is one example of a technology driving global change—particularly through its demand for materials and rare minerals. But before AI evolved to its current level, two other technologies exemplified the process created by the demand for specialized materials: cars and smartphones. Often, the mass adoption of a new invention changes human behavior, which leads to new technologies and infrastructures reliant upon the invention. In turn, these new technologies and infrastructures require new or improved materials—and these often contain critical minerals: those minerals that are both essential to the technology and strain the supply chain. The unequal distribution of these minerals gives leverage to the nations that produce them. The resulting power shifts strain geopolitical relations and drive the search for new mineral sources. New technology nurtures the mining industry. The car and the development of suburbs At the beginning of the 20th century, only 5 out of 1,000 people owned a car, with annual production around a few thousand. Workers commuted on foot or by tram. Within a 2-mile radius, many people had all they needed: from groceries to hardware, from school to church, and from shoemakers to doctors. Then, in 1913, Henry Ford transformed the industry by inventing the assembly line. Now, a middle class family could afford a car: Mass production cut the price of the Model T from US$850 in 1908 to $360 in 1916. While the Great Depression dampened the broad adoption of the car, sales began to increase again after the end of World War II. With cars came more mobility, and many people moved farther away from work. In the 1940s and 1950s, a powerful highway lobby that included oil, automobile, and construction interests promoted federal highway and transportation policies, which increased automobile dependence. These policies helped change the landscape: Houses were spaced farther apart, and located farther away from the urban centers where many people worked. By the 1960s, two-thirds of American workers commuted by car, and the average commute had increased to 10 miles. Public policy and investment favored suburbs, which meant less investment in city centers. The resulting decay made living in downtown areas of many cities undesirable and triggered urban renewal projects. Long commutes added to pollution and expenses, which created a demand for lighter, more fuel-efficient cars. But building these required better materials. In 1970, the entire frame and body of a car was made from one steel type, but by 2017, 10 different, highly specialized steels constituted a vehicle’s lightweight form. Each steel contains different chemical elements, such as molybdenum and vanadium, which are mined only in a few countries. While the car supply chain was mostly domestic until the 1970s, the car industry today relies heavily on imports. This dependence has created tension with international trade partners, as reflected by higher tariffs on steel. The cellphone and American life The cellphone presents another example of a technology creating a demand for minerals and affecting foreign policy. In 1983, Motorola released the DynaTAC, the first commercial cellular phone. It was heavy, expensive, and its battery lasted for only half an hour, so few people had one. Then in 1996, Motorola introduced the flip phone, which was cheaper, lighter, and more convenient to use. The flip phone initiated the mass adoption of cellphones. However, it was still just a phone: Unlike today’s smartphones, all it did was send and receive calls and texts. In 2007, Apple redefined communication with the iPhone, inventing the touchscreen and integrating an internet navigator. The phone became a digital hub for navigating, finding information, and building an online social identity. Before smartphones, mobile phones supplemented daily life. Now, they structure it. In 2000, fewer than half of American adults owned a cellphone, and nearly all who did used it only sporadically. In 2024, 98% of Americans over the age of 18 reported owning a cellphone, and over 90% owned a smartphone. Without the smartphone, most people cannot fulfill their daily tasks. Many individuals now experience nomophobia: They feel anxious without a cellphone. Around three-quarters of all stable elements are represented in the components of each smartphone. These elements are necessary for highly specialized materials that enable touchscreens, displays, batteries, speakers, microphones, and cameras. Many of these elements are essential for at least one function and have an unreliable supply chain, which makes them critical. Critical materials and AI Critical materials give leverage to countries that have a monopoly in mining and processing them. For example, China has gained increased power through its monopoly on rare earth elements. In April 2025, in response to U.S. tariffs, China stopped exporting rare earth magnets, which are used in cellphones. The geopolitical tensions that resulted demonstrate the power embodied in the control over critical minerals. The mass adoption of AI technology will likely change human behavior and bring forth new technologies, industries, and infrastructure on which the U.S. economy will depend. All of these technologies will require more optimized and specialized materials and create new material dependencies. By exacerbating material dependencies, AI could affect geopolitical relations and reorganize global power. America has rich deposits of many important minerals, but extraction of these minerals comes with challenges. Factors including slow and costly permitting, public opposition, environmental concerns, high investment costs, and an inadequate workforce all can prevent mining companies from accessing these resources. The mass adoption of AI is already adding pressure to overcome these factors and to increase responsible domestic mining. While the path from innovation to material dependence spanned a century for cars and a couple of decades for cellphones, the rapid advancement of large language models suggests that the scale will be measured in years for AI. The heat is already on. Peter Müllner is a distinguished professor in materials science and engineering at Boise State University. This article is republished from The Conversation under a Creative Commons license. Read the original article.

Generative artificial intelligence has become widely accepted as a tool that increases productivity. Yet the technology is far from mature. Large language models advance rapidly from one generation to the next, and experts can only speculate how AI will affect the workforce and people’s daily lives. As a materials scientist, I am interested in how materials and the technologies that derive from them affect society. AI is one example of a technology driving global change—particularly through its demand for materials and rare minerals. But before AI evolved to its current level, two other technologies exemplified the process created by the demand for specialized materials: cars and smartphones. Often, the mass adoption of a new invention changes human behavior, which leads to new technologies and infrastructures reliant upon the invention. In turn, these new technologies and infrastructures require new or improved materials—and these often contain critical minerals: those minerals that are both essential to the technology and strain the supply chain. The unequal distribution of these minerals gives leverage to the nations that produce them. The resulting power shifts strain geopolitical relations and drive the search for new mineral sources. New technology nurtures the mining industry. The car and the development of suburbs At the beginning of the 20th century, only 5 out of 1,000 people owned a car, with annual production around a few thousand. Workers commuted on foot or by tram. Within a 2-mile radius, many people had all they needed: from groceries to hardware, from school to church, and from shoemakers to doctors. Then, in 1913, Henry Ford transformed the industry by inventing the assembly line. Now, a middle class family could afford a car: Mass production cut the price of the Model T from US$850 in 1908 to $360 in 1916. While the Great Depression dampened the broad adoption of the car, sales began to increase again after the end of World War II. With cars came more mobility, and many people moved farther away from work. In the 1940s and 1950s, a powerful highway lobby that included oil, automobile, and construction interests promoted federal highway and transportation policies, which increased automobile dependence. These policies helped change the landscape: Houses were spaced farther apart, and located farther away from the urban centers where many people worked. By the 1960s, two-thirds of American workers commuted by car, and the average commute had increased to 10 miles. Public policy and investment favored suburbs, which meant less investment in city centers. The resulting decay made living in downtown areas of many cities undesirable and triggered urban renewal projects. Long commutes added to pollution and expenses, which created a demand for lighter, more fuel-efficient cars. But building these required better materials. In 1970, the entire frame and body of a car was made from one steel type, but by 2017, 10 different, highly specialized steels constituted a vehicle’s lightweight form. Each steel contains different chemical elements, such as molybdenum and vanadium, which are mined only in a few countries. While the car supply chain was mostly domestic until the 1970s, the car industry today relies heavily on imports. This dependence has created tension with international trade partners, as reflected by higher tariffs on steel. The cellphone and American life The cellphone presents another example of a technology creating a demand for minerals and affecting foreign policy. In 1983, Motorola released the DynaTAC, the first commercial cellular phone. It was heavy, expensive, and its battery lasted for only half an hour, so few people had one. Then in 1996, Motorola introduced the flip phone, which was cheaper, lighter, and more convenient to use. The flip phone initiated the mass adoption of cellphones. However, it was still just a phone: Unlike today’s smartphones, all it did was send and receive calls and texts. In 2007, Apple redefined communication with the iPhone, inventing the touchscreen and integrating an internet navigator. The phone became a digital hub for navigating, finding information, and building an online social identity. Before smartphones, mobile phones supplemented daily life. Now, they structure it. In 2000, fewer than half of American adults owned a cellphone, and nearly all who did used it only sporadically. In 2024, 98% of Americans over the age of 18 reported owning a cellphone, and over 90% owned a smartphone. Without the smartphone, most people cannot fulfill their daily tasks. Many individuals now experience nomophobia: They feel anxious without a cellphone. Around three-quarters of all stable elements are represented in the components of each smartphone. These elements are necessary for highly specialized materials that enable touchscreens, displays, batteries, speakers, microphones, and cameras. Many of these elements are essential for at least one function and have an unreliable supply chain, which makes them critical. Critical materials and AI Critical materials give leverage to countries that have a monopoly in mining and processing them. For example, China has gained increased power through its monopoly on rare earth elements. In April 2025, in response to U.S. tariffs, China stopped exporting rare earth magnets, which are used in cellphones. The geopolitical tensions that resulted demonstrate the power embodied in the control over critical minerals. The mass adoption of AI technology will likely change human behavior and bring forth new technologies, industries, and infrastructure on which the U.S. economy will depend. All of these technologies will require more optimized and specialized materials and create new material dependencies. By exacerbating material dependencies, AI could affect geopolitical relations and reorganize global power. America has rich deposits of many important minerals, but extraction of these minerals comes with challenges. Factors including slow and costly permitting, public opposition, environmental concerns, high investment costs, and an inadequate workforce all can prevent mining companies from accessing these resources. The mass adoption of AI is already adding pressure to overcome these factors and to increase responsible domestic mining. While the path from innovation to material dependence spanned a century for cars and a couple of decades for cellphones, the rapid advancement of large language models suggests that the scale will be measured in years for AI. The heat is already on. Peter Müllner is a distinguished professor in materials science and engineering at Boise State University. This article is republished from The Conversation under a Creative Commons license. Read the original article.

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.