Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Why the government won’t let you see its best tool for forecasting hurricanes

News Feed
Thursday, September 26, 2024

The National Oceanic and Atmospheric Administration for four years has used a hurricane forecasting tool that often surpasses all others in its accuracy, but it won’t release its predictions to the public, spurring concerns that it is holding back information that could help people prepare for deadly storms.The tool, known as the HCCA model, was developed by NOAA as part of a program to reduce errors in hurricane forecasts. Statistics published by NOAA’s National Hurricane Center show that from 2020 to 2023, HCCA was one of the two best models for forecasting a storm’s track and intensity. In 2022, HCCA provided the most accurate track forecasts for all lead times out to four days, even beating the Hurricane Center’s official forecast.The HCCA model produced superior two-day and three-day track forecasts to the Hurricane Center during Ian, the devastating Category 4 hurricane that struck Florida in late September 2022. That hurricane was particularly hard to predict, and better track forecasts could have improved evacuation decisions and saved lives.But because of agreements with a vendor, NOAA has refused to release the model’s results to the public. With a massive storm headed toward a U.S. landfall this week, critics of the agreement argue taxpayer-funded forecasts should be freely and openly available. They say the model’s forecast could be highlighted in television and online graphics as one of the more reliable scenarios given its track record.“The HCCA is the gold standard in modern consensus modeling, and if it were available, we would show it,” Bryan Norcross, Fox Weather hurricane specialist, said in an email.The HCCA, or Hurricane Forecast Improvement Program (HFIP) Corrected Consensus Approach model, is one of more than 25 models used by the National Hurricane Center and is often referenced in its forecast discussions. It uses a proprietary technique, obtained from the private weather risk firm now known as RenaissanceRe Risk Sciences, to blend forecasts from other hurricane forecast models.“HCCA combines input from a number of models in a way that is weighted by their past performance,” Mark DeMaria, senior research scientist at Colorado State University and co-author of a research article describing the model, said in an email. “That allows biases from individual models to cancel each other and provide a more accurate forecast.”The agreement signed in 2020 by NOAA and the company enabled the agency to collaborate with the firm but does not allow the government to provide compensation. It states HCCA forecasts are “trade secrets and confidential information” that “shall not be publicly disclosed or disseminated” for a period of five years from the effective date of the agreement. The terms of the agreement were released to The Washington Post in response to a Freedom of Information Act request.Some worry the model’s inaccessibility sets a bad precedent for future partnerships between the government and private industry if it keeps potentially lifesaving information from the public.Maureen O’Leary, a National Weather Service spokesperson, said the agency strives for unrestricted public access to data and models but that “we must honor legal agreements made.”She added that NOAA is “constantly evaluating new opportunities to improve our products and services and seeks to find the appropriate balance to share that information publicly.”A company spokesperson for RenaissanceRe said in an email that its collaboration with NOAA is “one of [its] many public-private partnerships … which encourages risk knowledge sharing so communities around the world can better protect themselves.”Some private weather providers, however, have voiced concerns about the lack of access to the model’s forecasts.“This HCCA model … was developed at NOAA obviously using taxpayer resources,” Jonathan Porter, senior vice president at the forecasting services company AccuWeather, said in an interview. “This is an urgent public safety issue. It’s about ensuring … that we all have access to the same critical data as the Hurricane Center to effectively understand and communicate risks to people in harm’s way.”Baron Weather, a longtime provider of weather content to broadcast media, also supports wider access to the model.“It would certainly be a welcome addition for all broadcast meteorologists and assist them in communicating tropical forecast information and hazards to their viewers,” Bob Dreisewerd, the company’s chief executive, said in an email to The Post.Open data policies challenged by commercial business modelsNOAA plans to start making HCCA forecasts publicly available after its five-year agreement with RenaissanceRe, previously known as WeatherPredict Consulting, expires in March. “It is our intent to publicly release real-time HCCA model output and the source code before the start of the 2025 hurricane season,” O’Leary said in an email.Porter said AccuWeather is “delighted that NOAA … will make HCCA forecast guidance available to meteorologists across the country so that they can better understand the rationale behind the National Hurricane Center’s forecast and warnings.”But, he argues, restricted access to the model during this and previous hurricane seasons has been “a major setback” that “goes against the basic principles of … free and open distribution of government-based data.”“It’s setting a very precarious precedent … threatening to unravel and reverse over 50 years of progress that’s been achieved through the cooperation of the government, academic and private sectors,” Porter said. It “raises the question of what won’t be distributed next.”U.S. weather forecasting has long been a collaborative endeavor. Historically, NOAA and its international government partners have provided the foundational sensors and systems for making forecasts while the private sector helps to widely disseminate predictions and creates specialized products and services. The 2003 National Academies’ “Fair Weather” report helped define the roles of the U.S. government, private sector and academia at a time of growing friction between the sectors due to their increasingly overlapping roles.The report noted “the government’s obligation to make its information as widely available as possible to those who paid for it — the taxpayers,” but also recognized the challenges of government-industry partnerships and the desire for policy “that permits commercial objectives to be achieved.”The lines between the U.S. weather sectors have become even more blurred in recent years as the private sector has built up capabilities that were once exclusively undertaken by governments. NOAA now buys commercial satellite, aircraft and ground data and is collaborating with private companies that have recently built powerful AI weather models.“The weather enterprise has become a lot more complicated in the past decade, with more observations and modeling being done by private-sector entities,” Keith Seitter, executive director emeritus at the American Meteorological Society, said in an email. “This has challenged the historical approach having all the data being openly and freely distributed … because the private-sector producers often need to protect their intellectual property as part of their business model.”Seitter and Mary Glackin, former deputy undersecretary for operations at NOAA, are among those leading an American Meteorological Society study looking at the state of the weather enterprise two decades after the “Fair Weather” report. Glackin, now chair of the National Academies’ Board of Atmospheric Sciences and Climate, said policy around commercial weather data and technology acquisition presents a growing challenge.“Plans must be a balance of public good and costs while also considering maintaining a vibrant U.S. private sector,” Glackin said in an email. “I suspect each opportunity will need to be weighed independently — at least until we have more experience.”New policy guidance published in July by NOAA addresses the challenge of balancing public and commercial interests, stating that its “programs and offices should seek to maximize the public benefit derived from environmental data and data products obtained through commercial solutions by negotiating the least restrictive terms of use possible.”Andrew Rosenberg, a former NOAA official and a senior fellow at the University of New Hampshire’s Carsey School of Public Policy, said the confidentiality requirements that come along with NOAA’s commercial partnerships can sometimes be too broad, at the expense of transparency that is designed to instill trust in its work.That is especially concerning when it comes to weather forecasts that are meant to serve public health and safety, Rosenberg added. “I do think it’s problematic,” he said. “That isn’t really the way you want to serve the public interest.”

The lack of access to this model is spurring concerns that NOAA is holding back information that could help people prepare for deadly storms.

The National Oceanic and Atmospheric Administration for four years has used a hurricane forecasting tool that often surpasses all others in its accuracy, but it won’t release its predictions to the public, spurring concerns that it is holding back information that could help people prepare for deadly storms.

The tool, known as the HCCA model, was developed by NOAA as part of a program to reduce errors in hurricane forecasts. Statistics published by NOAA’s National Hurricane Center show that from 2020 to 2023, HCCA was one of the two best models for forecasting a storm’s track and intensity. In 2022, HCCA provided the most accurate track forecasts for all lead times out to four days, even beating the Hurricane Center’s official forecast.

The HCCA model produced superior two-day and three-day track forecasts to the Hurricane Center during Ian, the devastating Category 4 hurricane that struck Florida in late September 2022. That hurricane was particularly hard to predict, and better track forecasts could have improved evacuation decisions and saved lives.

But because of agreements with a vendor, NOAA has refused to release the model’s results to the public. With a massive storm headed toward a U.S. landfall this week, critics of the agreement argue taxpayer-funded forecasts should be freely and openly available. They say the model’s forecast could be highlighted in television and online graphics as one of the more reliable scenarios given its track record.

“The HCCA is the gold standard in modern consensus modeling, and if it were available, we would show it,” Bryan Norcross, Fox Weather hurricane specialist, said in an email.

The HCCA, or Hurricane Forecast Improvement Program (HFIP) Corrected Consensus Approach model, is one of more than 25 models used by the National Hurricane Center and is often referenced in its forecast discussions. It uses a proprietary technique, obtained from the private weather risk firm now known as RenaissanceRe Risk Sciences, to blend forecasts from other hurricane forecast models.

“HCCA combines input from a number of models in a way that is weighted by their past performance,” Mark DeMaria, senior research scientist at Colorado State University and co-author of a research article describing the model, said in an email. “That allows biases from individual models to cancel each other and provide a more accurate forecast.”

The agreement signed in 2020 by NOAA and the company enabled the agency to collaborate with the firm but does not allow the government to provide compensation. It states HCCA forecasts are “trade secrets and confidential information” that “shall not be publicly disclosed or disseminated” for a period of five years from the effective date of the agreement. The terms of the agreement were released to The Washington Post in response to a Freedom of Information Act request.

Some worry the model’s inaccessibility sets a bad precedent for future partnerships between the government and private industry if it keeps potentially lifesaving information from the public.

Maureen O’Leary, a National Weather Service spokesperson, said the agency strives for unrestricted public access to data and models but that “we must honor legal agreements made.”

She added that NOAA is “constantly evaluating new opportunities to improve our products and services and seeks to find the appropriate balance to share that information publicly.”

A company spokesperson for RenaissanceRe said in an email that its collaboration with NOAA is “one of [its] many public-private partnerships … which encourages risk knowledge sharing so communities around the world can better protect themselves.”

Some private weather providers, however, have voiced concerns about the lack of access to the model’s forecasts.

“This HCCA model … was developed at NOAA obviously using taxpayer resources,” Jonathan Porter, senior vice president at the forecasting services company AccuWeather, said in an interview. “This is an urgent public safety issue. It’s about ensuring … that we all have access to the same critical data as the Hurricane Center to effectively understand and communicate risks to people in harm’s way.”

Baron Weather, a longtime provider of weather content to broadcast media, also supports wider access to the model.

“It would certainly be a welcome addition for all broadcast meteorologists and assist them in communicating tropical forecast information and hazards to their viewers,” Bob Dreisewerd, the company’s chief executive, said in an email to The Post.

Open data policies challenged by commercial business models

NOAA plans to start making HCCA forecasts publicly available after its five-year agreement with RenaissanceRe, previously known as WeatherPredict Consulting, expires in March. “It is our intent to publicly release real-time HCCA model output and the source code before the start of the 2025 hurricane season,” O’Leary said in an email.

Porter said AccuWeather is “delighted that NOAA … will make HCCA forecast guidance available to meteorologists across the country so that they can better understand the rationale behind the National Hurricane Center’s forecast and warnings.”

But, he argues, restricted access to the model during this and previous hurricane seasons has been “a major setback” that “goes against the basic principles of … free and open distribution of government-based data.”

“It’s setting a very precarious precedent … threatening to unravel and reverse over 50 years of progress that’s been achieved through the cooperation of the government, academic and private sectors,” Porter said. It “raises the question of what won’t be distributed next.”

U.S. weather forecasting has long been a collaborative endeavor. Historically, NOAA and its international government partners have provided the foundational sensors and systems for making forecasts while the private sector helps to widely disseminate predictions and creates specialized products and services. The 2003 National Academies’ “Fair Weather” report helped define the roles of the U.S. government, private sector and academia at a time of growing friction between the sectors due to their increasingly overlapping roles.

The report noted “the government’s obligation to make its information as widely available as possible to those who paid for it — the taxpayers,” but also recognized the challenges of government-industry partnerships and the desire for policy “that permits commercial objectives to be achieved.”

The lines between the U.S. weather sectors have become even more blurred in recent years as the private sector has built up capabilities that were once exclusively undertaken by governments. NOAA now buys commercial satellite, aircraft and ground data and is collaborating with private companies that have recently built powerful AI weather models.

“The weather enterprise has become a lot more complicated in the past decade, with more observations and modeling being done by private-sector entities,” Keith Seitter, executive director emeritus at the American Meteorological Society, said in an email. “This has challenged the historical approach having all the data being openly and freely distributed … because the private-sector producers often need to protect their intellectual property as part of their business model.”

Seitter and Mary Glackin, former deputy undersecretary for operations at NOAA, are among those leading an American Meteorological Society study looking at the state of the weather enterprise two decades after the “Fair Weather” report. Glackin, now chair of the National Academies’ Board of Atmospheric Sciences and Climate, said policy around commercial weather data and technology acquisition presents a growing challenge.

“Plans must be a balance of public good and costs while also considering maintaining a vibrant U.S. private sector,” Glackin said in an email. “I suspect each opportunity will need to be weighed independently — at least until we have more experience.”

New policy guidance published in July by NOAA addresses the challenge of balancing public and commercial interests, stating that its “programs and offices should seek to maximize the public benefit derived from environmental data and data products obtained through commercial solutions by negotiating the least restrictive terms of use possible.”

Andrew Rosenberg, a former NOAA official and a senior fellow at the University of New Hampshire’s Carsey School of Public Policy, said the confidentiality requirements that come along with NOAA’s commercial partnerships can sometimes be too broad, at the expense of transparency that is designed to instill trust in its work.

That is especially concerning when it comes to weather forecasts that are meant to serve public health and safety, Rosenberg added. “I do think it’s problematic,” he said. “That isn’t really the way you want to serve the public interest.”

Read the full story here.
Photos courtesy of

Only three people prosecuted for covering up illegal sewage spills

Employees of water firms who obstruct investigations into spills could face jail, as new rules come into force on FridayWater company bosses have entirely escaped punishment for covering up illegal sewage spills, government figures show, as ministers prepare to bring in a new law threatening them with up to two years in prison for doing so.Only three people have ever been prosecuted for obstructing the Environment Agency in its investigations into sewage spills, officials said, with none of them receiving even a fine. Continue reading...

Water company bosses have entirely escaped punishment for covering up illegal sewage spills, government figures show, as ministers prepare to bring in a new law threatening them with up to two years in prison for doing so.Only three people have ever been prosecuted for obstructing the Environment Agency in its investigations into sewage spills, officials said, with none of them receiving even a fine.Officials said the data shows why the water regulator has found it so difficult to stop illegal spills, which happen when companies dump raw sewage during dry weather. The Environment Agency has identified hundreds of such cases since 2020.Steve Reed, the environment secretary, said: “Bosses must face consequences if they commit crimes – there must be accountability. From today, there will be no more hiding places.“Water companies must now focus on cleaning up our rivers, lakes and seas for good.”Water companies dumped a record amount of sewage into rivers and coastal waters last year, mostly because wet weather threatened to wash sewage back into people’s homes.Data released last month by the Environment Agency revealed companies had discharged untreated effluent for nearly 4m hours during 2024, a slight increase on the previous year.But companies have also illegally dumped sewage during dry weather. Data released to the Telegraph last year under freedom of information rules shows regulators had identified 465 illegal sewage spills since 2020, with a further 154 under investigation as potentially illegal spills.Britain’s polluted waterways became a major issue at last year’s election, with Labour promising to end what it called the “Tory sewage scandal”.Government sources say one reason illegal spills have been allowed to continue is that regulators have faced obstruction when investigating them.In 2019, three employees at Southern Water were convicted of hampering the Environment Agency when it was trying to collect data as part of an investigation into raw sewage spilled into rivers and on beaches in south-east England.The maximum punishment available in that case was a fine, but none of the individuals were fined. Several of the employees said at the time they were told by the company solicitor not to give data to the regulator.Two years later, Southern was given a £90m fine after pleading guilty to thousands of illegal discharges of sewage over a five-year period.New rules coming into force on Friday will give legal agencies the power to bring prosecutions in the crown court against employees for obstructing regulatory investigations, with a maximum sanction of imprisonment.Directors and executives can be prosecuted if they have consented to or connived with that obstruction, or allowed it to happen through neglect.The rules were included in the Water (Special Measures) Act, which came into law in February. The act also gives the regulator new powers to ban bonuses if environmental standards are not met and requires companies to install real-time monitors at every emergency sewage outlet.Philip Duffy, the chief executive of the Environment Agency, said: “The act was a crucial step in making sure water companies take full responsibility for their impact on the environment.“The tougher powers we have gained through this legislation will allow us, as the regulator, to close the justice gap, deliver swifter enforcement action and ultimately deter illegal activity.“Alongside this, we’re modernising and expanding our approach to water company inspections – and it’s working. More people, powers, better data and inspections are yielding vital evidence so that we can reduce sewage pollution, hold water companies to account and protect the environment.”

Indians Battle Respiratory Issues, Skin Rashes in World's Most Polluted Town

By Tora AgarwalaBYRNIHAT, India (Reuters) - Two-year-old Sumaiya Ansari, a resident of India's Byrnihat town which is ranked the world's most...

BYRNIHAT, India (Reuters) - Two-year-old Sumaiya Ansari, a resident of India's Byrnihat town which is ranked the world's most polluted metropolitan area by Swiss Group IQAir, was battling breathing problems for several days before she was hospitalised in March and given oxygen support.She is among many residents of the industrial town on the border of the northeastern Assam and Meghalaya states - otherwise known for their lush, natural beauty - inflicted by illnesses that doctors say are likely linked to high exposure to pollution.Byrnihat's annual average PM2.5 concentration in 2024 was 128.2 micrograms per cubic meter, according to IQAir, over 25 times the level recommended by the WHO.PM2.5 refers to particulate matter measuring 2.5 microns or less in diameter that can be carried into the lungs, causing deadly diseases and cardiac problems."It was very scary, she was breathing like a fish," said Abdul Halim, Ansari's father, who brought her home from hospital after two days.According to government data, the number of respiratory infection cases in the region rose to 3,681 in 2024 from 2,082 in 2022."Ninety percent of the patients we see daily come either with a cough or other respiratory issues," said Dr. J Marak of Byrnihat Primary Healthcare Centre. Residents say the toxic air also causes skin rashes and eye irritation, damages crops, and restricts routine tasks like drying laundry outdoors."Everything is covered with dust or soot," said farmer Dildar Hussain.Critics say Byrnihat's situation reflects a broader trend of pollution plaguing not just India's cities, including the capital Delhi, but also its smaller towns as breakneck industrialisation erodes environmental safeguards.Unlike other parts of the country that face pollution every winter, however, Byrnihat's air quality remains poor through the year, government data indicates.Home to about 80 industries - many of them highly polluting - experts say the problem is exacerbated in the town by other factors like emissions from heavy vehicles, and its "bowl-shaped topography"."Sandwiched between the hilly terrain of Meghalaya and the plains of Assam, there is no room for pollutants to disperse," said Arup Kumar Misra, chairman of Assam's pollution control board.The town's location has also made a solution tougher, with the states shifting blame to each other, said a Meghalaya government official who did not want to be named.Since the release of IQAir's report in March, however, Assam and Meghalaya have agreed to form a joint committee and work together to combat Byrnihat's pollution.(Reporting by Tora Agarwala; Writing by Sakshi Dayal; Editing by Raju Gopalakrishnan)Copyright 2025 Thomson Reuters.

UK government report calls for taskforce to save England’s historic trees

Exclusive: Ancient oaks ‘as precious as stately homes’ could receive stronger legal safeguards under new proposalsAncient and culturally important trees in England could be given legal protections under plans in a UK government-commissioned report.Sentencing guidelines would be changed under the plans so those who destroy important trees would face tougher criminal penalties. Additionally, a database of such trees would be drawn up, and they could be given automatic protections, with the current system of tree preservation orders strengthened to accommodate this.In 2020, the 300-year-old Hunningham Oak near Leamington was felled to make way for infrastructure projects.In 2021, the Happy Man tree in Hackney, which the previous year had won the Woodland Trust’s tree of the year contest, was felled to make way for housing development.In 2022, a 600-year-old oak was felled in Bretton, Peterborough, which reportedly caused structural damage to nearby property.In 2023, 16 ancient lime trees on The Walks in Wellingborough, Northamptonshire, were felled to make way for a dual carriageway. Continue reading...

Ancient and culturally important trees in England could be given legal protections under plans in a UK government-commissioned report.Sentencing guidelines would be changed under the plans so those who destroy important trees would face tougher criminal penalties. Additionally, a database of such trees would be drawn up, and they could be given automatic protections, with the current system of tree preservation orders strengthened to accommodate this.There was an outpouring of anger this week after it was revealed that a 500-year-old oak tree in Enfield, north London, was sliced almost down to the stumps. It later emerged it had no specific legal protections, as most ancient and culturally important trees do not.After the Sycamore Gap tree was felled in 2023, the Department of Environment, Food and Rural Affairs asked the Tree Council and Forest Research to examine current protections for important trees and to see if they needed to be strengthened. The trial of two men accused of felling the Sycamore Gap tree is due to take place later this month at Newcastle crown court.The report, seen by the Guardian, found there is no current definition for important trees, and that some of the UK’s most culturally important trees have no protection whatsoever. The researchers have directed ministers to create a taskforce within the next 12 months to clearly define “important trees” and swiftly prepare an action plan to save them.Defra sources said ministers were evaluating the findings of the report.Jon Stokes, the director of trees, science and research at the Tree Council, said: “Ancient oaks can live up to 1,000 years old and are as precious as our stately homes and castles,” Stokes explained. “Our nation’s green heritage should be valued and protected and we will do everything we can to achieve this.”Currently, the main protection for trees is a tree preservation order (TPO), which is granted by local councils. Failing to obtain the necessary consent and carrying out unauthorised works on a tree with a TPO can lead to a fine of up to £20,000.The Woodland Trust has called for similar protections, proposing the introduction of a list of nationally important heritage trees and a heritage TPO that could be used to promote the protection and conservation of the country’s oldest and most important trees. The charity is using citizen science to create a database of ancient trees.The report’s authors defined “important trees” as shorthand for “trees of high social, cultural, and environmental value”. This includes ancient trees, which are those that have reached a great age in comparison with others of the same species, notable trees connected with specific historic events or people, or well-known landmarks. It could also include “champion trees”, which are the largest individuals of their species in a specific geographical area, and notable trees that are significant at a local scale for their size or have other special features.Richard Benwell, the CEO of the environmental group Wildlife and Countryside Link, said: “Ancient trees are living monuments. They are bastions for nature in an increasingly hostile world and home to a spectacular richness of wildlife. We cannot afford to keep losing these living legends if we want to see nature thrive for future generations. The government should use the planning and infrastructure bill to deliver strict protection for ancient woodlands, veteran trees, and other irreplaceable habitats.”Felled ancient trees In 2020, the 300-year-old Hunningham Oak near Leamington was felled to make way for infrastructure projects. In 2021, the Happy Man tree in Hackney, which the previous year had won the Woodland Trust’s tree of the year contest, was felled to make way for housing development. In 2022, a 600-year-old oak was felled in Bretton, Peterborough, which reportedly caused structural damage to nearby property. In 2023, 16 ancient lime trees on The Walks in Wellingborough, Northamptonshire, were felled to make way for a dual carriageway.

L.A. will set aside $3 million to help owners of fire-damaged homes test their soil for lead

The L.A. County Board of Supervisors approved a proposal to allocate $3 million to help owners of fire-damaged homes test their soil for lead.

The Los Angeles County Board of Supervisors will allocate $3 million to help homeowners near the Eaton burn area test for lead contamination, after preliminary tests found elevated levels of the heavy metal on homes standing after the fire.Supervisors Kathryn Barger and Lindsey Horvath proposed the motion after preliminary test results released last week by the Los Angeles County Department of Public Health showed lead levels above state health standards in as many as 80% of soil samples collected downwind of the Eaton burn scar.On Tuesday, the board voted 4-0 to direct $3 million from the county’s 2018 $134-million settlement with lead-paint manufacturers to test residential properties that are both downwind and within one mile of the Eaton burn scar boundary.Lead is a heavy metal linked to serious health problems including damage to the brain and nervous system, as well as digestive, reproductive and cardiovascular issues, according to the Environmental Protection Agency.Roux Associates, a private testing firm hired by the county, collected samples from 780 properties in both burn zones over four weeks from mid-February to mid-March. It tested for 14 toxic substances commonly found after wildfires: heavy metals such as arsenic and lead; polyaromatic hydrocarbons such as anthracene and napthalene; and dioxins.More than one-third of samples collected within the Eaton burn scar exceeded California’s health standard of 80 milligrams of lead per kilogram of soil, Roux found. Nearly half of samples just outside the burn scar’s boundary had lead levels above the state limit. And downwind of the fire’s boundary, to the southwest, between 70% and 80% of samples surpassed that limit.In the Palisades burn area, tests found little contamination beyond some isolated “hot spots” of heavy metals and polyaromatic hydrocarbons, Roux’s vice president and principal scientist Adam Love said last week.Nichole Quick, chief medical advisor with the L.A. County Department of Public Health, said at the time that officials would be requesting federal and state help to further assess the Palisades hot spots, and working with the county on targeted lead testing in affected areas downwind of the Eaton fire.The county is for now shouldering the responsibility of contaminant testing because, as The Times has reported, the federal government has opted to break from a nearly two-decade tradition of testing soil on destroyed properties cleaned by the U.S. Army Corps of Engineers after fires.After previous wildfires, the Army Corps would first scrape 6 inches of topsoil from cleared properties and then test the ground underneath. If those tests revealed toxic substances still on the property, it would scrape further.After the devastating Camp fire in Paradise in 2018, soil testing of 12,500 properties revealed that nearly one-third still contained dangerous levels of contaminants even after the first 6 inches of topsoil were scraped by federal crews.L.A. County ordered testing from Roux in lieu of that federal testing. So far, the county has announced results only from standing homes, which are not eligible for cleanup from the Army Corps of Engineers; results from land parcels with damaged or destroyed structures are still pending.FEMA’s decision to skip testing after L.A.’s firestorms has frustrated many residents and officials, with some calling for the federal agency to reconsider.“Without adequate soil testing, contaminants caused by the fire can remain undetected, posing risks to returning residents, construction workers, and the environment,” the state’s Office of Emergency Services director Nancy Ward wrote in a February letter to FEMA. “Failing to identify and remediate these fire-related contaminants may expose individuals to residual substances during rebuilding efforts and potentially jeopardize groundwater and surface water quality.”

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.