Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

GoGreenNation News

Learn more about the issues presented in our films
Show Filters

The Dismantling of the US Forest Service Is Imminent

This story was originally published by High Country News and is reproduced here as part of the Climate Desk collaboration. In the 1880s, giant cattle companies turned thousands of cattle out to graze on the “public domain”—i.e., the Western lands that had been stolen from Indigenous people and then opened up for white settlement. In remote southeastern […]

This story was originally published by High Country News and is reproduced here as part of the Climate Desk collaboration. In the 1880s, giant cattle companies turned thousands of cattle out to graze on the “public domain”—i.e., the Western lands that had been stolen from Indigenous people and then opened up for white settlement. In remote southeastern Utah, this coincided with a wave of settlement by members of the Church of Jesus Christ of Latter-day Saints. The region’s once-abundant grasslands and lush mountain slopes were soon reduced to denuded wastelands etched with deep flash-flood-prone gullies. Cattlemen fought, sometimes violently, over water and range. The local citizenry grew sick and tired of it, sometimes literally: At one point, sheep feces contaminated the water supply of the town of Monticello and led to a typhoid outbreak that killed 11 people. Yet there was little they could do, since there were few rules on the public domain and fewer folks with the power to enforce them. That changed in 1891, when Congress passed the Forest Reserve Act, which authorized the president to place some unregulated tracts under “judicious control,” thereby mildly restraining extractive activities in the name of conservation. In 1905, the Forest Service was created as a branch of the US Agriculture Department to oversee these reserves, and Gifford Pinchot was chosen to lead it. A year later, the citizens of southeastern Utah successfully petitioned the Theodore Roosevelt administration to establish forest reserves in the La Sal and Abajo Mountains. Since then, the Forest Service has gone through various metamorphoses, shifting from stewarding and conserving forests for the future to supplying the growing nation with lumber to managing forests for multiple uses and then to the ecosystem management era, which began in the 1990s. Throughout all these shifts, however, it has largely stayed true to Pinchot and his desire to conserve forests and their resources for future generations.  But now, the Trump administration is eager to begin a new era for the agency and its public lands, with a distinctively un-Pinchot-esque structure and a mission that maximizes resource production and extraction while dismantling the administrative state and its role as environmental protector. Over the last nine months, the administration has issued executive orders calling for expanded timber production and rescinding the 2001 Roadless Rule, declared “emergency” situations that enable it to bypass regulations on nearly 60 percent of the public’s forests, and proposed slashing the agency’s operations budget by 34 percent. The most recent move, which is currently open to public comment, involves a proposal by Agriculture Secretary Brooke Rollins to radically overhaul the entire US Department of Agriculture. Its stated purposes are to ensure that the agency’s “workforce aligns with financial resources and priorities,” and to consolidate functions and eliminate redundancy. This will include moving at least 2,600 of the department’s 4,600 Washington, DC, employees to five hub locations, with only two in the West: Salt Lake City, Utah, and Fort Collins, Colorado. (The others will be in North Carolina, Missouri, and Indiana.) The goal, according to Rollins’ memorandum, is to “bring the USDA closer to its customers.” The plan is reminiscent of Trump’s first-term relocation of the Bureau of Land Management’s headquarters to Grand Junction, Colorado, in 2019. That relocation resulted in a de facto agency housecleaning; many senior staffers chose to resign or move to other agencies, and only a handful of workers ended up in the Colorado office, which shared a building with oil and gas companies. Using an emergency declaration, Trump’s timber production executive order would ease environmental protections so as to greatly expand logging in the national forests. Though Rollins’ proposal is aimed at decentralizing the department, it would effectively re-centralize the Forest Service by eliminating its nine regional offices, six of which are located in the West. Each regional forester oversees dozens of national forests within their region, providing budget oversight, guiding place-specific implementation of national policies, and facilitating coordination among the various forests. Rollins’ memo does not explain why the regional offices are being axed, or what will happen to the regional foresters’ positions and their functions, or how the change will affect the agency’s chain of command. When several US senators asked Deputy Secretary Stephen Vaden for more specifics, he responded that “decisions pertaining to the agency’s structure and the location of specialized personnel will be made after” the public comment period ends on September 30. Curiously, the administration’s forest management strategy, published in May, relies on regional offices to “work with the Washington Office to develop tailored strategies to meet their specific timber goals.” Now it’s unclear that either the regional or Washington offices will remain in existence long enough to carry this out. The administration has been far more transparent about its desire to return the Forest Service to its timber plantation era, which ran from the 1950s through the ’80s. During that time, logging companies harvested 10 billion to 12 billion board-feet per year from federal forests, while for the last 25 years, the annual number has hovered below 3 billion board-feet. Now, Trump, via his Immediate Expansion of American Timber Production order, plans to crank up the annual cut to 4 billion board-feet by 2028. This will be accomplished—in classic Trumpian fashion—by declaring an “emergency” on national forest lands that will allow environmental protections and regulations, including the National Environmental Protection Act, Endangered Species Act and Clean Water Act, to be eased or bypassed. In April, Rollins issued a memorandum doing just that, declaring that the threat of wildfires, insects and disease, invasive species, overgrown forests, the growing number of homes in the wildland-urban interface, and more than a century of rigorous fire suppression have contributed to what is now “a full-blown wildfire and forest health crisis.” Emergency determinations aren’t limited to Trump and friends; in 2023, the Biden administration identified almost 67 million acres of national forest lands as being under a high or very high fire risk, thus qualifying as an “emergency situation” under the Infrastructure Investment and Jobs Act. Rollins, however, vastly expanded the “emergency situation” acreage to almost 113 million acres, or 59 percent of all Forest Service lands. This allows the agency to use streamlined environmental reviews and “expedited” tribal consultation time frames to “carry out authorized emergency actions,” ranging from commercial harvesting of damaged trees to removing “hazardous fuels” to reconstructing existing utility lines. Meanwhile, the administration has announced plans to consolidate all federal wildfire fighting duties under the Interior Department. This would completely zero out the Forest Service’s $2.4 billion wildland fire management budget, sowing even more confusion and chaos. The administration also plans to slash staff and budgets in other parts of the agency, further compromising its ability to carry out its mission. The so-called Department of Government Efficiency fired about 3,400 Forest Service employees, or more than 10 percent of the agency’s total workforce, earlier this year. And the administration has proposed cutting the agency’s operations budget, which includes salaries, by 34 percent in fiscal 2026, which will most likely necessitate further reductions in force. It would also cut the national forest system and capital improvement and maintenance budgets by 21 percent and 48 percent respectively. The goal, it seems, is to cripple the agency with both direct and indirect blows. The result, if the administration succeeds, will be a diminished Forest Service that would be unrecognizable to Gifford Pinchot.

4 vaccines that are linked to a lower risk of dementia

Some vaccine-preventable diseases are linked to accelerated brain atrophy and increased dementia risk years down the line.

Vaccines don’t just protect us from infectious diseases or lessen their effects. Some are also associated with a reduced risk for dementia, research shows.“They’ll protect against these really potentially severe infections, especially in older adults, and preventing that alone is huge,” said Avram Bukhbinder, a resident physician at Massachusetts General Hospital in Boston who has conducted research on vaccines and dementia risk.“There seems to also be some kind of added benefit and ultimately it just adds a more compelling reason” to get routine vaccines, he said.Studies have found that many vaccines may be associated with a reduced risk of dementia — here are four of the most common ones with the strongest links.The flu shotAn estimated 47 million to 82 million people in the United States — about 13 to 24 percent of all people — caught influenza, or the flu, during the 2024-2025 season with 27,000 to 130,000 Americans dying as a result, according to preliminary data from the Centers for Disease Control and Prevention. (Flu season generally runs from October to May in North America.)Influenza and pneumonia — a potential complication of flu — are associated with five neurodegenerative diseases, including dementia and Parkinson’s disease, according to a 2023 study analyzing biobank data from over 400,000 people.“I don’t know how many times in the adult world we hear, ‘My loved one got flu, was in the hospital for a week or two, and it just was never the same.’ Like quickly went downhill from there,” Bukhbinder said.Many studies have found that flu vaccination is associated with a lower risk of dementia years later.In a 2022 study, Bukhbinder and his colleagues at the University of Texas Health Science Center at Houston examined a large health database of over 1.8 million adults ages 65 and over. They found that those who received at least one flu vaccine were 40 percent less likely to develop Alzheimer’s — the most common form of dementia — during the next four years.Getting the flu vaccine was also associated with a 17 percent reduction in dementia risk in a 2024 study of over 70,000 participants.The CDC recommends all people over 6 months old get annual flu shots, typically in September or October.Fewer than half of Americans typically get their flu vaccine each season.The shingles vaccineThe shingles vaccine has the strongest evidence for reducing the risk of dementia with multiple large-scale studies in the past two years corroborating the results of older studies.In one 2025 study, researchers tracked more than 280,000 adults in Wales and found that the shingles vaccine was linked with reducing dementia risk by 20 percent over a seven-year period.“There may be potential additional benefits beyond the protection that the vaccine provides for a particular condition,” said Pascal Geldsetzer, an assistant professor of medicine at the Stanford University School of Medicine and the senior author of the study. “So, that’s only an additional reason to get vaccinated.”A subsequent study examining over 100,000 patients in Australia similarly found that getting vaccinated for shingles was associated with reduced dementia risk.If you are eligible, you should probably get a shingles vaccine regardless of its chances of reducing your dementia risk. The vaccine reduces the reactivation of the varicella-zoster virus, which causes chicken pox in childhood and remains dormant in nerve cells afterward. When reactivated in adulthood, the virus manifests as shingles, which is characterized by a burning, painful rash and can sometimes cause lifelong chronic pain conditions or serious complications in a subset of people who get it.The CDC recommends two doses of a shingles vaccine for adults 50 and older or those 19 and older with a weakened immune system; 36 percent of eligible Americans got vaccinated in 2022.The RSV vaccineRespiratory syncytial virus, or RSV, is a common respiratory virus that can cause mild, cold-like symptoms in most people, but may cause severe infections in children as well as adults ages 65 and older. (The virus is the leading cause of hospitalization among American infants and causes an estimated 10o to 300 deaths in children under 5, and 6,000 to 10,000 deaths in people 65 or older, every year in the U.S.)A recent study tracking over 430,000 people found that the RSV vaccine (as well as the shingles vaccine) was associated with a reduced risk of dementia over 18 months compared with those who received the flu vaccine.The CDC recommends all adults ages 75 and older, as well as adults older than 50 at higher risk of RSV, get the vaccine.The Tdap vaccineSeveral studies have reported that the vaccine against tetanus, diphtheria and pertussis (or whooping cough), or Tdap, is associated with a reduced risk of dementia.One 2021 study with over 200,000 patients reported that older adults who received both the shingles and Tdap vaccines had further reduced risk of dementia compared with those who only received one of the vaccines.The CDC recommends routine Tdap vaccination for all adolescents and a booster for adults every 10 years. In 2022, about 30 percent of adults ages 19-64 who could be assessed had received a Tdap vaccine.How vaccines may reduce dementia riskResearch has shown that severe infections, including flu, herpes and respiratory tract infections, are linked to accelerated brain atrophy and increased risk of dementia years down the line.“We think it’s the uncontrolled kind of systemic inflammation that’s probably contributing to that,” Bukhbinder said. “And it’s very likely that they had the underlying Alzheimer’s or other dementia pathology already, but the inflammation is what pushed them over the edge.”Geldsetzer said that the varicella-zoster virus, which causes shingles, has the most clear biological links because it hibernates in our nervous system and can more directly affect the brain. (Getting a chicken pox vaccine in childhood can prevent this virus from taking hold in the first place.)Though different vaccines are linked to reduced dementia risk, there are inherent limitations to how the research was conducted. The link is associational, not causal, because the people who get vaccines may be different from those who don’t.For example, it could be that “those who are on average more health-motivated, have better health behaviors, are the ones who decide to get vaccinated,” Geldsetzer said. Even though researchers try to account for these confounding variables, it is not possible to fully filter out differences in health behaviors associated with dementia risk.But recent studies hint at a stronger link between the shingles vaccine and dementia-risk reduction. This research takes advantage of “natural experiments” because of arbitrary dates that the governments of Wales and Australia set for shingles vaccine eligibility; those born immediately before and after the eligibility date are probably not different and can be more directly compared. And when they are, those who got the shingles vaccines had lower risk of dementia, said Geldsetzer, who was an author on the Wales and Australia studies and is raising money to fund a randomized controlled trial.There are two broad biological hypotheses for how vaccines are linked to reduced dementia risk. Vaccines could reduce the risk of getting sick and infection severity, which have been linked to increased dementia risk.“I feel confident that that’s part of the story, but it’s not the whole story,” Bukhbinder said.Another, not mutually exclusive possibility is that the vaccine itself may activate the immune system in a beneficial way. Vaccination “may be honing or refining the immune system’s response,” Bukhbinder said.There’s “good evidence that what happens outside of the brain … seems to actually affect the inside pretty robustly,” Bukhbinder said.How to keep up-to-date on vaccines and reduce dementia riskVaccinations, like all medical treatments, can have some risks and side effects, so it is important to speak with your doctor about your particular health needs.However, “I would say by and far the benefits of getting these vaccinations almost incomparably outweigh the risks,” Bukhbinder said.In addition, 45 percent of dementia cases could be delayed or prevented with lifestyle and environmental changes, according to the 2024 Lancet Commission report on dementia.Do you have a question about human behavior or neuroscience? Email BrainMatters@washpost.com and we may answer it in a future column.

Two climate scientists on how to use emotion in the climate crisis

From anger to hope, Kate Marvel and Tim Lenton explain how to tackle the tricky feelings aroused by climate change and harness them to take action

With emissions still rising, how do we feel hope for the future?Qilai Shen/Bloomberg via Getty Images With dire environmental warnings and extreme weather events in the news almost every day, it can be tempting to simply avoid thinking about the climate crisis. But how do climate scientists, who must grapple with the harsh reality of our changing planet every day, cope? What can they teach us about processing the powerful emotions provoked by escalating climate change? And are there ways we can use these feelings to our advantage? New Scientist recently sat down with New York-based climate scientist Kate Marvel and Tim Lenton, a climate scientist at the University of Exeter, UK. Both have spent years modelling how our planet may react to increasing greenhouse gas concentrations in the atmosphere, and both have recently published books that distil their perspectives on how best to engage with, and tackle, the climate emergency. At first glance, these are two quite different books. Human Nature, by Marvel, is a series of essays exploring the science of climate change, each centred on a different emotional response to the crisis. By contrast, Lenton’s book, Positive Tipping Points, prioritises taking action over introspection. It makes a persuasive case that a radical, systemic shift to a cleaner world is possible with the right social, economic and technological interventions. At their heart, though, both books are about how to embrace our emotions around climate change so we can reframe our thinking and actions. In this conversation, Lenton and Marvel reveal why we should feel angry, fearful, proud and hopeful all at once about our future on Earth. Rowan Hooper: Kate, your book is about nine ways to feel about our changing planet. Can we start with anger? Kate Marvel: The anger chapter was one of the easiest ones to write. What I wanted to talk about was the history of how we discovered climate change was happening. The thing that makes me really angry is that the history of scientists finding stuff out is intertwined with the history of people lying about it. I tell this story of a research group. They’re trying to establish that most of the excess carbon dioxide in the atmosphere comes from fossil fuels, and they design these really creative experiments to prove that. They have a large ship that’s going around, taking measurements of the ocean. And eventually they develop a climate model that has made extremely accurate projections in retrospect. You know who did all of that? It was Exxon. That does make me very angry. The fact that they knew. RH: Can anger be motivating? KM: I hope so. It can be really easy to go down a bad path where all you are is angry. Social media definitely incentivises this, where you’re fed more and more outrage, but it’s not productive outrage. RH: Your book also covers wonder, guilt, fear, grief, surprise, pride, hope and love. Can you talk us through how you processed these emotions? KM: What I wanted to do is embrace the fact there is no one way to feel about climate change. I was getting really frustrated when I was reading things that were designed to elicit a single emotion. Either, just be afraid, or just be angry, or just be hopeful. That didn’t feel very useful to me. I wanted to acknowledge that if you live on planet Earth, you have a conflict of interest. You care about what happens to this place. Because everybody that you know lives here. Tim Lenton studies “tipping points” in ecosystems that could affect the wider climateUniversity of Exeter RH: Tim, how do you find dealing with the emotions that come with studying climate change? Tim Lenton: I’ve been studying climate tipping points that could be really bad, really nasty. And arguably some of them are starting to unfold. I mean, we’re losing tropical coral reefs that up to half a billion people in the world depend on for their livelihoods. I’ve been staring this stuff down for nearly 20 years. So, I just found I had to use the mental toolkit I had of understanding complex systems to try to see if I could find plausible grounds for hope. Could we build a credible case that we could accelerate the change we need to get out of trouble? It took doing the research on the book to see that there was evidence that this is possible, and I wasn’t just going to delude myself with naive hope. RH: So it’s rational, usable hope? TL: It’s conditional optimism. I’m optimistic on the basis that some people are going to read the book, and some fraction of them will join me on the same journey. History teaches us that it only needs a fraction of people to change to ultimately tip everyone to change. Madeleine Cuff: Tim, much of your career has focused on this idea of tipping points. For those who are new to the concept, what are they? TL: Tipping points are those moments where a small change makes a big difference to the state or the fate of some system. For the bad ones in the climate, we know that there are large parts of the Earth system – major ice sheets, aspects of the ocean circulation, big bits of the biosphere – that have what we call alternative stable states. And they can be tipped from one state into another. We could potentially tip the Amazon rainforest into a different degraded forest or savannah state, for example. MC: What is a positive tipping point? TL: I’m drawing on over half a century of scholarship in different fields that shows you can have tipping points in social change. We’re all familiar with the idea of political revolutions popping up and protests popping up seemingly out of nowhere and exploding in size. But history also teaches us that sometimes you get abrupt and hard-to-reverse changes in technology. There are tipping points where one new technology will take over from an existing one. RH: The obvious climate example I’m thinking of is electric vehicles. And, of course, solar is so cheap now that it’s really taking off. How do we bring about positive tipping points? TL: We have to think about what actions can bring forward the positive tipping points, accepting that we need to be going more than five times faster than we are at decarbonising the economy. Luckily, each of us has agency to do something about this. At the most basic level, maybe we can be an adopter of new behaviour, such as eating less meat, or adopting a new technology like EVs or solar panels. We’ve probably also got a pension fund, and we should be asking hard questions about where that’s invested. The story of positive tipping points that have already happened starts with social activists or innovators. The people who have a passion to develop the core new technology, or activists who want to create change and see that possibility before everybody else. In her research, Kate Marvel tries to better model our planet’s changing climateRoy Rochlin/Getty Images MC: Kate, we’ve talked a little bit about the negative emotions that come with thinking about climate change. But what about the impact of positive emotions? What role can they play in inspiring positive action? KM: I started the book with the emotion wonder because, when you take a step back, just thinking about this planet that we live on and the fact that we understand it at all, that’s incredible. It’s a really useful tool for making connections and starting conversations. A lot of times, when I tell people I’m a climate scientist, they assume I’m immediately going to start scolding them. But if you start out with wonder, if you start out a conversation with: “Did you know the Earth’s water is probably older than the Earth itself?” people are going to say: “Oh wait, that’s amazing.” And they are going to be more likely to talk to you. Embracing a wide spectrum of emotions is useful as a communications strategy. There is support for feeling these emotions in the scientific and social scientific literature. There is a sense of pride we can feel in doing the hard work. There is deep satisfaction in making change. The social science literature also says that love is probably the most powerful motivating factor in climate action. People are motivated to act because they love their communities, their families, their children. We know how powerful that emotion is. I have a whole chapter on hope, even though I have a very complicated relationship to hope. I feel like when people always ask me: “Do you hope we can solve climate change?” that, for me, is like asking, do you hope you can clean your bathroom? That’s a silly question. You know what to do, just go clean your bathroom. As Tim says, we have so many of the solutions we need. We are on these trajectories already. We just need to push them over the precipice. We need to get past that social tipping point. RH: We have to face up to these emotions, don’t we? Maybe that’s one reason why we haven’t really got to grips with the problem – it’s too big for us to face. KM: Totally. I think about this stuff all day every day, and I still don’t really understand it. I can’t fit it into my head. This is a problem that is caused by basically every industrial human activity. And because CO2 and other greenhouse gases are well mixed in the atmosphere, it is affecting literally every aspect of life on this planet. Trying to boil that down to something very glib and manageable is just not possible. It is the work of a lifetime, or many lifetimes, to really come to terms with what this is and what this means, and what we do about it. Most Americans are concerned about climate change and want the US government to do something. But when you look at the polls, most Americans think other Americans do not think that. So that, I think, is why one of the most powerful things that an individual can do regarding climate change is to talk about it. Because when you talk about it, you realise, maybe I’m not so much of an individual after all. Maybe I’m not alone. RH: What do you want people to do after reading your books? KM: I would like people to think about how to tell climate stories that resonate with themselves, with their own community, with the people who will listen to them because of who they are and what they bring to the table. TL: I’m hoping the readers are feeling empowered to act, in what might have beforehand been feeling like a very scary, disempowering situation. I’d like them instead to feel a sense of agency. This is an edited version of an interview that originally took place on New Scientist‘s The World, the Universe and Us podcast What on earth can we do about climate change? See Matt Winning explain how to dispel the despair and take action on 18 October newscientist.com/nslmag

Louisiana's $3B Power Upgrade for Meta Project Raises Questions About Who Should Foot the Bill

Meta is racing to construct its largest data center yet, a $10 billion facility in northeast Louisiana as big as 70 football fields and requiring more than twice the electricity of New Orleans

HOLLY RIDGE, La. (AP) — In a rural corner of Louisiana, Meta is building one of the world's largest data centers, a $10 billion behemoth as big as 70 football fields that will consume more power in a day than the entire city of New Orleans at the peak of summer.While the colossal project is impossible to miss in Richland Parish, a farming community of 20,000 residents, not everything is visible, including how much the social media giant will pay toward the more than $3 billion in new electricity infrastructure needed to power the facility. Watchdogs have warned that in the rush to capitalize on the AI-driven data center boom, some states are allowing massive tech companies to direct expensive infrastructure projects with limited oversight.Mississippi lawmakers allowed Amazon to bypass regulatory approval for energy infrastructure to serve two data centers it is spending $10 billion to build. In Indiana, a utility is proposing a data center-focused subsidiary that operates outside normal state regulations. And while Louisiana says it has added consumer safeguards, it lags behind other states in its efforts to insulate regular power consumers from data center-related costs. Mandy DeRoche, an attorney for the environmental advocacy group Earthjustice, says there is less transparency due to confidentiality agreements and rushed approvals.“You can’t follow the facts, you can’t follow the benefits or the negative impacts that could come to the service area or to the community,” DeRoche said. Private deals for public power supply Under contract with Meta, power company Entergy agreed to build three gas-powered plants that would produce 2,262 megawatts — equivalent to a fifth of Entergy's current power supply in Louisiana. The Public Service Commission approved Meta’s infrastructure plan in August after Entergy agreed to bolster protections to prevent a spike in residential rates.Nonetheless, nondisclosure agreements conceal how much Meta will pay.Consumer advocates tried but failed to compel Meta to provide sworn testimony, submit to discovery and face cross-examination during a regulatory review. Regulators reviewed Meta’s contract with Entergy, but were barred from revealing details. Meta did not address AP’s questions about transparency, while Louisiana's economic development agency and Entergy say nondisclosure agreements are standard to protect sensitive commercial data. Davante Lewis — the only one of five public service commissioners to vote against the plan — said he's still unclear how much electricity the center will use, if gas-powered plants are the most economical option nor if it will create the promised 500 jobs. “There’s certain information we should know and need to know but don’t have,” Lewis said. Additionally, Meta is exempt from paying sales tax under a 2024 Louisiana law that the state acknowledges could lead to “tens of millions of dollars or more each year” in lost revenue.Meta has agreed to fund about half the cost of building the power plants over 15 years, including cost overruns, but not maintenance and operation, said Logan Burke, executive director of the Alliance for Affordable Energy, a consumer advocacy group. Public Service Commission Jean-Paul Coussan insists there will be “very little” impact on ratepayers.But watchdogs warn Meta could pull out of or not renew its contract, leaving the public to pay for the power plants over the rest of their 30-year life span, and all grid users are expected to help pay for the $550 million transmission line serving Meta’s facility.Ari Peskoe, director of Harvard University’s Electricity Law Initiative, said tech companies should be required to pay “every penny so the public is not left holding the bag.” How is this tackled in other states? Elsewhere, tech companies are not being given such leeway. More than a dozen states have taken steps to protect households and business ratepayers from paying for rising electricity costs tied to energy-hungry data centers. Pennsylvania’s utilities commission is drafting a model rate structure to insulate customers from rising costs related to data centers. New Jersey’s utilities regulators are studying whether data centers cause “unreasonable” cost increases for other users. Oregon passed legislation this year ordering utilities regulators to develop new, and likely higher, power rates for data centers. Locals have mixed feelings Some Richland Parish residents fear a boom-and-bust cycle once construction ends. Others expect a boost in school and health care funding. Meta said it plans to invest in 1,500 megawatts of renewable energy in Louisiana and $200 million in water and road infrastructure in Richland Parish.“We don’t come from a wealthy parish and the money is much needed,” said Trae Banks, who runs a drywall business that has tripled in size since Meta arrived.In the nearby town of Delhi, Mayor Jesse Washington believes the data center will eventually have a positive impact on his community of 2,600.But for now, the construction traffic frustrates residents and property prices are skyrocketing as developers try to house thousands of construction workers. More than a dozen low-income families were evicted from a trailer park whose owners are building housing for incoming Meta workers, Washington says.“We have a lot of concerned people — they’ve put hardship on a lot of people in certain areas here," the mayor said. “I just want to see people from Delhi benefit from this.”Brook reported from New Orleans. Brook is a corps member for The Associated Press/Report for America Statehouse News Initiative. Report for America is a nonprofit national service program that places journalists in local newsrooms to report on undercovered issues.Copyright 2025 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.Photos You Should See – Sept. 2025

Technique makes complex 3D printed parts more reliable

New research enables computer designs to incorporate the limitations of 3D printers, to better control materials’ performance in aerospace, medical, and other applications.

People are increasingly turning to software to design complex material structures like airplane wings and medical implants. But as design models become more capable, our fabrication techniques haven’t kept up. Even 3D printers struggle to reliably produce the precise designs created by algorithms. The problem has led to a disconnect between the ways a material is expected to perform and how it actually works.Now, MIT researchers have created a way for models to account for 3D printing’s limitations during the design process. In experiments, they showed their approach could be used to make materials that perform much more closely to the way they’re intended to.“If you don’t account for these limitations, printers can either over- or under-deposit material by quite a lot, so your part becomes heavier or lighter than intended. It can also over- or underestimate the material performance significantly,” says Gilbert W. Winslow Associate Professor of Civil and Environmental Engineering Josephine Carstensen. “With our technique, you know what you’re getting in terms of performance because the numerical model and experimental results align very well.”The approach is described in the journal Materials and Design, in an open-access paper co-authored by Carstensen and PhD student Hajin Kim-Tackowiak.Matching theory with realityOver the last decade, new design and fabrication technologies have transformed the way things are made, especially in industries like aerospace, automotive, and biomedical engineering, where materials must reach precise weight-to-strength ratios and other performance thresholds. In particular, 3D printing allows materials to be made with more complex internal structures.“3D printing processes generally give us more flexibility because we don’t have to come up with forms or molds for things that would be made through more traditional means like injection molding,” Kim-Tackowiak explains.As 3D printing has made production more precise, so have methods for designing complex material structures. One of the most advanced computational design techniques is known as topology optimization. Topology optimization has been used to generate new and often surprising material structures that can outperform conventional designs, in some cases approaching the theoretical limits of certain performance thresholds. It is currently being used to design materials with optimized stiffness and strength, maximized energy absorption, fluid permeability, and more.But topology optimization often creates designs at extremely fine scales that 3D printers have struggled to reliably reproduce. The problem is the size of the print head that extrudes the material. If the design specifies a layer to be 0.5 millimeters thick, for instance, and the print head is only capable of extruding 1-millimeter-thick layers, the final design will be warped and imprecise.Another problem has to do with the way 3D printers create parts, with a print head extruding a thin bead of material as it glides across the printing area, gradually building parts layer by layer. That can cause weak bonding between layers, making the part more prone to separation or failure.The researchers sought to address the disconnect between expected and actual properties of materials that arise from those limitations.“We thought, ‘We know these limitations in the beginning, and the field has gotten better at quantifying these limitations, so we might as well design from the get-go with that in mind,” Kim-Tackowiak says.In previous work, Carstensen developed an algorithm that embedded information about the print nozzle size into design algorithms for beam structures. For this paper, the researchers built off that approach to incorporate the direction of the print head and the corresponding impact of weak bonding between layers. They also made it work with more complex, porous structures that can have extremely elastic properties.The approach allows users to add variables to the design algorithms that account for the center of the bead being extruded from a print head and the exact location of the weaker bonding region between layers. The approach also automatically dictates the path the print head should take during production.The researchers used their technique to create a series of repeating 2D designs with various sizes of hollow pores, or densities. They compared those creations to materials made using traditional topology optimization designs of the same densities.In tests, the traditionally designed materials deviated from their intended mechanical performance more than materials designed using the researchers’ new technique at material densities under 70 percent. The researchers also found that conventional designs consistently over-deposited material during fabrication. Overall, the researchers’ approach led to parts with more reliable performance at most densities.“One of the challenges of topology optimization has been that you need a lot of expertise to get good results, so that once you take the designs off the computer, the materials behave the way you thought they would,” Carstensen says. “We’re trying to make it easy to get these high-fidelity products.”Scaling a new design approachThe researchers believe this is the first time a design technique has accounted for both the print head size and weak bonding between layers.“When you design something, you should use as much context as possible,” Kim-Tackowiak says. “It was rewarding to see that putting more context into the design process makes your final materials more accurate. It means there are fewer surprises. Especially when we’re putting so much more computational resources into these designs, it’s nice to see we can correlate what comes out of the computer with what comes out of the production process.”In future work, the researchers hope to improve their method for higher material densities and for different kinds of materials like cement and ceramics. Still, they said their approach offered an improvement over existing techniques, which often require experienced 3D printing specialists to help account for the limitations of the machines and materials.“It was cool to see that just by putting in the size of your deposition and the bonding property values, you get designs that would have required the consultation of somebody who’s worked in the space for years,” Kim-Tackowiak says.The researchers say the work paves the way to design with more materials.“We’d like to see this enable the use of materials that people have disregarded because printing with them has led to issues,” Kim-Tackowiak says. “Now we can leverage those properties or work with those quirks as opposed to just not using all the material options we have at our disposal.”

Study shows mucus contains molecules that block Salmonella infection

MIT researchers now hope to develop synthetic versions of these molecules, which could be used to treat or prevent foodborne illnesses.

Mucus is more than just a sticky substance: It contains a wealth of powerful molecules called mucins that help to tame microbes and prevent infection. In a new study, MIT researchers have identified mucins that defend against Salmonella and other bacteria that cause diarrhea.The researchers now hope to mimic this defense system to create synthetic mucins that could help prevent or treat illness in soldiers or other people at risk of exposure to Salmonella. It could also help prevent “traveler’s diarrhea,” a gastrointestinal infection caused by consuming contaminated food or water.Mucins are bottlebrush-shaped polymers made of complex sugar molecules known as glycans, which are tethered to a peptide backbone. In this study, the researchers discovered that a mucin called MUC2 turns off genes that Salmonella uses to enter and infect host cells.“By using and reformatting this motif from the natural innate immune system, we hope to develop strategies to preventing diarrhea before it even starts. This approach could provide a low-cost solution to a major global health challenge that costs billions annually in lost productivity, health care expenses, and human suffering,” says Katharina Ribbeck, the Andrew and Erna Viterbi Professor of Biological Engineering at MIT and the senior author of the study.MIT Research Scientist Kelsey Wheeler PhD ’21 and Michaela Gold PhD ’22 are the lead authors of the paper, which appeared Tuesday in the journal Cell Reports.Blocking infectionMucus lines much of the body, providing a physical barrier to infection, but that’s not all it does. Over the past decade, Ribbeck has identified mucins that can help to disarm Vibrio cholerae, as well as Pseudomonas aeruginosa, which can infect the lungs and other organs, and the yeast Candida albicans.In the new study, the researchers wanted to explore how mucins from the digestive tract might interact with Salmonella enterica, a foodborne pathogen that can cause illness after consuming raw or undercooked food, or contaminated water.To infect host cells, Salmonella must produce proteins that are part of the type 3 secretion system (T3SS), which helps bacteria form needle-like complexes that transfer bacterial proteins directly into host cells. These proteins are all encoded on a segment of DNA called Salmonella pathogenicity island 1 (SPI-1).The researchers found that when they exposed Salmonella to a mucin called MUC2, which is found in the intestines, the bacteria stopped producing the proteins encoded by SPI-1, and they were no longer able to infect cells.Further studies revealed that MUC2 achieves this by turning off a regulatory bacterial protein known as HilD. When this protein is blocked by mucins, it can no longer activate the T3SS genes.Using computational simulations, the researchers showed that certain monosaccharides found in glycans, including GlcNAc and GalNAc, can attach to a specific binding site of the HilD protein. However, their studies showed that these monosaccharides can’t turn off HilD on their own — the shutoff only occurs when the glycans are tethered to the peptide backbone of the mucin.The researchers also discovered that a similar mucin called MUC5AC, which is found in the stomach, can block HilD. And, both MUC2 and MUC5AC can turn off virulence genes in other foodborne pathogens that also use HilD as a gene regulator.Mucins as medicineRibbeck and her students now plan to explore ways to use synthetic versions of these mucins to help boost the body’s natural defenses and protect the GI tract from Salmonella and other infections.Studies from other labs have shown that in mice, Salmonella tends to infect portions of the GI tract that have a thin mucus barrier, or no barrier at all.“Part of Salmonella’s evasion strategy for this host defense is to find locations where mucus is absent and then infect there. So, one could imagine a strategy where we try to bolster mucus barriers to protect those areas with limited mucin,” Wheeler says.One way to deploy synthetic mucins could be to add them to oral rehydration salts — mixtures of electrolytes that are dissolved in water and used to treat dehydration caused by diarrhea and other gastrointestinal illnesses.Another potential application for synthetic mucins would be to incorporate them into a chewable tablet that could be consumed before traveling to areas where Salmonella and other diarrheal illnesses are common. This kind of “pre-exposure prophylaxis” could help prevent a great deal of suffering and lost productivity due to illness, the researchers say.“Mucin mimics would particularly shine as preventatives, because that’s how the body evolved mucus — as part of this innate immune system to prevent infection,” Wheeler says.The research was funded by the U.S. Army Research Office, the U.S. Army Institute for Collaborative Biotechnologies, the U.S. National Science Foundation, the U.S. National Institute of Health and Environmental Sciences, the U.S. National Institutes of Health, and the German Research Foundation.

America's blame game over Canada's wildfire smoke misses the point, experts say

US officials have blamed Canada for not doing enough to stop its wildfire smoke from wafting south. Climate experts say it’s not so simple.

America's blame game over Canada's wildfire smoke misses the point, experts sayNadine YousifSenior Canada reporterGetty ImagesSmoke from Canada's wildfires have drifted south to the US several times this summer, clouding the sky with an orange haze. As deadly wildfires raged in the Canadian province of Manitoba this summer, Republican lawmakers in nearby US states penned letters asking that Canada be held accountable for the smoke drifting south."Our skies are being choked by wildfire smoke we didn't start and can't control," wrote Calvin Callahan, a Republican state representative from Wisconsin, in a letter dated early August.Callahan, along with lawmakers from Iowa, Minnesota and North Dakota, filed a formal complaint with the US Environmental Protection Agency (EPA) urging an investigation into Canada's wildfire management.Manitoba premier Wab Kinew quickly condemned the move, accusing the lawmakers of throwing a "timber tantrum" and playing "political games".By August, the wildfires had scorched more than two million acres in Manitoba, forced thousands to evacuate, and killed two people – a married couple who authorities said were trapped by fast-moving flames around their family home. As September draws to a close, data shows that 2025 is on track to be Canada's second-worst wildfire season on record.A study published in the Nature journal in September has revealed that smoke from Canada's wildfires has also had far-reaching, fatal consequences. It estimates that the 2023 wildfires - the country's worst on record by area burned - caused more than 87,500 acute and premature deaths worldwide, including 4,100 acute, smoke-related deaths in the US and over 22,000 premature deaths in Europe.Wildfire smoke contains PM2.5 - a type of air pollution - that is known to trigger inflammation in the body. It can exacerbate conditions like asthma and heart disease, and, in some causes, can damage neural connections in the brain."These are big numbers," said Michael Brauer, a professor at the University of British Columbia who co-authored the study. He added the findings show wildfire smoke should be treated as a serious health issue, akin to breast cancer or prostate cancer.For some American lawmakers, the blame falls squarely on Canada. "Canada's failure to contain massive wildfires," Callahan wrote in August, "has harmed the health and quality of life of more than 20 million Americans in the Midwest."Their complaints raise the question: Could Canada be doing more to curb its wildfires – and by extension, their smoke?Climate and fire experts in both countries told the BBC that the answer is largely no. "Until we as a global society deal with human-cased climate change, we're going to have this problem," said Mike Flannigan, an emergency management and fire science expert at Thompson Rivers University in British Columbia.Gallo Images/Orbital Horizon/Copernicus Sentinel Data 2025Wildfire smoke can often travel hundreds of thousands of miles. A sattelite image here from August shows smoke from a fire in Newfoundland drifting over the Atlantic Ocean.Metrics show Canada's wildfires, a natural part of its vast boreal forest, have worsened in recent years. Fire season now starts earlier, ends later, and burns more land on average. The 2023 fires razed 15 million hectares (37 million acres) – an area larger than England – while the 2025 blazes have so far burned 8.7 million hectares (21.5 million acres).As of mid-September, there are still more than 500 fires burning, mostly in British Columbia and Manitoba, according to the Canadian Interagency Forest Fire Centre.Roughly half of Canada's wildfires are sparked by lightning, while the rest stem from human activity, data from the National Forestry Database shows. Experts warn that hotter temperatures are making the land drier and more prone to ignition.Wildfires are not only worsening in Canada. The US has recently seen some of its most damaging blazes, including the 2023 Hawaii wildfires that killed at least 102 people, and the Palisades fire in January, the most destructive in Los Angeles history.Both countries have struggled to keep pace, often sharing firefighting resources. Canadian water bombers were deployed in California this year, while more than 600 US firefighters travelled north to assist Canada, according to the US Forest Service.In Canada, strained resources – and worsening fires – have fuelled calls for a national firefighting service. Wildfire emergency response is currently handled separately by each of the provinces and territories."The system we have right now worked 40 years ago. Today? Not so much," argued Mr Flannigan.Others propose controlled burns, a practice used in Australia and by indigenous communities, as a solution, though these fires would still generate smoke. Some argue for better clearing of flammable material in forests and near towns, or investing in new technology that can help detect wildfires faster.Some of that work is already underway. In August, Canada pledged more than $47m for research projects to help communities better prepare for and mitigate wildfires.Getty ImagesMajor Canadian cities, like Vancouver, have also been dealing with wildfire smoke. Still, experts like Jen Beverly, a wildland fire professor at the University of Alberta, warn there is little Canada can do to prevent wildfires altogether."These are high intensity fire ecosystems" in Canada, she said, that are different from fires in Australia or the US. "We have very difficult fires to manage under extreme conditions, and we're seeing more of those because of climate change."With a warmer climate, Prof Beverly said attention should be paid to pollution. She noted that the US is the second-worst carbon emitter in the world behind China. "I mean, we should be blaming them for the problem," she argued.In recent months, the Trump administration has also rolled back environmental policies designed to reduce emissions, and has withdrawn the US from the Paris climate accords.Sheila Olmstead, an enviromental policy professor at Cornell University, noted that Canada and the US have a history of cooperation on pollution and climate, including an air quality agreement signed by the two in 1991 to address acid rain."It was a very clear framework for addressing the problem, and that's what seems to be missing here," Olmstead told the BBC. Both countries, she said, would benefit from working together on wildfires instead of trading blame.As for the EPA complaint, it is unclear what the agency could do to address the US lawmakers' concerns. In a statement to the BBC, the EPA said it is reviewing it "and will respond through appropriate channels".Prof Brauer said the data in his study shows that even though the fires are burning in Canada - often in remote areas - their impact can reach far beyond.The findings, he told the BBC, call for a re-framing of how the consequences of climate change are understood. "The effects of a warmer climate are localised, and there are winners and losers," Prof Brauer said. "But this is an illustration that some of these impacts are becoming global."He argued that the US lawmakers' complaints are an "unfortunate distraction," and that the focus should instead be on collaboration and learning how to "live with smoke"."This stuff isn't going away," Prof Brauer said, adding that there are ways to prevent future deaths if there is a will to adapt.

Environment Agency failed to visit serious pollution incidents, files show

Data from inside England's environment watchdog show an agency struggling to monitor serious pollution.

Environment Agency failed to visit serious pollution incidents, files showJonah FisherEnvironment correspondentGetty ImagesOne reservoir's fish were all killed by pollution in one incident Documents and data shared with BBC News from inside England's much criticised environment watchdog show an agency struggling to monitor incidents of serious pollution.The information shows the Environment Agency (EA) only sent investigators to a small fraction of reported incidents last year and often relied on water companies - who may be responsible for the pollution - for updates.An internal EA document from this year states that all potentially serious incidents should be attended by staff. But in 2024, the EA didn't go to almost a third of nearly 100 water industry incidents that were eventually ruled to have posed a serious threat to nature or human health. The agency also downgraded the environmental impact of more than 1,000 incidents that it initially decided were potentially serious without sending anyone to take a look.The EA says it does "respond" to all incidents but has ways to assess pollution that don't involve going in person. It says when reports come in it is "careful not to underestimate the seriousness of an incident report".But the EA insider who provided the BBC with the data was critical of the agency. "What not attending means is that you are you are basically only dealing with water company evidence. And it's very rare that their own evidence is very damning," the insider said.Among the incident reports shared with the BBC were an occasion when a chemical spilled into a reservoir killing all its fish and which the EA did not attend. Another time, sewage bubbled up into a garden for more than 24 hours with no deployment from the EA.The BBC is not printing specific details from the reports to protect the identity of the whistleblower. But they show an agency often slow to respond and frequently copying water company updates into EA documents verbatim before downgrading incidents.Other documents show pollution incidents that were reported to the EA by water companies hours after the problem had already been solved, making the impact much harder to assess as the evidence may have washed away. The data show that overall the agency went to just 13% of all the pollution incidents, serious and more limited, that were reported to it in 2024.Jonah Fisher/BBCAshley Smith from the campaign group Windrush Against Sewage Pollution (WASP) says its "virtually impossible" to get the Environment Agency to come out. "It's virtually impossible to get them to come out," Ashley Smith a veteran water quality campaigner from the Oxfordshire based campaign group Windrush Against Sewage Pollution (WASP) told the BBC."(When you call the EA) they go through a scenario where they'll say 'are there any dead fish'. And, typically there are not dead fish because often the fish are able to escape."The EA then says – we'll report that to Thames Water – and it will be Thames Water if anyone who gets in touch with you."Jonah Fisher/BBCMatt Staniek (front row) is leading a campaign to get Windermere in the Lake District cleaned upMatt Staniek is a water quality campaigner in the Lake District and cited several incidents where he says the EA took explanations from the local water company about sewage spills at face value, which later through his own data requests were proved wrong."The Environment Agency has not been holding United Utilities accountable," he says. "And the only way that we get them to properly turn up to pollution incidents and now actually try and do a proper investigation is by going to the media with it, and that should not be the case."A United Utilities spokesperson responded saying "we are industry leading at self-reporting incidents to the Environment Agency".As part of the government's landmark review of water industry regulation it has promised to end "self reporting" of incidents by water companies.There is widespread agreement that the current system is not working and plans are being drawn up to merge the regulators – including the EA - which oversee different parts of the water industry – into just one."The Environment Agency is so hollowed out that it cannot investigate pollution crimes, effectively telling polluters they can act with impunity," James Wallace, the chief executive of campaign group River Action, told the BBC.In July the BBC revealed that staff shortages had led to the EA cancelling thousands of water quality tests at its main laboratory in Devon."We respond to every water pollution incident report we receive," an Environment Agency spokesperson said."To make sure we protect people and the environment, we are careful not to underestimate the seriousness of an incident report when it comes in. Final incident categorisations may change when further information comes to light. This is all part of our standard working practice."

Energy Department plans to claw back $13B in green funds

The Energy Department is planning to claw back $13 billion in unspent climate funds, it announced Wednesday. In a press release, the department said that it plans to "return more than $13 billion in unobligated funds initially appropriated to advance the previous Administration’s wasteful Green New Scam agenda." The press release did not specify exactly where the...

The Energy Department is planning to claw back $13 billion in unspent climate funds, it announced Wednesday.  In a press release, the department said that it plans to "return more than $13 billion in unobligated funds initially appropriated to advance the previous Administration’s wasteful Green New Scam agenda." The press release did not specify exactly where the money would have otherwise gone or what it will be used for now, if anything. Spokespeople for the Energy Department did not immediately respond to The Hill's request for additional information. Asked about the money during the New York Times's Climate Forward event on Wednesday, Energy Secretary Chris Wright said the funds "hadn't been assigned to projects yet" but that they were aimed at subsidizing more wind and solar energy, as well as electric vehicles.  The Trump administration has repeatedly sought to curtail spending on renewable energy — and set up barriers that hamper its deployment — while trying to expedite fossil fuels and nuclear power.  The Energy Department has made several attempts to cut climate spending, including previous funding recissions.  The Environmental Protection Agency has separately sought to rescind billions of its own climate spending that was issued under the Biden administration. 

Will Portland weaken its policy to phase out diesel, replace it with biofuels?

Portland’s Renewable Fuels Standards Advisory Committee is poised to recommend delaying the phase-out -- but the decision on how to move ahead will be made by city leaders.

Portland leaders may soon weigh whether to roll back parts of the city’s signature climate policy on replacing diesel with renewable fuels, a first-in-the-nation standard critical to reducing emissions and harmful particulate matter pollution. The policy, adopted by the City Council in 2022 and aimed at medium and heavy trucks, phases out the sale of petroleum diesel by 2030, gradually replacing it with diesel blended with renewable fuels at increasingly higher increments.Council members had hailed the diesel phase-out as a tool to reduce pollution in low-income neighborhoods often located near freeways with high concentrations of diesel emissions. As part of the policy, a 15% blend requirement began in 2024, a 50% blend will be required by 2026 and a 99% blend by 2030. Medium and heavy trucks affected by the policy include delivery trucks, school and transit buses, dump trucks, tractor trailers and cement mixers. But Portland’s Renewable Fuels Standards Advisory Committee is poised to recommend weakening the phase-out. The committee was established in July 2023 to advise the city Bureau of Planning and Sustainability director on technical and economic issues related to the renewable fuel supply as well as meeting the policy’s fuel requirements. A draft memo, made public in advance of the committee’s meeting this week, shows the committee is planning to ask the city to reduce the 2026 biofuel percentage requirement from 50% to 20% and delay implementation until 2028 or 2030. The memo was obtained by the Braided River Campaign, a Portland nonprofit that advocates for a green working waterfront, and shared with The Oregonian/OregonLive. The proposed rollback essentially would allow trucks to continue to emit black carbon or “soot” at a higher level and for longer than under the original plan.The draft also recommends pausing for at least two years strict restrictions on the type of feedstock used to make renewable fuels – a standard that three years ago was hailed as the most innovative, emission-reducing part of Portland’s diesel phase-out. The pause would allow retailers to fall back on using biofuels made from feedstocks such as soybean, canola and palm oils which have been linked to much higher carbon emissions, displacing food production and causing deforestation. The draft memo, addressed to Planning and Sustainability’s Director Eric Engstrom, says the changes would respond to unfavorable biodiesel and renewable diesel market conditions in Oregon and Portland, including the scarcity of low-carbon intensity feedstocks such as used cooking oil and animal tallow.It’s unclear who will decide on the future of the diesel phase-out. While Engstrom has sole discretion to make changes to the program’s rules, the City Council holds the authority to amend city code. Engstrom did not immediately comment on whether the recommended changes would require rule or code changes. Portland officials have said they are fully committed to electrification of trucks but that transition will take many years. Moving from diesel to biofuels is an interim step, they said, allowing for faster emission and particulate matter reductions. The committee’ draft recommendation comes as Portland leaders are debating the future of the Critical Energy Infrastructure Hub, a 6-mile stretch on the northwest bank of the Willamette River where most of Oregon’s fuel supply is stored. Zenith Energy, which operates a terminal at the hub that has drawn environmental opposition, has promised the city to convert from fossil fuel loading and storage to renewable fuels. Other companies at the hub are also eyeing renewable fuels as a new income stream. Earlier this week, the city unveiled four alternatives for the hub, one of which allows for unlimited renewable fuel expansion. Environmental advocates said the committee’s recommendations are unacceptable and would gut the renewable fuel policy’s environmental credibility.“This is a complete walk-back of a promise made to Portlanders,” said Marnie Glickman, Braided River Campaign’s executive director. “The city sold this policy on the promise of a rapid decline in carbon pollution. Now, before the strongest rules even take effect, the industry-dominated advisory board is asking for a hall pass to continue using the cheapest, dirtiest biofuels.” The committee is set to refine the memo at its meeting on Thursday and may vote on the recommendation. It must submit the final recommendation to Engstrom by mid-October. Biofuel cost is one of the major reasons the committee cites for the recommended changes. “If the RFS (renewable fuel standard) is left unchanged, the cost of the diesel fuel in Portland could get significantly higher in the City of Portland compared to the rest of the state of Oregon due to the combined higher requirement of renewable content and lower carbon intensity,” the memo said. The draft memo also says Portland’s program has trouble competing with other regional markets such as California for scarce low-carbon intensity biofuels. It also mentions Trump’s One Big Beautiful Bill excluding feedstocks supplied from countries outside North America from tax incentives – which is likely to further reduce the supply of low-carbon feedstocks. Glickman said she’s also concerned about the committee’s potential conflict of interest when making recommendations to the sustainability director – a fact the draft memo acknowledges. Six of the seven members of the advisory committee are representatives of fuel producers and suppliers – including bpAmerica, Phillips 66 and the Western States Petroleum Association. The committee’s only non-industry member – Andrew Dyke, a senior economist at ECOnorthwest – declined to comment on the draft memo. In 2006, Portland became the first city in the U.S. to adopt a renewable fuel standard, which required the city’s fuel retailers to sell a minimum blend of 5% biodiesel. The city updated the policy in 2022 to a full diesel phase-out. The current policy far exceeds the federal and state renewable fuel standards.If you purchase a product or register for an account through a link on our site, we may receive compensation. By using this site, you consent to our User Agreement and agree that your clicks, interactions, and personal information may be collected, recorded, and/or stored by us and social media and other third-party partners in accordance with our Privacy Policy.

No Results today.

Our news is updated constantly with the latest environmental stories from around the world. Reset or change your filters to find the most active current topics.

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.