Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

UCLA Unveils Breakthrough 3D Imaging Technology to Peer Inside Objects

News Feed
Thursday, August 1, 2024

Artistic depiction of a wavelength-multiplexed diffractive optical processor for 3D quantitative phase imaging. Credit: Ozcan Lab @ UCLAAll-optical multiplane quantitative phase imaging design eliminates the need for digital phase recovery algorithms.UCLA researchers have introduced a breakthrough in 3D quantitative phase imaging that utilizes a wavelength-multiplexed diffractive optical processor to enhance imaging efficiency and speed. This method enables label-free, high-resolution imaging across multiple planes and has significant potential applications in biomedical diagnostics, material characterization, and environmental analysis.Introduction to Quantitative Phase ImagingLight waves, as they propagate through a medium, experience a temporal delay. This delay can unveil crucial information about the underlying structural and compositional characteristics. Quantitative Phase Imaging (QPI) is a cutting-edge optical technique that reveals variations in optical path length as light moves through biological samples, materials, and other transparent structures. Unlike traditional imaging methods that rely on staining or labeling, QPI allows researchers to visualize and quantify phase variations by generating high-contrast images that enable noninvasive investigations crucial to fields such as biology, materials science, and engineering. A recent study reported on July 25 in Advanced Photonics introduces a cutting-edge approach to 3D QPI using a wavelength-multiplexed diffractive optical processor. The innovative approach, developed by researchers at the University of California, Los Angeles (UCLA), offers an effective solution to a bottleneck posed by traditional 3D QPI methods, which can be time-consuming and computationally intensive.UCLA researchers report a new method for quantitative phase imaging of a 3D phase-only object using a wavelength-multiplexed diffractive optical processor. Utilizing multiple spatially engineered diffractive layers trained through deep learning, this diffractive processor can optically transform the phase distributions of multiple 2D objects at various axial positions into intensity patterns, each encoded at a unique wavelength channel. These wavelength-multiplexed patterns are projected onto a single field-of-view (FOV) at the output plane of the diffractive processor, enabling the capture of quantitative phase distributions of input objects located at different axial planes using an intensity-only image sensor – eliminating the need for digital phase recovery algorithms. Credit: C. Shen et al., doi 10.1117/1.AP.6.5.056003.The UCLA Innovation in Optical ProcessingThe UCLA team developed a wavelength-multiplexed diffractive optical processor capable of all-optically transforming phase distributions of multiple 2D objects at various axial positions into intensity patterns, each encoded at a unique wavelength channel. The design allows for the capture of quantitative phase images of input objects located at different axial planes using an intensity-only image sensor, eliminating the need for digital phase recovery algorithms.“We are excited about the potential of this new approach for biomedical imaging and sensing,” said Aydogan Ozcan, lead researcher and Chancellor’s Professor at UCLA. “Our wavelength-multiplexed diffractive optical processor offers a novel solution for high-resolution, label-free imaging of transparent specimens, which could greatly benefit biomedical microscopy, sensing, and diagnostics applications.”Multiplane Imaging and Its ApplicationsThe innovative multiplane QPI design incorporates wavelength multiplexing and passive diffractive optical elements that are collectively optimized using deep learning. By performing phase-to-intensity transformations that are spectrally multiplexed, this design enables rapid quantitative phase imaging of specimens across multiple axial planes. This system’s compactness and all-optical phase recovery capability make it a competitive analog alternative to traditional digital QPI methods.A proof-of-concept experiment validated the approach, showcasing successful imaging of distinct phase objects at different axial positions in the terahertz spectrum. The scalable nature of the design also allows adaptation to different parts of the electromagnetic spectrum, including the visible and IR bands, using appropriate nano-fabrication methods, paving the way for new phase imaging solutions integrated with focal plane arrays or image sensor arrays for efficient on-chip imaging and sensing devices.Implications for Science and TechnologyThis research has significant implications for various fields, including biomedical imaging, sensing, materials science, and environmental analysis. By providing a faster, more efficient method for 3D QPI, this technology can enhance the diagnosis and study of diseases, the characterization of materials, and the monitoring of environmental samples, among other applications.Reference: “Multiplane quantitative phase imaging using a wavelength-multiplexed diffractive optical processor” by Che-Yung Shen, Jingxi Li, Yuhang Li, Tianyi Gan, Langxing Bai, Mona Jarrahi and Aydogan Ozcan, 25 July 2024, Advanced Photonics.DOI: 10.1117/1.AP.6.5.056003

All-optical multiplane quantitative phase imaging design eliminates the need for digital phase recovery algorithms. UCLA researchers have introduced a breakthrough in 3D quantitative phase imaging...

Wavelength-Multiplexed Diffractive Optical Processor for 3D Quantitative Phase Imaging

Artistic depiction of a wavelength-multiplexed diffractive optical processor for 3D quantitative phase imaging. Credit: Ozcan Lab @ UCLA

All-optical multiplane quantitative phase imaging design eliminates the need for digital phase recovery algorithms.

UCLA researchers have introduced a breakthrough in 3D quantitative phase imaging that utilizes a wavelength-multiplexed diffractive optical processor to enhance imaging efficiency and speed. This method enables label-free, high-resolution imaging across multiple planes and has significant potential applications in biomedical diagnostics, material characterization, and environmental analysis.

Introduction to Quantitative Phase Imaging

Light waves, as they propagate through a medium, experience a temporal delay. This delay can unveil crucial information about the underlying structural and compositional characteristics. Quantitative Phase Imaging (QPI) is a cutting-edge optical technique that reveals variations in optical path length as light moves through biological samples, materials, and other transparent structures. Unlike traditional imaging methods that rely on staining or labeling, QPI allows researchers to visualize and quantify phase variations by generating high-contrast images that enable noninvasive investigations crucial to fields such as biology, materials science, and engineering.

A recent study reported on July 25 in Advanced Photonics introduces a cutting-edge approach to 3D QPI using a wavelength-multiplexed diffractive optical processor. The innovative approach, developed by researchers at the University of California, Los Angeles (UCLA), offers an effective solution to a bottleneck posed by traditional 3D QPI methods, which can be time-consuming and computationally intensive.

Quantitative Phase Imaging of a 3D Phase-Only Object Using a Wavelength-Multiplexed Diffractive Optical Processor

UCLA researchers report a new method for quantitative phase imaging of a 3D phase-only object using a wavelength-multiplexed diffractive optical processor. Utilizing multiple spatially engineered diffractive layers trained through deep learning, this diffractive processor can optically transform the phase distributions of multiple 2D objects at various axial positions into intensity patterns, each encoded at a unique wavelength channel. These wavelength-multiplexed patterns are projected onto a single field-of-view (FOV) at the output plane of the diffractive processor, enabling the capture of quantitative phase distributions of input objects located at different axial planes using an intensity-only image sensor – eliminating the need for digital phase recovery algorithms. Credit: C. Shen et al., doi 10.1117/1.AP.6.5.056003.

The UCLA Innovation in Optical Processing

The UCLA team developed a wavelength-multiplexed diffractive optical processor capable of all-optically transforming phase distributions of multiple 2D objects at various axial positions into intensity patterns, each encoded at a unique wavelength channel. The design allows for the capture of quantitative phase images of input objects located at different axial planes using an intensity-only image sensor, eliminating the need for digital phase recovery algorithms.

“We are excited about the potential of this new approach for biomedical imaging and sensing,” said Aydogan Ozcan, lead researcher and Chancellor’s Professor at UCLA. “Our wavelength-multiplexed diffractive optical processor offers a novel solution for high-resolution, label-free imaging of transparent specimens, which could greatly benefit biomedical microscopy, sensing, and diagnostics applications.”

Multiplane Imaging and Its Applications

The innovative multiplane QPI design incorporates wavelength multiplexing and passive diffractive optical elements that are collectively optimized using deep learning. By performing phase-to-intensity transformations that are spectrally multiplexed, this design enables rapid quantitative phase imaging of specimens across multiple axial planes. This system’s compactness and all-optical phase recovery capability make it a competitive analog alternative to traditional digital QPI methods.

A proof-of-concept experiment validated the approach, showcasing successful imaging of distinct phase objects at different axial positions in the terahertz spectrum. The scalable nature of the design also allows adaptation to different parts of the electromagnetic spectrum, including the visible and IR bands, using appropriate nano-fabrication methods, paving the way for new phase imaging solutions integrated with focal plane arrays or image sensor arrays for efficient on-chip imaging and sensing devices.

Implications for Science and Technology

This research has significant implications for various fields, including biomedical imaging, sensing, materials science, and environmental analysis. By providing a faster, more efficient method for 3D QPI, this technology can enhance the diagnosis and study of diseases, the characterization of materials, and the monitoring of environmental samples, among other applications.

Reference: “Multiplane quantitative phase imaging using a wavelength-multiplexed diffractive optical processor” by Che-Yung Shen, Jingxi Li, Yuhang Li, Tianyi Gan, Langxing Bai, Mona Jarrahi and Aydogan Ozcan, 25 July 2024, Advanced Photonics.
DOI: 10.1117/1.AP.6.5.056003

Read the full story here.
Photos courtesy of

EPA grants air permit, clears way for new deep-water oil port off Southeast Texas coast

The Texas GulfLink would be about 30 miles off the coast of Freeport. It's touted for first-of-its-kind technology to reduce emissions. Environmentalists and Brazoria County residents still have concerns.

This August 2014 shows the Gulf shoreline in Texas’ Bolivar Peninsula.The U.S. Environmental Protection Agency (EPA) has issued an air-quality permit for a proposed deep-water crude oil port about 30 miles off the shore of Freeport, a Gulf Coast town south of Houston. Its supporters say it takes an extra step toward reducing emissions, while environmental advocacy groups and some nearby residents worry it will still exacerbate pollution. The Texas GulfLink deep-water port would implement a "first-of-its-kind use of vapor capture and control technology mounted on an offshore support vessel," according to a news release issued Monday by the EPA. The agency notes that such technology has been used on shuttle tankers for decades with 96% emission-control efficiency. "Sentinel Midstream is proud to unveil a groundbreaking vapor control application that will revolutionize the loading of Very Large Crude Carriers in the Gulf of America," said Jeff Ballard, the CEO of Sentinel Midstream, of which Texas GulfLink is a subsidiary, in the EPA news release. "Developed by our Texas GulfLink team in close collaboration with the EPA, this innovative approach significantly reduces volatile organic compounds, setting a new industry standard for environmental performance and advances the implementation of Best Available Control Technology." Air pollutants that are emitted during the process of obtaining crude oil "will be captured at the tanker and routed via flexible hose to a control system located on an adjacent, dynamically positioned offshore support vessel," according to Brad Toups, an EPA official who wrote the permit and presented it during a public hearing in June. Those emissions, referred to as volatile organic compounds, are either stored and sold, or they're used as fuel. Sentinel Midstream did not immediately respond a request for comment Tuesday. The permit, under the Clean Air Act, is one piece of the puzzle toward the rig's development. The other is approval from the U.S. Department of Transportation's Maritime Administration, or MARAD. In February, MARAD issued a Record of Decision, indicating its approval of the project. RELATED: EPA approves long-awaited plan to clean up San Jacinto River waste pits near Houston Though the project takes steps toward reducing emissions, clean energy advocacy groups have expressed criticisms of the Texas GulfLink deep-water port. "Approving yet another massive offshore oil terminal like this will only worsen a global climate crisis that is already slamming Texans with flooding, heat waves, and drought," Jen Duggan, executive director of the Environmental Integrity Project, told Houston Public Media. "This terminal is expected to release more than 21,000 tons of greenhouse gases per year, as much as much as 4,321 cars and trucks driven for a year. It is good that the Trump Administration says the terminal will be using some pollution controls. But we should remember that ‘unleashing' more dirty fossil fuels like this also means more air and water pollution released upstream during the fracking, drilling, and processing of the oil before it even arrives at the oil export terminal. And then more pollution again when it is burned — all to the detriment of the climate and local communities." During a public EPA hearing in June, members of the Brazoria County community also shared concerns about the initiative. "This project doesn't benefit people in Brazoria County, it only benefits rich executives who continue to squeeze profits at the expense of communities like Freeport," said Riley Bennington, a Brazoria County resident, according to an EPA transcript of the hearing. "As a kid growing up in Texas, I really thought we'd be past this by now. We've had renewable energy figured out. Why is this even being considered?" Though most of the testimony during the June 25 public hearing opposed Texas GulfLink, the initiative wasn't completely without praise. Amy Dinn, an attorney from Lone Star Legal Aid representing Better Brazoria, said GulfLink's permits are "much better and more protective of the environment" than other such projects, though she still expressed concerns that not enough research was done on the ozone emissions and impacts of severe weather.

‘They dictate the rules’: BBC tells PM’s Evan Davis to stop hosting heat pump podcast

Presenter believes decision was taken due to the technology’s link with net zero after he was told he risked accusations of political biasThe BBC presenter Evan Davis has been told he can no longer host a podcast about heat pumps due to the corporation’s concerns that discussing the technology risks “treading on areas of public controversy”.The presenter of BBC Radio 4’s PM programme had hosted 20 episodes of the Happy Heat Pump Podcast, which launched in 2024. It has covered issues around installing the technology, the cost, noise levels and the alternatives for people replacing their gas boilers. Continue reading...

The BBC presenter Evan Davis has been told he can no longer host a podcast about heat pumps due to the corporation’s concerns that discussing the technology risks “treading on areas of public controversy”.The presenter of BBC Radio 4’s PM programme had hosted 20 episodes of the Happy Heat Pump Podcast, which launched in 2024. It has covered issues around installing the technology, the cost, noise levels and the alternatives for people replacing their gas boilers.However, despite initially being given approval to go ahead with the non-BBC project, bosses told Davis the podcast risked exposing him to accusations of political bias. “As the series has gone on – in fact as the world has progressed over the last few months – they have become concerned that anything like this trying to inform people about heat pumps can be interpreted, rightly or wrongly, as somehow treading on areas of public controversy,” he told followers of the podcast’s YouTube channel.“I take their shilling, they dictate the rules. They have to try and keep their presenters out of areas of public controversy, and they have decided heat pumps can be controversial, so they’ve asked me not to be involved.”The widespread installation of heat pumps is seen as necessary to achieve the government’s target of hitting net zero carbon emissions by 2050. Last month Kemi Badenoch, the Conservative leader, dropped her party’s support for the target. Davis said he believed the decision to stop him appearing on the podcast had been taken because of a link between heat pumps and the net zero target.Bean Beanland, a director at the Heat Pump Federation and Davis’s co-presenter on the podcast, described the decision as “quite extraordinary”. Douglas Parr, Greenpeace UK’s policy director, said: “As an impartial broadcaster, the BBC should not be pandering to attempts from the right to turn the world’s most efficient home heating system into a culture war issue. What’s next – cancelling Gardeners’ World because of Monty Don’s support for peat-free compost?”Davis told the Guardian he received “no remuneration at all” for the podcast and had personally paid its small costs for music, dissemination and microphone equipment. He said there was no link with the HPF, other than the fact it employed his co-host.However, he defended the broadcaster. “While it’s easy to be infuriated by the BBC and its caution on things like this – and of course, I do disagree with it in this case – I’ve never had the burden of actually having to run the BBC and make a hundred decisions a day, while people from all sides shout incessantly at me,” he said.“I’m obviously free to leave if I don’t like the restrictions that come with working here, but I choose not to because it is a great institution, the PM programme is in excellent shape, and they pay me handsomely.”The BBC has received criticism over its handling of environmental issues. In 2018, the broadcaster said it would stop “both-sidesing” the climate crisis, admitting that it got some of its coverage “wrong” by setting up debates with those who deny climate science.However, more recently, the broadcaster has given a platform to some who call for reduced action on the climate breakdown. Producers also accused the BBC of shelving a 2023 political programme by Sir David Attenborough that linked the UK’s biodiversity loss to the climate crisis. Insiders said this was because of fears its themes of the destruction of nature would risk a backlash from Tory politicians and the rightwing press.BBC guidelines state employees should not compromise the impartiality of the corporation in their outside work. A source said while the BBC is clear that climate change is happening, responses to it are a matter of public policy. They added that Davis’s podcast only explored and promoted one possible solution.The BBC has previously come under pressure over the external projects of its presenters. Last year, the broadcaster Clive Myrie apologised for failing to declare at least £145,000 earned from external events and said he would stop doing them for the “foreseeable future”.

Workshop explores new advanced materials for a growing world

Speakers described challenges and potential solutions for producing materials to meet demands associated with data centers, infrastructure, and other technology.

It is clear that humankind needs increasingly more resources, from computing power to steel and concrete, to meet the growing demands associated with data centers, infrastructure, and other mainstays of society. New, cost-effective approaches for producing the advanced materials key to that growth were the focus of a two-day workshop at MIT on March 11 and 12.A theme throughout the event was the importance of collaboration between and within universities and industries. The goal is to “develop concepts that everybody can use together, instead of everybody doing something different and then trying to sort it out later at great cost,” said Lionel Kimerling, the Thomas Lord Professor of Materials Science and Engineering at MIT.The workshop was produced by MIT’s Materials Research Laboratory (MRL), which has an industry collegium, and MIT’s Industrial Liaison Program. The program included an address by Javier Sanfelix, lead of the Advanced Materials Team for the European Union. Sanfelix gave an overview of the EU’s strategy to developing advanced materials, which he said are “key enablers of the green and digital transition for European industry.”That strategy has already led to several initiatives. These include a material commons, or shared digital infrastructure for the design and development of advanced materials, and an advanced materials academy for educating new innovators and designers. Sanfelix also described an Advanced Materials Act for 2026 that aims to put in place a legislative framework that supports the entire innovation cycle.Sanfelix was visiting MIT to learn more about how the Institute is approaching the future of advanced materials. “We see MIT as a leader worldwide in technology, especially on materials, and there is a lot to learn about [your] industry collaborations and technology transfer with industry,” he said.Innovations in steel and concreteThe workshop began with talks about innovations involving two of the most common human-made materials in the world: steel and cement. We’ll need more of both but must reckon with the huge amounts of energy required to produce them and their impact on the environment due to greenhouse-gas emissions during that production.One way to address our need for more steel is to reuse what we have, said C. Cem Tasan, the POSCO Associate Professor of Metallurgy in the Department of Materials Science and Engineering (DMSE) and director of the Materials Research Laboratory.But most of the existing approaches to recycling scrap steel involve melting the metal. “And whenever you are dealing with molten metal, everything goes up, from energy use to carbon-dioxide emissions. Life is more difficult,” Tasan said.The question he and his team asked is whether they could reuse scrap steel without melting it. Could they consolidate solid scraps, then roll them together using existing equipment to create new sheet metal? From the materials-science perspective, Tasan said, that shouldn’t work, for several reasons.But it does. “We’ve demonstrated the potential in two papers and two patent applications already,” he said. Tasan noted that the approach focuses on high-quality manufacturing scrap. “This is not junkyard scrap,” he said.Tasan went on to explain how and why the new process works from a materials-science perspective, then gave examples of how the recycled steel could be used. “My favorite example is the stainless-steel countertops in restaurants. Do you really need the mechanical performance of stainless steel there?” You could use the recycled steel instead.Hessam Azarijafari addressed another common, indispensable material: concrete. This year marks the 16th anniversary of the MIT Concrete Sustainability Hub (CSHub), which began when a set of industry leaders and politicians reached out to MIT to learn more about the benefits and environmental impacts of concrete.The hub’s work now centers around three main themes: working toward a carbon-neutral concrete industry; the development of a sustainable infrastructure, with a focus on pavement; and how to make our cities more resilient to natural hazards through investment in stronger, cooler construction.Azarijafari, the deputy director of the CSHub, went on to give several examples of research results that have come out of the CSHub. These include many models to identify different pathways to decarbonize the cement and concrete sector. Other work involves pavements, which the general public thinks of as inert, Azarijafari said. “But we have [created] a state-of-the-art model that can assess interactions between pavement and vehicles.” It turns out that pavement surface characteristics and structural performance “can influence excess fuel consumption by inducing an additional rolling resistance.”Azarijafari emphasized  the importance of working closely with policymakers and industry. That engagement is key “to sharing the lessons that we have learned so far.”Toward a resource-efficient microchip industryConsider the following: In 2020 the number of cell phones, GPS units, and other devices connected to the “cloud,” or large data centers, exceeded 50 billion. And data-center traffic in turn is scaling by 1,000 times every 10 years.But all of that computation takes energy. And “all of it has to happen at a constant cost of energy, because the gross domestic product isn’t changing at that rate,” said Kimerling. The solution is to either produce much more energy, or make information technology much more energy-efficient. Several speakers at the workshop focused on the materials and components behind the latter.Key to everything they discussed: adding photonics, or using light to carry information, to the well-established electronics behind today’s microchips. “The bottom line is that integrating photonics with electronics in the same package is the transistor for the 21st century. If we can’t figure out how to do that, then we’re not going to be able to scale forward,” said Kimerling, who is director of the MIT Microphotonics Center.MIT has long been a leader in the integration of photonics with electronics. For example, Kimerling described the Integrated Photonics System Roadmap – International (IPSR-I), a global network of more than 400 industrial and R&D partners working together to define and create photonic integrated circuit technology. IPSR-I is led by the MIT Microphotonics Center and PhotonDelta. Kimerling began the organization in 1997.Last year IPSR-I released its latest roadmap for photonics-electronics integration, “which  outlines a clear way forward and specifies an innovative learning curve for scaling performance and applications for the next 15 years,” Kimerling said.Another major MIT program focused on the future of the microchip industry is FUTUR-IC, a new global alliance for sustainable microchip manufacturing. Begun last year, FUTUR-IC is funded by the National Science Foundation.“Our goal is to build a resource-efficient microchip industry value chain,” said Anuradha Murthy Agarwal, a principal research scientist at the MRL and leader of FUTUR-IC. That includes all of the elements that go into manufacturing future microchips, including workforce education and techniques to mitigate potential environmental effects.FUTUR-IC is also focused on electronic-photonic integration. “My mantra is to use electronics for computation, [and] shift to photonics for communication to bring this energy crisis in control,” Agarwal said.But integrating electronic chips with photonic chips is not easy. To that end, Agarwal described some of the challenges involved. For example, currently it is difficult to connect the optical fibers carrying communications to a microchip. That’s because the alignment between the two must be almost perfect or the light will disperse. And the dimensions involved are minuscule. An optical fiber has a diameter of only millionths of a meter. As a result, today each connection must be actively tested with a laser to ensure that the light will come through.That said, Agarwal went on to describe a new coupler between the fiber and chip that could solve the problem and allow robots to passively assemble the chips (no laser needed). The work, which was conducted by researchers including MIT graduate student Drew Wenninger, Agarwal, and Kimerling, has been patented, and is reported in two papers. A second recent breakthrough in this area involving a printed micro-reflector was described by Juejun “JJ” Hu, John F. Elliott Professor of Materials Science and Engineering.FUTUR-IC is also leading educational efforts for training a future workforce, as well as techniques for detecting — and potentially destroying — the perfluroalkyls (PFAS, or “forever chemicals”) released during microchip manufacturing. FUTUR-IC educational efforts, including virtual reality and game-based learning, were described by Sajan Saini, education director for FUTUR-IC. PFAS detection and remediation were discussed by Aristide Gumyusenge, an assistant professor in DMSE, and Jesus Castro Esteban, a postdoc in the Department of Chemistry.Other presenters at the workshop included Antoine Allanore, the Heather N. Lechtman Professor of Materials Science and Engineering; Katrin Daehn, a postdoc in the Allanore lab; Xuanhe Zhao, the Uncas (1923) and Helen Whitaker Professor in the Department of Mechanical Engineering; Richard Otte, CEO of Promex; and Carl Thompson, the Stavros V. Salapatas Professor in Materials Science and Engineering.

California launches first-in-nation satellite tech to curb methane leaks

California air quality regulators on Friday announced the launch of a first-in-nation satellite data project, with the aim of monitoring and minimizing methane emissions. The technology involves the use of satellite-mounted methane sensors that transmit data regarding the location of methane leaks that could otherwise go undetected, according to California Air Resources Board (CARB). The...

California air quality regulators on Friday announced the launch of a first-in-nation satellite data project, with the aim of monitoring and minimizing methane emissions. The technology involves the use of satellite-mounted methane sensors that transmit data regarding the location of methane leaks that could otherwise go undetected, according to California Air Resources Board (CARB).  The project, funded by a $100 million state budget investment, serves to bolster collaboration between industry and state and local leaders, in order to curb emissions and protect public health, per the agency. In advancing this new initiative, state officials touted the effort as critical climate action amid the Trump administration’s many rollbacks in the U.S. Environmental Protection Agency (EPA).  “Decades of progress to protect public health is on the line as the Trump Administration works to roll back critical environmental protections,” Gov. Gavin Newsom (D) said in a statement. “California isn’t having it." Of specific concern to Californians has been the EPA’s decision to reconsider what’s called the “endangerment finding” — the basis for federal actions to curb planet-warming emissions.  “We’re using satellite technology to detect methane leaks as they happen,” Newsom said. “With this new data, we’ll be able to move faster to cut harmful methane pollution – protecting Californians and the clean air we’ve fought so hard for.” Methane — a clear, odorless gas released from landfills, livestock facilities and fossil fuel operations —is more than 80 times as potent as carbon dioxide when it comes to near-term warming.  The satellites, one of which has already been deployed, will be able to show specific regions for observation, leading to targeted mitigation efforts. “The effort provides information that is much closer to real time than the data now available,” Liane Randolph, chair of CARB, said in a statement. “It allows us to get ahead of one of the major contributors to what has become an immediate threat to public health and the environment.” The governor on Friday also announced that he was joining the “America Is All In” bipartisan climate coalition as its newest co-chair. The coalition of state and local leaders intends to halve emissions by 2030 and achieve net-zero by 2050, while boosting resilience amid climate challenges.  “With the all-out assault we’re now facing on low-carbon, green growth from the federal level, it’s the subnational leaders — those of us leading our states and cities — who have to step up,” Newsom said. 

New desalination technology being tested in California could lower costs of tapping seawater

A new desalination technology is undergoing testing in Southern California. Water managers hope it will offer an environmentally friendly way of tapping the Pacific Ocean.

Californians could be drinking water tapped from the Pacific Ocean off Malibu several years from now — that is, if a company’s new desalination technology proves viable. OceanWell Co. plans to anchor about two dozen 40-foot-long devices, called pods, to the seafloor several miles offshore and use them to take in saltwater and pump purified fresh water to shore in a pipeline. The company calls the concept a water farm and is testing a prototype of its pod at a reservoir in the foothills of the Santa Monica Mountains. The pilot study, supported by Las Virgenes Municipal Water District, is being closely watched by managers of several large water agencies in Southern California. They hope that if the new technology proves economical, it could supply more water for cities and suburbs that are vulnerable to shortages during droughts, while avoiding the environmental drawbacks of large coastal desalination plants.“It can potentially provide us Californians with a reliable water supply that doesn’t create toxic brine that impacts marine life, nor does it have intakes that suck the life out of the ocean,” said Mark Gold, director of water scarcity solutions for the Natural Resources Defense Council. “If this technology is proven to be viable, scalable and cost-effective, it would greatly enhance our climate resilience.” OceanWell’s Mark Golay, left, and Ian Prichard, deputy general manager of Calleguas Municipal Water District, walk toward a prototype of the desalination pod being tested in Las Virgenes Reservoir. (Allen J. Schaben / Los Angeles Times) During a recent demonstration at Las Virgenes Reservoir, Tim Quinn, the company’s water policy strategist, watched as the 12-foot-long cylindrical prototype was lowered underwater on a cable. “We pull fresh water only up out of the ocean, and the salt stays down there in low concentrations, where it’s not an environmental problem,” Quinn said.The testing at Las Virgenes Reservoir will help the company’s engineers check how the system works in filtering out plankton and discharging it back into the water. When the pod was nearly 50 feet underwater, Mark Golay, the company’s director of engineering projects, turned on the pumps and water flowed from a spigot.The next step, expected later this year, will involve conducting trials in the ocean by lowering a pod from an anchored boat into the depths about 5 miles offshore.“We hope to be building water farms under the ocean in 2028,” Quinn said.Quinn previously worked for California water agencies for four decades, and he joined Menlo Park-based OceanWell two years ago believing the new technology holds promise to ease the state’s conflicts over water.“Ocean desal has never played a prominent role in California’s water future,” he said, “and this technology allows us to look to the ocean as a place where we can get significant sources of supply with minimal, if any, environmental conflict.”Managers of seven Southern California water agencies are holding monthly meetings on the project and studying what investments in new infrastructure — such as pipelines and pump stations — would be needed to transport the water the company plans to sell from the shore to their systems. Leaders of Las Virgenes Municipal Water District, who are spearheading the effort, are holding an event at the reservoir Friday to showcase how the technology is being tested. The pilot study is being supported by more than $700,000 in grants from the Metropolitan Water District of Southern California and the U.S. Bureau of Reclamation. The company still will need to secure additional permits from the federal government and the state. And it has yet to estimate how much energy the process will require, which will be a major factor in determining the cost.But water managers and other experts agree that the concept offers several advantages over building a traditional desalination plant on the coast.Significantly less electricity is likely to be needed to run the system’s onshore pumps because the pods will be placed at a depth of about 1,300 feet, where the undersea pressure will help drive seawater through reverse-osmosis membranes to produce fresh water.While the intakes of coastal desalination plants typically suck in and kill plankton and fish larvae, the pods have a patented intake system that the company says returns tiny sea creatures to the surrounding water unharmed. And while a plant on the coast typically discharges ultra salty brine waste that can harm the ecosystem, the undersea pods release brine that is less concentrated and allow it to dissipate without taking such an environmental toll. Golay lowers a prototype into Las Virgenes Reservoir for testing. (Allen J. Schaben / Los Angeles Times) If the technology proves viable on a large scale, Gold said, it would help make Southern California less reliant on diminishing imported supplies from the Sacramento-San Joaquin River Delta and the Colorado River.Research has shown that human-caused climate change is driving worsening droughts in the western United States. Gov. Gavin Newsom’s administration has projected that as rising temperatures diminish the snowpack and intensify droughts, the average amount of water available from the reservoirs and aqueducts of the State Water Project could shrink between 13% and 23% over the next 20 years.Southern California’s water agencies are moving ahead with plans to build new facilities that will transform wastewater into clean drinking water, and have also been investing in projects to capture more stormwater.In addition to the economic viability, other questions need to be answered through research, Gold said, including how well the system will hold up filtering tiny sea life, how much maintenance will be needed, and whether the pods and hoses could present any risk of entangling whales.OceanWell’s executives and engineers say their system is designed to protect marine life and eliminate the environmental negatives of other technologies. A conceptual illustration shows a so-called water farm that OceanWell plans to install off the California coast, with 40-foot-long pods anchored to the seafloor about 1,300 feet deep. (OceanWell) Robert Bergstrom, OceanWell’s chief executive, has been working on desalination projects since 1996, and previously built and operated plants in the U.S. Virgin Islands, the Bahamas and other Caribbean islands for the company Seven Seas Water, which he founded.When Bergstrom retired, he moved to California and eventually decided to go back to work to develop technology to help solve California’s water problems.“I had a big idea,” Bergstrom said. “I knew this was going to be just a huge lift to get this done, a moonshot.”OceanWell, founded in 2019, now has 10 employees. Its lead investor is Charlie McGarraugh, a former partner of the investment banking company Goldman Sachs. One of its major investors is Japan-based Kubota Corp. Building on Bergstrom’s concept, Chief Technology Officer Michael Porter and the engineering team have worked on the design. They built the first prototype in Porter’s kitchen in San Diego County, and did initial tests in a lab.“It was inspired by the environmental community in California pointing out problems that needed to be solved,” Bergstrom said.Desalination plants are operating in parts of California, including the nation’s largest facility, in Carlsbad, and a small-scale plant on Santa Catalina Island. But proposals for new coastal desalination plants have generated strong opposition. In 2022, the California Coastal Commission rejected a plan for a large desalination plant in Huntington Beach. Opponents argued the water wasn’t needed in the area and raised concerns about high costs and harm to the environment.The problem of traditional shallow intakes drawing in large amounts of algae, fish larvae and plankton goes away in the deep sea, Bergstrom said, because the perpetual darkness 1,300 feet underwater supports vastly less sea life.“We have much cleaner water to deal with,” Bergstrom said. “It’s pretty much a barren desert where we’ve chosen to locate, and as a result, we just don’t have that much stuff to filter out.”A specific site for the first water farm has not yet been selected, but the company plans to install it nearly 5 miles offshore, with a pipeline and a copper power cable connecting it to land.Putting the system deep underwater will probably reduce energy costs by about 40%, Bergstrom said, because unlike a coastal plant that must pump larger quantities of seawater, it will pressurize and pump a smaller quantity of fresh water to shore.Bergstrom and his colleagues tout their invention as a totally different approach. They say it’s not really desalinating seawater in the traditional sense, but rather harvesting fresh water from devices that function like wells in the ocean.After their first water farm, they envision building more along the coast. Bergstrom believes they will help solve water scarcity challenges in California and beyond.Various sites off California would be well-suited to develop water farms, from San Diego to Monterey, Bergstrom said, as would many water-scarce countries with deep offshore waters, such as Chile, Spain and North African nations.“I believe it’ll reshape the world more than just California water,” Quinn said, “because I think the globe is looking for something that is this environmentally friendly.”Under the company’s plans, the first water farm would initially have 20 to 25 pods, and would be expanded with additional pods to deliver about 60 million gallons of water per day, enough for about 250,000 households.Las Virgenes and six other water agencies — including L.A. Department of Water and Power, the city of Burbank and Calleguas Municipal Water District — are working together on a study of how water could be delivered directly from the project, and at what cost, as well as how inland agencies could benefit indirectly by exchanging supplies with those on the coast.“We’re very heavily dependent on imported water, and we need to diversify,” said David Pedersen, Las Virgenes’ general manager. “We need to develop new local water that’s drought resilient, and that can help us as we adapt to climate change.”His district, which depends almost entirely on imported supplies from the State Water Project, serves more than 75,000 people in Agoura Hills, Calabasas, Hidden Hills, Westlake Village and surrounding areas. Mike McNutt, public affairs and communications manager for Las Virgenes Municipal Water District, tastes water that flows from a spigot after passing through a prototype desalination system at Las Virgenes Reservoir. (Allen J. Schaben / Los Angeles Times) During the drought from 2020 to 2022, the district was under severe water restrictions and customers reduced usage nearly 40%. Pedersen hopes the district will be able to tap the ocean for water by around 2030. At Calleguas Municipal Water District, which delivers water for about 650,000 people in Ventura County, deputy general manager Ian Prichard said one of the big questions is how much energy the system will use.“If the technology works and they can bring it to market, and we can afford to bring the water into our service area, then that would be great,” Prichard said. “The big test is, can they produce water at a rate that we want to pay?”

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.