Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

How Forensic Scientists Continue to Identify 9/11 Victims 23 Years after the Attacks

News Feed
Wednesday, September 11, 2024

Rachel Feltman: Twenty-three years ago, a series of coordinated terrorist attacks killed nearly 3,000 people and turned Manhattan’s iconic World Trade Center into Ground Zero. Most of you probably remember seeing footage and photos of the long, complicated process of looking for victims in the smoldering debris. But you might not realize that for forensic scientists, that work is far from finished even today.For Scientific American’s Science Quickly, I’m Rachel Feltman. I’m joined today by Kathleen Corrado, the forensics executive director at Syracuse University College of Arts & Sciences. She’s here to tell us how the staggering scale of 9/11’s mass casualty event presented forensic scientists with new challenges—and how the lessons they learned are helping them identify wildfire victims, suspected criminals and the many remaining casualties of 9/11 itself.Thank you so much for joining us today.On supporting science journalismIf you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.Kathleen Corrado: My pleasure.Feltman: So broadly speaking, what kind of impact did 9/11 have on the forensic science community?Corrado: Well, the event that happened in 9/11 in the World Trade Center was basically the first time that DNA analysis was used to identify victims on such a large scale. So while there were about 2,700 victims or so, due to the fire, the explosion, the building collapse, there were a lot of very small samples. A lot of the bodies were degraded.... Really that’s the first time that we really had to think about: How do we deal with this many samples, this many people?Feltman: Mm.Corrado: We had to, you know, look at how we store the samples, how we track the samples. We had to think about software in terms of inventorying the samples, in terms of analyzing the DNA. We had to automate. And then again, with the samples being so degraded, it really affected the way that we process the samples.Feltman: Tell me more about some of those, you know, unique forensic challenges.Corrado: Right, so when we have natural disasters—whether it’s a fire, a flood—or something more accidental, like a plane crash, or a terrorist event, like a bombing, typically the way that bodies are identified are by different methods such as fingerprints and dental records and physical attributes, like tattoos, or if there’s some kind of a medical device, like if someone has a pacemaker or an artificial knee or hip, they have serial numbers on ’em.So that’s the typical way that bodies are identified. But in this instance, in 9/11, because of the jet fuel, there was a really large amount of fire, the building collapsed, a lot of the bodies were really, really degraded and compromised. And so that left us with a lot of really small fragments of bone and other items that you really couldn’t use any other identification method other than DNA.So one of the challenges, first off, was to basically determine what was bone, what wasn’t bone. And if it was bone, was it human bone? And the second challenge is: How do we get the DNA out of such a compromised sample?Feltman: Well, and I think that’s a great segue into talking about the new technologies that emerged. What are we doing differently now because of what forensic scientists learned after 9/11?Corrado: Right, so a lot of the samples were degraded, and so we had to come up with new ways of extracting the DNA: so basically taking the DNA out of the cell and then processing it. There also was—just due to the large volume of samples, everything was done manually, and it took quite a long time. It could take weeks or months to get through the process. And so we basically had automatic robotics that we could put in to process the samples. So those are some of the innovations that came out of that.In addition, one of the other things that we had to think about was the reference samples. So when you have bodies that we’re trying to identify, there’s two different ways we can identify them. One would be with what we call antemortem samples, which is when we’re taking a direct sample from the victim and comparing it: something like a toothbrush or a razor or earbuds—something like that that might have the victim’s DNA on it that we can do a direct comparison.And then a second type of comparison that we would do is where we compare the victim’s DNA to relatives. And so that would be first-degree relatives—we’re looking for parents, children, sometimes siblings. So basically there were a lot of challenges in 9/11 with just, you know, determining: How do you get the message out to these families that we need these samples? How do we tell them which family members we need to collect and what samples we need to collect?You know, when 9/11 happened, after 9/11 happened, it really was a wake-up call, saying: We need to have policies and procedures for this type of mass disaster. You know, we need to know who’s in charge, who’s collecting the samples, who’s gonna be the voice speaking to the families.There’s a lot of new policies and procedures in place that we have now so we know how to do this: we know how to put the message out and how to make sure that we’re getting the right samples.Feltman: Yeah. Can we talk a little bit more about the technological leaps that have happened? You know, I think some of our, our listeners might not know what the process of DNA extraction looked like in 2000 and what it looks like now, so I would love to get a little bit of an overview.Corrado: Yeah, so—absolutely. One of the biggest changes that’s happened is what we call the rapid DNA instruments—basically [they’re] a game changer. So [a] rapid DNA instrument, how it’s different is: previously what would happen is the samples would have to be collected at the site, they’d have to be shipped to the laboratory, and then the laboratory would manually process the samples—so they’d have to extract the DNA, and then take that DNA and generate a DNA profile, and then do the interpretation. And that could take weeks or months.Feltman: Mm.Corrado: With rapid DNA instruments now, all of those processes are done inside the instrument, so it’s one step. So you take the sample, whether that’s a swab of blood or perhaps a sample from bone that we can extract, we put it into the instrument, it does all of those processes within the instrument, and it does it in about 90 minutes ...Feltman: Wow.Corrado: Which truly is a game changer. So something that would take weeks or months before, we now can do quite quickly.Other benefits are, [two], that these instruments can be placed directly at the site. So we don’t have to send the samples to a lab; we can set up a makeshift lab, put these instruments right in the area where the disaster occurred and process the samples right there.And then the third reason why they’re very helpful is that we don’t need a DNA analyst—we don’t need an expert to run these samples. So as before, every sample had to be run by a DNA expert in the lab and interpreted by a DNA expert, these results are spit out in 90 minutes, and you don’t need to be a DNA expert to run it to get the results.And ... these types of instruments were used in the 2018 Camp Fire in California. So I think there were about 100 victims of that fire, and I think something close to, like, 80 percent of those samples were ID’d through DNA, which is really high. So prior to that it was—usually it was about 20 percent of samples were—we would use DNA to identify.Now we can use it not only just for the samples of the victims but also the family reference samples. So even before, all those family reference samples had to go to a lab. Now they can all be processed on-site in these instruments.And I believe it was also used in the Maui wildfires, and also it’s used in things like the war in Ukraine—I mean, these instruments have a lot of other uses besides mass disaster victim identification.Feltman: Yeah, well, and tell me more about the policies that emerged and changed because of 9/11. You mentioned that it was really a wake-up call in terms of needing systems in place. What are some of those systems?Corrado: Well, we have to make sure that we have a good policy in terms of what samples to collect, how those samples are stored, what will happen to those samples after they’re used and the data after they’re used. We also have to make sure that we have a single point person that can go ahead and give the information out to the public as well as to the families. We have to have safety. You know, we have to worry about hazards—biohazards. So all of those policies are in place.Additionally, with the reference samples, something that’s really important now is the informed consent. So we wanna make sure that the relatives that are giving their samples know what it is that they’re giving, know why they’re giving it and also they know what’s gonna happen to that sample and to that data afterwards—you know, is it gonna go into a database, or is it gonna be destroyed? So there’s informed consent now, which is really important in terms of protecting people’s privacy.Feltman: So are there any new technologies that actually emerged from the 9/11 investigation specifically?Corrado: Well, specifically from the 9/11 investigation there were new technologies in terms of how to analyze degraded samples. And particularly when we have these samples, they’re very small fragments of DNA, and previous to 9/11 we really weren’t able to get data from such small samples. And so after 9/11 and continuously we’ve been able to improve the extraction technologies for small samples.There’s also a new technology called next-generation sequencing that’s at the forefront right now. That technology will allow us to analyze samples that are even smaller. So when the DNA is broken up into small, small pieces, this technology will allow us to analyze even smaller samples, and then it allows us to build them together into a bigger, contiguous DNA profile or sequence, and that will allow us to have more sensitivity, so we’ll be able to analyze samples that are even smaller. And that technology is starting to be used even to identify more of the remains from 9/11 because only about 60 percent of the victims have been identified from the 9/11 event.Feltman: Wow. And outside of the 9/11 investigations, you know, how is that technology changing forensic science?Corrado: In the criminal justice system, similar to things like mass disasters, where we have degradation of samples, we have a lot of samples in crime scenes that are exposed to environmental conditions. There’s old samples, cold cases where there’s not a lot of DNA left. So all of these technologies that allow us to generate a DNA profile from a very small sample or a very degraded sample have really made leaps and bounds in terms of us being able to identify perpetrators of crimes.Another technology that's out there that I think is being used in criminal and in identification is SNPs, single nucleotide polymorphisms, and, in particular, that’s using externally visible characteristics, or EVCs. So, say we have a victim of a mass disaster that no one’s really looking for them—they don’t have family members that are looking for ’em, or there are no family reference samples ...Feltman: Mm-hmm.Corrado: What we can do with externally visible characteristics is: it can give us clues about the person’s eye color, their hair color, if they had freckles, their skin tone and their biogeographical ancestry. So if we don’t have something to compare to, we might be able to get information as to how this person looked—you know, what their external characteristics were—that might help us identify them.Feltman: And I assume that’s quite useful in forensic science for lots of other kinds of investigations, too.Corrado: It can be. It’s relatively new. And quite honestly it’s a little controversial because it’s not clear that we should be using externally visible characteristics to identify suspects, but there are companies out there that offer that service.Feltman: Sure, yeah, no, I can, I can see the potential issues in, in using it for suspect identification specifically.Well, are there any challenges related to mass casualty events that forensic scientists are still figuring out how to tackle?Corrado: Yeah, absolutely. So, you know, when it comes down to the mass disasters, certainly the environment still plays a huge effect. So, you know, like I said, if we have a fire, that can cause degradation. But also if we think about something like a flood—like think about the tsunami in 2004 in South Asia.First there was this flood, so all of the bodies were submerged underwater, and then they were scattered in such a large area, and it was really hot there; the sun’s beating down on these bodies. And so all of that causes the remains to degrade, and unfortunately there were so many victims in that mass disaster that they couldn’t collect everything quickly enough. And so in that instance the temperature and the heat really affected the ability to use DNA. So in those instances they really had to rely more on other types of mechanisms to identify a body, such as odontology or dental records or fingerprints. So in, in that instance I think DNA was used in very small numbers of the identifications.And secondly, I think another challenge that we faced in 9/11 that still happens to this day is getting the message out to the family members and collecting reference samples.So you can imagine—let’s use Maui as an example—it’s a little bit difficult when people are faced with all of these really traumatic experiences to say, “Hey, by the way, we need to collect a DNA sample from you.”In addition to that, there are sometimes a lot of reluctance for families to give a reference sample. There’s somewhat of a distrust of the government, and particularly in Maui, again, there’s some cultural issues to that. A lot of Indigenous people there had some concerns. They had issues in the past where the government was collecting their DNA to determine, you know, who had rights to land and things like that. So they had a lot of distrust. And so it’s hard to think about: How are we going to explain to these families why it’s so important for them to give their DNA samples if they want to identify their loved one?Another challenge that we still have, again, that we had in 9/11, we have in all of these situations is: no matter how good our identification methods are, whether it’s DNA or dental records or fingerprints, we still have to identify the remains. So it’s still gonna take the very first person, the anthropologist coming in, sifting through all the debris, saying, “Yeah, that’s a bone. No, it’s not a bone. Yeah, that’s human. No, it’s not human.” It’s a time-comprehensive process. So that’s sort of a limiting factor. So we still have to think about: Are there ways that we could perhaps move that part of the process a little bit faster?Feltman: Hmm, well, and just going back to something you mentioned earlier: you know, the fact that so many of the victims of 9/11 have not yet been identified. Could you tell me more about how that process is going?Corrado: Yes, so that project is still being worked [on] by the Office of the Chief Medical Examiner of New York City. So they have staff that are dedicated to that project. They have committed to identifying every one of those last remains if they can. And so they continue to analyze those, and basically they’re doing work in terms of: What are the new technologies out there?So we’ve talked about the new extraction technologies. They also use different types of DNA: they don’t just use nuclear DNA; they use mitochondrial DNA, which is another type of DNA that’s found in cells in higher copy numbers. So oftentimes in very degraded samples or in bone, there’s more mitochondrial DNA left than there is nuclear DNA. So that’s another process that they can use.And again they’re looking at this new technology called next-generation sequencing, which is a very different process than we currently use. And this is where we’re sequencing the base pairs of DNA, and next-generation sequencing has a promise of—it’s a lot more sensitive because it—we’re able to sequence a lot smaller fragments, and we can sequence the smaller fragments and then put them together into one larger fragment to read the sample and generate information. And so as this technology progresses, the labs are picking up this technology, validating it and using it in the hopes of identifying more of those remains.Feltman: Thank you so much for coming on. This was really fascinating.Corrado: Well, thank you so much for having me. I really appreciate it. And it was my pleasure.Feltman: That’s all for today’s episode. Tune in on Friday for something very special: a chat with an astronaut—from actual space—about how his time on the ISS is helping him take his photography hobby to new heights.In the meantime, do us a favor and leave us a quick rating or review or comment or whatever your podcast platform of choice lets you do to tell them that you like us. You can also send us any questions or comments you have at ScienceQuickly@sciam.com.Science Quickly is produced by me, Rachel Feltman, along with Fonda Mwangi, Kelso Harper, Madison Goldberg and Jeff DelViscio. Shayna Posses and Aaron Shattuck fact-check our show. Our theme music was composed by Dominic Smith. Subscribe to Scientific American for more up-to-date and in-depth science news.For Scientific American, this is Rachel Feltman. See you next time!

Forensic scientists are still working to identify victims of the 9/11 attacks using advancements in technology and techniques developed over the past two decades.

Rachel Feltman: Twenty-three years ago, a series of coordinated terrorist attacks killed nearly 3,000 people and turned Manhattan’s iconic World Trade Center into Ground Zero. Most of you probably remember seeing footage and photos of the long, complicated process of looking for victims in the smoldering debris. But you might not realize that for forensic scientists, that work is far from finished even today.

For Scientific American’s Science Quickly, I’m Rachel Feltman. I’m joined today by Kathleen Corrado, the forensics executive director at Syracuse University College of Arts & Sciences. She’s here to tell us how the staggering scale of 9/11’s mass casualty event presented forensic scientists with new challenges—and how the lessons they learned are helping them identify wildfire victims, suspected criminals and the many remaining casualties of 9/11 itself.

Thank you so much for joining us today.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Kathleen Corrado: My pleasure.

Feltman: So broadly speaking, what kind of impact did 9/11 have on the forensic science community?

Corrado: Well, the event that happened in 9/11 in the World Trade Center was basically the first time that DNA analysis was used to identify victims on such a large scale. So while there were about 2,700 victims or so, due to the fire, the explosion, the building collapse, there were a lot of very small samples. A lot of the bodies were degraded.... Really that’s the first time that we really had to think about: How do we deal with this many samples, this many people?

Feltman: Mm.

Corrado: We had to, you know, look at how we store the samples, how we track the samples. We had to think about software in terms of inventorying the samples, in terms of analyzing the DNA. We had to automate. And then again, with the samples being so degraded, it really affected the way that we process the samples.

Feltman: Tell me more about some of those, you know, unique forensic challenges.

Corrado: Right, so when we have natural disasters—whether it’s a fire, a flood—or something more accidental, like a plane crash, or a terrorist event, like a bombing, typically the way that bodies are identified are by different methods such as fingerprints and dental records and physical attributes, like tattoos, or if there’s some kind of a medical device, like if someone has a pacemaker or an artificial knee or hip, they have serial numbers on ’em.

So that’s the typical way that bodies are identified. But in this instance, in 9/11, because of the jet fuel, there was a really large amount of fire, the building collapsed, a lot of the bodies were really, really degraded and compromised. And so that left us with a lot of really small fragments of bone and other items that you really couldn’t use any other identification method other than DNA.

So one of the challenges, first off, was to basically determine what was bone, what wasn’t bone. And if it was bone, was it human bone? And the second challenge is: How do we get the DNA out of such a compromised sample?

Feltman: Well, and I think that’s a great segue into talking about the new technologies that emerged. What are we doing differently now because of what forensic scientists learned after 9/11?

Corrado: Right, so a lot of the samples were degraded, and so we had to come up with new ways of extracting the DNA: so basically taking the DNA out of the cell and then processing it. There also was—just due to the large volume of samples, everything was done manually, and it took quite a long time. It could take weeks or months to get through the process. And so we basically had automatic robotics that we could put in to process the samples. So those are some of the innovations that came out of that.

In addition, one of the other things that we had to think about was the reference samples. So when you have bodies that we’re trying to identify, there’s two different ways we can identify them. One would be with what we call antemortem samples, which is when we’re taking a direct sample from the victim and comparing it: something like a toothbrush or a razor or earbuds—something like that that might have the victim’s DNA on it that we can do a direct comparison.

And then a second type of comparison that we would do is where we compare the victim’s DNA to relatives. And so that would be first-degree relatives—we’re looking for parents, children, sometimes siblings. So basically there were a lot of challenges in 9/11 with just, you know, determining: How do you get the message out to these families that we need these samples? How do we tell them which family members we need to collect and what samples we need to collect?

You know, when 9/11 happened, after 9/11 happened, it really was a wake-up call, saying: We need to have policies and procedures for this type of mass disaster. You know, we need to know who’s in charge, who’s collecting the samples, who’s gonna be the voice speaking to the families.

There’s a lot of new policies and procedures in place that we have now so we know how to do this: we know how to put the message out and how to make sure that we’re getting the right samples.

Feltman: Yeah. Can we talk a little bit more about the technological leaps that have happened? You know, I think some of our, our listeners might not know what the process of DNA extraction looked like in 2000 and what it looks like now, so I would love to get a little bit of an overview.

Corrado: Yeah, so—absolutely. One of the biggest changes that’s happened is what we call the rapid DNA instruments—basically [they’re] a game changer. So [a] rapid DNA instrument, how it’s different is: previously what would happen is the samples would have to be collected at the site, they’d have to be shipped to the laboratory, and then the laboratory would manually process the samples—so they’d have to extract the DNA, and then take that DNA and generate a DNA profile, and then do the interpretation. And that could take weeks or months.

Feltman: Mm.

Corrado: With rapid DNA instruments now, all of those processes are done inside the instrument, so it’s one step. So you take the sample, whether that’s a swab of blood or perhaps a sample from bone that we can extract, we put it into the instrument, it does all of those processes within the instrument, and it does it in about 90 minutes ...

Feltman: Wow.

Corrado: Which truly is a game changer. So something that would take weeks or months before, we now can do quite quickly.

Other benefits are, [two], that these instruments can be placed directly at the site. So we don’t have to send the samples to a lab; we can set up a makeshift lab, put these instruments right in the area where the disaster occurred and process the samples right there.

And then the third reason why they’re very helpful is that we don’t need a DNA analyst—we don’t need an expert to run these samples. So as before, every sample had to be run by a DNA expert in the lab and interpreted by a DNA expert, these results are spit out in 90 minutes, and you don’t need to be a DNA expert to run it to get the results.

And ... these types of instruments were used in the 2018 Camp Fire in California. So I think there were about 100 victims of that fire, and I think something close to, like, 80 percent of those samples were ID’d through DNA, which is really high. So prior to that it was—usually it was about 20 percent of samples were—we would use DNA to identify.

Now we can use it not only just for the samples of the victims but also the family reference samples. So even before, all those family reference samples had to go to a lab. Now they can all be processed on-site in these instruments.

And I believe it was also used in the Maui wildfires, and also it’s used in things like the war in Ukraine—I mean, these instruments have a lot of other uses besides mass disaster victim identification.

Feltman: Yeah, well, and tell me more about the policies that emerged and changed because of 9/11. You mentioned that it was really a wake-up call in terms of needing systems in place. What are some of those systems?

Corrado: Well, we have to make sure that we have a good policy in terms of what samples to collect, how those samples are stored, what will happen to those samples after they’re used and the data after they’re used. We also have to make sure that we have a single point person that can go ahead and give the information out to the public as well as to the families. We have to have safety. You know, we have to worry about hazards—biohazards. So all of those policies are in place.

Additionally, with the reference samples, something that’s really important now is the informed consent. So we wanna make sure that the relatives that are giving their samples know what it is that they’re giving, know why they’re giving it and also they know what’s gonna happen to that sample and to that data afterwards—you know, is it gonna go into a database, or is it gonna be destroyed? So there’s informed consent now, which is really important in terms of protecting people’s privacy.

Feltman: So are there any new technologies that actually emerged from the 9/11 investigation specifically?

Corrado: Well, specifically from the 9/11 investigation there were new technologies in terms of how to analyze degraded samples. And particularly when we have these samples, they’re very small fragments of DNA, and previous to 9/11 we really weren’t able to get data from such small samples. And so after 9/11 and continuously we’ve been able to improve the extraction technologies for small samples.

There’s also a new technology called next-generation sequencing that’s at the forefront right now. That technology will allow us to analyze samples that are even smaller. So when the DNA is broken up into small, small pieces, this technology will allow us to analyze even smaller samples, and then it allows us to build them together into a bigger, contiguous DNA profile or sequence, and that will allow us to have more sensitivity, so we’ll be able to analyze samples that are even smaller. And that technology is starting to be used even to identify more of the remains from 9/11 because only about 60 percent of the victims have been identified from the 9/11 event.

Feltman: Wow. And outside of the 9/11 investigations, you know, how is that technology changing forensic science?

Corrado: In the criminal justice system, similar to things like mass disasters, where we have degradation of samples, we have a lot of samples in crime scenes that are exposed to environmental conditions. There’s old samples, cold cases where there’s not a lot of DNA left. So all of these technologies that allow us to generate a DNA profile from a very small sample or a very degraded sample have really made leaps and bounds in terms of us being able to identify perpetrators of crimes.

Another technology that's out there that I think is being used in criminal and in identification is SNPs, single nucleotide polymorphisms, and, in particular, that’s using externally visible characteristics, or EVCs. So, say we have a victim of a mass disaster that no one’s really looking for them—they don’t have family members that are looking for ’em, or there are no family reference samples ...

Feltman: Mm-hmm.

Corrado: What we can do with externally visible characteristics is: it can give us clues about the person’s eye color, their hair color, if they had freckles, their skin tone and their biogeographical ancestry. So if we don’t have something to compare to, we might be able to get information as to how this person looked—you know, what their external characteristics were—that might help us identify them.

Feltman: And I assume that’s quite useful in forensic science for lots of other kinds of investigations, too.

Corrado: It can be. It’s relatively new. And quite honestly it’s a little controversial because it’s not clear that we should be using externally visible characteristics to identify suspects, but there are companies out there that offer that service.

Feltman: Sure, yeah, no, I can, I can see the potential issues in, in using it for suspect identification specifically.

Well, are there any challenges related to mass casualty events that forensic scientists are still figuring out how to tackle?

Corrado: Yeah, absolutely. So, you know, when it comes down to the mass disasters, certainly the environment still plays a huge effect. So, you know, like I said, if we have a fire, that can cause degradation. But also if we think about something like a flood—like think about the tsunami in 2004 in South Asia.

First there was this flood, so all of the bodies were submerged underwater, and then they were scattered in such a large area, and it was really hot there; the sun’s beating down on these bodies. And so all of that causes the remains to degrade, and unfortunately there were so many victims in that mass disaster that they couldn’t collect everything quickly enough. And so in that instance the temperature and the heat really affected the ability to use DNA. So in those instances they really had to rely more on other types of mechanisms to identify a body, such as odontology or dental records or fingerprints. So in, in that instance I think DNA was used in very small numbers of the identifications.

And secondly, I think another challenge that we faced in 9/11 that still happens to this day is getting the message out to the family members and collecting reference samples.

So you can imagine—let’s use Maui as an example—it’s a little bit difficult when people are faced with all of these really traumatic experiences to say, “Hey, by the way, we need to collect a DNA sample from you.”

In addition to that, there are sometimes a lot of reluctance for families to give a reference sample. There’s somewhat of a distrust of the government, and particularly in Maui, again, there’s some cultural issues to that. A lot of Indigenous people there had some concerns. They had issues in the past where the government was collecting their DNA to determine, you know, who had rights to land and things like that. So they had a lot of distrust. And so it’s hard to think about: How are we going to explain to these families why it’s so important for them to give their DNA samples if they want to identify their loved one?

Another challenge that we still have, again, that we had in 9/11, we have in all of these situations is: no matter how good our identification methods are, whether it’s DNA or dental records or fingerprints, we still have to identify the remains. So it’s still gonna take the very first person, the anthropologist coming in, sifting through all the debris, saying, “Yeah, that’s a bone. No, it’s not a bone. Yeah, that’s human. No, it’s not human.” It’s a time-comprehensive process. So that’s sort of a limiting factor. So we still have to think about: Are there ways that we could perhaps move that part of the process a little bit faster?

Feltman: Hmm, well, and just going back to something you mentioned earlier: you know, the fact that so many of the victims of 9/11 have not yet been identified. Could you tell me more about how that process is going?

Corrado: Yes, so that project is still being worked [on] by the Office of the Chief Medical Examiner of New York City. So they have staff that are dedicated to that project. They have committed to identifying every one of those last remains if they can. And so they continue to analyze those, and basically they’re doing work in terms of: What are the new technologies out there?

So we’ve talked about the new extraction technologies. They also use different types of DNA: they don’t just use nuclear DNA; they use mitochondrial DNA, which is another type of DNA that’s found in cells in higher copy numbers. So oftentimes in very degraded samples or in bone, there’s more mitochondrial DNA left than there is nuclear DNA. So that’s another process that they can use.

And again they’re looking at this new technology called next-generation sequencing, which is a very different process than we currently use. And this is where we’re sequencing the base pairs of DNA, and next-generation sequencing has a promise of—it’s a lot more sensitive because it—we’re able to sequence a lot smaller fragments, and we can sequence the smaller fragments and then put them together into one larger fragment to read the sample and generate information. And so as this technology progresses, the labs are picking up this technology, validating it and using it in the hopes of identifying more of those remains.

Feltman: Thank you so much for coming on. This was really fascinating.

Corrado: Well, thank you so much for having me. I really appreciate it. And it was my pleasure.

Feltman: That’s all for today’s episode. Tune in on Friday for something very special: a chat with an astronaut—from actual space—about how his time on the ISS is helping him take his photography hobby to new heights.

In the meantime, do us a favor and leave us a quick rating or review or comment or whatever your podcast platform of choice lets you do to tell them that you like us. You can also send us any questions or comments you have at ScienceQuickly@sciam.com.

Science Quickly is produced by me, Rachel Feltman, along with Fonda Mwangi, Kelso Harper, Madison Goldberg and Jeff DelViscio. Shayna Posses and Aaron Shattuck fact-check our show. Our theme music was composed by Dominic Smith. Subscribe to Scientific American for more up-to-date and in-depth science news.

For Scientific American, this is Rachel Feltman. See you next time!

Read the full story here.
Photos courtesy of

EPA grants air permit, clears way for new deep-water oil port off Southeast Texas coast

The Texas GulfLink would be about 30 miles off the coast of Freeport. It's touted for first-of-its-kind technology to reduce emissions. Environmentalists and Brazoria County residents still have concerns.

This August 2014 shows the Gulf shoreline in Texas’ Bolivar Peninsula.The U.S. Environmental Protection Agency (EPA) has issued an air-quality permit for a proposed deep-water crude oil port about 30 miles off the shore of Freeport, a Gulf Coast town south of Houston. Its supporters say it takes an extra step toward reducing emissions, while environmental advocacy groups and some nearby residents worry it will still exacerbate pollution. The Texas GulfLink deep-water port would implement a "first-of-its-kind use of vapor capture and control technology mounted on an offshore support vessel," according to a news release issued Monday by the EPA. The agency notes that such technology has been used on shuttle tankers for decades with 96% emission-control efficiency. "Sentinel Midstream is proud to unveil a groundbreaking vapor control application that will revolutionize the loading of Very Large Crude Carriers in the Gulf of America," said Jeff Ballard, the CEO of Sentinel Midstream, of which Texas GulfLink is a subsidiary, in the EPA news release. "Developed by our Texas GulfLink team in close collaboration with the EPA, this innovative approach significantly reduces volatile organic compounds, setting a new industry standard for environmental performance and advances the implementation of Best Available Control Technology." Air pollutants that are emitted during the process of obtaining crude oil "will be captured at the tanker and routed via flexible hose to a control system located on an adjacent, dynamically positioned offshore support vessel," according to Brad Toups, an EPA official who wrote the permit and presented it during a public hearing in June. Those emissions, referred to as volatile organic compounds, are either stored and sold, or they're used as fuel. Sentinel Midstream did not immediately respond a request for comment Tuesday. The permit, under the Clean Air Act, is one piece of the puzzle toward the rig's development. The other is approval from the U.S. Department of Transportation's Maritime Administration, or MARAD. In February, MARAD issued a Record of Decision, indicating its approval of the project. RELATED: EPA approves long-awaited plan to clean up San Jacinto River waste pits near Houston Though the project takes steps toward reducing emissions, clean energy advocacy groups have expressed criticisms of the Texas GulfLink deep-water port. "Approving yet another massive offshore oil terminal like this will only worsen a global climate crisis that is already slamming Texans with flooding, heat waves, and drought," Jen Duggan, executive director of the Environmental Integrity Project, told Houston Public Media. "This terminal is expected to release more than 21,000 tons of greenhouse gases per year, as much as much as 4,321 cars and trucks driven for a year. It is good that the Trump Administration says the terminal will be using some pollution controls. But we should remember that ‘unleashing' more dirty fossil fuels like this also means more air and water pollution released upstream during the fracking, drilling, and processing of the oil before it even arrives at the oil export terminal. And then more pollution again when it is burned — all to the detriment of the climate and local communities." During a public EPA hearing in June, members of the Brazoria County community also shared concerns about the initiative. "This project doesn't benefit people in Brazoria County, it only benefits rich executives who continue to squeeze profits at the expense of communities like Freeport," said Riley Bennington, a Brazoria County resident, according to an EPA transcript of the hearing. "As a kid growing up in Texas, I really thought we'd be past this by now. We've had renewable energy figured out. Why is this even being considered?" Though most of the testimony during the June 25 public hearing opposed Texas GulfLink, the initiative wasn't completely without praise. Amy Dinn, an attorney from Lone Star Legal Aid representing Better Brazoria, said GulfLink's permits are "much better and more protective of the environment" than other such projects, though she still expressed concerns that not enough research was done on the ozone emissions and impacts of severe weather.

‘They dictate the rules’: BBC tells PM’s Evan Davis to stop hosting heat pump podcast

Presenter believes decision was taken due to the technology’s link with net zero after he was told he risked accusations of political biasThe BBC presenter Evan Davis has been told he can no longer host a podcast about heat pumps due to the corporation’s concerns that discussing the technology risks “treading on areas of public controversy”.The presenter of BBC Radio 4’s PM programme had hosted 20 episodes of the Happy Heat Pump Podcast, which launched in 2024. It has covered issues around installing the technology, the cost, noise levels and the alternatives for people replacing their gas boilers. Continue reading...

The BBC presenter Evan Davis has been told he can no longer host a podcast about heat pumps due to the corporation’s concerns that discussing the technology risks “treading on areas of public controversy”.The presenter of BBC Radio 4’s PM programme had hosted 20 episodes of the Happy Heat Pump Podcast, which launched in 2024. It has covered issues around installing the technology, the cost, noise levels and the alternatives for people replacing their gas boilers.However, despite initially being given approval to go ahead with the non-BBC project, bosses told Davis the podcast risked exposing him to accusations of political bias. “As the series has gone on – in fact as the world has progressed over the last few months – they have become concerned that anything like this trying to inform people about heat pumps can be interpreted, rightly or wrongly, as somehow treading on areas of public controversy,” he told followers of the podcast’s YouTube channel.“I take their shilling, they dictate the rules. They have to try and keep their presenters out of areas of public controversy, and they have decided heat pumps can be controversial, so they’ve asked me not to be involved.”The widespread installation of heat pumps is seen as necessary to achieve the government’s target of hitting net zero carbon emissions by 2050. Last month Kemi Badenoch, the Conservative leader, dropped her party’s support for the target. Davis said he believed the decision to stop him appearing on the podcast had been taken because of a link between heat pumps and the net zero target.Bean Beanland, a director at the Heat Pump Federation and Davis’s co-presenter on the podcast, described the decision as “quite extraordinary”. Douglas Parr, Greenpeace UK’s policy director, said: “As an impartial broadcaster, the BBC should not be pandering to attempts from the right to turn the world’s most efficient home heating system into a culture war issue. What’s next – cancelling Gardeners’ World because of Monty Don’s support for peat-free compost?”Davis told the Guardian he received “no remuneration at all” for the podcast and had personally paid its small costs for music, dissemination and microphone equipment. He said there was no link with the HPF, other than the fact it employed his co-host.However, he defended the broadcaster. “While it’s easy to be infuriated by the BBC and its caution on things like this – and of course, I do disagree with it in this case – I’ve never had the burden of actually having to run the BBC and make a hundred decisions a day, while people from all sides shout incessantly at me,” he said.“I’m obviously free to leave if I don’t like the restrictions that come with working here, but I choose not to because it is a great institution, the PM programme is in excellent shape, and they pay me handsomely.”The BBC has received criticism over its handling of environmental issues. In 2018, the broadcaster said it would stop “both-sidesing” the climate crisis, admitting that it got some of its coverage “wrong” by setting up debates with those who deny climate science.However, more recently, the broadcaster has given a platform to some who call for reduced action on the climate breakdown. Producers also accused the BBC of shelving a 2023 political programme by Sir David Attenborough that linked the UK’s biodiversity loss to the climate crisis. Insiders said this was because of fears its themes of the destruction of nature would risk a backlash from Tory politicians and the rightwing press.BBC guidelines state employees should not compromise the impartiality of the corporation in their outside work. A source said while the BBC is clear that climate change is happening, responses to it are a matter of public policy. They added that Davis’s podcast only explored and promoted one possible solution.The BBC has previously come under pressure over the external projects of its presenters. Last year, the broadcaster Clive Myrie apologised for failing to declare at least £145,000 earned from external events and said he would stop doing them for the “foreseeable future”.

Workshop explores new advanced materials for a growing world

Speakers described challenges and potential solutions for producing materials to meet demands associated with data centers, infrastructure, and other technology.

It is clear that humankind needs increasingly more resources, from computing power to steel and concrete, to meet the growing demands associated with data centers, infrastructure, and other mainstays of society. New, cost-effective approaches for producing the advanced materials key to that growth were the focus of a two-day workshop at MIT on March 11 and 12.A theme throughout the event was the importance of collaboration between and within universities and industries. The goal is to “develop concepts that everybody can use together, instead of everybody doing something different and then trying to sort it out later at great cost,” said Lionel Kimerling, the Thomas Lord Professor of Materials Science and Engineering at MIT.The workshop was produced by MIT’s Materials Research Laboratory (MRL), which has an industry collegium, and MIT’s Industrial Liaison Program. The program included an address by Javier Sanfelix, lead of the Advanced Materials Team for the European Union. Sanfelix gave an overview of the EU’s strategy to developing advanced materials, which he said are “key enablers of the green and digital transition for European industry.”That strategy has already led to several initiatives. These include a material commons, or shared digital infrastructure for the design and development of advanced materials, and an advanced materials academy for educating new innovators and designers. Sanfelix also described an Advanced Materials Act for 2026 that aims to put in place a legislative framework that supports the entire innovation cycle.Sanfelix was visiting MIT to learn more about how the Institute is approaching the future of advanced materials. “We see MIT as a leader worldwide in technology, especially on materials, and there is a lot to learn about [your] industry collaborations and technology transfer with industry,” he said.Innovations in steel and concreteThe workshop began with talks about innovations involving two of the most common human-made materials in the world: steel and cement. We’ll need more of both but must reckon with the huge amounts of energy required to produce them and their impact on the environment due to greenhouse-gas emissions during that production.One way to address our need for more steel is to reuse what we have, said C. Cem Tasan, the POSCO Associate Professor of Metallurgy in the Department of Materials Science and Engineering (DMSE) and director of the Materials Research Laboratory.But most of the existing approaches to recycling scrap steel involve melting the metal. “And whenever you are dealing with molten metal, everything goes up, from energy use to carbon-dioxide emissions. Life is more difficult,” Tasan said.The question he and his team asked is whether they could reuse scrap steel without melting it. Could they consolidate solid scraps, then roll them together using existing equipment to create new sheet metal? From the materials-science perspective, Tasan said, that shouldn’t work, for several reasons.But it does. “We’ve demonstrated the potential in two papers and two patent applications already,” he said. Tasan noted that the approach focuses on high-quality manufacturing scrap. “This is not junkyard scrap,” he said.Tasan went on to explain how and why the new process works from a materials-science perspective, then gave examples of how the recycled steel could be used. “My favorite example is the stainless-steel countertops in restaurants. Do you really need the mechanical performance of stainless steel there?” You could use the recycled steel instead.Hessam Azarijafari addressed another common, indispensable material: concrete. This year marks the 16th anniversary of the MIT Concrete Sustainability Hub (CSHub), which began when a set of industry leaders and politicians reached out to MIT to learn more about the benefits and environmental impacts of concrete.The hub’s work now centers around three main themes: working toward a carbon-neutral concrete industry; the development of a sustainable infrastructure, with a focus on pavement; and how to make our cities more resilient to natural hazards through investment in stronger, cooler construction.Azarijafari, the deputy director of the CSHub, went on to give several examples of research results that have come out of the CSHub. These include many models to identify different pathways to decarbonize the cement and concrete sector. Other work involves pavements, which the general public thinks of as inert, Azarijafari said. “But we have [created] a state-of-the-art model that can assess interactions between pavement and vehicles.” It turns out that pavement surface characteristics and structural performance “can influence excess fuel consumption by inducing an additional rolling resistance.”Azarijafari emphasized  the importance of working closely with policymakers and industry. That engagement is key “to sharing the lessons that we have learned so far.”Toward a resource-efficient microchip industryConsider the following: In 2020 the number of cell phones, GPS units, and other devices connected to the “cloud,” or large data centers, exceeded 50 billion. And data-center traffic in turn is scaling by 1,000 times every 10 years.But all of that computation takes energy. And “all of it has to happen at a constant cost of energy, because the gross domestic product isn’t changing at that rate,” said Kimerling. The solution is to either produce much more energy, or make information technology much more energy-efficient. Several speakers at the workshop focused on the materials and components behind the latter.Key to everything they discussed: adding photonics, or using light to carry information, to the well-established electronics behind today’s microchips. “The bottom line is that integrating photonics with electronics in the same package is the transistor for the 21st century. If we can’t figure out how to do that, then we’re not going to be able to scale forward,” said Kimerling, who is director of the MIT Microphotonics Center.MIT has long been a leader in the integration of photonics with electronics. For example, Kimerling described the Integrated Photonics System Roadmap – International (IPSR-I), a global network of more than 400 industrial and R&D partners working together to define and create photonic integrated circuit technology. IPSR-I is led by the MIT Microphotonics Center and PhotonDelta. Kimerling began the organization in 1997.Last year IPSR-I released its latest roadmap for photonics-electronics integration, “which  outlines a clear way forward and specifies an innovative learning curve for scaling performance and applications for the next 15 years,” Kimerling said.Another major MIT program focused on the future of the microchip industry is FUTUR-IC, a new global alliance for sustainable microchip manufacturing. Begun last year, FUTUR-IC is funded by the National Science Foundation.“Our goal is to build a resource-efficient microchip industry value chain,” said Anuradha Murthy Agarwal, a principal research scientist at the MRL and leader of FUTUR-IC. That includes all of the elements that go into manufacturing future microchips, including workforce education and techniques to mitigate potential environmental effects.FUTUR-IC is also focused on electronic-photonic integration. “My mantra is to use electronics for computation, [and] shift to photonics for communication to bring this energy crisis in control,” Agarwal said.But integrating electronic chips with photonic chips is not easy. To that end, Agarwal described some of the challenges involved. For example, currently it is difficult to connect the optical fibers carrying communications to a microchip. That’s because the alignment between the two must be almost perfect or the light will disperse. And the dimensions involved are minuscule. An optical fiber has a diameter of only millionths of a meter. As a result, today each connection must be actively tested with a laser to ensure that the light will come through.That said, Agarwal went on to describe a new coupler between the fiber and chip that could solve the problem and allow robots to passively assemble the chips (no laser needed). The work, which was conducted by researchers including MIT graduate student Drew Wenninger, Agarwal, and Kimerling, has been patented, and is reported in two papers. A second recent breakthrough in this area involving a printed micro-reflector was described by Juejun “JJ” Hu, John F. Elliott Professor of Materials Science and Engineering.FUTUR-IC is also leading educational efforts for training a future workforce, as well as techniques for detecting — and potentially destroying — the perfluroalkyls (PFAS, or “forever chemicals”) released during microchip manufacturing. FUTUR-IC educational efforts, including virtual reality and game-based learning, were described by Sajan Saini, education director for FUTUR-IC. PFAS detection and remediation were discussed by Aristide Gumyusenge, an assistant professor in DMSE, and Jesus Castro Esteban, a postdoc in the Department of Chemistry.Other presenters at the workshop included Antoine Allanore, the Heather N. Lechtman Professor of Materials Science and Engineering; Katrin Daehn, a postdoc in the Allanore lab; Xuanhe Zhao, the Uncas (1923) and Helen Whitaker Professor in the Department of Mechanical Engineering; Richard Otte, CEO of Promex; and Carl Thompson, the Stavros V. Salapatas Professor in Materials Science and Engineering.

California launches first-in-nation satellite tech to curb methane leaks

California air quality regulators on Friday announced the launch of a first-in-nation satellite data project, with the aim of monitoring and minimizing methane emissions. The technology involves the use of satellite-mounted methane sensors that transmit data regarding the location of methane leaks that could otherwise go undetected, according to California Air Resources Board (CARB). The...

California air quality regulators on Friday announced the launch of a first-in-nation satellite data project, with the aim of monitoring and minimizing methane emissions. The technology involves the use of satellite-mounted methane sensors that transmit data regarding the location of methane leaks that could otherwise go undetected, according to California Air Resources Board (CARB).  The project, funded by a $100 million state budget investment, serves to bolster collaboration between industry and state and local leaders, in order to curb emissions and protect public health, per the agency. In advancing this new initiative, state officials touted the effort as critical climate action amid the Trump administration’s many rollbacks in the U.S. Environmental Protection Agency (EPA).  “Decades of progress to protect public health is on the line as the Trump Administration works to roll back critical environmental protections,” Gov. Gavin Newsom (D) said in a statement. “California isn’t having it." Of specific concern to Californians has been the EPA’s decision to reconsider what’s called the “endangerment finding” — the basis for federal actions to curb planet-warming emissions.  “We’re using satellite technology to detect methane leaks as they happen,” Newsom said. “With this new data, we’ll be able to move faster to cut harmful methane pollution – protecting Californians and the clean air we’ve fought so hard for.” Methane — a clear, odorless gas released from landfills, livestock facilities and fossil fuel operations —is more than 80 times as potent as carbon dioxide when it comes to near-term warming.  The satellites, one of which has already been deployed, will be able to show specific regions for observation, leading to targeted mitigation efforts. “The effort provides information that is much closer to real time than the data now available,” Liane Randolph, chair of CARB, said in a statement. “It allows us to get ahead of one of the major contributors to what has become an immediate threat to public health and the environment.” The governor on Friday also announced that he was joining the “America Is All In” bipartisan climate coalition as its newest co-chair. The coalition of state and local leaders intends to halve emissions by 2030 and achieve net-zero by 2050, while boosting resilience amid climate challenges.  “With the all-out assault we’re now facing on low-carbon, green growth from the federal level, it’s the subnational leaders — those of us leading our states and cities — who have to step up,” Newsom said. 

New desalination technology being tested in California could lower costs of tapping seawater

A new desalination technology is undergoing testing in Southern California. Water managers hope it will offer an environmentally friendly way of tapping the Pacific Ocean.

Californians could be drinking water tapped from the Pacific Ocean off Malibu several years from now — that is, if a company’s new desalination technology proves viable. OceanWell Co. plans to anchor about two dozen 40-foot-long devices, called pods, to the seafloor several miles offshore and use them to take in saltwater and pump purified fresh water to shore in a pipeline. The company calls the concept a water farm and is testing a prototype of its pod at a reservoir in the foothills of the Santa Monica Mountains. The pilot study, supported by Las Virgenes Municipal Water District, is being closely watched by managers of several large water agencies in Southern California. They hope that if the new technology proves economical, it could supply more water for cities and suburbs that are vulnerable to shortages during droughts, while avoiding the environmental drawbacks of large coastal desalination plants.“It can potentially provide us Californians with a reliable water supply that doesn’t create toxic brine that impacts marine life, nor does it have intakes that suck the life out of the ocean,” said Mark Gold, director of water scarcity solutions for the Natural Resources Defense Council. “If this technology is proven to be viable, scalable and cost-effective, it would greatly enhance our climate resilience.” OceanWell’s Mark Golay, left, and Ian Prichard, deputy general manager of Calleguas Municipal Water District, walk toward a prototype of the desalination pod being tested in Las Virgenes Reservoir. (Allen J. Schaben / Los Angeles Times) During a recent demonstration at Las Virgenes Reservoir, Tim Quinn, the company’s water policy strategist, watched as the 12-foot-long cylindrical prototype was lowered underwater on a cable. “We pull fresh water only up out of the ocean, and the salt stays down there in low concentrations, where it’s not an environmental problem,” Quinn said.The testing at Las Virgenes Reservoir will help the company’s engineers check how the system works in filtering out plankton and discharging it back into the water. When the pod was nearly 50 feet underwater, Mark Golay, the company’s director of engineering projects, turned on the pumps and water flowed from a spigot.The next step, expected later this year, will involve conducting trials in the ocean by lowering a pod from an anchored boat into the depths about 5 miles offshore.“We hope to be building water farms under the ocean in 2028,” Quinn said.Quinn previously worked for California water agencies for four decades, and he joined Menlo Park-based OceanWell two years ago believing the new technology holds promise to ease the state’s conflicts over water.“Ocean desal has never played a prominent role in California’s water future,” he said, “and this technology allows us to look to the ocean as a place where we can get significant sources of supply with minimal, if any, environmental conflict.”Managers of seven Southern California water agencies are holding monthly meetings on the project and studying what investments in new infrastructure — such as pipelines and pump stations — would be needed to transport the water the company plans to sell from the shore to their systems. Leaders of Las Virgenes Municipal Water District, who are spearheading the effort, are holding an event at the reservoir Friday to showcase how the technology is being tested. The pilot study is being supported by more than $700,000 in grants from the Metropolitan Water District of Southern California and the U.S. Bureau of Reclamation. The company still will need to secure additional permits from the federal government and the state. And it has yet to estimate how much energy the process will require, which will be a major factor in determining the cost.But water managers and other experts agree that the concept offers several advantages over building a traditional desalination plant on the coast.Significantly less electricity is likely to be needed to run the system’s onshore pumps because the pods will be placed at a depth of about 1,300 feet, where the undersea pressure will help drive seawater through reverse-osmosis membranes to produce fresh water.While the intakes of coastal desalination plants typically suck in and kill plankton and fish larvae, the pods have a patented intake system that the company says returns tiny sea creatures to the surrounding water unharmed. And while a plant on the coast typically discharges ultra salty brine waste that can harm the ecosystem, the undersea pods release brine that is less concentrated and allow it to dissipate without taking such an environmental toll. Golay lowers a prototype into Las Virgenes Reservoir for testing. (Allen J. Schaben / Los Angeles Times) If the technology proves viable on a large scale, Gold said, it would help make Southern California less reliant on diminishing imported supplies from the Sacramento-San Joaquin River Delta and the Colorado River.Research has shown that human-caused climate change is driving worsening droughts in the western United States. Gov. Gavin Newsom’s administration has projected that as rising temperatures diminish the snowpack and intensify droughts, the average amount of water available from the reservoirs and aqueducts of the State Water Project could shrink between 13% and 23% over the next 20 years.Southern California’s water agencies are moving ahead with plans to build new facilities that will transform wastewater into clean drinking water, and have also been investing in projects to capture more stormwater.In addition to the economic viability, other questions need to be answered through research, Gold said, including how well the system will hold up filtering tiny sea life, how much maintenance will be needed, and whether the pods and hoses could present any risk of entangling whales.OceanWell’s executives and engineers say their system is designed to protect marine life and eliminate the environmental negatives of other technologies. A conceptual illustration shows a so-called water farm that OceanWell plans to install off the California coast, with 40-foot-long pods anchored to the seafloor about 1,300 feet deep. (OceanWell) Robert Bergstrom, OceanWell’s chief executive, has been working on desalination projects since 1996, and previously built and operated plants in the U.S. Virgin Islands, the Bahamas and other Caribbean islands for the company Seven Seas Water, which he founded.When Bergstrom retired, he moved to California and eventually decided to go back to work to develop technology to help solve California’s water problems.“I had a big idea,” Bergstrom said. “I knew this was going to be just a huge lift to get this done, a moonshot.”OceanWell, founded in 2019, now has 10 employees. Its lead investor is Charlie McGarraugh, a former partner of the investment banking company Goldman Sachs. One of its major investors is Japan-based Kubota Corp. Building on Bergstrom’s concept, Chief Technology Officer Michael Porter and the engineering team have worked on the design. They built the first prototype in Porter’s kitchen in San Diego County, and did initial tests in a lab.“It was inspired by the environmental community in California pointing out problems that needed to be solved,” Bergstrom said.Desalination plants are operating in parts of California, including the nation’s largest facility, in Carlsbad, and a small-scale plant on Santa Catalina Island. But proposals for new coastal desalination plants have generated strong opposition. In 2022, the California Coastal Commission rejected a plan for a large desalination plant in Huntington Beach. Opponents argued the water wasn’t needed in the area and raised concerns about high costs and harm to the environment.The problem of traditional shallow intakes drawing in large amounts of algae, fish larvae and plankton goes away in the deep sea, Bergstrom said, because the perpetual darkness 1,300 feet underwater supports vastly less sea life.“We have much cleaner water to deal with,” Bergstrom said. “It’s pretty much a barren desert where we’ve chosen to locate, and as a result, we just don’t have that much stuff to filter out.”A specific site for the first water farm has not yet been selected, but the company plans to install it nearly 5 miles offshore, with a pipeline and a copper power cable connecting it to land.Putting the system deep underwater will probably reduce energy costs by about 40%, Bergstrom said, because unlike a coastal plant that must pump larger quantities of seawater, it will pressurize and pump a smaller quantity of fresh water to shore.Bergstrom and his colleagues tout their invention as a totally different approach. They say it’s not really desalinating seawater in the traditional sense, but rather harvesting fresh water from devices that function like wells in the ocean.After their first water farm, they envision building more along the coast. Bergstrom believes they will help solve water scarcity challenges in California and beyond.Various sites off California would be well-suited to develop water farms, from San Diego to Monterey, Bergstrom said, as would many water-scarce countries with deep offshore waters, such as Chile, Spain and North African nations.“I believe it’ll reshape the world more than just California water,” Quinn said, “because I think the globe is looking for something that is this environmentally friendly.”Under the company’s plans, the first water farm would initially have 20 to 25 pods, and would be expanded with additional pods to deliver about 60 million gallons of water per day, enough for about 250,000 households.Las Virgenes and six other water agencies — including L.A. Department of Water and Power, the city of Burbank and Calleguas Municipal Water District — are working together on a study of how water could be delivered directly from the project, and at what cost, as well as how inland agencies could benefit indirectly by exchanging supplies with those on the coast.“We’re very heavily dependent on imported water, and we need to diversify,” said David Pedersen, Las Virgenes’ general manager. “We need to develop new local water that’s drought resilient, and that can help us as we adapt to climate change.”His district, which depends almost entirely on imported supplies from the State Water Project, serves more than 75,000 people in Agoura Hills, Calabasas, Hidden Hills, Westlake Village and surrounding areas. Mike McNutt, public affairs and communications manager for Las Virgenes Municipal Water District, tastes water that flows from a spigot after passing through a prototype desalination system at Las Virgenes Reservoir. (Allen J. Schaben / Los Angeles Times) During the drought from 2020 to 2022, the district was under severe water restrictions and customers reduced usage nearly 40%. Pedersen hopes the district will be able to tap the ocean for water by around 2030. At Calleguas Municipal Water District, which delivers water for about 650,000 people in Ventura County, deputy general manager Ian Prichard said one of the big questions is how much energy the system will use.“If the technology works and they can bring it to market, and we can afford to bring the water into our service area, then that would be great,” Prichard said. “The big test is, can they produce water at a rate that we want to pay?”

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!

CONTACT US

sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.