Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Outrage at plans to develop Turkey’s cultural heritage sites

News Feed
Monday, August 21, 2023

Archaeologists fear dangerous precedent if court approves new beach facilities at site of Phaselis on the Mediterranean coastThe construction of tourist facilities on two beaches that were part of the ancient city of Phaselis – a tentative nominee for Unesco world heritage status – has caused outrage at what is claimed to be the latest example of the Turkish culture ministry sacrificing heritage for tourism.The Alacasu and Bostanlık beaches, on Turkey’s southern Mediterranean coast in the province of Antalya, were part of Phaselis, a Greek and Roman settlement thought to be the birthplace of Plato’s student Theodectes. Despite having ruins dating back to the second century BC, the beaches have never been subject to an archaeological dig. Continue reading...

Archaeologists fear dangerous precedent if court approves new beach facilities at site of Phaselis on the Mediterranean coastThe construction of tourist facilities on two beaches that were part of the ancient city of Phaselis – a tentative nominee for Unesco world heritage status – has caused outrage at what is claimed to be the latest example of the Turkish culture ministry sacrificing heritage for tourism.The Alacasu and Bostanlık beaches, on Turkey’s southern Mediterranean coast in the province of Antalya, were part of Phaselis, a Greek and Roman settlement thought to be the birthplace of Plato’s student Theodectes. Despite having ruins dating back to the second century BC, the beaches have never been subject to an archaeological dig. Continue reading...

Archaeologists fear dangerous precedent if court approves new beach facilities at site of Phaselis on the Mediterranean coast

The construction of tourist facilities on two beaches that were part of the ancient city of Phaselis – a tentative nominee for Unesco world heritage status – has caused outrage at what is claimed to be the latest example of the Turkish culture ministry sacrificing heritage for tourism.

The Alacasu and Bostanlık beaches, on Turkey’s southern Mediterranean coast in the province of Antalya, were part of Phaselis, a Greek and Roman settlement thought to be the birthplace of Plato’s student Theodectes. Despite having ruins dating back to the second century BC, the beaches have never been subject to an archaeological dig.

Continue reading...
Read the full story here.
Photos courtesy of

Storyteller Ed Edmo brings Native traditions to life at Celilo Falls and other cultural sites

Edmo, Shoshone-Bannock, grew up listening to his father’s stories. “At the time, I didn’t realize it wasn’t just stories,” he says. “It was my dad giving me my culture night by night.”

This story originally appeared on Underscore.newsMany who grew up in the Pacific Northwest recognize Ed Edmo by his expressive hands and face as he tells stories, breathing life into characters like Nasho, a monster woman, and cultural tricksters like the Coyote and Spider Woman. He changes his voice, making it high for the women’s parts and low for the men’s. Sometimes he incorporates puppets or asks the audience to help him, urging them to repeat after him.Ed Edmo, Shoshone-Bannock, is a traditional storyteller, poet, playwright, published author, actor, performer, instructor and tour guide who lectures on cultural issues at cultural sites in the Pacific Northwest, such as the flooding of Celilo Falls, as well as drug and alcohol abuse and mental health for Native peoples. He has also served as a consultant to the Smithsonian Museum of the American Indian in Washington D.C.He tells a story of a monster with “long hair, claws for fingers, snaggly teeth, snot down to here, and bad breath,” asking the audience to repeat those descriptions with him every time he mentions the monster, utilizing hand motions and facial expressions to fully immerse his audience in the story. As his hands wave through the air, the light catches on his silver and turquoise rings—one on almost every finger. Edmo says this is an audience favorite.“Stories were told to teach children how to act,” shares Edmo. “…how to treat their elders, how to interact with nature and the world.”Once Edmo is on a roll, it’s hard for him to stop. He loses time, as does the audience, immersed in the moment. He tells story after story, as long as the audience is engaged. His stories explain how things work or why they are the way they are. He brings the stories and legends of his culture alive. It feels like you’re a part of the story. You can smell the wood burning, feel the bee sting, taste the salmon eyeball soup, and smell the meat cooking because of the way he tells it. He uses props like a cane and puppets of a bear, goose, coyote and eagle to help illustrate his stories.Edmo grew up listening to his father’s stories. He remembers the crackling fire and his mom, grandmother, and aunties quilting around it as they sat and listened.“At the time, I didn’t realize it wasn’t just stories,” Edmo said, stroking his salt-and-pepper braids. “It was my dad giving me my culture night by night.”‍Native resiliencyWith a birth story like Edmo’s, it’s no wonder he grew to become the traditional storyteller he is today. When Edmo was born on the Duck Valley Indian Reservation in Nevada in 1947, he was so small and weak, the delivery doctor gave him five minutes to live. Seventy-six years later, he is not only alive, but thriving. While still small in stature, Edmo’s gravitas and impact on contemporary Native culture in the Pacific Northwest is anything but.Edmo is a living testament to Native resiliency. Shortly after he was born, his family moved back to his father’s ancestral home at Celilo Village near The Dalles along the Columbia River. His paternal family’s homelands are in the Columbia Gorge. The Celilo Wy-am people are best known for what is currently known as Celilo Falls, where the river once flew down a torrential drop of over 20 feet between bluffs on the shorelines made from basalt. These were the ancestral fishing grounds for the Celilo Wy-am people, where they speared salmon while standing on rocks surrounding the falls. Later, they created platforms across the rocks from which they caught fish in nets.When Edmo was just a young boy, The Dalles Dam flooded Celilo Falls. Before the flooding, the falls provided everything that the Celilo Wy-am people needed: from salmon, sturgeon, smelt, and eel to traditional roots and berries along the riverbanks. After the falls were flooded, their land base went from over 108 acres to what is now just over seven acres.While eating lunch at one of his favorite local cafes, Ed Edmo shares a photo taken of him as a young man (center) singing in the drum circle while occupying the Portland Area Bureau of Indian Affairs office on Nov. 11, 1973. Also pictured, Al Smith (right) and Bear Cub (back right).Ed Edmo/Underscore News/Report for AmericaEd Edmo’s hands are adorned with jewelry, many are repurposed silver spoon handles that he now wears as rings.Ed Edmo/Underscore News/Report for AmericaTo keep that history alive, Edmo teaches students and community groups in the Pacific Northwest about Celilo Falls and the Celilo Wy-am people. During one of his presentations, Edmo shared slides on Celilo Falls.“I couldn’t believe how something so big and beautiful could disappear,” he said.Edmo was raised by both his parents and attended elementary school at Petersburg School in The Dalles, which was a one-room schoolhouse.“I didn’t realize it was so small,” Edmo said. “When I was so little, it seemed big!”He graduated from Wishram High School, where he first started getting involved in performance art as a drama student. In 1974, Edmo met his wife, Carol, in Portland. He said they heard about each other through various friends and relatives. They got married in 1976.Edmo is now in recovery from alcohol and lives a clean and sober life. He goes to Alcoholics Anonymous meetings to stay sober.“I’m following the path,” he shares.Grandfather StorytellerIn addition to storytelling, Edmo is a renowned poet and writer. In 1974, he received a grant from the Potlatch Fund for his poem “These Few Words of Mine.” At the time, he was attending Portland Community College.“I write about contemporary moments in time for contemporary Indians,” said Edmo. “…poems about drinking coffee and raking leaves, about my parents when I was a kid.”Poems with titles like, “Grandfather Storyteller,” and, “Burnside Cowboy.” Edmo’s most famous poem is “Indian Education Blues.” He wrote it in 1968 in Bend, Oregon. Now, it’s a part of the cultural exhibit “Poetry in Motion,” which features poetry on TriMet transit buses and light rail.He keeps a beaded pen on him, calling it his “weapon.” He explains that, as a writer, the inspiration to write can hit at any time or any place. One time, he left his journal in the car when he stopped at a Dairy Queen. There was a family in the restaurant, and he was compelled to write a story about them based on their expressions and interactions with each other and the staff, creating a whole backstory from his imagination. He wrote it on a napkin.Sharing cultureEd Edmo, Shoshone-Bannock, is a traditional storyteller, poet, playwright, published author, actor, performer, instructor and tour guide who lectures on cultural issues at cultural sites in the Pacific Northwest.Ed Edmo/Underscore News/Report for AmericaHis writing and storytelling has recently been memorialized as part of a collection of murals of Portland icons, entitled “A Place Called Home” in the Portland International Airport. The mural of Edmo depicts him dressed in a blue button-down shirt with his iconic suspenders, traditional beadwork, and his long hair in two braids. Above his dapper black hat, Edmo holds one of his favorite storytelling props: a stuffed eagle.When the mural was first unveiled, Edmo attended and sang a song for the airport employees. Edmo has spent his life sharing his culture and supporting his community – in this, and many other ways. He currently serves on the board of Red Lodge Transition Services, a program for women reentering Clackamas County after prison. He teaches cultural classes at Coffee Creek Women’s Penitentiary, and storytelling classes at Gresham High School. He says he is also looking forward to performing at the Fisher Poets Gathering in Astoria at the end of February.He finds time to dance and run sweat lodges at various men’s prisons in the Pacific Northwest where he serves as a traditional dancer. He says he doesn’t believe in competition dancing, where dancers get paid for their performances.While he admits he used to charge people coins or candy for pictures of him in his regalia when he was younger, he says he believes in the tradition of dance as ceremony and to promote healing in the community.“At Celilo, we danced for the fun of it, not for competition,” he says.— Leah Altman, Oglala Lakota, is a Native American adoptee and was raised in the Portland area. She has written for local and national publications, including Portland Monthly, Oregon Humanities, Portland State University’s Metroscape magazine, and Indian Country Today. For over a decade, Leah has also worked for Native and BIPOC-led environmental and community organizations, including the NAYA Family Center, Verde, Native Arts & Cultures Foundation, and the Intertribal Agriculture Council.

The Farm Bill Hall of Shame

The farm bill is one of the most important but least understood pieces of US legislation, and it’s overdue for renewal. But Congress couldn’t pass a new version in the fall, reflecting partisan dysfunction and also a contentious debate about what the bill ought to be—a debate that has become ensnared in the nation’s culture […]

The farm bill is one of the most important but least understood pieces of US legislation, and it’s overdue for renewal. But Congress couldn’t pass a new version in the fall, reflecting partisan dysfunction and also a contentious debate about what the bill ought to be—a debate that has become ensnared in the nation’s culture wars. Racial equity, food sovereignty, protections for workers, and meaningful action on climate change have broadened the bill’s traditional mandate of growing food and feeding hungry people. In this special series, a partnership with the Food and Environment Reporting Network, we’ll be exploring some of the urgent issues a new farm bill must address. Read the other stories in the series here. The farm bill is among the most important pieces of legislation that Congress is more or less obliged to pass. Yet to all but a handful of people whose job it is to parse its every incremental gain or loss, it is largely inscrutable. Every five years we’re treated to bitter fights over things like the use and abuse of agricultural subsidies; attempts to defund SNAP; the notion that environmental stewardship should guide farm policy as much as increasing production; and how (and sadly whether) to build equity into an agriculture system with a racist history.  But the backstories to these fights, some ill-fated and others shameful, can provide important context and help to clarify exactly what’s at stake. Over the last 90 years there have been several key farm bill moments, the consequences of which shape the debates ongoing today. The farm bill’s original sin  The story of the farm bill is one of Black land dispossession and persistent racial inequality in American agriculture. It was baked into the first farm bill, and it has never been made right. Faced with the twin crises of the Dust Bowl and the Great Depression, the Roosevelt administration was determined to help ailing farmers. But in the rush to pass the 1933 Agricultural Adjustment Act (AAA), the era’s most dramatic piece of farm legislation, FDR sided with Southern plantation owners, who wanted to raise crop prices, and farm income, by paying farmers to plant fewer acres.  This was disastrous for Black farmers. As Jonathan Coppess writes in The Fault Lines of Farm Policy, historians argue that the AAA was “designed and used intentionally to help southern cotton planters push poor black sharecroppers off the land and consolidate their holdings.” Only landowners could sign government contracts that paid farmers to plow up their fields, and the vast majority of Black farmers were tenants or sharecroppers. White landowners were supposed to share the payments they got with their renters and sharecroppers, but few did. Others purposefully signed crop subsidy contracts only when they didn’t have tenant or sharecropper agreements, to avoid sharing the wealth. “Whether the share-cropper received any payment or any credit allowance for the cotton plowed up on his patch depended upon the tender mercies of his landlord,” wrote Webster Powell and Addison T. Cutler in Harper’s in 1934. “We encountered a number of cases where the landlord arranged with the government to plow up all of the patch operated by an individual cropper and followed this up by closing the books with the cropper and sending him ‘down the road.’” All told, the largest farms maintained or even increased their harvests while farming less land, and farmers on rented land went out of business. Between 1930 and 1935, land owned by white farmers increased by 35 million acres while land owned by all other farmers decreased by more than 2 million acres. Over the next twenty years the number of Black farmers dropped 37 percent.  The AAA was just the beginning. Decades of racist policies followed, driving further dispossession, including discriminatory USDA lending and the denial of title and other property rights. Between 1910 and 1997, Black farmers lost 90 percent of their farmland. Today, Black farmers own less than 1 percent of total US farmland. This year, a group of Senate Democrats, led by Cory Booker of New Jersey, is pushing for the Justice for Black Farmers Act, first introduced in 2020, to be included in the next farm bill. The legislation’s aim is ambitious: “[E]nd discrimination within the USDA, protect remaining Black farmers from losing their land, [and] provide land grants to create a new generation of Black farmers.”  It would be fitting if the farm bill, once used to solidify racial inequality, becomes the vehicle to right that historical wrong. Related The “Machine That Eats Up Black Farmland” The birth of huge The number of American farms has dropped from 6.8 million in 1935 to 2 million today. The remaining farms have gotten a lot bigger, with the median operation managing 445 acres and the largest 11 percent producing 82 percent of all farm goods. Large farms deal with similarly massive agribusiness companies that control seeds, fertilizer, and other things farmers need. Together, this so-called “Big Ag” system dominates our food supply, prioritizing yields and efficiency over high-quality food, a healthy environment, and rural development.   How did this happen? The conventional wisdom blames Earl Butz, Richard Nixon’s Secretary of Agriculture, who reportedly urged farmers to “get big or get out” (though there is scant evidence that he actually said it). Regardless, most farm consolidation had already happened by the time Butz took the reins at the department of agriculture in 1971.  Some of the consolidation was made possible by technological advancements that exploded yields in the post-war era, like rural electrification and better seed breeding. But the farm bill, and ag policy broadly, also favored large farms over smaller ones.  Even before the New Deal, agricultural officials envisioned a system of large, mechanized farms—the kinds of operations that needed fewer people to do the work. So when the federal money started flowing in the 1930s, most of it went to the big players who used it to invest in tractors and other technology that allowed them to get even bigger. For instance, FDR ended the 1931 “feed and seed” loans that many small farmers and tenants relied on in favor of federal lending programs with steep collateral requirements that small farmers couldn’t meet. As small cotton farmers told Harper’s in 1934, “The only thing they haven’t got a mortgage on is my wife and kids.” Even though Republicans and Democrats bemoaned the “farm problem,” or the rapid loss of farms that continued through the 1950s and into the 1970s, neither party summoned the political will to try more aggressive interventions that some states were experimenting with, like limiting corporate farmland ownership. And anti-communist Cold War sentiment shut down any consideration of collective farming or land redistribution. Late New Deal policy did aim to create a fair price floor for all farmers and discourage the overproduction that drove prices down. But policymakers never quite struck the right balance between preventing overproduction and ensuring that farms of all sizes were able to access federal support and stay afloat.  When Eisenhower and his agriculture secretary, Ezra Taft Benson, lowered New Deal price floors in 1954 and ended corn production controls in 1959, surpluses grew, prices fell, and larger farms were able to survive on those lower prices while smaller ones could not. The Kennedy administration made one last effort to support crop prices by limiting production, but wheat farmers killed it in the famous 1963 wheat referendum. The president of the American Farm Bureau Federation, the nation’s biggest agricultural lobbyist that had led the charge against the referendum, called the defeat “a major turning point in the continual battle…between those who believe in an agriculture producing for the competitive market and those who favor Government supply management.” So farm policy inched closer and closer to unfettered production for low market prices, and the “farm problem” got worse. The debate simmers on today. The clamor to do something about consolidation in the next farm bill includes a proposal for an “Office of Small Farms” that would “ensure that all USDA programs are designed to meet the needs of small farmers.”  Show me the money  The farm bill has always allowed big farms to take more than their fair share of federal subsidies and use the extra cash to outcompete smaller farms. This inequity has gotten worse since the 1970s, when the farm bill stopped trying to raise market prices as a way to help farmers and instead started making direct payments to them.  The problem is that direct payments don’t make it to the people who need them the most. Between 1995 and 2021, the top 10 percent of subsidy recipients—the largest and wealthiest farms—got more than 78 percent of commodity program payments, and the top 1 percent received a whopping 27 percent. To date, every effort to set a reasonable cap on subsidy payments for large farms has failed, stymied by the power of the ag lobby and its allies in Congress. Congress nearly passed subsidy limits in the 2014 farm bill cycle. Legislation included a $50,000 limit, stricter requirements that recipients be “actively engaged in farming” to prevent non-farmers from receiving payments, and a modest limit on crop insurance subsidies for the richest farms. Sen. Thad Cochran from Mississippi, the ranking GOP member of the Senate Agriculture Committee, who was facing a primary challenge, reportedly managed to nix the limit before the final bill was passed as part of an effort to shore up his support back home.    In May 2023, the Des Moines Register published an op-ed that called for a farm bill that ensured that subsidies go to “actual farmers instead of the current practice of sending endless subsidies to millionaires, billionaires, nonfarmers, and absentee owners who often live in big cities or even foreign countries.” It could have been written fifty years ago. Feeding the crop insurance beast In February 2023, the Government Accountability Office reiterated its longstanding plea for Congress to rein in the soaring costs of federal crop insurance, which have more than quadrupled since 2005. Since 1938 the federal government has subsidized insurance for farmers to protect against losses from things like poor harvests, a drop in market prices, and extreme weather. But until the 1990s, crop insurance subsidies were a marginal part of the farm safety net; today they’re the second most expensive program in the farm bill (after SNAP). The GAO didn’t say it, but it’s reasonable to ask: Why, as the impact of climate change is impossible to ignore, are taxpayers paying farmers billions to grow thirsty and other failure-prone crops, such as cotton and alfalfa, in the desert Southwest. And why are private insurance companies raking in taxpayer money to cover them? Here’s why. In 1996, Sen. Pat Roberts’ “Freedom to Farm” bill turned most crop subsidies into fixed payments, unrelated to market prices. It also increased federal subsidies for private crop insurance, so that if farmers wanted extra income guarantees they could pay to insure their profit margin or their crops. When crop prices collapsed one year into the Freedom to Farm plan, the new fixed payments proved paltry, driving more farmers to buy crop insurance and demand ad hoc emergency government payouts—which they got, to the tune of $30 billion between 1998 and 2002. After shelling out for emergency payments, Congress doubled down on pushing farmers to buy crop insurance and gave them even more money to cover premiums. At the end of 1994, there were 90 million acres covered by crop insurance; today nearly 500 million acres are covered. Taxpayers pay roughly 60 percent of farmers’ insurance premiums, even as plans become more expensive due to rising crop failures due to climate change. This system has one big winner: private crop insurance companies. GOP leadership—including Pat Roberts—snuck a provision into the 2014 farm bill that prevents the government from ever spending less money to subsidize crop insurance. The law effectively locks in high profits for insurers.  And how about those cotton growers in the desert? Since 2001, heat linked to climate change has driven $1.33 billion in insurance payouts to farmers across the Southwest for crops that failed amid high temperatures. And that number is only going to grow. Risky, environmentally sensitive, and less-productive farmland can get the same crop insurance subsidy as highly productive land, even though it’s more likely to underperform and require an insurance payout. With these incentives, farmers converted into crop land millions of acres of less-productive, drought-prone grasslands and sensitive wetlands that likely would not have been farmed. An analysis by the Environmental Working Group estimates that over 23 million acres of grassland, shrub land, and wetlands were turned into crop land between 2008 and 2011.  What climate crisis? Beyond the perennial call for deep cuts to SNAP, one of the biggest obstacles to a new farm bill this year is an effort by House Republicans to hijack $20 billion of funds from Biden’s big climate bill, the Inflation Reduction Act, earmarked for agriculture climate-mitigation projects and use them to fatten crop subsidies.  You may think that this is a singular bit of crazy generated by the chaos that engulfed the House GOP last fall. But in fact it is merely the latest in the ongoing degradation, via the farm bill, of farmers’ ability to mitigate the climate crisis. Farm bill conservation policy is split into “land retirement” programs, like the Conservation Reserve Program, which pay farmers to take erosion-prone or environmentally sensitive land out of production, and “working lands” programs, such as the Environmental Quality Incentives Program (EQIP) and the Conservation Stewardship Program (CSP), which pay farmers to adopt more sustainable practices.   In the run-up to the 2014 farm bill, the conservation programs came under fire as Tea Party Republicans refused to increase the debt ceiling and demanded significant spending cuts. In the face of this pressure, a small group of congressional leaders struck a deal, sometimes called the “Secret Farm Bill,” which agreed to $23 billion in farm program cuts over ten years. When the 2014 law finally emerged, conservation spending had been cut by $6 billion, or nearly 10 percent over ten years.  These changes sent US agriculture backward at a time when farms needed and wanted support to conserve biodiversity and reduce air and water pollution. An analysis by the Environmental Working Group, for instance, found that between 2017 and 2020, only 23 percent of EQIP funds supported practices deemed “climate-smart” by USDA. And even without the cuts, EQIP and CSP do not prioritize practices that reduce greenhouse gas emissions. To make matters worse, every conservation program except CRP is subject to further cuts through the annual federal budget setting process. The year after the 2014 farm bill passed Congress gave all conservation programs (other than CRP) 7 percent less money than originally stipulated. All told, the National Sustainable Agriculture Coalition estimates that these budgeting processes will have cut another $2 billion in conservation spending between 2014 and 2024. So even if Democrats manage to hang on to the billions earmarked for conservation in the Inflation Reduction Act, it will just barely make up for cuts made a decade ago. Such is the perverse reality of the farm bill. 

The hidden racial bias in U.S. lung cancer screening policy

A 2019 study found that screening eligibility rules disproportionately excluded Black smokers. Little has changed

About five years ago, Vanderbilt University epidemiologist Melinda Aldrich discovered something about the U.S. cancer care system that has eaten at her ever since. Aldrich had been studying the public health benefits of preventive lung cancer screening, widely seen as a critical first line of defense against a disease that kills more than 120,000 Americans a year — more than the combined total for breast, colon, and prostate cancers. Lung cancer is extraordinarily difficult to treat in its advanced stages, and the earlier it is detected, the better a person’s odds for survival, many experts say. But hospitals and insurers impose strict criteria for deciding who can receive preventive lung cancer screening, and when. At the time of Aldrich’s discovery, a person had to be between 55 and 80 years old and have smoked the equivalent of 30 pack-years — a pack a day for 30 years, say, or two packs a day for 15 — to be screened with the standard method for early detection, called low-dose spiral computed tomography. Aldrich’s analysis of more than 84,000 adult smokers revealed that under those criteria, Black patients at relatively high risk of lung cancer were being disproportionately excluded from the screening eligibility pool. More than two-thirds of Black smokers who were diagnosed with lung cancer did not meet the age and smoking history criteria at the time of their diagnosis. “These people are getting diagnosed with lung cancer, and yet they can't even get in the door to be screened,” Aldrich later recalled. Aldrich knew that Black Americans tended to develop lung cancer at higher rates and at younger ages than their White counterparts, despite on average smoking fewer cigarettes. And by her calculations, high-risk Black smokers were being excluded from screening at more than one and a half times the rate of White smokers. As Aldrich saw it, the screening inequity had the potential to worsen the already staggering toll that lung cancer exacts in Black communities. According to the American Lung Association, the disease claimed the lives of more than 14,000 Black Americans in 2021, the most recent year for which there was data. Black lung cancer patients were 15 percent less likely than White patients to be diagnosed early, and had the lowest five-year survival rate of any racial group. Based on their findings, Aldrich and her collaborators recommended lowering the age threshold to 50 and the smoking history requirement to 20 pack-years for Black smokers, to make rates of screening eligibility more equitable. Putting race at the center of cancer screening was in some ways a radical proposal, and Aldrich was prepared for pushback from her peers. But at the 2018 World Conference on Lung Cancer, where she first presented the findings, the response from the audience was very positive. Aldrich and her colleagues realized the work was going to be taken seriously. “These people are getting diagnosed with lung cancer, and yet they can’t even get in the door to be screened.”  Two years after the Vanderbilt study was published, the nation’s standard bearer for preventive health policy — an independent panel of experts known as the U.S. Preventive Services Task Force — revised its lung cancer screening guidelines, citing their work as a factor alongside other studies, including a large study called the Nelson trial. But rather than lower the age and pack-year requirements specifically for Black smokers, as Aldrich and her colleagues had advised, the task force loosened those criteria for all smokers, regardless of race. The new guidelines caught a larger share of Black smokers but also larger shares of White smokers and smokers of other backgrounds. As a result, it narrowed, but did not close, the racial screening gap that Aldrich and company had identified. The task force’s decision has thrust it into the center of a simmering debate about the best way to address the factors underlying racial health disparities in lung cancer and other diseases. And it has led some experts to call for a more complete overhaul of screening policy that accounts not just for race but for genetics, environment, occupation, and other risk factors in screening decisions. “Lung cancer [screening] is not going to be a one size fits all solution,” said Aldrich, who advocates for more personalized approaches to assessing risk. That those approaches have yet to gain traction in health policy circles, however, underscores a seemingly inescapable truth of the American health care system: With public health interventions like preventative screening, it’s nearly impossible to disentangle scientific findings from the politics and social forces that drive them.   Aldrich’s interest in racial disparities in lung cancer goes back two decades, to her days as a Ph.D. student at the University of California, Berkeley. There, she studied the genetic factors that influence lung cancer risk, and how those factors vary between Black American and Hispanic populations. When she arrived at Vanderbilt in 2010, she began deepening her understanding of the fraught topic, probing genetics, social determinants of health, and environmental risk factors. For the better part of Aldrich’s career, there were no official national standards for screening asymptomatic patients for lung cancer — not even for heavy smokers, who are known to be at elevated risk. In part, that was because the most common tool for examining a patient’s lungs, the chest X-ray, wasn’t very good at identifying small tumors. In 2011, however, a study known as the National Lung Screening Trial, which followed more than 53,000 people aged 55 to 74 with a history of heavy smoking, showed that low-dose spiral computed tomography, or low-dose CT, could identify early-stage tumors while they were still treatable. Study participants screened with low-dose CT had a 20 percent lower risk of lung cancer death than those screened with standard chest X-rays. Two years later, citing that study, the U.S. Preventive Services Task Force issued its first-ever lung cancer screening guidelines: Anyone 55 or older with at least a 30 pack-year smoking history should be screened annually with low-dose CT. The recommendation, like all of the task force’s guidelines, was the product of a complex calculus that weighed the benefits of early detection against potential harms like overdiagnosis, unnecessary tests, and invasive follow-up procedures. And coming from the U.S. Preventive Services Task Force, perhaps the most authoritative voice in preventive care and evidence-based medicine, the recommendation carried substantial weight. The task force’s 16 volunteer members, appointed by the secretary of the U.S. Department of Health and Human Services, draw on peer-reviewed evidence, as well as expert and public opinions to arrive at health care policy recommendations. When the task force sets screening guidelines, leading medical associations, insurance companies, and primary care physicians often adopt those guidelines as their own. After the task force released its lung cancer screening criteria, professional organizations such as the American Academy of Family Physicians and the American College of Chest Physicians soon issued guidelines that were essentially identical. But there were reasons, even then, to believe that lung cancer screening guidelines based strictly on age and smoking history might disproportionately exclude Black patients from receiving potentially life-saving care. It would not take long for that policy pitfall to come to light.   It is well documented that Black smokers tend to develop lung cancer younger and more often than White smokers, despite generally smoking less: A 2006 study published in the New England Journal of Medicine found that Black smokers with a habit of up to 20 cigarettes a day were around twice as likely as White smokers to develop lung cancer; a 2015 analysis found that Black Americans were diagnosed at younger ages for nearly every cancer type, including lung, prostate, breast, and brain cancer; and an analysis of data from the National Lung Screening Trial showed that African Americans were twice as likely as White Americans to die of lung cancer, and were generally younger at the time of diagnosis. Although the reasons for these disparities are unclear, the 2006 New England Journal of Medicine study suggested that they can’t be explained by racial differences in socioeconomic status alone. Whatever the cause of the lung cancer disparity, Black smokers’ heightened risks likely weren’t reflected in the U.S. Preventive Services Task Force’s 2013 screening recommendations. That’s because Black patients were vastly underrepresented in the National Lung Screening Trial, the study that formed the basis of the guidelines. Only 4 percent of the trial’s participants were Black, though Black people made up roughly 14 percent of the U.S. population around the time the study was conducted. If lung cancer screening guidelines were failing Black smokers, it may have been because the clinical trial system had failed them first. With public health interventions like preventative screening, it’s nearly impossible to disentangle scientific findings from the politics and social forces that drive them. Experts say this is a problem that extends beyond lung cancer. Breast cancer mortality rates, in particular, are high among Black women, who are twice as likely as White women to die of the disease before age 50. Yet for a decade, the task force recommended that women at average risk should start mammograms, the main screening tool for breast cancer, at 50, regardless of race, based largely on findings from clinical trials that lacked proportionate representation of Black women. (The task force recently updated its guidelines to recommend screening starting at 40 for women of all racial and ethnic backgrounds.) Similarly, Black men die of prostate cancer at twice the rate of White men, yet until recently, the task force advised against prostate cancer screening altogether. Overall, White people account for 60 percent of the U.S. population but more than 70 percent of clinical trial participants, according to a study published in the journal Therapeutic Innovation & Regulatory Science. This lopsidedness has dire consequences for public health, said Laurie Fenton Ambrose, president and CEO of the patient advocacy group GO2 for Lung Cancer. “We evolve clinical trials in a way that never captures real world implication.” The reasons for Black Americans’ underrepresentation in clinical trials are myriad and complicated. Experts point out that socioeconomic factors can render the basic logistics of trial participation — such as securing transportation and free time — difficult if not impossible for some people to manage. And Fenton Ambrose notes that around 80 percent of the general population is treated for diseases like cancer in hospitals in their local communities, not the major academic centers where most clinical trials for cancer are conducted. Those local facilities may not have the resources to offer clinical trial participation to patients, she says, and those patients might not even be able to access them if they knew they existed. The racial imbalances in clinical trials may be partially attributable to study design. Many trials exclude patients with high body-mass indices or chronic illnesses like high blood pressure, kidney disease, and asthma — all of which are more common among Black Americans. “There’s just so many of these barriers,” Fenton Ambrose said. "I don't think there's an intention to set parameters that exclude [people of color], it’s that the end result isn’t a sufficiently represented population.” But there’s also evidence that Black patients’ poor representation in clinical trials stems partly from racism and implicit bias. A 2020 study published in the journal Cancer found that some clinical trial recruiters withheld opportunities from racial and ethnic minorities because they viewed them as “less promising.” One asserted that African Americans have less knowledge than patients of other backgrounds, and required more time to understand the study instructions. Another said they felt African Americans were less likely to comply with the protocols and rules required by a trial. “Sometimes you'll hear researchers use the excuse that those are hard-to-reach populations,” said Carol Mangione, a primary care doctor and former chair of the U.S. Preventive Services Task Force in a 2021 interview. “I think those are hardly reached.” The gap in clinical trial participation has led to a chicken-egg problem: While there is statistical data that shows Black people get diagnosed with and tend to die at higher rates of certain cancers, there often is not sufficient clinical trial evidence to support race-specific screening recommendations. “We have to be much more purposeful about how we design studies so that it's easy for people to participate,” Mangione said. “We need the federal government to put out requests for clinical trials where you're only allowed to compete to do that research if you can guarantee that you're going to bring in a certain percentage of people of different groups.”   Five years after Melinda Aldrich arrived at Vanderbilt, she gained a collaborator who would be instrumental in helping her unearth the racial disparities stemming from the U.S. Preventive Services Task Force’s lung cancer screening guidelines. Kim Sandler joined the Vanderbilt faculty in 2016, fresh off a residency in radiology and fellowship in cardiothoracic imaging. She and Aldrich began working together, and among their first projects was to examine the relative impacts of the 2013 screening guidelines on patients of different racial backgrounds. That study called for a dataset that was much more representative of the Black population than the National Lung Screening Trial that had informed the 2013 guidelines. Aldrich and Sandler decided to use the Southern Community Cohort Study, funded by the National Cancer Institute, which follows approximately 85,000 people — mostly patients at community health centers throughout the southeast — recruited between 2002 and 2009. Of the roughly 48,000 participants in the study who reported a history of smoking, more than 32,000 were Black. Using statistical models, Aldrich and Sandler analyzed the data to see whether those patients were being screened at appropriate rates. The results were jaw-dropping. Their models suggested that only 17 percent of Black smokers met the task force’s age criteria and 30-pack-year smoking history requirement, compared with 31 percent of White smokers. Worse, during the 12-year period that they followed the cohort for the study, just 32 percent of Black smokers who were diagnosed with lung cancer met the screening eligibility at the time of their diagnosis, compared with 56 percent of White smokers. The data was compelling, but the remedy was hardly clear cut. “I actually remember having that discussion in the library of the Oxford House," Sandler recalls. "There was a group of us sitting around a table with all the data from tens of thousands of patients, but it was like, ‘how do we package this?’” she said. “It was that balance between what's best for the patients, what’s best for the population moving forward in the screening space, and what is most easily implementable.’” Ultimately, the researchers recommended lowering the smoking history criterion for African Americans to 20 pack-years, and the age requirement to 50 — steps their modeling indicated would result in more equitable screening outcomes. Aldrich and Sandler are not the only researchers to have proposed a race-conscious approach to cancer screening. A recent study published in JAMA Network Open considered racial differences in breast cancer rates and recommended that mammography screening start at age 42 for Black women, 51 for White women, 57 for American Indian or Alaska Native and Hispanic women, and 61 for Asian or Pacific Islanders. Last October in JAMA Oncology, a team led by Stanford researchers proposed that decisions to screen for lung cancer should be made using a risk-based model that takes race and ethnicity, among other factors, into account. But efforts to incorporate race into the cancer screening equation have faced pushback, in part on political and sociological grounds. Some critics worry that such medical prescriptions will only perpetuate the notion of race as biological instead of a social construct. Among them is Otis Brawley, a professor of oncology and epidemiology at Johns Hopkins University and former chief medical and scientific officer for the American Cancer Society. For more than three decades, Brawley’s stated mission has been “to close racial, economic, and social disparities in cancer prevention, detection, and treatment of cancer in the United States and worldwide.” Yet, he adamantly opposes making race a part of screening guidelines. For decades, Brawley says, medicine has attributed racial disparities in cancer outcomes to a myth that Black and White people — and the diseases that affect them — differ biologically. Quoting the Nobel Prize-winning scientist Harold Varmus, Brawley said that “if you want to divide the American population into more races than one, that’ fine, but in terms of biology, that’s like trying to slice soup.” Brawley also believes that focusing on race obscures the true driver of cancer disparities: quality of care. He notes that most Black cancer patients don’t have access to the MD Andersons and Memorial Sloan Ketterings of the world — hospitals where he says cancer outcomes are equal regardless of race. As he sees it, the examples set by those hospitals point to a very obvious solution to health disparities: “Equal treatment yields equal outcomes.” But what looks like disparity driven by biological differences is actually just a proxy for socioeconomic issues, he says, and Black patients often don’t have access to equitable treatment.   When the U.S. Preventive Services Task Force was reassessing its previous guidelines, it commissioned a modeling study that concluded that the suggested updates would confer more health benefits to more people compared to the existing guidelines. But that study’s modeling did not consider racial or ethnic disparities in lung cancer risk. In a statement emailed to Undark, Michael Barry, current chair of the task force, wrote that while the group always looks for evidence that would facilitate “specific recommendations for different racial and ethnic populations, especially those disproportionately burdened by a given disease or condition, that data often does not exist." Barry added that "when the Task Force considered whether it was possible to include race as a variable in the model it used for its lung cancer screening recommendation, it found that there was not sufficient underlying data. The evidence was simply not available to show how race — together with the other variables in the model — affects the way people’s health progresses over time due to both lung cancer and other medical conditions a person could develop.” For decades, Brawley says, medicine has attributed racial disparities in cancer outcomes to a myth that Black and White people — and the diseases that affect them — differ biologically. This does not mean it won’t be included in future models, as Barry acknowledged that research is constantly progressing and that new studies will be incorporated into future revisions of the screening guidelines. Barry added that, in terms of screening eligibility, dropping the pack-year threshold from 30 to 20 benefitted the Black high-risk population more than the corresponding non-Hispanic White population. Indeed, a 2022 study in JNCI Cancer Spectrum found that while the new recommendations did not eliminate the racial gap in screening eligibility, they narrowed it. According to that analysis, the new guidelines reduced a 16-percentage-point gap between the proportions of Black and White smokers with lung cancer who were eligible for screening at the time of their diagnosis to around 11 percentage points. (Under the new recommendations, Latino smokers, who had also been disproportionately excluded from lung cancer screening under the old guidelines, became the group most likely to be excluded from screening.) At the same time, some experts, like the authors of a 2023 JAMA Oncology study that found a persistent screening eligibility gap under the new guidelines, argued that new recommendations didn’t make screening access equitable. “Based on racially and ethnically diverse population-based cohort data, the 2021 USPSTF guidelines for lung cancer screening still induce racial and ethnic disparities,” the study authors wrote. In a Stanford Medicine press release, Summer Han, the study’s lead author, noted: “Our study shows that these changes to the guidelines are not sufficient to address race-based differences in lung cancer incidence and age at diagnosis.”   As some public health experts see it, stringent screening eligibility guidelines aren’t the biggest obstacle to closing lung cancer disparities. Kathy Levy, a former project manager for GO2 for Lung Cancer’s Alabama Lung Cancer Awareness, Screening, and Education project, known as ALCASE, said it matters little whether screening guidelines incorporate race because even eligible people face significant barriers to getting screened. Levy spent three years working to get more underserved people of color screened, diagnosed, and treated for lung cancer throughout Alabama’s Black Belt, a rural region home to large African American populations, and she remembers it as a slow process to get started.. There, she says, practically any cancer diagnosis comes with a social stigma. “We're rural and a cancer diagnosis is so secret,” said Levy. “They’ve just gotten to where they would start saying the word ‘cancer.’ They would call it ‘the Big C’.” Some Black people also feared getting a diagnosis because they viewed lung cancer as an automatic death sentence, she said, and didn’t want to know if they had it. “We had to do a lot of education, and a lot of hand holding, just showing the love and concern to these people that were out there in the neighborhood. You might have to talk with them six or seven times.” Nationally, just 5.8 percent of people who are eligible for lung cancer screening are actually screened; in some states, the number is only around 1 percent. According to the 2022 Lung Health Barometer from the American Lung Association, nearly 70 percent of people don’t even know that lung cancer screening exists. Black adults who are eligible for lung cancer screening are even less likely than people from other backgrounds to undergo the procedure, Levy said, due to distrust of the health care system and lack of access to care. According to Levy, the challenges don’t stop with the patients; many health care providers remain uninformed about proper lung cancer screening. She recalls sending patients for screening only to have their doctors pronounce them clear after listening to their lungs and ordering a chest X-ray, procedures that are known to be ill-equipped to detect early tumors. She later created a “Dear Doctor” letter that her patients could give to their physicians — a missive that explained the screening process and included a list of facilities that offered low-dose CT scans. The persistent frustrations notwithstanding, Levy believes her work has made a difference, particularly in her home county of Choctaw, where she is known as “the Cancer Lady” who pesters everybody about getting regular check-ups and screenings. Now, they call her asking if it’s time to get screened again, she said with a soft chuckle. Although Levy is doubtful that any screening policy change would fix the lung cancer woes that plague the Black Belt, she takes issue with the use of age restrictions on screening, a barrier that could disproportionately affect Black smokers. “I wish it was that we didn't have to worry about age on the screening because we have people that are actually younger now, [that are] getting cancer at such an early age,” she said. “Why do we have to put stipulations about an age on screening for our health?”   The debate over race-conscious screening guidelines is taking place against a backdrop of rapid changes in the treatment and scientific understanding of lung cancer. Experts acknowledge that cancer screening is imperfect , although many emphasize that they think it is worth any associated risks. For instance, early detection can skew screening statistics for topics such as five-year survival rate, even if it does nothing to extend a person’s life. One educational resource by the National Cancer Institute showed how a common metric, the five-year survival rate, can sometimes mislead: If a group of patients are diagnosed with cancer at 67 and then die of the disease at 70, they would effectively have a zero percent five-year survival rate. But if they are diagnosed early, at 60, and also die at 70, the group would have a 100 percent five-year survival rate — even though the early detection may not have helped them live any longer. The National Cancer Institute references this so-called “lead-time bias” as one of the causes of confusion and misunderstanding about screening efficacy. “Doctors frequently talk about survival rates,” says Brawley, the Johns Hopkins epidemiologist. He says he frequently tells his graduate students that if doctors are just talking about survival without talking about it in a randomized trial, "you have someone who doesn't know what the hell they're talking about.” Brawley points out that the interventions and follow-up tests that come after a positive screening result can themselves be risky. He calculated that the follow-up procedures for the group in the National Lung Screening Trial that got a low-dose CT scan killed one person for every 5.4 people the new screening method saved. Modeling work used by the U.S. Preventive Services Task Force in determining the new guidelines predicted less dire — but nonetheless significant — screening harms, including roughly one radiation-related death occurring due to annual screening for every 13 cancer deaths averted. The same model predicted around two false positives per person over a lifetime of screening, with some of those false positives resulting in follow-up biopsies. One study found that the mortality rate for patients who elect to have a surgical lung biopsy in hospitals is just under 2 percent, and almost 10 times higher than that for people who have it as a nonelective procedure. These sobering statistics reinforce the need to target screening at people at highest risk in order to reduce the number of false positives and risky follow-up procedures. The stigma of smoking leads many people to lie about or legitimately underestimate the extent of their smoking history, making it too crude an indicator of risk for some. Jennifer King, GO2’s former chief scientific officer and now the chief science officer at the International Association for the Study of Lung Cancer, would like to see clinical trials and screening recommendations consider additional risk factors, including genetic mutations known to be associated with heightened cancer risk, family history, and prior history of cancer. As she puts it, “Smoking history is not going to find everybody.” When King says “everybody,” she may be referring to people like Diane Juitt, a 59-year-old resident of Columbia, South Carolina who is among the 10 to 20 percent of American lung cancer patients who have never smoked or smoked fewer than 100 cigarettes in their lifetime. When Juitt developed an unshakeable, bone-rattling cough in the summer of 2021, she never imagined she might have lung cancer. She attributed her symptoms to the residual effects of a Covid-19 infection she’d battled earlier in the year. “It kept getting worse,” she recalls. The first time she saw a doctor, he gave her something for the cough and “it went away, but then it just came back.” Juitt, who was preparing for retirement from her job with the Army National Guard, decided to undergo a complete physical, which included a chest X-ray that revealed a spot on her lung. A follow-up low-dose CT scan led to a biopsy. By the time doctors at the Hollings Cancer Center at the Medical University of South Carolina in Charleston performed surgery, her cancer was at stage 3: The tumor had not invaded adjacent organs, but cancerous cells had spread to nearby lymph nodes in the chest. Doctors removed part of her lung and about 30 lymph nodes. Juitt is Black, but genetics may have been a more critical risk factor for her lung cancer. Subsequent tests revealed that she had a genetic mutation that increased her risk for the disease. Her two daughters were screened for the same mutation based on her experience.   The U.S. Preventive Services Task Force does not recommend lung cancer screening for people who have never smoked. It says that current evidence does not support adding other risk factors than smoking to the screening guidelines. But cases like Juitt’s exert pressure for the screening guidelines to evolve. The American Cancer Society recently updated its guidelines to eliminate the number of years since quitting smoking as an eligibility factor. Many researchers favor expanding the eligibility pool for lung cancer screening and considering a broader array of risk factors in screening decisions. Aldrich would like to see the guidelines take into account not only smoking history and race, but also other demographic factors and genetic sequencing. “Race is complicated,” she said. It’s difficult to disentangle it from social factors, diets, culture, genetic variants across populations. Fenton Ambrose, of GO2 for Lung Cancer, suggested that certain classes of workers, such as first responders and military servicemembers, should also be prioritized for screening. Lung cancer incidence and mortality rates are much higher for military personnel than the general population, due to occupational exposures to an assortment of toxins. Many researchers favor expanding the eligibility pool for lung cancer screening and considering a broader array of risk factors in screening decisions. While Barry, the U.S. Preventive Services Task Force chair, agrees that it's a good thing to look for additional risk factors and focus screening on people who would most benefit, he also warns that more complicated implementation procedures might push people away. "Our recommendation currently using just age and pack years is pretty straightforward," he said. In trying to fine tune those criteria, he added, "we want to be careful not to create an implementation barrier when we already know even with the simple criteria, that lung cancer screening is underutilized." Meanwhile, Aldrich’s collaborator, Sandler, believes that many screening eligibility guidelines are likely to change in the not-too-distant future. She points to ongoing studies that are evaluating the benefits of screening other high-risk populations. And she’s hopeful that new diagnostic tests will emerge to augment low-dose CT in the screening process. Researchers in the U.S. and Taiwan, for example, have developed a new AI tool that can detect predict future lung cancer risk within the next six years based off of a single low-dose CT scan; other tests can identify minute genetic markers circulating in the blood that signal residual or recurrent cancer. In terms of the future evolution of the task force screening guidelines, “my hope is that they will continue to address disparities in race and gender,” Sandler said, “and that we also continue to look at access.”   Melba Newsome is a freelance science, health, and environmental journalist in Charlotte, North Carolina. This reporting was supported in part by a grant from the National Institute for Health Care Management Foundation. This article was originally published on Undark. Read the original article. Read more about health inequities

“Astounding” Findings – Scientists Uncover Startling Origins of Neurodegenerative Diseases

Researchers have created the world’s largest ancient human gene bank by analyzing the bones and teeth of almost 5,000 humans who lived across western Europe...

Researchers established the world’s largest gene bank of ancient human DNA, analyzing remains from up to 34,000 years ago to trace the spread of genes and diseases. Key findings include the historical introduction of multiple sclerosis risk genes into Europe and the genetic origins of neurodegenerative diseases. This study, offering new insights into disease evolution and treatment, marks a significant advance in understanding the interplay between ancient genetics and modern health.Researchers have created the world’s largest ancient human gene bank by analyzing the bones and teeth of almost 5,000 humans who lived across western Europe and Asia up to 34,000 years ago.By sequencing ancient human DNA and comparing it to modern-day samples, the international team of experts mapped the historical spread of genes – and diseases – over time as populations migrated.The ‘astounding’ results have been revealed in four trailblazing research papers recently published in the same issue of Nature and provide a new biological understanding of debilitating disorders. The extraordinary study involved a large international team led by Professor Eske Willerslev at the Universities of Cambridge and Copenhagen, Professor Thomas Werge at the University of Copenhagen, and Professor Rasmus Nielsen at University of California, Berkeley, and involved contributions from 175 researchers from around the globe.The scientists found:The startling origins of neurodegenerative diseases including multiple sclerosisWhy northern Europeans today are taller than people from southern EuropeHow major migration around 5,000 years ago introduced risk genes into the population in north-western Europe – leaving a legacy of higher rates of MS todayCarrying the MS gene was an advantage at the time as it protected ancient farmers from catching infectious diseases from their sheep and cattleGenes known to increase the risk of diseases such as Alzheimer’s and type 2 diabetes were traced back to hunter-gatherersFuture analysis is hoped to reveal more about the genetic markers of autism, ADHD, schizophrenia, bipolar disorder, and depressionNorthern Europe has the highest prevalence of multiple sclerosis in the world. A new study has found the genes that significantly increase a person’s risk of developing multiple sclerosis (MS) were introduced into north-western Europe around 5,000 years ago by sheep and cattle herders migrating from the east.Genetic History and DiseaseBy analyzing the DNA of ancient human bones and teeth, found at documented locations across Eurasia, researchers traced the geographical spread of MS from its origins on the Pontic Steppe (a region spanning parts of what are now Ukraine, South-West Russia, and the West Kazakhstan Region).They found that the genetic variants associated with a risk of developing MS ‘traveled’ with the Yamnaya people — livestock herders who migrated over the Pontic Steppe into North-Western Europe.These genetic variants provided a survival advantage to the Yamnaya people, most likely by protecting them from catching infections from their sheep and cattle. But they also increased the risk of developing MS.“It must have been a distinct advantage for the Yamnaya people to carry the MS risk genes, even after arriving in Europe, despite the fact that these genes undeniably increased their risk of developing MS,” said Professor Eske Willerslev, jointly at the Universities of Cambridge and Copenhagen and a Fellow of St John’s College, an expert in analysis of ancient DNA and Director of the project.He added: “These results change our view of the causes of multiple sclerosis and have implications for the way it is treated.”The new study has found the genes that significantly increase a person’s risk of developing multiple sclerosis (MS) were introduced into north-western Europe around 5,000 years ago by sheep and cattle herders migrating from the east. Credit: SayoStudioThe age of specimens ranges from the Mesolithic and Neolithic through the Bronze Age, Iron Age, and Viking period into the Middle Ages. The oldest genome in the data set is from an individual who lived approximately 34,000 years ago.The findings provide an explanation for the ‘North-South Gradient’, in which there are around twice as many modern-day cases of MS in northern Europe than southern Europe, which has long been a mystery to researchers.From a genetic perspective, the Yamnaya people are thought to be the ancestors of the present-day inhabitants of much of North-Western Europe. Their genetic influence on today’s population of southern Europe is much weaker.Previous studies have identified 233 genetic variants that increase the risk of developing MS. These variants, also affected by environmental and lifestyle factors, increase disease risk by around 30 percent. The new research found that this modern-day genetic risk profile for MS is also present in bones and teeth that are thousands of years old.“These results astounded us all. They provide a huge leap forward in our understanding of the evolution of MS and other autoimmune diseases. Showing how the lifestyles of our ancestors impacted modern disease risk just highlights how much we are the recipients of ancient immune systems in a modern world,” said Dr William Barrie, a postdoc in the University of Cambridge’s Department of Zoology and co-author of the paper.Multiple sclerosis is a neurodegenerative disease in which the body’s immune system mistakenly attacks the ‘insulation’ surrounding the nerve fibers of the brain and spinal cord. This causes symptom flares known as relapses as well as longer-term degeneration, known as progression.Professor Lars Fugger, a co-author of the MS study professor and consultant physician at John Radcliffe Hospital, University of Oxford, said: “This means we can now understand and seek to treat MS for what it actually is: the result of a genetic adaptation to certain environmental conditions that occurred back in our prehistory.”Professor Astrid Iversen, another co-author based at the University of Oxford, said: “We now lead very different lives to those of our ancestors in terms of hygiene, diet, and medical treatment options and this combined with our evolutionary history means we may be more susceptible to certain diseases than our ancestors were, including autoimmune diseases such as MS.”The Lundbeck Foundation GeoGenetics Centre – the resource underpinning the discoveriesThe new findings were made possible by the analysis of data held in a unique gene bank of ancient DNA, created by the researchers over the past five years with funding from the Lundbeck Foundation.This is the first gene bank of its kind in the world and already it has enabled fascinating new insights in areas from ancient human migrations, to genetically-determined risk profiles for the development of brain disorders.By analysing the bones and teeth of almost 5,000 ancient humans, held in museum collections across Europe and Western Asia, the researchers generated DNA profiles ranging across the Mesolithic and Neolithic through the Bronze Age, Iron Age and Viking period into the Middle Ages. They compared the ancient DNA data to modern DNA from 400,000 people living in Britain, held in the UK Biobank.“Creating a gene bank of ancient DNA from Eurasia’s past human inhabitants was a colossal project, involving collaboration with museums across the region,” said Willerslev.He added: “We’ve demonstrated that our gene bank works as a precision tool that can give us new insights into human diseases, when combined with analyses of present-day human DNA data and inputs from several other research fields. That in itself is amazing, and there’s no doubt it has many applications beyond MS research.”The team now plans to investigate other neurological conditions including Parkinson’s and Alzheimer’s diseases, and psychiatric disorders including ADHD and schizophrenia.They have received requests from disease researchers across the world for access to the ancient DNA profiles, and eventually aim to make the gene bank open access.The research was funded by a €8M grant from the Lundbeck Foundation, and conducted at the Lundbeck Foundation Geogenetics Centre at the University of Copenhagen.Jan Egebjerg, Director of Research at the Lundbeck Foundation, said: “The rationale for awarding such a large research grant to this project, as the Lundbeck Foundation did back in 2018, was that if it all worked out, it would represent a trail-blazing means of gaining a deeper understanding of how the genetic architecture underlying brain disorders evolved over time. And brain disorders are our specific focus area.”Physical traits and disease risks of modern-day Europeans“It’s striking that the lifestyles of the people in the Eurasian region over the last 10,000 years have resulted in a genetic legacy that impacts their present-day descendants, in terms of both their physical appearance and their risk of developing a number of diseases,” said Dr. Evan Irving-Pease at The Globe Institute, University of Copenhagen, first author of a separate Nature paper (The Selection Landscape and Genetic Legacy of Ancient Eurasians).By comparing DNA from 1,664 archaeological skeletons of the prehistoric inhabitants of the Eurasia — ranging in age from the Mesolithic (Middle Stone Age) to around 1,000 BC — to over 400,000 DNA profiles from present-day Europeans, the team has gained a host of new insights into our genetic histories, including:Body height: North-Western Europeans today are typically taller than Southern Europeans — a genetic predisposition to being tall is likely to have come from the Yamnaya.Disease risk: Influenced by how much DNA a person has from the ancient populations that migrated across Eurasia after the last Ice Age. For example, Southern Europeans typically have a lot of ancient farmer DNA and are genetically predisposed to developing bipolar disorder. North-Western Europeans carry more genetic risk for multiple sclerosis, while Eastern Europeans have an increased genetic risk of developing Alzheimer’s and type 2 diabetes.Lactose tolerance: DNA analysis of the prehistoric inhabitants of the Eurasia has revealed that lactose tolerance — the ability to digest the sugar in milk and other dairy products – emerged in Europe around 6,000 years ago.Vegetable tolerance: The ability to better survive on a vegetable-rich diet was written into Europeans’ genes by the dawn of the Neolithic Age, around 5,900 years ago.Past human gene pools of western EurasiaIn a third Nature paper, (Population Genomics of Postglacial Western Eurasia), the researchers, among which first authors Morten Allentoft, professor at Curtin University, Australia and Lundbeck Foundation Geogenetics Centre at UCPH, Martin Sikora, associate professor at the Lundbeck Foundation Geogenetics Centre at UCPH, and Kristian Kristiansen, Professor of Archaeology at the University of Gothenburg, Sweden, show that genetic differences between ancient populations in western Eurasia were substantially higher than previously estimated, and also much higher than observed in present-day populations.Origins of modern-day DanesIn a fourth Nature paper, ((100 Ancient Genomes Show Two Rapid Population Turnovers in Stone Age Denmark) the team reports findings that overturn the commonly-held view that the ancestors of present-day Danes were Stone Age hunter-gatherers.The team, which includes Prof. Kristian Kristiansen and Dr. Anders Fischer, both of whom are affiliated with the Lundbeck Foundation Geogenetics Centre at the University of Copenhagen (UCPH), analyzed DNA from 100 skeletons of the prehistoric inhabitants of the region now known as Denmark, who lived between 10,000 years ago and 2,700 years ago.They found that since the last Ice Age around 12,000 years ago, Denmark has experienced two near-total population turnover events, the second of which is still evident in the gene pool of present-day Scandinavia.Around 5,900 years ago at the dawn of the Neolithic Age, a group of farmers with genetic roots in Anatolia — the Asian part of present-day Turkey – brought a new farming culture to Denmark. The resulting changes in diet were clear in analysis of the ancient bones, and show that these farmers completely replaced the hunter-gatherers living in the region.Then, around 5,000 years ago, the Yamnaya arrived and eliminated the Anatolian farmers. The Yamnaya people are the closest ancestors of present-day ethnic Danes.Reference: “Elevated genetic risk for multiple sclerosis emerged in steppe pastoralist populations” by William Barrie, Yaoling Yang, Evan K. Irving-Pease, Kathrine E. Attfield, Gabriele Scorrano, Lise Torp Jensen, Angelos P. Armen, Evangelos Antonios Dimopoulos, Aaron Stern, Alba Refoyo-Martinez, Alice Pearson, Abigail Ramsøe, Charleen Gaunitz, Fabrice Demeter, Marie Louise S. Jørkov, Stig Bermann Møller, Bente Springborg, Lutz Klassen, Inger Marie Hyldgård, Niels Wickmann, Lasse Vinner, Thorfinn Sand Korneliussen, Morten E. Allentoft, Martin Sikora, Kristian Kristiansen, Santiago Rodriguez, Rasmus Nielsen, Astrid K. N. Iversen, Daniel J. Lawson, Lars Fugger and Eske Willerslev, 10 January 2024, Nature.DOI: 10.1038/s41586-023-06618-z“The selection landscape and genetic legacy of ancient Eurasians” by Evan K. Irving-Pease, Alba Refoyo-Martínez, William Barrie, Andrés Ingason, Alice Pearson, Anders Fischer, Karl-Göran Sjögren, Alma S. Halgren, Ruairidh Macleod, Fabrice Demeter, Rasmus A. Henriksen, Tharsika Vimala, Hugh McColl, Andrew H. Vaughn, Leo Speidel, Aaron J. Stern, Gabriele Scorrano, Abigail Ramsøe, Andrew J. Schork, Anders Rosengren, Lei Zhao, Kristian Kristiansen, Astrid K. N. Iversen, Lars Fugger, Peter H. Sudmant, Daniel J. Lawson, Richard Durbin, Thorfinn Korneliussen, Thomas Werge, Morten E. Allentoft, Martin Sikora, Rasmus Nielsen, Fernando Racimo and Eske Willerslev, 10 January 2024, Nature.DOI: 10.1038/s41586-023-06705-1“Population genomics of post-glacial western Eurasia” by Morten E. Allentoft, Martin Sikora, Alba Refoyo-Martínez, Evan K. Irving-Pease, Anders Fischer, William Barrie, Andrés Ingason, Jesper Stenderup, Karl-Göran Sjögren, Alice Pearson, Bárbara Sousa da Mota, Bettina Schulz Paulsson, Alma Halgren, Ruairidh Macleod, Marie Louise Schjellerup Jørkov, Fabrice Demeter, Lasse Sørensen, Poul Otto Nielsen, Rasmus A. Henriksen, Tharsika Vimala, Hugh McColl, Ashot Margaryan, Melissa Ilardo, Andrew Vaughn, Morten Fischer Mortensen, Anne Birgitte Nielsen, Mikkel Ulfeldt Hede, Niels Nørkjær Johannsen, Peter Rasmussen, Lasse Vinner, Gabriel Renaud, Aaron Stern, Theis Zetner Trolle Jensen, Gabriele Scorrano, Hannes Schroeder, Per Lysdahl, Abigail Daisy Ramsøe, Andrei Skorobogatov, Andrew Joseph Schork, Anders Rosengren, Anthony Ruter, Alan Outram, Aleksey A. Timoshenko, Alexandra Buzhilova, Alfredo Coppa, Alisa Zubova, Ana Maria Silva, Anders J. Hansen, Andrey Gromov, Andrey Logvin, Anne Birgitte Gotfredsen, Bjarne Henning Nielsen, Borja González-Rabanal, Carles Lalueza-Fox, Catriona J. McKenzie, Charleen Gaunitz, Concepción Blasco, Corina Liesau, Cristina Martinez-Labarga, Dmitri V. Pozdnyakov, David Cuenca-Solana, David O. Lordkipanidze, Dmitri En’shin, Domingo C. Salazar-García, T. Douglas Price, Dušan Borić, Elena Kostyleva, Elizaveta V. Veselovskaya, Emma R. Usmanova, Enrico Cappellini, Erik Brinch Petersen, Esben Kannegaard, Francesca Radina, Fulya Eylem Yediay, Henri Duday, Igor Gutiérrez-Zugasti, Ilya Merts, Inna Potekhina, Irina Shevnina, Isin Altinkaya, Jean Guilaine, Jesper Hansen, Joan Emili Aura Tortosa, João Zilhão, Jorge Vega, Kristoffer Buck Pedersen, Krzysztof Tunia, Lei Zhao, Liudmila N. Mylnikova, Lars Larsson, Laure Metz, Levon Yepiskoposyan, Lisbeth Pedersen, Lucia Sarti, Ludovic Orlando, Ludovic Slimak, Lutz Klassen, Malou Blank, Manuel González-Morales, Mara Silvestrini, Maria Vretemark, Marina S. Nesterova, Marina Rykun, Mario Federico Rolfo, Marzena Szmyt, Marcin Przybyła, Mauro Calattini, Mikhail Sablin, Miluše Dobisíková, Morten Meldgaard, Morten Johansen, Natalia Berezina, Nick Card, Nikolai A. Saveliev, Olga Poshekhonova, Olga Rickards, Olga V. Lozovskaya, Olivér Gábor, Otto Christian Uldum, Paola Aurino, Pavel Kosintsev, Patrice Courtaud, Patricia Ríos, Peder Mortensen, Per Lotz, Per Persson, Pernille Bangsgaard, Peter de Barros Damgaard, Peter Vang Petersen, Pilar Prieto Martinez, Piotr Włodarczak, Roman V. Smolyaninov, Rikke Maring, Roberto Menduiña, Ruben Badalyan, Rune Iversen, Ruslan Turin, Sergey Vasilyev, Sidsel Wåhlin, Svetlana Borutskaya, Svetlana Skochina, Søren Anker Sørensen, Søren H. Andersen, Thomas Jørgensen, Yuri B. Serikov, Vyacheslav I. Molodin, Vaclav Smrcka, Victor Merts, Vivek Appadurai, Vyacheslav Moiseyev, Yvonne Magnusson, Kurt H. Kjær, Niels Lynnerup, Daniel J. Lawson, Peter H. Sudmant, Simon Rasmussen, Thorfinn Sand Korneliussen, Richard Durbin, Rasmus Nielsen, Olivier Delaneau, Thomas Werge, Fernando Racimo, Kristian Kristiansen and Eske Willerslev, 10 January 2024, Nature.DOI: 10.1038/s41586-023-06865-0“100 ancient genomes show repeated population turnovers in Neolithic Denmark” by Morten E. Allentoft, Martin Sikora, Anders Fischer, Karl-Göran Sjögren, Andrés Ingason, Ruairidh Macleod, Anders Rosengren, Bettina Schulz Paulsson, Marie Louise Schjellerup Jørkov, Maria Novosolov, Jesper Stenderup, T. Douglas Price, Morten Fischer Mortensen, Anne Birgitte Nielsen, Mikkel Ulfeldt Hede, Lasse Sørensen, Poul Otto Nielsen, Peter Rasmussen, Theis Zetner Trolle Jensen, Alba Refoyo-Martínez, Evan K. Irving-Pease, William Barrie, Alice Pearson, Bárbara Sousa da Mota, Fabrice Demeter, Rasmus A. Henriksen, Tharsika Vimala, Hugh McColl, Andrew Vaughn, Lasse Vinner, Gabriel Renaud, Aaron Stern, Niels Nørkjær Johannsen, Abigail Daisy Ramsøe, Andrew Joseph Schork, Anthony Ruter, Anne Birgitte Gotfredsen, Bjarne Henning Nielsen, Erik Brinch Petersen, Esben Kannegaard, Jesper Hansen, Kristoffer Buck Pedersen, Lisbeth Pedersen, Lutz Klassen, Morten Meldgaard, Morten Johansen, Otto Christian Uldum, Per Lotz, Per Lysdahl, Pernille Bangsgaard, Peter Vang Petersen, Rikke Maring, Rune Iversen, Sidsel Wåhlin, Søren Anker Sørensen, Søren H. Andersen, Thomas Jørgensen, Niels Lynnerup, Daniel J. Lawson, Simon Rasmussen, Thorfinn Sand Korneliussen, Kurt H. Kjær, Richard Durbin, Rasmus Nielsen, Olivier Delaneau, Thomas Werge, Kristian Kristiansen and Eske Willerslev, 10 January 2024, Nature.DOI: 10.1038/s41586-023-06862-3

Ancient Jewelry Shows Ice Age Europe Had 9 Distinct Cultures

Prehistoric artifacts used in jewelry, such as beads made from shells, amber and ivory, have shed light on the cultural groups that were present in Europe tens of thousands of years ago

Bling isn’t a modern invention; humans have been wearing what anthropologists call personal ornamentation for tens of thousands of years. And the distinct ways prehistoric people adorned themselves can illuminate long-vanished cultures. A new study has used more than 100 types of beads, made of shells, ivory and other materials, to determine that there were at least nine distinct cultural groups living in the then frozen landscapes of Europe between 34,000 and 24,000 years ago.These cultures were so distinct that you would have been able to tell them apart just by the embellishments on the bodies of their members, even if those people had similar genetics. In fact, in some cases the new study pointed to culture being the stronger factor. “We've shown that you can have two [distinct] genetic groups of people who actually share a culture,” says the study’s lead author Jack Baker, a doctoral candidate in prehistory at the University of Bordeaux in France.The study, published on Monday in Nature Human Behaviour, analyzed 134 types of beads from 112 sites across Europe—from Paviland in Wales to Kostenki in Russia—that dated back to a prehistoric ice age between 34,000 and 24,000 years ago. A few of the trinkets were found in burials, but most were from ancient dwelling sites. These personal ornaments were surprisingly diverse: ivory fashioned into owl-like shapes, beads carved to look like human breasts, amber pendants, shells with holes in them and a wide variety of animal teeth. Using these and other types of adornments, the researchers identified nine distinct cultural groups of hunter-gatherers that were present during this period.“In the East, for example, they were very, very much more focused on ivory, on teeth, on stone,” Baker says. But on the other side of the Alps, people would have adorned themselves with “really relatively flamboyant colors: reds, pinks, blues, really vibrant colors.” If you were to see one person from each group, he adds, “you could say, ‘He's from the East, and he's from the West,’” at a quick glance.But one of the study’s main findings was that distance and isolation only accounted for a surprisingly small difference between the ornaments that the various groups wore in necklaces, bracelets and other trinkets, Baker says.This suggests other factors were at play, possibly including the availability of materials, cultural sharing among groups and an individual’s social status. The study found that differences were more pronounced when it came to burial sites, compared with places where people lived. “Cultural differences crystallize better around things like funerary rites,” Baker says, adding that this highlights the importance of taking site usage into context when investigating ancient human behavior.Marjolein Bosch, a paleolithic zooarchaeologist at the Austrian Archaeological Institute, who wasn’t involved in the new study, says it “clearly highlights differences in the range of ornamental diversity between these two archaeological contexts and points toward different narratives in cultural expression in the realms of life and death.”The finding of nine distinct cultures broadly matched paleogenomic data that identified various groups that were present in Europe during that time—but there were exceptions. Based on the artifacts, the researchers also identified one apparently distinct culture for which there are currently no genetic data. “This study has shown really nicely that genetics does not equal culture,” Baker says.The new study “tells us that there is a right and a wrong way to study and report about identity in the past.... One of the dangerous problems with ancient genomics is that genes aren’t proxies for group or individual identity; our identities are shaped by our cultural milieu,” says Sheela Athreya, a professor at the department of anthropology at Texas A&M University, who also was not involved in the new research. Building both individual and group identity is an “enormously complex human process.”For Baker, the research also highlights that—even during an ice age, when environmental conditions were “horrendous”—“we still flourish, and we still create things that are beautiful to adorn ourselves.”

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!


sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.