Cookies help us run our site more efficiently.

By clicking “Accept”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information or to customize your cookie preferences.

Our Food System Could Have Been So Different

News Feed
Saturday, October 1, 2022

The old, epic story of agriculture in North America had two heroes, long sung and much venerated. One was human ingenuity. The other was corn.That story went something like this. On this continent, agriculture—and therefore civilization—was born in Mesoamerica, where corn happened to be abundant. The more advanced people there began cultivating this knobbly little plant and passed their knowledge north, to people in more temperate climes. When Europeans arrived, corn ruled the fields, a staple crop, just like wheat across the ocean. If the Middle East’s Fertile Crescent was agriculture’s origin point for Europe, Mexico was agriculture’s origin point here. This very human innovation had unspooled in the same rare way in these two places. Superior men tamed nature and taught other superior men to follow.Part of this story is true. The first ear of corn—although calling it corn might be a stretch—likely grew somewhere in the highlands of Central Mexico, as far back as 10,000 or so years ago. The oldest known bits of recognizable corn, a set of four cobs each smaller than a pinky finger, are some thousands of years younger than that. They were uncovered in Oaxaca, in 1966, and that site, cuna del maiz, the “cradle of corn,” is in concept a landmark of human advancement on Earth. In appearance, like many archaeological sites, it is unimpressive, a cave so shallow that even the designation “cave” is questionable. But sometimes a whole history is preserved by chance on a dry cave floor. Sometimes a handful of seeds can help confirm a theory about the dawn of agriculture, or help unravel it.Humans have been living in the valley of Oaxaca for ages; now the main road passes a boomlet of mezcalerias, flat fields of corn, and an antique cliffside etching of a cactus. Some nearby caves, too, have traces of ancient wall paintings—a jaguar, two stick figures, and la paloma, “the dove.” When, starting in 1964, the archaeologist Kent Flannery came to this valley looking for a place to dig, he examined more than 60 of these caves, tested 10 or so, and eventually focused his work on just two. And in one of those, he found some notably old corn cobs. Today, that cave is contained in a biological preserve where council members of the nearest town patrol the grounds and, from time to time, guide visitors up the ridge. Mostly they show off the ancient paintings, in vaulted caves with views that stretch for miles.The corn cave, which is no taller or roomier than a modest corner office, likely served as a storeroom or shelter for nomadic peoples, who left behind bones and plant detritus as far back as 10,000 years ago. Amid the remains of deer, rabbit, mud turtle, mesquite, pine nuts, squash, and prickly pear, Flannery and his crew found those four scant specimens of corn. These days, the cobs are usually stored in Mexico City’s fabulous Museo Nacional de Antropología, but the winter I visited they happened to be on display in Oaxaca’s cultural museum. They, too, are not much to look at—skinny nubbins of plant, black and cragged with empty spaces where kernels once grew. Really, they’re hardly corn. And that gap, the distance between these hardly-corns and the flush, fleshy ears that sustain nations, is where the old story of agriculture’s origins starts to break down.The development of agriculture, the Marxist archaeologist V. Gordon Childe declared in 1935, was an event akin to the Industrial Revolution—a discovery so disruptive that it spread like the shocks of an earthquake, transforming everything in its path. Childe’s work on what he termed “the Neolithic Revolution” focused on just one site of innovation in the Near East, the famous Fertile Crescent, but over time archaeologists posited similar epicenters in the Yangtze River valley of East Asia and in Mesoamerica. From that third point of origin, corn is supposed to have converted naive, nomadic hunter-gatherers into rooted, enlightened farmers throughout the continent, all the way up into the northern plains.This long-held narrative now seems to be incomplete, at best. After all, corn took its sweet time fomenting that revolution—thousands of years to transform from scraggly specimens like the ones found in Oaxaca to full-on corn, thousands more to migrate up from Mesoamerica, and still more to adapt to the growing season at higher latitudes. In the rolling fields of the Midwest, the breadbasket of the United States, maize-based agriculture took over only with Mississippian culture, which began just one short millennium ago.Over the past few decades, a small group of archaeologists have turned up evidence that supports a different timeline, which begins much, much earlier. Plant domestication in North America has no single center, they have discovered. In the land that’s now the U.S., domestication was not an import from farther south; it emerged all on its own. Before Mexico’s corn ever reached this far north, Indigenous people had already domesticated squash, sunflowers, and a suite of plants now known, dismissively, as knotweed, sumpweed, little barley, maygrass, and pitseed goosefoot. Together, these spindly grasses formed a food system unique to the American landscape. They are North America’s lost crops.The lost crops tell a new story of the origins of cultivation, one that echoes discoveries all around the world. Archaeologists have now identified a dozen or more places where cultivation began independently, including Central America, Western and Eastern Africa, South India, and New Guinea. Even in the Fertile Crescent, the old story of a single agricultural revolution does not hold. People there domesticated more than one kind of wheat, and they did it multiple times, in disparate places. The agricultural revolution was both global and fragmented, less an earthquake than an evolutionary shift. If correct, this new reading would debunk what is effectively a “Great Yeoman Theory of History.” No isolated bolts of human inspiration caused a wholesale shift in how humans live and eat; instead, one of civilization’s most important turns would be better understood as the natural outcome, more or less, of biology and botany, a marvel that could (and did) occur almost everywhere that people lived. The global food system that we have now is based on just a tiny fraction of all the plants on Earth. But other paths were always open.It used to be that few people believed in America’s lost crops. The evidence was too limited, their seeds too small. Think of how tiny quinoa seeds are; pitseed goosefoot is closely related, but its seeds are even smaller—too small to register with Americans as food. A prominent lost-crops scholar, Gayle Fritz, once called this the “real men don’t eat pigweed” problem. At an archaeological symposium in the 1980s, a giant in the field dismissed these plants as little more than food for birds: Fritz recalls him saying something like, “All of the crops that have been recovered from the entire Eastern United States would not feed a canary for a week.”The evidence that he was wrong has been sitting in archaeological archives for decades. Back in the ’30s, just as the idea of the Neolithic Revolution was taking hold, an archaeologist named Volney Jones was studying seeds found in a rock shelter in eastern Kentucky, similar to Flannery’s cave in Oaxaca. The Kentucky cave was littered with the remains of corn, gourds, and squash, along with the ancient seeds of sumpweed and goosefoot—“local prairie plants,” Jones called them. These plants did register as food to people back then: Some of their seeds were found preserved in human fecal matter. And the seeds were unusually large for plants of the kind, a sign of domestication.Determining the age of archaeological specimens is an inexact art, and before radiocarbon dating was invented, in the ’40s, it was still less exact. Jones couldn’t say for sure how old the prairie seeds were, but if they were older than the corn and squash, he wrote, “we could hardly escape the startling conclusion that agriculture had a separate origin in the bluff shelter area.” He passed over this idea quickly, perhaps because it seemed so impossible. Even in American archaeology, a relatively quiet corner of human prehistory, a Kentucky cliff was considered a nothing place, where nothing important could have happened. If agriculture had a separate origin here, Western narratives of global human development would have to be rewritten.“The Ozarks were supposed to be a backwater,” Fritz, who is a paleoethnobotanist and professor emerita at Washington University in St. Louis, told me. “We called it the ‘hillbilly hypothesis of Ozark nondevelopment.’ You know, they were probably mostly hunter-gatherers, throwbacks to the Archaic.” Deep into the first millennia A.D., these people were supposed to have been stuck in subsistence-level living. “Well, it turns out that’s just not true,” Fritz said.Early in her career, Fritz came across a collection of ancient seeds from the Ozarks, beautiful specimens, many of which were unusually large and some of which had never been examined closely for subtle signs of domestication. Domesticated seeds develop traits that make them more appealing to humans: They are larger than wild ones, offering more nutrition, and sometimes their seed coats are thinner, granting easier access to the succulent bits. When Fritz examined the Ozarks goosefoot seeds, which had been excavated from yet another unassuming cave, she found that by the standards of wild seeds, their seed coats were notably thin. She spent some of her scant funding on accelerator-mass-spectrometry analysis, a new type of radiocarbon dating, to show that the seeds were older than anyone had imagined. “We thought the Ozark rock-shelter assemblages didn’t have much in the way of time depth, maybe 1,000 to 500 years,” she told me. “My dates went back 3,000 years.”This was in the ’80s. “That was what the game was at that time,” Bruce D. Smith, an archaeologist who dedicated much of his career to plant domestication, told me. “You wanted to get a date and demonstrate the specimen was different from all the wild specimens of the same species.” Smith is now retired (he lives in New Mexico and writes mystery novels), but for decades he was a curator at the Smithsonian Institution’s National Museum of Natural History, in Washington, D.C. He began to look at seed collections held at the museum and found the same results: People in eastern North America had cultivated prairie plants as food. His and Fritz’s analyses, along with similar work from a small group of like-minded scholars, made a convincing archaeological case: People had grown these spindly grasses deliberately, saved their seeds, and then eaten them. Sumpweed, little barley, and goosefoot, these birdseed plants that couldn’t possibly be of interest to humans—they weren’t wild things anymore, but crops.The seeds Smith studied are still in the collection at the National Museum of Natural History; Logan Kistler, who’s now the museum’s curator of archaeobotany and archaeogenomics, showed them to me. Many are kept these days in one-dram vials, each containing 100 seeds, but Smith originally found 50,000 seeds stored in a single cigar box in the museum’s attic. Under a microscope, a domesticated goosefoot seed looks like a golden disc; some of the seeds in the Smithsonian’s collection are early enough in the process of domestication that they still resemble lumps of coal, black and uneven. It is not entirely clear what about them would have attracted human attention, or led someone to taste one.Go back far enough, and this is true of so many plants we now eat: Their ancestors were unpalatable, possibly inedible, or even toxic to the human body. Corn itself is descended from a grass called teosinte, the obvious appeal of which is so limited that some researchers once hypothesized that ancient humans were first drawn to the plant for its stalk, as a base for an alcoholic brew. Smith had a theory to explain the draw of the lost crops, though: They were easily available. Ancient people would have encountered them in the flood plains of the Missouri and Mississippi River basins, where water would have cleared ground as a farmer tills a field, creating bountiful spreads of plant-based food.Or perhaps, as a pair of younger paleoethnobotanists have proposed, it was not only the landscape, but animals—large animals—that led people to these plants. Robert Spengler, who studied with Fritz and now directs the paleoethnobotany labs at the Max Planck Institute for the Science of Human History, thinks that all over the world, people have been attracted to plants that evolved to appeal to grazing animals. In the Mississippi basin, those animals would have been bison. When Spengler first told Natalie Mueller, once his grad-school colleague, now a professor at their alma mater, Washington University in St. Louis, that he thought bison could have led people to the lost crops, she was skeptical. “I was like, ‘Rob, what the hell are you talking about?’” she told me.But she started to find hints that he might be onto something. Most of the lost crops are rarities these days: Throughout her career, Mueller had painstakingly sought them out on the disturbed land at the edge of human development—the strip between a farmed field and the road, or by a path leading to an old mine. Bison, too, are scarce, but where they have been reintroduced to the prairie, she has had little trouble finding the lost crops. They were growing in the places the animals had cleared.In other words, before anyone thought to save sumpweed seeds, or plant little barley, perhaps those plants, having come to depend on bison for their survival, were changing to fit the tastes of humans who wandered along the bisons’ trails, gathering food from the stands of grass growing there. In 2019, Mueller started visiting a prairie preserve in Oklahoma more regularly, to see what she might find, and she invited me along. Once you see the prairie, she told me, I would see what she meant—that the bison and these plants, thriving together, make their own case. Being there had made her imagine the past anew, and it could do the same for anyone willing to carefully consider how a few overlooked plants now behaved in a landscape that more closely resembled the one where humans would have first met them.   Illustration by Kirsten Stolle The early morning fog erased the rolling hills of the Joseph H. Williams Tallgrass Prairie Preserve. It erased most of the road ahead, and any sign of the bison—“our big boys,” as Mueller and Ashley Glenn, her friend and go-to botanist, liked to call them. It muted the sun into a smear of yellow; it washed color from the grass, graying the prairie into a dense muddle that hid birds, spiders, and the coyote (or was it a wolf?) that called somewhere in the near distance. But even on a clear morning, I could not have picked out the plant we were seeking—sumpweed, or Iva, as Mueller called it, from its scientific name, Iva annua. Perhaps it should have stuck out: Fall had purpled its leaves and seeds, and it grew tall enough. But  mixed among the other grasses, the plant was easy to miss.Every time Mueller saw it, she perked up. The first specimen we found was puny, but its fruit was chonky—“really big,” she noted with satisfaction—and as we drove through the preserve, she pointed out the Iva lining the road to me and Fritz, who had come on the trip as well: “Oh, there’s Iva … It’s all Iva over here … Look at this stand; it’s a beautiful one.” At one point, she stopped the car suddenly by the roadside, having spotted, she thought, a sunflower (domesticated, too, on this continent, around the same time as Iva), the first she had seen on the preserve, growing right next to Iva, a coincidence that was going to make her head explode, she was saying, when Glenn, who had wandered deeper afield, cupped her hands around her mouth and yelled—“Iva!”She was standing in a pool of purple that in the late-day light stood out like a bruise against the fading green of the prairie. Even I could pick it out, easily. So much bushy sumpweed surrounded her that she could have stayed in that one spot and harvested for hours.And to Mueller, that made perfect sense. “Usually the bison are all over this spot,” she told me.Like humans, bison are landscapers, and their influence on their environs could have been what led people to the lost crops to begin with. Out on the prairie, where the grass and sky swallowed our gangly bipedal figures, the bison were scaled to fit. From a distance, their dark, curved backs dotted hillsides. One morning we found a herd of them gathered near the fence. Spread out in a column 100-some strong, they began to run, harrumphing through the grass, hurtling up and down the dips and ditches beside the road, muscling forward half tons of flesh and clearing paths through the tall grass.Many of the bison traces we walked were just about wide enough for a single person, and it’s easy to imagine that people traveling the prairies millennia ago would have chosen to follow these paths. Without the bison, the tall grasses grow so thick together that moving anywhere requires tramping down thickets of ornery stalks almost guaranteed to be hiding snakes or other dangers. Whenever we left the road, we sought out these bison traces.Just like a flood on the banks of a river, bison create the fresh-turned earth that an annual grass needs to sow its seeds. When they’re not galloping across the prairie, bison graze patches into the grass, or wallow in it, clearing plots of land with their massive bulk as effectively as any farmer might and opening ground for small fields of Iva and other lost crops. During one of her first spring visits, Mueller stood in a green pool of growth and marveled at three of them—little barley, maygrass, and tiny Iva seedings—mingled together, as if someone had planted them for an archaeologist to find. Based on their observations at the preserve, Mueller and Glenn have argued, along with Spengler, that ancient foragers might have first thought of the lost crops as a potential food when they encountered these dense stands along bison trails.So many domesticated plants started out this way, as what we now derisively refer to as weeds. They showed up and showed up and showed up at the edges of human experience, until someone started interacting with them. Wild grasses would not have been so different from the wolves that hung around the edges of human campgrounds and over time evolved into dogs. Though we rarely give plants credit for such improvisation, some of the more flexible species could have found opportunity, too, in the disturbed ground of those campsite edges.Seeing the Iva in such abundance on the prairie only reinforces the notion that humans might have begun to gather its seeds, so that selection pressure eventually shaped the plant into a form ever more appealing. In a way, this story is simpler than one that casts humans as heroic inventors who discover agriculture with their big human minds. And this less deliberate version could have happened over and over again, in many places across the planet.Wheat, barley, and lentils; corn, squash, and beans; rice, peas, potatoes—humans didn’t necessarily choose them as domesticates, and we’re a rebound relationship for some. Like any species, plants can be opportunistic, and many that we now eat had other partners in a previous era, when megafauna dominated North and South America. Squash, for example, started as compact fruit packed with bitter compounds that only mastodons and their ilk could handle. Avocados, too, evolved to feed these giant creatures, with big shiny pits that slid down megafaunal gullets as easily as raspberry seeds pass through ours. But we turned out to be excellent seed distributors too.We also have our own predilections. Agriculture has slowly rid fruits of bitterness, but the seeds that Mueller and her colleagues harvest from fields, or from the experimental gardens where they’ve grown lost crops, have not undergone that long negotiation with human taste. Boiled or sautéed, goosefoot greens still have a bitter bite. Mueller and the archaeologist Elizabeth T. Horton, another lost-crops scholar, have both tried cooking Iva, with similar outcomes. “It smelled really, really bad,” Horton said. One student had more success grinding it up and making a simple bread. It had “a light herbal flavor,” Mueller reported.She has in the past dropped off seeds for Rob Connoley, the chef of the St. Louis restaurant Bulrush, whose tasting menus feature locally foraged foods. When I asked him how he handled the lost crops, he described air-popping goosefoot seeds into garnishes, or working them into chocolate, as a sort of “foraged Nestle’s Crunch Bar.” Raw, the seeds have an unappealing flavor—“dusty, earthy, but oily,” in his experience. Iva is even harder to cook with. Connoley and his crew tried shelling, popping, and toasting the seeds, and only that last strategy worked, kind of. Ground into a paste, the toasted seeds were edible, technically, but “imagine tasting house paint,” Connoley said. “It’s not the best thing by itself.”Confronted with teosinte, corn’s wild ancestor, a chef might have the same trouble. Like the lost crops, teosinte so little resembles what we think of as food that for decades archaeologists argued whether it could possibly have given rise to corn, or if they were missing some link, an ancient form of maize. Now that debate is settled: Teosinte is it. At first glance, its long, green leaves do seem like corn’s—I saw a small stand in Oaxaca, grown in the city’s ethnobotanical garden. But it’s wider than corn, less organized in its makeup, and only thin, dried tendrils keep its seeds connected. When the seeds fall to the ground, they look like lost human teeth, gnarled and off-white.Genetic evidence suggests that domestication makes more sense when you think of it as a long, drawn-out process, rather than an event. At the beginning of a human-plant relationship, humans would have unconsciously exerted selection pressure on plants, which would respond by, say, producing larger seeds or clustering their seeds near the top. Eventually, humans started choosing plants with certain qualities on purpose. Thinking about agriculture’s origins in this way fills some of the gaping holes in the traditional narrative. For instance: How does a person envision a domesticated plant if they’ve never seen a domesticated plant? (They don’t have to.) And how does a society keep after that vision, generation after generation, for the thousands of years that domestication can take? The slow, evolutionary story, as opposed to the fast, revolutionary one, “doesn’t rely on a few clever people in every society making the decision,” Kistler said. “It just happens. It emerges.”In this evolutionary process, the domestication of any particular plant need not be a one-off. Again, genetic evidence bears this out: Rice was domesticated at least three separate times, in Asia, South America, and Africa. In the Fertile Crescent, domestication took about 2,000 years, and early versions of wheat and other important crops were spread across the region.Kistler is an archaeologist by training, and he might, on any given day, have ancient plant samples—pale-orange squash, when I visited—sitting out in his cavernous office in the museum’s back halls. Although he sometimes travels far afield in search of new plant material, much of his actual work takes place on a computer, as he searches the genetic code of ancient seeds for secrets about plants’ pasts. His work has helped show, for example, that teosinte’s journey to become fully domesticated corn took thousands of years and spanned continents. And that hardy bottle gourds likely reached the Americas by floating across the Atlantic, to be independently domesticated on this side of the ocean. Looking at domestication at this level of detail has teased out how each emerging partnership between human and plant has its own story: Cassava, a perennial vine whose roots are packed with enough cyanide compounds to cause paralysis or death, necessarily took a different route to domestication than teosinte. A plant that evolved fruits to attract some animal or bird as a seed disperser might have a different meet-cute with humans than one that serves us its seeds or roots.Some of these stories have ended. In the Middle East, a different type of wheat was domesticated in parallel with the one we eat now, grown for hundreds of years, and then, for some reason, slowly abandoned. It is now extinct. In South India, a staple crop called browntop millet largely disappeared. Almost certainly, archaeologists have yet to unearth evidence of other lost crops; some we’ll never rediscover. The era of agriculture still accounts for only a fraction of human history’s 200,000 years, and even in this short time we have narrowed down our options, discarding whole crop systems. We think of ourselves as omnivorous foodies, but we are picky eaters, dedicated to a small group of select foods. Illustration by Kirsten Stolle North America’s lost crops were already disappearing from the archaeological record by A.D. 1200, though here and there people were still cultivating them, sometimes for hundreds of years more. An archaeological site in Arkansas, for instance, contained a trove of fat Iva seeds that date to the 15th century A.D., and a couple of glancing references in the journals of early European arrivals hint that some people might still have been eating goosefoot in the 16th century. Perhaps the upheaval of European colonization ended this agriculture heritage altogether. But by then it was already disappearing.Why did these plants fall out of use? And, in turn, why did corn succeed? On a genetic level, changes in certain parts of the plant genome are associated with domesticated traits, but no one knows exactly which genetic traits might predispose a plant to flip from wild to domesticated, or which might act as barriers to domestication. If we understood that, it would be possible to say more definitively why so few plants have made it into the human diet and stuck there. “There are 300,000 plant species, and humans have a known use for, like, 10 percent of them,” Kistler said. “We get half our calories from three of them. And we owe our history to a lot more than the ones we think about right now.”According to its partisans, maize was simply a better crop. But scholars of the lost crops have gone to great pains to show that goosefoot, Iva, and the others are nutritionally competitive with corn. They also know that corn did not supplant the lost crops for hundreds of years. At one moment, corn and those crops thrived as compatible, complementary foods. In a spot not far from where St. Louis sits today, the ancient city of Cahokia, the largest ever discovered dating to the Mississippian period in what’s now the U.S., used to host feasts. Often, Cahokia is considered a corn city, built on maize-centric agriculture, but in the remains of those feasts, squash, sunflower seeds, and all five of the lost crops—maygrass, goosefoot, knotweed, little barley, and sumpweed—are preserved alongside corn cobs.Those cobs are still only a few inches long, neither the catalyst for domestication in this part of the world nor a panacea that transformed human life here immediately. Corn now rules American fields, but is that a historical contingency, one of those realities that swung a particular way by chance, or the necessary end to the story of American agriculture? In the Andes, goosefoot’s cousin, quinoa, stayed a staple; why didn’t goosefoot settle in America’s midwestern plains? In some parts of the world, crops we think of as winners—crops such as rice—started domestication then disappeared, nudged into obscurity by biology, history, or both. “I don’t think we’re ready to answer why we have the few dominant crops we have,” Kistler told me.With the right care and attention, the lost crops might still reveal their allure. They are, Mueller and her colleagues have found, eager to please. In plots scattered across the country, she and a small group of other archaeologists had started cultivating these plants, the first time in hundreds of years that humans have treated them as food. Mueller originally planted her garden with seeds sourced from across the Midwest, including Iva seeds from Arkansas, where Horton had started growing Iva and other lost crops too. For a while, she and Mueller competed over how tall they could get their Iva, Mueller told me. And Horton kept winning.The plants started with a population of Iva that Horton found right outside her old office, at the Arkansas Archaeological Survey. (She now has her own macrobotanical consulting company, Rattlesnake Master.) That original stand of sumpweed grows “big and healthy and lush and gorgeous,” she told me, but never more than about five feet in height, typical for wild Iva. In the Arkansas garden, the first year, the Iva grew six feet. The next year, seven. Then eight, and sometimes nearly nine feet tall. Already, she’s finding unusually large seeds too. Mueller and Horton think these plants might have descended, distantly, from domesticated Iva, which could explain their quick changes. Or Iva’s plasticity makes it respond easily to environmental influences. Transforming the plant’s genes such that it becomes a true domesticate might take ages, but perhaps Iva has a natural flexibility in how it expresses those genes. A plant like that, which responds to human influence so readily, might have been attractive, too, even to someone with no conception of domestication.Ultimately, Mueller hopes that the lost crops might help reveal the fundamental mechanisms of domestication. When I visited her experimental garden plot, she was growing goosefoot, Iva, and erect knotweed, in configurations that might tell her a little more about the secrets their seeds hold. “What I want to do is redomesticate them,” she told me.Who knows? A generation from now goosefoot could be rebranded as North American quinoa, and eaten across the world; Iva could become an acquired taste. By rediscovering the crops that we’ve lost, we could revitalize our idea of what counts as food. We tend to think that we, in our globalized world, eat a variety of goodies greater than any available to humanity in eras past, but like the professor who couldn’t abide pigweed, we have a narrow vision of what passes muster. Historically, domesticating a particular species might have taken thousands of years, but archaeological experiments have shown that the same work can be done in just a few dozen. If we took our cues from ancient diets, we could quickly expand our pantries again. By sampling some of the first foods humans ever grew themselves, we might think again about the possibilities of the world and its growing things, or of rekindling old relationships for millennia to come. We might notice other plants that are growing on the edge of our experience, and wonder what they have to offer.

The story of America’s “lost crops” shows the reign of corn was not inevitable.

Read the full story here.
Photos courtesy of

People who grow their own fruit and veg waste less food and eat more healthily, says research

Those who grow their own food in gardens and allotments waste less and eat more healthily – but not everyone has the chance to do so.The post People who grow their own fruit and veg waste less food and eat more healthily, says research appeared first on SAPeople - Worldwide South African News.

The rising cost of living is making it harder for people, especially those on lower incomes (who often have poorer diets), to afford to eat healthily. Despite this, households in the UK continue to waste a shocking amount of food – including around 68kg of fruit and vegetables each year.Food waste is not only damaging to your pocket, it’s also bad for the environment too. Globally, 1.3 billion tonnes of food are wasted every year, generating about 8% of the world’s greenhouse gas emissions. These emissions arise from unused food at all stages of the food supply chain, from production to decomposition.However, our recent study revealed that those who grow their own food in gardens and allotments waste an average of just 3.4kg of fruits and vegetables – 95% less than the UK average. These households adopted various practices to minimise food waste, including preserving or giving away their excess produce.There has been renewed interest in growing fresh produce in gardens, community gardens and allotments in the UK and elsewhere in recent years. But the available supply of allotments is not enough to meet increasing demand.Allocating more land for household fruit and vegetable production could make a significant contribution to the availability of fresh produce for urban residents.Research has shown that using a mere 10% of the available space in the English city of Sheffield for food cultivation could supply enough fruit and vegetables to meet the needs of 15% of the city’s population. And more people growing their own food could also reduce waste.Food waste generates about 8% of the world’s greenhouse gas emissions. Joaquin Corbalan P/ShutterstockFood diariesOur study involved 197 households in the UK that grow their own food. We asked them to maintain a food diary, where they recorded the amounts of fruits and vegetables they acquired each week. We received complete records from 85 separate households.They specified whether each item was cultivated in their garden or allotment, bought from shops or markets, sourced from other growers, or foraged in the wild. The households also recorded the quantity of the produce they gave away to family and friends and the amounts they had to throw out.Our findings suggest that individuals who grow their own food may be more inclined to avoid food wastage than the average person in the UK. This is possibly because they place a higher value on the produce they had grown themselves.The results align with earlier research that was conducted in Germany and Italy. This study found that the amount of discarded food was greatest among people who shopped exclusively in large supermarkets. People who purchased items from various small stores tended to waste less food, while those who grew their own food wasted the least.Our findings also suggest that the households we studied can produce roughly half of all the vegetables, and 20% of the fruit, they consume annually. These households consumed 70% more fruits and vegetables (slightly more than six portions per day) than the national average.Eating plenty of fruits and vegetables as part of a balanced and nutritious diet is key to maintaining good health. This kind of diet can help prevent diseases such as type 2 diabetes, certain cancers and heart disease.Yet, in the UK, less than one-third of adults and only about 8% of teenagers eat their “five-a-day”. This target, which is based on advice from the World Health Organization, recommends eating at least five 80g portions of fruit and veg every day.Grow-your-own households adopted various practices to minimise food waste. Alan Goodwin Photo/ShutterstockGrow your own food securityGrowing your own food can improve access to fresh fruits and vegetables, promote good health and reduce food waste. However, several obstacles hinder involvement in household food production. These obstacles include limited access to the land, skills and time needed to grow your own fruit and veg.Approximately one in eight UK households lack access to a garden. And, since the 1950s, the availability of allotments throughout the UK has declined by 60%. This decline has been particularly evident in more deprived areas of the country, where people could benefit most from better availability of nutritious foods.We also found that those who grew their own food dedicated approximately four hours each week to working on their allotment or garden. Unfortunately, not everyone has the luxury of having the time to do so.Nonetheless, raising awareness about the benefits of home food production, beyond just food security and reducing waste, to include its positive impacts on social cohesion, overall well-being and biodiversity could encourage more people to participate. Increasing demand for growing space may also encourage local authorities to allocate more land for this purpose.Whether you grow your own food or not, everyone can adopt mindful practices when purchasing or growing food. Planning ahead and freezing or sharing excess food with others to prevent it from going to waste are good options.ALSO READ: Five healthy foods that will save you money at the tillBut some food waste is inevitable. Composting it instead of sending it to a landfill will substantially lower its impact on the planet.Boglarka Zilla Gulyas, Postdoctoral Research Associate in SCHARR, University of Sheffield and Jill Edmondson, Research Fellow in Environmental Change, University of SheffieldThis article is republished from The Conversation under a Creative Commons license. Read the original article.The post People who grow their own fruit and veg waste less food and eat more healthily, says research appeared first on SAPeople - Worldwide South African News.

Lead poisoning could be killing more people than HIV, malaria, and car accidents combined

A mother and child at a Médecins Sans Frontières clinic in Anka, Nigeria, the site of a major lead poisoning outbreak in 2019. | Kola Sulaimon/AFP via Getty Images However bad you think lead poisoning is for the world, it’s worse. Everyone knows lead is bad for you. We’ve known this for a very long time: in the first century BCE, the Roman architect Vitruvius warned against using lead in pipes, observing the “pallid color” of plumbers forced to work with it. We know leaded gasoline leads to premature death in the elderly, that high lead exposure can substantially reduce IQ, and that there is likely a relationship between lead exposure in children and high rates of crime later on.Yet lead is still everywhere — especially in poorer countries. Pure Earth, the largest nonprofit working on lead contamination internationally, recently conducted a massive survey of products in 25 low and middle-income countries, from Peru to Nigeria to India to the Philippines, to test for lead levels in household goods. In their sample they found high levels of lead in 52 percent of metal and 45 percent of ceramic foodware (a category including dishes, utensils, pots and pans), as well as 41 percent of house paints and 13 percent of toys.This has major consequences. A new paper in Lancet Planetary Health, authored by economist Bjorn Larsen and Ernesto Sánchez-Triana, World Bank’s global lead for pollution management, tries to quantify the scale of the lead problem globally. The authors estimate that some 5.5 million people die prematurely due to lead exposure every year, and that the problem as a whole imposes a social cost of $6 trillion a year. That equals 6.9 percent of total world GDP.These are massive numbers, and it’s worth putting them into context: 5.5 million deaths from lead in 2019 exceeds the number of people who died that year from car accidents (1.2 million), tuberculosis (1.18 million), HIV/AIDS (863,837), suicide (759,028), and malaria (643,381) combined. If accurate, the figure means that a little under one in 10 deaths globally can be traced to lead. Meanwhile, a social cost of 6.9 percent of global GDP exceeds a recent World Bank estimate of the social cost of air pollution, which added up to 6.1 percent of GDP.Digging into the cost of leadThese massive numbers may seem more plausible when you consider just how prevalent serious lead exposure is in developing countries.A 2021 evidence review led by environmental scientist Bret Ericson reviewed blood lead surveys in 34 nations, which together account for over two-thirds of the world’s population. Overall, those studies estimated that 48.5 percent of children had high lead levels (defined as above 5 micrograms per deciliter, or µg/dL). Levels of exposure varied greatly, with surveys in a few countries (like Tanzania) not finding any children with blood lead levels above 5 µg/dL, and other countries (like Pakistan) showing huge majorities with levels that high. (Of course, it’s possible that limitations in these surveys underestimate lead exposure in some countries.) We can identify a number of possible sources of these high lead levels. Historically, the major driver was lead in gasoline, but in 2021 the last country on Earth still using lead for that purpose (Algeria) phased it out. Lead is widely used in car batteries, plane fuel, and the consumer goods that Pure Earth surveyed in its report, but which source is most important in contributing to poisoning in children is still unclear.For one thing, we know surprisingly little about how lead in, say, a plate translates into lead in the system of a human eating off that plate. The Pure Earth study included a test of some aluminum cookware, wherein it boiled acetic acid (the main ingredient in vinegar) in them for two hours, and then tested the liquid for lead. Fifty-two percent of the pots had leached an amount of lead above the World Health Organization guideline level for drinking water. That suggests that food cooked in such pots would contain lead, which would then poison children who drink it. But much more research is needed.Indeed, “more research is needed” is a decent summary for the whole state of lead research. The Lancet Planetary Health study finding that lead kills 5.5 million people a year relied on lead poisoning estimates from the Global Burden of Disease study, which sometimes produces its numbers not based on surveys of actual people, but on other data (like the share of population in urban areas, and the year that leaded gasoline was phased out) that is in turn predictive of lead exposure.Further, the Lancet study estimates deaths caused by lead-induced cardiovascular disease based on studies of the US, estimating the effect of lead on cardiovascular disease rates. Air pollution expert Roy Harrison told the news agency AFP that applying such findings to the whole world is “a huge jump of faith.”That said, any errors could go in both directions. The actual surveys that Ericson and coauthors compiled of blood lead levels showed that the problem was worse than the Global Burden of Disease data suggested. It might be that lead is a greater risk factor for cardiovascular disease in poor countries than in the US, because there are more medical resources to counter the negative effects of lead in the US. All of that could mean the new study actually underestimates the damage lead is doing.The only way to have a stronger sense of the scale of the problem is to invest more in understanding it. A 2021 report found that nonprofits spend, at most, $10 million a year addressing lead exposure in developing countries, with much of that money coming from governments. For comparison, global efforts to fight HIV/AIDS, which, if this new report is to be believed, kills about one-fifth as many people globally as lead does, got $8.2 billion in government funding in 2022 alone. The point here is not that we’re spending too much on HIV/AIDS — we may still be spending too little there too. But we’re spending far too little on understanding and tackling lead exposure, when it could be a problem of similar or greater magnitude. It’s among the most neglected problems in global health, and one where a substantial investment could go a long way.

Suggested Viewing

Join us to forge
a sustainable future

Our team is always growing.
Become a partner, volunteer, sponsor, or intern today.
Let us know how you would like to get involved!


sign up for our mailing list to stay informed on the latest films and environmental headlines.

Subscribers receive a free day pass for streaming Cinema Verde.
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.