The Problem With Lab-Grown Meat

Lab-grown meat could end up replicating some of the existing problems with our industrial food system.

Lab-grown meat from the U.S. is presented in the Disgusting Food Museum on Dec. 6 in Los Angeles.
Lab-grown meat from the U.S. is presented in the Disgusting Food Museum on Dec. 6 in Los Angeles.
Robyn Beck/AFP/Getty Images

On May 8, Future Tense and the New America Fellows Program will host the event “Will Slaughter-Free Meat Change the American Way of Eating?” in New York at 6:30 p.m. For more information and to RSVP, visit the New America website.

At the 2019 Milken Global Conference, which brings together business and public policy leaders to explore how market-based solutions could help social problems, a cohort of speakers suggested revamping our food system could be key to intervening in climate change. They argued that food innovations such as lab-grown meat were critical to creating a better and more just food future. Their sentiments echo political scientist Jan Dutkiewicz’s claim that lab-grown meat is the solution that would make the Green New Deal possible.

Advocates for lab-grown meat say that beyond helping fight climate change, it will also improve animal welfare and shake up our food production system. But there is a problem with cellular agriculture—another name for lab-grown meat—that the cheerleaders don’t seem to be talking about. In key ways, lab-grown meat is built on the same foundational logics of our current industrial food system. As a result, it’s firmly on the road to replicating many of the challenges that it claims it will address, and in the process risks making a food future that is worse, rather than better, for eaters.

To think through this, we need to look not at the food system itself, but rather at what it does: Provide energy to power working human bodies. Thinking of food this way came to the forefront of U.S. nutritional thought in the late 1890s, when Wilbur Atwater and E.B. Rosa figured out how to measure the energetic potential stored up in food. Researchers quickly adopted this approach, publishing pamphlets and guides to maximizing the energetic efficiency of foods consumed and minimizing waste. Over the ensuing century, the calorimeter would be joined by other instruments including the clinical test, gas chromatograph, mass spectrometer, and PCR in the effort to identify and understand how the molecules found in food functioned physically and biologically. Reducing both bodies and the food they eat to their molecular components resulted in some fantastic discoveries. Researchers figured out how to chemically reproduce compounds found in nature such as vanilla and learned that many diseases (such as rickets and scurvy) could be easily addressed through vitamin supplementation. Food could be quantified, measured, and formulated specifically to improve human well-being. Enter fortification.

This molecular understanding resulted in food become something to be taken apart and put back together. I think of this as the Lego building block model of food: Rather than imagining bread as flour, water, yeast, and salt, if I instead think of it as a combination of sugars, starches, proteins, fats, and minerals in specific ratios, I can then identify potential points for carefully tinkering with bread. What happens if I change out the ratio of saturated to unsaturated fat? What functions are gained if I increase protein content?

As a trained molecular biologist and food chemist, I love the way this approach allows me to imagine and re-imagine food as a series of molecular interactions available for tinkering with to improve flavor, texture, shelf-life, even human health. As a scholar of how food science and technology act in society, however, I am less enchanted with the on-the-ground impact of reducing everything to the molecular. Molecular reductionism is what facilitated the growth in the early 1990s of fat-free foods packed with moisture-holding sugars as a replacement (remember SnackWells?), and the subsequent boom in the late 1990s and early 2000s of functional foods such as pastas fortified with omega-3 fatty acids.

Let’s say that through scientific research and technological innovation, industrial food producers can make foods that act like preventative medicine, prolonging health and staving off impending ills. If so, this is great for shareholders: These functional foods come with a higher price tag. It’s great for eaters with the money necessary to bring these foods into their daily diet. But it’s detrimental to those on limited budgets, those who live in rural or urban food deserts, or those unaware that such edible health interventions exist. Indeed, the presence of these foods on the market—with their carefully engineered extraction and concentration of ingredients understood as having significant impact on health—reinforces the idea that access to healthful eating requires going through the technological and scientific expertise found in the industrial food laboratory. That logic that continues to undergird foods that appear “whole” at face value such as the Kuli Kuli bar or more tongue-in-cheek Nonbar. Unable to access the techniques, ingredients, and technologies that allow the creation of such foods, a large swath of humanity is excluded from the ability to produce the types of foods that are advertised as allowing them and their children to be healthier and live longer. This move not only perpetuates an already unequal access to health care and services, it promotes the erasure of traditional food ways in favor of an industrialized diet high in processed foods.

Cellular agriculture doesn’t make the same health-related claims as functional foods. But it operates on the same assumptions: that you have to go through the industrial research and production assemblage to access this “clean,” environmentally friendly meat. There, in the lab, we find molecular reductionism hard at work as researchers seek to understand how to get animal protein cells to grow in spaces foreign to their being. We find molecular reductionism driving investigation into how to replace the animal-based serums critical to cellular growth so the product can live up to its “animal-free” label. We find the same far-flung supply chains and base materials that rely on petrochemical extraction. And we see once again a divide between who can even produce these foods, let alone get access to them.

I like to contrast the widely circulating hype about lab-grown meat with a story my cousin told about my uncle Merlen at his funeral. His favorite food, she noted, was fried chicken. But fried chicken was not readily available in his Idaho hometown. It was a luxury for a family with five boys. So he saved up money from a side job, and one spring bought 100 live chicks (plus 20 free ones) from a mail-order catalog. Over the next months he cared for the chickens, and then when they were old enough, he and his mother slaughtered them, cleaned them, and took them to the freezer his family rented for storage. He ate a lot of fried chicken that summer.

I recognize how foreign my uncle’s experience is to my own life as well as the lives of a large swath of the U.S. population. But I also see in his experience a third logic that both lab-grown and industrially produced meat relies upon, that of disrupting (or obfuscating) what geographer John Law and anthropologist Annemarie Mol refer to as the metabolic intimacy between humans and the animals we eat.

Historically, that intimacy took on the form of humans and animals living in close proximity to each other, with food scraps from the table or cover crops on the farm going to feed the animal, before the animal in turn fed the human, either through contributing its waste to improving soil quality or through its own flesh. More recently, that intimacy has taken on other forms as the distance grows between where animals are raised and eaten. It now shows up in antibiotic resistance due to low-level use of antibiotics in confined animal feeding operations and in toxic concentrations of un-composted animal waste that contaminate groundwater supplies. Although these interactions are now far-flung, and much less visible than walking out into the backyard to feed your chicken table scraps, they nonetheless link eaters in one place with production methods in another. It’s too early in the cellular agriculture game to see where the all pieces will land, but one thing is clear. A massive move to lab-grown meat will continue disrupting the metabolic intimacy located in a first-hand experiential understanding of where food comes from, what resources are required, and what hands are asked to (or even able to) do the labor of producing it.

Rather than take a careful look at our own tastes and desires and how they tie us into larger systems of inclusion and exclusion, lab-grown meat as currently constituted offers a cotton-candy bedtime story about making a different world through food. Unfortunately, unless the logics underlying its production change, cellular agriculture is going to give us the same fluff we already have.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.