About 10 years ago, it seemed like everywhere you looked, someone was talking about the importance of the human microbiome. In 2003, the Human Genome Project finished mapping and sequencing all human DNA, and the National Institutes of Health turned attention to studying the nonhuman genes that inhabit us, those of the trillions—trillions!—of bacteria, viruses, fungi, and other microorganisms that live on and inside our bodies. Between 2007 and 2016, the NIH awarded $170 million to the Human Microbiome Project to investigate our relationship with our microbiota.
This project and other research showed that our reliance on microbiota cannot be overstated. Several early seminal studies showed that the microbiome is integral for regulating both our immune system and internal organs. In their entirety, microbiota account for about 2 to 6 pounds of the average human body, but about half the total number of all cells. These microbiota, collectively dubbed the “virtual organ,” express 2 million to 20 million genes, outnumbering human genes more than 100 fold. These organisms digest our food; maintain the pH of microenvironments such as saliva, bile, gastric acid, and tear ducts; and remove dead cells so live ones can take their place. They colonize our skin, hair, and armpits, and coat our every single nook and cranny inside and out. The presence of a healthy array of microbiota is imperative for maintaining nearly every physiologic process and for thwarting growth of the pathogens that make us sick.
But it’s not just our microbiome that is important. Even seemingly potentially harmful pathogens may be important to our health. Indeed, many evolutionary biologists believe the single most important driving force in evolution is the race to outwit our constantly evolving pathogens. According to this hypothesis (called the Red Queen hypothesis), without this competition, we might still be eight-celled organisms swimming in a pond.
As research on the importance of the microbiome and of exposure to external microbes proliferated, it seemed that journalists and the public were open to this message. In 2013, New York Times stalwart Michael Pollan was extolling the glories of living life so as to maintain one’s microbial diversity. An article in 2015 from Bloomberg pointed out that trying to avoid bacteria was both pointless and wrongheaded. 2016’s popular Let Them Eat Dirt was blurbed by none other than parenting guru and bestselling author William Sears as “A must-read for parents, teachers and any healthcare provider for children.”
By 2019, public acceptance was growing that exposure to microbes–and, yes, even to infectious agents–was inevitable and, in fact, mostly beneficial.
And then COVID-19 hit.
Now all of that progress has been shoved under the wheels of cleaning protocols and abandoned for fantasies of a sterile environment. We have confused dirt and grime, and even “germs” in the traditional sense—that is, microbes spread from one human to another—with deadly pathogens, and we are now desperately trying to avoid all of it. The result is that some people walk their dogs outside all alone in an N95 mask, apply hand sanitizer often and liberally, and talk about how they “don’t want to ever get a cold again.” The New York City transit authority has an entire webpage devoted to its cleaning protocols, including applying “antimicrobial biostats” after conducting an entire disinfection protocol. Children are excluded from school because they have runny noses, when it is precisely children’s exposure to other kids with runny noses that creates a robust immune system when they reach adulthood.
Regular exposure to microbes and allergens at a critically young age helps refine the immune system and trains it to discriminate between self and non-self invaders—a concept known as the hygiene hypothesis. Studies show that young children who live on farms or with pets develop allergies and asthma less frequently than those who are not exposed to animals and their environments. Likewise, children who suck their thumbs and bite their nails, habits that introduce a wide variety of microbes into the oral cavity—and thus into the rest of the body—also have heightened protection against allergic diseases compared with those who do not.
Microbes aren’t just good for training immune systems, though. They are also critical for our overall health and well-being. Adults who live on farms also have more diverse gut microbiota, as do people who have more frequent and closer social interactions than those who do not. Lack of gut microbial diversity is associated with a large range of health issues including obesity, anxiety, and depression. (It is why the gut is sometimes referred to as “the second brain.”) Gut microbiota mediate both digestion and immunity by outcompeting pathogens for nutrients and space, and stimulating innate immune responses. Gut microbiota also play an important role in maintaining and regulating microbiota in other organs, such as the lungs and heart, and an imbalance may worsen respiratory illnesses, including COVID-19. Skin microbiota, often the first line of defense against an infection, can stimulate specific gene expression signatures in T cells, which protect against infection and accelerate wound healing. It is unsurprising that lack of diversity in microbiota is associated with pulmonary and cardiovascular diseases, autoimmune diseases such as diabetes and muscular dystrophy, and even cancer and mental disorders.
Microbial diversity has taken a major hit in recent decades, even long before the pandemic, likely due to ubiquitous cleaning and sanitizing protocols, diets high in processed foods, and more sedentary, indoor living. But COVID-19 mitigations implemented over the past two years, such as lockdowns, masking, distancing, and widespread use of disinfectants, can only have compounded the problem. Ironically, staying inside where we are exposed to fewer microbes and fewer people, which for many people also contributed to higher alcohol consumption and weight gain, likely contributes to the poor immunity that hampers our ability to fight off SARS-CoV-2.
We may already be seeing effects of these pandemic isolation policies on immunity. One possible explanation for the recent spate of severe hepatitis in young children is that, after two years of being sheltered away from other humans, children’s immune systems are less adept at fighting previously mild pathogens such as adenoviruses. (There is so far no evidence that SARS-CoV-2 itself is causing the hepatitis directly.) A related phenomenon, immunity debt, in which lack of exposure to others causes a high burden of infectious diseases when isolation is ended, likely contributed to RSV outbreaks in infants and toddlers last summer and fall.
Thankfully, many of our initial misguided mitigations to prevent the spread of COVID-19, such as wiping down groceries, filling in skateboard parks with sand, and swathing kids’ swing sets in plastic wrap, have been phased out. But too many still remain. There are, today, still “clean” pen bins at banks and doctors’ offices, QR code menus in restaurants, and plexiglass barriers in some schools, grocery stores, and doctor’s offices. Children are still wiping down desks with bleach, Lysol, and other toxic products, and going to school with hand sanitizers clipped to their backpacks. Even the CDC has kept a detailed protocol for deep cleaning on its website despite the now well-established consensus that risk of surface transmission of SARS-CoV-2 is infinitesimally small. There is still a strong stigma about going out even with a minor cough or sniffle, as well as articles saying we all need to rethink the handshake as a cultural norm, and, in the middle of allergy season, sneezing is viewed as a social transgression.
Measures such as separating pens have likely persisted because they are viewed as harmless. Who is a tub of “clean” pens actually hurting? And while individually these protocols may be relatively insignificant, the combination of all of them and their seemingly endless insertion into daily life may actually be doing harm by contributing to an environment that is “hyper clean,” and by fostering the idea that encountering any microbes at all is detrimental to health.
Coming off of two years of COVID-19 restrictions that kept us away from others, on top of constant messaging that spreading germs can be deadly, it is understandable that many people find the fantasy of sterility appealing. However, as we exit the pandemic phase, we must restore some of our equanimity toward the microbial world and regain an appreciation of how integral these organisms are to our survival. Vaccines, not Lysol wipes, are our best defense against the harms of SARS-CoV-2. And microbes are not the enemy. They are part and parcel of who we are: 8 percent of our DNA is actually viral remnants, and another 40 percent is thought to have viral origins, including the genes that give rise to the placenta, the organ that defines us as mammals and keeps us all alive for nine months in utero.
We would do well to remember that nature abhors a vacuum—when microbes that have lived with us for hundreds of thousands of years are cleaned away, other worse ones may take their place. We need to let ourselves swap microbes with other people and with our environments again. It’s literally how we rose up out of the swamp and became human.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.