This story is part of the In-Between, Slate’s series on how life is slowly getting back to normal.
In the spring and summer of 2020, before we quite understood how weirdly uneven the American experience of COVID would be, the media produced a spate of “COVID will change everything” articles. Remember “COVID will be the end of the brick-and-mortar college”? Or “COVID will kill public transit”? Obviously, we can’t say with any real confidence yet what COVID “has done,” in terms of societal shifts. But, looking at the cultural remnants of past infectious diseases, we can get a general sense of the sorts of things that might stick around as “hangovers.” Chances are Americans will be left with a grab bag of COVID-related adaptations: some useless, some useful; some cruel, some generous; some intelligent and wonderful, some regressive and restrictive. The verdict will be decades in the making.
Many familiar—and good!—features of our houses and cities came about as responses to 19th century epidemics of contagious disease. Terrifying bouts with cholera in the first half of the 19th century led to the building of municipal sewer systems and better regulation of housing; in the baby city of Chicago, in 1833, officials prompted by fears of the contagious waterborne disease banned disposal of animal carcasses in waterways and instituted practices of street cleaning and waste disposal. Because some people believed disease was spread by miasma, or “foul air,” architectural responses to 19th century epidemics included verandas, wider spacing between buildings, outdoor courtyards, and tree planting. The discovery of germ theory in the late 19th century prompted new changes to the way houses were built, like the switch from armoires to closets to make rooms easier to clean, and the inclusion of a half-bath on the first floor to contain visitors’ germs away from where the family might bathe.
Cholera-inspired sanitation regulations aside, cities’ experiences of epidemics didn’t always translate into upgrades in quality of life for those whose living conditions weren’t great. Last year, I interviewed Kathryn Olivarius, a historian who is writing a book about yellow fever and social inequity in New Orleans in the 19th century. What happened in that city, I wondered, after that disease ceased to be a threat? Wealthy people in New Orleans, Olivarius said in an email, kept up with their seasonal migrations: “The rich would still leave town at the beginning of summer and come back in the early autumn, well into the 20th century.” Their business schedules and social lives had been organized around the possibility of warm-weather epidemics, and the habit persisted after yellow fever became a bad memory. “Plus,” she wrote, “New Orleans is just hot in the summer, so it made sense.” The more meaningful hangover from yellow fever, she thought, was political. “I think that the legacies of small state thinking, disease denialism, and a libertarian rather than communitarian approach to health are an outcropping from yellow fever epidemics,” she added.
And indeed, quite often in the United States, epidemics lead to a proliferation of individual consumer solutions to disease. In the 20th century, as major threats from most infectious diseases receded with better sanitation, vaccines, and medical treatments, their memory lived on in the marketplace. What historian Nancy Tomes calls the “germ sell” was huge in the 1920s and 1930s, as advertisers used the fear of microbes to sell a wide array of consumer goods: VapoRub, Absorbine Jr. (a treatment for athlete’s foot), Cremo cigars (rolled by machine, to reduce risk of germ transmission from workers).
In her work, Tomes analyzes editorials published in advertising trade journals during the 1918–19 influenza pandemic, noting that some bigger and more prestigious advertisers tried to step carefully around current events. Some chose to avoid looking mercenary—toning down the “scare” copy, selling products like toothpaste with a straight appeal to the value of retaining good health. But after the pandemic, advertisers harked back to the bad memories of that experience by referring obliquely to it, calling for people to purchase their stuff to achieve the vague goal of “hygiene.” It helped that nobody really understood why the recent influenza had been so deadly—an uncertainty that left the door open for advertisers to hawk health remedies by referring to the need not to “neglect a cold, or it might turn into something worse.” This approach, Tomes writes, worked “a deep vein of anxiety” going back to the fear around the 1918–19 pandemic.
Paper products and cellophane packaging exploded in popularity after germ theory became common knowledge and after the pandemic concluded. Before World War I, Tomes points out, public health efforts to sell people on buying a paper cup instead of sharing a communal one at public water fountains failed completely. In those prewar, pre-pandemic times, the only kind of disposable paper product many Americans bought was toilet paper. After the war and the pandemic were over, wrapped food, paper cups, paper plates, Kleenex, and Kotex became quotidian purchases. In her book The Gospel of Germs, Tomes describes how the manufacturer DuPont capitalized on the mood, using advertising copy that invoked the possibility that the food people were buying had been touched by others: “Strange hands, inquisitive hands, dirty hands. Touching, feeling, examining the things you buy in stores”—ack! Give me that cellophane!
The new knowledge, and maybe people’s experiences of the flu, contributed to a widespread mood of consumer paranoia. Even as tuberculosis death rates declined in the 1920s, and chronic diseases replaced infectious ones as the leading causes of death in the United States, newspaper and magazine editors got hooked on covering “germ diseases”—especially epidemics. Tomes found evidence that the press had a very specific understanding of what worked: Disease stories, in the form of exposés that overthrew conventional wisdom, warned of a secret consumer danger and prodded people to change their behavior immediately. Editors looked to medical literature to find out about outbreaks of “unusual diseases” (brucellosis, amoebic dysentery, psittacosis)—little flare-ups that affected small numbers of people and were “serious enough to cause concern without being so overwhelming as to produce extreme denial or despair.”
The same memory of past epidemics that sold products was also used to perpetuate segregation. Sara Jensen Carr, whose book The Topography of Wellness: How Health and Disease Shaped the American Landscape is coming out later this year, found that housing covenants governing the development of suburbs from the 1920s through the 1960s used language around infectious disease to justify racial exclusion. “There was a lot of language in these housing ordinances that said that white families couldn’t sell to people of races with higher disease rates than them, and argued this was backed up by ‘science,’ ” Carr said. “Of course, this was another justification for something they would have tried to do, anyway.”
There were also ecological consequences to Americans’ epidemic memories. In the midcentury period, the fear of polio led American towns and cities to clamor for the use of DDT, a chemical that we can still find in our ecosystems almost 50 years after it was banned. The much-heralded use of DDT in the war effort, in service of controlling malaria and typhus, led to postwar public acceptance of the use of the chemical in the United States. And because some people thought that flies might carry polio (they didn’t), the miracle chemical began to look like the answer to the terrifying polio outbreaks that haunted American parents; for about 15 years, people sprayed DDT and thought that it would keep polio away. Historian Elena Conis’ article on the use of DDT for polio between the end of World War II and the early 1950s describes health officials in San Antonio spraying every block in the city, “asking householders to leave windows and doors open so the fog could penetrate their homes.”
The long-term impact of the midcentury polio epidemics was also felt in the political sphere. Polio survivors, historian Naomi Rogers said, made a big impact in the world of disability rights. “By the early ’60s,” she said, “you see some polio survivors start to say they’re just sick and tired of being treated as ‘imbeciles’ or ‘feeble-minded,’ and they began to try to organize for their civil rights.” Among the people Rogers pointed to is Ed Roberts, who sued the University of California, Berkeley to accommodate his attendance, then became a leader of the independent living movement, and Paul Longmore, an activist and a leading light in the field of disability history. In the 1980s, many older polio survivors were struck by post-polio syndrome—a worsening of their symptoms, decades after their illness and initial rehabilitation. That experience radicalized still more survivors, and many of them joined other disability rights activists in lobbying for the passage of the Americans With Disabilities Act of 1990.
The post-epidemic effects that give historians of medicine the most pause in considering what the post-COVID world will be like are the ones that Nancy Tomes, in an interview, compared to the Maginot line—the massive and sophisticated set of fortifications, built by France in the 1930s to keep Germany out, that Germany bypassed in 1940 by invading through Belgium at a part of the line that was poorly defended. In his history of venereal disease, Allan Brandt writes about how, after the midcentury discovery of antibiotics radically changed the treatment of syphilis and gonorrhea, public health officials struggled to keep up, spending more and more money on programs that served the wrong populations. In 1976 in New York City, Brandt reported, officials performed 116,000 premarital examinations at a cost of $2.3 million, looking for cases of disease in prospective brides and grooms; they found only 39 new cases of syphilis, at a cost of about $60,000 per case, when they would have been more effective looking for cases among “higher-risk populations, such as homosexuals, college students, and sexually active teenagers.”
And so the disease that you spend a lot of time fighting and worrying about creates grooves in your habits that don’t serve you well when the next disease comes along. “Some of it is the persistence of behaviors that you learn as a child, that you just repeat. Some of it is generational memory,” Tomes said. When she was researching her book about the culture that sprang up around infectious disease in the early 20th century, she was deeply interested in the way that the health practices people learned back then, which focused on things like fomites and spit, didn’t work at all to prevent the transmission of HIV—yet people learning about HIV couldn’t seem to pivot away from them.
“HIV/AIDS wasn’t spread by casual contact,” Tomes said, “and the amount of suffering inflicted on people who became HIV-positive because of this misapplication got me thinking about how we basically learn one set of rules that we apply to everything contagious. But in fact, the world of bacteria and viruses is far more complex than that. It’s possible the next viral variant that comes along will spread in some new way. It could spread through water, I don’t know! We are really unprepared.”
As COVID subsides, will we keep our masks and wear them as a matter of course during coming flu seasons? Will we lobby for more workers to have paid sick leave so they can stay home when they’re not well? Will we stock more toilet paper and canned beans, remembering those weeks when we couldn’t get any? Time will tell which, if any, of these hangovers will become habit.