Less than a mile from the vast jail compound on Rikers Island in New York lies an abandoned hospital that once incarcerated a working-class woman for 26 years. Mary Mallon, better known as Typhoid Mary, was a single Irish immigrant without children who worked as a cook for wealthy New York families in the early 1900s. Although she was healthy and had developed immunity to typhoid fever, she unwittingly infected others with the deadly bacteria she harbored in her organs.
Newspapers of the time called Mallon a disease factory, a germ, a witch. She was not immune to those personal attacks. Doctors and government health officials held her at the Riverside Hospital on North Brother Island near New York City for study, under the banner of public safety. Meanwhile, according to Susan Campbell Bartoletti, author of Terrible Typhoid Mary, other known carriers of typhoid in the state who had infected far more people with the fever, such as a farm worker named Tony Labella and an outdoor guide in the Adirondacks dubbed Typhoid John, roamed free.
What happened to Mary Mallon beginning in 1907 was not, however, an isolated incident in medical history. Rather, it is merely the most notorious case of a common community response to epidemics across history and throughout the world: confining those suspected to carry a deadly strain of an illness and restricting their movement.
Quarantine, as its been called due to the 14th-century Venetian practice of detaining ships for 40 days (quaranta giorni) to stave off the plague, can be a powerful tool for fighting disease outbreaks. It was credited with preventing deaths during the 1918 influenza pandemic, when cities around the United States and Europe closed schools and theaters and canceled funerals and sporting events, placing people exposed to the so-called Spanish influenza under house arrest or sending them to city hospitals. The origins of the tactic date back as far as the Old Testament: Leviticus 13: 4–46 calls for the isolation of suspected “lepers” by priests.
The list of diseases that justify quarantine in the U.S. and elsewhere has expanded in recent decades as new outbreak fears have emerged. During the 2003 SARS outbreak, quarantine resurfaced in the eyes of public health officials as a response suitable for the pandemics of the modern, interconnected world—not merely a tool of the bygone eras before antibiotics and sanitation. Quarantines were enforced in Guinea, Sierra Leone, and Liberia during the Ebola epidemic of 2014–16, and imposed by New Jersey and New York on health care workers returning from West Africa. Quarantine, as a matter of law in the United States and much of the world, can be used to restrict the movement of people who seem healthy—if suspected to have been exposed to a contagious disease in the recent past. (For some diseases, this makes sense, while during the latest major Ebola epidemic, people were not contagious unless and until they exhibited symptoms.)
Today, the threat of pandemics is rising, thanks to a growing world population, more frequent movement of people and goods, and a warming climate hospitable to disease-spreading bugs. That’s why it’s important for us to take a hard look at how quarantines are carried out if we hope to avoid the sins of the past. Long before and ever since the time of Typhoid Mary, the life-saving tool of quarantine has also targeted and harmed the poorest, most vulnerable members of society. In the United States, we would be particularly wise to exercise vigilance about government-enforced quarantine in this historical moment: We live under political leadership that has proved itself unashamed to use executive power to tear apart poor migrant families and turn away religious minorities. Wielding the legal authority of quarantine to target specific populations would be just a small step for a president and White House that has already taken giant leaps encroaching on civil liberties.
Over the past two decades, I have witnessed firsthand the dangers posed by quarantines—and their frequent sidekick, stigmatization of the ill. When I lived in Vietnam 13 years ago, the government had underway a “social evils” campaign, aimed at combatting the spread of HIV and AIDS. Officials sent sex workers and intravenous drug users to facilities where they remained under watch and were forcibly rehabilitated from their former lifestyles. Soviet-style propaganda posters hung on roadsides and café walls demonizing patients of HIV and AIDS alongside heroin addicts and prostitutes. The stigma of a positive diagnosis drove people who suspected they might have HIV underground and into the closet, where they posed greater harm to those they might have infected unwittingly and to themselves. At the rural leprosy hospital in Thai Binh province where I worked, elderly leprosy patients lived, played cards, attended Buddhist meditation sessions, and even sometimes married each other despite lacking some or all of their limbs. More and more young patients began coming to the isolated leprosy hospital to be tested for HIV, reporting that it was the only place they felt they could escape the eyes of their neighbors, who would stop drinking tea with their parents if they knew the illness afflicted a son or daughter. Leaders from aid organizations warned that the social evils campaign was undermining the ability to treat HIV patients and to curb the spread of the virus, while forcing seriously ill AIDS patients to live in the shadows or in social isolation.
The scene in Vietnam echoed what I had heard at the start of the millennium, when I lived in Havana, Cuba. Gay HIV and AIDS patients there told me about their experiences being imprisoned in government-run sanatorios, hospitals outside of cities where such patients were routinely isolated from their communities from the mid-1980s until the early 1990s. By 2000, HIV patients were no longer forced to stay at the countryside facilities, but for some, the sanatorios remained the best place to receive medical care and increased their chances of getting access to antiretroviral therapy.
Stories like these might sound typical of socialist dictatorships—the kind of incidents that would never happen in liberal democracies. But it’s not just oppressive regimes that marginalize and stigmatize the sick. The sordid past of democratic government quarantines should serve as clear warnings for the future.
Harvard medical historian Allan Brandt has documented how, during World War I, the United States government imprisoned nearly 30,000 women in the U.S. who were sex workers or “camp girls” who fraternized with men because of fears they were spreading sexually transmitted diseases to potential military recruits. In the early 20th century, poor and Jewish immigrants who traveled in steerage and third-class to the United States from Europe were quarantined at much higher rates than rich travelers who were barely examined for possible illness. In 1900, President William McKinley issued a quarantine of all Chinese and Japanese people in the city of San Francisco after a man was found dead from the plague in a Chinatown basement. The decree led many laborers of Asian origin to lose their jobs before it was ruled unconstitutional by a federal court for violating the Equal Protection Clause of the 14th Amendment.
More recently, in August 2014 during the Ebola epidemic, the democratic government of Liberia quarantined with military forces an entire poor neighborhood in Monrovia known as West Point. The community protests that emerged led to many injuries, one death, and the premature end of the quarantine before the virus’ incubation period of 21 days. Researchers have documented how many poor families in Liberia were prohibited from conducting burials and stigmatized for violations, while wealthier families could circumvent burial bans through secret bribes to funeral homes. Meanwhile, in the United States, public health experts called for New York and New Jersey to end their mandatory 21-day quarantines of health care workers who had treated Ebola victims, because of the indiscriminate application to people who were healthy and therefore not contagious. They also worried about the chilling effect of quarantines on doctors and nurses volunteering to travel abroad to help stop the epidemic at its source.
Fear often overshadows reason amid deadly disease outbreaks. “The use of segregation or isolation to separate persons suspected of being infected has frequently violated the liberty of outwardly healthy persons, most often from lower classes, and ethnic and marginalized minority groups have been stigmatized and have faced discrimination,” writes Eugenia Tognotti, a biomedical researcher at the University of Sassari in Sardinia, Italy. In a 2013 paper, she calls this characteristic “almost inherent in quarantine.”
Perhaps the greatest danger of forced quarantines that isolate the poor and vulnerable is that they often don’t work. By eroding trust in public health measures, they undermine the ultimate goal of protecting societies from the spread of disease. People find ways to break free. It is thus possible that giving patients and the public more choices—and the option to quarantine themselves—could in some instances be more effective at fighting off pandemics than police or military-enforced quarantine. One study credits the lower death rate in New York City, relative to cities such as Philadelphia and Boston, during the 1918 influenza epidemic in part to the city’s decision to impose less stringent restrictions on movement, including keeping schools open where kids could be educated about how the flu virus spreads and encouraging infected families to subject themselves to voluntary quarantine in their homes. A group of researchers who have looked back at successes and failures during the 2014–16 Ebola epidemic showed that containment led by local community leaders, as opposed to imposed by a central police or military force, had greater success in restricting movement and encouraging sanitary practices; another group argued for using the least restrictive measures needed to fight outbreaks. Governments and public health advocates need to further investigate such cases from the past to establish the conditions and norms under which quarantines can be just and effective in the future. It would be wise to do so before the panic of the next pandemic sets in.
Mary Mallon was mistrustful of doctors who insisted she was infected with the typhoid bacteria, and who incarcerated her and took blood samples against her will. She felt perfectly healthy at the time of her initial arrest, which likely contributed to cognitive dissonance and her rejection of the scientists’ hard truth about her role in the outbreak. During a brief period when she was released from her involuntary quarantine, Mallon returned to cooking her signature fresh peach ice cream under a pseudonym at a hospital—in violation of her agreement with local authorities—and she further spread typhoid fever. It is unclear whether Mallon would have acted differently had doctors and health officials taken a different approach. They could have taken the time to fully and calmly explain their concern about her carrying typhoid before they accosted her. They could have offered her a choice about whether to be studied as the cause of the New York City typhoid epidemic. They could even have initially let her remain in the community but made an effort to help her find work in a trade other than cooking. What we do know is that doctors’ and officials’ attitudes toward her as a poor, single, immigrant woman almost certainly shaped both how she was treated and her reaction to being quarantined.
The lesson we can learn from Typhoid Mary, and from much of the history of quarantine, is that society’s interest is best served not by punishing the sick and powerless, but by caring for them.
This piece originally misstated the date of Future Tense’s event on the 100th anniversary of the 1918 influenza pandemic. It was Oct. 25, not Oct. 26.