As the United States struggles to distribute and administer COVID-19 vaccines, we’re looking back at the history of vaccine rollouts in our country, including the logistical roadblocks to shots and communicating with a fearful public. The COVID vaccines have been widely shown to be safe and effective, unlike some historical examples that had significant associated risks. But what can stories of failures from the past teach us about how to fairly administer them?
On Tuesday, Feb. 16, at 1 p.m. Eastern, join Future Tense for a conversation with Atul Gawande and Helene Gayle, co-chair of the National Academies framework for vaccine distribution, about the COVID-19 vaccine rollout.
For many midcentury American families, a bout of the measles—kids home from school, feverish and rash-covered, eating Popsicles and reading comic books—was a rite of passage. To officials looking at the big picture, the public health toll of measles, a common and extremely contagious childhood disease, looked unacceptably high. Measles causes encephalitis in 1 case per 1,000, with further serious complications (deafness, intellectual disabilities) occurring in one-third of those cases. Most kids weathered the measles fine, but there were lots of cases, and thus lots of complications. In the late 1950s, the country saw an average of 4,000 kids get encephalitis from measles every year, while about 450 died.
The idea that measles could be conquered emerged from a climate of Kennedy-era liberal altruism mixed with pro-scientific optimism prompted by the success of the polio vaccine. When it came to vaccines improving public health, anything, it seemed, might be possible. So in 1967, the U.S. government launched a campaign to eradicate measles. Alexander Langmuir, the CDC’s chief epidemiologist, compared his motives for the campaign to Edmund Hillary’s famous reason for climbing Mount Everest: “Because it was there.” Langmuir embellished, “To this may be added ‘ … and it can be done.’ ”
The late-’60s measles campaign used a vaccine that was already a few years old. In 1954, Thomas Peebles, a scientist working with famed biomedical researcher John Enders at Boston Children’s Hospital, took advantage of a measles outbreak at the private Fay School in Massachusetts to collect blood samples and isolate a strain that could be used to make a vaccine. (It’s called the Edmonston strain, because the sick student who provided the blood was named David Edmonston.) Enders made this strain available to other researchers, and in 1962, Maurice Hilleman and his colleagues at Merck released an attenuated measles vaccine using that strain. This was called Rubeovax, and it needed to be administered with a shot of gamma globulin antibodies to reduce any reactions it might cause. (Gamma globulin, a product made from the blood of a person or animal who has already had a disease, bestows immunity for a much shorter period than a true vaccine.)
The innovation met, at first, with a lukewarm public reception. This wasn’t because the public was afraid of the vaccine’s effects. Though the measles-mumps-rubella shot would later become a focus of our present-day anti-vaccination movement, in the ’50s and ’60s, as historian James Colgrove, author of State of Immunity, put it in an interview, “there was very little active resistance to vaccines.” The success of New York City in vaccinating 6 million residents against smallpox in 1947, ahead of a feared outbreak when one man died of the disease in a city hospital, can partly be attributed to this postwar American mood of easy amiability toward the idea of vaccination.
No, the “meh” reaction to the new measles shots was more a matter of money. After the first flush of success of the polio vaccination campaign of the mid-1950s, immunization rates dipped among poorer populations in the early 1960s. The Vaccination Assistance Act, which was passed in 1962, funded grants to states to help with vaccine delivery, specifying the use of the money for vaccinations against polio, diphtheria, whooping cough, and tetanus, hoping to rectify this problem.
AFL-CIO representative Andrew Biemiller said at a hearing on the act in 1962 that his organization supported government funding for vaccination because of data showing that, among kids under 5 in Atlanta, 78 percent of wealthier people had gotten three or more Salk shots compared with only 30 percent of the less wealthy. “Cases of paralytic polio are concentrated in a city’s central core,” he pointed out, “where the poorer, the less privileged, the minority groups are to be found.”
As with polio, so with measles. The new measles vaccines cost about $10 (about $86 today) and were mostly administered by private physicians. Families of kids who didn’t have access to the health care system, or couldn’t afford to lay out for the shots, stayed away. The act passed. Three years later, it had to be reauthorized, and the language was changed to add measles to the list of diseases eligible for vaccine assistance. That’s when the Centers for Disease Control decided to go for eradication, picking Rubeovax for its shot, even though it required that gamma globulin, because it was cheaper. “The difference in price between the two vaccines was enough to make the vaccine available to almost three million additional children,” historian Elizabeth Etheridge writes in her history of the CDC, Sentinel for Health.
The measles eradication campaign used many of the publicity tricks familiar from diphtheria and polio immunization drives of years past. It had the support of President Lyndon Johnson, who announced the effort personally, urging Americans to participate. In New York City, billboards, posters, television commercials, and radio announcements told people that measles was the next childhood disease that could be zapped. (One slogan used in the city was “Measles Bites the Dust.”) The city held “health happenings”—groovy—in underserved neighborhoods, with toys and games as prizes for kids who got vaccinated. Charles Schulz signed on, producing a series of Peanuts comic strips promoting the vaccine.
At first, this campaign saw results. In 1967–68, 11.7 million doses of measles vaccines were given to American children. In 1968, there were only about 22,000 cases of measles recorded nationwide, as opposed to the annual average of 450,000 from the previous five years. But slowly, over the next couple of years, the numbers crept back up, and in 1971, a survey showed a low 57.2 percent rate of measles immunization—only 41.1 percent in poor urban areas. Why?
“One of the issues was just a lack of perceived threat,” Colgrove, the historian, said. “I think that was maybe even a bigger obstacle than the lack of a delivery infrastructure.” There was, he said, a gap between the way health professionals saw the threat and the way the public thought about it. “The folks at the CDC had a global view and knew what a problem measles could be in low-resource settings, where kids who needed to be rushed to the emergency room couldn’t be rushed to the emergency room. And they were looking at the incredible contagiousness of measles.”
In postmortems of the campaign’s failure, some engaged in poor-blaming, with one study Colgrove quotes in his book arguing that poor people “take a rather cavalier attitude” toward illness. “If the lower and middle classes viewed illness with as great alarm as do the upper classes, their use of physicians would surely rise,” the author of this study wrote. But greater barriers were probably the short hours of child health clinics, which tended to fall during the working day, and long waiting times for the shots. By the end of the 1960s, Colgrove said, measles “outbreaks were becoming very much concentrated among poor kids in rural areas, like in Appalachia, and in inner-city neighborhoods.”
But perhaps a bigger problem for the measles effort, at least in the 1969–70 period, was the existence of rubella. In 1964, a German measles epidemic caused deafness, blindness, intellectual disabilities, and heart defects in about 20,000 American newborns. Rubella, as the disease came to be called in an effort to distinguish it from the “red measles,” is a mild disease for adults, but can be harmful to fetuses in utero. If measles was a minor disease in most parents’ minds, congenital rubella syndrome was a huge deal. Adding to the fear factor, rubella was very difficult to avoid contracting, as it’s highly contagious. Pregnant women with kids in school were terrified.
Scientists thought rubella recurred cyclically, coming back every five to seven years. So, as 1969–70 approached, public concern grew that another wave of babies born with congenital rubella syndrome was coming. Labs raced to produce a vaccine; finally, an effective one, developed by Stanley Plotkin at the Wistar Institute in Philadelphia, was licensed in 1969.
Officials decided that the easiest way to try to achieve herd immunity was to give the new rubella vaccine to children instead of adult women and teenage girls, who would be carrying the fetuses who might be affected. There was a chance that already-pregnant women themselves might be vaccinated by mistake and adult women suffered some passing side effects from the vaccine (transient arthritis among them) while children did not. This was the first time kids would be vaccinated for a disease that didn’t directly threaten them but was very harmful to others.
Health organizations like March of Dimes, which had pivoted to a focus on the prevention of birth defects after its success with polio, promoted the rubella shot by selling the idea of it directly to kids. Leslie J. Reagan’s book about rubella, Dangerous Pregnancies, contains images from a March of Dimes comic that went home with schoolchildren in some cities, along with their consent forms for the rubella vaccine. Titled Rubella Robs the Cradle, the book was aimed at convincing children to submit to the new shots lest they become “red-haired Stevie,” the protagonist, who unknowingly infects his aunt with rubella and causes his cousin to be born blind. Reagan writes that, compared with the measles vaccine’s promotional materials, items promoting the rubella shot were more commonly aimed at minority communities; Rubella Robs the Cradle had a Spanish-language counterpart, Sarampión Alemán … Tragedia Que Acecha La Cuna.
As with measles, American health care’s uneven coverage of the population shaped the way the rubella vaccine was administered. “Locating and immunizing 100 percent of female teens or women of childbearing age could not perpetually be achieved nationwide in the United States, particularly without a national healthcare system,” Reagan writes. In other countries, they managed it; in Britain, Australia, and Israel, the vaccination went into schoolgirls’ arms only. The strategy was, it turns out, “inadequate,” as Reagan puts it—outbreaks among adults continued into the middle of the 1970s, and babies continued to be born with CRS (though many fewer than in 1964). Officials shifted their strategies to immunize adult women, soldiers, and college students as well as children, and that seemed to work.
The diversion of resources from measles to rubella shows how easily vaccination drives aimed at getting rid of endemic diseases can be undermined by the ebb and flow of public concern if there’s not a long-term governmental commitment to supporting vaccine access. The competition for government resources between the measles and rubella vaccines ended when Merck’s Hilleman combined the measles, mumps, and rubella vaccines into the MMR shot in 1971. In the 1970s, almost all states adopted laws forcing children to be vaccinated against these diseases in order to enter school, and the Carter administration pursued vaccination programs in the late 1970s, commissioning PSAs like these memorable ones with characters from Star Wars. At the end of 1979, 90 percent of American children had been immunized with the MMR shot. And then, in 1998, along came Andrew Wakefield, the discredited British physician who helped catalyze the anti-vaccine movement. The rest is a story for our times.