Future Tense

The AstraZeneca Vaccine Crisis in Europe Isn’t About Science at All

A vial of AstraZeneca COVID-19 vaccine sits on a metal tray.
Some countries in Europe have hit pause on the use of AstraZeneca’s COVID-19 vaccine. Jens Schlueter/Getty Images

There is a crisis brewing on the other side of the Atlantic, about the safety of the Oxford-AstraZeneca vaccine against COVID-19. Europeans in multiple countries have reported blood clots and abnormal bleeding after receiving it, which has occasionally required hospitalization. In response, regulators across Europe, where this vaccine is most widely distributed, have suspended its use and are reviewing its safety. But scientific and medical experts are frustrated. They emphasize that the incidence of blood clots is actually much lower than in the general population and that the vaccine is safe. They worry that the regulators’ response will amplify vaccine hesitancy and increase potentially deadly COVID-19 infections. The science, they insist, is clear and should be trusted.

Advertisement

But this crisis isn’t about science at all. It’s about public trust, and scared citizens cannot be easily convinced by expertise that feels remote. Our solutions need to reflect that.

Advertisement
Advertisement

There is a long-standing perception that vaccine hesitancy is the result of public ignorance or a dismissal of science. But studies show that vaccine hesitancy is the result of mistrust in our governing institutions, including those dedicated to science and technology. Citizens are alarmed by the often cozy relationships between regulators and the industries they oversee and frustrated with the role of private interests in the research enterprise.

My own research, conducted through the Technology Assessment Project at the University of Michigan, adds two more types of institutional failure that lead to citizen distrust. We have seen serious problems due to what sociologist Diane Vaughan calls the “normalization of deviance,” where unsafe bureaucratic practices come to seem normal if they don’t cause immediate catastrophe. Reviewing the space shuttle Challenger disaster, Vaughan discovered that NASA’s organizational culture made it essentially impossible for managers to hear engineers’ concerns about the weaknesses of the O-ring, the technical component ultimately blamed for the explosion because it cracked in exceedingly cold temperatures. Similarly, organizational problems at the U.S. Centers for Disease Control and Prevention led to its faulty COVID-19 test early in the pandemic.

Advertisement

In addition, marginalized communities often feel that the decisions made by governments and technical experts simply do not represent them. Consider the recent water crisis in Flint, Michigan. Soon after leaders changed the city’s water source in order to save money, residents (54 percent of whom are Black) noticed that their water had a foul smell and was discolored. They began losing their hair, and their skin was breaking out in rashes. Some people died of Legionnaires’ disease. They cried out to local experts and government officials, including environmental and health regulators, but their concerns were summarily dismissed for months. This episode inflamed existing concerns that government officials did not respect community knowledge or needs, and to this day there is great skepticism that their water is safe to drink and use.

Advertisement
Advertisement

Seen in this light, we should applaud European regulators’ decisions to adopt a precautionary approach to the Oxford-AstraZeneca vaccine. To ensure public trust in vaccines—and technologies more generally—governments need to take adverse events and emerging community concerns seriously. By suspending vaccine distribution and reviewing the data, European governments are allowing citizens to feel that their concerns are being heard and to trust that their governments are truly representing their interests. In the short term, these governments can continue to build community trust by being transparent about their findings on the vaccine, including the uncertainties, and the risks and benefits that they are balancing as they decide how to proceed. This is particularly crucial in Europe, where there were high rates of vaccine hesitancy even before the cases of blood clots came to light.

Advertisement

But this episode also provides lessons for the long term. Reacting quickly and transparently during a crisis is not enough, especially because citizens often do not share the same priorities of, for example, achieving herd immunity. Instead, establishing and maintaining public trust requires systemic solutions.

First, medicine regulators need to create systems that require physicians to report all adverse drug reactions, coupled with clear rules about the types of data that might trigger further review. Both the data and the rules that guide government decision-making should be accessible to citizens. While most countries have reporting systems, few are mandatory. A mandatory system makes important data available, and, if communicated effectively, it can reassure citizens that their leaders are monitoring pharmaceuticals and are poised to act when necessary. It can also help curb the spread of false information, as citizens can view the data for themselves.

Advertisement
Advertisement

Second, regulators should include citizens’ perspectives—particularly from communities that are historically marginalized or that are likely to be skeptical—when reviewing vaccines. Community members might advise bureaucrats during initial review processes and sit on advisory committees alongside technical experts. This would provide crucial expertise to policymakers, helping them understand which risks worry citizens and how they balance risks against benefits. It would simultaneously provide these communities with a sense of inclusion and empowerment while giving them greater insights into the vaccine approval process that they can bring back to friends and family who might be hesitant.

Finally, it is crucial that medicine regulators, research funding organizations, and other science and technology policy institutions bring community concerns centrally into their day-to-day work. In other words, we cannot only care about the concerns of alienated communities when it affects us directly. To build long-term trust, leaders must learn more about the concerns and priorities of historically marginalized populations and shift research and regulatory priorities to address them.

Some might argue that such measures are too time-consuming and will distract from the goal of developing science and technology in the public interest. But if the past year has taught us anything, it’s that broad social trust is crucial to successful public health initiatives and, ultimately, to our survival.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Advertisement