Science

COVID Skeptics Don’t Just Need More Critical Thinking

Without a shared approach to scientific expertise, “trusting the data” won’t lead us to the same conclusions.

A woman sits at a laptop. At one side are charts and graphs. At the other, drawings saying things like, " COVID fake."
Photo illustration by Slate. Photo by SIphotography/iStock/Getty Images Plus.

Over six grim pandemic months, from March to September of 2020, researchers at the MIT Visualization Group tried to find out how COVID-19 skeptics were talking about science and data. They identified Twitter users who had clustered together into communities around COVID-related topics and collected the graphs, charts, and maps each community tended to share. (The group has put together a visual presentation of their resulting paper, where you can see which images each group favored.) At the same time, they found anti-mask groups on Facebook and “deep lurked” there, watching comment threads and observing Facebook Live streams where people taught one another how to use public health data to come to their own conclusions, independent of government interpretation.

Advertisement

The diverse and ever-evolving community of skeptics they tracked across half a million tweets and 41,000 visualizations should give anyone convinced that anti-maskers and COVID deniers are just uneducated, and in need a good dose of data literacy, some serious pause. In these groups, there was plenty of education going on. But what kind? “It’s certainly tempting to characterize COVID skeptics as simply ‘anti-science,’ ” the MIT group writes, reminding us that this is what people like Anthony Fauci have chosen to do, over the course of the pandemic. “But this would make it impossible to meaningfully understand what they mean when they say ‘science.’ ”

Advertisement
Advertisement
Advertisement

I spoke to Crystal Lee, the leader of the group of researchers and a graduate student in MIT’s Program in Science, Technology, and Society, about their findings. Among other things, we talked about the omnipresence of Elon Musk, an unlikely cameo by Thomas Kuhn, and the urgent need to reconceptualize our simplistic understanding of the benefits of “critical thinking.” Our conversation has been edited and condensed for clarity.

Advertisement

Rebecca Onion: How do people in these groups you found characterize themselves? What kinds of expertise do they claim?

Crystal Lee: A lot of them emphasized what you might call their amateur status. They emphasize that they’re, like, “moms down the street,” concerned parents. They present themselves as normal people who are just engaging in critical thinking that’s not beyond the pale for any normal, well-educated person.

There’s that category, but there’s an interesting overlap between that “you don’t need special expertise to do this” person and the people who like to mention the fact that they have a Ph.D., have been doing science for a long time. Pointing to evidence like: There’s a Nobel Prize winner who also supports this [Stanford’s Michael Levitt]—rhetoric very much within the wheelhouse of scientific inquiry.

Advertisement
Advertisement

So it’s an interesting dynamic between the kinds of expertise people want to assert. And they use it different ways in different kinds of arguments. So when it comes to talking about scientific validity and the CDC, for example, they’ll say, “Look, the CDC backtracked after they issued these initial guidelines on masks. Can we really trust science?” If you’re putting on your hat as concerned parent, you can acknowledge scientific expertise, but also say, you know, “We shouldn’t have to rely on experts.” All these contested frames of expertise.

Advertisement

Looking at the networks on Twitter that you identified, you named one the “COVID skeptic network,” and their most-followed person was Elon Musk, which is a great example of someone with some kind of STEM bona fides, who has created this persona as man of the people. Along those lines, I was fascinated to see your analysis of the Facebook discussions where people were teaching one another, or some people were taking on the persona of instructor. Can you talk about how that works within a community of people who are skeptical of interpretations of data given to you by other people? Like, they’re skeptical, but then they’re teaching one another how to interpret data? This seems like an interesting contradiction.

Advertisement
Advertisement

This is another way that there’s a sense that we need to democratize science: It shouldn’t be left to the experts; if you think about it critically, it should be obvious. But there’s also a sense that, you know, obviously expertise exists! People have Ph.D.s. But since we can’t trust them, there are extra things we can learn in order to make decisions as informed citizens.

The alternate way of teaching and talking about data is in some ways, I think, unique to these groups, but it also bears resemblance to the kinds of things you’d see in data science classes taught at MIT, or elsewhere.

What do you mean? Is it the specific tools being taught that bear some resemblance, or the rhetoric around data, or something else?

Advertisement

So there is a convergence of tools [like Excel and Tableau]. But there’s also a convergence of data sources. So the skeptics are doing these livestreams, where they go together to some state health department website and talk about how you navigate the data portal. What data is important? How is the data collected? I feel like these are the kinds of questions you would get in a class. Thinking about scientific studies, like, Is the data reliable? What do the interpretations look like? Given that we have these raw data sets, what kinds of analyses can be run in order to find trends in the data? These are the same kinds of things you talk about when learning data literacy.

Advertisement
Advertisement
Advertisement

My question is, if they are using these same tools, using the same data sets, and asking the same questions as the scientists who create visualizations for the government, where are the points of departure? Where do the roads diverge in the woods?

The biggest point of diversion is the focus on different metrics—on deaths, rather than cases. They focus on a very small slice of the data. And even then, they contest metrics in ways I think are fundamentally misleading. They’ll say, you know, “Houston is reporting a lot of deaths, but the people there are measuring ‘deaths with COVID,’ in addition to ‘deaths by COVID’ ”—that distinction.

Yes, that’s a big one—but, of course, we know that many times the person died from a condition caused by COVID, and that’s what’s being reported.

Advertisement

Right. And another major thing is people feel that data doesn’t match their lived experience. So we know a lot of health departments now have websites and data portals and such, but especially in smaller communities, the statistics they have are from the state, and there’s some unevenness between the city or town level and then the state. And so the state might be really bad, and the numbers are scary, but the rate might be lower in a specific town. So they’ll say, “Look, we don’t know anybody who has it, and our hospitals are fine.” So there’s a disconnect that is underlying the skepticism that leads them to try to reapproach the data, reanalyze and represent the data in a way that makes more sense to them.

Advertisement
Advertisement
Advertisement

What do these kinds of skeptics have in common with, say, anti-vaxxers who misuse data from the Vaccine Adverse Event Reporting System, or climate skeptics who repurpose climate data?

Advertisement

Well, I’m less of an expert on those groups, but just looking at this from a history of science or science and technology studies perspective, what stuns me are how all these groups use concepts from those academic disciplines. In one of these COVID skeptic groups, they were invoking Thomas Kuhn on scientific paradigm shifts! All this rhetoric about what expertise means, and the kinds of scientific knowledge that’s valid … they are all doing STS. Which is really interesting, and also horrifying.

This paper of yours fits into a little movement of people who are questioning the idea, which is basically received wisdom among American liberals, that more media literacy, data literacy, or critical thinking is what’s needed now to heal our partisan divisions. You cite a danah boyd speech from 2018 (“You Think You Want More Media Literacy. Do You?”)—and Charlie Warzel wrote in the New York Times recently about Michael Caulfield, a digital literacy expert who argues that deep research, in the context of the internet, isn’t always the best thing. I’m sure there’s a lot more. How do your findings build on this?

Advertisement

The other paper that was really helpful for me in thinking about this was Francesca Tripodi’s paper on conservative evangelical media practices, where she talks about how conservative evangelical Trump supporters are doing a lot of primary source research—looking at transcripts of what Trump actually said, rather than getting a mediated sense of what’s going on. They do their primary research first, then they find other sources that give them additional analysis, if they need it.

This is in line with a lot of that work. “Critical thinking” has so many different valences in different communities. For these anti-maskers, thinking critically about science means not being a sheep and accepting what the scientific establishment says. For other people, thinking critically means accepting science, and then thinking about how it applies to our daily lives, how it informs public policy.

There are a lot of people who agree with us that these kinds of empty calls for critical thinking don’t amount to anything, simply because people are working in fundamentally different epistemological frameworks. We don’t have a shared reality would be a way to put it. So it’s very difficult to talk about critical thinking in ways that bridge these different realities.

Advertisement