On Friday, President Joe Biden took a swing at Facebook over coronavirus-vaccine misinformation that continues to proliferate on the platform. “They’re killing people,” Biden told reporters when asked what his message was to Facebook and other platforms regarding misinformation and the pandemic. “I mean they really, look, the only pandemic we have is among the unvaccinated, and that’s—they’re killing people.” This blunt, if brief, accusation was enough to kick off a weekend news cycle in which the social-networking giant aggressively pushed back against the White House and nearly every close watcher of the company weighed in on the extent to which Facebook deserves blame for the millions of Americans who continue to refuse to get COVID shots.
The tensions between the houses of Biden and Zuckerberg aren’t quite a sudden development. According to the Wall Street Journal, Facebook and the White House have been privately meeting for months to discuss ways to curb anti-vaccine content. The administration had reportedly been optimistic about the prospect of working with Facebook, but talks recently fell apart as officials decided that the platform has a flawed and insufficiently rigorous approach to confronting vaccine hesitancy. And it’s not just Biden who’s ramping up public pressure on Facebook. Earlier in July, White House chief of staff Ron Klain told the New York Times that people in focus groups that the administration is commissioning on vaccines are most commonly pointing to Facebook as the source of misinformation they’ve seen on the topic. Surgeon General Vivek Murthy additionally called on social media companies to “take responsibility for addressing the harms” of vaccine misinformation in his first advisory of this administration, and White House Press Secretary Jen Psaki has called on Facebook to work harder to remove anti-vax posts. On Monday, Biden addressed the issue with a lighter touch, saying, “My hope is that Facebook, instead of taking it personally that somehow I’m saying Facebook is killing people, that they would do something about the misinformation, the outrageous misinformation about the vaccine. That’s what I meant.”
After Biden’s initial comments, Guy Rosen, Facebook’s vice president of integrity, wrote a sharp-elbowed blog post claiming that the facts do not bear out the president’s accusations, and suggested that the administration is trying to shift the blame on Facebook for failing to reach its goal of getting 70 percent of Americans vaccinated by July 4. (The country narrowly missed the mark, with a 67 percent vaccination rate.) “The fact is that vaccine acceptance among Facebook users in the US has increased,” Rosen wrote. “These and other facts tell a very different story to the one promoted by the administration in recent days.” He further asserted that more than 3.3 million Americans have used Facebook’s vaccine finder tool to schedule an appointment, and that the platform has already taken down more than 18 million pieces of COVID-19 misinformation. However, the post did not disclose how many people interacted with those 18 million pieces of content, as Harvard Shorenstein Center research director Joan Donovan noted.
Was Biden right in pointing the finger at Facebook for its role in facilitating the spread of vaccine misinformation, or was his ire misplaced? A bit of both, actually. As it turns out, the story is a lot more complicated than either party has acknowledged.
Ultimately, Facebook has improved in the way it deals with health misinformation during the pandemic in terms of taking down and limiting the spread of such content, and the platform likely isn’t the main driver of vaccine hesitancy in this country. But at the same time, the anti-vax movement wouldn’t be as powerful and pernicious as it is without Facebook.
Stanford Internet Observatory research manager Renée DiResta, an expert on the online tactics of the anti-vaccination movement, argues in a worthwhile Twitter thread that after being snubbed by the mainstream media and more traditional outlets, anti-vaxxers began relying heavily on Facebook around 2009 to find more followers. They used groups, pages, and ads as the publicity-generating infrastructure that would sustain their cause for years to follow. Facebook was hesitant to take action because anti-vaxxers were increasingly framing their views as being political in nature, especially after a 2015 bill outlawing personal and religious exemptions for school vaccines in California. Facebook has until recently been extremely reluctant to moderate political content, and anti-vaxxers were able to take advantage of that permissiveness to prime people with medical misinformation and cultivate a receptive audience. By the time the COVID-19 vaccines came along, anti-vaxxers were ready to mobilize the infrastructure they’d built on social media. Facebook has played an important part establishing an internet ecosystem that allows lies to spread much faster than the truth, and despite the platform’s current efforts to provide accurate vaccine information, it’s trying to climb out of a hole that it’s helped to dig for more than a decade.
That being said, it’s not clear that Facebook is the primary engine behind vaccine misinformation at this very moment. As Charlie Warzel points out in Galaxy Brain, while Facebook does have a role in amplifying this misinformation, it’s also likely been able to reach more people with pro-vaccine resources than many government campaigns due to its sheer size and audience capture. In addition, experts seem to place more of the blame for vaccine hesitancy on established conservative voices like Fox News and Republican politicians. While Facebook may help to amplify these views—and reach them in private groups where one-on-one persuasion can happen—influential establishments and figures already have their own huge platforms through which to reach impressionable followers. On top of that, it’s tricky to determine how much of the blame to place on Facebook because there isn’t a lot of comprehensive information about the extent and nature of the problem on the platform. The White House has been repeatedly citing a report from the nonprofit Center for Countering Digital Hate that found that 65 percent of anti-vaccine misinformation on social media is coming from about 12 individuals. Facebook both disputes the report’s methodology and claims to have taken action against some of these individuals’ accounts.
As Donovan and others have noted, Facebook likes to release a lot of flattering statistics about its efforts to promote accurate information and take down lies, but it conveniently tends to omit data about the number of users who are seeing misinformation and the ways in which they engage with that content. Reporting from the New York Times suggests that the company is mulling the prospect of restricting tools—the valuable, Facebook-owned CrowdTangle—that researchers use to try to understand which pieces of content are spreading the most on the platform. Independent analyses, such as one conducted by NPR on the virality of stories about people dying after getting vaccinated, suggest that the problem may be more serious than Facebook would like to admit. NPR found that in almost half of all the days from January to March of 2021, such stories were among the most popular vaccine-related articles on Facebook, Pinterest, and Twitter.
It’s not clear what the White House can do itself to limit the spread of vaccine misinformation, given that direct government intervention could run afoul of the First Amendment. Beyond more short-term fixes, it’s also not clear what Facebook can or is willing to do about the fact that its size and engagement incentives make anti-vax content spread so quickly on the platform. For lawmakers currently scrutinizing Big Tech, though, all of this may add up to yet another reason to try to limit Facebook’s influence. And while they’re focused on health misinformation, YouTube’s and Amazon’s, too.