Future Tense

Why Is YouTube So Much Worse Than Facebook and Twitter at Stopping Misinformation?

It took the platform months longer than its peers to ban antivaxxers. It’s a pattern.

A woman holds up anti-vaccine protest signs.
Anti-vaccine protesters stage a protest outside of the San Diego Unified School District. Sandy Huffaker/Getty Images

On Wednesday, YouTube announced a major escalation in how it deals with content that poses a public health risk: The platform is banning misinformation related to any vaccine approved by local health authorities and the World Health Organization. YouTube’s medical misinformation policies previously prohibited the promotion of harmful untested treatments and false claims about COVID-19, but the Google subsidiary says that the pandemic has spurred it to scrutinize anti-vaccine content in general. “We’ve steadily seen false claims about the coronavirus vaccines spill over into misinformation about vaccines in general, and we’re now at a point where it’s more important than ever to expand the work we started with COVID-19 to other vaccines,” a company blog post explaining the decision reads. Some of the inaccurate narratives that the platform will target include links between autism and vaccines, microchips being hidden in vaccines, and general portrayals of vaccines as ineffective and dangerous. In addition to establishing these new policies, YouTube also suspended the accounts of prominent anti-vaxxers such as Robert F. Kennedy Jr., Joseph Mercola, and Sherri Tenpenny.

Advertisement
Advertisement
Advertisement
Advertisement

The action is significant and was applauded by public-health experts. The timing … is something else. Facebook announced its own on ban on vaccine misinformation in February, after years of advocates and researchers calling for such a policy. Twitter did so in March. And even though YouTube publicized its own policy on Wednesday, the platform notes in its blog post that “as with any significant update, it will take time for our systems to fully ramp up enforcement.” As the largest video platform by far, YouTube is a major part of the online anti-vaccine ecosystem, serving as a repository for deceptive videos that spread across the platform, then circulate widely on Facebook and Twitter. Critics have been pointing out over the course of the pandemic that YouTube seemed to be escaping scrutiny in the public discourse about anti-vaccine misinformation, particularly when President Joe Biden suggested in July that Facebook was a major force propelling vaccine hesitancy in the U.S. It does tend to be tricky to track such misinformation on YouTube since it often comes in the form of statements made in lengthy videos; it’s easier to keep tabs on snippets of text with anti-vaccine claims that make their way around Facebook and Twitter. At the same time, some of the accounts that YouTube banned on Wednesday had tens of thousands of followers and millions views. They clearly weren’t unknown to the company.

Advertisement
Advertisement

This isn’t the first time that YouTube has been late to the party when it comes to misinformation issues. In 2020, YouTube took a largely hands-off approach to misinformation surrounding the presidential election compared to Facebook and Twitter. The platform only started taking action on misleading videos about widespread voter fraud a month after the election, after the “safe harbor” deadline for states to settle disputes with the results had passed. During the election, YouTube opted to attach information panels to all videos pertaining to the vote results, whether or not they were truthful. Facebook and Twitter, on the other hand, had been labeling particular posts as containing misinformation and limited the spread of certain misleading content. YouTube was also the last major platform to disable former President Donald Trump’s account following the Jan. 6 Capitol riot that he helped incite, and has arguably been the least aggressive in handling the fallout. While other platforms have either permanently banned Trump’s account or specified the amount of time he will be suspended, YouTube has simply said it will allow the former president to post again once the threat of violence has subsided—meaning that YouTube may be his best chance to return to mainstream social media in the immediate future.

Advertisement
Advertisement
Advertisement
Advertisement

In yet another example, YouTube went public with its policies cracking down on the QAnon conspiracy theory last fall, a week after Facebook and months after Twitter. Even then, YouTube’s policies were not as forceful compared to those of other major platforms. While Facebook outright banned accounts and groups related to QAnon, YouTube stopped short of completely kicking the conspiracy theory off of its platform and made more allowances for what it called “borderline content.”

It’s notable that YouTube CEO Susan Wojcicki also hasn’t had to come before Congress to answer questions about social media misinformation alongside Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey. While her boss, Google CEO Sundar Pichai, has testified before lawmakers on the issue, tech experts and journalists have been eager to hear Wojcicki explain YouTube’s approach to Congress herself. There would seem to be a lot to explain.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Advertisement