Banning Alex Jones isn’t the only thing YouTube is doing to stem conspiracy theories on its platform. The company recently began linking to Wikipedia and Encyclopedia Britannica articles below some videos that cover subjects that are often muddied by misinformation. The company wrote in a blog post that it would be doing this in July, and earlier this month the additional information began to appear on videos about climate change. Now YouTube appears to have broadened the program, adding information boxes to the bottom of videos that cover subject matters that tend to attract communities pushing false theories.
While the move is clearly aimed at curbing the impact of accounts peddling conspiracy theories, it also affects the treatment of certain topics by even reputable outfits. A video from the History Channel on the Oklahoma City bombing, for example, includes a link with a snippet of information from the Encyclopedia Britannica entry. The same snippet is included on a video titled “The Oklahoma City Bombing: 4 Unanswered Questions” from the popular YouTube channel Stuff They Don’t Want You to Know. Videos about the Holocaust likewise now include an Encyclopedia Britannica link to information about the Holocaust. A video from BuzzFeed of a Holocaust survivor titled “How I Escaped the Holocaust” and a video from the verified channel Alltime Conspiracies, which specializes in topics like the illuminati, titled “Did Britain Murder Thousands of Holocaust Survivors?” both contain the encyclopedia links. The links do not appear to show up on mobile visits to the same YouTube videos, however.
YouTube CEO Susan Wojcicki shared the outlines of this plan at South by Southwest in March, explaining that the platform would begin drawing language from Wikipedia to contextualize videos that tend to court conspiratorial coverage and misinformation. At the time the Wikimedia Foundation, which administers Wikipedia, released a statement noting that it had not been made aware of YouTube’s plan in advance of the announcement. “We are always happy to see people, companies, and organizations recognize Wikipedia’s value as a repository of free knowledge. In this case, neither Wikipedia nor the Wikimedia Foundation are part of a formal partnership with YouTube,” the company’s statement read.
Wikipedia executive director Katherine Maher noted on Twitter that the decision to scrape Wikipedia for information makes it more difficult for unpaid editors to contribute to Wikipedia.* “While we are thrilled to see people recognize the value of @Wikipedia’s non-commercial, volunteer model, we know the community’s work is already monetized without the commensurate or in-kind support that is critical to our sustainability,” Maher tweeted. “Scraping our content means people can’t contribute.” Maher’s point was that when people read an information box pulled from Wikipedia—as the YouTube program is doing—instead of visiting Wikipedia.org to get the information, they are less likely to see the opportunity for volunteer participation in editing Wikipedia—and are also less likely to see calls to donate to the organization.
But there’s also the potential for the opposite problem—that this move will make Wikipedia articles a greater target. Unlike Alphabet, the parent of Google and YouTube and the second most valuable company in the world, the Wikimedia Foundation is a nonprofit that relies on volunteers and donations to stay operational. Wikipedia entries are written collaboratively by hundreds of thousands of users across the world, including people who believe in conspiracy theories and those who have a political agenda, though thanks to volunteer editors, the highly active editing community is often able to pry facts from fiction, the result of intense behind-the-scenes debates and strong rules requiring citations. “You might rely on us, but we rely on you. If you use @Wikipedia, please support Wikipedia! Your edit, your time, your advocacy, your donation - they’re all a contribution to essential, open, and free knowledge for all,” Maher added in a tweet at the end of her thread. “Google donates about $1 million every year to the Foundation as part of their matching gift donations and corporate giving, this includes our most recent fiscal year,” Samantha Lien, a spokeswoman for Wikimedia, told me in an email. The recent addition of Wikipedia content below certain YouTube videos is not a part of any financial partnership, Lien added.
It’s not clear how the new YouTube citations will play out on Wikipedia. There are potential pitfalls, because Wikipedia isn’t just a source, it’s a community, so in a way YouTube is exposing it to trouble. It’s possible that some people will see the Wikipedia link at the bottom of a video with a conspiracy theory and then try to add the information in the video to the Wikipedia page. Articles that are linked to might become the scenes of intense additional debate, and a mob of conspiracy theorists who are convinced that, for example, a mass shooting is a hoax, might try to strong-arm a section into articles about news events that question their veracity. Editors who are dedicated to including only verifiable information may become more prone to harassment or become tired and abandon editing the page. Exposing believers in conspiracy theorists to Wikipedia and Encyclopedia Britannica isn’t a worrisome idea in itself, but the fact that Wikipedia is editable makes it potentially vulnerable to those in favor of promoting a particular agenda, especially if multiple conspiracy theorists decide to work in a coordinated fashion.
While videos on YouTube about the Oklahoma City bombing all appear to include a link to the Encyclopedia Britannica page about the terrorist attack, videos about who was behind the Sept. 11, 2001, terrorist attack do not yet have an accompanying link from Wikipedia or the Encyclopedia Britannica. Right now, YouTube’s program seems to be in rollout mode—though it’s not YouTube’s only tool for tackling conspiracy theories.
Sometimes, YouTube pushes back on misinformation by downgrading it in its search algorithms. Videos that question whether the Parkland, Florida, school-shooting survivors are crisis actors do not include a citation to either encyclopedia, though those videos have become much harder to find by searching on YouTube than they were in the weeks that followed the massacre at Parkland high school in February of this year. Soon after the Parkland shooting, videos with conspiracy theories about the survivors quickly emerged on YouTube. A week after the shooting, a video promoting the idea that student David Hogg is an actor (he’s not) reached the coveted position of YouTube’s top trending video. YouTube was apparently not aware that this video had amassed such popularity and removed it “as soon as we became aware” of it, further explaining in a statement to news outlets that the video “should never have appeared in Trending,” a reference to a list of popular videos on YouTube.
The toughest tool YouTube has is banning—both individual videos and entire accounts. Last week, YouTube decided to remove prominent conspiracy theorist Alex Jones’ channel from the platform, which at the time had about 2.4 million followers. Jones was one of the most popular proponents of the false story that Hogg is an actor, as well as other conspiracy theories that have gained traction over the years, like the dangerously false allegation that the Sandy Hook shooting was a hoax. More recently, Jones claimed without evidence that special counsel Robert Mueller is somehow involved in a pedophile ring. Jones was removed for violating the company’s policies against hate speech, though it’s not clear what the threshold was for violating such policies.
YouTube did not clarify what topics would be subject to the additional information from encyclopedias, but said that it would continue to add new topics as it tests and rolls out the feature.
Correction, Aug. 14, 2018: Due to an editing error, this post originally misspelled Katherine Maher’s first name.