Future Tense

YouTube Boosted a Conspiracy Theory Video About a Florida Shooting Survivor Because It Contained a Real News Clip

A screenshot of a conspiracy theory video on YouTube from the channel "Today News" that was among the top videos suggested in the site’s search tool.
A screenshot from one of the videos on YouTube from the channel “Today News” that was among the top videos suggested in the site’s search tool.
YouTube screenshot

Conspiracy theories about the teen survivors of the school shooting in Parkland, Florida, have begun making the inevitable rounds, and on Wednesday, one of them landed the coveted spot of YouTube’s top trending video.

The video in question suggested that 17-year-old David Hogg, a student at Marjory Stoneman Douglas High School who has appeared on TV news programs over the last few days advocating for stricter gun laws, is an actor. (He is not.) By the time the video was removed later on Wednesday, Motherboard reported it had amassed more than 200,000 views. And that was just one of the videos hawking false theories that aimed to defame the mass-shooting survivor and budding gun-reform activist.

YouTube says it took the video down “as soon as we became aware it” and explained in a statement to news outlets that the video “should never have appeared in Trending.” The misclassification happened, according to YouTube, “because the video contained footage from an authoritative news source, our system misclassified it.” In this case, the footage came from a local newscast.

In other words, YouTube’s “system” scans videos uploaded to the platform and is able to determine if it contains footage from a reputable outlet. That makes it potentially eligible to qualify for YouTube’s “trending” category, a distinction that appears to elevate the video in search results on YouTube, too. YouTube is copping to one of two things here, and neither is good: Either it identified the hoax video as a duplicate of an original video on the platform and elevated the former rather than registering something was amiss; or some other factor has led to a loophole in which hoaxers can game its trending search feature by doctoring in clips from established news outlets.

When I searched YouTube for “David Hogg” later in the afternoon Wednesday—this was after the video that went to No. 1 had been removed—the top search results were still packed with conspiracy theories labeling Hogg a “crisis actor,” suggesting he actually attends high school in California, alleging he has an arrest record, and that his father used to work for the FBI. Though the first two videos were actual news clips posted by CNN and CBS Los Angeles, the following 15 were all peddling conspiracy theories, with the exception of one video aiming to debunk a conspiracy theory, two recycled clips from CNN, and a video of a YouTube video-blogger complaining that the video sharing site removed all his channel’s ad money “after mentioning David Hogg.”

Multiple videos claimed to prove David Hogg is a “crisis actor” and that he forgot his lines. Another is titled, “After Blaming Trump, School Shooting Survivor David Hogg’s Dirty Secret Was Just Exposed.” The fourth search result featured a thumbnail of Hogg with the word “exposed” stamped on his forehead and claimed the “NSA Has Proof.” Most all of these videos appeared to use some clips of television news reports from reputable outlets.

I asked YouTube about what happens when it scans videos, finds clips from trusted sources in them, and classifies them in its trending category. In response, the company would only acknowledge that it’s been toying with how it displays search results around important news stories. “In 2017, we started rolling out changes to better surface authoritative news sources in search results, particularly around breaking news events,” a company spokesperson told me in a statement, adding that YouTube has “seen improvements, but in some circumstances these changes are not working quickly enough.”

The failures of tech companies to provide trustworthy information in the aftermath of a major news event has become chillingly routine. After the Texas shooting in November, I first reported that the massively popular video sharing site featured unverified right-wing reports in its top result. And after the Las Vegas shooting in October, YouTube’s sister site, Google, offered results at the top its page from the notoriously shady troll site 4chan. In December, Google said it would hire 10,000 new positions to clean up “problematic content” on YouTube. It’s not actually clear if there is a team of people monitoring what search results float to the top on YouTube when it comes to queries about breaking news and the people embroiled in sensitive events, like, say, youth survivors of a mass shooting. If that team has been installed, it could probably benefit from taking more time to vet whether or not the results are promoting disinformation. If it’s software, it’s just not working.

Read more of Slate’s coverage of the Parkland shooting.

One more thing

You depend on Slate for sharp, distinctive coverage of the latest developments in politics and culture. Now we need to ask for your support.

Our work is more urgent than ever and is reaching more readers—but online advertising revenues don’t fully cover our costs, and we don’t have print subscribers to help keep us afloat. So we need your help. If you think Slate’s work matters, become a Slate Plus member. You’ll get exclusive members-only content and a suite of great benefits—and you’ll help secure Slate’s future.

Join Slate Plus
April Glaser is a Slate technology writer.