YouTube’s New Crackdown on Hate Speech Comes at a Curious Time

A collage of different YouTube logos
Nicolas Asfouri/Getty Images

On Tuesday, YouTube refused to remove videos from a right-wing YouTuber attacking a journalist for his sexual orientation and his ethnicity. It took days for the platform—owned by Google—to even respond to a complaint about the channel, and when it finally did, it said the content didn’t violate its harassment and hate speech policies.

Less than a day later, YouTube and Google announced the introduction of new policies aimed at “removing more hateful and supremacist content from YouTube.” In a blog post, YouTube said content that alleges a group is superior in order to justify discrimination on characteristics like age, race, caste, gender, religion, sexual orientation, or veteran status would be prohibited under its new hate speech policy. It’ll also remove some conspiracy theory videos that deny well-documented violent events, like the Sandy Hook Elementary shooting and the Holocaust.

YouTube’s announcement Wednesday comes as Silicon Valley tech giants continue to revise policies and rules aimed at tackling hate speech and misinformation on their platforms. Earlier this year, Facebook removed several inflammatory figures—including conspiracy theorist Alex Jones, alt-right personality Milo Yiannopoulos, white supremacist Paul Nehlen, and Nation of Islam leader Louis Farrakhan—for engaging in and promoting violence and hate.

But those moves have often been viewed as too slow or too narrow in scope. Extremist content often flies under the radar on the sites. Sometimes decisions also seem to split hairs: The Facebook page for Jones’ Infowars was taken down in August 2018, long before Facebook took broader steps to remove numerous other pages connected to Jones and Infowars. Meanwhile, conservatives—led by President Donald Trump—are criticizing the major tech companies for what they say is a crackdown on the right.

The problem often comes when the companies are forced to interpret and enforce their existing rules, as evidenced by YouTube’s decision Tuesday not to ban right-wing comedian and YouTuber Steven Crowder after Carlos Maza, a video producer and writer at Vox, said Crowder has harassed him online for years. Maza and others accused YouTube of giving a free pass to Crowder because of his channel’s popularity. (Update: YouTube tweeted Wednesday afternoon that it has “suspended this channel’s monetization. We came to this decision because a pattern of egregious actions has harmed the broader community and is against our YouTube Partner Program policies.” In response, Maza tweeted, “Demonetizing doesn’t work. … The ad revenue isn’t the problem. It’s the platform.”)

Crowder’s derogatory insults—posted in numerous videos shared on YouTube to his more than 3.8 million subscribers—ranged from attacks on Maza’s sexual orientation to invectives about his ethnicity. YouTube tweeted Tuesday that the right-wing personality’s commentary wasn’t enough to violate the platform’s current anti-harassment and anti-bullying policies even though the policies prohibit content that “deliberately posted in order to humiliate someone” or “makes hurtful and negative personal comments/videos about another person.”

It appeared that YouTube’s investigation began only after Maza posted a compilation of Crowder’s insults on his Twitter channel. That video shows Crowder referring to Maza as a “lispy queer,” an “anchor baby,” and the “gay Vox sprite.” In the Twitter post, Maza called for YouTube to take down Crowder’s offensive videos — in some of the clips Crowder can be seen wearing a “Socialism is for F—” T-shirt that used a homophobic slur.

“As an open platform, it’s crucial for us to allow everyone–from creators to journalists to late-night TV hosts–to express their opinions w/in the scope of our policies,” YouTube wrote on one of its Twitter accounts Tuesday night. “Opinions can be deeply offensive, but if they don’t violate our policies, they’ll remain on our site.”

So Crowder’s YouTube channel will stay up—at least for now. YouTube said in its Twitter post that it is investigating other claims against Crowder’s page, and it isn’t clear how Google and YouTube’s new hate policies will affect Crowder’s channel. The YouTuber has defended his posts as debate and comedy—not harassment.

YouTube’s announcement also said that it aims to reduce the circulation of borderline content by adjusting its recommendation algorithms to reduce traffic to content on the edge of violating its policies. Instead, it hopes to raise up authoritative voices like top news channels and reward “trusted creators” while improving enforcement of rules that remove advertising and monetization features from channels that brush up against its hate speech policies. Now comes the tough part: making the hard choices that will inevitably lead to criticism.