Future Tense

Facebook Temporarily Banning Alex Jones Plays Right Into His Hands

Alex Stone poses following an episode of Alter Family Politics on SiriusXM at Quicken Loans Arena on July 20, 2016 in Cleveland, Ohio.
Alex Stone poses following an episode of Alter Family Politics on SiriusXM at Quicken Loans Arena on July 20, 2016 in Cleveland, Ohio. Ben Jackson/Getty Images for SiriusXM

After a lot of hemming and hawing, Facebook is finally giving Alex Jones and his InfoWars network a slap on the wrist. On Thursday evening, the company shared that it will ban the far-right conspiracy theorist from accessing and posting to his pages for the next 30 days. CNBC reported Friday afternoon that Facebook is now considering removing Jones’ pages from the platform for repeated violations of its community standards.

But as of publication of this story, the official InfoWars page is still chugging along, as is Jones’ verified page. That’s because the 30-day timeout is for Jones’ personal account, which means he can’t manage the verified pages. His fellow admins, however, can continue to pilot the pages in his absence. Jones’ verified show page counts 1.6 million followers, and InfoWars’ has nearly a million.

The temporary ban comes after Jones violated Facebook’s community standards, a spokesperson told CNET. The social network said it had received complaints about four different videos, which have been removed, though we don’t know what was in them. It comes after Facebook had warned Jones that he was on the brink of a suspension, following weeks of media pressure about InfoWars remaining on Facebook despite the company’s efforts to clean false news off the platform.

On Wednesday, YouTube also took action to remove four of Jones’ videos (it’s not clear if those were the same ones that led to Facebook’s actions) and blocked him from live broadcasting for 90 days. Still, Jones’ surrogates quickly found a way around the block. Ron Gibson, a self-described member of Jones’ network, started airing InfoWars’ livestream from his YouTube channel after the ban. And, of course, Jones is still airing his daily broadcasts on his own site, where he’s imploring “Patriots to Spread the Word About Big Tech’s Murder of Free Speech,” as one recent headline reads. In this narrative, Jones is the victim and Facebook is the enemy. He argues that the supposed censorship of conservative voices is the reason why the social network’s stock took a historic dip on Thursday after its earnings calls.

To be clear, that isn’t true at all. Facebook’s stock took a plunge after failing to increase user numbers this quarter, which certainly isn’t due to the censorship of conservative voices. Fox News, after all, has a daily show on the new Watch platform, not to mention the torrent of conservative commentary that spreads on Facebook daily.

But conspiracy theories are Jones’ product, and the 30-day suspension plays right into his hands. Now he can play the victim while drumming up pressure on Facebook to be even more reticent about taking meaningful action against pages like InfoWars that work to amplify false news, for fear of appearing biased against conservative voices. It was just last week that executives from Facebook, Google, and Twitter all sat through an hours long hearing with the House Judiciary Committee where Republican lawmakers continued to insist that Facebook prioritizes a left-leaning stance on issues like LGBTQ rights and immigration.

For years, Jones has been using his Facebook and YouTube pages to spew dangerous conspiracy theories, false news, and hate speech, including most recently the unfounded claim that special counsel Robert Mueller has a history of pedophilia—which Facebook said didn’t violate its community rules. Then there was the allegation that the teenage survivors of the Parkland, Florida shooting were paid actors working on behalf of gun-control advocates, the false claim that the Sandy Hook massacre was a hoax, and his role in spreading the Pizzagate conspiracy theory.

The issue as to whether Facebook would continue to host Jones’ pages came to a boil earlier this month after the company told reporters that it would continue to allow Alex Jones and InfoWars on the site despite Facebook’s ongoing efforts to fight the deluge of false news that floods the platform. That led some tech critics, including me, to call out Facebook for its unwillingness to do more to prevent the spread of false news. And while it’s good that Facebook is starting to do more, the way it executed Jones’ punishment is likely only going to empower him when he comes back next month.

During his suspension, Jones will likely continue to rally his massive number of followers against Facebook, claiming there’s a conservative bias, and fringe lawmakers may seize on this as just another example of the so-called bias. This shoehorns Facebook into inaction when it comes to taking meaningful steps to punish pages that promote hate speech and false news and encourages Facebook to continue with what appears to be ad-hoc decisions to take action only after media pressure .

So what should Facebook have done instead of suspend Jones? As the biggest social network in the world, Facebook has the responsibility of making sure that it’s serving the information needs of its users and keeping us safe. Sometimes that means not just warning repeat violators or putting them in timeout, but kicking them off for good. Milo Yiannopoulos was permanently banned from Twitter after leading a racist troll campaign against actress Leslie Jones in 2016, for example. Jones’ trolling of victims and survivors of mass shootings on Facebook for years might fall into a similar category.

It’s not clear how the videos that Facebook removed on Thursday violated standards while his previous posts hadn’t. If Facebook does end up banning Alex Jones’ network of pages, as CNBC reports it is considering, the company should clearly explain what was in violation and why it took so long to take action—even if that means admitting that it was wrong for not doing more sooner. Until Facebook clearly explains its process for determining when something violates its standards, and enforces its policies against such violations consistently, this cat and mouse game is likely only going to get worse.