On Wednesday, Facebook invited a handful of journalists to its offices in New York City for an update about its attempts to clean up its platform of misinformation and propaganda. After making attendees watch a glossy, 12-minute video about Facebook’s efforts to fight false news, the company opened the room up for a discussion.
If Facebook had hoped to send journalists out the door feeling that the company is on the right track, the plan was derailed when CNN reporter Oliver Darcy asked John Hegeman, Facebook’s news feed head, a good question: If Facebook is truly serious about fighting false news, why does Infowars still have an account there? The fringe-right, conspiracy-theory news site run by Alex Jones has nearly 1 million followers on Facebook, and has used the platform to disseminate dangerously false information and theories about real news events. In response, Hegeman put it plainly: “The company does not take down false news.”
Hegeman went further, adding that saying something false “doesn’t violate community standards” and that Infowars has “not violated something that would result in them being taken down.” Infowars’ Facebook page currently contains at least 10 videos posted about a purported liberal plan to start a civil war, an overtly false narrative that Jones has been pushing for the past year, which culminated recently with claims that the battle was scheduled for July 4.
“I think part of the fundamental thing here is that we created Facebook to be a place where different people can have a voice. And different publishers have very different points of view,” Hegeman said. Journalists—whose livelihood depends on their success at reporting the news accurately—were appalled, tweeting their shock at Facebook’s position, especially when it comes to a figure like Alex Jones, whose personal verified Facebook account has 1.7 million followers and whose conspiracy theories frequently have hateful, violent themes. Facebook pushed back on Twitter, tweeting in response to Darcy that figuring out what to do with pages that post false information is a hard problem to solve: “We don’t think banning them is the right option – better to demote posts rated as false and the Pages that spread them.”
A consensus quickly emerged, at least among tech critics: Facebook should take down Infowars’ page, obviously. Why should the operation that insisted the Sandy Hook massacre was a hoax, that propagated the “Pizzagate” conspiracy theory that inspired a man to fire a gun in a D.C. pizzeria, be allowed a perch on the platform? There’s probably a counterargument, of course, that any act of policing speech is thorny and surrounded by slippery slopes. But to debate a single act of content moderation is to miss the larger question: If we’re this worried about what disinformation means for society, why did we ever let Facebook get so big?
Facebook is obviously wrestling with itself over these issues—in February the company told reporters that “images that attack the victims of last week’s tragedy in Florida are abhorrent,” in response to a deluge of content, much of it likely inspired by Alex Jones’ viral conspiracy theory that the teen survivors of the deadly mass shooting at a high school in Parkland, Florida, were hired “crisis actors.” “We are removing this content from Facebook,” the company said in a statement to multiple reporters. And while the Alex Jones and Infowars pages surely ought to have enough strikes against them by now to warrant removal, Facebook isn’t wrong to note that content moderation is not always clear-cut. Much of the most dangerous and misleading viral content on Facebook that some people categorize under “fake news” aren’t flat-out conspiracy theories or even directly false, but rather exaggerate reality, discredit verified factual reporting, and encourage bigotry in order to stoke hatred.
If something isn’t by-the-book false, should Facebook take the content down? Surely, it’s OK for individuals with unverified accounts and pages to be wrong sometimes, right? At what point do wildly different interpretations become lies? And is Facebook’s solution—to demote posts by algorithimically limiting their reach within the social network and removing the worst offenders but generally leaving the pages the publish these posts up—a half-measure or the best possible approach to a tricky set of problems?
What’s clear is that Facebook doesn’t really want to be the decider of what’s true or false, what’s productive speech or dangerous provocation, and it certainly has no legal liability to do so. In a healthier marketplace, it might be less crucial that Facebook work harder to think through these issues. If there were five other massively popular social networks to choose from, Facebook could take down Jones’ pages and they could live elsewhere. It wouldn’t be as big of a deal. If Facebook didn’t take down Infowars’ pages, users who don’t want to do business with a company that hosts such content could vote with their feet.
But with 2.1 billion users, Facebook is the dominant social network in the world. If you have a small business, if you’re an artist hosting local events, if you have relatives across the world you want to keep in touch with, if you run a news operation, not having a Facebook page is basically not an option. And when Facebook fails, it’s not merely failing its users: It’s failing society.
Facebook was able to get so big thanks, in part, to politicians who spent the last decade looking away. The most recent successful major antitrust case in the U.S. was against Microsoft almost 20 years ago. Facebook is so inspiring an American success story that even the politicians grilling CEO Mark Zuckerberg at two hearings earlier this year couldn’t help but thank him for his contribution to capitalism; politicians depend on the social network to reach constituents just as media organizations need it to reach readers. Because its service is free and, well, amazing, it’s been very easy for most people to not scrutinize how Facebook has used the data we all give it, even though for years it was freely handing out lots of that information to third-party companies, which led to the Cambridge Analytica scandal. There have been no comprehensive privacy laws that Facebook has had to follow. And it’s led to problems that are so worrisome precisely because they impact one-third of the people on the planet. Facebook was allowed to build an advertising empire that, until last year, allowed marketers or publications who wanted to hypertarget ads to Facebook users based on shared interests like “how to burn Jews.” Russian operatives instrumentalized the social network for years in order to manipulate U.S. voters and try to help sway the 2016 election for Donald Trump; they’re almost certainly at it again ahead of the midterms in November. This would be a problem for any company, but when that company is as big as Facebook, each of these problems is a potential crisis.
So here we are. The specifics of how the company applies ethical and editorial standards to its product wouldn’t be as consequential if there were somewhere else to go, but there’s nowhere else to go. (Twitter and Snapchat aren’t operating at the same scale, Instagram and WhatsApp are owned by Facebook, and YouTube, which is a Google subsidiary, has its own societally significant problems.) So it really matters how Facebook equivocates about conspiracy theorists like Alex Jones, even if critics can differ on the precise right approach. Misinformation at scale is dangerous, just as it’s surely also quite profitable for a company like Facebook, too. And it’s clear that whatever Facebook’s responsibility is, it’s still falling short. The company fought to achieve its massive size, and has lobbied for years to avoid regulation. If no one will control Facebook, Facebook has to. It’s still failing to make the case that it’s up to the task.