Future Tense

How QAnon Will Survive Being Banned by Facebook

The social network helped turn the conspiracy theory into a normie obsession—one that might be too big to stop.

The letter Q stenciled on the Facebook letter F logo
Photo illustration by Slate. Image by Facebook.

It’s been evident for a while that QAnon is now a conspiracy theory for normies. The theory, which emerged in 2017 and holds that President Donald Trump is waging a covert war against a satanic cabal of pedophiles that controls the Democratic Party and other major institutions, has captured the imagination of a shocking number of Americans, including at least one who’s about to become a congresswoman. One place QAnon has spread is that most normie of social networks: Facebook.

This week, Facebook took the unusual move of issuing a broad ban of QAnon on its platforms, barring groups, pages, and Instagram accounts promoting the conspiracy theory. The move considerably expands the scope of the company’s policies aimed at QAnon by removing such content even if it does not discuss potential violence. Within Facebook’s enforcement framework, this puts QAnon roughly on the same footing as extremist militias, though not on the level of an outright hate group or terrorist organizations, for which the platform removes individuals’ posts. The policy change has resulted thus far in the removal of hundreds of groups and pages, many of which had tens of thousands of followers. This is in addition to the 1,500 pages and groups related to QAnon that Facebook had already banned following a previous enforcement escalation in August.

Facebook has been a crucial outlet for QAnon. The platform conducted an investigation uncovering thousands of groups with millions of followers, which NBC News discovered through internal documents last month. QAnon’s influence has grown beyond its online roots, with dozens of candidates for state legislatures and Congress having endorsed the conspiracy theory and with supporters holding hundreds of “Save the Children” rallies across the country. The misinformation that the movement spreads touches everything from vaccines to wildfires. QAnon has become so big, so quickly, that there’s good reason to believe it will survive being deplatformed by the world’s largest social network.

Researchers and tech commentators do agree that Facebook’s large-scale ban is a significant and necessary step, even if it’s long overdue. “It will absolutely have an impact on the movement’s ability to evangelize and execute coordinated harassment campaigns on Facebook’s properties,” said Brian Friedberg, a senior researcher at Harvard University’s Shorenstein Center who has been studying QAnon. Facebook has in the past served as a more accessible channel for followers of the conspiracy theory who aren’t particularly comfortable with the mayhem of imageboards, which are forums where people can anonymously post images and text. Q, the anonymous leader of QAnon, posts messages to followers on the imageboard 8kun, which is known for hosting disturbing and hateful content. Users who don’t want to deal with 8kun’s abstruse interface and explicit content often go to the much more user-friendly Facebook, where groups and pages relay Q’s messages. Aggregator sites like QMap, which present Q’s messages in a much more digestible form, also cater to a wider audience, though Facebook often serves as a hub for discussion.

Facebook’s decision to stymie QAnon content that isn’t calling for or discussing violence is particularly notable because, while its supporters have been linked to incidents of murder and kidnapping, the damage the movement has done is hardly confined to violent crimes. As Facebook itself pointed out in its announcement about the ban, QAnon supporters had been pushing the false claim that antifa groups were responsible for the wildfires on the West Coast, which overwhelmed local 911 dispatchers with calls referencing the rumor. The movement has also helped propel anti-vaccine and COVID-19 myths, and generally tends to make the online lives of others miserable, as when QAnon accounts recently attacked Chrissy Teigen after her miscarriage with baseless accusations of pedophilia.

In other words, a lot of damage has already been done. “I hope [the ban] has the impact now that it would have had two years ago when the movement was much smaller and more easily contained,” said Mike Rothschild, a QAnon researcher and author of The World’s Worst Conspiracies. “I fear that it’s too late. This is a very robust movement.” Friedberg adds that Facebook’s decision to widen its ban right before Nov. 3 opens the platform to accusations of election interference that it could’ve avoided if it had just done this much earlier. Trump has in the past declined to condemn, and even encouraged, QAnon followers, which has entrenched their belief in the conspiracy.

New Jersey Rep. Tom Malinowski, lead sponsor of a resolution that the House passed last week formally condemning QAnon, also applauded Facebook’s move, but says that the platform may be playing an interminable game of whack-a-mole if it solely relies on pulling down content rather than examining the algorithms that encourage the spread of misinformation and tend to reward sensational, emotional, and often polarized content. “It’s a bit like a farmer who suddenly notices his fields are overrun with noxious weeds. He pulls the weeds, which is good, but doesn’t stop to consider why they spread in the first place,” he told Slate. (I also interviewed Malinowski last week about death threats he received from QAnon adherents.) “The problem is the algorithms that are designed feed us increasingly fearful versions of what we already fear and increasingly hateful versions of what we already hate.”

While this expanded ban will undoubtedly put a dent in the presence of QAnon on Facebook, its followers have become fairly canny when it comes to avoiding restrictions. For instance, when Twitter tried to crack down on QAnon accounts in July, leaders in the movement found ways to rebrand themselves as anti-trafficking advocates. Rothschild expects that the same sort of thing will happen with the Facebook ban. “They’ll latch onto other movements, they’ll disguise their hashtags, they’ll use more memes,” he said. “They’re good at this by now.”

It’s not clear that Facebook is going to be willing or able to go further than the extent of this ban in order to stem the QAnon movement. Making sure that these groups aren’t returning using different names and code words could require Facebook to keep tabs on how people are developing QAnon-related phrases elsewhere on the internet. “Are we asking Facebook to surveil their users off of their platform? That’s really one of the only ways,” said Friedberg. “Social media platforms seem either unaware or unable to publicly acknowledge when their platforms are being manipulated by campaigns on other services.”

Malinowski ultimately believes that congressional intervention may be necessary to address the way in which algorithms fuel the online spread of QAnon and extremism more broadly. “The problem is that it would be hard for large online companies to make the necessary changes without losing money, because their very successful business models are the problem,” he said, referring to the way in which platforms feed users increasingly polarized content. “I don’t see it happening without greater regulatory pressure.” The congressman is currently working on legislation that would address the algorithmic promotion of extremist and conspiratorial content like QAnon, which he plans to put forward in a couple of weeks.

After Facebook began enforcing its new ban, QAnon followers on Twitter interpreted the development as a sign that Trump will soon execute his long-awaited plan to arrest the pedophiles in the “deep state.” Many believed that Trump would announce the arrests of hundreds of satanic pedophiles who control the Democratic Party at a Justice Department press conference on Wednesday morning. It turned out that the press conference instead had to do with the Islamic State. Even after the event ended, though, followers were still convinced that the Justice Department would eventually lock up the pedophiles.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.