The hours after a catastrophic event are filled with chaos, as conflicting information emerges and people grab onto whatever information they can find. It’s an environment where misinformation can thrive, where people share and retweet with panic and where responsible media outlets can’t fact check enough to distribute reliable information to the public After the Uvalde shooting, a lie about the shooter’s identity made it from one of the most wretched corners of the internet to a member of Congress’ personal Twitter account within hours.
The shooter (whose name I will not use) at Robb Elementary School in Uvalde, Texas, was killed around 12:45 p.m. Central Time on Tuesday. His name was released at a press conference at 4 p.m. Within a few hours, several people on 4chan had begun posting linking to the Reddit profile of a trans woman, suggesting she was the shooter. There was no evidence for this; several 4chan posters even commented that the photos didn’t look like the other photos of the shooter floating around. Nevertheless, others picked the story up and ran with it.
The posts got traction on 4chan for a few reasons. First, content on 4chan is organized by recency, so having an active thread that people are responding to increases the visibility of the information, even if the person who posted it is just a random anonymous user.
Second, the content is subject to basically no moderation; 4chan has a small moderation team and a junior “janitor” team that provides maintenance, but relative to the massive moderation departments of the major media platforms, there’s very little there. Third, the board has a long culture of trolling and posting misinformation. Joan Donovan, research director at the Shorenstein Center, notes that 4chan was instrumental in creating the red-pill online cultural movement.
From 4chan, this misinformation that the shooter was a trans woman was picked up by many right-wing Facebook groups and probably made it to discussions on private messaging applications. This is the phase at which things are most difficult to track, because many of the groups have small footprints and a select membership or are private by design (as in the case of group messaging services like WhatsApp). We can’t see the private messaging, but we can see the uptake from groups like “Young Conservatives of Southern Indiana” on Facebook, which posted one of the images taken from the trans woman’s Reddit profile along with the claim that she was the shooter. That group has about 4,000 followers, which is not a large audience in social media spaces, but it significantly increases the reach out to a broader audience who is on Facebook but not on 4chan or Reddit.
Unlike 4chan, Facebook actually does have a robust moderating team, but that moderation relies in large part on user reporting, and so someone would have to see the Young Conservatives post, check to verify it, and then report it as false. If the readers at Young Conservatives are generally disposed to buy this misinformation, then that’s less likely to happen until the post spills out beyond their usual audience. In this case, that happened, and Facebook did take the post down. Page administrators claimed that the post was actually later judged not to violate community standards by Facebook in a follow-up post, but it no longer seems to be up on their page. But by the time Facebook moderators acted, the claim had gained some visibility—and just because it was removed on one group doesn’t necessarily mean it was deleted everywhere.
As the post on 4chan started to spread through Facebook groups and private group chats, it got picked up by actual influencers: people with large numbers of followers on the major platforms. This is when the misinformation gets the attention of the general public. The claim that the shooter was a trans woman was picked up by Rep. Paul Gosar, a Republican from Arizona; Candace Owens; and Alex Jones. Jones has been banned from Twitter, but still has a large following on his radio show. Gosar’s personal account, where he tweeted (and later deleted) the claims has more than 159,000 followers. Candace Owens has more than 3 million.
Once the claim is blasted out to audiences of that size, it will get the attention of major media outlets. The claim was quickly fact-checked. Gosar took the tweet down, after a few hours and about 100 retweets, when he was confronted with those fact-checks. (Notably, Owens did not. As I first draft this, more than a day later, her posts are still up.) But this leaves us with one more problem: A fact check works only if people see and believe it. If someone sees Gosar’s original tweet before it’s deleted, but they don’t see the fact-check (because they do not follow or trust other media outlets), then they’re going to continue to believe the claim made by Gosar. This particular bit of misinformation will continue to circulate in these circles long after the facts are checked and the tweets are deleted.
This case provides a clear look at a gritty and complicated landscape. In the hours after a disaster, people are grasping in the dark for any information they can find. If the only information they can find is from an anonymous post on 4chan, especially one that aligns with their worldview, then that will be their starting point. This is not the first time such a case has resulted in picking out an innocent person as an attacker. In the wake of the Boston Marathon bombing, pseudo-sleuths identified Sunil Tripathi, who had disappeared in the weeks prior to the attack, as a suspect; the hunch was based on some Redditors’ impressions that there was a resemblance between Tripathi and the FBI picture of the suspect. The story was then picked up by Buzzfeed and the BBC, as well as social media personalities. A week after the bombing, Tripathi’s body was found; he had died by suicide, apparently well before the marathon.
There are lots of forms of misinformation. The story of the trans woman misidentified as the shooter is not necessarily the kind that radicalizes future killers or influences elections, though it can be under the wrong circumstances. But it’s one of the important and persistent types of cases that comes up. It’s also a case that helps us trace a clear path from an anonymous comment on a message board to the Twitter account of a member of the U.S. House of Representatives.
It starts with an interface that promotes recency and has limited moderation, like 4chan or certain subreddits. It gets taken up into groups that have small user bases across other platforms, including social media platforms like Facebook and group chats. Because these groups are smaller, they don’t attract the attention of moderators as quickly and the information has time to spread, eventually catching the eyes of actual influencers, people with tens of thousands (even millions) of followers. And once it’s out, the fact-checking and moderation kicks in, but even that is a limited stopgap. For years to come, some people will believe the shooter was a trans woman. And even if they find that it was debunked, they may consider it a conspiracy to suppress the truth.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.