TikTok doesn’t know that eight people died in the surging crush of a crowd of 50,000 as Travis Scott performed at his Astroworld music festival in Houston, Texas, last Friday night. It doesn’t know that dozens sustained injuries as Scott encouraged the audience to “rage,” or that countless more were traumatized by struggling to stay upright, packed into a throng of people squished so tightly it was sometimes hard to draw a proper breath. TikTok doesn’t know any of this. And how could it? It’s not a person, as much as we like to personify it. It’s an app.
But what the app does know is that users are rapidly sharing videos from Astroworld. They’re commenting upon these clips and reposting them to Twitter. People are downloading them and sending them to friends via text messages and Instagram DMs. The app is learning—not incorrectly— that people want more Astroworld content, which makes it more and more likely that some of those videos will find their way to your For You Page, even if you’d never heard of the event before it became one of the country’s deadliest crowd-control failures in recent history. Because those are the cues the app is receiving from users: Attention to a certain topic in yields more videos on that topic out.
This is how, in the days following the tragedy, I found myself seeing an increasing number of unsettling videos taken during Scott’s Astroworld show. The ones from the crowd are anxiety-inducing, to say the least; there’s a real whiplash from laying in bed, watching some lady making chicken in an air fryer or a twentysomething doing a dance I’ll never attempt to learn, to hearing people screaming for help that isn’t coming. Some of the videos come with a sensitive content warning from TikTok. Just as many don’t have any warning attached at all.
The footage is, in a certain way, critical to the aftermath of the tragedy. It’s on-the-ground reporting of what really happened at Astroworld and will no doubt be instrumental as investigations into the events proceed. But it’s not content that, given the option, I would have asked TikTok to show me. (Like most tech platforms, TikTok’s moderation system involves a hybrid of machines and humans. Videos that TikTok’s “technology” flags are then passed along to a human moderator, where they are manually reviewed for content violations. But if a video passes that initial screen, it’ll go live immediately. )
I’ve been trying to teach my FYP that I don’t want to see these videos, with varying degrees of success. But the more harrowing, or at least differently harrowing, videos that I’m now served are the ones from before the tragedy occurred—TikToks of people lamenting having to sell their tickets because their plans changed at the last minute or, worse yet, excitedly getting ready to go to the festival. With those videos, my immediate reaction is to flock to the comments, hoping to find out that these people made it home safely from the show. And then any solace I feel is immediately replaced by a different sadness when I realize that the relief I’m feeling for that person comes at a human cost of another person’s life.
TikTok, like Instagram, shows you videos algorithmically, not chronologically, meaning videos don’t surface in the order they’re posted. They appear when a computer program deems them interesting enough for you to see, when enough people have engaged with them for the platform to decide that you’ll want to engage with them too. When it works as it’s supposed to you get a feed full of things you’ll actually enjoy seeing, like videos on subjects that interest you. Or Instagram pictures from people you are actually friends with and not just random brands you followed for a giveaway and meant to unfollow but forgot. But sometimes, the algorithm comes up short. Take, for example, the days following the 2016 election: My particular Instagram feed felt like some sort of strange and prolonged wake for the following week. Hopeful election-day posts from people heading off to the polls only crossed my feed well after the results were called and four years of Donald Trump as president was our new reality. Were these posts that I would have enjoyed before the election was called? Definitely; the algorithm understood that right. But what it couldn’t do was interpret that the meaning of those posts had completely changed. Something similar is happening again, five years later, with tragedies like Astroworld.
This is yet another reminder of just how ill-equipped tech platforms are to handle real, human emotions—and why it’s all but impossible to stop the internet from, say, serving you ads for baby formula and tiny booties if you’ve announced a pregnancy that results in misccariage. The unconscious, empathy-bereft algorithm is why, as Lauren Goode wrote for Wired in April, you’re doomed to live in a digital wedding industrial complex forever, even if you call off your nuptials. Algorithms do not grieve. They cannot differentiate between morbid fascination and regular, run of the mill fascination. They traffic only in data. But people—and people’s lives—are much more than data, even if the programs we use only think of us that way.