The Industry

How a 119-Word Local Crime Brief Became Facebook’s Most-Shared Story of 2019

A tale of accidental virality in the age of algorithms.

A bunch of arrow icons clicking on "Share on Facebook."
Photo illustration by Slate. Photos by Thinkstock and Facebook.

On an otherwise ordinary Sunday in late January, a 32-year-old web editor for a chain of local radio stations in Central Texas ran across a news item that he found interesting. Ten minutes later, he had written and published what would become Facebook’s most-shared story of 2019 so far.

The story has nothing to do with Donald Trump, celebrities, teens in MAGA hats, or the Democratic primary candidates. It’s a 119-word local crime brief about a wanted suspect, and the man who wrote it never intended for it to reach a national audience, let alone amass more than 800,000 Facebook shares in the six weeks following its publication—nearly twice as many as any other piece of English-language content this year.

Exactly how this news stub went mega-viral is a mystery no one has quite solved, though there are clues, starting with its alarming yet geographically ambiguous headline: “Suspected Human Trafficker, Child Predator May Be in Our Area.” At a time when fortunes can be built and lost on Facebook traffic, the story’s wild success might seem like a bizarre accident, a glitch in the system. But it also suggests that, for all of Facebook’s efforts to improve its news feed over the years, the social network remains as capricious and opaque an information source as ever.

Aaron Savage has worked at the same group of radio stations, based in Temple, Texas, for his whole professional life. He started in 2005 as a part-time board operator working Houston Astros games for KTEM NewsRadio 14 and worked his way up to digital managing editor for the five Killeen/Temple-area stations owned by Townsquare Media, a national chain. His job includes writing and publishing several news posts a day on the stations’ websites and social media pages, which he administers by himself most days.

On Jan. 27, Savage noticed a crime brief on the website of KWTX News 10, a local TV station with which KTEM NewsRadio has a partnership. The story, “Texas Rangers Search Here for Human Trafficking Suspect,” jumped out to him as “something that needs to be presented to the public any way we can,” he told me in a phone interview. It gave the name and description of one Issac-John Bernard Collins, who was wanted for aggravated kidnapping, human trafficking, and sexual abuse of a young child. His case had not made headlines, but the allegations were disturbing, and authorities had reason to believe he was holed up in the Waco area.

Savage did a quick rewrite, cited and linked to the original post, gave it a headline that he felt conveyed the appropriate urgency, and published it on the local radio websites he manages. An hour and a half later, he posted a link to the story from the Facebook page of one of those stations, US 105 FM New Country, based in Temple.

And then, he says: “It just took off.” Savage, who monitors traffic data on his stories via the Facebook-owned analytics tool CrowdTangle, says the numbers for the story “went through the roof” overnight, quickly becoming US 105 FM’s most-shared post ever. Savage didn’t know until I told him in a phone interview last week that the story was named in a recent study by the analytics firm NewsWhip as the most-shared web content of 2019 so far.

A screenshot of the KTEM news item on Facebook.
Screenshot via Facebook

It beat out, among other extremely viral stories, TMZ’s report of Luke Perry’s death, CNBC’s breaking story about the end of the U.S. government shutdown, and an aggressively SEO-optimized Daily Mirror story about the viral “Momo challenge.” The original news brief that Savage’s post was based on, by KWTX 10’s weekend anchor Ke’Sha Lopez, was nowhere on the list.

How did this happen? Neither a researcher for NewsWhip nor a spokesperson for Facebook and CrowdTangle could fully explain it. Nor could Savage himself, who called it “a little scary” that the post did so well. But each offered some intriguing hypotheses.

One of the critiques of Facebook as a news source has long been that its algorithm tends to circulate stories with provocative headlines, regardless of whether the content is reliable. That’s a big reason why misleading partisan propaganda and even outright fabricated news stories thrived on the platform in the run-up to the 2016 elections, even as hard news from traditional media sources withered. It’s why entire publications and media groups have sprung up over the years to make money gaming Facebook’s algorithm: ViralNova, Elite Daily, Distractify, the Independent Journal Review.

Facebook has responded with countless tweaks over the years designed to promote “high-quality content,” discourage clickbait and like-bait, fact-check bogus stories, and punish pages that peddle misinformation. In 2018, Facebook announced a set of major algorithm changes designed to prioritize news from “trusted” and “local” sources, and to boost content shared by users’ friends and family over content published by professional Facebook pages. It said users would see less news in their feeds overall, but what they did see would be more reliable, and that it would focus on facilitating “meaningful interactions” among users.

In practice, the levers that Facebook’s engineers pull tend to be blunter instruments than you might think. Facebook’s mechanism for determining “trusted sources,” for instance, turned out to be a two-question survey. It assumes news is “local” to you if it’s being shared by a publication that has an audience tightly clustered in your area—regardless of whether the story’s topic is actually local. It defines “meaningful interactions” partly based on the number of comments on a post. NewsWhip’s study shows that the outlets that receive the highest share of comments tend toward the tabloid-y side of the media spectrum: LifeZette, the Sun, the New York Post, and our old friend LADbible.com.

According to CrowdTangle data provided by Facebook, Savage’s story on the suspected child predator racked up more than 50,000 shares on the original US 105 FM post alone—even though the station’s Facebook page has only about 7,000 followers. Within days, the story was being shared by much larger pages, including the Longview Police Department, Donald Trump Republic 2016, and Good Times With Trump 2016-2024. Over the course of February, it found traction on an ever-wider variety of subnetworks, including the page of R&B singer Sarahi Allende and a group called Truckers Wall of Shame.

In each case, the story garnered not only traditional “shares,” in which people repost it to their own friends and family, but large numbers of comments in which users tagged other people they know, presumably to alert them to the danger. That pattern might help to explain why several stories about crime, including Amber Alerts, make NewsWhip’s Top 10 most-shared list. (NewsWhip published a separate ranking of “Most Engaged” stories, a metric that counts other interactions such as likes and comments in addition to shares. The US 105 FM post ranked 15th on that list, with TMZ’s Luke Perry story taking the top spot. You can download NewsWhip’s full study here.)

While Facebook couldn’t confirm exactly what aspects of its algorithm helped the story on its way, Savage’s crime brief appears to have ticked nearly every box that the social network is trying to prioritize. First, it was shared by a local news organization, making it more likely that people in the Waco area would see it at the top of their feeds. Second, it generated large numbers of comments, which Facebook counts as “meaningful interactions.” Finally, its sharing was driven heavily by individual Facebook users, rather than by professional publishers with large followings, which means that it would be helped along by the company’s focus on surfacing posts from “friends and family first.”

But the wild card may have been the story’s headline. While it was clear from reading the story that it was about Waco and Central Texas, the headline just said the predator was in “our area.” Anyone who read the headline without reading the story might reasonably have thought the story was about their area, even if they were far from Texas. When I mentioned that possibility to Savage, he agreed, and said he’s usually more careful to localize his stories in the headline as well as the text.

That hypothesis was supported when I called the crime hotline listed in the story, which is run by the Texas State Department of Public Safety. The dispatcher, who declined to give his name, initially told me he couldn’t provide information about any individual case. But when I mentioned some of the details, they quickly rang a bell: “Is that the guy from the Facebook article?” the dispatcher asked me. That guy, it turned out, he could give details on. “He was taken into custody over a month ago—a few days after that article was posted.” (The Texas DPS’s communications department did not respond to multiple calls and emails seeking further information about the case.)

The apprehension was not made on the basis of a tip from anyone who read the story, he added. But that didn’t stop people from trying. The tip line “lit up like a Christmas tree, sir,” he told me. “I received three calls a day for over three weeks from coast to coast.” This despite the fact that Collins was apparently already in custody for much of the story’s long-tail shelf life—which rendered the article itself fundamentally misleading. It became, albeit unintentionally, a form of viral misinformation, which is exactly the sort of content that Facebook has been trying to avoid amplifying.

While no sources I talked to suggested this, it’s worth considering that there may also have been a darker factor at work in the story’s virality. The scary headline about a wanted child predator along Texas’ I-35 corridor came packaged with a picture of a suspect who appeared to be nonwhite, at a time when President Donald Trump’s vilification of immigrants and push for a border wall was dominating national headlines. Some of the hundreds of thousands who shared it probably feared for their personal safety or that of their children. But it’s also likely that many, including the popular Trump pages that helped it go viral, shared it because it appeared to reinforce a racist political narrative.

If you took all of Facebook’s public statements at face value, you’d expect today’s news feeds to be dominated by friends, family, and various communities sharing high-quality, original, meaningful content. In practice, the company’s algorithm changes tend to be more effective at curtailing existing strategies for gaming the system than advancing any particular high-minded social goals. If nothing else, the strange saga of the Collins crime brief is a reminder that the interactions between Facebook’s users and its algorithm are murky and complex, and often produce results that reflect no one’s idea of a healthy news diet.

For all the handwringing around Facebook stepping back from the news in 2018, NewsWhip’s analysis found that engagement levels for web content on Facebook so far in 2019 have bounced back to 2017 levels. In other words, the quirks of Facebook’s algorithm matter as much as ever in determining what people read, and which news sources thrive.

When I asked Savage what he took away from his brush with virality, he thought for a moment, then said: “I guess it’s Mr. Zuckerberg’s world, and we’re just living in it.”