When Facebook announced it would survey users to rank news outlets by “trustworthiness,” I had my doubts about its methodology. As CEO Mark Zuckerberg described it, the company’s plan seemed to involve simply asking a bunch of people which outlets they’ve heard of and which ones they trust. It would then use the results to elevate stories from “broadly trusted” outlets in its news feed rankings—helping to address, in theory, the problems of click-bait, misinformation, and hyperpartisanship that have plagued the platform. I called such an approach “painfully simplistic and naïve” and hoped aloud that I was misreading the company’s announcement.
It turns out that is exactly what the company plans to do. BuzzFeed’s Alex Kantrowitz got his hands on the actual survey and published it on Tuesday. It’s two questions long.
Since then, Facebook has understandably taken flak for its laughably basic approach to the thorny problem of how to evaluate media credibility. “We fucking told you so,” wrote Gizmodo’s Bryan Menegus, who had greeted Facebook’s original announcement of the survey with a post headlined, “I Can’t Fucking Believe How Dumb Facebook’s News Feed Update Is.” He wondered how Facebook itself would fare on its own trust survey—a good question!
But before we dismiss the project entirely, it’s worth asking what value such a straightforward survey might yet have—and what it tells us about the social network’s rapidly changing relationship to the news media. A closer look at Facebook’s plan suggests it’s not as dumb as it sounds.
The folly of ranking news outlets’ trustworthiness based on a two-question survey would seem to be obvious in the era of Donald Trump, whose presidency hinges on fueling public mistrust of the mainstream media. After all, the random Facebook users who the company will be surveying are the same ones who were already spreading hoaxes and misinformation far and wide, right?
Well, yes—but context matters. The reason Facebook is such fertile ground for so-called fake news is not simply that its users are dumb and ignorant and credulous, as the Intercept’s Sam Biddle suggested. If the problem of online misinformation could be reduced to user error, it would have made no sense to blame Facebook for it in the first place.
On the contrary, the criticism of Facebook is that the structure of its news feed makes users more susceptible to half-truths and hoaxes than they would be otherwise—especially when that content plays on their biases and emotions. Before Facebook, people read the news by choosing their source first—turning on CBS, buying a copy of People or Newsweek, subscribing to their local newspaper. In the news feed, it’s the opposite: Individual story headlines jostle for your attention, and you only interact with the ones that jump out and grab you. The source is presented in tiny print, as an afterthought. That’s what makes it easy for a hoax site such as ABCNews.com.co to trick people into thinking they’re reading the real ABC News.
It’s not necessarily true, then, that the same people who share a story in their news feed from Liberal Daily would mark the site as “trustworthy” on Facebook’s survey. They might not even recognize its name.
That still leaves the problem whereby conservatives and liberals have a hard time agreeing these days on which outlets are trustworthy. When Facebook first announced it, I imagined the following scenario: Trump voters say they trust Fox News and don’t trust CNN. Liberals say the opposite, and we’re left with a stalemate. Meanwhile, conservatives express trust in the likes of the Independent Journal Review, while many liberals turn out to be unfamiliar with that name, resulting in a high ratio of trust to familiarity among those surveyed. Ergo, Facebook treats the Independent Journal Review as more credible than CNN.
But after seeing the survey and hearing more from Facebook about how the company plans to use the results, I’m less pessimistic. In fact, there are reasons to think it could do more good than harm, despite its glaring limitations.
For one thing, as basic as the survey is, it’s important that the second question leaves room for gradations of mistrust. That is, it encourages users to distinguish between outlets they don’t fully trust (perhaps that would be the Wall Street Journal for liberals or the Washington Post for conservatives) and those they don’t trust at all (let’s say, Breitbart and Occupy Democrats, respectively).
Second, Facebook clarified that it won’t rank news sources based on a simple ratio of the number of people who trust a given news source to the number who are familiar with it. Adam Mosseri, who heads up the news feed, told me via Twitter that the company will be drawing on its wealth of data about users to weight their responses. Specifically, it will be looking for news outlets that enjoy at least some measure of familiarity and trust from respondents with a broad range of reading habits. So much for the Independent Journal Review emerging as a big winner.
Mosseri declined to go into much more detail, unfortunately—and Facebook’s reluctance to disclose the specifics of its methodology remains a legitimate reason for concern. But he and company representatives did offer one other point that could help assuage critics’ worst fears. The trust survey, they insist, is less about ranking the relative merit of established journalistic outlets than it is about separating those outlets from the legions of lower-quality sources that populate people’s feeds. Mosseri said the company’s algorithm will apply the “trustworthiness” signal only to sites that meet a certain threshold of recognition among a diverse set of users.
While Mosseri didn’t name specific outlets, this suggests to me that the main goal is not really to figure out whether more people trust ABC News or the Denver Post. It’s to boost both ABC News and the Denver Post at the expense of hoax sites like ABCNews.com.co or the Denver Guardian. How it treats more mainstream outlets that have a partisan bent, such as Fox News or MSNBC, is harder to predict—which is why critics should keep up the pressure on Facebook to disclose such specifics. But it seems likely that they’ll fare better than sources that occupy positions farther out on the fringe and that lack even the basic journalistic standards found on cable news.
Even so, potential pitfalls abound. This approach could help to entrench big, corporate media on Facebook at the expense of reputable niche sites and upstarts. A 2014 survey found that BuzzFeed was the least trusted of several dozen major news sites among respondents from across the political spectrum. Today, BuzzFeed has established itself as a major, serious outlet, thanks in no small part to Facebook’s algorithm. Had Facebook been using this trust survey four years ago, it might never have had a chance.
All that said, it’s important to remember that this is just one of hundreds of signals that go into Facebook’s news feed ranking. That’s no excuse for sloppy implementation. But it does suggest that, as the Verge’s Casey Newton put it, the survey probably “won’t make or break the media” regardless of how it’s implemented.
That’s why I think the biggest takeaway here is not that Facebook’s news survey is flawed but that it exists at all. The company is constantly tweaking its algorithm, so the specifics of any given signal are highly likely to change. But until recently, Facebook had steadfastly rejected the notion that it had any role to play in assessing the credibility of sources whose content was shared on its platform. Now it’s beginning to embrace that role, however timidly. And building it into the news feed ranking system is a big step. It’s far more potent than, say, adding “disputed” tags to a small number of manually debunked false news stories. A change to the news feed algorithm—the company’s most prized and closely guarded asset—is how you know Facebook is taking a problem seriously.
It’s still putting the onus of editorial judgment on its users. But at least it’s finally sending a signal to publishers that their credibility matters to their news feed ranking and not just their ability to trick readers into hitting like. It helps that the company is simultaneously working on features like a new local news and events section, which could help the local news outlets that have been decimated by the shift to online news.
The company will almost certainly commit some blunders along the way, and those could have serious consequences. But it’s clear by now that its previous agnosticism to the truth was a deeply misguided philosophy. I’d rather see Facebook make mistakes in pursuit of a better one than cling to the old canards about algorithmic neutrality. But perhaps I’ll feel differently if I’m out of a job six months from now because Facebook’s survey respondents deemed Slate untrustworthy—or, worse yet, unfamiliar.
Support our independent journalism
Readers like you make our work possible. Help us continue to provide the reporting, commentary, and criticism you won’t find anywhere else.Join Slate Plus