Facebook may have found a solution to the controversy over its “trending” news section. Not a satisfying solution, mind you—in fact, it’s an ugly, compromised, cowardly solution—but one that would at least deflect attention from the feature and head off future charges of editorial bias.
The potential solution is reflected in a test that the company is running on a subset of users, which Facebook confirmed to me on Wednesday. Mashable, earlier in July, reported on what appears to have been a crude, prior version of the test, and Huffington Post noticed the updated version on Tuesday. At least two of my Slate colleagues are now seeing it in their own feeds.
As a reminder, the trending box that got the company in so much trouble looked like this:
For those in the test group, the trending box now looks like this:
Spot the difference? The article summaries are gone, leaving only the keywords without context. In their place is a number showing how many people are talking about each keyword on Facebook.
The test is noteworthy for what it reveals about the company’s approach to controversy and how it perceives its role in the media. It’s noteworthy even if Facebook decides not to go through with these changes, but especially if it does.
The current trending feature is a fascinating thing—a mashup of algorithmically highlighted, of-the-moment topics, including celebrity gossip and viral memes, with human-written headlines so dry and passive and anodyne that they read like relics of a bygone print era. But it’s that very hybrid of machine and human selection that opened Facebook to criticism when former contractors told Gizmodo that their own judgments and biases came into play much more than Facebook had previously let on.
The flap’s political dimension was largely trumped up as even Glenn Beck agreed. Yet it brought to light a very real tension between Facebook’s self-styled image as a “neutral” tech platform and the reality of its emergence as perhaps the country’s most influential arbiter of news. (Here’s a primer on the controversy for those who never quite got what the fuss was about.)
The outcry from political conservatives, as well as from others (like me) who scrutinize the company’s influence on the media, presented Facebook with a choice. It could step up, acknowledge the role human judgment plays in its products (and not just the trending news box), and take steps to make sure that judgment is being applied with rigor and care. Or it could shrink from the controversy, pulling the human decision-making safely behind the curtain of its code.
Facebook appears to have chosen … both. A promise from Mark Zuckerberg to investigate the claims and a high-profile summit with prominent conservatives both defused some of the faux outrage and served as a tacit admission that the company’s human leaders bear ultimate responsibility for the machinations of its algorithm. The company followed up by adding political bias to its agenda for employee bias training and publishing a statement of values for its News Feed algorithm.
But there remained the problem of the trending news feature, a relatively insignificant part of the Facebook app and website that is now a target for future allegations of bias, real or imagined. One option would have been to make it better, and more significant, by hiring experienced journalists to turn it into a sort of curated, public front page for the News Feed. But that would have made it even more of a lightning rod.
Instead, Facebook first announced a set of tweaks to its trending news guidelines that reduced at least the appearance of human involvement, if not the reality of it. Now, it may be headed further in that direction. Reducing that box to a series of keywords and engagement numbers, sans context, would make it less noticeable, less user-friendly, and ultimately less interesting. It would be like Twitter’s trending topics or Google Trends—only uglier and in no discernible order, rendering it useless even as a leaderboard. I would be very surprised if Facebook’s testing doesn’t show a marked decline in engagement with the feature.
Yet that may all be worth it to Facebook simply to minimize the risk of another brouhaha over its news judgment. If so, that will be understandable on some level: This is hardly a core feature of its service, and it’s not a hill that Facebook wants to die on. But it will also signal that the company is willing to compromise its product in order to avoid offending anyone.
Make no mistake, Facebook will still be exercising editorial judgment—if not in the trending box, then in the values that shape its News Feed algorithm, or in its treatment of live videos that show young men being shot by police officers. And those judgments will still be subject to criticism. But those are choices Facebook can’t avoid making, because the News Feed and live video are central to its business. The trending section, in its current form, is not—and if these changes go through, it never will be.