Facebook has been trying awfully hard to repair the past two years of damage to its reputation. It’s been running apologetic ads on TV. It’s sent CEO Mark Zuckerberg to testify before Congress. It’s rolled out new privacy settings. It’s employed new and improved content-moderation algorithms and filtering technologies, and hired more people to weed out the undesirable content that slips through the cracks. Advertisers are now under more rigorous scrutiny, and Facebook users must be informed about who paid for each political ad they see in their news feeds.
In the midst of all of this, Facebook has decided to launch a new initiative that will air exclusive TV-news programming from a range of content providers, including household names like Fox News, CNN, Univision, and BuzzFeed. In the view of the social media behemoth, this represents not just a way to get users to spend more time with the platform, but also another way to combat what it has dubbed “false news.” But the term for the problematic media the platform helped spread isn’t best captured by false news, or the more ubiquitous fake news. Instead, it’s a category of content that Oxford University’s Computational Propaganda Project calls “junk news,” which its researchers say provides a more accurate description, since it also applies to all sorts of information that—while not completely fake—still contributes to problematically uninformed, misinformed, bigoted, and hyperpolarized views. It’s an important distinction in the diagnosis of Facebook’s problems—and helps explain why the company’s new, sanctioned news programming may end up causing more trouble than good.
Facebook famously does everything it can to avoid being categorized as a media company. It doesn’t want to subject itself to the many regulations that apply to traditional media companies. Nor does it want to play the editorial gatekeeping role of evaluating which content is factual and relevant enough to be considered newsworthy—or, at least, it doesn’t want to be held responsible for doing so (part of the reason why the company did away with its Trending Topics). While Facebook has started demoting posts that users and fact-checkers flag as false, the outsourcing of this judgment task allows it to keep insisting it’s just a technology company providing a forum for the exchange of viewpoints and information. Zuckerberg himself has said that making content decisions on behalf of users makes him “fundamentally uncomfortable.”
This stance was on full display on Wednesday, when John Hegeman and Sara Su, who lead News Feed at the company, hosted an event in New York City to showcase Facebook’s efforts to fight false news. One of the journalists invited to the event, CNN’s Oliver Darcy, followed up after the event, writing in a tweet that Facebook had failed to provide a “good answer” when asked why Alex Jones’ Infowars was still allowed on the site. Facebook reacted by using free speech as a shield: “We see Pages on both the left and the right pumping out what they consider opinion or analysis—but others call fake news. We believe banning these Pages would be contrary to the basic principles of free speech.” Over and over, the company expresses that it wants to be a forum for public discourse, enabling expression of your free-speech rights. The platform is completely neutral.
Except when it isn’t.
For years, Facebook has been met with criticism for its algorithmically aided and human decisions about what it censors and what it promotes. Important historical photos from respected sources get removed, and the accounts behind them get banned. But Russian trolls and purveyors of dangerous conspiracy theories remain mostly free to post outrageous falsehoods, so long as they don’t violate “community standards.” It’s not like Facebook has always been neutral about nudity, violence, racism, sexism or homophobia, though. It hasn’t always been evenhanded and uncontroversial in enforcing its guidelines, either. If anything, Facebook holds a reputation among academics, journalists, and users for its inconsistency in what it will and will not permit in your news feed.
Which brings us back to the company’s nascent foray into commissioning its own news programming. With a history of vacillating and incohesive semigatekeeping, Facebook hasn’t exactly won the world championship of content-curation discipline. But company leaders seem to see their investment in a new lineup of TV-news–style shows for Facebook Watch as a way to distance themselves even further from the gatekeeping process. Why? Because, in their view, they’re outsourcing editorial decisions to established media outlets like CNN, Univision, ABC News, and Fox News. In doing so, the company seems to be selling the new strategy as a turn on the old Fox News Channel slogan. With the new news content on Facebook Watch, it’s: They report. You decide.
But by doing so, Facebook’s sliding further into becoming what it’s said it’s trying to avoid. By establishing partnerships with a selected set of commercial news organizations, Facebook is making editorial choices. It’s giving its seal of approval to certain news outlets rather than others.
Facebook makes another editorial decision when it decides how much time and resources it dedicates to the different partners. Fox News, for example, looks like it will be the biggest contributor of breaking-news updates with its short, twice-daily “Fox News Update” segments—heightening worries among those already concerned by the controversial partnership. It’s worth noting that, as Slate’s Will Oremus has written, “the word news is key here, because Fox News distinguishes internally between its news programming (think Shep Smith) and its opinion programming (think Tucker Carlson and Sean Hannity), with the news programming being far less partisan” and that Smith, who will anchor the Facebook Watch segments, “has a reputation as a relatively down-the-middle newsman.” It’s likely something that Zuckerberg and Co. will use to justify the choice of a self-proclaimed partisan news outlet.
But regardless of how relatively objective Smith may seem against the right-wing punditry and vitriol that’s otherwise found on the network, it’s still a questionable move. By giving Fox News a dominant spot, and by making a serious investment in the programming and promotion of it, Facebook helps legitimize a media outlet that journalism and media scholars consider to be little more than a propaganda organization. Even if the Fox News segments on Facebook turn out to have a high degree of journalistic integrity, it won’t change the fact that the Fox News Channel churns out junk news and falsehoods several hours a day on TV through its commentary programs led by hosts like Sean Hannity, Tucker Carlson, and Laura Ingraham. And that’s not counting the stories that are published on Fox News’ website.
Because almost half of all Americans get their news from Facebook, it matters when the company tells its users that they can trust Fox News (or, for that matter, can trust CNN, ABC News, Univision, BuzzFeed, Mic, Quartz, ATTN, or the other handful of branded news-programming providers it’s bringing to Facebook Watch, or has been paying to produce Facebook Live content). It matters that it has chosen to even go down the partisan route instead of striking up a partnership with news organizations that have no commercial or political interests, like NPR or PBS. It matters that the company is spending millions commissioning this news programming—plus the untold value it will give it by promoting it in users’ feeds—at the same time it’s been deprioritizing news content from other outlets (including, to be transparent, this publication) and incorrectly flagging promoted posts from some publishers. It’s a choice. And it’s an editorial one.
What’s more, these editorial choices will only raise more. What is Zuckerberg going to do if one of these media partners produces a Facebook Watch segment that contains questionable reporting—a glaring omission of relevant facts, a lack of credible sourcing, a hyperpartisan or bigoted slant? What happens if one of these branded partners engages in that original sin: spreading false news? What if it does it repeatedly? Will it hold this sanctioned content to higher standards than its other pages? Will someone at Facebook step in to correct it? Or will they simply let it slide?
Perhaps we should cut the ifs here—we know the missteps and biases that already plague some of these TV-news outlets will inevitably show up on Facebook’s version of it too. And it will be another editorial decision of the sort that Facebook says it wants nothing to do with. It’s a good thing Facebook is just a platform, otherwise it would be in trouble. Possibly antitrust violations–level trouble.
By favoring some media outlets over others, Facebook seems to be pursuing a strategy that, in some ways, echoes a case that landed the Associated Press in legal hot water in the 1940s. Among other practices at the time, the cooperative association engaged in gathering and distributing news decided it was best for its business to refuse to let member newspapers sell or provide its news to outlets that weren’t members of the association. (Members were also notorious for blocking new memberships from newspapers it saw as competitors, making the association even more exclusive.) It didn’t go well. The issue ultimately went to the U.S. Supreme Court, which decided that the AP was in violation of the Sherman Antitrust Act, even though it wasn’t a monopoly. The population’s access to news and information, it ruled, was of higher priority than the AP’s right to conduct its business as it saw fit. Justice Hugo Black wrote the opinion for the court and wrote that the First Amendment was contingent on the “widest possible dissemination of information from diverse and antagonistic sources,” which he deemed “essential to the welfare of the public”. Black also concluded that while freedom to publish is guaranteed by the Constitution, “freedom to combine to keep others from publishing is not.”
Facebook may believe it’s protecting First Amendment rights by pretending to be a platform rather than a media outlet. But Facebook surrenders its feigned neutrality when it selects a group of preferred content providers, gives them its seal of approval, and lets some of them have more access to users than others. It’s a choice that could be viewed as “combining to keep others from publishing,” or, at least, from letting others have the same access to reach users on its mega-powerful media-distributing platform.
On the one hand, Facebook claims to be a neutral forum and defends this position as the reason why it should be exempt from the regulation that traditional media faces. On the other hand, the company makes highly consequential editorial decisions—and ones that, by virtue of its outsize role as a content distributor, may not always align with that “widest possible dissemination of information” the Supreme Court has deemed “essential to the welfare of the public.” How long will Facebook be able to call itself just a platform before it loses even more of the public trust it’s been fighting so hard to regain?
We’ll be watching.