“Connecting the world isn’t always going to be a good thing,” the head of Facebook’s news feed acknowledged this week in an interview with Slate’s technology podcast, If Then. Adam Mosseri, a vice president of product management at Facebook, was responding to allegations in a recent U.N. report that Facebook has fueled hatred of Rohingya Muslims in Myanmar. A U.N. rapporteur said this week that ultranationalist Buddhist Facebook pages have helped to incite violence in that country amid what has been called “a textbook example of ethnic cleansing.”
Mosseri, whose team manages the algorithm that determines what people see in their Facebook feeds, said on the podcast that the situation in Myanmar is both “deeply concerning” and “challenging for us for a number of reasons.” He said the company’s approach of partnering with third-party fact-checkers is not viable in Myanmar because Facebook has not been able to find such organizations to partner with in the country. Instead, it has had to focus on trying to enforce its community standards and terms of service, and “changing incentives” around things like click-bait and sensational headlines that can play a role in spreading hatred and propaganda. Mosseri called real-world violence one of the worst possible effects of social media, adding, “We lose some sleep over this.”
In the wide-ranging interview, Mosseri also addressed criticism that Facebook’s algorithm, which ranks posts in people’s feeds based on their online behavior (among other factors), reinforces people’s biases at the expense of civil discourse. The news feed team’s goal, he said, is no longer simply to cater to people’s short-term preferences but to try to better understand and serve their long-term interests—which Mosseri thinks will align better with the interests of society as a whole. Users’ long-term interests are “quite frankly, more difficult to understand and measure, and optimize for,” Mosseri said. “But I have found no evidence in my time working on news feed that there’s a correlation between easy to measure and important. That’s why it’s so critical that we get better at this and get better at this quickly.”
Earlier in the interview, Mosseri responded to criticism from publishers—including the San Francisco Chronicle’s editor-in-chief Audrey Cooper—that Facebook’s frequent “capricious” algorithm changes have made it impossible for the news media to build and sustain online audiences, thus undermining their business models. “We actively try not to be capricious in what we launch,” Mosseri said, but he added that the company needs to do a better job explaining its changes and why it’s making them.
“We’re an important part of a lot of publishers’ strategies, and I think that’s a good thing,” Mosseri said. But, he added, “It’s important that we’re not publishers’ only strategy.” For instance, publications that sell subscriptions should think of Facebook as a way to reach and acquire new subscribers, he said. That suggests a different approach than ad-supported publications, which would presumably be using Facebook to drive page views.
At the interview’s conclusion, Mosseri was asked why Facebook can’t be more transparent about its algorithm’s inner workings, such as publishing the source code for the news feed or disclosing the exact methods behind its controversial new plan to rate news sources by trustworthiness. “I think we can be more transparent,” Mosseri said, though he added, “I don’t think releasing all the source code would be very helpful.” Explaining what Facebook is doing and why, at a high level, is crucial, he said. But giving the public too much visibility into Facebook’s algorithms, Mosseri argued, would create “stress and anxiety about details that don’t really matter.”
Facebook would also likely incur a “reputational cost,” he said. “When we make a mistake, we’re just going to get beat up for it. Which is why we want to be really careful. If we release a statistic and it turns out to be wrong in some way because there’s a bug in the code, or there’s some nuance to measurement that we missed—we’re going to get a lot of criticism very, very quickly.” Often, he added, “the people who are asking or demanding more transparency are the ones that are quickest to criticize us when we get things wrong.” Still, Mosseri said scrutiny is “healthy” for Facebook overall, even if it’s not always pleasant for him and his team. “At the end of the day I really do think criticism helps us shed light on our blind spots, helps us be more self-aware, and I think that’s painful but healthy.”
A partial transcript of the interview is below. You can listen to the full interview here.
April Glaser: So we’re going to play a clip now from Audrey Cooper, who is the editor in chief of the San Francisco Chronicle. Here, she’s speaking with Bob Garfield of On the Media earlier this year. She’s talking after Facebook announced major changes to the news feed:
At the end of the day, they make these very seemingly capricious decisions. They don’t get any buy-in for it. But the people who say news media shouldn’t have relied on Facebook, I just say, well, what was the option? To ignore where everybody else was reading news? I became a subscriber to the Washington Post and the New York Times and Mother Jones magazine because I saw their stories in my news feed and I would click through. I really got engaged with that content and I became a subscriber. Almost every major newsroom in America nowadays is funded that way, at least in part.
She’s speaking in part to the fact that when people go online to get news, there are really only a couple of places that they go: Google and Facebook. Those places act as curators for what people see and what they don’t see. Should publishers rely on Facebook to get their stories out there? Or was that a big mistake? Because she’s speaking to the fact that they’ve been having to kind of play this hurry-up-and-catch-up game with the company as it changes its news feed.
Adam Mosseri: I know we actively try not to be capricious in what we launch, which is what [Cooper] sort of started with. I do think we need to do a better job explaining what we’re doing and how we’re doing it so that people don’t miss our intent. I think we need to do a better job there. I think we are large enough that it’s important for publishers to think about our platform and about specifically how they can leverage it for whatever their needs are. Different publishers have different needs, based on their business models, based on their editorial point of view, based on their content strategies, etc.
But I also think that it’s important that we’re not publishers’ only strategy, that they think about their readership holistically. That they understand that there are important differences between people who come across their articles passively in the news feed when they are just trying to catch up on the day and a reader who seeks them out directly, in an intentful way, where the mindset is very different and I think the value of the reader is very, very different. So I think we’re an important part of a lot of publishers’ strategies and that’s a good thing. But I also encourage publishers to think about us as one piece of a larger set of opportunities. More specifically, to be really intentional about how they leverage the platform. If you’re a subscription-based business model, which a lot of large publishers are moving toward, I think the most important thing that we can provide as a platform for you, as a publisher, is to be an acquisition channel for subscribers. If you’re not an advertising-based business, I think the value you can get out of our platform is very different. The strategies should reflect that difference.
Will Oremus: There was some criticism this week, from a perhaps unlikely source, of Facebook’s role in the news. A U.N. report, where they’re investigating whether there has been genocide against the Rohingya people in Myanmar, mentioned that Facebook has helped to fuel hatred of the Rohingya people there. When I asked folks on Twitter what should I ask you, because everybody has a lot of questions for you these days …
I’ve noticed that by the way.
Will Oremus: … one that came up again and again is how you think about your responsibility. You’re sitting there in Menlo Park with your team. You’re trying to be thoughtful about ways to change the news feed to better serve your readers and to give them what they want. But sometimes that turns into helping maybe to fan the flames of hatred in a country halfway across the world. What do you do? How do you think about that kind of problem? What do you do when you hear that kind of criticism and what’s your process that you take to address something like that?
It’s important for us to remember that technology isn’t naturally a good or a bad thing. It’s sort of agnostic and it’s how technology’s used that can be either good or bad. Similarly, connecting the world isn’t always going to be a good thing. Sometimes it’s also going to have negative consequences. The most concerning and severe negative consequences of any platform potentially would be real-world harm. So what’s happening on the ground in Myanmar is deeply concerning in a lot of different ways. It’s also challenging for us for a number of reasons.
There is false news, not only on Facebook but in general in Myanmar. But there are no, as far as we can tell, third-party fact-checking organizations with which we can partner, which means that we need to rely instead on other methods of addressing some of these issues. We would look heavily, actually, for bad actors and things like whether or not they’re violating our terms of service or community standards to try and use those levers to try and address the proliferation of some problematic content. We also try to rely on the community and be as effective as we can at changing incentives around things like click-bait or sensational headlines, which correlate, but aren’t the same as false news.
Those are all examples of how we’re trying to take the issue seriously, but we lose some sleep over this. I mean, real-world harm and what’s happening on the ground in that part of the world is actually one of the most concerning things for us and something that we talk about on a regular basis. Specifically, about how we might be able to do more and be more effective, and more quickly.
Will Oremus: Lately there’s a whole new set of issues that have come to the fore, particularly since the 2016 election, which is: What if, in the process of trying to give people what they want, you end up undermining civil discourse in various ways, reinforcing filter bubbles, or allowing foreign agents to interfere in elections by posting content that plays on people’s emotions and gets them riled up? I mean, sometimes people want to get riled up about how evil liberals are or how evil conservatives are. Facebook seems to have, unintentionally, maybe optimized the stoking of that kind of division in society.
So how do you now think about what the goal of Facebook is, particularly vis-à-vis news and politics? I understand you’re working at the same time on these issues of meaningful interactions with friends and family, which is really the core, I think, of what you want news feed to be about. But when you’re thinking about its role in society, have you moved past in some way that the goal is just to give people what they want or make them feel good? Are you moving toward some sense of broader societal or democratic obligations in terms of what you’re prioritizing in the feed?
Historically, we haven’t been trying to focus news feed on giving people exactly what they want or what makes them feel good, though I think our work is categorized as such pretty often. But rather, we’ve been trying, and maybe not being particularly effective, at connecting people with what they would find meaningful, which I think is an important distinction to make. You’re asking about not only long-term interest, but you’re asking about, do we have a responsibility to society as a whole, or to groups, or to communities of people. I think those things are related.
I think people’s long-term interests tend to be more aligned with the interest of the community and their short-term interests seem to be more focused on their interests as an individual. We have been trying to broaden. So we have been trying to consider more how do we think about what is the most effective, or what is the best news feed that we can possibly create, not only for an individual but also for communities at large.
One of the ways we’ve been doing that is we’ve been better at trying to understand people’s longer-term interests, which are, quite frankly, more difficult to understand and measure and optimize for. But I have found no evidence in my time working on news feed that there’s a correlation between easy to measure and important. That’s why it’s so critical that we get better at this and get better at this quickly.