Many of us were introduced to Frances Haugen—the Facebook whistleblower—on Tuesday during her Senate testimony, but Wall Street Journal reporter Jeff Horwitz met her 10 months ago. Until recently, he referred to Haugen as “Sean,” a code name used to protect her as a source.
Haugen had been working on a team at Facebook called Civic Integrity, which was supposed to keep the platform from being manipulated during the 2020 election the way it had been in 2016.Then, when Facebook announced they were dissolving the Civic Integrity team, Haugen got in touch with Horwitz and the two set up an in-person meeting. During their walk together through the East Bay Hills, Horwitz quickly learned that Haugen didn’t want to discuss antitrust or data protection. She wanted to tell him what she knew about how Facebook really worked, and why that scared her.
Eventually, she collected a series of internal documents from a central employee hub and gave them to Horwitz. The documents showed that Facebook knew that it’s the tenor on its platform was getting angrier. That Instagram was often harmful to teenagers, that efforts to keep human traffickers and drug cartels off the platform were failing. And that Facebook’s attempts to stop vaccine misinformation weren’t working. These documents would go on to form the basis of Horwitz and his Wall Street Journal colleagues’ explosive series the Facebook Files, which led to Haugen recently testifying before Congress.
On Friday’s episode of What Next: TBD, I spoke with Horwitz about Haugen and whether “fixing” Facebook is at all possible.
Lizzie O’Leary: You spent a lot of time working on these stories, and you’ve been a part of this kind of “rollout” of Frances Haugen’s public identity. America got to see her on TV and in these hearings, but what are your impressions of her?
Jeff Horwitz: The woman who was testifying on Tuesday, that’s pretty much her. She speaks in very long, thoughtful sentences. She’s pretty strategic. She’s not very off the cuff. There are generally pauses before she answers questions, and she really understands this stuff pretty well. I’ve had people asking me questions like, “Her feelings on antitrust or Section 230, who coached her on that?” Pretty much everything she said in that hearing was either what she innately understood about the platform or stuff that she picked up in the course of learning about it in terms of her own investigations.
I have been surprised in your reporting, and in the congressional testimony, to hear Frances Haugen say, “I don’t hate Facebook. I love Facebook. I want to save it.” I’m not sure I would have gone through all of that and carefully looked at these documents and thought about federal whistleblower protection and still loved the place. What does she love about it?
I don’t think she loves the product in the current form. I think she considers it to be a threat to democracy and human life. But in terms of the general idea that this technology doesn’t have to be this way and that a company that is committed to Facebook’s stated mission of connecting the world and bringing people closer together, that that is a possible thing. Yeah. She’s very much a believer in that. She used to talk about how coming at the company from a place of vengeance isn’t really something that is likely to lead to productive solutions. It’s going to make people defensive. It is going to prevent the sharing of information and people understanding where the legitimately hard decisions are in this. She told me that early on in this that if people just ended up being angrier at Facebook as a result of what she’d done, it was kind of a waste.
During Tuesday’s hearing, the senators seem to understand what Haugen was warning them about. There was a moment when the senators actually seemed quite prepared and surprisingly with it in terms of understanding what she wanted to zero in on. I’m talking about engagement and engagement-based ranking. What is engagement-based ranking, and how does it work?
OK. So we should probably go back to old Facebook. People would post things, they would appear in a newsfeed in chronological order, and you would look at them and scroll past the things you didn’t like and click on the things you did. And Facebook quite reasonably decided that that was not the optimum way of servicing what people wanted to see. And perhaps they could use signals, like did a whole bunch of people like something, if so, maybe you’ll like it. And then they could refine that to: Did a whole bunch of people who kind of behave like you like something? If so, maybe you’ll like it. They made these changes that were supposed to make the whole system more engaging. Things that were going to make you like, click on, reshare content. And there’s nothing about that that seems like inherently problematic on its face. You’re just giving people what they would prefer to see based on an algorithm’s best guess. And they didn’t really kind of think about what some of the ramifications of that were.
It turns out that certain kinds of content, sensationalistic, click bait-y, anything shocking did really well. And that began another cycle of clicky, outrageous content, over and over.
They realized, long after building a system that was meant to maximize engagement, that one of the most effective tactics to create engagement was to get people angry. And publishers realize that, too. If you’re trying to build an audience, there is no surer way to do it than just posting things that make people outraged over and over and over again. And whether it’s true or not, or overstated, isn’t really so much the point. It works.
Haugen seems to return again and again in her testimony to this overarching idea that Facebook’s financial success as a company is at odds with the well-being of its users. That feels like something that maybe you could understand intuitively. But somehow, seeing it written down internally, it gives it a different tenor. Do you think having the documents is key to all of this?
Absolutely. Yeah. The things that we wrote about in the Facebook Files series—human trafficking on Facebook, whether powerful people have an edge, whether Facebook makes people fight, whether Facebook might have some problems with it’s COVID response and encouraging anti COVID vaccine zealots—these are not new topics. The thing that’s new about them is the grounding of what’s happening inside the company’s own understanding of itself.
Understanding it internally and yet saying different things publicly.
Yeah, that too. I think something that Ms. Haugen is very, very into as an idea is that there needs to be a form of real data access. Outsiders with accreditations and appropriate precautions need to be able to get inside the company in the same way that like literally any data scientist in the company can.
Facebook’s response to Frances Haugen is multipronged. On the one hand, you have company spokespeople saying, well, she didn’t know all that much anyway. Then another tact is like, well, maybe she stole documents. And then Mark Zuckerberg has a blog post saying this idea that we prioritize profit over safety and well-being, that’s just not true. What did you make of his response?
It’s very carefully worded. It’s true: They don’t set out to increase the amount of incendiary content. And no one says, “We should up the incendiary content button because it will increase engagement.” They don’t ever say that. What they do is they just simply increase engagement. They figured out how to increase engagement and they do it by building things that just happen to be better at spreading incendiary content. This is the system they’ve designed and they’re not really even arguing with it. They’re just calling it kind of a mischaracterization somehow.
Haugen mentioned raising the age at which kids could be on social media. And the big one: simplifying the newsfeed to be chronological, the way it used to be rather than engagement based. Does that seem possible? And would that really make a difference? There was all sorts of problematic content on the site back when it was chronologically based.
Absolutely. There are problems with people and there are going to be problems on any social network used by people. And I don’t know that saying, while there were problems back before the internet/back before Facebook/before algorithmic ranking really answers the question. I think it’s a question of proportion and of what the effects are. There will always be bad stuff getting posted to the internet and that’s OK. The internet turns out to be pretty resilient. The question is whether bad stuff and bad people are able to harness systems meant for the distribution of high-engagement content. And I don’t know it has to be a full, straight chronological feed ranking, but I definitely would not underestimate Ms. Haugen’s intelligence or understanding of these things. It’s, I think, at least a great place to start the discussion.
There’s also a proposal by a Stanford Law professor to make the kind of internal research that you all reported on available for outside researchers. And that is a whole different degree of openness. What would that do?
That would be extremely helpful. It would allow policymakers and academics to actually be thinking about this stuff in a way that could be actionable. Because right now, I mean, literally a data science intern at Facebook has the ability to run experiments that the world’s leading research institutions don’t, and that’s a problem. Right now, all the information is behind a wall and everyone outside is just guessing about what they think, right? Even if you do know that Instagram isn’t good for teenage girls, which I think a lot of people have said they knew. If you can’t prove it, you can’t test it, and you can’t look at possible solutions internally, that knowledge isn’t worth much.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.