S1: Hey, everyone. Just before we get started, I wanted to let you know that today’s episode has some adult language.
S2: OK under the show.
S1: When when Facebook went down this week, like, did you did you have a moment of like, Oh my God, I did this?
S3: No, no, I did not. I had I had a moment of like, is there some slight chance that this was tied to our work?
S1: That’s Jeff Horwitz. He’s a reporter for The Wall Street Journal. And his work, part of a series called The Facebook Files, has rocked Facebook, revealing not just the harms the company does, but what executives knew about them on last week’s show. We talked to Jeff’s colleague, Georgia Wells, about the troubling impact that Facebook’s app Instagram has on teenage girls. This week we’re talking to Jeff because honestly, everything just keeps blowing up in Facebook’s face these days. Jeff’s main source for the Facebook files was the whistleblower who testified to Congress on Tuesday, and just coincidentally, that was after Facebook’s companies went down for five hours on Monday.
S3: And then I concluded no for a couple of reasons. First of all, it would be absolutely stupid. Taking down in particular, WhatsApp is just like something that anyone who understands what infrastructure looks like would never do. It’d be like, Hey, I’m going to take down like the phone system because I have a problem with AT&T is it’s just too stupid.
S1: Still, it was nice to know that it wasn’t actually because of his reporting.
S3: I was relieved to be 100 percent clear when the company came out and said, It turns out we’re just really bad. The internet was was a really happy, happy moment for me to hear, though.
S1: If the outage was just a weird coincidence, the results of Jeff’s reporting have been anything but. Facebook is now under a national microscope.
S4: Good afternoon, Chairman Blumenthal, ranking member, Blackburn and members of the subcommittee.
S1: Francis Haugen Jeff Soares has done what she set out to do. Start a very public conversation about how Facebook really works and how to fix it.
S4: I saw Facebook repeatedly encounter conflicts between its own profits and our safety. Facebook consistently resolves these conflicts in favor of its own profits. I came forward at great personal risk because I believe we still have time to act. We must act now.
S1: Today on the show, the story of the Facebook whistleblower told by the reporter who knows her best, who she is, what she wants and whether fixing Facebook is possible. A lot of people were introduced to Francis Haugen on Tuesday during her Senate testimony. But Jeff met her 10 months earlier, and for most of that time, he actually called her Sean, a code name they used to protect her as a source. She’d been working on a team at Facebook called Civic Integrity, which was supposed to keep the platform from being manipulated during the 2020 election, the way it had been in 2016.
S3: I’d gotten in touch right after the election with a whole bunch of people who worked on civic integrity, and I was really interested in understanding the break, the glass measures and basically the things that Facebook had done successfully in an emergency environment to calm its platform down.
S1: These are the measures to sort of keep things under control.
S3: Yeah, exactly. A lot of them were like anti virality or virality, restrictions, things, things of that nature. And so the I reached out to a whole bunch of people there just basically saying, Hey, this work seems really important, and it seems like it’s likely to go away. I think it’s too valuable to just stay inside the company, and I got nothing back except notice from another reporter that apparently my notes had been mocked by people on Facebook comms until a month later.
S1: By that time, Facebook had shut down the Civic Integrity Team and started distributing its employees throughout the company. Frances Haugen was one of them.
S3: Frances, who was one of the people I had written to, had just been at the meeting in which the Facebook civic team was getting dissolved and she got in touch. She wanted to know who I was and then, you know, we arranged for an in-person meeting and took a walk up in the East Bay Hills. I think it was very much, you know, I was trying to get a sense of exactly what her motivations were and what she knew. And she was trying to get a sense of whether I was a dipshit and whether whether I kind of understood the general issues that she was concerned about.
S1: Frances didn’t want to talk about antitrust or data protection, the kinds of things that have dominated Facebook’s scandals in the past. She wanted to share what she knew about how Facebook really worked internally, how it was designed to work and why that scared her.
S3: This isn’t just her, it’s just a lot of people that she worked with, and a lot of former employees that I’ve spoken to was pretty distraught about the quality of discussion about Facebook externally. Right. It was all about sort of, is Facebook doing terrible things with our data? Should Section 230 be lifted so people can sue the platforms to oblivion? It was like, should WhatsApp be broken off from Facebook and and Instagram, you know, is antitrust the solution to everything? And I think she definitely wished that there was a lot more focus on. What the actual mechanics of the product are and how that informs how it’s experienced rather than it being, whether Facebook is biased against a particular party or wants someone to win a presidential election. She’s very much more focused on system design and on how making choices about how you distribute content is the same thing as making choices about what content gets distributed.
S1: This was also personal for Haugen. She had a friend, a close friend, who was essentially radicalized online viewing more and more incendiary content. Haugen talked with her mother, an Episcopal priest, about her options. Eventually, she collected a series of internal documents from a central employee hub and gave them to Jeff. The documents showed that Facebook knew that its platform was getting angrier, that Instagram was often harmful to teenagers. But efforts to keep human traffickers and drug cartels off the platform were failing, and that Facebook’s attempts to stop vaccine misinformation weren’t working. And all of it alarmed Francis Haugen. We spent a lot of time working on these stories and then you’ve sort of, you know, been a part of this rollout of of her public identity. And I guess I wonder if you could describe her for me, her demeanor or how she communicates because, you know, America got to see her on TV and in these hearings. But I’d love your impressions of her.
S3: Yeah, I mean, look like the the woman who was testifying on Tuesday, but that’s pretty much her. She speaks in a very long, thoughtful sentences. She’s pretty strategic. She’s not very off the cuff. There are like generally pauses before she answers questions. And she kind of understands this stuff pretty well, and I I’ve I’ve had people asking me questions like, well, was her feeling on antitrust or Section 230, you know, like it’s it’s who coached her on that and it’s like that, actually pretty much everything she said in that hearing. Was either what she innately understood about the platform or stuff that she picked up in the course of learning about it in terms of her own investigations.
S4: I joined Facebook because I think Facebook has the potential to bring out the best in us. I believe in the potential of Facebook. We can have social media we enjoy that connects us without tearing our democracy apart, our democracy, putting our children in danger and sowing ethnic violence around the world. We can.
S1: I have been surprised in your reporting and in the congressional testimony to hear Francis Haugen say, I don’t hate Facebook. I love Facebook. I want to save it. I’m not sure I would have gone through all of that and like carefully, you know, looked at these documents and thought about federal whistleblower protection and still love to place. What does she love about it?
S3: I mean, I don’t think she loves the product in the current form. I think she considers it to be a threat to democracy and human life back in the
S1: car to love something in that way.
S3: But yeah, exactly. But but in terms of the general idea that this technology doesn’t have to be this way and that a company that is committed to Facebook stated mission of connecting the world and bringing people closer together, that that is a possible thing. Yeah, she’s very much a believer in that. She used to Skype, talked about how like coming at the company from a place of vengeance isn’t really something that is likely to lead to productive solutions. It’s going to make people defensive. It is going to prevent the sharing of information and people understanding where the legitimately hard decisions are in this, that, you know, if it were just that Mark Zuckerberg was like trading human lives for money, right? Like that isn’t a helpful frame to think about this, and it’s not the way Mark Zuckerberg or anyone thinks. And it isn’t likely to lead to solutions. So I think, you know, she told me that early on in this that if people just ended up being angrier at Facebook as a result of what she’d done, it was kind of a waste.
S1: When we come back, what Haugen wants to see happen. You’re listening to what next? TBD, I’m Lizzie O’Leary and I’m talking with Jeff Horwitz from The Wall Street Journal during Tuesday’s hearing, the senators seemed to understand what Haugen was warning them about. There was a moment when the senators actually seemed quite prepared, and maybe I can say as a former congressional reporter. Surprisingly with it in terms of understanding what Francis Haugen wanted to zero in on, and I’m talking about engagement and engagement based ranking, I wonder if you could describe what engagement base ranking is and how it works.
S3: Okay, so we should go back to old Facebook right where like people would post things they would appear in a News Feed in chronological order and you would look at them and scroll past the things you didn’t like and click on the things you did. And they quite reasonably decided that that was not the optimum way of servicing what people wanted to see. And perhaps they could use signals like did a whole bunch of people like something? If so, maybe you’ll like it. And then they could refine that to did a whole bunch of people who kind of behave like you do like something, then if so, maybe you’ll like it, right? And so they kind of made these changes that were supposed to make the whole system more engaging, write things that were going to make you like, click on reshare content. And there’s nothing about that that seems like inherently problematic from its face on its face. It’s you’re just giving people what they would prefer to see based on an algorithm. Best guess. And they didn’t really kind of think about what some of the ramifications of that were.
S1: It turns out that certain kinds of content, sensationalistic clickbait, be anything shocking did really well, and that began another cycle of clicky, outrageous content over and over.
S3: So they basically realized long after building a system that was meant to maximize engagement. One of the most effective tactics to create engagement was to get people angry, and publishers realized that, too. If you try to build an audience, there is no sure way to do it than just posting things that make people outraged over and over and over again. And whether it’s true or not or overstated isn’t really so much the point it works.
S1: I think if you are someone who has followed Facebook for a while or reports on Facebook, you know this. But at this hearing, it felt to me like lawmakers and also the public are getting it really for the first time. Why do you think it broke through?
S3: Honestly, I think we kind of all have an intuitive sense of it. I think hearing it explained well, is another matter.
S4: Facebook is going to say you don’t want to give up engagement based ranking. You’re not going to like Facebook as much if we’re not picking out the content for you. That’s that’s just not true.
S3: There are a lot. Look, this is like a different type of physics and these are black box systems. So I think, you know, to some degree, like even if we have a general sense that what plays on the internet isn’t always good behavior. I don’t know that that it was particularly obvious why that happened or how it came about. You know, it might feel a little basic, but at the same time, actually just being able to talk through the mechanics does feel to me like it was really important.
S4: And it’s interesting because those clicks and comments, the researchers aren’t even necessarily for your benefit. It’s because they know that other people will produce more content if they get the likes and comments and shares. They prioritize content in your feed so that you will give little hits of dopamine to your friends so they will create more content.
S1: She seemed to return again and again to this overarching idea that Facebook’s financial success as a company is at odds with the well-being of its users. And again, that feels like something that maybe you could understand intuitively and somehow seeing it written down. I don’t know. It gives it a different tenor written down internally. Do you think having the documents? Oh, yeah, yeah, absolutely. To all of this?
S3: Absolutely. Yeah. I think look, the things that we wrote about in the Facebook file series, you’ve been trafficking on Facebook, whether powerful people have an edge, whether Facebook makes people say whether Facebook might have some problems in its COVID response and encouraging anti-Covid vaccines zealots, these are not new topics. The thing that’s new about them is the grounding of what’s happening inside the company’s own understanding of itself,
S1: understanding it internally and yet saying different things publicly.
S3: Yeah, that too. I think something that Miss Haugen is very, very into as an idea is that there needs to be a forum of real data, access outsiders with accreditations and, you know, appropriate precautions need to be able to get inside the company in the same way that, like literally any data scientist in the company can.
S1: I’ve been watching Facebook’s response to Frances Haugen, and it’s sort of multi-pronged. On the one hand, you have company spokespeople saying, Well, she didn’t know all that much anyway. Then another sort of tact is like, Well, maybe she stole documents. And then Mark Zuckerberg has a blog post saying, this idea that we prioritize profit over safety and well-being. That’s just not true. What did you make of his response?
S3: It’s very carefully worded. It’s true they don’t set out to increase the amount of incendiary comment content, and no one says we should like the incendiary content button because it will increase engagement. They don’t ever say that what they do is they just simply increase engagement. They figure out how to increase engagement, and they do it by building things that just happen to be better at spreading incendiary, incendiary content. This is the system they’ve designed, and they’re not really even arguing with it. They’re just calling it kind of a mischaracterization somehow.
S1: Well, let’s talk about some of the possibilities for what comes next. She mentioned raising the age at which kids could be on social media. And then I think the big one. You know, simplifying the newsfeed to be chronological the way it used to be rather than engagement based. I wonder, no one does. That’s impossible. And number two, would that really make a difference because there was all sorts of, you know, problematic content on the site? Back when it was chronologically based?
S3: Absolutely. There are problems with people and they are going to be problems on any social network used by people. And I don’t know that saying, well, there were problems back before the internet slash back before Facebook slash before algorithmic ranking. Really answers the question. I think it’s a question of proportion and of what the effects are, right, because there will always be bad stuff getting posted to the internet, and that’s OK. The internet turns out to be pretty resilient. The question is whether bad stuff and bad people are able to harness systems meant for the distribution of high engagement content. And, you know, I don’t know. It has to be a full straight kind of chronological feed ranking, but I definitely would not underestimate Miss Haugen intelligence or understanding of these things. Directionally, it’s, I think, at least a great place to start the discussion.
S1: There’s also a proposal by a Stanford law professor to make the kind of internal research that you all reported on available for outside researchers. And that is a whole different degree of openness. What would that do?
S3: That would be extremely helpful. It would allow policymakers and academics to actually be thinking about this stuff in a way that that could be actionable because right now, I mean, it’s literally a data. Data science intern at Facebook has the ability to run experiments that the world’s leading research institutions don’t. And that’s a problem right now. All the information is behind a wall, and everyone outside is just guessing about what they think. Right? Even if you do quote unquote know that Instagram isn’t good for teenage girls, which I think a lot of people have said they knew if you can’t prove it, you can’t test it and you can’t look at possible solutions internally. That knowledge isn’t worth much.
S1: Jeff Horwitz, Thank you very much. Thank you. Jeff Horwitz reports on Facebook for The Wall Street Journal. One quick note Before we go a little mistake in yesterday’s show, Mary referred to Janet Yellen as the Fed chair. That was her previous job. She is the Treasury secretary. That is it for us today. TBD is produced by Ethan Brooks or edited by Tori Boche and Alison Benedict, and Alicia Montgomery is the executive producer for Slate Podcasts. TBD is part of the larger What Next family, and we’re also part of Future Tense, a partnership with Slate, Arizona State University and New America. And I have to recommend that you listen to Wednesday’s episode of What Next? Mary spoke with Ed Yong from The Atlantic, who is, in my opinion, the best journalist writing about the pandemic. They talk about whether we’re prepared for the next one. Spoiler alert No. What next? We’ll be back next week. I’m Lizzie O’Leary. Thanks for listening.