On Sunday, the former employee who’s made the past several weeks one big migraine for Facebook finally revealed her identity on 60 Minutes. Frances Haugen, who has filed federal complaints against the company, provided damaging internal documents to the Wall Street Journal for a blockbuster series last month, and is scheduled to testify to Congress, is a project manager who began working at Facebook in 2019. Haugen was part of the company’s civic integrity team, which focused on misinformation and interference surrounding elections, and left the company in May. She’s originally from Iowa and previously worked on ranking algorithms, which attempt to serve users the content that they’re more likely to want, for Pinterest, Yelp, and Google. In her interview with 60 Minutes, she claimed that the issues she saw at Facebook were worse than at the other social network companies where she was employed.
Haugen alleged to authorities in complaints last month that Facebook has been misleading the public and its investors about the efficacy of its initiatives to moderate misinformation and hate on its platforms. Using her leaks of internal documents, the explosive Wall Street Journal series revealed that the company’s platforms exacerbate body image issues in teenage girls, allow some elite users to skirt the rules, is inadequate in cracking down on human traffickers, and was unable to steer conversations on the platform away from vaccine hesitancy. It has amounted to Facebook’s biggest scandal in years, partly because it covers so many different aspects of its operations. While many of these issues have been in the public eye for a while now, the documents have revealed the extent to which the company itself realizes it’s causing harm in the world.
The thrust of Haugen’s argument on 60 Minutes was that Facebook consistently “prioritized growth over safety,” another long-standing criticism. She highlighted that Facebook gets far more engagement from content that inspires anger, which actually spurred political parties in Europe to warn the company that they felt like they had to shift their policy positions in order to gain more traction on social media. Research she provided to 60 Minutes further showed that the company believes that it only acts on 3 to 5 percent of hateful content on the platform. Facebook did try to reorient its algorithms around friends and family in 2018; the documents Haugen publicized showed that this initiative actually had the opposite effect and made the platform more divisive.
Haugen told the Journal that she agreed to help with Facebook’s 200-person civic integrity team because one of her friends had been consumed by a rabbit hole of misinformation and white supremacist ideas online. She found, though, that her team had inadequate resources for formidable projects, like building systems to detect misinformation targeted at particular communities. Facebook dissolved her team after the 2020 election but prior to the Capitol riot. (Facebook claims that it reassigned the functions of the team to various parts of the company.) Haugen decided to leave the company in April and spent the last month of her tenure rifling through Facebook Workplace, an internal social network containing thousands of sensitive documents that’s accessible to all employees. She gathered an enormous cache of documents and reached out to a nonprofit called Whistleblower Aid for legal representation. Haugen is set to testify before Congress on Tuesday and is seeking whistleblower protection with the Securities and Exchange Commission, which would shield her in the event that Facebook tries to retaliate and accuse her of stealing company property.
Haugen says she leaked this information to improve Facebook, instead of destroying or harming it. She is notably opposed to breaking up the company or changing Section 230 of the Communications Decency Act—a foundational internet law—to reduce the social network’s liability protections. Facebook said in a statement responding to 60 Minutes’ report, “Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and do nothing is just not true.” Thanks to Haugen, that almost certainly won’t be the last time this week that Facebook has to explain itself.