On Tuesday, Facebook employee-turned-whistleblower Frances Haugen testified before the Senate Commerce Committee on the tens of thousands of internal documents she leaked to the press, federal regulators, and Congress about the company’s inner workings. Her main argument during the hearing, which has become something of a mantra throughout her various public appearances, is that Facebook “puts profits before people.” That is, the company’s incentives to continually grow its user base and increase engagement are at odds with measures that would make its platforms safer and less hostile.
In what may be the most important exchange of the hearing, Haugen illustrated how this worrisome dynamic plays out in news feeds. Sen. John Thune, a South Dakota Republican, asked Haugen, “Could you talk more about why engagement-based ranking is dangerous, and do you think Congress should seek to pass legislation like the Filter Bubble Transparency Act that would give users the ability to avoid engagement-based ranking altogether?” Engagement-based ranking orders news feeds based on the extent to which users are interacting with certain posts, whether by sharing or liking or commenting.
Here’s Haugen’s full answer:
Facebook is going to say, “You don’t want to give up engagement-based ranking. You’re not going to like Facebook as much if we’re not picking out the content for you.” That’s just not true. Facebook likes to present things as false choices, like you have to choose between having lots of spam. Let’s say, imagine we ordered our feeds by time, like on iMessage. There are other forms of social media that are chronologically based. They’re going to say, “You’re going to get spammed. You’re not going to enjoy your feed.” The reality is that those experiences have a lot of permutations. There are ways that we can make those experiences where computers don’t regulate what we see, we together socially regulate what we see. But they don’t want us to have that conversation, because Facebook knows that when they pick out the content that we focus on using computers, we spend more time on their platform, they make more money. The dangers of engagement-based ranking are that Facebook knows that content that elicits an extreme reaction from you is more likely to get a click, a comment, a reshare. And it’s interesting because those clicks and comments and reshares aren’t even necessarily for your benefit. It’s because they know that other people will produce more content if they get the likes and comments and reshares. They prioritize content in your feed, so you will give little hits of dopamine to your friends, so they will create more content. And they have run experiments on people—producer-side experiments—where they have confirmed this.
According to Haugen, the research indicates that content that elicits an extreme, often angry reaction from users is more likely to get clicks, and Facebook’s algorithms promote clicky content. This feeds into a cycle in which producers of such content are incentivized to put out ever more divisive posts in order to get that engagement and thus rank higher on news feeds. According to one report Haugen leaked, even an algorithm change in 2018 that the company claimed would promote more “friends and family” content actually exacerbated this dynamic.
Haugen contends that while engagement-based ranking hurts society at large, it nevertheless makes Facebook more profitable by gluing users to the site. A healthier alternative, she argues, would be to order news feeds by chronology. She further anticipated one of Facebook’s counterarguments to chronologically ordered feeds, which is that users would be inundated with spam. Haugen characterized this as a false choice, since there are ways to demote spam content while still maintaining chronology.
The dynamic Haugen described illustrates how Facebook has built a social media ecosystem in which its financial success is often at odds with the well-being of its users—or, at the very least, in which the company focuses on mitigating the problems created by its powerful ranking algorithms rather than meaningfully changing the algorithms themselves. It touches on the various dysfunctions that she’s brought to light like divisive content and misinformation, both of which tend to produce a lot of engagement. Engagement-based ranking, she says, even exposes young users to content promoting anorexic behavior. And in one of the more striking accusations Haugen leveled during the hearing, she said such algorithms “in places like Ethiopia, it’s literally fanning ethnic violence.”