Mark Zuckerberg thinks Facebook users are stupid. At least that’s what I’ve concluded from reading a post on his new year’s “challenge,” in which he resolves to “host a series of public discussions about the future of technology in society.” The challenge itself didn’t force my conclusion, though it seems to be another in a long line of PR stunts meant to make users think Facebook is addressing its missteps. It was one particular question on Zuckerberg’s list that crystalized something for me: Facebook and its CEO are dishonest salesmen, and we’re the uninformed buyers they’re trying to dupe.
“Do we want technology to keep giving more people a voice, or will traditional gatekeepers control what ideas can be expressed?” Zuckerberg asks, as if the greatest challenge to free speech in our lifetime can be boiled down to a simple binary choice. The framing itself is misleading, painting “technology” as an unadulterated societal good that sprinkles people with First Amendment fairy dust at each login.
I believe in technology’s capacity to give voice to the voiceless. I started my career supporting democracy activists in Eastern Europe, where many inspiring opposition movements began online. But just as it amplifies, technology has the capacity to silence. Sometimes Facebook even aids in this silencing, ceding to authoritarian governments’ demands to block content deemed politically sensitive. The company has also been far too slow to recognize the civil rights abuses on the platform, including the proliferation of hate speech, voter-suppression efforts, and discriminatory ad targeting. While it began a civil rights audit in 2018, it undertook a parallel investigation into so-called anti-conservative bias led by a former Republican senator and a conservative think tank.
These investigations, like Zuckerberg’s poorly phrased question, present a false dichotomy. Facebook wants us to believe that protecting the speech of marginalized groups and conservatives are separate issues, when at their core, they are about the equal and transparent enforcement of the platform’s community standards, complete with a planned appeals process. Right now, that’s unbalanced, allowing serial abusers (like, until recently, Infowars’ Alex Jones) to use the platform to spread hate, while their targets are deplatformed by Facebook’s A.I. for minor offenses like using curse words. Without transparent enforcement—as anyone who has been a target of online abuse can attest—radical free speech will actually silence those it is meant to empower.
The second half of Zuckerberg’s question, regarding “traditional gatekeepers,” reveals an ironic disdain for the group, considering that Facebook is probably the world’s largest media gatekeeper. Here, Zuckerberg is capitalizing on the fact that most Facebook users don’t realize their news feed is not an organic compilation of their friends’ musings and trustworthy news. A recent report on news literacy released by Oxford University’s Reuters Institute for the Study of Journalism found that 40 percent of people don’t know how the “individual decisions about what news stories are shown to people on Facebook” are made. Another 23 percent believe that those decisions are made by journalists at news outlets or at Facebook. Only 29 percent know the truth: that the news feed is populated by an algorithm that decides what users might like to see—that is, what will keep users on the platform, clicking on revenue-generating ads—based on their past behavior and interests. That breakdown is even more stark in users over the age of 65, only 18 percent of whom understand how their news feed operates.
Facebook is already controlling what information we consume, but most users don’t realize it. The framing of the question makes clear that Zuckerberg would like to keep it that way. He wants users to believe that their Facebook feed is a choose-your-own-adventure book with infinite endings, when in reality it’s a horror film in which the user is being injected with a carefully crafted drug cocktail meant to maximally distract and debilitate. Deliberate misdirection is a hallmark of the Facebook experience. Of course it would manifest in its CEO’s new year’s challenge.
To be clear, I do not reject the premise of these discussions outright. Silicon Valley’s idealist bubble deserves a few more needle pricks, and America needs a reminder of what civil discourse looks like. But if Zuckerberg and his company are to experience the personal and professional growth that he claims to seek, the challenge should start by ensuring the framing of the debate is as transparent, truthful, and accessible as possible. Rather than “Do you want a free Facebook, or do you want Big Bad Government and the Mainstream Media to interfere in your First Amendment Rights?” (because that’s essentially what the question says), he might ask: To what extent and in what areas should Facebook conduct content moderation? How can we do better to encourage discussion and free expression while protecting marginalized voices? How can we better educate users about how our service works so they can be more discerning consumers and not fall victim to disinformation campaigns?
In answering these questions, I hope that Zuckerberg turns not only to the “leaders, experts, and people in our community from different fields” outlined in his post, but to people outside of the Silicon Valley bubble and people who have been hurt by his company’s negligence, especially minorities. Rather than putting out disinformation about critics, as the company did when George Soros voiced his disapproval of the platform, Facebook should invite them to debate.
If Facebook truly aims to connect people and support democratic discourse and not simply turn a profit, it should start by changing its relationship with its users, not only in its approach to these debates but in its daily interactions. Empower users. Give them the unadulterated information they need to make informed decisions about their news consumption and their data. Communicate with them not like a salesman trying to push a bad deal but like someone trying to help a friend through a difficult decision. Then this will be a challenge worth taking. I’m ready to talk; are you, Zuck?
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.