Future Tense

Breaking Up Facebook Won’t Fix Its Speech Problems

There might be good reasons to separate Facebook, Instagram, and WhatsApp. Speech isn’t one of them.

Mark Zuckerberg, Chris Hughes.
Facebook CEO Mark Zuckerberg speaks during the F8 Facebook developers conference on April 30 in San Jose; Chris Hughes speaks at Harvard University on Feb. 5, 2013. Photo illustration by Slate. Photos by Justin Sullivan/Getty Images, Jonathan Wiggs/The Boston Globe via Getty Images.

On Thursday, in an eloquent and reflective New York Times op-ed, Facebook co-founder Chris Hughes added his voice to the growing chorus calling for the social network to be broken up.

Most arguments for antitrust action against Facebook generally focus on its data collection, privacy practices, and effects on innovation, but Hughes emphasizes the unilateral power that Mark Zuckerberg has over 2 billion people’s speech. He calls it “the most problematic aspect of Facebook’s power.” After describing the awesome power that Facebook, and Zuckerberg in particular, has to “monitor, organize and even censor” the world’s conversations, Hughes calls for the government to use its antitrust tools to break up the platform so that competition and market-based accountability can do its work, apparently sure that people would choose a healthier form of discourse if they only had the option on another platform.

Advertisement
Advertisement
Advertisement
Advertisement

We may, as a society, decide that the lack of competition and invasions of privacy might make breaking up big tech worth it. But it’s unlikely that such an approach would solve the speech-related issues. In some cases, it may actually make them worse. Hughes appears to fall prey to what’s known as the “streetlight effect”: the tendency to search for answers where it’s easiest to look, named after an old joke about a drunk man looking for his keys under a streetlight not because he lost them there but because that’s where the light is. In this moment of techlash, it’s easy to feel like we’re stumbling around in the dark in the face of the unprecedented rise of the private power of a handful of large tech giants over so much of our public discourse. But we need to be specific about the problems we’re confronting before we can be confident that they can be solved (or at least mitigated) by making the companies themselves smaller.

Advertisement
Advertisement

Take Hughes’ statement that “unlike with pipes and electricity, there is no good argument that the country benefits from having only one dominant social networking company.” Isn’t there? Sitting here at Harvard (where, the founders are always so keen to remind us, it all started), it’s easy to feel that the world is small and I don’t need Facebook to stay connected to it. After all, I’m lucky to find that so much of it comes to me. But that’s not the case everywhere or for everyone. (It certainly felt different when I lived in Australia, which a former prime minister once called the “arse end of the world.”) The ability for people from more remote places or with niche interests to find people they have things in common with is made easier when they congregate in one place. And breaking platforms up shouldn’t be shorthand for introducing competition in the hope that some “better” platform will become that one place if “bigness” is your real concern. So the question is, what are we really worried about?

Advertisement
Advertisement
Advertisement
Advertisement

If we’re worried about Russia or other foreign powers using Facebook and other social media platforms to influence elections—as Sen. Elizabeth Warren mentions when making her case for breakup—then we also need to acknowledge that these influence campaigns are often sophisticated, cross-platform operations that require well-resourced, expert, and coordinated responses. No one has made a good case for how breakup and competition would aid these efforts.

If we’re worried about echo chambers, where social media causes people to self-select homogenous communities and not be exposed to differing views, then Hughes is right to fear that “more competition in social networking might lead to a conservative Facebook and a liberal one.” It’s not clear to me that a better world is one where we have partisan social media, or even age-segregated or regionally segregated ones for that matter.

Advertisement
Advertisement

If we’re worried about the coarsening of public discourse that comes from an attention economy “optimized for engagement,” then heightened competition in that economy might be adding fuel to the fire.

If we’re worried about the spread of misinformation or hate speech, and hope that fact-checking or counterspeech can offer remedies, then it’s helpful to be able to have a more unified public sphere in which to spread these correctives. The constant pace and costs of content moderation mean that current well-resourced platforms are failing to act quickly enough to stop misinformation and hate. Are we sure several smaller, possibly Balkanized platforms would do a better job?

Advertisement
Advertisement

If we’re worried—and we absolutely should be—about Facebook’s completely inexcusable failure to respond quickly or adequately to the use of its platform to spread hate speech during an ongoing genocide in Myanmar, it’s not clear that a regional or smaller platform would inevitably do better. Keep in mind that a lot of the inflammatory posts were purposefully spread by members of the Myanmar military, which has considerable power in the region.

Advertisement
Advertisement

Which brings me to another worry—the need for platforms to push back on authoritarian demands (or even democratic government demands) to censor speech. The confidence I have in platforms doing this now isn’t strong, and I’m deeply concerned about what this means for oppressed peoples. But unlike Hughes, I don’t have faith that competition and market-based accountability will solve it. The amorality of profit has not been a powerful defender of human rights around the world. As David Kaye, the U.N. special rapporteur on freedom of opinion and expression, observed, Hughes doesn’t tell us how breaking up Facebook solves speech-policing issues globally.

In short, there’s a lot to be worried about. I’m worried! And none of this is to argue that Facebook shouldn’t be broken up. But we can’t have this conversation properly if we don’t start by acknowledging that there are complicated stakes here that need balancing. And it may well be that splitting Instagram and WhatsApp off from Facebook is a good policy response to other problems, but it will not solve the most intractable issues about how to moderate speech on any individual platform.

Advertisement
Advertisement

Which is why I’m more on board with Hughes’ other recommendation: the need for greater government and democratic oversight of social media content-moderation practices. Hughes calls for the creation of a government agency to “create guidelines for acceptable speech on social media,” and I’m not sure I’d go that far. But like him, I’m concerned about the unilateral, unaccountable, and ad hoc way that Mark Zuckerberg can dictate speech rules for so many people. If I lack faith in market-based accountability, I am more optimistic about democratic and regulatory accountability and oversight. Done right, it can bring transparency and independence to the difficult task of deciding what speech is and isn’t allowed in these spaces that facilitate so much important public discourse, and what algorithmic or other tweaks are made to how that speech spreads. Transparency and independence in these decisions are two things that are currently sorely lacking in these decisions at Facebook (and in Silicon Valley more broadly).

Advertisement
Advertisement
Advertisement

But this will encounter different resistance by those who think that government involvement in these decisions would be more “un-American” than Hughes says Mark Zuckerberg’s unprecedented power is. First Amendment doctrine is deeply shaped by a distrust of government motives in speech regulation. As a non-American, my fear of government does not run so deep as to resign myself to hoping the market will fix these pressing problems. But others may not be so easily convinced. Perhaps it doesn’t matter—other countries seem ready to step into the vacuum left by U.S. inaction. But as the home jurisdiction of many of these companies, the U.S. has greater power to compel changes in company behavior and create enforceable oversight mechanisms. An “international grand committee” of lawmakers from nine countries could not get Zuckerberg to appear for hearings in the U.K. last year, but the U.S. Senate at least managed to make him show up (even if what occurred cannot really be described as meaningful oversight).

Hughes declares that “an era of accountability for Facebook and other monopolies may be beginning.” Put me down as one of the most hopeful that this is true. But when it comes to the need to constrain the power platforms have over speech, I’m not convinced market-based accountability is the answer. I’m going to keep fumbling around in the dark for solutions to these genuinely difficult questions, rather than join the party near the streetlight.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Advertisement