Future Tense

Well-Intentioned Section 230 Reform Could Entrench the Power of Big Tech

On the left, an out-of-focus man sitting behind a nameplate; on the right, a man with a beard appears on a TV screen.
Twitter CEO Jack Dorsey testifies remotely during a Senate Judiciary Committee hearing on Capitol Hill on Nov. 17. Pool/Getty Images

In hearing after hearing over the past eight months, the CEOs of large tech platforms have repeatedly defended their products against now-familiar accusations from members of Congress: A shocking Facebook group should have been removed and wasn’t, an important tweet was unjustly deleted, or a horrific video was viewed by millions on YouTube before it was taken down.

Across both parties, the message is clear: Tech platforms have too much power to regulate speech, and Congress should curb this power by reforming Section 230 of the Communications Decency Act. Dozens of members of Congress have introduced bills that would do just that.  According to Slate’s Section 230 Legislative Tracker, 12 bills were introduced in the last four months of 2020 alone, and several more have been introduced in 2021, with others inevitably coming soon.

Advertisement
Advertisement
Advertisement
Advertisement

But despite legislators’ goal to take a bite out of Big Tech, most reform proposals would aggregate even more power in the hands of large technology platforms.

First, several proposed reforms would shift power away from individual users by increasing platform censorship. Currently, websites can use Section 230 as a defense against liability for users’ speech they host. They can use this defense early in the litigation process, before they are forced to bear the expensive burdens of discovery. Any Section 230 reform that increases legal costs will cause platforms to censor user content that gives rise to these risks.

A number of proposed reforms would have this effect. Former President Donald Trump, President Joe Biden, and several members of Congress have endorsed full repeal of Section 230, which would create massive new legal liabilities for any platform that hosts user-generated content. But even more modest reforms could have the same effect. Facebook recently proposed that platforms should have to earn Section 230 protections by abiding by certain “industry best practices,” such as having in place “robust practices for identifying illegal content and quickly removing it.”

Advertisement

To prove that they have earned the protections, platforms would need to spend time and money making their case in court.

Advertisement

Second, several proposals would introduce barriers to entry by raising compliance costs for startups trying to compete with tech companies with much deeper pockets. For smaller platforms, mandating appeals channels and transparency reports could be prohibitively burdensome and require diverting employee capacity away from engineering and into compliance functions that could make it even more difficult to innovate and compete. It shouldn’t come as a surprise that a coalition of smaller platforms including eBay, Etsy, and Reddit warned that such proposals could “unintentionally harm smaller sites.”

Third, other reforms could stifle competition on product quality. Concerned with “algorithmic amplification of harmful, radicalizing content,” some proposals limit Section 230 protections in cases where platforms use algorithms to sort content, even though companies like TikTok have used the strength of their algorithms to become a competitive threat to Big Tech. If relying on algorithmic ranking could cause platforms to lose Section 230 protections, TikTok may be less able to use its algorithm as a source of competitive advantage. Similarly, requiring that a platform be “neutral” would make it harder to use differentiated content moderation policies to compete with Big Tech.

Advertisement
Advertisement

Policymakers should be wary of changes that could put even more power in the hands of Big Tech. But that doesn’t mean there are no good ways to expand platform liability. As I argue in an article published last week in the Antitrust Chronicle, the focus should be on proposals that shift power in the opposite direction: away from large tech platforms and toward individual users and governments.

Advertisement
Advertisement

The first step should be to expand platform liability by modernizing federal criminal law for the digital age. Section 230 contains an exemption for federal criminal law, so platforms cannot use Section 230 as a defense in a federal criminal case. Congress can use this exemption to address two areas of concern: the use of online tools to suppress voting and organize riots.

Advertisement

To criminalize online voter suppression, Congress should consider the Deceptive Practices and Voter Intimidation Prevention Act. The bill was first introduced in 2007 by then-Sen. Barack Obama and is included in H.R. 1, which recently passed the House and is awaiting a vote in the Senate (where it seems unlikely to pass). The bill would expand platform liability by creating federal criminal penalties for voter suppression. If passed, it would make it a federal crime to make false statements concerning the “time, place, or manner” of an election, the “qualifications for or restrictions on voter eligibility,” or public endorsements.

Congress should also modernize federal criminal law on incitement to riot, which would address the use of online platforms to organize offline violence like the Jan. 6 attacks on the Capitol. There is existing law on incitement, but it was passed in 1968 and needs to be updated.

Advertisement
Advertisement

In addition to legislative reforms, platforms should work with policymakers to implement product solutions that empower users. For instance, platforms could give people more choice over algorithms, enabling users to toggle between feeds tailored for “political junkies,” “sports fans,” or “pet lovers.” They could also offer people the ability to select from “middleware” third-party algorithms, as my colleague Barak Richman recently proposed.

Advertisement

Additional product changes could make it easier for governments to enforce existing law. Platforms could enable people to report content that violates state laws to the offices of state attorneys general, to report false voting information to election-monitoring organizations, and to report harassment to victims support services.

Finally, policymakers should embark on Section 230 reform with respect for the uncertainties ahead. Despite our best intentions, reforms that aim to curb platform power may instead aggregate power in platforms’ hands, just as some argue that privacy legislation in Europe has entrenched Big Tech.

Advertisement

In the face of this uncertainty, policymakers should approach reform with curiosity, incorporating tools that will enable us to evaluate impact and adapt as needed. For example, Congress and the Federal Trade Commission should make it easier for platforms to share data with researchers so they can assess what’s working and what’s not. Regulators should work with researchers to develop metrics for evaluating user empowerment, barriers to entry, and product quality so that they can monitor progress against these benchmarks.

To ensure that Section 230 reform aimed at constraining the power of Big Tech doesn’t have the opposite effect, we need a new reform agenda focused on empowering users, putting government in the position to set the rules, and institutionalizing a curious approach to monitoring progress.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Advertisement