A boomtown in the middle of a gold rush, Reddit attracts a reported 163 million monthly users. Like most boomtowns, it’s long been a virtually lawless place. Its official list of rules has always been spare and brief, less a code of law than a hastily scribbled note nailed to a hitching post. But there’s a new sheriff in town, and he’s looking to change things. The trouble is that they may not be changing for the better. To the contrary, they’re probably about get much, much worse.
Steve Huffman, Reddit’s new CEO, is not new to the role of digital frontier lawman. A co-founder of the site, he policed it in its early years. Waxing nostalgic about those days while discussing policy changes in an “Ask Me Anything” post on Thursday, Huffman wrote, “Occasionally someone would start spewing hate, and I would ban them. The community rarely questioned me.”
As the site grew larger, such interventions became less practical. Huffman says that he, his successors, and their deputies resorted to expulsions only in the face of “painful controversies.” Though he fails to specify whom these controversies pained (the site’s users? its reputation? beloved celebrities?), Huffman acknowledges that the company’s response to them demonstrated “inconsistent reasoning” and led to “no real change in policy.”
The results of that consistent inconsistency should be familiar to anyone who follows the site. Its users determine almost everything about its day-to-day appearance, creating its topic-based subreddit communities and voting on what should rise to the top of the trash pile. Even as the company has banned revenge porn and particularly awful subreddits, its long-standing commitment to what Huffman describes as “unfettered free speech” has continued to make it an ugly place, especially if you poke around in the wrong corners.
In the hopes of improving the site he helped create, a site he obviously loves, Huffman laid out a clearer set of new rules designed to preserve and advance the site’s status as “a place to have open and authentic discussions.” In doing so, he outlined a list of newly prohibited types of content, including incitements to “harm or violence against an individual or group,” overt harassment and bullying, and illegal materials such as copyrighted content (not discussions of illegal activities, as some publications initially reported). More importantly, though, he wrote that others forms of content—things not quite icky enough to be banned but bad enough to raise hackles—would be identified with a label, much as the site currently flags some materials as not safe for work. Once this new system goes into effect, posts and communities tagged in this way will be visible only to users who have both logged into the site and opted into seeing objectionable materials.
Some of those who reported on Huffman’s remarks were rightly skeptical. Noah Kulwin of Re/code described them as “a digital band-aid on what is a much, much larger problem.” And BuzzFeed’s Charlie Warzel wrote that Huffman was merely continuing “in the tradition of past Reddit leadership, offering yet another declaration of values and ideals—an ideological framework from which the company can operate.” But the changes don’t merely fail to fix things. They may actually intensify the problems with the site. Here are four reasons to worry about Huffman’s proposals.
First, many of the prohibited forms of content have always been off limits on Reddit. As Warzel notes, the new guidelines “are far more explicit” than ever before, but there are few substantial changes. It’s unclear, then, whether this is a real step forward or mere hand-waving.
Among other things, Reddit will continue to ban “Publication of someone’s private and confidential information,” an activity popularly known as doxxing. While protecting the privacy of users online is a critical step toward preventing bullying and harassment, Reddit’s commitment to this policy actually supported one of its most vile users on one occasion. When Gawker revealed the identity of Michael Brutsch (known on Reddit as Violentacrez), an especially active troll, the site’s volunteer moderators rose up in his defense, citing the ban on doxxing—even though the actual publication of Brutsch’s details took place off Reddit. Though it may be an unusual outlier, the Violentacrez case demonstrates that the site’s users have an idiosyncratic relationship to its rules.
Second, many of the rules—especially those that are actually new—remain vague. When Huffman explains that adult content will have to be tagged NSFW (which has been the case for years), he declines to explain what “adult” means, instead falling back on the old Justice Potter Stewart saw: that you know pornography when you see it. Similarly, the only guideline for determining which communities will be flagged for separation from the rest of the site is whether they violate “a common sense of decency.”
Here, Huffman employs the language of empty moralism. He claims that sections of the site that actively promote harm to others, such as /r/rapingwomen, will be banned, but those that are merely racist or otherwise offensive, such as the racist /r/coontown, will merely be reclassified and separated from the rest of the site. Beyond these examples, he offers no explicit framework for what makes a community indecent, apart from writing that they will be those that “I and many others find indecent.”
Without a more clearly developed account of what does and does not offend our ostensibly “common sense of decency,” this policy is more likely to create chaos than to quell it. Indeed, the site’s moderators are already worrying about how these changes will affect their own communities.
Third, while quarantining offensive subreddits may be a step in the right direction, it is too weak a change to make a real difference. This new rule closely resembles one proposed in Slate by David Auerbach earlier this week, but with one critical difference. Auerbach suggested tagging subreddits that promulgate hate but went further, arguing that users who post to or moderate these subreddits should also be tagged. This means that as long as they continued to participate in such communities, they would be identifiable as hate speech supporters, even when they go elsewhere on the site. Huffman’s approach sections off troublesome communities but does nothing to prevent their members from spewing the hate incubated within them elsewhere on the site.
Fourth, Reddit doesn’t just need rules; it also needs tools to enforce them. As Auerbach explained in an earlier article, Reddit’s recent user rebellion was a response from the site’s volunteer moderators, who felt that they hadn’t received adequate support from administrators. Huffman admits that new features are necessary but makes little effort to describe what such tools will entail. In one response to user questions, he claims that he and his team are keeping their “tactics close to our chest for now.”
Tactics, not rules, are exactly what Reddit needs. Huffman’s openness and willingness to engage with his site’s users are admirable, as is his clear desire to elevate the site. But if he’s going to play sheriff, it’s not enough to simply put on his badge. He’ll also need to keep a ready hand on his holster.
This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.