Welcome to Source Notes, a Future Tense column about the internet’s information ecosystem.
In February, Molly White, age 27, received yet another creepy message: “We have three people who are going to be taking a tour [of your apartment] and looking at it, but you will not know who they are because we wont disclose that ahead of time, just know that when they do they will be wearing a hidden camera and we will be sharing the deets.” For White, the threat was one more example of the harassment she has received because she is one of Wikipedia’s most prolific female contributors.
Curbing malicious behavior is one goal of the Wikimedia Foundation’s new “Universal Code of Conduct,” which applies to Wikipedia’s many language editions and related projects. The code, which was released Feb. 2, enshrines concepts such as mutual respect and civility and makes clear that harassment, abuse of power, and content vandalism are unacceptable. According to the foundation, more than 1,500 Wikipedia volunteers representing five continents and 30 languages participated in the creation of the new code.
It’s not the case that Wikipedia has had no rules about conduct in the past. The English version of the encyclopedia has a formal civility policy, and the website’s legal terms of use also restrict severely harmful behaviors. But until recently, those policies were not universal across other language editions of Wikipedia, nor were they written in a plain language style that’s intended to communicate expectations to ordinary users without a law degree.
I have argued previously that more people should contribute to Wikipedia. Volunteering for the site helps curb internet misinformation and makes the project more comprehensive as a shared resource. But editing Wikipedia is not without its costs. Many would-be contributors have no doubt been turned off by the project’s more toxic personalities. Others have been harassed off wiki by people who are upset about how their biographical page has been written. But if more people felt safe and welcome in contributing to Wikipedia, then the website could at last expand and diversify its editing ranks, and Wikipedia could advance its mission to become the sum of all human knowledge—or, at least, that’s the theory behind the new code.
In the past, violations of the terms of use have been enforced by user bans from the foundation. “Beyond that, the Wikimedia Foundation has historically left community governance up to the individual communities, some of which have developed fairly robust policies and systems of enforcement,” said White. For example, volunteer administrators like White—editors who are selected by their peers—can block users who violate the policies against incivility. The English Wikipedia and some other projects also have an Arbitration Committee, or ArbCom, composed of volunteer users, which reviews disputes and has been described as Wikipedia’s “Supreme Court.” That’s not the case for other language editions, like Scots Wikipedia, which has had its own unique problems.
While most of us know what harassment looks like on TikTok or Twitter, it’s perhaps less clear, to those outside of the Wikipedia community, what it looks like in the context of an internet encyclopedia. Wikimedia’s 2018 Gender Equity Report found that 14 percent of the women who were interviewed had experienced poor community health on Wikipedia, reporting issues ranging from a general lack of support to harassment. “I have had porn posted to my userpage,” said one interviewee. Another interviewee reported that there was a lot of “aggression when discussing [Wikipedia] biographies of women.”
Meg Wacha, the president of the Wikimedia New York City regional chapter, told me that the overwhelming majority of the editors they have interacted with have been friendly and supportive. “We enjoy spending time together and enjoy doing this work. For the most part, those gatherings are really joyful experiences,” Wacha said, adding that in prior years, Wikipedia community members have hosted incredible dance parties. Then again, the Wikimedia New York City organization has occasionally encountered examples of highly abusive behavior, such as violations of a user’s personal space or user space (for example, turning an editor’s personal userpage into a wall of shame) and doxing (revealing personal information about an editor).
White’s story of contributing to Wikipedia unfortunately includes several incidents of doxing. She started editing the site in 2008, when she was 13, and for years she only went by the username of GorillaWarfare. She thinks that most Wikipedia editors assumed she was male, partly because her username gives that impression and also because an estimated 90 percent of the site’s contributors are men, according to survey data. Then, in 2011, White agreed to allow the Wikimedia Foundation to use her photo in a fundraising campaign that featured appeals for donations from various Wikipedians and foundation employees. “I was more naïve then and didn’t expect what I now see as a bit of a given, which is that some people zeroed in on the photograph of the young woman,” White said in an email.
Since then, White’s private Facebook account has been posted to 4chan and Encyclopedia Dramatica, a trolling site known as Wikipedia’s evil twin. Strangers have contacted her employer to complain about her contributions to Wikipedia. She regularly receives harassing emails, including threats of violence (often extremely graphic or sexual), and people have dug into her relatives’ Facebook pages, which have fewer privacy settings than hers. Wikimedia Foundation CEO Katherine Maher discussed White’s harassment in detail in a 2016 speech, and White gave permission to use her name for this article. “Some time not long after the doxing I began to be open about my name, gender, and city,” White wrote. “I found that to some extent, if you tell people the basic stuff, they don’t always try as hard to dig for the information I do care about keeping private.” White said her strategy worked pretty well up until around 2018, when she began editing pages on more contentious topics like incels.
Most of White’s harassment has taken place off Wikipedia and on other sites. But other types of harassment can take place on Wikipedia itself, such as the horror known as hounding. Consider this hypothetical: A Wikipedia editor from a minority group is adding content to the encyclopedia. The hounding editor, likely someone with a grudge, begins systemically removing all of the other editor’s contributions. For example, the hounding editor might delete a biographical Wikipedia article about a woman by arguing that the subject lacks notability. After that, the hound will remove a photograph, claiming the copyright cannot be verified. Next, they delete the other editor’s text contributions by suggesting they are not written in the proper encyclopedic style, that is, from a neutral point of view.
The hounding editor keeps deleting and deleting … taken individually, there is at least a colorable argument for each move. After all, two editors can reasonably disagree on what is proper. But by stepping back slightly, it becomes clearer what’s really happening. The hounding editor is tracking the other editor, following all of their moves, and disrupting them, often with the aim of causing frustration and distress. It’s the Wikipedia equivalent of stalking. In the past, Wacha told me that enforcers would consider most of these behavior issues in isolation, focusing on specific edit disputes rather than the entire picture. One positive of the new code, from Wacha’s perspective, is that there is new language relating to patterns of abuse. The new code explicitly states, “In some cases, behavior that would not rise to the level of harassment in a single case can become harassment through repetition.”
Most behavioral problems on the English-language version of Wikipedia have been handled by community representatives, including both volunteer administrators and ArbCom. This emphasis on governance by the user makes Wikipedia very different than Facebook and Twitter.
“Facebook is basically a snitch system,” said Jillian C. York, a free expression activist and the author of Silicon Values. A user reports bad behavior, which is then reviewed by either a Facebook employee or a poorly paid contractor who may or not have a good understanding of the social context. “Wikipedia is very different. It’s community members holding each other to a set of rules, which for me is the ideal form of content moderation,” York said. York noted that there are some similarities between Wikipedia and Reddit, which has likewise traditionally granted significant power to community moderators.
Back in 2019, I covered the banning of an editor named Fram, who was repeatedly accused of harassment. Fram was initially banned by the Wikimedia Foundation’s Trust and Safety team of employees for violating the site’s terms of use. But the ban by the foundation’s employees sparked a surprising uproar among Wikipedia editors who thought the foundation was exceeding its authority. Many volunteer editors argued that the community itself should be responsible for policing and enforcing punishment for behavior. Eventually the Fram matter was reverted to ArbCom, which decided to remove Fram’s administrator tools but not block him from the site entirely.
The messaging from the Wikimedia Foundation about the new code has been careful to make clear that Wikipedia is not intending to become more centralized or professionalized in the manner of Facebook and Twitter. Rather, the code is intended to “empower” communities themselves. This month the code entered Phase 2, which is dedicated to community approval and discussions about enforcement. “A code of conduct is only ever as good as its implementation, and the support it provides to the people who are subject to it,” Wacha said, adding that the implementation and enforcement are probably more important than the text of the code itself. Other Wikipedia editors told me that they expected the existing community governance would continue on established projects—like ArbCom on English Wikipedia. At the same time, the new code might help smaller projects that have not previously had sufficient levels of safety and support.
To those outside the Wikipedia community, it might seem rather strange that Wikipedians are so passionate about the issue of governance by and for their decentralized group of volunteers. But that’s what arguably makes Wikipedia so special compared with other sites. “Wikipedia’s code is meant to ensure that a community of people behave in a certain way. The difference here is that we’re talking about a community,” York said. “Facebook is not a community, no matter what Mark Zuckerberg wants to believe.”
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.