Welcome to Source Notes, a Future Tense column about the internet’s information ecosystem.
On Sept. 4, 2001, the MIT Technology Review published an article titled “Free the Encyclopedias!” introducing Wikipedia, the free web-based encyclopedia. The article described Wikipedia, which had started in January of that year, as “intellectual anarchy extruded into encyclopedia form” and proclaimed that Wikipedia “will probably never dethrone Britannica.”
One week after the MIT Technology Review story, the Wikipedia community responded to the spectacular tragedy of the Sept. 11, 2001, attacks by kicking into encyclopedia-editing overdrive. In short order, the Wikipedia community created approximately 100 Sept. 11–related articles, at a time when Wikipedia as a whole had only about 13,000 articles, covering topics such as the attacked buildings, flights, and perpetrators, as well as “terrorism,” “box-cutter knife,” and “collective trauma,” according to research by Brian Keegan of the University of Colorado Boulder.
The Internet Archive’s Wayback Machine includes a snapshot of the Wikipedia page for the September 11, 2001 Terrorist Attack as it existed on Oct. 9, 2001. What is striking is that the 20-year-old page doesn’t look all that different from the way Wikipedia looks today: heavily text-based, the same plain white background, and those same old-fashioned cobalt-blue links.
Even though the basic model of Wikipedia hasn’t shifted much since its early days, the public perception of Wikipedia has changed dramatically over the past 20 years. Throughout the early 2000s, mainstream media remained largely skeptical toward Wikipedia, debating whether the internet encyclopedia was diminishing the importance of expertise and ceding truth itself to popular opinion. But in recent years, the press coverage has trended more positive, with journalists praising Wikipedia as the “good cop” of the internet and a “ray of light” in a depressing world.
As we approach the 20th anniversary of Sept. 11, Facebook users are likely to see 9/11 tributes selected by an algorithmic assessment of that user’s content preferences, part of the personalized, polarized social media experience. On the other hand, every English Wikipedia user who visits the current page for the September 11 attacks this week will see the same article regardless of their demographic profile. Wikipedia’s approach—sameness within a language edition—is actually sort of boring in the sense that it is one-size-fits-all. And Wikipedia also remains quite labor-intensive. Instead of displaying content based on an algorithm, the Wikipedia process requires a lot of human vetting, discussion, and compromise. Case in point: Nearly 6,000 user accounts have contributed to the September 11 page, according to the site’s MediaWiki software.
With its quaint interface and nonprofit model, Wikipedia is in many ways a product of the early 2000s. What’s less obvious is how those shocking events of the early 2000s—including the coverage of 9/11 and Operation Iraqi Freedom, and the media treatment of so-called wikiality—helped make Wikipedia into the so-called last bastion of shared reality that it is today.
Flashback to 20 years ago: In the immediate aftermath of 9/11, the Wikipedia community responded with a significant amount of trauma-induced altruism, including links to donating blood and money and efforts to create thousands of biographical Wikipedia pages for the victims and survivors. This effort was controversial, Keegan notes. Contributors argued that Wikipedia was trying to emulate traditional encyclopedias, like Britannica, which tended to focus on fewer, more obviously notable subjects. Those in favor of keeping the 9/11 victim biographies argued that Wikipedia is not paper, meaning that because Wikipedia is a digital project, it did not have the same practical constraints as a print encyclopedia.
Ultimately, the more conservative argument prevailed, Keegan writes, and these 9/11 pages were moved to a separate “memorial wiki” outside of Wikipedia. (This digital memorial unfortunately deterioriated and was effectively shuttered by September 2006.) As a general rule, Wikipedia editors love to codify their rules with policies, and in 2004, Wikipedia’s official policy was revised to state that Wikipedia is not a memorial site. But consider how different Wikipedia would be if that decision had instead gone the other way. Perhaps there is some hypothetical parallel universe where Wikipedia includes tribute pages not only for the casualties of 9/11 but of COVID-19—an alternative-reality Wikipedia where people can leave comments on the virtual graves. The early-2000s decision to define Wikipedia by what it is not—not a memorial site, not a social networking service, not a blog or a breaking news site—helped Wikipedia retain its identity as an encyclopedia. That’s why Wikipedia is now lauded for distributing accurate medical information about COVID-19, and not for helping people process their grief.
Following the 9/11 terror attacks, the United States and its allies invaded Afghanistan, and this history is documented in digital form on the Wikipedia entry for the War in Afghanistan. Reviewing the edit history of this page shows that the article’s title has long been the subject of controversy. Back in 2003, editors debated several title options, including “2001 Afghanistan war,” “US-led invasion of Afghanistan,” and “US-led attack on Afghanistan” (emphasis added). As they discussed these content decisions, editors grappled with applying the Wikipedia policy of neutral point of view and considered whether words like invasion and attack expressed an unwanted bias. Flashing forward to Aug. 15 of this year, editors changed the article’s name from “War in Afghanistan (2001-present)” to “War in Afghanistan (2001-2021)” after a broad group of Wikipedia editors argued that most reliable news sources were reporting that the war had come to an end. This move was in keeping with the Wikipedia policy of verifiability, which was established in August 2003 and allows people visiting Wikipedia to check that the information comes from a reliable source.
Throughout the presidential election campaigns in 2004, the New York Times chronicled how both George W. Bush’s and John Kerry’s Wikipedia pages were subject to attacks, with vandals substituting Bush’s photo for Hitler’s or replacing Kerry’s entry with the single line: “John Kerry is a girl.” A lot of elder millennials remember this era of Wikipedia, sometimes with a touch of nostalgia, when the site was an anarchical free-for-all characterized by frequent vandalism. Although there are still instances of vandalism today, those issues are often fixed very quickly, usually within a few minutes. Today’s rapid self-correction is partly a byproduct of Linus’ law—“given enough eyeballs, all bugs are shallow,” or to paraphrase, “point enough smartphones on Wikipedia, and the crowd will quickly spot graffiti.”
But it’s not just the proliferation of smartphones that has helped Wikipedia defend itself against vandalism. Back in 2005, shortly after the Bush vs. Kerry election and the Siegenthaler biography incident (a short-lived Wikipedia hoax that now has its own Wikipedia page), the Wikipedia community revised its page protection policy to allow for a new mechanism: “semi-protection.” Certain Wikipedia pages, including the entries for the 9/11 attacks and the war in Afghanistan, have been semi-protected, which means that they can only be edited by users who have registered a user account, and not by anonymous IP users. There are other requirements, too, such as the account must also be at least four days old with at least 10 edits made to Wikipedia. Sometimes the mechanism of semi-protection is erroneously characterized by popular media sources as locking the content of an article. But the truth is that it’s not very hard to rack up 10 edits, and semi-protection itself has been an effective method at screening out a lot of would-be vandalism. Keep in mind, the user who dunked on John Kerry’s page by calling him a “girl” was probably a drive-by vandal, a first-timer who was not otherwise making significant positive contributions to Wikipedia. Perhaps like a lot of us, this person wouldn’t have made the same dumb changes if they were required to cool off for four days.
Looking back at Wikipedia’s media representation in the mid-2000s, the site was sometimes used as a boogeyman to express broader concerns, including the belief that the informed, objective perspective—let’s call it the Truth—was being replaced by a feel-good, non-truth determined by the math. Stephen Colbert launched his satirical news program The Colbert Report with a segment dedicated to what would be dubbed 2005’s word of the year: truthiness. “We’re not talking about truth. We’re talking about something that seems like the truth—the truth we want to exist,” Colbert said. He urged his viewers to take the truth into their own hands and “save” the declining populations of elephants in Africa by changing their numbers on Wikipedia, causing the site’s server to crash. Especially in 2005–06, Colbert routinely made Wikipedia the butt of the joke, introducing the term wikiality to describe a model in which truth was decided by the will of the majority and not facts. For Colbert’s audience, it was hilarious when the comedian edited Wikipedia live on his TV show, such as when he changed George Washington’s Wikipedia page to say that the former president did not own slaves.
Truthiness as a concept in 2005 paralleled discussions from two years prior when the United States invaded Iraq. When Colbert’s show launched, there was a growing sense that the Bush administration had decided to invade Iraq because it felt like there were weapons of mass destruction in Iraq and not because it had actually verified that those WMDs existed, a textbook example of truthiness. Whether or not it was a fair conclusion, truthiness and wikiality were taken to be the real-world manifestations of the Wikipedia philosophy as the encyclopedia that anyone can edit.
But notice how Wikipedia did not actually enable what we now call the post-truth mindset. Even before smartphones were prevalent, the false statement that George Washington did not own slaves was removed from Wikipedia in about three minutes. Later, Colbert’s account was blocked from editing Wikipedia entirely. (Wikipedians told me that Colbert himself has no hard feelings and remains a big fan of the site.) Wikipedia’s self-defense mechanisms, like semi-protection and Linus’ law, continued to grow stronger throughout the internet encyclopedia’s first two decades. There are now Wikipedia articles dedicated to debunking 9/11 conspiracy theories, and dedicated volunteer groups like “Guerilla Skepticism on Wikipedia” have sought to root out instances of pseudoscience from the encyclopedia. Today, if someone attempts to add incorrect information to the Wikipedia page for the COVID-19 vaccine, the falsehood is likely to be removed by one of the page’s 400 dedicated watchers who have signed up to monitor recent changes. That’s because Wikipedia’s focus is not on what the user wants to be true—truthiness based on feelings—but on applying the site’s policies for reflecting reliable sources. Or as Wikipedians themselves put it, the rule is verifiability, not truth.
Upon review, it’s clear that Wikipedia’s growth trajectory after 9/11 helped brace the site early on from misinformation. However, the site is not immune to every challenge in this post-truth information environment. “Wikipedia, which was born in the Bush era, is good with fighting away blatant lies, but not so much politicized narratives,” said Omer Benjakob, a reporter who covers Wikipedia for the Israeli newspaper Haaretz. Benjakob offered the example of Croatian Wikipedia, which external experts determined had been shaped by the “Croatian radical right” to reflect a distinctly nationalist bias. Remember, too, that the distribution of knowledge across the 323 separate language editions of Wikipedia remains highly uneven. For example, the English Wikipedia page for the “War on Terror” presents a different picture than its Arabic language counterpart, which refers to the subject as “America’s War on Terror.”
Clearly Wikipedia faces major challenges going forward, especially on the global front. And yet with the benefit of hindsight, it appears like some of the early fearmongering directed toward Wikipedia was highly misdirected. In 2010, while President Barack Obama was ordering an additional troop surge in Afghanistan, writers were still publishing condescending op-ed pieces like “Does Wikipedia Still Suck?” (the author concluded that most articles sacrificed depth for breadth), and Time magazine named Mark Zuckerberg its person of the year.
But it seems obvious now that instead of worrying about “sucky” Wikipedia, which has been genuinely helping people throughout this so-called infodemic, people should have been worried about the new social media platforms. After all, the new breed of millennial dictators use social media as their weapon of choice—not Wikipedia, which they cannot control. Meanwhile, researchers are figuring out that Wikipedia, or something very similar to it, might be our last best hope to course-correct.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.