As the various federal investigations into the ways Russian operatives tried to meddle in the 2016 U.S. election continue, the number of American tech companies that we know were involved in the disinformation campaign keeps growing. It wasn’t just through Facebook, YouTube, Twitter, and Instagram that Russian trolls tried to sow division and strengthen the candidacy of Donald Trump. Websites like PayPal, Tumblr, and Reddit were nodes of their activity, too.
On Monday, Reddit admitted that it too has found evidence of Russian meddling on its platform, including “a few hundred” suspicious accounts the company has removed that either are Russian in origin or have links to known Russian troll campaigns. It was a multi-faceted infestation, with propagandistic ads making up just a tiny part of it, CEO Steve Huffman said in a Reddit post: Although the company found some ads on Reddit potentially bought by Russians engaged in disinformation efforts, there weren’t many of them either before or after the election, he said, adding that “ads from Russia are blocked entirely, and all ads on Reddit are reviewed by humans.” Huffman also noted most of the suspicious accounts were removed in 2015 and 2016 during previous attempts to purge abusive uses from the site.
Reddit didn’t specify whether these accounts are linked to the Internet Research Agency, the St. Petersburg troll farm that special counsel Robert Mueller and the congressional Russian inquiries have focused on as they’ve probed how the Russian efforts in 2016 used social media. But the admission comes just days after a Daily Beast report on a leak of internal data from the Internet Research Agency that included information on how the troll operation instrumentalized Reddit to promote its fake U.S. activist websites and social media groups. One of the known Russian troll efforts, BlackMatters.Us, which Slate first found was hosting an active PayPal account, had links from its site up-voted on Reddit, sometimes thousands of times. And posters from the Internet Research Agency involved in the Black Matters campaign even stated that they would host an Ask Me Anything session on Reddit in last fall, but it never happened.
According to Huffman, the most troubling activity on Reddit from known and suspected Russian operatives wasn’t in the form of ads or accounts run by trolls, but rather from actual Reddit users who were amplifying troll-made content of their own volition. One known Internet Research Agecny account on Twitter, @TEN_GOP, which posed as the account for the Republican Party of Tennessee, had its tweets “amplified by thousands of Reddit users,” according to Huffman, adding, “sadly, from everything we can tell, these users are mostly American, and appear to be unwittingly promoting Russian propaganda.”
Sen. Mark Warner, the top Democratic member on the Senate Intelligence Committee, said in a statement to the Washington Post on Monday that he “would encourage all of the social media companies to take a much closer look at how their platforms and services could be used to manipulate their users’ trust and attention.” And congressional investigators reportedly plan to ask Reddit, as well as Tumblr, some questions about how exactly Russian trolls used their platforms in the run-up to the 2016 election. Google, Facebook, and Twitter all submitted to a minimarathon of three congressional hearings in October and November last year, where the companies were grilled by unhappy lawmakers eager to know how much these powerful tech companies knew about Russian meddling on their platforms, what they could have done to stop it, and how they’ll prevent more disinformation from circulating in the future. The companies revealed that Russian operatives created thousands of misleading accounts on their websites.
Facebook alone estimated that roughly 126 million people likely saw content or followed accounts made by the IRA between January 2015 and August 2017. And Twitter revealed in January that it has found 3,814 accounts believed to be made by the Russian troll group, and it sent emails to 677,775 users who had inadvertently followed or interacted with content made by the trolls. Jonathan Albright, the research director of the Tow Center for Digital Journalism at Columbia University, found that just six of the hundreds of Facebook accounts that the Internet Research Agency backed accounts alone had been shared about 340 million times. As Reddit’s statement Monday makes clear, we’re still learning about how widespread this campaign was.
Though it’s still not clear how deeply Russian-backed agents used Reddit to try to stoke divisions and confuse voters, it’s important to remember that content on Reddit often enjoys a much more active level of engagement then the articles you scroll past in your Twitter and Facebook feeds. Reddit users are known to have lively conversations, plan campaigns, and take action when stirred by content someone in their community posts. Content, we’re now learning, that they may not have realized originated in a nondescript office building in St. Petersburg.