It should be well-known by now that Russian operatives made memes and fake activist pages to try to sway the 2016 presidential campaign for Donald Trump. But what most people don’t know is that they were also selling sex toys, recruiting Americans to work with them through job listings, offering free self-defense classes, soliciting photos for a calendar, and even offering counseling to followers of page called Army of Jesus who were struggling with porn addiction. Two new reports delivered to the Senate Intelligence Committee—one from the cybersecurity group New Knowledge and the other from Oxford University’s Computational Propaganda Project and the social networking–analysis firm Graphika—expand what the public knows about how a Kremlin-linked troll farm, the Internet Research Agency, widened divides in American political life during and after the 2016 election. Together, the reports comprise the most extensive research yet into exactly how Russian agents instrumentalized U.S. technology companies to launch what may be largest state-sponsored effort to manipulate voters and derail an election in U.S. history.
While the two reports both present new findings, they both also deliver some familiar conclusions: Social media companies weren’t completely forthright with Congress about the extent of Russian disinformation activities when the companies testified after the 2016 election. Both sets of researchers also found that Instagram, owned by Facebook, and YouTube, owned by Google, played a far greater role in the propaganda campaign than previously thought and that Russian agents were particularly focused on and effective at exploiting long-standing racial tensions in the U.S. For example, the New Knowledge report notes that of the 1,107 videos on YouTube the researchers reviewed, 1,063, or 96 percent, were about either police brutality or the Black Lives Matter movement. Yet when YouTube testified to Congress in 2017, the company shared in a statement, “These channels’ videos were not targeted to the U.S. or to any particular sector of the U.S. population.” While that comment may be referring to paid targeted advertising, researchers with New Knowledge wrote that the testimony “appears disingenuous.”
The researchers at New Knowledge also found that Instagram—through which, Facebook previously reported, the IRA trolls reached 20 million users, compared to 126 million on Facebook—was actually a far more fruitful platform for engagement than Facebook was, despite the smaller number of users reached. Instagram clocked in about 187 million user engagements (meaning likes, shares, and comments) while Facebook amassed about 77 million, according to the research, which tracked Internet Research Agency accounts from early 2015 to the fall of 2017. The Russian trolls were clearly aware of their success on Instagram. The report from Oxford University found that activity on Instagram from the Russian troll operation skyrocketed after the election, from 2,600 posts a month in 2016 to about 6,000 posts per month in 2017 before the accounts were shuttered by Facebook.
More than any other topic matter, much of this activity focused on infiltrating racial-justice movements and prodding racial tensions, which the New Knowledge report calls an “expansive cross-platform media mirage targeting the Black community.” That mirage included a complex architecture of accounts across multiple platforms for single fake activist groups. One such group, BlackMattersUs, hosted accounts on Twitter, Instagram, Tumblr, Google Plus, Facebook, and Gmail. The faux racial-justice group also hosted a page on PayPal that it shared on its website BlackMattersUS.com, as Slate first reported in November 2017. There, the Russian trolls solicited donations from Americans interested in helping to finance their purported work supporting black community leaders. (PayPal took the account down after Slate asked the company for a response and would not disclose if any money had been donated through the account.)
The attempt to collect funds via PayPal may well have been part of the larger ruse to appear like an authentic activist group, which the trolls went through great lengths to mimic. BlackMattersUS, which described itself as a “nonprofit news outlet that delivers raw and original information on the most urgent issues important to the African-American community in America,” even scraped real events planned by black activists and entrepreneurs across the country to repost on its event page, to the great surprise of the event organizers. One such event, the MLK Grande Parade in Houston, a large annual parade honoring Martin Luther King Jr., was advertised on the Russian troll site. “I think it’s a load of BS. My name is Charles—it’s not comrade,” Charles Stamps, who has organized the event since 1994, told me when I called him last fall and told him that BlackMattersUS was promoting his event. “I don’t appreciate it, and we certainly don’t want to be affiliated with it.” Russian trolls also concocted voter-suppression efforts that “were targeted almost exclusively at the Black community on Instagram and Facebook,” according to the New Knowledge report. This included memes, like one on Facebook that read “Do Not Vote for Oppressors” next to a drawing of Hillary Clinton and DonaldTrump holding hands. Another Instagram meme shared before the election by the account @woke_blacks read, “Want a quick lesson in racism & hypocrisy? Imagine in 08 if Obama had 5 children by 3 different women & was bragging about grabbing pussies,” next to the hashtag “#Boycot2016.”
This May, the House Intelligence Committee released a collection of about 3,500 Facebook and Instagram ads purchased by Russian trolls to attempt to sway the 2016 election. And it was more than a year ago, in October 2017, that executives from Twitter, Facebook, and Google first testified to Congress about Russian meddling on their platforms. Since then, the story has only become more complete—and more embarrassing for these powerful internet platforms, which hosted an enormous amount of inauthentic activity related to U.S. politics. We still don’t know how much it swung anything. That’s the thing with propaganda: It’s not obvious when it works. We have no clue how many Americans opted not to vote because of Russian voter-suppression efforts or troll campaigns that aimed to smear Hilary Clinton as a distrustful and ill candidate. But we do know that social media companies, even after testifying to Congress about the problem, have been either been underreporting the activity or not successfully looking. Yes, Facebook did set up a “war room” ahead of the 2018 elections and Twitter has gotten better at dismantling inauthentic accounts and bots—all of this with the help of thousands of content-moderation hires—but there’s still no clear legal requirement for these companies to do better, to keep a closer eye, to disclose political-ad buyers, or to be transparent about their cleanup efforts to the public.
With more information in its hands about how these companies built tools that allowed malicious state actors to walk through the front door and meddle with American political life, Congress has what it needs to start reining in these companies so that even after the public’s attention moves elsewhere, a repeat of 2016 isn’t possible. And yet: Google and Facebook spent record sums on lobbying this year, and as of yet there’s no single piece of legislation on the table to regulate these companies that Congress seems likely to rally behind—even as the reasons to take the threat of manipulative social media campaigns seriously keep piling up.
This post was updated with some screenshots of Kremlin-linked memes.