The Industry

The Cambridge Analytica Scandal Is What Facebook-Powered Election Cheating Looks Like

The Trump campaign’s data firm got its hands on 50 million Facebook users’ information—and then reportedly lied about deleting it.

Cambridge Analytica's chief executive officer Alexander Nix gives an interview during the 2017 Web Summit in Lisbon on November 9, 2017. 
Europe's largest tech event Web Summit is being held at Parque das Nacoes in Lisbon from November 6 to November 9.  / AFP PHOTO / PATRICIA DE MELO MOREIRA        (Photo credit should read PATRICIA DE MELO MOREIRA/AFP/Getty Images)
Cambridge Analytica’s chief executive officer, Alexander Nix. AFP Contributor/Getty Images

On Friday night, as Americans began settling into the weekend, Facebook dropped a pretty substantial piece of news: The company said in a detailed blog post that it had suspended from its platform the political-data firm Cambridge Analytica, which worked with the Trump campaign during the 2016 election, after learning the company hadn’t deleted Facebook user data it had obtained in violation of the social network’s policies.

Facebook said that a Russian-American psychology professor at the University of Cambridge named Dr. Aleksandr Kogan had obtained user data through a personality app he built in 2014 called “thisisyourdigitallife,” which scraped data from the profiles of people who took the quiz as well as that of their friends—something that was allowed under Facebook’s policy for third-party apps at the time. Although only about 270,000 people took the survey, the New York Times reports that Kogan was able to obtain data on 50 million users, likely through connections between friends on the network. While the project promised users the data collection was only for research purposes, it nevertheless funneled the data to Cambridge Analytica, which had funded the development of the app to the tune of $800,000. Kogan had also received funding from the Russian government for his research into the psychology of Facebook users, the Guardian reported.

We don’t yet know how effective Cambridge Analytica’s targeting, based on psychological profiles its CEO has described as its “secret sauce,” truly was, but after working with Ted Cruz’s presidential campaign in the Republican primary, Cambridge Analytica was tapped by the Trump campaign. The company also did data analytics work on the successful Leave.EU campaign that resulted in a vote for Britain to exit the European Union. What we do seem to have, following these revelations about the source of some of the company’s data, is yet another in a long string of instances and coincidences—like the meetings between Trump campaign staff and Russian operatives—that look a lot like a willingness to cheat to win the White House, possibly in violation of the law. And in this case, cheating with data that was taken through Facebook’s front door.

Cambridge Analytica isn’t your typical data analytics firm. Its primary backer is Robert Mercer, the secretive billionaire and former CEO of the New York investment firm Renaissance Technologies whose family is also one of the main funders of Breitbart News and was the largest donor to Trump-backing Super PACs during the 2016 election. Steve Bannon, President Trump’s former chief strategist, was the vice president of Cambridge Analytica’s board at the same time he was chairman of Breitbart News.

The data firm started partnering with U.S. political campaigns around 2015 with the promise that it had the ability to do what it called “psychographic” targeting, which allowed Cambridge Analytica to create psychological profiles to “effectively engage and persuade voters using specially tailored language and visual ad combinations” that appeal to each person on an emotional level, according to Cambridge Analytica’s website. The company claims it’s able to build these profiles by leveraging “up to 5,000 data points on over 230 million American voters.” That last number is a lot bigger than the 50 million people reported in the New York Times, and it’s unclear where that remaining data came from.

Cambridge Analytica began contracting for Trump’s campaign in June 2016, roughly six months after the Guardian reported that the company, when working for Cruz, was targeting voters using data harvested from tens of millions of Facebook users without their knowledge. Facebook confirmed in August 2016 that data from Kogan’s app collected and misused data from the company and that Facebook was taking efforts to delete it. Still, Facebook didn’t alert users that their data had potentially ended up in the hands of Republican operatives.

The personality quiz created by Kogan appeared in 2014 on a platform for freelancers run by Amazon called Mechanical Turk.* The quiz, posted by Kogan’s company Global Science Research, offered to pay participants $1 or $2 to complete it and also required participants to download an app and consent to sharing data about themselves and their social network. At the time the quiz and app were running, Facebook permitted app developers to access data about a person’s network, like the names of their friends, as well as their likes and other personal details about themselves and their network, according to Facebook’s head of security, Alex Stamos, in a series of now-deleted tweets. The company says it has now updated its policies to let users decide what information Facebook can share about them.

Facebook says, however, that when Kogan passed the data he collected to Cambridge Analytica, he violated the company’s rather permissive data collection rules and in 2015 demanded that Kogan and everyone he handed the data to destroy it and certify that they did so. Facebook says it received certifications that the data was deleted from both Kogan and Cambridge Analytica, as well as from Christopher Wylie, who helped to found Cambridge Analytica. (Wylie and other former employees of the company gave interviews to the New York Times and the Guardian about Cambridge Analytica.) But apparently not all the data was deleted, and the New York Times reports that it was able to view a portion of the Facebook user data that still exists. The contract that Kogan initially had with Cambridge Analytica also allowed him to keep a copy of the data he scrapped for his research.

Kogan claimed in an email to Cambridge Analytica in 2014 that the data he was collecting would be able to predict a person’s neuroticism, political views, agreeableness, and interests in things like militarism, horoscopes, and the environment. These data points could be what Cambridge Analytica is referring to when it claims that it’s capable of psychologically targeting American voters.

“Without Facebook, we wouldn’t have won,” said Theresa Hong, a member of the digital arm of Trump’s presidential campaign, in an interview with the BBC last year when giving a tour of Trump’s digital campaign headquarters, dubbed Project Alamo, in San Antonio, Texas. Alamo was the name of the dataset used by Cambridge Analytica, according to Hong, who said Cambridge Analytica shared offices with the Trump campaign’s digital efforts and confirmed Facebook and Google sent liaisons to their offices to help Trump’s campaign. Hong showed the BBC how Cambridge Analytica could identify if, say, it was targeting a working mother concerned about childcare: She probably wouldn’t be interested in “a war ridden destructive ad” popping up in her Facebook app, but might respond to something more “warm and fuzzy,” lacking Trump’s voice, Hong said. “It wasn’t uncommon to have about 35 to 45 thousand iterations of these types of ads everyday.”

Cambridge Analytica also bragged publicly about its use of data and voter modeling during the election, which sparked the interest of David Carroll, a professor at Parsons School of Design who studies media, data targeting, and campaigns. In January 2017, he requested under the U.K.’s data protection law to see if any of his personal data had been shipped to Cambridge Analytica’s parent company, Strategic Communication Laboratories, based in London. Last year, Carroll learned that Cambridge Analytica had given him scores on a scale of 1 to 10 on certain hot-button political categories, like a “gun rights importance rank.” Based on these scores, he was ultimately listed as “unlikely” to vote Republican. On Friday, the same day Facebook moved to suspend Cambridge Analyica’s account, Carrol filed a legal claim in the U.K. to learn what the company did with his data and if it was handed to anyone else.

Cambridge Analytica was roped into Special Counsel Robert Mueller’s investigation of Russian interference in the 2016 election this past December when it requested to see employee emails, the Wall Street Journal reported at the time. The firm also may be in hot water with the Federal Election Commission, which has rules about how non-U.S. persons are allowed to work on political campaigns. Alexander Nix, the CEO of Cambridge Analytica, is British and many of the company’s employees were either European or Canadian.

On Saturday, Cambridge Analytica responded to Facebook’s allegations and the reports in the New York Times and Guardian, writing that Wylie was a contractor and not a founder of the company and claiming that all of the Facebook data provided by Kogan was erased as soon as the company learned it was obtained illegitimately.

In his now-deleted Twitter thread, Facebook’s Stamos said that Kogan’s handing of Facebook user data to Cambridge Analytica wasn’t technically a breach, since he “didn’t break into any systems, bypass technical controls, or use a flaw in our system to gather more data than allowed.” All of which means Kogan took a startling amount of user data that he was permitted to take—even if he wasn’t supposed to hand it over to a voter-targeting operation after taking it. Other app developers may have harvested data from users in the way Krogan and Cambridge Analytica did under Facebook’s old rules, too, and there’s no telling where that data might have ended up. Since 2015, Facebook has changed its policies, limiting the amount of user data third-party apps can siphon from the network—although it still allows sophisticated political targeting through its advertising platform, the kind which the Trump campaign was able to use, likely abetted by the Cambridge Analytica data set. None of this is surprising, of course, but it should be galling. Facebook’s primary job is making money, after all—and even careful policies can’t always control for the scrupulousness of how its customers use its primary product: our data.

Correction, March 19, 2018: This article originally misspelled the name of the platform Mechanical Turk.

Read more from Slate on Cambridge Analytica.