When he testified to Congress this month, Facebook CEO Mark Zuckerberg portrayed Cambridge University researcher Aleksandr Kogan as a rogue app developer who deceived the social network by harvesting data on users’ personalities that he then sold to the consulting firm Cambridge Analytica. He assured lawmakers that Kogan had been banned from the platform. And, as TechCrunch’s Natasha Lomas noted at the time, he went on to take a shot at Cambridge University more broadly, testifying that Facebook has uncovered “a whole program” associated with Cambridge in which a number of researchers were building similar apps.
What Zuckerberg didn’t mention was that Facebook itself had worked directly with Kogan and his Cambridge colleagues for years—and that it continues to this day to employ two of Kogan’s close research associates. In an interview with CBS’ 60 Minutes on Sunday, Kogan said one of them, his former co-worker Joseph Chancellor, was fully involved in harvesting the user data that they then sold to Cambridge Analytica. On Monday, Facebook spokesman Andy Stone confirmed to Slate that, with respect to Chancellor, “a review of the situation is ongoing.”
Kogan, the researcher at the Cambridge Analytica data scandal’s center, spoke out Sunday in greater depth than he has since the story broke in mid-March, shining a global spotlight on Facebook’s failure to protect its users’ personal data. In separate interviews with 60 Minutes and BuzzFeed News, Kogan tried to make it seem like he was the one betrayed by Facebook rather than the other way around.
No one should buy the idea that Kogan is a victim of any kind, or that Facebook’s alleged complicity exonerates him for hoodwinking users into giving up data not only on themselves but also on their friends, which he then sold. But he did make a persuasive case that Facebook had created conditions in which that sort of violation was almost inevitable—and that its scapegoating of him is disingenuous.
Far from sneaking in the back door to Facebook’s platform, Kogan emphasized that he had enjoyed a close working relationship with researchers at the company. From Kogan’s 60 Minutes interview:
I visited their campus many times. They had hired my students. I even did a consulting project with Facebook in November of 2015. And what I was teaching them was lessons I learned from working with this data set that we had collected for Cambridge Analytica. So I was explaining, like, “Here’s kind of what we did. And here’s what we learned. And here’s how you can apply it internally to help you with surveys and survey predictions and things like that.”
Facebook confirmed that it had a history of working with Kogan, but said it was never aware of his activities with Cambridge Analytica. The company provided the following statement from Ime Archibong, its vice president of product partnerships:
Kogan—a Cambridge University researcher—first approached Facebook in 2013 to do standard research using anonymized, aggregated data. And in October 2015, Kogan had a brief consulting contract with Facebook. At no point during these two years was Facebook aware of Kogan’s activities with Cambridge Analytica. It was not until December 2015 that we first learned Kogan had broken Facebook’s terms of service by selling to Cambridge Analytica Facebook information collected via an app he built.
That seems plausible, and Kogan didn’t deny that he had sold the data or hidden that fact from Facebook. Kogan also acknowledged publicly for the first time that he knew Cambridge Analytica planned to use his data for political purposes, likely for Republican candidates. The guy is hardly innocent, and “I figured everyone else was doing it too” is a weak excuse. There is a real difference between using people’s Facebook data for academic research and selling it to political targeting firms.
Yet while Facebook has banned Kogan himself, Kogan pointed out that it still employs his former co-worker and research partner, Chancellor. 60 Minutes’ Lesley Stahl asked Kogan: “Did he [Chancellor] have anything to do with the study you did for Cambridge Analytica?” Kogan’s reply: “Yeah. I mean, we did everything together.” Kogan told BuzzFeed that Chancellor even informed Facebook of his research when he applied for his job there in 2015.
The Intercept first reported Facebook’s relationship with Chancellor in 2017; my Slate colleague April Glaser reported in March that Chancellor appeared to still work there. (You can see his employee page for Facebook Research here.)
BuzzFeed also highlighted Kogan’s relationship with Pete Fleming, who is now head of research at Instagram, which Facebook owns. Kogan said he worked on “at least 10 papers” with Fleming over the years. That doesn’t mean Fleming did anything wrong, of course. But it further undermines Facebook’s implication that Kogan was some kind of rogue researcher acting entirely on his own. Facebook did not answer a question about Fleming’s ties to Kogan.
60 Minutes bolstered its Kogan segment by interviewing Sandy Parakilas, a former Facebook employee who has spoken out about what he believes was the company’s laissez-faire attitude toward user data. Parakilas told 60 Minutes that he tried to tell Facebook higher-ups about problems with their app developer policies, but “I think they didn’t want to know,” because if they knew then they could be held responsible. Parakilas said that was because the company was prioritizing user growth and monetization over privacy. (Facebook notes that it did in fact shut off developers’ access to the personal data of users’ friends beginning in 2014.)
In the BuzzFeed interview, Kogan downplayed the significance of the data he sold to Cambridge Analytica. He said whistleblower Christopher Wylie had exaggerated the data’s usefulness in political campaigns, adding that Wylie lacks the qualifications to be considered a data scientist. Kogan said his personality profiles were not precise enough to be used for effective microtargeting.
Separately, Cambridge Analytica released a statement Sunday in which it agreed that Kogan’s data was not useful for political targeting. “Cambridge Analytica’s research showed that the personality types licensed by GSR/Kogan underperformed compared to more traditional ways of grouping people by demographics,” the company said. It added that it did not use personality typing at all in its work for Donald Trump’s presidential campaign.
Kogan told BuzzFeed that he believes that Facebook’s own privacy policies were a more serious issue than anything he or Cambridge Analytica did. While that still doesn’t excuse him, he’s probably right.