After a silence so long it made headlines of its own, Facebook CEO Mark Zuckerberg responded on Wednesday to the Cambridge Analytica data scandal that has plunged his $500 billion company into crisis.
In a post on his Facebook page, Zuckerberg accepted responsibility for “mistakes” without exactly apologizing; he offered a qualified defense without seeming defensive; and he proposed a series of relatively modest fixes that probably should have been implemented years ago. In short, Zuckerberg successfully addressed several of the specific issues highlighted by the Cambridge Analytica data leak while steering clear of the larger questions it has raised about Facebook’s core business.
Zuckerberg’s statement seems unlikely to mollify the company’s more persistent critics or to extricate Facebook from the legal and regulatory trouble it’s facing. Within minutes, for example, Sen. Ed Markey, a Massachusetts Democrat, had responded: “You need to come to Congress and testify to this under oath.” But it might be just enough to defuse what has probably been the most damaging news cycle in Facebook’s history and to satisfy casual users and observers that Facebook is doing at least something to solve whatever problem it is that everyone seems so upset about.
“We have a responsibility to protect your data, and if we can’t then we don’t deserve to serve you,” Zuckerberg began, in probably the statement’s most sincere-sounding line. (It could be read as a subtle nod to the #DeleteFacebook campaign.) “I’ve been working to understand exactly what happened and how to make sure this doesn’t happen again.” From there, his tone brightened: “The good news is that the most important actions to prevent this from happening again today we have already taken years ago. But we also made mistakes, there’s more to do, and we need to step up and do it.”
Note that Zuckerberg doesn’t say what those “mistakes” were, however. One strong candidate for that distinction would be the Facebook policy that gave third-party apps access to data on users’ friends without those friends’ knowledge. That’s the policy that Facebook changed in 2014, and it’s what Zuckerberg is referring to when he says that actions have already been taken. But it would have been nice to hear him admit that it was a deeply misguided and greedy policy in the first place.
It’s also possible that by “mistakes,” Zuckerberg is alluding to the company’s failure to disclose to users that their data had been passed to Cambridge Analytica back when Facebook found out about it in 2015. Again, the fact that Zuckerberg declined to get specific about what he thinks Facebook did wrong is maddening to those who follow the company closely, not to mention those who appreciate a proper apology.
Zuckerberg goes on to lay out a timeline of relevant events, starting with the company’s 2007 launch of the Facebook Platform, which first allowed third-party-app developers to plug into the social network. He explains that the permissions Facebook granted them to access data on users’ friends were meant to power certain types of social apps: “Your calendar should be able to show your friends’ birthdays, your maps should show where your friends live, and your address book should show their pictures.” It’s those kinds of permissions that researcher Aleksandr Kogan allegedly abused years later to gather data on unwitting Facebook users for purposes of targeted political messaging.
Another subtle nod to critics comes at the end of Zuckerberg’s timeline. Facebook’s first response when the story broke on Saturday was to quibble over news organizations’ use of the term data breach. That came across to many critics as defensive, and several (including me) pointed out that there was a more important kind of breach here: a breach of users’ trust. Facebook has grown adept of late at echoing the language of its critics, and Zuckerberg did that in his statement today:
This was a breach of trust between Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.
Then comes the key part: Facebook’s actual plan. It’s a pretty straightforward three-pronged proposal:
1) Facebook will investigate apps that had access to lots of user data, audit any of those with suspicious activity, ban those found to have misused people’s information, and inform the users who were affected. That includes notifying Facebook users whose information was gathered by Cambridge Analytica.
2) It will further restrict apps’ access to users’ data in several ways, such as cutting them off from users who haven’t opened the app in at least three months.
3) It will place a temporary notice at the top of users’ news feeds encouraging them to review the data they’ve consented to share with various apps.
It seems clear now that the long silence from the company’s leaders was at least partly a function of their determination not to speak until they could commit to some specific fixes. In some ways that’s admirable: Proposing concrete solutions is an important way to show that you’re taking a problem seriously. And it should help to turn down the heat on the company at least somewhat. Notifying affected users, in particular, is something many critics have demanded.
Yet Zuckerberg and Facebook have developed a habit of acknowledging criticism and proposing solutions in the same breath. That can feel a bit facile at times, because it suggests that said problems are superficial enough to be solved with a few simple tweaks. Contrast that with Twitter CEO Jack Dorsey’s recent admission that his platform has become fundamentally unhealthy, and he doesn’t know exactly how to fix it yet. Inaction may signal weakness in the short term, but it could lead to more meaningful change in the long run.
Everything Zuckerberg is proposing seems sensible—provided you view the Cambridge Analytica scandal as one that primarily concerns third-party-app permissions. But the problems Facebook now faces seem broader and deeper than that. I’ve argued that what people are really upset about is Facebook’s core business model. At this point, the entire arrangement by which social media companies trade access to critical online services in exchange for wide-ranging surveillance on people’s behavior and preferences is being called into question. And a lot of critics, including a growing cadre in Congress, think the answer lies in legislation and regulation—not audits of bad actors and minor adjustments to data policies.
The Cambridge Analytica story, in this view, is just the fuse to a powder keg of discontentment with the company’s central project of mining users’ personal data to help advertisers target them. If that’s the case, then what Zuckerberg has just announced is a plan to disable the fuse after the bomb has already gone off.