Future Tense

The Many Mea Culpas of Mark Zuckerberg

The Facebook founder has this down to a formula.

The evolution of Zuckerberg apologies.
The evolution of Zuckerberg apologies.
Photo illustration by Lisa Larson-Walker. Photos by Robert Galbraith/Reuters (2), Stephen Lam/Reuters.

After five days of mounting outrage over the revelation that a strategic communication firm used pilfered Facebook data to target voters, many were wondering why Facebook’s founder Mark Zuckerberg hadn’t issued a statement. Finally, Wednesday afternoon, at a moment when many users were contemplating whether to #deletefacebook, Zuckerberg released a post on his Facebook page, followed later in the evening by a two-part interview on CNN that repeated much of the same content.

Zuckerberg’s statement addresses “the Cambridge Analytica situation.” In it, he offers a defense of the Facebook’s actions and promises change. He also provides a timeline of the event, detailing when a researcher named Aleksandr Kogan created a personality quiz app, deployed it on Facebook, collected data from millions of users, and then later shared the dataset with Cambridge Analytica. Zuckerberg outlines steps Facebook has already taken to mitigate this kind of behavior and the steps they will take moving forward. He closes with an earnest personal narrative about how seriously he takes this matter.

The post follows a pattern visible across a decade and a half of Zuckerberg mea culpas that have followed:

• the 2006 user uproar over the introduction of the News Feed;
• the 2007 outrage over Facebook allowing user profiles to be discoverable by search engines;
• the 2008 complaints about Facebook’s practice of indefinitely keeping copies of user data from deleted accounts;
the 2008 backlash when the “Beacon” program was introduced;
• the 2009 user indignation about expansions to Facebook’s user data retention policies;
the 2010 concern over the way Facebook was handling privacy and was divulging identifying information to advertisers;
the 2010 worry that Facebook was “breaking things”;
the 2015 critique that Facebook was acting in an anti-competitive manner in introducing a “Free Basics” program in India;
and more recently, in responses over Facebook’s influence on the 2016 U.S. elections and other global elections.

I know this pattern because of my work on the Zuckerberg Files. Along with Anna Lauren Hoffmann at the University of Washington and Michael Zimmer at the University of Wisconsin–Milwaukee, I researched and analyzed nearly every single thing that Mark Zuckerberg ever said in public from 2004–2014. The pattern I’ve observed goes like this: Acknowledge, diffuse blame, make the problem manageable, empower users, invoke personal care.

Step 1. Acknowledge

Typically, Zuckerberg situates Facebook as attentive to users’ concerns, but positions whatever critique is being made within the context of an ever-changing Facebook. Credit where credit is due, he has gotten a lot better at this step since his 2006 post, “Calm down. Breathe. We hear you,” which came in response to user outrage over the introduction of the news feed system that broadcast profile changes to other users. In that case, his tone was outright dismissive of the issue.

This is how it shook out Wednesday:

We have a responsibility to protect your data. … I’ve been working to understand exactly what happened and how to make sure this doesn’t happen again. The good news is that the most important actions to prevent this from happening again today we have already taken years ago.

In these opening lines, Zuckerberg lets the public know that he, personally, is responsive to their cries. The next step is selectively avoiding responsibility.

Step 2. Diffuse blame

Avoiding blame is something that Zuckerberg does well. He has sometimes dodged it by suggesting outraged users are part of a feedback mechanism that feeds the evolution of the platform. However, in this case, he has a readymade patsy in researcher Aleksandr Kogan (who he mentions nine times by name in the 900-word post) and Cambridge Analytica, stating, “This was a breach of trust between Kogan, Cambridge Analytica, and Facebook.” Zuckerberg’s tone is almost congratulatory toward his own company for having the foresight to change its policies in 2014, “to prevent bad actors from accessing people’s information in this way.”

While moving blame, Zuckerberg positions himself individually (as opposed to Facebook as a whole) as an agent for positive change. For example, he writes in his latest post (emphasis mine):

I’ve been working to understand exactly what happened

I’m serious about doing what it takes to protect our community.

At the same time, when he does finally cop to some blame, he does so by shifting the onus for previous undesirable outcomes onto the institution as a whole.

We have a responsibility to protect your data. … But we also made mistakes, there’s more to do. …

Step 3. Make the problem manageable

When Zuckerberg’s faced blowback, one of the things his responses do is frame the problem so that it appears manageable through platform redesign or policy response. For example, in a 2010 Washington Post op-ed responding to the way the company handled privacy, Zuckerberg wrote, “Sometimes we move too fast—and after listening to recent concerns, we’re responding.” In Wednesday’s post, Zuckerberg does it again, stating,

In this case, we already took the most important steps a few years ago in 2014 to prevent bad actors from accessing people’s information in this way. But there’s more we need to do and I’ll outline those steps here. …

By making the problem appear addressable through these changes, he simultaneously makes the issue not about Facebook’s economic model itself. Problems no longer become fundamentally about Facebook building profiles of individuals against which advertisements are sold (which some have argued actually is the problem in this case), but instead become a function of design.

Step 4. Empower users

In Zuckerberg’s rhetoric, new tools that empower users are the panacea solution to what ails Facebook. New tools were the response to the sudden threat of search engine visibility in 2007, and of overcomplexity of already existing privacy tools in 2010. These tools reposition users as having control. We can see this same rhetorical device deployed in his Wednesday post:

[W]e want to make sure you understand which apps you’ve allowed to access your data. In the next month, we will show everyone a tool at the top of your News Feed with the apps you’ve used and an easy way to revoke those apps’ permissions to your data.

Of course, what this statement doesn’t address is why it took this event for Facebook to make privacy tools more centrally located.

Step 5. Invoke personal care.

Finally, Zuckerberg tends to conclude his responses to controversy with a reminder of how much he cares. For example, responding to the 2006 controversy over the introduction of profile changes alerting news feeds, he wrote, “This may sound silly, but I want to thank all of you who have written in and created groups and protested. Even though I wish I hadn’t made so many of you angry, I am glad we got to hear you.” This same note is hit in the closing remarks of Wednesday’s statement:

I’m serious about doing what it takes to protect our community. … I want to thank all of you who continue to believe in our mission and work to build this community together. I know it takes longer to fix all these issues than we’d like, but I promise you we’ll work through this and build a better service over the long term.

This passage is doing a lot of work, but I want to draw attention to one thing in particular. It pushes the idea that he (Zuckerberg, not regulators) will be the instigator to save Facebook, (though in the CNN interview, he did mention being open to certain forms of industrywide regulation). And it presupposes that he actually has the power to do so. This is important because, if users or investors lose faith in Zuckerberg’s ability to create change, they lose faith in Facebook.

So should we put faith in Zuckerberg’s mea culpa this time? To answer that question, let’s return to what Zuckerberg wrote in his 2010 Washington Post op-ed. He told us the five principles under which Facebook operates:

1) You have control over how your information is shared.

2) We do not share your personal information with people or services you don’t want.

3) We do not give advertisers access to your personal information.

4) We do not and never will sell any of your information to anyone.

5) We will always keep Facebook a free service for everyone.

We need to see this most recent mea culpa for what it is. Not just a formula for crisis response, but the pattern of a tech leader who cannot publicly reckon with the fact that his company and its sprawling, unchecked third-party information ecosystem rotted away those first three principles.

Read more from Slate on Cambridge Analytica.