One of the most revealing lines in a newly released trove of internal Facebook emails comes from a 2015 discussion about a new type of data that the company wanted to start collecting. Specifically, Facebook wanted to read users’ “call logs” on Android devices—lists of everyone they had called and texted on their phones, even when they weren’t using Facebook.
The company planned to gain users’ permission to do that as part of an update to the mobile Facebook app. Normally, Android’s policies would dictate that users see a pop-up notification before downloading the update, notifying them of Facebook’s request for permission to track their call logs. That’s Google’s attempt to make sure its Android users know what they’re signing up for before they agree. But the internal emails—seized by a U.K. parliamentary committee as part of an investigation into Facebook—show that the social network saw this request for informed user consent as an obstacle to be circumvented.
Ultimately, the company found a loophole: If Facebook shipped its app update with only one permission request, it wouldn’t trigger the permissions pop-up. That meant that users would be “opting in” to the call log tracking when they downloaded the update, without Android making them aware of what they were opting into. A Facebook product manager put it this way: “This is a pretty high-risk thing to do from a PR perspective but it appears that the growth team will charge ahead and do it.”
The sentence is remarkable not because it’s shocking, but because it shows Facebook operating precisely as its critics have long suspected: putting its own growth above the interests of its users, casually and as a matter of course. The context of the email conversation, which included Facebook’s former “privacy sherpa” (and Survivor winner) Yul Kwon, shows no one raising an objection to tricking users into handing over sensitive data. The only concern mentioned is the public relations risk.

The same email suggests holding off on another invasive permissions request—not out of respect for users but due to concern that the backlash from the call logs request would subject the second sensitive permission request to public scrutiny, too. Facebook ultimately went ahead with the update and the collection of Android call logs, and succeeded in keeping it quiet—all the way until March of this year, when a couple of users stumbled upon the logs after downloading their Facebook data. At the time, Facebook responded by publishing a “fact check” that made it seem like the company had been upfront with users about exactly what it was requesting. While there was apparently a small-text mention of the data request that appeared within the Messenger and Facebook Lite apps, we now know the company went to lengths to avoid calling attention to it.
The exchange about Android call log permissions highlights a common theme in the emails, which had originally been collected as part of a U.S. court case between Facebook and the developers of an app that let users scan the site for pictures of their friends in bikinis. (There are no good guys in this lawsuit.) The theme is that Facebook leaders, up to and including CEO Mark Zuckerberg and COO Sheryl Sandberg, were privately explicit about their ruthless business tactics, even as they projected a public image of the company as earnestly idealistic and driven by its mission to connect the world.
In one lengthy 2012 email, Zuckerberg explains his decision to allow developers access to Facebook’s platform only if the developers in turn encouraged their app’s users to share back to Facebook. He calls this “reciprocity.” From the email (italics mine):
We’re trying to enable people to share everything they want, and to do it on Facebook. Sometimes the best way to enable people to share something is to have a developer build a special purpose app or network for that type of content and to make that app social by having Facebook plug into it. However, that may be good for the world but it’s not good for us unless people also share back to Facebook and that content increases the value of our network. So ultimately, I think the purpose of platform—even the read side—is to increase sharing back into Facebook.
Like a loyal assistant coach, Sandberg responded by echoing what she felt was Zuckerberg’s key point. She wrote (italics mine):
I think the observation that we are trying to maximize sharing on Facebook, not just sharing in the world, is a critical one. I like full reciprocity and this is the heart of why.
How many times did Facebook say publicly that its mission was to make the world more open and connected? Yet here are its two longtime leaders making it crystal clear to all of their deputies that they don’t give a damn about connecting the world unless Facebook profits.
It’s reminiscent of the line in the HBO farce Silicon Valley in which tech executive Gavin Belson proclaims that he doesn’t want to live in a world “where someone makes the world a better place better than we do.” Zuckerberg and Sandberg don’t want to live in a world where someone makes the world more open and connected than they do.
There’s much more in the trove that reflects poorly on Facebook, all of it consistent with the picture of a company so driven to dominate that it never stops to consider that what it’s doing might be wrong—not just “high-risk” from “a PR perspective,” but deceptive, unethical, or harmful. It’s also consistent with a 2016 memo written by a Facebook executive that made the case that user growth was a “de facto good,” justifying everything else the company might do. Not to put too fine a point on it, it’s broadly consistent with the picture painted by the 2010 release of a 2004 instant messaging exchange between a then-19-year-old Zuckerberg and a friend shortly after he launched Facebook, in which he called Facebook users “dumb fucks” for trusting him with their data.
True to form, Facebook responded to the release of the emails on Wednesday with a blog post that’s framed as a clarification but written as a riposte. The company describes the emails as “cherrypicked” by Six4Three, the maker of the bikini app, to make Facebook look bad, which is almost certainly true but does nothing to discredit their content. It then takes on the major topics one by one, doing plenty of its own cherry-picking as it addresses some points while sidestepping others. For instance, in a brief paragraph about Android call logs, Facebook underlines the phrase “opt in” but doesn’t mention the maneuvering it undertook to hide just what it was that users were “agreeing” to.
As the damaging stories about Facebook’s inner workings have piled up, some observers have started asking what exactly the company has done that’s so bad. In the three weeks since a blockbuster New York Times investigation, we’ve learned that the company has employed shady PR tactics, attacked critics, deflected blame, allowed politics to influence its decisions, downplayed charges of malfeasance, wielded its market power to throttle rival tech firms, exploited users, and more. All of which is ugly—but hardly unique among powerful multinational corporations. So why all the scrutiny on Facebook specifically?
The emails help to clarify the answer. The backlash isn’t about any one thing Facebook did. It’s about the discrepancy between how the company acted and how it presented itself all these years. Publicly, they were a band of idealistic engineers bent on connecting the world, and if much went awry in their process, we could blame it on their youthful naïveté and missionary zeal. Internally, we now know, the company’s leaders were unscrupulous businesspeople bent on expanding their empire—and the only thing that made them think twice was the possibility of bad press.