Last week, Vox published an interview with Facebook CEO Mark Zuckerberg. The chat was part of a robust (for the press-wary Zuckerberg, at least) campaign of contrition following the revelations that data-consulting company Cambridge Analytica had used and brokered millions of Facebook users’ information. Zuckerberg’s apologies began with a March 21 Facebook post, then a CNN interview, then full-page ads in major newspapers, and a one-hour call with press and journalists. This week, he’ll spend two days testifying before Congress.
It’s easy to think that the Cambridge Analytica moment is a lesson in privacy, but that would miss the forest for the trees. What the scandal—and more importantly, Facebook’s reaction—really teaches us is that we’re living in a new system of global private governance platforms. Facebook might not be around forever, but something like it certainly will be. How we integrate these platforms that govern our fundamental rights into our traditional concepts of democracy and government is not a project just for Mark Zuckerberg, but for all of us.
The first lesson of the Cambridge Analytica fallout is what these platforms are. The terms corporation and company don’t fully capture the role Facebook or Google play in our lives. In order to provide services, these platforms use personal information about their users. We want and expect them to do this, but we also expect good faith, trustworthiness, and nonmanipulation. Offline, we have long recognized similar expectations in our professional relationships with doctors and lawyers. In his Facebook post, Zuckerberg seems to understand this duty too: “We have a responsibility to protect your data … This was a breach of trust between [researcher Aleksandr] Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it.” In this sense, as Yale Law School professor Jack Balkin has argued, Facebook isn’t just a company—it’s part of a “new category of businesses for the digital age … an information fiduciary.”
Cambridge Analytica also tells us what these platforms are doing. In addition to using our information to provide services, Facebook and similar networks create systems that liberate and constrain our basic rights. Through policies and code, these private platforms govern our rights like speech, assembly, or innovation. In his interview with Vox, Zuckerberg made clear that Facebook is in the business of governing users, describing how the platform has had to “build out a whole set of policies and governance” to adjudicate user disputes. Going forward, the site aims to “set up a more democratic or community-oriented process that reflects the values of people around the world.”
Though it helps to hear it from Zuckerberg himself, the fiduciary and governance role Facebook plays in our lives was evident long before the past few weeks. Privacy advocates have been warning about this for years. Less clear was what exactly users could do about it. That is the most important lesson from Cambridge Analytica: what it reveals about how to hold these platforms accountable.
If we think of Facebook as a governor, we must recognize a major difference in accountability between platforms and an actual democratic government. The latter has direct accountability from citizens. In contrast, platform-governance systems largely rely on indirect feedback to meet the expectations and norms of their users in their policies. Creating the policies that run this system means complying with (and pushing back against) governments and laws, but most significantly, it means divining user preferences through media, civil society, or collective action.
Facebook relies on the norms of its users to remain relevant and to protect its reputation. That makes it powerful but also fragile. Democratic nation-states do not have to worry about relevance or reputation—at least not by the second or hour. At best, elected officials within government worry about those ideas by election cycles. But like so many AOL-free-trial CDs, the memory of the internet is littered with examples of platforms that have seized our attention only to fall from view. (Remember Yo?) In the past two weeks, Facebook lost $50 billion in shareholder value. With the #DeleteFacebook movement, users (at least, those who can) are “voting” through market exit.
But mere accountability through economic pressure won’t solve the real problem. As Maggie Koerth-Baker recently wrote, “the trouble with quitting Facebook is that we like Facebook.” This means not simply using blunt market tools to bring about its end, but rather using it as a lever to instill values that can shape its future. History tells us this is possible. In the past few hundred years, Western society has developed norms about government that involve peaceful transition of power and republican democracy. When a government doesn’t reflect our values, we communicate those values by electing new leaders. We usher in a new era—we don’t throw out the entire government and start again.
Like a government looks to representatives, these internet platforms need reliable mechanisms to communicate the values that users want reflected on the site. Some of these values are obvious. Information-fiduciary responsibility, for instance, should become the norm, much like client-patient confidentiality is expected. But how we want to be governed online, what we want to be able to see and say, and the technological due process we expect are harder for users to communicate and for Facebook to implement. Facebook’s job now is to do a better job of creating a system for users to share their input. (Facebook briefly experimented with users voting on policies years ago, but it failed for complicated reasons.) We also have to start working to better express what we want from these services. In the interim, it means you, as a user, thoughtfully developing expectations of what these platforms are and what you want them to be.
In an online world that brings together hundreds of different countries, languages, and cultures, that will be hard. But no one said this would be easy. You are not just a user of Facebook—you are a citizen of a new age of private internet governance.