Every week it seems there is a new controversy about content on tech platforms: Alex Jones deplatforming; YouTube’s decision to not take down the conservative commentator Steven Crowder; Faebook’s decision to keep up the “drunk Nancy Pelosi” video. All stem from initial policy decisions made in some glass-walled office in Menlo Park that end up rippling around the world; and then result in a global online explosion that seems to be heard everywhere but Silicon Valley.
In November, Facebook announced the most ambitious and proactive idea of how to deal with these issues and rebuild trust in the way these consequential decisions are being made. It proposed an independent Oversight Board to hear disputes regarding the platform’s Community Standards and give transparent and binding decisions.
Six months ago Facebook vice president of global affairs and communications Nick Clegg announced in a company press release that the platform would begin a multipronged external outreach to collect advice and perspectives on how to construct the Oversight Board to oversee policy and appeals about speech on its platform. The release included a draft charter outlining key issues; a list of six “public listening” meetings Facebook would hold in New York City, Mexico City, Singapore, Nairobi, Delhi, and Berlin; and other plans to reach out to global stakeholders.
On Thursday, Facebook released a compilation of what it heard at those six meetings and all the other external advice it sought worldwide. The company heard from more than 650 people from 88 countries represented at 22 smaller global roundtables; feedback from more than 250 experts in one-on-one meetings; and the results of an online “public consultation” process, which encouraged users to both answer polls and submit essays on what they thought the board should look like. The result is an almost 250 page tome, with the first 45 pages being a summarizing report and the remainder detailed appendices.
In the end, the consensus reflected in the report is pretty much exactly what you’d expect from an attempt to find global common ground—which is to say, not much at all.
As many, including us, have discussed, Facebook is having a constitutional moment: It is ceding this policy power over content policy to an independent board. The report sheds some light into how this all began. Though multiple people have called for an appeals process and transparency in Facebook’s content moderation process for years, the report cites a game-changing white paper by Harvard Law professor Noah Feldman written in March 2018. In remarks made public for the first time in the report, Feldman calls for the creation of a “Supreme Court to protect and define free expression and association on Facebook. Along with a lower appeals court, the court would interpret and apply an iconic, one-sentence values commitment that Facebook would adopt.”
It was an idea that came at the right time and place, and was taken up by Facebook CEO Mark Zuckerberg, who went public with the notion in April 2018 in an interview with Vox, and then formally announced the project in November of that same year. All they needed to figure out was how such an adjudicative body could govern a global community, what it would look like, what the scope of its review and jurisdiction would be, whose norms it would reflect, the composition of the board, what it would mean for it to be a diverse body, the values it would reflect, and how it would be independent, accountable, and transparent. You know, the little things.
Thursday’s report does not answer any of these questions, but it does reflect the discrepancy in global opinion about what the answers should be. Perhaps most centrally for the “constitutional question,” the report quotes one of us arguing that the “difficult choices” about which values to prioritize in a constitutional context should be decided by Facebook and not the board. In drawing up constitutions, real-world framers make these kinds of choices often: The U.S. Constitution, for example, prizes liberty most of all, while other countries’ constitutions make dignity their centerpiece. Facebook should make clear its central values, which will guide the board as it translates these values into practice and determines over time what the precise balance should be.
In carrying out this task of translation for America, the U.S. Supreme Court has used the Federalist Papers to try to understand the original intent and priorities of the founders. Written between 1787 and 1788 under the pseudonym Publius by Alexander Hamilton, James Madison, and John Jay, the Federalist Papers are 85 pieces of propaganda to promote the ratification of the U.S. Constitution. The essays were widely read at the time, though admittedly by an elite audience because of low literacy rates and the lack of press distribution. But the Federalist Papers were more than just propaganda—they have remained incredibly influential on courts in interpreting the intent and meaning behind the United States Constitution and the balancing of its values.
So is Thursday’s report a kind of Federalist Papers for the new constitutional project that is the Oversight Board? There are some similarities. For one thing, the report definitely has elements of propaganda. It is meant to show the hard work (and it has been hard and groundbreaking) that Facebook has been doing to try to take the global temperature on speech governance, harassment, due process, transparency, and accountability on online speech. It could have provided a lasting compendium of materials that the board could use for years to come to understand what the Facebook community needs and wants from this body. Sadly, it’s not clear that that will be the case.
Facebook can’t be blamed for not extracting a simple set of common values out of a community 2.3 billion users deep. Some things are important to some, other things more important to others. Regional differences matter a lot, but so does the fear of balkanization of the internet and the threat of overlocalizing decisions. As well as a recurring theme of angst and dissatisfaction with Facebook’s current systems, many respondents seem to emphasize the same terms over and over again: “due process,” “transparency,” “independence,” “diversity.” Everyone can agree on those terms, but it seems like no one can agree on what those terms mean when actualized. There are hard trade-offs involved, and likely no perfect answers, made all the more difficult because the world has never seen an institution quite like this before. But that was true at the founding of the U.S., too—and the uncertainty and disagreement was alleviated by Publius’ powerful and eloquent articulation of his vision for the country. With the Oversight Board due to hear its first case by the end of the year, it is time for Facebook to provide similar guidance. This report, while interesting and impressive, was not it, but hopefully the final charter, coming in the next few months, will be.*
Update, June 27, 2019: This piece was updated to clarify an additional report will be released later this summer.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.