Future Tense

The Independent Facebook Oversight Board Has Made Its First Rulings

A gavel.
So ordered. Bill Oxford/Unsplash

Facebook’s Oversight Board unveiled its first round of decisions on Thursday, the first concrete actions in an experiment intended to bring more checks and balance to Big Tech.

The Oversight Board, announced by Mark Zuckerberg in 2018, was created with the promise “to uphold the principle of giving people a voice while also recognizing the reality of keeping people safe.” Six months later, the board was formed as an independent body of 20 experts from all over the world and in a variety of fields, including journalists and judges. Although the board governs independently from Facebook (and Instagram), early critics worried it would amount to nothing more than a PR stint. Out of more than 150,000 cases submitted, six were chosen in December. The cases were emblematic of bigger, thematic issues in content moderation, such as censorship of hate speech, female nudity, and COVID-19 misinformation. Five decisions were released today. All are binding.

Advertisement
Advertisement
Advertisement
Advertisement

According to the board, the cases were debated by five-member panels, each of which included a representative from the place where the post in question was written. The panel sometimes requested public comments and integrated them into their decision. Before finalizing a decision, a majority of the board had to agree.

Michael McConnell, who is director of Constitutional Law Center at Stanford Law School, said he agreed to join the board because it recently dawned on him that “the real decisions about what people can say and how they can say it in our world are no longer based on Supreme Court decisions,” but by companies like Facebook instead. McConnell, who used to be federal judge, felt that joining this board was like joining a judiciary committee for modern times.

Advertisement

Of the five rulings shared Thursday and detailed below, four were in favor of overturning Facebook’s original decision to remove a post.

1.  The post: A user in Myanmar shared viral photos of Syrian toddlers who drowned on the beach and a comment, written in Burmese, that seemed to say Muslim men are psychologically damaged. The post contrasted some the silence around the treatment of Muslims in Myanmar with the killing of people in France over cartoons depicting the Prophet Muhammad.

Advertisement

The decision: Even though the beginning of the post featured hate language, the board suggested that when the whole post is considered, it is actually a “commentary on the apparent inconsistency between Muslims’ reactions to events in France and in China.” The board re-translated the post, which provided a slightly different meaning than Facebook’s original translation, and after consulting experts on the nuances of anti-Muslim hate speech, ruled that the post be restored.

Advertisement

2. The post: A picture of churches in Baku, Azerbaijan, was posted with a caption, in Russian, that asserted Armenians had historical ties with Baku that Azerbaijanis didn’t. The post used the term taziks to refer to Azerbaijanis, which is considered “a dehumanizing slur attacking national origin.”

Advertisement

The decision: Considering the violence between the two countries and the language of the post, the board upheld Facebook’s removal of the post, agreeing that it has harmful intentions. “While Facebook takes ‘Voice’ as a paramount value, the company’s values also include ‘Safety’ and ‘Dignity,’ ” the board wrote. Given the context of the post, the board said Facebook should, in this case, value the latter.

3. The post: A series of photographs of breast cancer symptoms—including pictures of female nipples—posted with the clear intentions to raise awareness of the disease during “Pink October.” The post was initially removed automatically by Facebook in October and then restored by Facebook after an internal review found it was indeed in line with community guidelines. Facebook requested that the case be dropped, declaring it “moot” but the board still saw the value in reviewing it, claiming that “the incorrect removal of this post indicates the lack of proper human oversight which raises human rights concerns,” including freedom of expression, especially for women.

Advertisement
Advertisement

The decision: The fact that Facebook’s automated system failed to recognize the Portuguese words for “breast cancer” raised concern for the board. It requested that the post be restored and suggested that Facebook clarify when automated forces are monitoring users and make sure a human being is reachable for appeals. The board also recommended that Instagram implement guidelines detailing that nipples may be shown in breast cancer awareness posts. (It’s worth noting that Facebook has different protocols for male and female nipples.)

4. The post: A quote, misattributed to Nazi propagandist Joseph Goebbels, that said arguments appealing to pathos are more important than the truth. The account user claimed the post was supposed to serve as a commentary on Donald Trump.

Advertisement

The decision: The board agreed that post wasn’t displaying support for Goebbels, so freedom of expression reigned first. Suggestions were made for Facebook to clarify that if one is posting about “a dangerous individual, the user must make clear that they are not praising or supporting them.”

Advertisement

5. The post: A user implied that a French agency wasn’t authorizing hydroxychloroquine and azithromycin for the treatment of COVID-19, even though it was “harmless” and could be a cure. The post was originally removed for spreading COVID-19 misinformation that this drug cocktail could cure the virus.

Advertisement

The decision: The board determined that since these drugs require a prescription in France, the misinformation could only cause minimal harm. The board ruled to restore the post and suggested that Facebook should fix how it approaches misinformation, such as correcting it instead of removing it.

Advertisement

Facebook will have seven days to restore the posts. The company released a statement acknowledging the binding nature of these decisions and said that it has already restored the posts. “Their recommendations will have a lasting impact on how we structure our policies,” vice president of content policy Monika Bickert wrote. The board’s decisions and recommendations will not only apply to the five posts above, but also posts that have “identical content with parallel context.” One caveat to this was noted in reference to the COVID-19 misinformation case. Facebook wrote that its decision to remove COVID-19 misinformation is in accordance with advice from the CDC and WHO, so it won’t change this approach.

The Oversight Board’s rulings seemed to value free speech above all else. Yet Nate Persily, co-director of Stanford’s Cyber Policy Center, noted on Twitter, “This is not where most public opinion is, of course. Whether in the U.S. or the world, most observers think the problem with FB is that it takes down too little speech, not too much.” McConnell remarked that these free speech decisions were made carefully, and with great difficulty, because the board’s priority to uphold freedom of expression had to be held in consideration—and tension—with the “harm that can take place as a result of social media activity.”

Advertisement
Advertisement
Advertisement
Advertisement

The Real Facebook Oversight Board, a citizen campaign against the board, released a statement featuring quotes from advocates, claiming that the board’s decision to restore posts of hate speech could lead to a “troubling precedent for human rights.”

Next up on the board’s agenda comes its biggest case to date: Should Donald Trump be allowed back on Facebook? The conclusion won’t be revealed for more than 80 days. But based on these recent decisions, speculation is rampant. “I don’t think we can read too much into it, but what we do know is that they are willing to overturn decisions by Facebook,” Persily said of the board. “And so, we should not be surprised if they do reinstate Trump.”

Although future decisions will continue to come out, including one more in the next few days, McConnell believes that we won’t understand the implications of Facebook’s Oversight Board until many years in the future. Yet, what McConnell knows now is that he and his fellow board members will fight for transparency and to hold Facebook accountable. “And I think to have a group like that looking over Facebook’s shoulder can only do good. How much good it will do, I don’t know. But it’s got to do some good.”

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Advertisement