Inside Facebook’s Supreme Court

Listen to this episode

S1: There’s this famous photo from 1972 taken during the Vietnam War. The photographer was a journalist named Nick. And while the photo is officially called the terror of war, people often refer to it as napalm girl.

S2: You’ve probably seen it. It was taken moments after a U.S. commander ordered South Vietnamese planes to drop napalm near a village just north of Saigon. In the photo, children are seen running away from that village. And in the center is a nine year old girl, Kim fuq, naked, crying and running toward the camera. The image is unforgettable.

Advertisement

S3: In the U.S., newspaper editors made exceptions to their policies banning frontal nudity and published a photo widely. It became one of the most indelible war photos ever taken. And it won the Pulitzer Prize.

S4: Four years ago, a Norwegian writer named Tom Egland posted that same photo to Facebook.

S5: It is a photo of a 9 year old girl running naked the streets and it was removed under the child exploitive imagery rule.

S1: That’s Kate Clonic, a lawyer and writer whose research is focused on Facebook.

S5: And it happened to have been posted by a very famous Norwegian author who got very upset and threw a fit on Facebook after it was removed. And the Norwegian prime minister then posted it. And it was removed. And then there was a letter to Mark Zuckerberg that was published on the front page of a Norwegian newspaper, said, Dear Mark Zuckerberg, chagga Mark Hellscape, The Brave Today.

Advertisement
Advertisement
Advertisement

S3: And it was a whole lecture about censorship. First love.

S6: First, you make rules that don’t distinguish between child pornography, unimportant war documentary photos. Then you practice these rules without sound judgment.

S1: Finally, Facebook eventually allowed the photo to go back up. But the whole controversy, Kait says, was a turning point.

S7: I think that they had failed. Up until that point, to recognize how much these content moderation decisions could not be solved just by taking things down. And that was just as important to make people feel that they were heard and that they had voice as it was to remove content that they didn’t want to see in the years after the controversy over the napalm girl image.

Advertisement

S1: Facebook has continued to struggle with these decisions. They’ve created a huge content moderation operation and endured scandals over doctored videos and deep platforming. Facebook simply couldn’t settle on a clear system that determined what stays up and what comes down. So right now, the company is trying something new, creating an independent oversight board separate from the company that will decide what content stays and what goes. A group. Users can appeal to and one that will try to police what 2 billion people put online. Think of it as a Supreme Court for Facebook. You’re a lawyer. You’ve written and thought a lot about this. What was your first thought when you heard that they were doing this?

Advertisement

S7: I mean, I follow these breakouts more than probably anyone. And even I was kind of like, wow, I’ll believe it when I see it.

S1: So Kate asked to watch and Facebook said yes.

S8: Most of the past year, she’s been an independent observer embedded inside the company as it wrestles with what its Supreme Court can and should be.

Advertisement

S9: And there have been moments where I have been very skeptical that it’s all going to come together and survive. And then there’ve been moments of kind of grandiosity in which I’m just kind of like, wow, this is maybe going to change the world forever.

S10: Today on the show, building Facebook Supreme Court, I’m Lizzie O’Leary and this is What Next TBD, a show about technology power, how the future will be determined.

Advertisement

S8: Stay with us.

S1: Over the last year, Kate Clonic spent hundreds of hours inside Facebook as the structure of the Supreme Court for content moderation called the oversight board was taking shape. You’ve been embedded within Facebook. How did that happen?

S7: Yeah, it happened because I think I’m the only one who’s dorky enough to ask. And I mean that sincerely. I mean, I think that even people who covered tech kind of saw the oversight board in the announcement as kind of like, okay, like this is just a PR stunt. But this particular project is about building transparency and accountability. And so I think that there was a different, slightly different set of rules around this than maybe there would be for other parts of Facebook are following around other parts of Facebook. So when I kind of pitched this idea of like I’d like to embed, I don’t want an NDA and I want to tape everything, because if if this goes belly up and I have to write about it, I don’t want you coming back at me and saying that this didn’t happen.

Advertisement

S1: Do you remember your first day there? Oh, God, yes. Yes.

S5: I mean, I happened to be there the first week that all of the interns had started. And so there was something like eight thousand interns. And so the place was just a zoo and it was just completely packed. And I started having meetings and going to meetings with the governance team that was building this out. And it was very much like, I don’t know, a little bit like what Jane Goodall must have felt like when she wandered into the jungle for the first time. I didn’t speak the language. I didn’t understand what was going on. I didn’t know which questions to ask. The whole thing was like I would say the first couple of days was about learning the language of Facebook and learning how to talk to them in constructive ways.

Advertisement
Advertisement
Advertisement

S1: I do think that the language of technology can be confusing and intimidating. What did the people you met with say that they were trying to do, you know, in English? You know, a lot of them were lawyers.

S5: So that was actually kind of weirdly this common language that we could speak. And while I was learning all the acronyms and I basically like and a lot of this to a court system and I like and a lot of this to kind of constitution building or institution building.

S1: And that was everyone was seeing it the same way you’ve used the example of Facebook as a nation state. Mark Zuckerberg has described social media as a fifth estate. And I I wonder if you agree with that in principle going forward and if it scares you at all.

Advertisement

S11: Oh, I don’t think of Facebook as a nation state. I think of it is something else. I see it as like sitting on top of nation states. I would say that for years, for millennia, the tradeoff was between nation states and citizens. And the story of speech was one of censorship by nation states and trying to curb that and by pushing back with democracy. And what platforms did and what the Internet did was it allowed citizens to not worry about that anymore and to route around the problem of state censorship? Of course, now we know that states are also co-opting platforms to read around the problem of democracy in all of this. I’m not pro platform. I’m not anti platform. I’m if there is anything that I am, I kind of think that I’m just pro user citizen, which I think is the one stakeholder here that is not centralized enough to be able to kind of represent and understand this kind of brave new world of everything that’s happening.

Advertisement
Advertisement

S1: I guess that makes me wonder about sort of the meta questions of who should be in charge of policing all of these things. And and if we as users are going to be comfortable with an independent board or or if some people are going to say, yeah, that that doesn’t work for me, like I want the government involved here.

S11: Yeah, but the question then is what government which dreverman people always are kind of like, why can’t the government just do that? Nimoy’s just like widget on this is this is this is like literally like this is operating within every single nation, state and government pretty much besides China. So like which government to what ends like specifically in the United States, you’re hamstrung by the First Amendment for better for worse of them trying to come in and say what Facebook can or cannot put on their platforms. And I’m not saying that that’s not the answer for certain types of problems. I just don’t think that the comfortable heuristic of, you know, we’ll pass a law against it or whatever works in this scenario. It requires something greater.

Advertisement

S4: In November 2018, Mark Zuckerberg began the process of making this idea of an oversight board into a reality in a series of blog posts. The company announced that it would hold workshops around the world to hear what people wanted in the board. They went to Singapore, Delhi, Nairobi, Berlin, New York, Mexico City to work out exactly how to design a Supreme Court for content.

S12: There are six months of consultancy and a global consultancy period where they went all over the world and asked experts and stakeholders and held workshops and consulted with over two thousand five hundred people to figure out what people thought the boards should look like.

Advertisement
Advertisement

S4: Kate was there for some of those sessions.

S1: At the end, Facebook wrote a report summing up what people said they wanted and how little agreement there actually was. You wrote that in the end, the consensus reflected in their report is pretty much exactly what you’d expect from an attempt to find global common ground, which is to say not much at all. What do you do with that when you’ve got all these different people from all over the world and yet you’ve got to make one thing from all of their responses?

S7: Yeah, I mean, if you read the report from that global consultancy, it’s literally like we heard one set of people cared really about this and another set of people cared about the exact opposite.

S12: Freedom of expression means a lot in America and Europe and actually means a little bit less in the global south where they’re mostly concerned with safety. And so trying to balance those with one set of rules is incredibly hard. And honestly, one of the things that we might eventually see come out of the oversight board is a formalization of the Balkanization of the Internet, which is to say that there’s just like that we’re going to split it up into it into regions, that there’s gonna be different rule sets for different regions. In fact, I was in the room at one point when they were talking about having a board member from a local region on every panel when they heard cases. So like if you heard a case from Sri Lanka, you would have someone that was representative of Sri Lanka. But with that 40 person board, that quickly became impossible. And so everyone is kind of like this just doesn’t make sense. We’re not going to do this. And that was in May. But by the end of June, one of the biggest things that had surfaced in the workshops and all of the things that people wanted a local member on the panel. And so was like they were like, OK, well, we’ll just have to make it happen.

Advertisement
Advertisement

S1: You mentioned speech being a challenge. Are there other areas that came up consistently as part of the process that were thorny?

S12: The decision of how to select board members is really, really hard. So what Facebook basically kind of came up with was they would pick the first three or four co-chairs and those three or four co-chairs would then work with Facebook to select the initial khadra of board members until they got to maybe 15 or 20.

S7: Once they get to 15 or 20, the board will be announced, which is expected in late March or early April.

S11: And going forward from that, they’ll have a hiring committee and they’ll choose their members themselves entirely by themselves.

S12: But there’s a question of tain’t rate of like how to whether that taints the entire process and means that it’s not actually independent and whether, you know, there has been there’s all types of other structural and financial independence built in for the board members. But for me, this was a big one. And I think for a lot of the people in the workshops that I was in and the global consultancy period, that was number, the things that was brought up over and over again was like, how are you going to do this without the hand to Facebook? Always having been in this process.

S4: In January, Facebook made its first attempt to answer that question in detail. It released a set of bylaws covering everything from the makeup of the board to the individual appeals process for content.

S1: So how is it going to work? Let’s say that I post something. It gets taken down for whatever reason. Then what happens?

Advertisement
Advertisement

S7: So what will happen is let’s say you post a picture of your cat and someone flags it for violence against animals and gets taken down incorrectly. Do you appeal that once to Facebook through their internal mechanisms and then you appeal it again? If you get to that level, you will get a code and you will take that code annual copy and paste it and you will go to a Web site that is not Facebook and you will copy and paste that code into generate your file and to give permission for Facebook to allow your private information to be reviewed by an outside body. And then it goes before a case selection committee, which functions a little bit like writ in the Supreme Court, which is kind of their decide whether or not your case is worth reviewing if your case is selected. It’s brought before a five person panel of oversight board members, and they then write a decision explaining why they gave their answer and saying whether they would take it down or keep it up.

S1: If a decision around a particular piece of content might be representative of a larger issue, then that small five person panel can flag it with the larger or 40 person board.

S7: If they’ve made a policy decision or a policy recommendation that Facebook has to apply to that policy decision and say why it is that they are implementing that policy decision or where they’re not. And that is also a public statement. Then all of those decisions are put into a database that is searchable on the website and we’ll be able to kind of have a common law type database that allows us to see similar facts or similar rulings.

Advertisement
Advertisement

S1: And all of this is supposed to happen within 90 days of making that initial complaint.

S11: Yes, exactly. Which is insane. It’s insane. It’s insane for it. It’s insane for two reasons. One is. The idea that you can do all of that within 90 days and then the other part of it is that both seems very slow and very fast. It feels very fast to give that amount of due process to someone. I mean. Court cases languish for years, right. But at the same time, it seems really slow because there is in the life of the Internet, within 18 hours, things might no longer be relevant from the user perspective. It’s really supposed to be about signaling erroneous decisions or decisions that people want change on Facebooks policy about to an outside board when time is not really of the essence.

S1: I listened to you detail all these steps and they’re frankly sort of dizzying in their complete. Tell me about it. Boy, it seems like a lot for an individual user.

S13: It is a lot for an individual user. But I think that we’re kind of at this point where these transnational private companies that have got privately governed are public rights of speech. I think it’s maybe time it’s these issues have certainly are not new. They’ve been happening for the last 20 years. And so I think that people become so much more literate in such a short amount of time. And I think that this is just the tip of the iceberg of that kind of civics and cultural literacy around this issue.

S1: This appeals process Kate is describing, it only applies to content that has already been removed from the platform. So if you’re a user and you want to see something come down, the oversight board can’t help. It is narrow, though, because this is just about content that gets taken down. It’s not about whether my neighbor is posting vaccine misinformation that I would like to see taken down. Yeah.

Advertisement
Advertisement

S13: And so at the beginning, it’s only going to be about removal. And so if you think about this actually from a privacy perspective, this makes a lot of sense. Let’s say that you flag a piece of content that your neighbor posts. Right. And Facebook says, no, we’re keeping it up. And you want to appeal that decision for you to appeal that decision, given the process that I just gave you. You would be sending someone else’s data off of Facebook and into the outside world. From a privacy perspective, you just can’t.

S1: That’s very difficult to do when I think about how the typical person uses Facebook. Do you think this is going to change their experience all that much?

S13: I don’t think that most people will appeal this type of content, but I think it might be part of a broader industry change that ends up happening. I see this kind of going. One of three ways. And one hand it might end up being that this just stays at Facebook. The other way that you could think about it is that other Web sites, Twitter, Google, whatever chip into the trust and then also want to use the oversight board and the people on the oversight board as their own adjudicators of their own types of content. And the third way I could see it going is that each of these platforms decides to create its own oversight board to review their own content based on their own rules and values. And in that case, what I see happening for users is markets have rules.

S14: You are very explicitly going to certain types of platforms to be able to say certain types of things with the understanding that even say certain types of things and some platforms and not on others.

Advertisement
Advertisement

S15: Kate Clonic, thank you so much. Yeah. Thank you. Kate Clonic is an assistant professor at St. John’s University’s School of Law. She’s also a fellow at the Information Society Project at Yale. All right. That’s it for today. What next? TBD is produced by Ethan Brooks and who stood by me? Lizzie O’Leary. And it’s part of a larger what next family. TBD is also part of Future Tense, a partnership of Slate, Arizona State University and New America. This year. Future Tense is collaborating with the Tech Law and security program at American University Washington College of Law on the Free Speech Project, an editorial, an event series. The series will examine the ways technology is influencing how we think about speech. Okay, Mary, we’ll be back on Monday. Thanks for listening. Talk to you next week.