Facebook Flips on Holocaust Denial

Listen to this episode

S1: If you’re American, you probably think of free speech as the default just the way things are, and I don’t know where it enters the system, I don’t know if it’s in the water or if it’s in the kindergarten curriculum.

S2: Evelyn Douek is not American, but it’s certainly something that I have encountered for years is just like First Amendment fundamentalism.

S1: She’s an Australian who lives in Massachusetts, and she’s one of the most dynamic and nuanced thinkers about online speech. She lectures at Harvard Law School. You came here to study kind of First Amendment law to look at this stuff as an outsider. What was your impression of the EU’s fundamental adherence to free speech?

Advertisement

S2: I feel a little bit like gaslighted as a foreigner when you come to America, as I did four years ago, to study comparative constitutional law and free speech, one of the most striking things about American free speech doctrine is this like there’s this example of there were Nazis that wanted to march in Skokie.

S3: I know jumping straight to Nazis is kind of leaping into the free speech deepend. But everyone’s describing one of the most famous First Amendment cases, one that really tests American values. And the story goes like this. In 1978, a group of neo-Nazis wanted to march in the Chicago suburb of Skokie, Illinois, largely because a lot of Holocaust survivors lived there, 7000 concentration camp survivors living in the predominantly Jewish Chicago suburb of Skokie. Not surprisingly, there was a huge legal fight.

Advertisement
Advertisement
Advertisement
Advertisement

S4: Skokie officials a block Nazi demonstrations with court injunctions when the Nazis appealed to the state Supreme Court, judges refused to hear the case.

S3: But what might surprise you if you don’t know the story is that the American Civil Liberties Union, indeed a Jewish lawyer with the ACLU, defended the Nazis right to march under the First Amendment, saying the right to free expression was integral to who we are as a country.

S2: It’s just such an iconic story of literal Nazis were going to be allowed to march in the street, and as a foreigner, you come here and you learn that and not only do you learn that it’s not like this inconvenient embarrassment about a First Amendment law, it’s this like really proud. One of the truly great victories for the First Amendment was that it will protect the speech that we hate because it is, you know, better to have it out in the open. It’s better to meet it with counter speech. And we just can’t trust the government to suppress. As an Australian, it’s very striking. I don’t even have a right to free speech. We don’t have a bill of rights in our Constitution. It’s like a completely foreign idea.

Advertisement

S1: This fight over unfettered free speech and in fact, where it collides with anti-Semitism and Holocaust denial broke into the news cycle again this week.

S5: There’s a split screen like the Supreme Court. Confirmation hearings are going on, on one side. And then on the other side, Facebook releases a blog post.

S1: The company, which has always said it values free expression above everything else, announced that it would ban any content that denies or distorts the Holocaust. Two days later, Twitter did the same thing. It might seem like banning Holocaust denial is a pretty easy call, but it was only a few years ago that Facebook said it wouldn’t prohibit Holocaust denialism on its platform, which is part of why Evelin says this moment is a really big deal.

Advertisement
Advertisement
Advertisement

S2: I think this is like a really iconic moment in the history of the company and its thinking and its evolution around its rules. There is no more emblematic rule that Facebook had about its sort of adherence to First Amendment principles.

S6: Today on the show, Facebook’s decision to finally prohibit Holocaust misinformation and what it means for free speech debates the Internet and the potential for change. I’m Lizzie O’Leary, and you’re listening to What Next TBD, a show about technology, power, how the future will be determined. Stay with us.

S1: I want to talk about how seismic a shift this is, if we think back to just two years ago, Mark Zuckerberg gave a very now well-known interview to Kara Swisher and said he didn’t believe that posts that deny the Holocaust should be taken down.

Advertisement

S7: I don’t believe that our platform should take that down, because I think that there are things that different people get wrong either. I don’t think that they’re intentionally getting it wrong, but I think that they are, as they might be. But it’s hard to impugn intent.

S1: Boy, it is a big journey from people get things wrong, even though I might find it personally offensive to my own thinking has evolved.

S8: The big thing that they always sort of hung onto was we don’t want to be arbiters of truth and we will not take content down purely on the basis that it’s false. We might take it on down on the basis that it’s nudity or that it’s hate speech or that it has other sort of effects. But we won’t take content down just because it’s wrong. And that’s sort of what’s reflected in that quote from Mark Zuckerberg to Takar Swishers. You know, some people get things wrong sometimes and the the pandemic literally change that decision overnight in the context of a global public health emergency. They abandoned that. They said we will take down false information about the pandemic because it poses a public health risk. And now we’re playing ball like now companies are taking content down on the basis that it’s false. And we’re now seeing it in a lot more other areas. We saw it in the context of the wildfires in the West. My country was on fire for months in December and January, and there were lots of false rumors about the fire, the cause of the fires. And Facebook didn’t take anything down. And then Oregon was on fire and a couple of months ago, and suddenly they were taking down misinformation about the cause of those fires. And I think that’s like a stock a contrast as you can draw.

Advertisement
Advertisement
Advertisement
Advertisement

S1: It’s so interesting to hear you peg this to the pandemic, because I think about all the data points that came before that. This is after the 2016 election. It is after the Charlottesville Unite, the right rally which took place in 2017. Do you think the coronaviruses pandemic, is it sort of launching us into a new, I guess, area of thinking about content and speech on its own? Or is it kind of a I guess, a catalyst for something that was going to happen anyway?

S5: Yeah, you’re absolutely right that it’s always sort of it’s part of the broader trend.

S8: It was a particularly visible and sort of obvious example of the trend in the same way that the pandemic has made many sort of fundamental assumptions and structures in society more visible. And we’ve sort of seen progressively moving more and more along that line of sort of, OK, we can’t be all speech all the time. Let’s balance interests and draw hotlines. And I think that the pandemic was just sort of another step along that road, if you think about it that way.

Advertisement

S1: These announcements from Facebook and Twitter about banning Holocaust denial are in line with other content, moderation decisions we’ve seen this year, like the outright ban on Kuhnen content. But in other ways, everyone says the decision on Holocaust denial marks a deeper and more fundamental shift in how speech is policed online.

S2: Holocaust denial is one of these iconic things about the First Amendment, and I believe that one of the reasons why Facebook sort of stuck to that principle for so long of allowing it on its surfaces was because it’s still considered itself a fundamentally American company. Attached to these First Amendment ideals is robust marketplace of ideas, which is bizarre when you think about it. These these are clearly global companies now and you know, most of their user bases outside of of America. But there was still something there that it couldn’t let go of. And so I think it’s really we’re not in First Amendment land anymore today like this is we are now in this unknown landscape of trying to work out what norms we can attach ourselves to.

Advertisement
Advertisement
Advertisement

S1: I think when we talk about free speech and you’re reflecting this a little bit, it’s so ingrained in the American psyche that sometimes we don’t even know what that means, that we apply this phrase free speech to things like Facebook, which is a company, not a government.

S5: So totally, you know, Facebook will out of fact check to something. Twitter will remove a tweet and everyone will cry. Free speech, censorship. And that’s weird.

S2: That is it’s strange. You’re right. They’re private companies. They have a product they have been censoring. Lots of stuff all the time, you know, adult nudity has been banned on Facebook from the beginning, they take down literally billions of pieces of spam all the time. And we sort of just like shrug or don’t even think about that. But we do have some sort of sense that this is where public debate is happening now and this is where we meet and we talk about issues of systemic importance, particularly in the pandemic where we literally can’t go anywhere else to talk about this stuff. We are sort of in these spaces and no legal system really has a good idea yet of how to cope with the fact that there are clearly still some free speech interests at stake in what these companies decide to allow on their platforms. They clearly are decisions that affect democracy in a really fundamental way. And so they should have some sort of public regarding interest in the way that they construct those systems. But you’re absolutely right. They’re not governments. And maybe we feel very differently about Facebook’s decision to ban something than we would about a government’s decision to ban something. And I think we should. But we just don’t really have the vocabulary yet or the structure to really think about that and what that means.

Advertisement
Advertisement
Advertisement
Advertisement

S8: We can’t pretend that these companies are hands off. We can’t pretend that they’re just opening a marketplace of ideas. They need to think much more seriously about their social responsibility.

S1: Of course, that also presents dangers. If we think about Facebook, say, suppressing speech in Myanmar or, you know, somewhere with an autocratic government, then there’s the question of, wait a minute, should they be standing up for a more American version of free speech?

S5: You’re absolutely right. Like, how do we draw the line between, you know, a legitimate local law that reflects sort of local community values and contextual understanding of what speech is in that society and how do we think about autocratic regimes? So a classic example is you have like Western politicians using fake news to describe content that they don’t like and sort of railing against these platforms for allowing political mis and disinformation on the political spectrum. Right. We’ve heard all of that and that discourse gets co-opted and taken by authoritarian regimes to do exactly what you’re talking about, to shut down the communication environment and impose very punitive laws that will punish these companies for allowing content that they disagree with on their platforms. Is it Facebook’s job to decide which countries are sufficiently democratic enough that they should be that Facebook should comply with their laws around speech? Well, yeah, actually, at the moment it is. But is that where we want to end up? I don’t think that it is, but we need to do a lot more thinking around. So how do we how do we impose structures around that?

Advertisement

S9: We’ll be right back.

S1: Recently, one structure has gotten some traction with both academics and the platforms themselves, the idea, which comes from a report to the U.N. Human Rights Council, is to take an existing framework from international human rights law and use it as a basis for content moderation, basically prioritise broad human rights over unfettered free speech. But everyone’s not convinced.

Advertisement
Advertisement
Advertisement

S5: And she just published an academic paper explaining why something that has gotten a lot of momentum behind it is the idea that, well, let’s replace sort of these First Amendment ideals and norms with international human rights law. That seems great. These are international companies. That’s the international pot. They affect human rights. That’s the human rights part. And we need laws and norms. So that’s the law. But it seems like a very natural fit. The problem that I see is it still doesn’t answer these more difficult questions of how is a company different to a government. It also just doesn’t move fast enough. Like my favorite example from the past couple of weeks is the hashtag Proud Boys. That was for a very long time a rallying cry for a white supremacist group. And then literally overnight, the meaning of that hashtag changed because so gay communities started posting it and trying to reclaim it and make it about proud out and proud boys kissing. And it was a symbol of gay love. And the slow moving infrastructures of law are not well equipped to deal with how literally language changes overnight from being hate speech to a sign of celebration of of gay love.

S1: And so I think that the international human rights law will only get it so far when we move from thinking academically about hate speech and content moderation to the physical infrastructure, the human infrastructure that is underneath these things that these giant companies. Are there any models of content, moderation, you know, short of burning it all down that you think we ought to explore?

S10: One of the things that I spend a lot of time thinking about and something I’ve been saying for a very long time is we need to get out of this take down, leave up false binary of content moderation. We spend so much time thinking about like what should be allowed or disallowed. But actually the far more important decisions about are about what is amplified and what is suppressed, what is shoved in front of you and what is not. And all these other things that we can do in between that can have a profound difference. So we can talk about labeling things, we can talk about adding warnings screens, we can talk about adding friction. And this is something I’m really excited about. Let’s slow things down a bit. Everything’s so fast. And I think that companies are starting to think Twitter is starting to do that. Right.

Advertisement
Advertisement
Advertisement

S1: I was really excited about Twitter’s announcement last week around friction that it is introducing and it’s slowing down the ability to retreat things sort of without thinking, without really reading what you’re looking at and what you’re spreading.

S5: Right. And reducing amplification of certain things. I mean, this is the Sicilia’s thing.

S10: Twitter introduces a nudge that when you go to read tweet a news article, if you haven’t previously clicked on that news article on Twitter, it will just pop up a little warning screen and say, hey, do you want to read this before you retweeted? And this is a bomb to journalists everywhere. Yeah, right. That’s not interfering with anyone’s free speech. That is not interfering with anyone’s autonomy. That is the simplest, gentlest possible nudge. And it has a big effect.

S8: Nothing about the way that these companies or these platforms are currently constructed is inevitable or natural. We think that they are because we’re used to them.

S1: You know, in so many ways it feels like we journalists, scholars, the public, have these conversations about content, moderation over and over again. But listening to you, you are actually saying, no, no, there are there are differences and they are happening. And we’re just maybe applying a very dramatic binary way of thinking that misses some of this stuff.

S5: Yeah, I think you may have just called me an optimist, and that’s very disconcerting. But I think you’re right. I am optimistic about I really do think a lot has changed in the last couple of years.

S8: There’s one more thing that excites and gives her a little hope, the potential to test out with real evidence or ideas about speech, so much of free speech law in theory, is based around these sort of hypothetical sort of conjectures around how speech works in society, you know, like the marketplace of ideas or the chilling effect and these kinds of things. For the first time in history, we have data about how those conjectures actually might work in practice. And when I say we, I mean Facebook right now and these companies, not us. Right. So we could we could be this really exciting paradigm where we start to answer some of those questions and we show that the marketplace of ideas was only ever, you know, a terrible analogy and a totally hypothetical reality that could never exist in practice. And here’s how it actually works when you put people on a platform hypercharged for engagement. And then we could start to answer those really important questions about what is going on here that explains these disturbing trends.

Advertisement
Advertisement
Advertisement

S3: Evelyn Douek, thank you so much. No, it’s great fun. Thank you for having me.

S6: Evelyn Douek is a lecturer at Harvard Law School and an affiliate at the Berkman Center for Internet and Society, and that’s our show for today. TBD is produced by Ethan Brooks, edited by Allison Benedikt and Torie Bosch, and hosted by me, Lizzie O’Leary. It’s also part of the larger What Next family. TBD is also part of Future Tense, a partnership of Slate, Arizona State University and New America. If you’re interested in learning more about content, moderation and other tech related speech issues, check out Future Tense is Free Speech Project, which you can find at Slate Dotcom. Future Tense. Have a great weekend, Mary Harris will be back in your ears on Monday. I’m Lizzie O’Leary. Thanks so much for listening.