Deplatforming the President

Listen to this episode

S1: Hey, before we get into this episode, I want to let you know that this show was recorded late Thursday morning about a fast moving news event. So it’s possible that some things may have changed by the time you listen. Also, there’s some bad words in this episode. So if you’re listening with kids, heads up. There were a lot of tweets, a lot of posts in advance of this. There’s a lot of online chatter on the part of people who took part in this. Were you expecting to see what we saw?

Advertisement

S2: I knew something terrible was going to happen when you had the president invite people to the Capitol to protest.

S1: That’s Danielle Citron. She’s a law professor at the University of Virginia who writes about online speech and privacy.

S2: He wasn’t saying, really, let’s protest. He said, come, we will take our country back. And so it was very clear he was inciting people who need very little incitement.

S1: But even though this is what Danielle studies and she knew the Trump supporters were talking on social media about going to D.C., she thought, hey, it’ll be OK, things won’t get that bad. And then Wednesday turned violent.

S3: My kids, one of them called me and said, this is really bad, mom, what is going on? I’m scared. And I just said, OK, honey, I’m with you. Then I sent off my tweet to Jack to say he’s incited violence, proof’s in the pudding like enough’s enough. Take him off Twitter.

Advertisement
Advertisement
Advertisement

S1: Lots of people tweeted Jack, as in Jack Dorsey, Twitter CEO, they tweeted him to complain about the company’s policies, especially when it comes to violence or harassment or President Trump. But Danielle is not just anybody. Twitter listens to her. She’s been advising them for more than a decade. And this week is quite clearly not just any week for the first time. Social media platforms are, however, briefly, banning the president and his rhetoric today on the show, why the platforms are finally acting, why they didn’t before, and whether it’s just too late. I’m Lizzie O’Leary. And this is What Next TBD, a show about technology, power and how the future will be determined. Stick with us. Danielle’s relationship with Twitter started back in 2009. If you don’t remember what things were like on the platform back then, well, they were a lot goofier, more personal. People tweeted about what they eat for lunch, but the seeds of the big free speech fights and safety concerns were already there. It was back then that Danielle wrote a memo to some of the policy folks at Twitter and she talked with them about how to make the platform safer. But at the time, Danielle says the company was mostly interested in banning impersonators or getting rid of copyright infringement.

Advertisement

S3: All that started to change about five years ago when harassment got worse and politics got much more pointed, and after the 2016 election, Jack brought a few of us together to talk about with him directly about how can we save democracy? And all of a sudden I’m being listened to.

S1: And what did he ask you then?

S3: Yeah, he’s like, I am really worried. What are you worried about and what do we do? And so we talked about the different threat landscape. What are the vectors? What are we worried about? Disinformation, incitement of harassment, doxxing. You know, we talked and took apart the different problems and we talked about possible ways to think of themselves and speech policies. My position was your site should see itself as something of a public trust and you need to act like a public trust, whether it’s BBC, your your algorithms are optimized for engagement. Well, that’s got to change, right? You can’t be amplifying destructive lies. It was like a beginning conversation and I felt hopeful.

Advertisement
Advertisement

S1: But at that point, Danielle was part of a group called Twitters Trust and Safety Council, a mix of experts on everything from bullying to gun violence to human trafficking. They advise the company on best practices and how to make their platform safer.

S3: I did feel like at some point in this presidency, in these four years, we stopped really needing the Trust and Safety Council. We stopped our in-person meetings. And since then, it feels like we’re a bit further away. That is, they sort of tell us what they’re doing and they don’t seek our input before, which is disappointing. It used to be the reverse, of course, which was the whole point. And I was worried that what we’ve seen in the last six months is a flood of disinformation that’s led to dead bodies. Health disinformation about masks led to people not wearing masks who use the bully pulpit in ways that clearly we know has led to public health disasters and safety problems.

Advertisement

S1: Of course, Twitter has made changes, but Danielle doesn’t think there enough labeling great still there.

S3: And we know that the negative and novel stick versus the bland, the boring, accurate information, they know that. So I think I feel let down.

S1: You wrote a piece for Slate saying time to take him off Twitter. But one thing that was in there that I found really interesting was that you initially were also for the public’s legitimate interest in knowing what Donald Trump or any other public official was up to.

S2: I stand by that. You don’t say, Lizzy, that policy is smart, but where did it change for you?

S1: When did you say, oh, wait a minute, this this doesn’t work for me anymore?

Advertisement
Advertisement
Advertisement

S4: See, it’s not that the policy doesn’t work its application, because when you ask is it in the public’s interest, this person’s. Twitter account. Now it is looking holistically and it’s saying, are they doing more harm than good? At the end, it’s pretty clear from the last six months there ain’t no public interest happening here. It’s just been it’s been destructive to the public interest. A flood of disinformation that leads to illness and death, a flood of lies and incitement and wink nod come to the Capitol. We’re going to march on the Capitol. I love you. He was inciting violence without question. Like, if I’m a prosecutor, I feel like I can look myself in the eye and say, you know, I’m going investigate this because this taken all together in context. It’s incitement.

Advertisement

S1: You know, it sounds like you’re saying they’re not adhering to their own policy.

S4: That’s exactly what I’m saying. They’re not meaningfully applying it. It has just become a free pass. And that’s not what we meant it to be a free pass.

S5: There are two places where I want to sort of dig in here. One is the idea that sometimes banning things makes them a bigger deal. When you look at, say, the Hunter Biden story. Yeah, yeah.

S1: The Hunter Biden story taking that down initially, apparently led to twice the traffic for it. Is there a risk here where, you know, taking away the president’s Twitter account or shutting down some of his tweets for a while leads to a culture of martyrdom, makes their kind of more online interest in what is happening here.

Advertisement
Advertisement

S4: So let’s talk about the costs and the benefits of that. Let’s let’s say, OK, on the one hand, a cost is the potential for martyrdom and greater popularity. It’s often true of a response to so-called censorship. At the same time, in this very moment, Twitter is a force multiplier. It’s algorithmic reach and capacity is profound. So the number of people who who would be on Twitter and reading his disinformation, incitement in this moment is really is really troubling.

S2: So I think as we think about the long and short term risks, I think the risk of keeping him on in this moment with the risk of physical violence being as high as it is, you know, risk is often tough for us to assess. But right now, in this moment, it ain’t that tough.

Advertisement

S3: So I think I’m going to live with the threat that he’s more popular in the in the longer run, you know, like he’ll have a following. But I don’t want him to have 50 million people that he has right now.

S1: The other question I have is whether the horse is out of the barn door, you know? His followers, the people who were in the capital on Wednesday and others who are watching on social media, they can do the same thing. They can tweet, they can put Facebook posts up. They are doing that at this moment.

S4: And that’s why we need really robust content, moderation. So the idea that these companies don’t know what to do is like an absurdity.

Advertisement

S1: So why are we here? I mean, we have had some version of this conversation 10, 12 times on this show. Why are we here in this moment if those roots are known?

Advertisement
Advertisement

S4: Because of choices that companies make about how their speech policies are applied in practice and the choices they make about funding the people at the first pass, we have it under funding of the people on the front lines of content, moderation, both their salaries, their life conditions.

S2: It’s not like we don’t understand how to write speech policies. I mean, these policies, we don’t get to see them because they’re goddamn they’re not transparent. They need to be. But if you’re on the inside and you see them, their pages and pages and pages and pages long about instructions to content moderators, but we need to fund them better. We need more of them, right?

S1: Well, at the end of the day, we are still talking about private companies. And I think this is the thing that. Confuses and frustrates people, because on the one hand, you have the president of the United States communicating on a platform, but that platform is a company. It’s not a public utility. It is not something that is, you know, being scrutinized by regulators.

S4: And but you said something really important. Yeah, it’s it’s a private company that’s not being scrutinized by regulators who you normally don’t say that.

S6: Phrases like it’s a private company that’s lawless faces no legal regulation. That is not the case for most any and every company in this country, the America. It’s important to note that we have treated tech companies that that operate at the content layer, that have their fingers in the content layer and said interactive service providers are not responsible for user generated content. If you over or under filter, thanks to Section 230 of the Decency Act, that’s the Communications Decency Act.

Advertisement
Advertisement

S1: And in essence, Section 230 protects platforms from getting sued for what users might say on their websites. It’s almost impossible to overstate how important this little piece of law is to social media companies, business models, which is why we’ve talked about it on this show so many times.

S6: With that comes freedom to moderate great. It also comes with it the freedom and a license to not give a shit, to ignore and to underfund moderators. And we have United States has in many respects become a safe haven for bad actors because they enjoy Section 230 immunity. That’s why you have nonconsensual pornography, a flame like, you know, online, and that’s why you have sites that whose raison d’être is destruction, because those moderators get to say, hands off now, you can’t sue me. I don’t care. I know predators are on my site. Too bad. So sad.

S1: But the idea that the government should be the arbiter of what is OK and what is not is also frightening.

S2: Terrible idea. And I think the the the folks drafting section two thirty in nineteen eighty six, Ron Wyden and then Chris Cox were right to say, you know, federal agencies would be outgunned in the space. They really don’t have the skills nor the funding to be speech arbiters. And we don’t want them to be. I agree. We don’t want them to be. But that doesn’t mean that law shouldn’t operate, just like The New York Times can be sued or Slate can be sued for knowingly like malicious knowing and recklessly publishing lies. Right. Which you don’t do because you know that there are some protections, but they have limits. And we are designed to hit those limits. Right. None of that can operate the common law with Section 230 as it is, does not get to operate it. There is no legal incentive to because they don’t have to internalize the cost of all this harm. There aren’t legal incentives that force them to go the next mile.

Advertisement
Advertisement

S1: As we’ve been talking, Mark Zuckerberg just banned President Trump from Facebook and Instagram until after the transition site.

S3: That’s. Thank you. Glad. Glad. Thanks. Appreciate it. Good stuff. OK, good. Jack follow. I’m grateful for that. So maybe he’s listening, but it is all about these very strong CEOs making decisions that make sense.

S1: Yeah, it’s weird that Mark Zuckerberg gets to make the call about whether the president, the United States gets to write whatever.

S3: That’s true, it is weird, right, and it’s really weird, I think, Lizzie, because we don’t have regulation in this space, like there’s a lot of decisions companies make all the time. And Facebook’s policy vis a vis public officials has long, like Twitter been. We’re going to look at them a little differently. We’re going to see what’s in the public interest. They just weren’t applying it. It was just like it’s a free pass. What that means is a free pass. That goddamn say that. Be clear, it’s not a matter of public interest. Just say they can do whatever they want.

S1: Right now, everything is heated. Everything feels fractious and scary and uncertain, and I wonder after the dust settles, whenever that is. Is this really the moment where there will be fundamental change in terms of how these platforms view speech? Because I’ve had this conversation 35 different ways and nothing has changed. Is this the moment things change?

S3: I’m pushing for it. I think, you know, there are times when the window opens.

S2: For legislation, and it’s usually when lawmakers are most interested because they see their own self-interest on the line, so maybe the storming of the Capitol is enough. Maybe.

Advertisement
Advertisement

S3: We’ve been talking about it in really specific terms with offices on the both sides of the aisle, so I do think it’s a moment I’m going to be annoying and push for it like and this has been brewing for at least two or three years. I’ve been talking to lawmakers about two thirty. Maybe we need a little more time, but maybe this is the moment of self-interest where we will see.

S7: Danielle Citron, thank you very much. Thank you so much for having me. Danielle Citron is a professor at the University of Virginia School of Law and the vice president of the Cyber Civil Rights Initiative. All right, that’s our show for today. TBD is produced by Ethan Brooks and edited by Allison Benedikt and Torie Bosch. Alicia Montgomery is our executive producer. TBD is part of the larger What Next family. And it’s also part of Future Tense, a partnership of Slate, Arizona State University and New America. And if you’re interested in learning more about content, moderation and other tech related speech issues, check out Future Tense is Free Speech Project, which you can find at Slate dotcom slash Future Tense. And I’m going to borrow one piece of advice from my Slate colleagues and say this weekend you can just get outside and go for a walk. Mary Harris will be back on Monday. I’m Lizzie O’Leary. Thanks for listening.