SCOTUS on the Internet: “It’s Complicated”
Dahlia Lithwick: Welcome back to the show.
Speaker 3: I love being back and love talking with you. Nothing would keep me from you.
Dahlia Lithwick: Well, I want to we’re just going to have to do a whole book show in the summer, because what you’re doing is really cutting edge. And I think one of the themes of this show is that kind of content moderation on the Internet feels like it’s the last war. And I want to sort of end there that it feels like we’re fighting a fight that is almost obsolete as we talk about it.
Dahlia Lithwick: But I do want to start, if you would, because everybody talks about Section 230 and nobody quite knows what it does. And so this case has been, you know what it does. And section 230, this is a juggernaut. It’s been a long time coming here at the Supreme Court. It’s described as, quote, the 26 words that created the Internet. Can you just walk us through, please, as though we’re seven what Section 230 was designed to do, Why it became the BET noir of particularly conservatives, but across the spectrum what the purpose was, Because I think that now we talk about it as though it was just designed to immunize Internet companies from everything. And that’s not right.
Speaker 3: Yes. No. And what’s so thank you so much for inviting me to talk about that history and the specific provisions and how they work together. Because you’re right, So often we just gloss over the thing. It’s like magic fairy dust, First Amendment free speech. And that’s, in many respects a misunderstanding of the project of the provisions and how they work together and the original purpose and history.
Speaker 3: So, you know, Section 230 is part of the 1996 Communications Decency Act, and it was truly, shockingly an anti-porn statute like criminalizing the knowing facilitation of pornography. And one would think, how do you have an Internet without porn? But Congress in 1996 thought maybe we could have one. And the Supreme Court sort of knew better and struck all of the statute down except for section 230.
Speaker 3: Now, Section 230, the title of Section 230 is Protecting Good Samaritan Blocking and Filtering of Offensive Content. Okay, so you might think, ha, Congress is talking about principally how to incentivize protecting the private filtering and blocking of offensive content. That’s the title.
Speaker 3: And there are a number of provisions that work together. There’s a purposes and a findings part of the statute, but where we’re going to focus our energy and where, you know, we auto are two provisions of section 230 C sorry to get really in the weeds, but it’s, I think, really important to talk about not only the titles but the language and like what representatives Cox and White and we’re trying to do and what they were responding to. So Section 230 C says again, protection for private filtering and blocking of offensive speech and Section 230 C one is called Treatment of a publisher or speaker. And I’m going to like loosely explain the two parts and then what they’re trying to do.
Speaker 3: Okay, So c one says treatment of publisher or speaker, and it says no user or provider. I’m doing this by heart, but I’ve said it so many times, I think we can trust me. So no provider or user of an interactive computer service shall be treated as a publisher or speaker of information provided by another information content provider. So that C one, C two. Its title is Civil Liability and it says no user. And that’s the only time we talk about immunity. But it says no user or provider of an interactive computer service shall be held liable for voluntarily and in good faith. And I’m riffing here, but filtering or blocking of objectionable content, They gave a couple examples like harassment and stalking and, you know, lewd, offensive, obscene speech. But basically that’s the idea.
Speaker 3: And so these work together. They write C one and C two together. And the idea is to provide incentives for companies, the early Internet service providers and their users to moderate content and to act as the good Samaritans blocking and filtering offensive content. Because Cox and Wider knew that there’s no chance federal agencies could deal with all this by themselves.
Speaker 3: The Internet was like a kind of glimmer in their eyes. We had the early bulletin boards. We had access to the Internet truly only through these, like walled gardens, like AOL prodigy, the very early ISPs. And they wrote these two provisions in response to a New York State trial court opinion, Stratton Oakmont versus Prodigy, which, by the way, fascinatingly is a defamation lawsuit brought by the fraudster of the time, the Wolf of Wall Street, Jordan Belfort, but not long story short, as he brings.
Speaker 3: So Jordan Belfort runs a boiler shop. He’s later convicted of fraud and goes to jail and then becomes this like evangelists for don’t do crimes. But his firm sues prodigy because I’m one of their bulletin boards called Money Talks. Somebody posted basically accusing them of fraud, which was true. But like any good fraudster, what do they do if this feels so Trumpian? Right. They say, Hey, defamation, you’re defaming me. But the neutral court looks at Prodigy and its business practices and says, Hey, Prodigy, you were trying to moderate and filter basically dirty words of profanity. You were using filtering software and you were moderating. So you were removing content that you failed your filtering, but you were also detecting profanity and other prohibited, objectionable content and you were taking it down.
Speaker 3: And so what the New York trial court found is because Prodigy was trying to moderate content that that increased their liability, that they then became the publisher strictly liable publisher of any content, any defamation, because it’s a defamation lawsuit that they failed to remove. So trying to clean up the Internet then received a penalty.
Speaker 3: And Cox and Wyden were horrified. Apparently, Chris Cox was like an early user of both Prodigy and copies of. He apparently was a big fan of the early bulletin boards. And Ryan White and then a fellow House member also was really interested in ensuring that the Internet would become something like they wanted to see what it would become. They wanted the flowers to bloom, and they also wanted to make sure that companies engaged in content moderation. They didn’t want them to be disincentivized. They didn’t want them to fear that if they engaged in content moderation, they’d be punished for it. Right? It would increase their liability.
Speaker 3: Okay. So see one and this is conversation on the floor. It’s in committee reports. You dive in like I did. You like get to I start writing about this stuff in 2008. You read all the committee reports, you read the House debates, and it’s clear both Cox’s comments and then Representative Gulati say that we write Section 231 as a direct response and to repudiate Stratton Oakmont versus Prodigy. So the idea of we’re not going to treat you as a publisher speaker, what that means is that we’re not going to treat you as a publisher, a speaker for someone else’s information. In the project of trying to block, remember the title block or filter offensive content. So that’s C one, right? So it’s a direct repudiation or congressional overruling of Stratton Oakmont.
Speaker 3: C two is the only provision that says anything at all about an immunity. It’s called civil liability, right. As the title of C two. And that provision talks about if you block and filter, you do it voluntarily and you do it in good faith, then you’re specifically immune from civil liability.
Speaker 3: Okay. So you got to look at the two provisions together that what they were trying to do. COX And Widen, was to provide incentives or to remove, they say, in the print of the findings and purposes, remove the disincentives from using any technologies that would block and filter objectionable and offensive content. And they made these two moves. One, we’re going to overrule Stratton Oakmont. And the second was to ensure an immunity from civil liability for doing that kind of blocking and filtering in good faith.
Speaker 3: So you might say, huh, okay, that’s a narrow project. It’s an important project. It’s one where they’re trying to incentivize content moderation. Right. They wanted companies to engage in that. And the purpose is part of the statute. That’s 230 B, it’s actually there five purposes one says has to do with free speech and encouraging the true diversity of political viewpoints online.
Speaker 3: But another equally as important purpose is to remove disincentives from blocking and filtering objectionable content. And the third and this is, you know, kind of at the heart of my work, says it wants to ensure that we enforce all laws against cyberstalking, cyber harassment and other forms of online abuse. So that’s a multi-layered purpose, right? That’s not just. Okay, ready? Free speech. Free speech. Sorry. That’s just the all the oxygen in the room is not about free speech, but. Okay, so that’s that’s where we are in 1996.
Dahlia Lithwick: And then in 1997, you get a case that sets the lower courts down this path of kind of delinking C one and C two. Right.
Speaker 3: One of the very first opinions and an important decision is Iran versus AOL. Fourth Circuit Judge Wilkinson writes, The Fourth Circuit finds that the decision that defamation not only applies, we’re going to treat as a publisher or speaker and find immune from responsibility sites that, you know, any interactive computer service, even if they know about defamation and don’t take it down. So we’re going to consider distributor liability as part of publisher liability. And this is all still defamation.
Speaker 3: I’m still like, okay, But the big error, I think, and Saran that the court makes is to say the only purpose of Section 230 is to enable free speech. And that’s just wrong. You know, what’s that line of from Scalia’s matter of interpretation where he says, like, you look out into a field and you see a lot of poppies and you pick only the ones you like, you know, think the red ones are not the blue ones like you only pick your friends. That’s just not true. Right?
Speaker 3: And what happens is the lower court. So the Supreme Court, as you noted so well, has never dealt with a 230 problem before, has consistently refused requests and invitations to hear cases to grant cert. And we’ve written and cry amicus briefs has urge them to take on cases they’ve never before this one. So it’s been 25 plus years and the lower courts have made a hash of it.
Speaker 3: The lower courts have only understood they viewed Section 231 as the only relevant provision. They fail to see the two pieces together, and the project is one of incentivizing content moderation. And instead of viewing C one as related really only to leaving up content. That’s where they should keep it, right? That is the idea was if you fail to remove or you leave up content that’s posted by somebody else, then you shouldn’t be held responsible for that, especially if you’re trying. You can add the gloss. I think it’s really important because that’s the title. But if you’re trying to be a good Samaritan and instead it’s anything that’s posted online that constitutes ones and zeros is conceivably speech, then that deserves an immunity.
Speaker 3: And that includes and this will take us to the case before us now, which is where you have YouTube, not just leaving up not just failing to remove ISIS videos, because if that’s what this case was about, I would say what a loser. Why did you bother? That is clearly covered by section 230 C one, exactly what Cox and White had wanted even outside the defamation context. Right. If if it was just about failing to remove and leaving up information provided by someone else. I wouldn’t be losing my marbles, but instead the theory of liability in this case.
Speaker 3: Is about what YouTube did that its algorithms which its designs you know they’re not. There’s some nonsense in the questions that algorithms are neutral. And I don’t know what drugs these folks are taking, but they know nothing about the design of algorithms, which is they’re engineers, they’re really high paid. There ain’t nothing neutral about what they’re doing. These folks are designing algorithms or set of instructions. They may be very high falutin. They may be machine learning algorithms and deep learning neural networks, but these engineers build them.
Speaker 3: And what these algorithms are doing, because the the model of YouTube’s business is online behavioral advertising is these algorithms are mining our data that they collect massive reservoirs of our data that they collect not only based on what we’re doing on YouTube, they’re buying it from data brokers. It’s a very expensive proposition, right? So they have massive reservoirs of our personal data. They are there refining. These engineers work really hard round the clock, right? They’re building algorithms that are super intelligent that tell them what videos we are most likely given our profiles, given our clicking, given our liking, given our personal data that we’re most likely to click on, because then they’ll earn advertising fees from that.
Speaker 3: And so this case is about the theory of liability that the petitioners and the plaintiffs allege is that your algorithms, YouTube, you are systems of amassing, amassing personal data, are making recommendations. And it’s your your efforts to make recommendations. That’s at the heart of the lawsuit. It’s not about the failure to remove or keeping up a leading up ISIS videos, no matter what these videos say. It’s about your recommendations based on your algorithmically fine tuned, sophisticated, fancy, very expensive algorithms and mining our personal data. And so The Court I mean, Justice Jackson is on it.
Dahlia Lithwick: Hold on. Give me one second. Give me one second to okay. Two things that you just.
Speaker 3: So sorry.
Dahlia Lithwick: Never apologize for being the smart person. My sister. That’s why you’re here. But I want to reflect back to things and tell me if I’m right. Because I think you said two important things. One is that the reason we’ve all been screaming about Section 230 for a very, very long time is because it is a statute that’s wrapped in a lot of political noise and that we’ve been screaming about the political noise, which is, as you say, free speech, free speech, free speech, without understanding that this is a complicated push me pull you statute that is incentivizing moderation and incentivizing what you’re calling good Samaritan efforts to regulate and in so doing, immunizing certain kinds of civil liability. That’s the first thing you’re saying. And so when we hear right from Ted Cruz, like, why is Alex Jones punished but not, you know, liberals, that’s that’s a political conversation about free speech that does not map onto the statute. That’s the first thing you just said, right?
Speaker 3: Totally. Like CTA wants them to do this like Ted Cruz is. What he’s saying is let’s get rid of 230 period C two, which is not at issue in this case, but the whole point of the statute is to encourage blocking and filtering objectionable in the eyes of the interactive service or the user.
Dahlia Lithwick: Good.
Speaker 3: Right. That’s totally a political move. It’s not a statutory interpretation.
Dahlia Lithwick: Right. And what you’re saying and this is so important, Danielle, is that the court kind of took a political case like they took a case that they’ve been asked to take and that there was great urgency. And I know Justice Thomas, like, really was chomping at the bit to decide this is a political case, but it’s really a tough, you know, as you say, you know, intent and purpose and legislative language case to map on to that political story. The second thing you just said is really important, and I want to just say it again for listeners, which is it’s clear that the plaintiffs in Gonzales couldn’t have sued Google just for hosting the videos. Right. That that’s 230. That’s a slam dunk. So the claim here is that YouTube’s algorithm that was pushing out this violent, radicalizing material is the problem, right? That’s what we’re actually talking about in this case. And so everything everybody thought that was happening in the court is, in fact, not what’s at issue. Right. Those are the two things you just said to me. Yes. Okay. Now let’s do Justice Jackson. Go.
Speaker 3: So in all the questioning, there’s so much misdirection happening. There’s the line of questions about I think it’s Gorsuch and Kavanaugh. These algorithms are neutral, which is and true, as a matter of technical fact, and B, have nothing to do with the question, like there’s nothing in Section 230 that would even ever suggest neutrality. In fact, it’s an anti neutrality statute. It says like, hey, your job here is to moderate and we’re going to give you incentives to moderate.
Speaker 3: Okay. That’s first there’s another set of misdirection, which is like, gee, I think Kavanaugh says this. Isn’t this a problem that’s going to wreck the economy again? He’s like engaging in politics. Okay, that’s not the issue here. I thought you were a judge. You’re going to read the briefs. You know, I got these justices. We’re going to be well prepped by their clerks and ask searing questions about the legislative history and the actual words and how these provisions fit together. And then also there’s Justice Kagan, who’s like, we can’t figure out the Internet. We’re just nine people who don’t understand anything, which is simply not true. Right? They’ve issued lots of opinions about complicated network technologies in the Fourth Amendment context. It’s not like they can’t figure out network technologies. They get it when they’re interested.
Speaker 3: The only justice who’s on it is Justice Jackson. I was swooning, you know, like truly, she has read the briefs, she has read the legislative history, she has read the literature and the scholarship. I was like, Oh, my goodness, yes. So she truly digs in to the statutory words. She digs into exactly what she wanted to say and how they work together. She digs into the title. She knows the history. The cocks and wide and right section to throw in response to Strat Oakmont versus Prodigy. She knows precisely what prodigy the finding was and what Cox and Widen were responding to. She echoes explicitly the multifaceted purposes that Congress lays out in 230 B. She lays them out. She gets it. You know, it’s not that. It’s. Palatable to some.
Speaker 3: The folks who view this as a political problem are not going to like the fact that in 1996, Congress didn’t imagine a world in which ISP’s would be using algorithms and mining data as a business model, that they would be pressing content for their own purposes, that their business would be putting before us so that we, like, clicked and shared on their ads content.
Speaker 3: They were imagining the early bulletin boards where people put up content. You know, Money Talks was a bulletin board. There were all sex was a bulletin board. I mean, there are countless bulletin boards that these ISPs created and what people posted cocks and wired and didn’t want them to bear responsibility for everything that was posted for strictly liable for that content if it was defamation, because they wanted them to try to moderate and engage in those kinds of filtering efforts.
Speaker 3: But what we have today is a different Internet. I literally laughed out loud when Kavanaugh was like, Golly, sounds so Mort Horowitz, like what would happen to these companies if they had to bear liability? And the answer is, these are the five biggest, fanciest billions of dollars market cap dominant players. I’m not going to cry forever here. But they would operate, as do their offline counterparts and have to bear responsibility and only in the case if they recommended.
Speaker 3: I mean, if if a theory works out for the plaintiffs, though, we don’t know. But they would bear responsibility for their own actions. We would enter a world in which network tools that these tools and services that engage in exploitation of our data, they’d have to be responsible for some of what they did. They would not be responsible for actually for over filtering C2 stands, right? We’re not going to lose our marbles here. But it would be a proper interpretation of C one. Finally, we wouldn’t be in this land of overinterpreting. See one.
Speaker 3: Right. But the justices seem to think they’re out of their skis here, that they can’t interpret the statute. But, you know, as Justice Jackson showed us, she’s brilliant. They can interpret the statute. And if Congress wants to revise the playing field and keeping on this super, you know, it’s almost like a super immunity that we have here. It’s an unqualified immunity, so to speak. It’s like an absolute super immunity that anything happens on the Internet their immune from. And that shouldn’t be the case. That wasn’t the point of C one.
Speaker 3: And so I hope they listen to Justice Jackson and heed her lessons because she read the briefs. She understands the stakes. She understands the theory of liability the plaintiffs are pushing. It’s not about treating YouTube as a public speaker for leading up failing to remove ISIS videos. The theory of liability isn’t about the ISIS videos and not taking them down. The theory of liability is the business model of YouTube, of using an algorithm to recommend using our data in a very sophisticated way. To press contact.
Dahlia Lithwick: Okay. So I was going to waste your time, evidently, in our listeners, by playing both that Kavanaugh quote and the Kagan quote that I think you pretty persuasively. Demonstrated just now are a little bit orthogonal to what we really want to do here, which is for you to explain to us if the court were taking it seriously in the fashion that Justice Jackson did take it seriously and they wanted to do a thing that is not too big to fail, right? Like, oh, there’s too, too much money. We can’t do anything that there is a fix here that the court could pick away through. Can you write that opinion for me?
Speaker 3: I could easily I feel like I’ve written it and a series of review articles, you know, where I explain that the overbroad interpretation of the statute has led us to a land that misunderstands Section 230 C one and two and how they operate together. And, you know, we have instructions, a blueprint from Cox and White, and we can go back to the origins, We can go to the language.
Speaker 3: So the decision would read and I’m imagining this is what Justice Jackson would write, is that YouTube? Section 230 does not immunize. YouTube from liability civil liability here, because C one is inapplicable here. What’s at issue is YouTube’s own conduct, their algorithmic recommendation system that they built and make tons of money from that they use our data and recommend things. This lawsuit isn’t about treating YouTube as a publisher or speaker for information that they fail to remove the laptop. We out, you know. So it’s it’s a hard problem, of course, because there are all these downstream consequences, which is the policy question next is.
Speaker 3: But Danielle, isn’t that, you know, Justice Jackson or, you know, justice, they have to wrestle with the fact that so much of tools and services that we use online are using all types of tools that mine our data to make recommendations. And will that open these companies up to liability? And the answer is it might. They need genuine theories of liability. Right. And those genuine theories of liability, you know, would have to get past themselves.
Speaker 3: 12 motions to dismiss and on the grounds of legal cognize ability. Even after we deal with the question of immunity. So there’s no blanket immunity. But then, of course, you got to have some theory of relief that works. So I guess my policy response and this is not a legal, analytical. Statutory interpretation response, but my policy response to the concern that we are going to have liability that follows like any other industry, we have to face liability for your business model that my response is, well, let’s see what happens.
Speaker 3: And if Congress wants to step in and provide a section to 32.0 where they explicitly draft a law that says this is a super immunity. This is anything that happens at the content layer, whether it’s recommendations, if they want to write that statute, do it, friends. But that’s not the statute that was written in 1996, and that has been interpreted in an aggressively overbroad way.
Speaker 3: I thought the court was nine justices that are really super smart and they go and they figure out when the lower courts are doing a terrible hash of things that they fix matters. I mean, that was my understanding always. I was an avid listener to you entire. I know what they’re supposed to be doing. I also know what they haven’t been doing, but I also know what their purpose is. These nine brilliant people and black robes. And they can they can do it.
Dahlia Lithwick: So just to be perfectly clear, you’re saying, look, the problem is that YouTube is mining our data, pushing out crazy crap. You know, there’s a version of the algorithm theory that is right here. It was not pursued correctly. You’re also saying there is super immunity, but it’s not going to get resolved by making YouTube. In other words, there’s some merit to the claim here. It’s not argued correctly and it’s not understood correctly by the court, but that there is a pathway to fixing this here. And I think you’re ultimately saying, by the way, the court can’t fix it. The court can get out of the way and let Congress fix it and not make it worse. That’s what you’re saying?
Speaker 3: Yes. Yes. Like my my version of the world is I didn’t want the court really to take this case. Dr. Marianne Franks and I so I’m the vice president of the Cyber Civil Rights Initiative, and Dr. Franks is our president. And we wrote an amicus in which we offered what we understand is really the true principal purpose of 230. It’s early understandings. You know, we sort of walked through the Prodigy verse, you know, Stratton Oakmont, and the court could get it right and be still unsatisfied. And in my scholarship, I have offered reforms for Section 230 that would be narrow reforms that get at the bad Samaritans that focus on the kinds of costs that the current interpretation of Section 230 has left on the table to be borne by victims. They’re strictly liable for all the harm into their privacy violations and cyberstalking.
Speaker 3: So I’m talking to Congress because I think that’s the right spot for all of this. But if Justice Jackson, I think, rightfully wants to reset the lower courts hash, they’ve made a mess of Section 230. They have applied it, even though the theory of liability has been about what companies have done themselves, the design of their sites.
Speaker 3: I’m thinking of Cory Goldberg’s case against Grinder, where the theory of liability is products. Liability. Hey, Grinder, it’s how you built this site that is the wrong. And courts have dismissed those claims. You know, I’d love it if the courts also got it right. You know, that they didn’t just look at 238 is a free pass and if they could interpret it in a correct way. But, you know, the political questions are going to remain. And so if we’re unsatisfied. Okay, Congress, I got some solutions for you. I’ve drafted a statute for you and my you know, in my scholarship and I’ve been working with some of those folks on the Hill. So it’s not like we can’t do it. It’s just two different projects.
Dahlia Lithwick: And certainly whatever the project was that happened on Tuesday has nothing to do with the project you and Dr. Frank are talking about, Right.
Speaker 3: What we tried to do in our amicus brief, Dr. Franks and I was to basically we spoke to Justice Jackson, and I think she heard us write that we we want a level set and get back to the proper understanding Section 230 and more us in the legislative history, more US and Stratton Oakmont versus Prodigy. Let’s look at the language, right? Let’s not get kind of confused by some definition section that Gorsuch unmoored from the statute itself. And he’s a textualist, but he forgot about text, to be honest. Right. But Justice Jackson was on it. She got it. But I don’t know if the rest do.
Dahlia Lithwick: Can I ask you just briefly, I know you didn’t tweetstorm Wednesday’s argument in the Twitter companion case, and it’s somewhat avoid the pitfalls of Section 230, but got mired in sort of the same thicket of causation and foreseeability under a different statute. That was the Justice against Sponsors of Terrorism Act. At Tuesday’s argument, it sounded like Justice Amy Coney Barrett was suggesting that if they resolve this in the Twitter case, they could resolve both at once, and the court wouldn’t have to think about 230. Is that a way I’m trying to think of a way out? In other words, as you said, the court way ahead of itself, you know, over its skis. There’s no way to do this. SMALL Is that the way out? If the court wants to do it?
Speaker 3: It seems to be. That is, if they decide that they’re going to tackle the question that there’s no aiding and it couldn’t conceivably be under this amended part of the statute for 80, that if they find there’s no, you know, cognize ability under aiding and abetting liability. Yeah. Then that seems like that they could decide that. Right. And that would be a way out of what do they say? That’s a 230 escape hatch like they can decide it on the JASTA Anti-Terrorism Act amendment and then just say, okay, there’s no aiding and abetting and therefore we just we don’t have to deal with any of this immunity question.
Dahlia Lithwick: So, Danielle, we started with me saying to you that I just I don’t mean to be grumpy about this, but sometimes I think, like when the justices can’t figure out, like, how their garage door opener works, like they’re always fighting the last war. And this isn’t like ageism. It’s even the younger justices, right? Yes, it is.
Dahlia Lithwick: Technology is changing so fast that there’s a weird, weird way in which we are now fighting about content moderation on YouTube. And like Chad, Egypt is going to, like, break the world, Right? And so I guess I just want to say, you know, you are I just told you to write this opinion as though you’re Justice Jackson. Now, I want you to tell me, are we just always going to be related in a way that we are approaching this by the time this stuff gets to the court or goes back to Congress? Technology has changed so, so dramatically that that which was existential three years ago is just like a fond memory. We’re like regulating dinosaur content moderation. It’s just this is the nature of the beast now.
Speaker 3: So I’m going to disagree a little bit because I think if we get our values right, that if we can figure out what really matters to us, then we can tackle the technologies that come our way, so long as we either right or statutes in it in a way that’s sophisticated enough and careful enough that we’re not going to be outpaced by the technology. And you’re absolutely right that technology changes so fast. But if we can map out the values that matter to us. So that’s my you know, our work on Internet privacy is sort of mapping out the things that matter, that if we do that, if we care, if we figure out what we care about and the values that we want to protect, that we think the values that are precondition to human flourishing, gosh, then we can do it. I mean, we shouldn’t throw up our hands and say we can’t figure this thing out. It doesn’t mean it’s easy.
Speaker 3: So working with lawmakers on the state and federal level, I can tell you it’s never easy. But we see judges do a good job. You know, once we were told at the Cyber Civil Rights Initiative that you couldn’t draft a statute to criminalize nonconsensual intimate imagery because it would never pass through the crucible of strict scrutiny. And we have in five cases and come out the other side with statutes that are. Totally consistent with the First Amendment that we can do that if you do it with enough care so we can protect intimate privacy. We can protect free expression. We’ve got to figure out the values that we think are most in jeopardy. And we’ve got to act carefully. We can do it. I’m not deterred, so to speak.
Speaker 3: But we also got to read the briefs. We got to read the history like the court. I just show up to the club unprepared. And that struck me. The most disappointing part of the argument was that the only person prepared, brilliant as ever, was Justice Jackson. Everyone else swept it. They literally were the frat boys with all my beer. They’re like, We can’t figure this out. Like, What are you talking about? Somebody whispered them in an ear at the party the night before. Neutrality. It’s too hard. Wrong institution.
Dahlia Lithwick: Compared to a bookstore. Compared to a bookstore? Yeah.
Speaker 3: Yeah, totally. Like they had their little set thing. Somebody wrote them a cheat sheet. But they didn’t do their homework. Except for Justice Jackson. So that upsets me. I’m like, I’ve been studying so hard. What do you mean?
Dahlia Lithwick: So, Danielle, at the risk of of of asking you to, like, explain all of your career standing on one foot, I do think maybe you could play us out with a list of those values that you want us to center, Because I think, you know, we’ve talked about revenge porn and violence, and I think that maybe it would be useful going forward. I’m thinking about if you and I can agree that the court is not going to radically rewrite Section 230 and that they probably want this case to go away. But if we can agree that this was not the day to do what the court played at doing this week, what are the values we should be centering as we think about chat, you and I and all the ways in which technology is changing at lightning speeds.
Speaker 3: Yeah, I mean, these technologies, these tools and services are indispensable to our lives, so we all should have a meaningful chance to use them and at the same time to use them for free expression and sexual expression and all the ways that we want to make the most of our lives and work and, you know, fall in love, meet people, network, you know, create opportunities for democratic engagement. We want to do all those things. And at the same time, those tools can be weaponized against us all the while that we are doing things that are really important to our our careers and our ability to engage with other people and to love, those tools are engaging in persistent, continuous, indiscriminate surveillance of our intimate lives.
Speaker 3: And in doing that, you know, all the ways we use these tools, we’re not thinking that when we use our Amazon echo that it’s recording and storing in the cloud and then potentially leaking our private conversations in our kitchens. And we’re not thinking as we use our period tracking apps and our dating apps, we’re searching adult videos on Pornhub. We are using our search engine, right? Which is it’s the key to our soul. You know, what we’re searching and what we’re thinking and what we’re browsing. We’re not thinking that all of that information is being used, shared, stored, sold and exploited against us in ways that have implications for our life insurance premiums, the jobs that we do or don’t get.
Speaker 3: And so the values that I want us to sit there and think about is we’re using these all these platforms in ways that are so pro-social. And at the same time, we are the object. We’re being turned into objects and manipulated and exploited. And I want us to think about how important the privacy around our intimate life is. Right around our bodies are health, our sexual orientation or sexual activities, our close relationships, that the privacy that we have for that we want, that we expect that we deserve.
Speaker 3: Right. As we use these tools and services in the bedroom, I’m seeing my phone goes everywhere I go that. Preserving the privacy, protecting the privacy around the data, around our intimate life is so important for us to be able to figure out who we are and develop our identities. It’s so important for us to enjoy self-esteem and social esteem.
Speaker 3: So when a content platform encourages people to post non-consensual intimate imagery, right, the cost is that to so many people, more often women and sexual and gender minorities and racial minorities. The cost is that you’re just a fragment. So when people see those images, you become just a body part, right? You’re not you’re not subject your object, right? You lose your social esteem. And if we didn’t have intimate privacy, like if we use these tools, as I am, to call you in the phone and we’re going to use these tools and services to get to know each other and to form friendships. Right. And fall in love. Like if we don’t have that privacy, we can’t form thick relationships. We need intimate privacy to be reciprocally vulnerable and to trust each other.
Speaker 3: And Charles Freed, I always quote him because it’s the greatest quote in the world from 1970. His book, Anatomy of Values, where he said, Privacy is the oxygen for love. And it is. And that’s on the line. You know, you asked me like, what are the values? What’s on the line when we use these network tools and services just to go back to our YouTube? Right. What’s on the line is our capacity for love, our capacity to communicate with privacy so we trust each other, right? What’s on the line is our ability to get jobs and keep jobs, our ability to figure out who we are and express ourselves in ways that feel safe. Because privacy isn’t to me, it’s weak us.
Speaker 3: And so if we have in view, as we think through legislating legislatively or even the common law courts, policymakers, as we think through what matters, the stakes are when we’re talking about online life and all these tools, the stakes, our intimate privacy. It is our civil rights and our liberties. And we often forget that when a site amplifies, recommends makes money off of uses our data to recommend nonconsensual intimate imagery, the cost to the sexual expression and expression of of the privacy victims. Because they’re leaving online life, they’re shutting down their LinkedIn. They are not using YouTube. They are literally completely removing themselves from any online engagement and offline engagement. Their friends don’t talk to them. You’re vanquishing the speech opportunities for victims. And so we got to have all of those values in mind.
Speaker 3: You know, as we think about all the kinds of policies, you know, content moderation is a beautiful thing. And I have to say, having worked with companies for 12 years or more, 15, we have seen industry self-regulate in ways that 230 was meant to do that. We see companies responding to nonconsensual intimate imagery. I wish we could touch those 9500 sites that their risen tetra is intimate image abuse, right? I can’t. But companies are engaging in that project of content moderation in ways that protect victims so they can express themselves. And so I guess I want those values on the table. And that’s the kind of thing those are the kinds of conversations I’ve been having with lawmakers, with judges, with companies, you know, with all of us, so that we have them in view as we make these decisions.
Dahlia Lithwick: Danielle, Sometimes I think of you as the world’s biggest brain in a vet, and I forget a little bit until I hear you say things like what you just said, that you’re also like the world’s biggest heart in a vet, and it’s just such a treat to have you unpack this on the show for us today. These cases are both so important, but I think, as you are urging, so important to get right.
Dahlia Lithwick: Danielle Citron is the Jefferson Scholars Foundation. Shenk distinguished professor in law and Caddell and Chapman, Professor of law at UVA. She writes and teaches about privacy, free expression and civil rights. Her scholarship and advocacy are so important. She was named a MacArthur Fellow based on her work on cyberstalking and intimate privacy in 2019, and her brand new book, The Fight for Privacy, Protecting Dignity, Identity and Love in the Digital Age was published this past October. And it is absolutely, if anything, Danielle said resonated with you all today, please, please buy it and read it, because I think this is cutting edge work. Danielle, I cannot thank you enough for helping us unpack what felt really hyper technical and abstract because it was hypertechnical and abstract but also wrong. So thank you so much for helping us pick through it today.
Speaker 3: And thank you. So Dahlia, you are also my heart and my brain. And so I read and listen to you. I have Lady Justice in both audio and 98. Present through the audio, but I had already read the book. So you inspire me so much. It’s so meaningful for me to be on Amicus and to read you and to hear you all the time as I take walks. So thank you.
Dahlia Lithwick: We are now at the Slate Plus bonus segment in which Mark Joseph Stern tells me all the stuff that I forgot to think about this week because I was too busy thinking about the other stuff. So, Mark, welcome back.
Mark Joseph Stern: Hi. So happy to be here.
Dahlia Lithwick: And I wonder if you want to start with loan forgiveness, which the court is taking on next week. This is Oh, I know, I know. We’re going to start we’re going to start with the lows and work our way to the highest mark. It seems to me the central issue here is whether President Biden exceeded his authority by implementing this loan forgiveness plan without going through Congress. It’s going to be a thing. Tell us what’s to come and tell us how extremely well this is going to go at the Supreme Court next week.
Mark Joseph Stern: So, I mean, yes, that is in some ways the central issue, whether Joe Biden had the legal authority to forgive up to $20,000 in student loans for about 40 million borrowers. But to me, the bigger question is whether the Supreme Court will totally rewrite and manipulate the rules of standing simply to let a couple of spiteful right wing litigators and red states tear down one of Biden’s signature executive policies. These cases, I think, on the merits, are tricky. All acknowledge that there are good arguments in both directions.
Mark Joseph Stern: Congress passed this law, the Heroes Act, after 911. It says that the Secretary of Education can forgive student loans when a borrower was negatively affected financially by a national emergency. COVID was a national emergency. It was declared as such by two separate presidents. The question is, well, Congress wasn’t really thinking about a pandemic when it passed this law. Congress wasn’t really thinking about 40 million borrowers getting their loans forgiven at once when it passed this law.
Mark Joseph Stern: Can we say that the law applies by its plain terms? Because, you know, the text says that loan forgiveness is all right. When we know that at the end of the day, it wasn’t actually Congress’s intent or Congress’s desire to allow this kind of one size fits all broad based loan forgiveness, I think that it’s clearly legal. I think that the Heroes Act tax should control. And the text says that the secretary of education can forgive loans. It’s pretty straightforward. The other side is going to say, oh, well, this is a major question. It’s going to invoke the major questions doctrine that the court used in West Virginia versus EPA and say, well, if Congress wanted to forgive 40 million people’s loans all at once, it would have passed a law expressly saying that. And because it didn’t, we have to strike this down.
Mark Joseph Stern: Setting that dispute aside, I think like the number one issue here really is how the hell do these plaintiffs have standing to sue? And it’s two different sets of plaintiffs. In the first case, it’s literally two borrowers who are disgruntled that they did not have an opportunity to file a formal comment on student loan forgiveness before it went into effect, before Biden sprung into action. And they’re saying that because they didn’t have an opportunity to comment, they have standing to challenge it and to prevent 40 million people from getting their loans forgiven.
Mark Joseph Stern: And especially absurd theory of standing, given that the Heroes Act expressly states that the Secretary of Education is not required to take public comments when he forgives student loans under this law. And yet a Trump judge in Texas went all the way, said, Of course, these folks have standing said the program was illegal and shut it down nationwide. That’s the first case.
Mark Joseph Stern: The second case is a group of red states led by Missouri. Most of the state’s claims are totally frivolous. Missouri has a little bit more of a foot in the door because it has this quasi governmental agency called Mo hla, I call it mohalla. I don’t think there is necessarily a settled pronunciation, but it sounds like a Coachella kind of rip off, so I’m going to stick with that.
Mark Joseph Stern: Mo Salah is a student loan servicer. It is an independent entity under Missouri law. It has the power to sue and be sued by itself. Mohalla has chosen not to sue to challenge student loan forgiveness. Minghella is not in this litigation, but Missouri has claimed the right dubiously to sue on Magellan’s behalf, and it claims that it is going to lose out on money because a bunch of student loans that are serviced by mohalla are going to be forgiven. Mohalla won’t be able to collect them and that will sort of trickled down to Missouri, losing out on funds. It is an incredibly combative.
Mark Joseph Stern: You did Theory of Standing. I guarantee you that if the subject matter here were environmental protection or consumer rights or discrimination or immigration or something that liberals care about, there would be no standing. It would be obvious. I mean, this Supreme Court is such a stickler about standing. It is said that you don’t have standing even when to give one recent example. Some credit company says on your file that you’re a freaking terrorist. The court says you still don’t have standing to sue when they call you a terrorist in your credit report.
Mark Joseph Stern: And so I guess to me, these cases are really depressing because there are weeks like this one where the Supreme Court issues some decisions where it seems like they really are doing law, where it seems like they’re kind of putting their heads together to kind of reason out the best solution. And then there are weeks like the coming one where it’s just going to be six partisans in robes getting really pissed, mad at Joe Biden for trying to do something a little bit good for some hard working Americans. And that is the kind of week that really makes me question my chosen profession. Deep down in my soul, and it is, I fear, going to lead to a Supreme Court decision that permanently blocks student debt relief.
Dahlia Lithwick: Mark, if you are going to make Mozilla a thing, I think I’m going to call this Moe Hello Blues in tribute to the old the old film. And let’s see if are two deep down in our soul, our broken, broken soul efforts at making something a thing can work. So listen, you just kind of stole my thunder for my next question, which is, this week, it seems that the court accidentally handed down some pretty good decisions. Leo Lippman did a good piece for us about Yes, one of them was a skin of the teeth there. But some pretty good decisions came down this week. What’s going on?
Mark Joseph Stern: Yeah. So two really important decisions that are sort of on the periphery not getting a ton of coverage, but I think they are important.
Mark Joseph Stern: I encourage everyone to read Leah’s piece for Slate on one of these cases. It’s called Cruz versus Arizona. It’s a pretty convoluted case procedurally. But basically, Arizona had enacted this scheme that, as Elena Kagan put it during arguments, is downright Kafka esque, where for many years it denied people convicted of capital crimes and sentenced to death a basic right that the Supreme Court had articulated years before in the 1990s, which is, well, if prosecutors are arguing that you have to be sentenced to death because you’re dangerous and you’re going to be dangerous for the rest of your life, and if you’re ever let out of jail, then you’re going to kill more people.
Mark Joseph Stern: But the only alternative to a death sentence is life in prison without the possibility of parole so that, you know, you won’t ever be let out of prison to kill people. You get to tell the jury that. You got to tell the jury, hey, listen, these guys are trying to kill me because they say I’m going to be dangerous forever, but you don’t have to kill me. Even if you think that you can just put me in prison for the rest of my life. Spare me a death sentence and not worry that I’ll be out on the streets killing again. This is a constitutional rule that the Arizona courts ignored, ignored for years and years in a series of egregious decisions.
Mark Joseph Stern: Finally, the Supreme Court corrected the Arizona courts in 2016 and said, Actually, you do have to follow the Supreme Court precedent that we issued several decades ago. And Arizona courts responded by basically flipping the middle finger at the Supreme Court, specifically, the Arizona Supreme Court just laughed off that decision and said, oh, this wasn’t a major change in the law. And none of these people who were totally screwed over by us ignoring the Supreme Court of the United States, none of them got to appeal the fact that their constitutional rights were egregiously violated. We’re just going to preserve their capital sentences and pretend like nothing really changed. And by a 5 to 4 vote. Depressing that it’s not close, but we’ll take it.
Mark Joseph Stern: The US Supreme Court said, No, you can’t do that. That’s not how any of this works. John Roberts and Brett Kavanaugh joined the three liberals in a great opinion written by Justice Sotomayor, basically calling out the Arizona Supreme Court for defying precedent and saying these death row inmates, they actually get to exercise and vindicate the constitutional rights that we have laid out.
Mark Joseph Stern: Very good decision, paired with another one. I won’t get as much into the weeds, but it’s called Helix Energy and it’s been kind of ripping up the lower courts. It’s this question of whether people who get paid on the higher end of the scale can still qualify for overtime pay. It’s a big kind of textualist dispute. The Fifth Circuit divided over it along pretty unusual ideological lines. But the Supreme Court, by a 6 to 3 vote, said yes, even if people get paid more than.
Mark Joseph Stern: The low wages that we typically associate with over time. You know, if they basically fall into this particular category, they can still claim overtime pay. They can still demand that they get compensated time and a half for their overtime work. And these employees who had been screwed over, they got to demand back pay in court. A good decision for Labor. Not a huge number of people affected by it, but it is good to see that again when the stakes are super high and the questions aren’t loaded with ideological or partisan concerns, sometimes a majority can still kind of feel its way to the right answer on this court.
Dahlia Lithwick: And Mark, before I let you go back to your dark, said curled up in a ball place, I think we do need to talk a little bit about the Wisconsin Supreme Court primary this week. Last show, we talked to justice, a sitting justice on the Wisconsin Supreme Court about what it was like to sit on a Trump election denialism case. And we also talked about what life is like on that court this week. I think in one of the most consequential pieces of signaling about what folks are thinking, at least in Wisconsin, about their Supreme Court elections in the form of the primary this week. Kind of a huge, huge news story.
Mark Joseph Stern: It’s a huge deal, really overlooked, I think, by most of the press. I mean, the primary was a four way race and it’s officially nonpartisan, but it was really between two Republican candidates and two Democratic candidates. One of the Democratic candidates had really drawn support from most of the party. Her name is Janet Proto States, and she has proven to be a darling of both left and centrist Wisconsinites. She was a fundraising genius. She ran a really smart campaign and she easily came out on top and will face off against the Republican in the April general election. Who is that Republican?
Mark Joseph Stern: Well, here is my favorite part of this story. So Wisconsin voters had a choice. There were two Republicans in this race. One of them is a bona fide folk hero named Judge Jennifer d’Oro, who oversaw the trial of the Waukesha Christmas parade attacker in 2021. She conducted herself admirably. She oversaw a very difficult hearing and I think presided quite well over it. It was televised and she started to draw a huge number of fans from both Wisconsin and all around the country. People sent her fan mail. They praised her to the hilt. People dressed up as her for Halloween because they loved her so much.
Mark Joseph Stern: And so she thought, I think quite reasonably well, maybe given that I have this massive fan base, I can be the Republican nominee and the Wisconsin Supreme Court race. But it was not meant to be because running against her was Daniel Kelly, an incredibly bad candidate who previously sat on the Wisconsin Supreme Court but was ousted in 2020 by friend of Amicus Jill Carroll.
Mark Joseph Stern: Skee losing by nearly ten points and losing once was not enough for Daniel Kelly. In the interim, he has tarnished his reputation even more by playing a critical role in the post election effort to overthrow the election in 2020. He allegedly helped to conspire in appointing fake electors in Wisconsin who would have sent the state’s electors to Trump instead of Biden, even though Biden won by 21,000 votes. He is part of the alleged coup, and Wisconsin voters said, We like that. At least Wisconsin Republicans did. They picked him over Jennifer d’Oro. They picked the scandal tarnished loser over the beloved folk hero.
Mark Joseph Stern: And so Janet Protasevich will face off against Daniel Kelly in April. I’m here to tell you it’s looking really good for the Democrats in this race. And that is, of course, a huge deal because the court currently has a slim conservative majority, a 4 to 3 conservative majority. But if Janet protest quits, wins this race, the court will flip to a 4 to 3 liberal majority. And that majority has all but said already that it is eager to protect reproductive rights and restore access to abortion in the Badger State. So this election is not only crucially important for voting rights, for environmental justice, for racial justice, for education, for all that stuff, but also kind of a referendum on abortion in Wisconsin. And as we have seen time and time and time and time again, when an election comes down to a referendum on abortion in America, it seems like abortion always wins.
Dahlia Lithwick: Yeah, and I think you said this, Mark, but let’s just say it again pro to say what’s. Explicitly ran on the idea that women should be allowed to make decisions and determinations about their bodily autonomy in their reproductive lives. It’s not something that she shied away from. And so I think you’re exactly right, Mark, to say this really will be hugely consequential. And to the extent that state supreme courts are the next battlefield, this is a really important battlefield. But I think it’s a really important piece of messaging about not running away from an issue that progressives have run away from for a very long time.
Mark Joseph Stern: 100%. And, you know, I think so often about the last opportunity that liberals had to gain a foothold on the Wisconsin Supreme Court and in a 2019 race and the liberal in that race. Lisa Neubauer I mean, she’s a good judge. There’s nothing wrong with her. But she ran away from the issues. She specifically ran away from abortion. She tried to run as the candidate who said very little and simply glided into office as a totally nonpartisan, non-ideological, kind of centrist judge. And it didn’t work. And she lost because people were not motivated to vote for her. They didn’t see what rights she would protect. They didn’t see what she would do for the state. That strategy is old and busted. It does not work. And Jane, it tried to say, which is showing Dems not just in Wisconsin, but everywhere around the country. What does work is standing up for the issues you believe in and centering reproductive autonomy among all of those issues and saying, I will not back down from this fight.
Dahlia Lithwick: That is such a very, very high note to end on. I will add no gloss to it because those moments are few and far between in our conversations. Mark Joseph Stern covers the courts, the law, democracy, abortion, the death penalty. I don’t know what all else, all the things for us at Slate.com. Mark, as ever, it is great to talk to you and it is really, really great to be on a Zoom with you where you are, in fact, smiling with your whole face.
Mark Joseph Stern: For the first time in Amicus plus history. Always a pleasure. Dahlia, thank you.
Dahlia Lithwick: This ad free podcast is part of your Slate Plus membership.
Mark Joseph Stern: I mean, we’re a court. We really.
Speaker 3: Don’t know.
Mark Joseph Stern: About these things. You know, these are not like the nine greatest experts on the Internet.
Dahlia Lithwick: Hi, and welcome back to Amicus. This is Slate’s podcast on the courts and the law and the Supreme Court. I am Dahlia Lithwick. I cover those things for Slate. And this week, the High Court decided to take the Internet out for a spin and the court came back. Confuse regulating free speech on the Internet. Turns out to be kind of hard, especially for a court that hasn’t really thought very much about this issue ever.
Dahlia Lithwick: So joining us today to talk about some cases that could literally strip Internet publishing right down to the studs is the wonderful Professor Danielle Citron, who is going to help us. And the justices, I hope, know what we don’t know when it comes to little things like content, moderation and search algorithms and theories of causation. Happily, Professor Cetron has been thinking about these issues for a really long time. Later on in the show, Slate Plus members are going to get to hear Mark Joseph Stern as he pops in to discuss some of the first decisions of the term which came down this past.
Speaker 3: Week.
Dahlia Lithwick: As well as the fate of President Joe Biden’s $400 billion student debt relief program, which will be heard by the court next week. But first, the Internet not really a thing that could be fixed over a few hours in February of 2023. But props to the justices who are willing to give it a try. In a pair of cases this past week that ranged over many, many hours on two separate argument days. The consensus now seems to be that these issues are far too complicated, the consequences too vast for the Supreme Court to just step in and take a big swing at regulating all Internet speech.
Dahlia Lithwick: In the first argument Tuesday, the justices heard for nearly 3 hours from the family of an American student killed in a 2015 ISIS attack in Paris. The family argued that Section 230 of the Communications Decency Act, that’s the federal law that protects websites right to moderate their platforms as they see fit. The argument was that needed to be narrowed by the court. At the heart of that case is this question of whether tech companies could or should be held liable for harmful content posted on their platforms by their users.
Dahlia Lithwick: The next day, on Wednesday, the court heard a companion case to that one about whether Twitter should be held responsible for an Istanbul terror attack. This would sidestep the Section 230 question, bypassing a different federal law that allows some lawsuits for, quote, knowingly providing substantial assistance to terrorists. That argument also kind of felt like a seemingly all or nothing mess. We want to be really clear that Internet violence and the incitement of violence is a serious problem. The question is whether the court could resolve it in a few hours this week.
Dahlia Lithwick: Joining us to discuss both cases is Danielle Citron. She is the Jefferson Scholars Foundation. Shank, distinguished professor in law and Caddell and Chapman, professor of law at UVA, where she writes and teaches about privacy, free expression and civil rights. Her scholarship and advocacy have been recognized nationally and internationally. In 2019, Citron was named a MacArthur Fellow based on her work on cyberstalking and intimate privacy. And her brand new book, The Fight for Privacy, Protecting Dignity, Identity and Love in the Digital Age, was published by W.W. Norton and Penguin Vintage UK in October of 2022, and it was named one of Amazon’s top 100 books of 2022. Danielle, congratulations on the book. It is so thrilling to have you back on Amicus.
Dahlia Lithwick: And that is a wrap for this episode of Amicus. Thank you so much for listening and thank you so much for your letters and your question. You can always keep in touch at Amicus, Slate.com. Or you can find us at Facebook.com slash Amicus, a podcast. We always love your letters. Today’s show was produced by Sara Burningham. Alisha Montgomery is executive producer of podcast The Slate. And Ben Richmond is our senior director of operations. And we’ll be back with another episode of Amicus in two short weeks. And until then, take good care.