Could the Supreme Court Kill the Internet as We Know It?

Listen to this episode

Speaker 1: Number 14? No. Number nine, Laser Smith, Appellant versus the people of the State of California.

Lizzie O’Leary: In October of 1959, the Supreme Court heard the story of Eliezer Smith, a Los Angeles bookstore owner.

Speaker 1: The appellant In this case, if the court please, was convicted and given a 30 day jail term for the possession in his bookstore of a hardcover book entitled Suite of Life. The facts are undisputed. There’s no question but that Mr. Smith owned the store, and the book was, in fact.

Advertisement

Speaker 3: In the store.

Lizzie O’Leary: The book, which was an erotic novel, was also illegal in 1950s L.A..

Speaker 3: So Smith is arrested and he is convicted under a Los Angeles ordinance that says if you merely sell obscene materials, even if you’ve not read them and you don’t know about them, you can go to jail.

Lizzie O’Leary: That’s Jeff Kossoff. Jeff’s a law professor who, despite the overused expression, literally wrote the book on the most important law underpinning the modern Internet. Section 230 of the Communications Decency Act. And to understand that law’s origins, Jeff says, we need to know about Eliezer Smith.

Speaker 3: And fortunately for him, a civil liberties attorney takes his case all the way up to the U.S. Supreme Court.

Advertisement
Advertisement

Lizzie O’Leary: The court held that the L.A. ordinance violated the First Amendment because it applied. Whether or not a bookseller had even read the book in question.

Advertisement
Advertisement
Advertisement

Speaker 3: So what it effectively does is that it imposes a duty on the distributor of this content to pre-screen everything. And the Supreme Court said, you know, that’s just too much of a chilling effect on constitutionally protected speech. So they strike down this Los Angeles ordinance. And that really is what sets the playing field over the next few decades for all sorts of claims, usually against bookstores and magazine stands that are both civil claims. So defamation lawsuit as well as criminal actions like obscenity cases that are brought against the distributor of content that someone else has created.

Advertisement

Lizzie O’Leary: It also set the stage for the creation of a law, Section 230, that has shielded Internet giants and social media companies from legal liability for what users say on their platforms. In the same way that Eliezer Smith wasn’t liable for what was inside a book in his store. But now the Supreme Court is set to hear a case on Section 231 that could fundamentally alter big Tech’s business model. Today on the show, 26 words created the modern Internet. What happens if the court hits control? I’ll delete. I’m Lizzie O’Leary and you’re listening to what next? TBD a show about technology power and how the future will be determined. Stick around.

Advertisement
Advertisement

Lizzie O’Leary: Part of the problem with being Jeff, one of the best known experts on this particular law is that people like me are always asking you for a very nerdy party trick. I know that you are probably sick of this, but you did write a book with the title The 26 Words That Created the Internet. Can you read them for me?

Speaker 3: Sure. No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

Advertisement
Advertisement
Advertisement

Lizzie O’Leary: You didn’t even really have to read that, did you? You just know it. Even memorized it.

Speaker 3: Yeah.

Lizzie O’Leary: How would you characterize the role that those 26 words had in in setting the stage for the modern Internet?

Advertisement

Speaker 3: So they’ve allowed companies of all sizes to build Internet businesses around content that users create rather than content that the companies create, because it’s really a risk shifting strategy. It’s saying the risk of any legal risk of defamation or anything else is going to be put on the people who post the content and it won’t be placed on the platform that hosts it.

Lizzie O’Leary: Jeff says that while the underlying law could be read in a few different ways, the courts have generally interpreted it pretty broadly.

Speaker 3: I could have a website that encourages users to go onto my site and defame you, and I could have like 200 comments that people are saying, Yes, I’m following your instructions and they post defamatory things about you. You could call me and say, Hey, this is false. I have proof it’s false. I want you to take it down. And I could take it down. But section 230 also says, I don’t have to take it down.

Advertisement

Lizzie O’Leary: To understand the landscape in which Section 230 was written. You have to remember the internet, the early 1990s. Back then, if you were a regular person who wanted to get online, you were probably using a service like CompuServe or Prodigy. Both hosted forums and message boards where people posted all sorts of things. But the companies took very different approaches to what users said on their sites. CompuServe didn’t do much moderation of the content on its service while Prodigy did. This being the Internet, users invariably said some offensive things and both companies got sued. But it was CompuServe hands off approach that was the more successful legal strategy.

Advertisement
Advertisement
Advertisement
Advertisement

Speaker 3: They’re able to get the case dismissed because what the judge says is you’re just like a Laser Smiths bookstore. So you didn’t know about this content. You had no reason to know. So we’re not going to hold you liable for defamation. You know, a few years later, Prodigy tries the same thing and its efforts are rejected. And the judge says you’re different than CompuServe because Prodigy actually wanted to be a family friendly service. So it marketed itself as a place where they actually do a lot of moderation and they keep all sorts of garbage off their services because they want, you know, kids to be able to use it for their homework at night without the parents fearing that they’ll be exposed to inappropriate material.

Advertisement

Speaker 3: So the judge says, because you do all of this, you’re more like a newspaper than a newsstand. And just like a newspaper, you are liable for every single thing in your pages. And this is in 1995. And it starts to get a lot of attention from from Congress and in the media, because it stands for the proposition that if you moderate content, you actually can increase your liability.

Speaker 1: I have here and have had the opportunity to share with several members of the Senate on both sides of the aisle what I refer to as a blue book.

Lizzie O’Leary: In 1995, Congress rewrote the telecommunications law for the first time in 60 years, and a lot of people, including lawmakers, thought the Internet was a terrifying place full of weirdoes and pornography. Nebraska Senator James Exon famously brought a blue binder onto the Senate floor full of porn he said he’d printed from the Internet.

Advertisement
Advertisement
Advertisement
Advertisement

Speaker 1: I would hope that all of my colleagues would, if they’re interested, come by my desk and take a look at this disgusting material pictures of which were copied off the free Internet only last week. To give you an idea of the depravity of our children, possibly our society that’s being practiced on the Internet today.

Lizzie O’Leary: At Exxon’s urging, the older, less tech savvy Senate attached the Communications Decency Act to their version of the telecom bill. The CDA made it illegal to knowingly send or show minors indecent content online. But over in the House, members were taking a very different approach.

Speaker 3: So you have a Republican, Chris Cox from Orange County, California, and a Democrat, Ron Widen, who at the time was in the house from Portland, Oregon. And they want to come up with an alternative, and that’s section 230. So Section 230 does is it solves this prodigy problem by saying that if you’re an interactive computer service provider, you won’t be treated as the publisher of content that someone else provides rather than have the government criminalize certain type. Of constitutionally protected speech. We’re going to put it in the hands of these online services and also the user.

Advertisement

Speaker 3: So in some sort of congressional magic, both the Communications Decency Act and Section 230 get put in the same part of the telecom bill, even though they conflict with one another. But the day that President Clinton signed it into law, you have civil liberties groups challenge the constitutionality of the Senate bill, and that goes within a year and a half up to the Supreme Court. And they strike that down the CDA. So basically, all that’s left of this Internet, part of the telecom law is section 230. And as an interesting footnote, there was barely any coverage of Section 230 when it was being introduced and passed. There was no talk about, you know, liability protections because you were dealing with companies like Prodigy and CompuServe and AOL.

Advertisement
Advertisement
Advertisement
Advertisement

Lizzie O’Leary: Right. Nobody imagined the current Internet.

Speaker 3: Yeah. I mean, they didn’t have the influence that any of the big platforms have right now.

Lizzie O’Leary: When we come back, Section 230 is left standing. And guess what? The platforms are pretty influential now.

Lizzie O’Leary: As tech platforms have grown in their influence and power, Section 230 has become a pretty popular target for politicians on both the left and the right. Republicans claim the law allows platforms to censor conservatives, while Democrats say it lets them wriggle free of their responsibilities to moderate content. But the courts have pretty vigorously upheld the law and ruled that it gives companies a broad liability shield, which is why the Supreme Court’s recent decision to even hear a two throw case is so significant. This case, Gonzales versus Google, centers on a young American law student, Noemi Gonzalez, who was killed in the 2015 ISIS attacks in Paris. Her family sued Google, claiming that YouTube, which is owned by Google, violated the Anti-Terrorism Act when its algorithm recommended iSight videos to other users.

Speaker 3: Now, what the plaintiffs in the Gonzalez case really focused on was they said, you know, we’re not just seeking to treat YouTube as a publisher. We’re going after YouTube for is the targeted promotion of certain content. If people are searching for things related to ISIS, at least at the time, the policies have shifted, at least internally for YouTube since then, YouTube will then start, at least according to the complaint, start recommending similar content. And this is part of the radicalization and propaganda process.

Speaker 3: So what the plaintiffs in the Gonzalez case are saying is that it’s that sort of algorithmic promotion of terrorist content that is not within the scope of Section 230. Now, courts, including the ninth Circuit, where this came from, and the Second Circuit have rejected that. And they say, you know, that’s part of the editorial and curation process that Section 230 protects. So just like we wanted to have Prodigy have the discretion to block pornography without suddenly being able to be sued by all sorts of people like a publisher. Same thing for for these sorts of cases. And there have been some judges who have dissented, including in this case in the ninth Circuit, who have said, no, this is different. And and that that, you know, YouTube and other social media sites really contribute to the harms in how they target the content.

Advertisement
Advertisement
Advertisement
Advertisement

Lizzie O’Leary: But what there isn’t, Jeff notes, is a split in opinions between appeals courts, which makes it even more notable that the Supreme Court took this case.

Speaker 3: Usually when the Supreme Court takes a case, it’s because you have what’s known as a circuit split. So you have maybe the Ninth Circuit and the Fourth Circuit read a statute in totally different ways. But on this issue, Section 230 really has not been read in different ways by different courts. So it’s pretty interesting. I was pretty surprised that the Supreme Court took this case because there’s not really a split of authorities here.

Lizzie O’Leary: Well, there’s something in the Ninth Circuit’s opinion that I thought was interesting. The majority there wrote that Section 230 shelters, more activity than Congress envisioned it would. And when I look at that and think about the court taking this case, it seems to me like the Ninth Circuit left a lot of space for either the Supreme Court or Congress to step in and say this is how Section 230 should be interpreted. Do you think I’m overreading that?

Speaker 3: I think that that’s that’s definitely one possibility, because the Supreme Court’s never taken a Section 230 case before. I mean, this is a really difficult case with tragic circumstances. And in other cases like this, judges have said, you know, we have to find that Section 230 protects the activities here, but we don’t like it. And Congress, you have to do something about it or someone has to do something about it.

Speaker 3: This happened actually in 2016. There was a lawsuit from child sex trafficking victims against Backpage, where they were they were trafficked. And the this went up to the federal appeals court in Boston. And the judges basically started off saying, you know, this is a really difficult case. We have to rule against the plaintiffs. But Congress really is the only one that can do something. And then Congress two years later did do something, and they amended Section 230 to exclude certain types of claims regarding sex trafficking and prostitution.

Advertisement
Advertisement
Advertisement

Lizzie O’Leary: Why do you think the court took this case?

Speaker 3: You know, I. I wish that I could get it in the minds of at least a few of the justices, because, I mean, you would need four justices to generally to to take this case. The only person we have some insight into is Justice Thomas, because he’s written that he is at least questioned strongly the interpretation of Section 230 by the lower courts. But he can’t take he can’t be the one who alone grants. Right.

Lizzie O’Leary: He basically said it’s a little unclear whether this current state of immunity is really what the law intends. Right?

Speaker 3: Yeah, exactly. So he so he definitely has his view, but I don’t know for certain and there’s no way to tell how some of his colleagues view it. And I mean, I there are so many possible outcomes because you don’t really know what’s going on behind the scenes and why why the justices want want to hear the case.

Lizzie O’Leary: But there’s no question that the political climate on both sides of the aisle is ripe for some sort of reexamination of Section 230, even if lawmakers haven’t been willing or able to do it themselves. I mean, for as long as I’ve been doing this show, we have been talking about section 20. And one of the things I find interesting about it is that Republicans and Democrats bring it up for for varying reasons. And it is sort of a political hobbyhorse. I wonder if you think this is reaching a head.

Speaker 3: I think it might be. I mean, obviously it’s in the news a lot. It’s hard to fully distance the Supreme Court from the political realities. I think there’s a lot of misunderstanding about Section 230 from both the left and the right. They both have real concerns about the action or inaction of the big tech platforms, but they seem to think that Section 230 changes will be some sort of magical solution. But for for the left there, they really want content that they view as harmful to be taken down more frequently. But the problem with that is that much of that content is constitutionally protected. So even without Section 230, the government can’t directly or indirectly cause the content to be taken down.

Advertisement
Advertisement
Advertisement

Speaker 3: So things like what the government deems to be COVID misinformation, I mean, that comes up a lot on the right. They tend to be more concerned about what they view as liberal tech platforms, unfairly censoring conservative viewpoints, often because of their hate speech policies or misinformation policies. And I mean, I think there are certain instances where I think it’s very valid, where, you know, there have been cases where platforms have not acted in the best interests of their users and had maybe too quickly declared things to be misinformation and they error on the side of taking things down. But the First Amendment also blocks the government from forcing them to keep it up. And that’s a good thing also.

Lizzie O’Leary: Because that’s their corporate free speech, right?

Speaker 3: Yeah, it’s their corporate free speech. And you also just like and I kind of have an old school belief in the First Amendment, I don’t want the government forcing the takedown of material. And I also don’t want the government forcing private companies to leave up the material. I want the government out of it. And that used to be really not a controversial viewpoint, but there really aren’t all that many people who really truly share that belief anymore.

Lizzie O’Leary: The court has the ability to alter the way Section 230 is interpreted and by extension, change the Internet. But it’s unclear how far the justices are going to go. What kind of power does the court have here in terms of how Section 230 could be applied kind of going forward?

Speaker 3: They could address the very narrow circumstances involving this specific case and how it applies with the Anti-Terrorism Act, and that would that would affect some cases. But they also could perhaps say that any algorithmic. Promotion of content is not covered by section 230. I think if they did that, that would really radically change both Section 230 and how platforms operate.

Advertisement
Advertisement
Advertisement

Speaker 3: You question how some social media sites would even function without any algorithmic promotion. I mean, how would Tik-tok operate and how. I mean, I think it’s just become so much a part of our how we use the Internet that I don’t think it’s really practical. I don’t know how a search engine would operate if you said that, because by just by virtue of using an algorithm, you are going to favor some content over other content. And I don’t think that we want to have a legal interpretation that says, you know, let’s not use algorithms anymore. I don’t think that’s good for anyone because that also blocks a whole lot of stuff that I don’t think many people say is useful. I mean, spam. We don’t want to let every spammer basically have equal access as our family members do. I think that there’s just a lot of practical problems.

Speaker 3: The Supreme Court also I mean, if they really want to open it up, they could say, and this is a reading that could have been taken of section 230. I think it’s a less less accurate reading. But the courts could have early on said that all Section 230 says is that you’re not treating the platforms as publishers, but they could still have that distributor liability that the magazines stands have so that they could be liable if they know or have reason to know of the content. That would change things pretty radically, because then every time a platform got a complaint about content, they would face a choice of either taking it down or possibly defending a defamation case in court.

Lizzie O’Leary: That seems like it would put them out of business.

Advertisement
Advertisement
Advertisement

Speaker 3: Yeah, I mean, you think about like last, which I think which is actually a site that really relies on section 230, quite a bit even more than kind of big tech, because they have fairly controversial content that businesses don’t really like. And if Glassdoor has these negative employee reviews and the company emails Glassdoor and says, you know, we’re not we want you to take it down because we say it’s inaccurate. Right now, Glassdoor doesn’t have to do that. But if you get if you basically say to 30 is just the magazine standard standard of liability, then Glassdoor would have a really strong reason to take down everything that people complain about. And then suddenly every workplace that you look for in West or looks like it’s perfect because all the negative reviews are taken down.

Lizzie O’Leary: You have spent so much time thinking about Section 230 and immersed in it. And every time it comes up in a big political fight or some member of Congress talks about it, or now when it is headed to the Supreme Court. I wonder what you think people miss.

Speaker 3: I think they miss the fact that. You’re not going to be able to fix everything about the Internet by changing Section 230. And that’s a bold statement from someone who wrote a book about Section 230 calling it the 26 words that created the Internet. But despite the fact that it really had this really fundamental impact on these Internet businesses, there’s so much that you can’t address by changing a statute.

Speaker 3: A lot of it is about human nature. There’s so much of a focus on the supply of bad stuff on the Internet. And I think actually someone who probably captured the issue the best was earlier this year, President Obama gave a speech about misinformation at Stanford and he said, why are we always talking about the supply when there’s a demand problem also? What is it about all of this harmful content on the Internet that attracts eyeballs and that gets people to believe it and gets people to act on it?

Advertisement
Advertisement
Advertisement

Speaker 3: And that’s not something that you’re going to deal with with Section 230. That’s an education issue. That’s social inequality. That’s I mean, criminal law. There’s all sorts of things that you’re I mean, just changing whether Facebook or Twitter can be sued for defamation, which is really what so many of the Section 230 cases come down to. That’s not going to address the fact that people are doing some really harmful and dangerous things. And so I I’m happy to see Section 230 still in the national dialogue. But I also think that we need to look at so many other social factors beyond that.

Lizzie O’Leary: Jeff Castro, thank you so much for talking with me.

Speaker 3: Thanks so much.

Lizzie O’Leary: Geoff Koseff is associate professor of cybersecurity law at the U.S. Naval Academy’s Cyber Science Department. He also wrote the book, The 26 Words That Created the Internet. You should check it out. And that is it for our show today. What next? TBD is produced by Evan Campbell. Our show was edited by Tori Bosch. Diane Levine is the executive producer for What next? Alicia montgomery is vice president of audio for Slate. TBD is part of the larger What Next Family. And we’re also part of Future Tense, a partnership of Slate, Arizona State University and New America. And if you like us, have a request for you become a Slate Plus member. Just head on over to Slate.com Slash what next?

Lizzie O’Leary: Plus, to sign up. And by the way, we are following the saga of Elon Musk’s attempt to maybe who knows by Twitter if you are hungry for Elon Content. We have done five yes, five episodes that touch on him. Twitter, this whole story. So check them out. I’m Lizzie O’Leary. Thanks for listening.