On Sunday evening, TikTok was granted another temporary reprieve when a judge blocked the Trump administration from banning it from app stores. But the app is still in a fight for its life as its Chinese owner, ByteDance, faces a deadline of Nov. 12 to either sell or spin off the U.S. arm of TikTok.
The Trump administration’s Aug. 6 executive order banned TikTok and another Chinese app, WeChat, as a supposed national security threat. But as courts review the order, they aren’t paying much attention to the First Amendment speech rights of TikTok users.
That’s a major oversight, because the First Amendment should save TikTok. We just need the courts to agree.
TikTok first sought to fight back against the executive order in federal court in Los Angeles, where it is headquartered. But its arguments centered around the Fifth Amendment due process violations—the government was demanding a sale of the company, with some part of the proceeds going to the U.S. Treasury, without giving it a chance to defend itself from the charge that it was a national security threat. A minor argument included at the end said that the action violated First Amendment rights—but only those of the company in its computer code. The company dropped the suit last week as it sought to negotiate with U.S. buyers, including Oracle and Microsoft.
Then a similar suit popped up in federal court in San Francisco, brought by a TikTok employee, Patrick Ryan, who worried that cashing his paycheck could be an act of treason under the broad language of the president’s order. (Disclosure: I’m the executive director of the First Amendment Clinic at the Sandra Day O’Connor College of Law at Arizona State University, which wrote a friend-of-the-court brief with the Electronic Frontier Foundation in this case; ASU is a partner with Slate and New America in Future Tense.) Again, the case relied on the Fifth Amendment, with no reference to the users’ rights. But after the government promised not to enforce the order against employees for being paid by the company, the suit was rendered moot.
At least WeChat users saw a little respect from the federal court examining the similar attempt to shut down that service. On Sept. 20, the court was persuaded that because the service is the “primary source of communication and commerce” for its Chinese-American users—it provides news and social media activities in Chinese and allows contact with users in China, where other American social media platforms are restricted—the users had demonstrated serious First Amendment concerns that “are the equivalent of censorship of speech or a prior restraint” on the service..
But then again, that decision was just a preliminary injunction. And the Trump administration is now back in court to convince the judge to overturn the WeChat injunction, promising a secret filing this week to make the case that the service is a national security threat. So the First Amendment interests are still on thin ice.
But TikTok came back to court last week, this time in Washington, D.C., to again argue against the president’s ban. This time, TikTok expanded the First Amendment interests to include not just the company’s code, but the company’s role as a user and speaker on its own service, thus giving it a hook to argue for all users’ First Amendment rights. TikTok argued that the executive order functions as a prior restraint of users’ speech and must be subject to “strict scrutiny”—meaning it is only valid if it is justified by a compelling government interest. As a fallback, TikTok argued that because it affects speech, it must at least be subject to “intermediate scrutiny”— justified by a “substantial” government interest. (The difference between a substantial and a compelling interest is just the sort of question that keeps lawyers employed.)
And while the court granted the preliminary injunction Sunday evening after a rare weekend hearing, it didn’t mention the First Amendment in its decision, instead relying on an exception to the International Emergency Economic Powers Act, which was the basis of the authority for the executive order, for “informational materials” and “personal communications.”
These disputes over TikTok and WeChat come amid a much bigger conversation over the legal rights and obligations of social media companies, even as courts have made clear in recent years that these forums deserve strong legal protections. The U.S. Supreme Court in 2017 struck down a North Carolina law barring registered sex offenders from using the internet and social media platforms in Packingham v. North Carolina. But that decision’s First Amendment findings were firmly rooted in a case from 20 years before, Reno v. American Civil Liberties Union, when “social media” as we know it today did not exist.
Partially quoting Reno, the court stated that, “While in the past there may have been difficulty in identifying the most important places (in a spatial sense) for the exchange of views, today the answer is clear. It is cyberspace—the ‘vast democratic forums of the Internet’ in general, and social media in particular.” The court continued: “In short, social media users employ these websites to engage in a wide array of protected First Amendment activity on topics ‘as diverse as human thought.’”
And TikTok takes things a step further. Posts made to the platform are widely shared and often connected by themes. In fact, it’s the algorithm that chooses what to show a user that is credited with TikTok’s popularity, and it’s the ultimate ownership of that algorithm that is the sticking point in the sale of the company. With posts being view by thousands, if not millions, of strangers, the service has, as the New York Times reported, “become an information and organizing hub for Gen Z activists and politically-minded young people.” Another Times article said it has “has amplified footage of police brutality as well as scenes and commentary from Black Lives Matter protests around the world, with videos created and shared on the platform frequently moving beyond it.”
Much of the highest-profile political activism on TikTok has focused on President Trump. Most notably, a group of TikTok teens claim to have launched a campaign to inflate the attendance expectations at Trump’s Tulsa, Oklahoma, rally in June. Another TikTok user, Sarah Cooper, has gained notoriety for her satirical posts about the president, where she points out what she sees as the absurdities of some of his statements merely by lip-synching short audio clips of his speeches.
So if the fight over TikTok involves politically controversial and socially active speech, but the legal battle centers on the Fifth Amendment claims and other statutory limits on presidential powers, the way to elevate the First Amendment interests is to emphasize to courts that the freedoms of the Bill of Rights are all tied together.
This means that when the court is considering a “due process” claim, but that claim has a fundamental and drastic effect on First Amendment rights, the speech interests “supercharge” the other constitutional interests and demand the highest standard of scrutiny under the law. This takes us back to the difference between strict and intermediate scrutiny, and the nature of the interest that must be demonstrated by the government. And thus, when the First Amendment is so clearly implicated, courts must always apply the strictest scrutiny, which generally means that the speech-restrictive law will fail this difficult test. So a law that completely shuts down a social media platform should never be tolerated.
The Supreme Court has most clearly recognized this interplay in the context of the
Fourth Amendment, in the 1965 case Stanford v. Texas. When a search warrant implicates First Amendment interests, the warrant requirement to “particularly describe the ‘things to be seized’ is to be accorded the most scrupulous exactitude when the ‘things’ are books. … No less a standard could be faithful to the First Amendment freedoms.”
These rights also require limited activity by the government, not sweeping decisions to shut down an entire social media platform. The high court held in 1963 that because the “First Amendment freedoms need breathing space to survive, government may regulate in the area only with narrow specificity.” The court “will not presume that the statute curtails constitutionally protected activity as little as possible.”
If there’s a national security threat due to access to users’ information by foreign powers, that access can be regulated consistent with the First and Fifth Amendments—by imposing controls on monitoring or reporting on user data, for instance. But the social media platform cannot be silenced.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.