In recent months, law enforcement, led by FBI Director James Comey, has waged war against the “going dark” problem—criminals using secure communications technologies, particularly encryption, to evade justice. Its solution to this problem is to encourage or require technology companies to build in back doors to allow the government to circumvent, say, encryption on your iPhone. But in reality, we are currently in a golden age of surveillance. The “going dark” argument should not be used as a reason to support back doors or other special access by law enforcement to encrypted communications.
Last Wednesday I had the privilege of testifying before the Senate Judiciary Committee on the balance between public safety and encryption. I have been researching and writing on encryption for two decades, including serving on President Obama’s Review Group on Intelligence and Communications Technology. My testimony stressed three arguments
First, I agree that there are indeed specific ways that law enforcement and national security agencies lose specific previous capabilities due to changing encryption technology. These specific losses, however, are more than offset by massive gains, including: (1) location information; (2) information about contacts and confederates; and (3) an array of new databases that create digital dossiers about individuals’ lives.
The adoption in the past 20 years of text messaging, an area highlighted by law enforcement as an example of “going dark,” specifically shows enormous gains to law enforcement. Although relatively few text messages were sent 20 years ago, by 2010 the number exceeded 6 trillion texts per year. For the predominant share of those messages, the content is available from the provider. Even for the subset where the content is encrypted, law enforcement can gain access to the metadata.
Being able to access texts and other metadata is enormously helpful in mapping the social graphs of suspects. Before we all communicated online, most of our social interactions (except our phone calls) left no records, and the content of communications left no trace unless law enforcement happened to have an active wiretap on a phone call. Today, however, metadata leaves traces of every electronic communication a suspect has, showing whom they speak to, how often, how long, and from where. Identifying these other confederates gives law enforcement the opportunity to use a number of other tools to access encrypted content, ranging from confidential informants, to surveillance on the co-conspirators, to offering immunity to one participant to gain access to the content of communications with the others.
Law enforcement has expressed particular concern about encrypted text messaging services, such as WhatsApp. For text messages, it might be tempting to say that law enforcement could call the glass half empty (some texts are encrypted) or half full (some texts are in the clear). With more than 6 trillion messages filling the cup, though, it takes chutzpah to say the glass is empty. Text messages are a prime example of a golden age of surveillance, and not of going dark.
Second, government-mandated vulnerabilities would threaten severe harm to cybersecurity, privacy, human rights, and U.S. technological leadership while not preventing effective encryption by adversaries. As occurred in the 1990s, a diverse coalition of cybersecurity experts, technology companies, privacy experts, human rights activists, and others has expressed vociferous and united opposition to government-mandated encryption vulnerabilities. These concerns include:
- Technology companies, even before Edward Snowden, had multiple reasons to deploy strong encryption to enhance cybersecurity and customer trust. The ongoing development of encryption should thus not be seen primarily as a short-term response to Snowden’s revelations.
- Overwhelming technical problems and costs result from mandates to create vulnerabilities in encryption.
- U.S. government support for encryption vulnerabilities increases cybersecurity problems in the “least trusted countries” and globally, and undermines U.S. human rights policies. The United States should be a strong example for cybersecurity and human rights, rather than an excuse used by repressive regimes to surveil U.S.-based businesses and individuals and clamp down on political dissent.
- Mandated vulnerabilities are bad industrial policy—they threaten U.S. technological leadership without preventing bad actors from using strong encryption.
An impressive new technical study by a group of experts was released on July 6 just before the hearing, titled “Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications.” The new study highlights three general problems. Providing mandated access “would force a U-turn from the best practices now being deployed to make the Internet more secure.” Furthermore, building in exceptional access would substantially increase system complexity, “making security testing difficult and less effective.” Finally, exceptional access would create concentrated targets for bad actors: “Recent attacks on the United States Government Office of Personnel Management show how much harm can arise when many organizations rely on a single institution that itself has security vulnerabilities.”
One might perhaps wonder whether the technical experts are stretching a point by making such definitive statements. Based on my two decades of work on these issues, the technical experts say the same things in private as are written in blue ribbon reports. The passion that the most eminent technical experts show here is due to their conviction based on hard-fought experience, and not a lobbying ploy.
Third, the Review Group on Intelligence and Communications Technology report, released in December 2013, unanimously and clearly recommended that the U.S. government vigorously encourage the use of strong encryption, stating:
We recommend that, regarding encryption, the US Government should:
(1) fully support and not undermine efforts to create encryption standards;
(2) not in any way subvert, undermine, weaken, or make vulnerable generally available commercial software; and
(3) increase the use of encryption and urge US companies to do so, in order to better protect data in transit, at rest, in the cloud, and in other storage.
With full awareness of the “going dark” concerns, we sharply criticized any attempt to introduce vulnerabilities into commercially available products and services, and found that even temporary vulnerabilities should be authorized only after administrationwide scrutiny. Based on the top-secret briefings and our experience, we found these policies would best fight cybercrime, improve cybersecurity, build trust in the global communications infrastructure, and promote national security.
At heart, providing access exceptions for U.S. law enforcement and intelligence agencies will be harmful, rather than helpful, to national security. The inability to directly access the content of a small fraction of these communications does not warrant the subsequent damage that would result to privacy and to U.S. economic, diplomatic, and security interests.
Special thanks to Justin Hemmings for assistance with this project.
This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, visit the Future Tense blog and the Future Tense home page. You can also follow us on Twitter.