Recently, the FBI has been attacking the “going dark” problem—that is, its inability to read all electronic communications—from both legal and technological angles. It wants to be able to fine communications companies for refusing to comply with subpoena requests for the content of customers’ emails and chats. It’s also trying to create ways of decrypting any communication sent via a Web service, like Gmail messages Facebook chats, or Twitter direct messages. It believes it can work with companies to build secure methods for lawfully intercepting communications on the Web.
But a report released last week by the Center for Democracy and Technology and some of the top names in computer security points out that building so-called “back doors” in to these Web services will also increase the risk that bad actors will gain access to the communications content of all users of these services. Creating a back door in software is like creating a lock to which multiple people hold the keys. The more people who have a key, the higher the likelihood that one will get lost. But this is precisely the power that would be granted by proposed extensions to the Communications Assistance for Law Enforcement Act (CALEA II).
Even the most trustworthy and reputable organizations can’t live up to the required level of vigilance to keep software keys safe. Take the National Institute of Standards and Technology, which the government designated as a key holder for the “Clipper chip,” a backdoor system for monitoring telephones. Key holders would distribute keys to wiretap individual phones to law enforcement if they could provide proper legal justification. (It’s not clear that NIST ever actually became the key holder, but the designation demonstrates the government’s trust in the organization’s security.) In the 1990s, NIST heavily lobbied for the adoption of key escrow methods of enabling lawful interception. Recently, NIST fell prey to a malware attack that could have leaked sensitive information. Ironically, the intrusion forced NIST to temporarily restrict the availability of a computer security vulnerability database that it maintains.
Keys aside, there will be plenty of other opportunities for hackers to poke holes in the security of a system with a back door. Secure communication channels in software are incredibly difficult to implement. Indeed, the security software mechanisms everyday Web users trust to secure email, insurance transactions, banking information, and so on are riddled with vulnerabilities that can be exploited. By adding additional avenues for law enforcement to access those channels, you automatically increase the number of vulnerabilities.
The government has already demonstrated that it can’t really be trusted to design secure back doors. The aforementioned “Clipper chip,” which was designed by the NSA in 1993, was a primitive and flawed attempt to create a way secure telephone conversations—and also allow government eavesdropping. If both callers used a special phone, anyone with a wiretap on either end of the call would hear garbled noise, unless the key for lawful intercept by authorities was used. Despite engineering by some of the most talented mathematicians and computer scientists in the world, Matt Blaze, then an AT&T engineer, soon found flaws in the design that allowed hackers to permanently disable the chip’s lawful intercept capabilities—so the calls could take place, but government couldn’t eavesdrop. And since the voice-garbling encryption technology was built into the phone electronics, it was nearly impossible to patch the flaw. It’s been 20 years since NSA engineers designed the Clipper chip, but in the interim, implementing secure systems has only gotten more difficult, and the threats to those systems vastly more pervasive. As Timothy B. Lee notes on the Washington Post’s Wonkblog , “This is more than a hypothetical concern. In 2005, the Greek government discovered that an unknown party was intercepting the phone conversations of Prime Minister Kostas Karamanlis and dozens of other senior officials in the Greek government.”
In the end, the risks that come with intentionally creating back doors are too high to balance out an unknown increase in public safety that the practice would provide. The engineering expertise to design these kinds of locks simply does not exist, and it will be impossible to ensure the keys won’t fall into the wrong hands. Policymakers shouldn’t waste any more time even talking about the proposal. The FBI has subpoenas, warrants, and National Security Letters to get email; pen registers, traditional wiretaps, and cellphone eavesdropping tools to get telephone data; and all of the powers set forth in the original CALEA to monitor most Internet communications. As the CDT report puts it, “The FBI’s desire to expand CALEA mandates amounts to developing for our adversaries capabilities that they may not have the competence, access, or resources to develop on their own.”