On Tuesday, the New York Times and the Washington Post published explosive allegations claiming that anti-virus software made by Kaspersky Labs, which is headquartered in Russia, had been programmed to target U.S. intelligence assets throughout the world. If these allegations are true, it implies that consumer security technology has been weaponized and turned into spyware for national governments. The implications will likely go far beyond Kaspersky, which has denied knowingly allowing the Russian government to use its software for this purpose.
With the Kaspersky scandal undermining faith in popular software, now might seem like a good time for our government to reassure consumers that this sort of thing won’t happen to U.S. firms. Instead, on the same day as the Kaspersky news broke, Deputy Attorney General Rod Rosenstein took a different approach. In a speech at the U.S. Naval Academy, Rosenstein exhorted American tech companies to deploy what he called “responsible encryption.” What Rosenstein means is that he thinks Google, Facebook, and Apple should modify their software so that they can always hand over their customers’ data to the U.S. government on demand.
The fact that the Justice Department has concerns about encryption shouldn’t come as a surprise. Over the past several years, firms like Apple have been moving to encrypt much of the data we store and transmit on our phones. If you use an iPhone, or an application such as WhatsApp, chances are that most of your data is protected using end-to-end encryption. This approach has major security benefits: It prevents anyone but you (or your communication partner) from reading your messages, which means your data is also protected from anyone who hacks into your provider. But while this encryption prevents criminals from stealing your data, it also locks out law enforcement and national security agencies.
U.S. law enforcement agencies have long wanted to do something about this. Rosenstein’s latest proposal, which could someday be enshrined into legislation, would require American firms to replace strong encryption with something different. The replacement would have an impossible mission. It would still be expected to keep out criminals—and even well-funded foreign intelligence agencies—but it would also allow tech firms to decrypt and hand over their customers’ data when required by warrant.
If this request sounds like a problem for technology firms, that’s because it is. In the wake of recent breaches and disclosures of classified NSA documents, the U.S. tech industry has been fighting to hold onto its credibility in competitive international markets. Encryption has been a vital part of that effort. The new Kaspersky allegations will almost certainly raise the stakes. At a minimum, they’ll provide foreign governments with new opportunities to raise barriers against U.S. products. At worst, they will raise real questions about the integrity of U.S. security and cloud service firms.
In Rosenstein’s view, these concerns are misplaced. The U.S. is a nation of laws, he argues, and any access to data will be based on warrants lawfully obtained. But to some extent, U.S. law don’t matter. We sell our products throughout the world. If American law enforcement gains access to encryption, then other nations’ security agencies will demand the same capability. And when they don’t get access, they could ban our products. In democratic nations, granting this access could be an acceptable trade-off. But inevitably, the same requests will come from authoritarian regimes like China, Russia, and others with a very different approach to human rights. With these capabilities mandated by the United States, our firms will have no way to decline.
Even worse, any technology that allows U.S. agencies to lawfully access data will present an irresistible target for hackers and foreign intelligence services. The idea that such data will remain safe is laughable in a world where foreign intelligence services have openly leveraged cyberweapons against corporate and political targets. In his speech, Rosenstein claims that the “master keys” needed to enable his proposal can be kept safe, but his arguments are contradicted by recent history. For example, in 2011 hackers managed to steal the master keys for RSA’s SecurID authentication product—and then used those keys to break into a slew of defense contractors. If we can’t secure the keys that protect top-secret documents, it’s hard to believe we’ll do better for your text messages.
At the end of the day we, as a society, have a decision to make. We can adopt the position that your data must always be accessible—first to the company that made your software and secondly to its government. This will in some ways make law enforcement’s job easier, but at a great cost to industry and our own cybersecurity. It will make us more vulnerable to organized hackers and could potentially balkanize the tech industry—exposing every U.S. software firm to the same suspicions that currently dog Kaspersky.
Alternatively, we can accept that to protect user data, companies have let it go—and the single most powerful tool technologists have developed to accomplish this goal is encryption. Software with encryption can secure your data, and in the long run this—properly deployed and verified—can help our software industry spread competitively across the world. This will not be without costs: It will make (some) crimes harder to solve. But the benefits will be real as well.
Software and service providers are not deploying encryption merely to frustrate the U.S. government. Providers know their business far better than the Justice Department does—when they choose to deploy encryption, it’s because their business depends on it. And while it may be frustrate law enforcement, in this case Silicon Valley’s interests and consumers’ interests are aligned.