The Government’s iPhone Demands Undermine Security for All of Us

FBI Director James Comey testifies while flanked by Deputy Attorney General Sally Quillian Yates during a Senate Judiciary Committee hearing on July 8, 2015.

Mark Wilson/Getty Images

On Tuesday evening a magistrate judge in Riverside, California, issued an unprecedented order—under a law that dates back to 1789—commanding that Apple help the FBI access the encrypted contents of an iPhone that belonged to one of the perpetrators of the San Bernardino shootings. Apple is refusing to do so, as it should, because the order represents an incredible overreach in government power. Some are reporting that Apple has agreed to unlock some iPhones in the past, though we don’t know in what circumstances. Whatever Apple’s past agreements with law enforcement, it has decided to fight against this order, as well as a similar one in New York.

But this story isn’t just about the one iPhone, or even all iPhones. If the government prevails, this precedent threatens to undermine the security and trustworthiness of the software running on all of our digital devices. The loss of that trust will spell disaster for the digital economy that relies heavily on that trust.

In short, the FBI is demanding that Apple do two things: First, create a version of iOS that does not limit incorrect guesses of a password before wiping the device; and second, sign that new version of iOS so that the iPhone will think it is a legitimate update. iPhones, and many other devices we use on a day-to-day basis, have security mechanisms that allow them to run only authorized updates. By forcing Apple to sign this new code, the FBI would effectively be creating a backdoor for its own use. And if a court can force Apple to do that, there’s no reason it couldn’t force any software vendor to code and ship government malware as part of its standard update process.

By attempting to subvert the secure process that allows and encourages us to accept critical security updates from our software vendors, the government proposes to take direct aim at the trust that currently underpins digital security and the digital economy. Our relationships with our digital devices are carefully balanced acts of trust. We store our digital lives on them—our photos, emails, calendars, and address books. If this precedent is set, it will spell digital disaster for the trustworthiness of everyone’s computers and mobile phones. That trust is what keeps businesses like Apple, Google, and Microsoft in business, both here in the United States and abroad. That trust is already struggling to recover from the damage done by governments as revealed by Edward Snowden almost three years ago.

Even if that trust were not crucial to business interests, it is fundamental to the cybersecurity of the Internet as a whole. Automatic update mechanisms such as the one that the FBI seeks to subvert are vital to the security of the network. When vulnerabilities are found in software, patches have to be distributed as quickly as possible. Getting average users to apply security updates is already an uphill climb as it is, and knowing that such an update could subversively actually undermine their personal device security could lead to even more people declining updates as a rule. If enough people choose not to update their devices, we begin to lose the collective security that is necessary to defend the Internet from the attacks of organized crime and repressive governments. Unsecured devices can then be hacked and added to botnets—networks of subverted machines that then are commanded to send spam, attack websites, and hack even more devices.

That is why, according to leaked internal memos, experts at the White House previously considered and rejected exactly this idea: because of the potentially devastating effect of users losing trust in the updates and vulnerability patches sent by software and hardware makers.

Apple is absolutely right to fight against this order. There must be a line in the sand that says that the government cannot compel companies to subvert their own software for the purposes of a government investigation. Can the government conscript programmers at Apple to write spyware that will be used against their own customers? Can they be forced to sign it and portray it to those users as the genuine article? Those are the questions at issue. They certainly should not be able to do so based on a novel reading of a centuries-old statute, without some sort of sign that Congress has weighed and affirmed such a step. A full-throated and dynamic public conversation about the government’s ability to compel malware needs to be the next step before the FBI tries to tell private companies how to design their products.