The government is once again using a terror scare to pursue new powers for routine law enforcement—and in the process making all Apple users less safe from criminals. Worse, it’s not clear the phone in question is going to be all that valuable.
On Tuesday night, a magistrate judge in Los Angeles ordered Apple to disable security features on an iPhone seized in the investigation of the Dec. 2, 2015, San Bernardino, California, attack. The judge, Sheri Pym, did so under the All Writs Act, a grab bag 1789 law that permits courts to issue orders in the interest of law. In this case, the government claims Apple must disable its security features to be able to execute an already approved warrant.
This moment when Apple would be ordered to hack a user’s phone has been coming for some time—but the timing suggests the government is using last year’s San Bernardino attack opportunistically to expand the government’s authority to spy. Even while the administration has presented the appearance of a debate, the FBI has been working to obtain a secret precedent to force Apple to create a back door.
Since September 2014, some in the administration—led by FBI Director James Comey—have been complaining about encryption making law enforcement harder. Last summer, the White House reviewed whether to ask for legislation requiring tech companies to provide a back door for law enforcement but decided not to do so. Yet on the very same day as Comey was publicly claiming to Congress the administration wasn’t asking for a back door, his agency was secretly asking a judge—in a drug case in Brooklyn—to force Apple to help break into its customers’ phones. In that case, Judge James Orenstein suggested (though newer ruled) that it was inappropriate to use the All Writs Act to do something Congress had considered but decided against. Here’s a timeline of the back and forth over encryption over the past couple of years:
On Tuesday, two months after seizing a phone the San Bernardino attacker, Syed Rizwan Farook, used for work, the FBI submitted another request for an All Writs Act order for a back door into it. At one level, the request—to shut down security features limiting the number of incorrect passwords and delaying new password attempts so the government could “brute force” unlock the phone by trying a range of possible passwords—seemed quite modest. “Apple has the ability to modify software that is created to only function within the SUBJECT DEVICE,” the government said, presenting this as a limited request. It even offered to let Apple keep the phone while the government tested a series of passwords on it remotely. But ultimately, the government was asking Apple to write a custom version of its operating system to access data that might—or might not—be on the phone.
This isn’t just troubling from a civil liberties perspective—it’s also downright puzzling when you consider the FBI’s own actions regarding the phone. As the request noted, the government first got a warrant for and seized this phone—from a Lexus sitting in Farook’s garage—on Dec. 3, just hours after the attack. In the weeks since, Apple provided the government all the content backed up from the phone before Oct. 19, 2015, as well as metadata showing that Farook and his wife and partner in the attacks Tashfeen Malik communicated using the device between July and November. (In an affidavit, FBI Agent Christopher Pluhar suggested, but did not actually claim, that they communicated using a program—probably iMessage—that could have been but was not backed up on iCloud.) On Jan. 29, the government had to renew its warrant to access the content of the phone. Yet Pluhar’s affidavit notes that throughout that whole time, “FBI has been unable to make attempts to determine the passcode.” Contrary to Comey’s description, “We’re still working on it,” the phone apparently just sat there. Had the phone been that urgent you would have thought the FBI would have asked for an All Writs order in December. Compare that with the urgency with which FBI held and questioned Farook’s friend, Enrique Marquez, for 10 days after the attack without charging him (which would entitle him to a lawyer).
Remember, Farook and Malik had at least two other phones, ones they attempted to destroy. Despite those efforts, the FBI has, given Comey’s public statements, been able to access the contents of those phones. The attackers succeeded in hiding a hard drive, the contents of which are likely far more interesting than this phone, but which the FBI rightly appears to assume is simply impossible for anyone—including the manufacturer—to recreate. But the pair left this phone in a Lexus in their garage, untouched. While Pluhar suspects Farook turned off the auto-backup function sometime after Oct. 19, 2015, the killer certainly didn’t treat this work phone with the same attention he did his other phones or that hard drive.
Pluhar notes that “Farook was in communication [using the phone] with victims who were later killed during the shootings,” which is unsurprising since Farook targeted his co-workers and this was his work phone. But if he was engaged in critical conversations with his co-workers, the FBI should have been able to recover them from the default auto-backup the county used. The FBI didn’t address why it couldn’t access these communications from iCloud. Plus, public reporting has already revealed that Farook had heated disagreements with one of the men he murdered—so it’s unlikely that this work phone will shed substantial new light on his office life.
The phone also is unlikely to reveal direction from anyone overseas. National Security Agency Director Mike Rogers told Yahoo last week that metadata obtained from Farook and Malik’s communication revealed they “didn’t find any direct overseas connections.”
So this appears to be a situation where the FBI has already answered the terrorist question: whether the couple took instructions from a foreign terrorist organization. The FBI has repeatedly said they did not. It’s possible that there is information they are holding back—but given the debate currently boiling, if that’s the case, they should disclose it. The FBI long ago answered the law enforcement question: Who committed 14 murders? Surely it would be better if the FBI had this information—it could be helpful in some ways. But the FBI already has the most important information about the attack.
Obtaining this information in this fashion comes with a cost: perhaps most troublingly, because the FBI wants to load the back door via an update, the approach risks degrading trust in automatic updates, a key approach to ensuring individuals protect themselves from hacks. Apple CEO Tim Cook suggested in an open letter to customers that writing the customized operating system will give someone—potentially meaning the federal government—the knowledge to replicate it for other users. “Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.”
There’s one more reason to look skeptically on this request: At a Senate Intelligence Committee Global Threats hearing earlier this month, Senate Intelligence Committee Chairman Richard Burr invited the FBI director to renew his call for back doors. That is when Comey revealed that the FBI had still been unable to access a phone from the San Bernardino attack. But he also admitted that the issue was most pressing to law enforcement officials, not on counterterrorism investigations. Encryption “is actually overwhelmingly affecting law enforcement,” Comey said, claiming encryption was disrupting investigations into kidnapping, murder, and even car accidents. Burr reiterated that impression, saying, ”I’ve had more district attorneys come to me” asking for back doors “than I have the [intelligence community heads] at this table.”
Chances are good that Congress and presidential candidates would not applaud the FBI’s efforts to make Apple phones weaker so loudly if this were about solving car accidents. But because it involves terrorism—or most likely, the details of the terrorist attack that pertain to conflicts between colleagues—there’s a big rush to make Apple’s programmers destroy the security features they’ve created, creating a new operating system that can be used for those car accidents.
Ultimately, Magistrate Judge Orenstein probably has it right. We would do well to force Congress to debate where the balance between device security and law enforcement security should lie before we start dismantling important protections—in part, to solve car accidents.
This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.