For years, U.S. law enforcement has tried, and failed, to convince Congress to require tech companies to provide backdoor access to encrypted data and communications. But it might not need such legislation after all: It may end up with a back door to a back door, by way of the land Down Under.
The Australian government is working to enact legislation to make it easier for law enforcement and intelligence agencies to access electronic communications information, including data protected by encryption, by forcing companies to weaken the security of their products. And the United States appears to be pushing for, and ready to exploit, that new law—a practice known as policy laundering.
The U.S. Justice Department and FBI have been complaining for years that they are “going dark”—that is, losing the ability to obtain criminals’ communications. Meanwhile, tech companies, security researchers, and privacy advocates have continued to explain that providing “exceptional access” to government threatens the individual rights and cybersecurity of all users. There is simply no way to guarantee that back doors created for the U.S. (or any) government will be secure against exploitation by others.
U.S. officials have tried to say recently that it is possible to develop so-called responsible encryption to permit such law enforcement access, but they still say that legislation is likely necessary because “[t]echnology companies almost certainly will not develop responsible encryption if left to their own [no pun intended?] devices.” But responsible encryption is a myth. Moreover, as the public learned in May, the U.S. government has been dramatically overstating the number of locked phones that it could not access due to encryption. Following this revelation, although the FBI continued to call for encryption legislation, the U.S.
government’s crusade seemed to die down.
It now appears that the United States’ efforts were largely pushed underground—or actually, Down Under. The United States brought its campaign for encryption back doors to the Five Eyes—an intelligence-sharing alliance composed of Australia, Canada, New Zealand, the United Kingdom, and the United States that dates back to World War II. Limited information is publicly available about how intelligence sharing works among the Five Eyes. But we do know that alliance members have also begun collaborating on law enforcement policy. Starting in 2013, these nations have participated in a “Five Country Ministerial,” which has evolved into an annual convening on strategy and information sharing regarding law enforcement and national security issues. The 2017 Joint Communiqué outlines plans to collaborate and share information on issues including global migration and refugee systems, cybersecurity, and encryption.
In August, the Five Country Ministerial 2018 doubled down on the issue of encryption. It released a Statement of Principles on Access to Evidence and Encryption in which the governments noted that if they continue to “encounter impediments” in their efforts to access encrypted communications, they may pursue legislative mandates for encryption back doors.
That same month, the government of Australia released an “exposure draft” of its Assistance and Access Bill 2018, which would grant the government new authorities to access electronic communications information, including encrypted data. Many provisions of the bill were modeled on the United Kingdom’s Investigatory Powers Act, which also raises serious threats to privacy and creates new tools that the government may try to use to mandate back doors, though they have not yet been tested. Australia’s bill includes some language that sounds good: It says that it would prohibit the government from requiring communications providers to create—or prevent them from repairing—“a systemic weakness or systemic vulnerability.” That sounds like it’s explicitly not calling for a back door. But this promising language is undermined by the bill’s authorization of powerful new tools. Through “technical assistance notices” and “technical capability notices,” the government could demand that communications providers take certain actions, including modifying their products. This means the government could require tech companies to install software or otherwise weaken product security to enable the government to access users’ data.
The Australian government opened a public comment period on the proposed bill, and by the deadline of Sept. 10, it had received more than 14,000 submissions, including comments filed by New America’s Open Technology Institute, where I work. (New America is a partner with Slate and Arizona State University in Future Tense.) Our comments were joined by an international coalition of 21 civil society organizations and 10 tech companies and trade associations, and outline key human rights and cybersecurity risks posed by the bill. Despite the overwhelming number of comments, only 10 days later, the Australian government rushed to introduce a somewhat modified version of the bill in Parliament. The Parliament opened a new public comment procedure (which ends Oct. 12), but the revisions are quite minor. For example, the updated bill added language authorizing Australian courts to hear challenges to government data demands, but it still fails to provide a process for bringing such challenges, or a clear and meaningful standard that courts would follow in assessing any such cases.
Even with the revisions, the bill would grant broad authorities to the Australian government that threaten human rights and cybersecurity. For example, the bill appears to permit the type of demand made by the FBI when it sought to compel Apple to unlock an iPhone used by the 2015 San Bernardino, California, shooter. Although the FBI argued it sought only to unlock one phone, as Apple explained, building the requested software tool would have made the technique widely available, thereby threatening cybersecurity for other users. To make matters worse, it would be far more difficult to bring legal challenges to these new authorities if granted to Australia than it would be to contest similar powers in jurisdictions like the United States and the U.K., because Australia does not have a bill of rights.
Once enacted into law in Australia, these powerful new tools could help provide the United States with a back door to an encryption back door. The U.S. government cannot ask the Australian government to collect and hand over data that the United States is legally prohibited from collecting on its own, but some data may be shareable under the secret terms of the Five Eyes alliance. Beyond that, if Australia gains the tools to force providers to undermine the security of their products, the United States and other governments could exploit those same tools. For example, if Australia could compel Apple to build a new operating system to circumvent iPhone security features—as the FBI demanded in the San Bernardino shooter case—then, once the system was built, Apple could no longer argue that it lacked the capacity to turn over data to the U.S. government in similar cases. Similarly, if Australia forced Facebook to re-engineer WhatsApp’s encrypted chats to be accessible in response to Australian legal demands, those chats would also be vulnerable to other governments’ demands. There is also a risk that the U.S. government could seek to expand its own authority directly, by pointing to Australia as the new model for “responsible encryption” legislation.
Either as a pathway or a model, the Australia bill creates risks to cybersecurity and privacy that extend well beyond that nation’s borders.