On Friday, U.S. Attorney General William Barr, joined by his Australian and British counterparts, sent a letter to Facebook CEO Mark Zuckerberg urging him to abandon plans to deploy strong encryption on all the Facebook-owned messaging services. “Security enhancements to the virtual world should not make us more vulnerable in the physical world,” the letter reads. “Companies should not deliberately design their systems to preclude any form of access to content, even for preventing or investigating the most serious crimes.”
Facebook intends to spread strong encryption to all of its messaging services in the near future, sealing off Facebook Messenger and Instagram messages from prying eyes (and computers) in the next few years. “Law enforcement, obviously, is not going to be psyched about that,” Zuckerberg said in a closed-door meeting at Facebook in July. “But we think it’s the right thing to protect people’s privacy more, so we’ll go defend that when the time is right.”
Human rights and security advocates have been saying for years that efforts to limit or weaken new forms of encrypted communication are futile because there have long been techniques to hide communication that lie beyond the reach of regulation. Bad people will find a way to hurt people, regardless of what Facebook and Apple do with their encrypted systems. And as more states once seemingly committed to human rights and the rule of law lurch toward authoritarianism and deploy intrusive surveillance, the value of encrypted communication grows.
If claims like this seem familiar, they should. We have been having this argument over encryption for more than 30 years. It’s no deeper or more nuanced than it was in 1995. The last time this debate sprouted was 2016, when the FBI managed to crack the encryption protecting the iPhone owned by the terrorists who slaughtered 14 people in San Bernardino, California.
Law enforcement has tried every few years to head off another proposed service or device that would make encryption more useful or more available. Security and human rights experts have responded that limiting encryption or allowing “backdoors” through which law enforcement or intelligence agencies could sneak into otherwise secure systems would endanger lives and risk harming vulnerable people.
There is no single or simple answer to this dilemma. Strong encryption is dangerous. Weak or no encryption is dangerous. Backdoors are dangerous. They are all dangerous to different groups of people living under different conditions.
Pick your losers. Pick whom you care to protect. The kinds of people you value most will indicate whether you support the spread of strong encryption or not.
Barr’s concerns are not unfounded. Zuckerberg’s position is sincere and justified. But Barr won’t concede that he is more interested in protecting children from predators than religious minorities from authoritarian states. And Zuckerberg won’t admit that his motives minimize the concerns of abused children and maximizes the interests of adults who wish to hide their activities from hostile governments. After decades of sincere and informed debate, we are no closer to figuring out the best way to protect data and communication from bad actors (including belligerent state actors) without shielding worse actors.
Encryption scrambles text so that humans can’t read it. A computer algorithm must have the right code or “key” to unscramble it, ensuring that the intended recipient can read the message and no one else.
Strong, “end-to-end” encryption is the key attraction for using WhatsApp, the Facebook-owned messaging service with more than 1.5 billion users around the world. It keeps oppressive governments that wish to crush dissent from monitoring what people say to each other. It also—just as importantly—keeps responsible governments from tracking terrorists, criminals, violent extremists, and spies who use such services, thus undermining efforts to prevent attacks and save lives.
Beyond that, and perhaps more importantly, strong encryption limits the ability of services like Facebook and law enforcement agencies from preventing, catching, and punishing those who would sexually abuse children and post video and images of the abuse.
Just this week, the New York Times published a disturbing and revealing investigation into the proliferation of the worst kinds of video and images of adults abusing children. It demonstrated that predators use both the encrypted WhatsApp service and the unencrypted Facebook Messenger service to distribute the illicit content. Facebook has been able to account for and report these crimes because Messenger is currently unencrypted. It will not be for long. Child abusers are surely in eager anticipation of that move.
Those with skills and motivation—including criminal syndicates, terrorist organizations, child abusers, political dissidents, and human rights activists—have been able to use encrypted email and text services as well as virtual private networks and other location-masking services like TOR for decades. There is nothing that governments can do to undo that power. Encryption is just math. And math can’t be caged.
Faced with encryption, law enforcement agencies must infiltrate suspect networks or convince suspects to turn over information. That’s probably how Robert Mueller’s investigation of the Trump campaign’s ties to Russia got access to some of Roger Stone’s WhatsApp messages with WikiLeaks.
The difference in 2019 is that the largest communication service in the history of our species, one that serves almost 2.5 billion people, will soon encourage its users to move their communication habits to encrypted channels. More people would enjoy the protection of strong encryption for the first time.
As with all things Facebook, the global scale makes this different. No one can predict the large-scale results of the global normalization of encrypted communication.
We do know it will be harder for Facebook to patrol its service to filter out noxious content—something the company purports to be committed to doing better in the future. We can predict that child abusers and terrorists will have at least a slightly easier time recruiting new collaborators.
This won’t matter to most Facebook users. They will send the same messages to the same people they would if the service were not encrypted. But at the margins, those who would be tempted to engage in illicit or dangerous activities might be emboldened by the ease and ubiquity of a global, encrypted network. Right now, the uninitiated might stumble into the view of law enforcement before they realize that they could have locked down their Facebook Messenger messages.
Beyond serving the cause of human rights and free speech, Zuckerberg has two more motivations to install encryption in his services. If Facebook users do less on the regular Facebook News Feed and do more in private groups and via encrypted messages, then Facebook can’t be held responsible for failing to keep its system free of calls for violence, harassment, or hate speech. Also, Zuckerberg looks around the world and into the near future and sees but one serious competitor: China-based WeChat, which is slowly finding users beyond the People’s Republic. WeChat will never seal itself off from the prying eyes of China’s security services. So it will never be a safe service for dissidents or activists or people who just want to practice religion freely.
Facebook offers a benefit to many who need and deserve such a benefit. It also invited the worst among us to do even more harm. Once again, Facebook is poised to behave just like Facebook always has. Its officials only imagine the benefits of their services. If things go terribly wrong, the company will struggle to cope or correct. And at a massive global scale, problems don’t stay small.
Some people are going to suffer from this move. Other people will benefit. Which people do you care more about? Vulnerable children or political dissidents?
Let’s not pretend this future will be pain-free. Let’s not pretend we won’t have to choose the winners and choose the victims.