This article is part of the Policing and Technology Project, a collaboration between Future Tense and the Tech, Law, & Security Program at American University Washington College of Law that examines the relationship between law enforcement, police reform, and technology. On Friday, Nov. 20, at 1 p.m. Eastern, Future Tense will co-host “Technology, Policing, and Earning the Public Trust,” an online event about the role of technology in law enforcement reform. RSVP here.
This summer, when officials in a few cities started using facial recognition software to identify protesters, many cried foul. Those objections turned ironic when protesters used facial recognition to identify police officers who had covered their badges or nameplates during protests. Powerful technology beloved by police had become a tool for accountability: David defeats Goliath.
Possibly satisfying—but profoundly naive. Protesters, civil libertarians, and ordinary Americans have far more to lose than gain from the normalization of facial recognition software. These incidents simply highlight the pressing need for more comprehensive regulation of this increasingly cheap and powerful tool, one that threatens to alter the balance of power between citizens and their government.
Law enforcement likes to talk the most about the least objectionable use of this technology: identifying suspected criminals from security footage. Police in Maryland, for example, identified the man who shot eight people at the Capital Gazette newspaper by running his photo against the Maryland Image Repository System, which contains millions of mug shots and driver’s license photos. But already, law enforcement is using this technology for mass surveillance and data gathering purposes, not just one-off identifications where there is reasonable suspicion a crime has been committed by an identifiable suspect.
Except for a few localities that have banned its use, the only real limitation on law enforcement’s use of facial recognition technology is fear of public outrage. In China, facial recognition is used to identify jaywalkers, who receive text messages warning them of punishment for second offenses. The hard truth is that there are few current legal obstacles to American police using the technology in much the same way. Congress has resisted calls for federal regulation. Decades of judicial curtailment of privacy protections—usually in the name of the drug war—have reduced the scope of Fourth and Fifth Amendment protections, and Americans out and about on public streets have surprisingly few privacy rights that police must observe. The only real obstacle to a China-like panopticon in the U.S. is funding and policymakers’ hesitance to outrage Americans.
Americans concerned about their privacy should not rely on the scruples of politicians and police, however. The NYPD, for example, recently issued a facial recognition policy that permits the technology’s use in investigating any crime, no matter how minor, including shoplifting. Florida courts recently greenlighted a lawsuit by a Coral Gables resident, who alleged that local police use automated license plate readers to identify nearly every vehicle that enters or exits the city, and at many points in between. These readers note the license plate number, date, time, and location of hundreds of thousands of cars every day. The data is stored for three years and shared with other law enforcement agencies in the state—which means that police in one Florida beach town can build and then share with other departments an astonishingly detailed history of a person’s movements through town and into other localities. Other Florida towns have pressed to expand their own use of plate reader technology.
Coral Gables didn’t really deny the allegations. Instead, like all police departments with similar programs, it tends to talk out of both sides of its mouth, insisting that the law-abiding have nothing to fear while denying that they have any privacy interest in their public movements in the first place. But the legal precedents cited by police rely on an unspoken assumption that resource constraints operate as their own kind of protection. Police don’t have the time or money to track and monitor every person moving about public byways and property, so they will necessarily limit their monitoring to those suspected of criminal activity. The average citizen’s anonymity is protected, while public safety is enhanced.
In an age of technologies such as license plate readers, “geofencing” (which uses smartphones’ GPS capability to map social networks and track personal data), and especially facial recognition, this unspoken assumption has been upended. Surveillance and monitoring data can be collected and stored on a mass basis, giving law enforcement the ability to build astonishingly detailed portraits of people’s lives. When the plaintiff in the Coral Gables case obtained his license plate reader records under a state sunshine law, the report ran for more than 80 pages. In our mass surveillance future, every trip to the doctor’s office, girlfriend’s apartment, library, church, gay bar, pharmacy, or liquor store can be identified, stored, and analyzed by law enforcement—with no need for individualized monitoring or individualized suspicion.
Facial recognition programs do have weaknesses, and they are currently far better at identifying white males than women or persons of color (which leads to the likelihood that the technology will misidentify black men and women as criminal suspects). But improvements have been exponential—the 127 leading software algorithms got 20 times better at searching photo databases between 2014 and 2018—and near-flawless facial recognition technology is likely within the industry’s grasp. Law enforcement, especially the FBI, has poured billions of dollars into databases to store photographs to improve facial recognition efforts, and is expanding efforts to identify persons based on voice prints or even walking gait.
This technology is spreading, getting better, cheaper, and more powerful with every year, and legal precedents and policy debates haven’t kept up. This is why protesters should not be so quick to embrace the use of this technology, even to hold police accountable. Nine times out of ten, Goliath beats David. Hoping for the occasional miracle isn’t a policy.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.