Should Police Bodycams Come With Facial Recognition Software?

A technology embraced to protect citizens could have major civil liberties implications.

Policeman with body-worn videocamera in North Charleston, South Carolina.
By enhancing police-worn bodycams with facial recognition technology, beat cops themselves could become citywide mass automated tracking tools.

Ryan Johnson/North Charleston/Flickr

This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. On Wednesday, Nov. 30, Future Tense will host an event in Washington, D.C., on the future of law enforcement technology. For more information and to RSVP, visit the New America website.

Imagine you’re at a large protest, thousands of demonstrators gathered. Police stand nearby, tasked with protecting those speaking out and maintaining safety. At each officer’s chest, a red light shows a body camera is recording, ensuring the officers do not engage in improper conduct. But what if that red light also meant a program was scanning, recording, and cataloging the face of every person in the crowd. Would you feel safe?

It would be a sad irony if police body cameras, brought into communities to check police power, became tools that improperly expanded it. But as they are rapidly being deployed in cities across the country, often without clear policies designed to protect privacy, we may be failing to fully consider the risks of pervasive surveillance these devices pose. And in addition to existing concerns, a huge new issue is rapidly approaching: body cameras that use facial recognition technology.

And body cameras that incorporate facial recognition technology are certainly on the horizon. This summer Taser International, by far the nation’s biggest producer of police body cameras, announced plans to incorporate facial recognition technology into its cameras in the future. And so far, not a single city places adequate limits on its use. Even if these enhanced technologies don’t lead to a world where “every cop will be RoboCop,” as Taser vice president Steve Tuttle once suggested, we need to talk about the real ways in which facial recognition devices could be used—and misused—and what limits should be put on their use.

Facial recognition works by creating and scanning for “face prints,” a unique biometric identification based on distinct facial features, such as the distance between the centers of an individual’s eyes. Computer programs can scan photos or video to develop new face prints and can scan footage (including real-time footage) to identify individuals that match face prints in an existing database. Software on the market already claims to be capable of comparing millions of faces per second.

Law enforcement agencies have already begun using facial recognition to a significant degree. According to a recent report from the Georgetown Law Center on Privacy and Technology, 1 in 2 American adults are already in photo databases—including not just police records but also civilian databases like those belonging to state Departments of Motor Vehicles—that law enforcement uses for facial recognition. Furthermore, the report says, at least 1 in 4 of all state and local police departments have the ability to run face recognition searches through their or another agency’s systems. Body cameras with built-in facial recognition capabilities could dramatically expand the ability to develop profiles and run scans, giving law enforcement unprecedented powers.

Consider that in cities like Chicago and Washington, D.C.—both of which already employ police body cameras—there are an average of 50 police officers per square mile. Such a tool could make the real-time lookout for dozens, perhaps even hundreds, of people relatively effortless.

So how might police departments use these tools?

The least controversial use of facial recognition would be to identify individuals in relation to emergencies—setting police cameras to scan the city for the face of a missing child or a suspect in an ongoing kidnapping during an Amber Alert, for example, or for an active shooter. It’s hard to imagine persuasive objections to this specific use of the technology, since responding to an imminent threat is a commonly accepted exception to Fourth Amendment rules that generally require warrants for certain police action.

Law enforcement agencies might also use this biometric data to try to identify other fugitives at large in nonemergency situations—sending out face prints of individuals that have outstanding arrest warrants to body cameras, which could then scan police footage for chance matches. Such a system could take “Most Wanted” posters into the 21st century and help catch dangerous fugitives much more efficiently. The requirement of an active warrant would also give judicial oversight to the technology’s deployment.

However, even here there are significant concerns. Facial recognition technology is known to misidentify individuals and is more prone to do so for minorities, as a U.S. Government Accountability Office study recently revealed. For police officers operating in real time, especially those authorized to use force, the cost of misidentification could be catastrophic. Without measures to reduce the risk of misidentification, the body cameras hailed as tools for reducing police abuse may cause it through different means.

And what kinds of accused offenders could this facial recognition trawl for? It’s one thing to set body cameras to identify wanted violent offenders that come into view of body cameras; it’s another to scan the streets for those accused of petty offenses. Many cities have sizable numbers of active arrest warrants for minor crimes. (For example, a Department of Justice investigation revealed that the municipal court in Ferguson, Missouri—population 21,000—had outstanding arrest warrants for 16,000 people in 2014, the vast majority for minor offenses like unpaid fines for traffic violations or overgrown grass. A judge has since withdrawn thousands.) Facial recognition software could give police officers unprecedented abilities to exercise “arrest at will” authority over a large proportion of the population. It’s not hard to imagine how, without curbs on its use, police could abuse this power against protesters, minorities, or others an individual officer could have bias against.

Beyond ability to target anyone with an outstanding warrant, this technology could also offer a powerful new means for location tracking and monitoring of the entire public. Law enforcement agencies commonly use location tracking for investigations—following an individual in public, attaching GPS tracking devices to cars, or obtaining cellphone locations from telecommunications companies, for example. But each of these methods requires intensive resources or, with some methods, a warrant.

By enhancing police-worn body cameras with facial recognition technology, beat cops themselves could be turned into a citywide mass automated tracking tool. But without judicial oversight, location tracking will not necessarily be limited to suspected wrongdoers. Law enforcement could use the technology to identify, monitor, and intimidate individuals involved in nonillicit activities absent any judicial review or prior arrests.

Most disturbingly, this could occur not only by tracking an individual, but also by targeting sensitive locations or events. Think if police were able to review body camera footage and use facial recognition to catalog every participant in a protest or every person that walked into a local mosque. Facial recognition and body cameras could quickly create a much more powerful, digital version of J. Edgar Hoover’s secret “enemies” lists.

This isn’t exactly a reach—consider that, in recent years, law enforcement agencies have questionably used targeted surveillance operations to monitor Black Lives Matter demonstrators and Muslim communities. And it’s not hard to imagine how this technological capability could be disastrous for civil liberties. Even the threat of such activity may have a devastating chilling effect, causing individuals to avoid public demonstrations or houses of worship out of fear that their faces could be recorded, scanned, identified, and cataloged. In the face of these risks, it’s critical that we require judicial authorization for developing face prints before combining facial recognition technology with body cameras.

This dystopian future, of course, isn’t inevitable. But in order to prevent it, we need to put smart checks in place before we give law enforcement such tools. From stingrays to spy planes, we are seeing the consequences of powerful surveillance technology creeping into local law enforcement without adequate limits. Police body cameras hold promise to help provide oversight and build much-needed community trust. But let’s make sure they, especially with the enhanced capabilities of facial recognition software, don’t just become another potential means for police abuse.