Police in South Wales have used automatic facial recognition, or AFR, software to identify a suspect and subsequently arrest him, making this the first arrest aided by this technology in the U.K., according to Wales Online.
The police arrested the man, who had a warrant out for his arrest, on May 31 after spotting him via a “Slow Time Static Face Search.” According to the South Wales Police, that search is linked to 500,000 mugshots. The man was spotted by a camera in a police surveillance van, and then his face was found in the database.
The arrest came as U.K. police were preparing to use real-time software to scan faces near the Cardiff central train station and the Millennium Stadium as part of their security plan for a major June 3 soccer game. But a spokesperson from the South Wales police department told Ars Technica that the man they arrested on May 31 was not related to the match.
The U.K. is already close to blanketed in cameras. There are 5.9 million closed-circuit television cameras in the United Kingdom, or about 1 camera per 11 people, according to the British Security Industry Authority.*
Two years ago, police in Leicestershire began to make use of all those cameras and started trial runs of NEC’s automatic facial recognition software, NeoFace. According to the company’s website, NeoFace “enables faces to be recorded and archived at a distance, act as a crime deterrent, and help identify a person in real-time.”
NEC has also partnered with U.K. police in other instances, like when NeoFace was integrated with FaceWatch, a software that lets business owners look at CCTV stills and videos to report crime to the police and share the images of suspects. With NeoFace, stores could be alerted
when someone from FaceWatch’s watch list—database of people who have been a part of a previous incident in that location—entered the premises.
However, law enforcement’s increasing use of facial recognition has been controversial, specifically in the United States, which is starting to use similar software to make arrests.
A study from Georgetown Law’s Center on Privacy & Technology published in October found multiple problems with the use of facial recognition software in police investigations. According to the report, even though it’s largely unregulated, more than 117 million adult Americans are in some kind of facial recognition network. One in four police agencies in America can run face recognition searches on its own database, run them on another agency’s system, or access a system where they can run a search. The study, which is called “The Perpetual Line-Up,” made record requests to 100-plus police agencies and found that none of the agencies needed warrants to use the recognition software to identify someone. Furthermore, the press release for the study says, “Of the 52 agencies that acknowledged using face recognition, only one obtained legislative approval for its use and only one agency provided evidence that it audited officers’ face recognition searches for misuse.”
Aside from being unregulated, facial recognition software seems to contain some blind spots, particularly when it comes to race. A 2011 study from the University of Texas–Dallas found that algorithms from some European countries and the United States were better at identifying Caucasians than other racial groups. Other research suggests that software has consistently struggled to identify black faces and features.
*Correction, June 6, 2017: This post originally misstated the number of cameras in the U.K. relative to the population. There is one camera for 11 people, not 11 cameras per person.