The ACLU released a report on Thursday revealing that Rekognition, Amazon’s facial recognition tool, had falsely matched 28 members of Congress to mug shots. Members of the ACLU purchased the version of Rekognition that Amazon offers to the general public and ran public photos of every member of the House and Senate against a database of 25,000 arrest photos. The entire experiment costed $12.33, which, as ACLU attorney Jake Snow writes in a blogpost, is “less than a large pizza.”
Almost 40 percent of the representatives that Rekognition falsely matched were people of color, even though they make up only 20 percent of Congress. Six members of the Congressional Black Caucus were among the false positives, including Georgia Rep. John Lewis and Illinois Rep. Bobby Rush. “These results demonstrate why Congress should join the ACLU in calling for a moratorium on law enforcement use of face surveillance,” Snow wrote in the post. Facial recognition systems are notorious for misidentifying women and people of color. While some companies that produce the software have addressed the concerns with public bias testing, Amazon has neglected to disclose any data on the matter.
One of the misidentified congressmen, New Jersey Rep. Frank LoBiondo, told Slate, “It wasn’t a very flattering mug shot,” and declined to comment further. Illinois Rep. Luis Gutiérrez, who was also falsely matched, issued a lengthier statement:
Are you sure Amazon isn’t just talking to Donald Trump, Tucker Carlson and Breitbart? Because they think all Latinos are criminals.
But on a more serious note:
If this technology has not been proven effective and has a systematic bias against people of color, then it will hurt, not help law enforcement and anyone else who uses it. I have to say, I am not surprised. Disappointed, yes, but not surprised.
Another misidentified congressman, Arizona Rep. Raúl Grijalva, also gave Slate a statement:
“If the facial recognition software cannot even correctly identify 28 public figures, how can we expect it to accurately identify millions of Americans? I have serious privacy and public safety concerns that this tool is already marketed to police departments across the country without public input and detailed analysis of its negative consequences. We need more information on this technology before we empower law enforcement with a tool that can be used to terrorize immigrants, facilitate negative police interactions, and erode important privacy protections.”
Police in Orlando, Florida, ended a pilot program using Rekognition last month. The department tested the software to determine whether it could identify officers in a video stream from eight surveillance cameras. Police in Maryland recently used a different facial recognition tool to identify the shooter who attacked the Capital Gazette newsroom in June. A sheriff’s office in Oregon has also been using similar software to identify suspects for around a year.
Activists, members of Congress, and even Amazon employees have raised privacy and discrimination concerns with the use of such software. In May, more than 40 civil rights groups sent a letter to CEO Jeff Bezos asking him to stop offering Rekognition to governments, arguing, “People should be free to walk down the street without being watched by the government.”
At around the same time, the Congressional Black Caucus sent a letter to Amazon urging caution in developing the software, which noted, “We are troubled by the profound negative unintended consequences this form of artificial intelligence could have for African Americans, undocumented immigrants, and protesters.” Then, last month, more than 100 Amazon employees signed a letter calling on Bezos to stop selling Rekognition to law enforcement, which read in part, “This will be another powerful tool for the surveillance state and ultimately serve to harm the most marginalized.”
Amazon issued a statement to Business Insider quibbling with the confidence thresholds the ACLU used in its test:
“We have seen customers use the image and video analysis capabilities of Amazon Rekognition in ways that materially benefit both society (e.g. preventing human trafficking, inhibiting child exploitation, reuniting missing children with their families, and building educational apps for children), and organizations (enhancing security through multi-factor authentication, finding images more easily, or preventing package theft). We remain excited about how image and video analysis can be a driver for good in the world, including in the public sector and law enforcement. With regard to this recent test of Amazon Rekognition by the ACLU, we think that the results could probably be improved by following best practices around setting the confidence thresholds (this is the percentage likelihood that Rekognition found a match) used in the test. While 80% confidence is an acceptable threshold for photos of hot dogs, chairs, animals, or other social media use cases, it wouldn’t be appropriate for identifying individuals with a reasonable level of certainty. When using facial recognition for law enforcement activities, we guide customers to set a threshold of at least 95% or higher.”