What Next

Amazon Encourages Police to Use Untested Facial Recognition Technology

Right now, no one is regulating how cops use facial recognition technology, and it looks like that’s just fine with Amazon.

A woman's face is targeted as she's placed into a police car by law enforcement.
Photo illustration by Slate. Photo by LightFieldStudios/Getty Images Plus.

The sheriff’s office in Hillsboro, Oregon, was the first law enforcement agency in the country to use Amazon’s facial recognition technology. And it happened almost by accident.

Chris Adzima was a techie who worked in the sheriff’s office designing an iPhone app to track inmate behavior. A few months ago, he caught this annual conference Amazon was having for web developers—it has the feel of a corporate pep rally crossed with a TED Talk. The big unveiling that year was Amazon’s facial recognition feature.

“They had all of these amazing, sort of creepy ideas, but law enforcement was not one of them,” says Drew Harwell, the Washington Post’s national technology reporter. “But this guy Chris Adzima at Washington County he thought, That would be great for us.”

Adzima had an idea to use Amazon’s facial recognition technology to compare surveillance footage to the mug shots of people who have been arrested in the past decade and a half in an attempt to ID people in the footage.

“All it really took him was a couple weeks of work and a couple hundred bucks, and he pushed the sheriff’s office onto Amazon’s server and into this facial recognition destiny it’s living in today,” says Harwell.

After the technology arrived, Harwell says, the police kind of got superpowers. Before this, the sheriff’s office in Hillsboro would fax or email photos around of people they were looking for. But now deputies can access an internal website where they can upload surveillance footage, and the machine will provide an “answer” based on a person’s mug shot.

“A couple of them said it felt like magic,” he says.

Who’s regulating how these cops use these tools?

“No one,” Harwell says. “There’s no federal law governing facial recognition.”

But that all may be changing. This week in Washington, the House Oversight and Reform Committee considered whether cops who use facial recognition are going too far. At the same time, Amazon shareholders met to decide whether to scale all this back. It’s why I contacted Harwell for this story—because it feels like we’re starting to see the rise of a new and important tool, like fingerprinting or DNA analysis, and the fight over regulating it is just getting started.

I spoke to Harwell for What Next, the daily news podcast I host for Slate. You can read this lightly edited version of our conversation, or hear the full discussion using the audio player below. And you can always find What Next via Apple Podcasts, Spotify, TuneIn, Stitcher, Overcast, or Google Play.

Using facial technology to compare surveillance footage to an existing bank of mug shots could help police find suspects, though they would still need some evidence to convict someone. But it could also have unintended consequences.

“Worst-case scenario is that they take that footage, they run it through the system, out pops some potential matches, including one that sort of looks like the suspect, but maybe not, and that person is arrested,” Harwell says. “Maybe the deputies misconstrue other evidence and end up bringing that person in and charging them with a crime they didn’t commit. So, the worst-case scenario is a misidentification, that leads to a false arrest, that leads to a potentially dangerous situation, and potentially it leads to an innocent person going to jail.”

One other danger from this technology? If police are looking at the same mug shots over and over again, they may start targeting all the “usual suspects.”

“Thinking about the data set [Hillsboro uses] here, it’s just people who have been arrested since 2001 and have had their photo taken. So, if this is your first time doing a crime, facial recognition is not going to find you,” says Harwell.

He continues: “On the flip side, if you were arrested, but you hadn’t done anything wrong, and were innocent, were never charged, your photo is still in the system, and you’re more likely to come up in the search down the road and maybe get brought in a second time, just because you had the bad luck of looking like someone who had been caught on camera, doing something wrong.”

It’s still unclear how many other police departments are using Amazon’s facial recognition technology.

“The company has only given us a few examples, [and] only a few sheriff’s offices and police departments have admitted that they’re using it,” Harwell says.

But facial recognition isn’t just an Amazon technology. Harwell points out that other surveillance and government contractors have also built similar systems that law enforcement officials are using.

“Some research by some folks at Georgetown actually found that up to 50 agencies across the country, including the FBI and including the NYPD, had used facial recognition in some way,” he says. “In some cases, it was [used to try] to find a fake ID or somebody who’s fraudulently steeling people’s identities—we can use facial recognition in that way. But in other cases, it’s been cameras in a public square gathering data, looking for wanted fugitives.”

Privacy advocates and groups like the American Civil Liberties Union have been critical of tech developers and companies like Amazon for the deployment of these tools.

“For them it’s all about these larger questions of, we have this technology that is imperfect and can lead to misidentification, and can lead deputies to false arrests, is proven inaccurate, and proven in some cases to be biased based on stuff like skin color,” Harwell says. “They’re worried about these larger social issues and how this technology can exacerbate some of those problems.”

He continues: “But for the tech side, it’s really about what this technology can do. They feel like some of these social quibblings are edge cases—they may happen 1 percent of the time—but if we’re able to use this technology to find missing kids, and find bad guys, and save deputies, maybe it’s worth it.”

In addition to questions of accuracy and transparency, the use of facial recognition technology also sparks questions of privacy.

“Even if it’s perfectly accurate, there’s still the worry that maybe we don’t want our faces scanned when we’re out in public,” Harwell says.

But Amazon is eager to have police departments use this technology, because it can help the artificial intelligence behind the tool learn.

“These systems learn faces by looking at millions of them—that’s how they pick out the differences between the width between eyes and how someone looks when they’re grimacing, and all the different little micro expressions and little ticks that we have in our faces,” Harwell says. “That’s how the A.I. learns.”

It appears that Amazon has incentivized police agencies to use its system. Departments often spend large amounts on technological tools, and for just a few dollars a month—less than what an agency will spend on some walkie-talkies—police can have facial recognition technology.

“The idea is that if Amazon can get this into lots of different public agencies and get lots of people pouring in different mug shots or surveillance photos, then the system is going to be that much more refined,” Harwell says. “Amazon poses it like this is a good thing, feeling like, We’re getting more accurate, we’re able to identify anybody, all of these issues of bias are potentially going away because we are feeding in more and more images into the machine. And yet for some people, a more accurate surveillance box is not reassuring to them. For them, the accuracy worries are just one tiny element of these broader concerns of Big Brother.”

Amazon has been particularly aggressive in marketing facial recognition technology to local law enforcement agencies, especially since many may already be using Amazon Web Services, a cloud computing series that is so widely used across the country that it took in more money that McDonald’s last year.

“Lots of the modern web is built on the Amazon cloud, and so [law enforcement] may just be able to effectively click a button and start using recognition today,” says Harwell. “So, that is a really alluring sales pitch that Amazon could make to some of these police forces.”

Even though Google has a large suite of A.I. tools and services, it does not sell a facial recognition technology that is able to be used like Amazon’s, and Microsoft has said that there needs to be a stronger government regulation of this technology.

“Amazon has been a lot more reserved in their criticism of facial recognition—they’ve sought to defend this technology against all comers, and yet the pressure is building,” Harwell says.

While Amazon shareholders rejected a plan to rein in the company’s facial recognition services, both Republicans and Democrats on the House Oversight and Reform Committee condemned the technology and criticized A.I. software as a danger to privacy and civil liberties. Just last week, San Francisco voted to ban facial recognition technology use by law enforcement.

“That ban in San Francisco was a symbolic victory for critics of facial recognition,” Harwell says. “San Francisco police were not using facial recognition, as far as we know. They had tested it for a number of years, but they weren’t actively scanning [people]. You can still get your face recognized or scanned in a business in San Francisco, because the law doesn’t affect anybody but city agencies and city police. You could also potentially get your face scanned at the San Francisco airport, because there’s totally different laws governing air travel, and DHS wants 97 percent of people’s faces to be scanned within the next four years if you’re traveling outside of the country.”

He continues: “So maybe this technology isn’t eminent, maybe we shouldn’t just all prepare to have our faces scanned on every block that we walk down, because there is a potential legal framework here to rein it in. But also there is a concern that the toothpaste is out the tube on facial recognition. It is so cheap, it’s so technically easy to turn on, and it’s so unbound by law that it’s going to be easier for people to turn it on than it is for critics to push back and get it turned off.”