The hammer of Thor, or Mjölnir, is an ancient Norse symbol that has been appropriated by white supremacists since at least World War II. Hans Schweitzer, an artist who produced posters for the Nazi Party, even signed his posters Mjölnir. Despite this, the symbol is also used by nonracist Norse pagans or Marvel fans who aren’t in-the-know: Scarlett Johansson, who plays the spy-assassin Black Widow in The Avengers, sports a Mjölnir tattoo on her wrist disguised as a bracelet tattoo with an “I [heart] NY” charm.
Johansson’s innocent tat gets at the heart of a tangled battle over using technology for fighting crime and tracking hate groups. For instance, GARI, an app out of Purdue University that’s being used by dozens of law enforcement agencies in Indiana, purports to match and interpret graffiti and tattoos to track gang movement and growth. And in early June, the Electronic Frontier Foundation drew attention to tattoo-tracking research conducted by the National Institute of Standards and Technology (NIST) in conjunction with the FBI. This sort of work may seem like a good idea to some, but there are problems, not least that it could sweep up innocuous uses of symbols that have multiple meanings.
The idea was to create a data set of tattoos taken from arrestees and prisoners—the 15,000 images were provided by the FBI—and to distribute that data set to outside parties. Then research institutions, biometric companies, universities, and others could create and test algorithms to identify tattoos and match common images to try to draw connections between people or similar images from other mediums. This Tattoo Recognition Technology—Challenge, or Tatt-C, ran from Sept. 23, 2014, to May 4, 2015.
There were a host of research-related problems—like the fact that the prisoners and arrestees whose tattoos were included in the massive database were likely not able to give consent. And while technology to establish connections between tattoos and images in other mediums (such as graffiti) is pretty nascent, it’s somewhat alarming that the project’s top-performing algorithm was able to match visually similar images, or related tattoos using non-tattoo imagery, with a mean average precision of only 15.1 percent. Matching tattoos to other individuals with similar tattoos fared even poorer, with a mean average precision of only 5.2 percent.
But beyond the ethical considerations and accuracy lies the bigger issue: The tattoo-matching program raises significant issues surrounding freedom of expression, freedom of association, and religious liberty. Furthermore, these algorithms could generate spurious linkages and errors that could flag people with tattoos of culturally or spiritually significant symbols as being members of hate groups.
By their nature, symbols and images often mean different things to different people. Oren Segal, director of the Anti-Defamation League’s Center on Extremism, points out that even symbols used by hate groups typically have dual meanings—like with Scarlett Johansson’s tattoo. “The potential for confusing what somebody’s body art or tattoo means happens all the time,” said Segal. That’s why the ADL’s hate symbols database, Hate on Display, includes a disclaimer asking readers to evaluate symbols in the context in which they appear.
Even white supremacist tattoos are often appropriations of symbols belonging to other cultures or artists. For example, the crossed hammer tattoo used by the racist skinhead group the Hammerskins has also been used by Pink Floyd fans, as the image was appropriated from the movie The Wall. The Chicago street gang Gangster Disciples uses six-pointed stars similar to the Jewish star as a symbol, which Segal said has led to confusion after acts of vandalism. Even swastikas are considered a sacred symbol and hold spiritual significance to Native Americans, Buddhists, and Hindus. Then there are people who may have been part of racist groups or hate groups when they were younger, Segal points out, but have since changed in their lives. They may still have tattoos associated with groups they’ve long since moved on from.
Segal is quick to note that there is still value in understanding the meaning of symbols since tattoos often provide clues into someone’s motivation (especially when investigating a crime). That said, misidentification could have a negative impact on people’s lives, whether that’s placing someone in a risk group category based on an out-of-context symbol, or algorithmic profiling errors inaccurately associating people with groups they’re not affiliated with. Algorithmic profiling based on body art also raises significant free speech concerns for members of unpopular political groups, for example, who bear tattoos reflective of their beliefs but aren’t necessarily involved in criminal activity.
And then there are religious concerns. Coptic Christians, for example, are a widely persecuted group, and some may bear religious tattoos. Algorithmic tattoo identification software that falls into the wrong hands could be used for nefarious purposes, such as imprisoning, torturing, or even murdering people for their religion or political beliefs. “I think that one of the bigger pictures here is, at what point are scientists going to say, ‘Hey, we should not be engaging in creating the tools that can be used to oppress society’ ?” said Dave Maass, an investigative researcher with the Electronic Frontier Foundation.
This problem, of course, already exists on a smaller scale since various law enforcement agencies already have tattoo databases. But using algorithms makes the process a lot faster and at a much larger scale. Telling government organizations not to collect massive amounts of data is likely to prove fruitless. “The natural instinct is always to try and collect data and see what happens,” said Suresh Venkatasubramanian, an associate professor at the University of Utah’s School of Computing. That said, “the outcome of collecting data is mission creep of the data, where it is being used beyond the original purpose for which it was collected.”
For example, ProPublica reported in May that predictive software originally created to determine who should be put in drug treatment or mental health counseling programs is now being used to try to predict criminal reoffenders. Furthermore, ProPublica revealed that a risk assessment scoring tool originally intended to help reduce crime is now being used in sentencing—and the software used to predict future criminals is racially biased against blacks.
“I think people often collect data in good faith. … But once the database exists, it’s very easy to say, ‘Well, can we look at this related thing, because we have the data anyway?’ and then, ‘Can we look at this other related thing, because we have the data anyway?’ ” said Venkatasubramanian. “And I think what invariably happens is that people start using the data for things beyond what it probably was meant to be used for.”
He also points out that people who get tattoos typically don’t check FBI databases to make sure the tattoos aren’t related to different gangs in their area. “We need a much higher degree of caution with the way we use these algorithms,” he said. And though algorithmic methods can process data faster than humans, “you lose that deliberative aspect that comes with looking at it.” Humans, of course, have biases of their own, but the lack of clarity in the way that algorithms come to conclusions can be problematic.
NIST has scrubbed one line from its webpage about the project. It previously read, “Tattoos provide valuable information on an individual’s affiliations or beliefs and can support identity verification of an individual.” A representative said the sentence was removed to improve communications about the project’s purpose, though a slide NIST used in a presentation explicitly stated that tattoos “suggest affiliation to gangs, sub-cultures, religious or ritualistic beliefs, or political ideology” and “contain intelligence, messages, meaning and motivation.” (An EFF blog post stated that NIST scrubbed the “religious” line from the slide following criticism.) Unfortunately, retroactive website and slideshow changes do not minimize the threat to civil liberties that the use of automated computer algorithms might pose.
This article is part of Future Tense, a collaboration among Arizona State University, New America, and Slate. Future Tense explores the ways emerging technologies affect society, policy, and culture. To read more, follow us on Twitter and sign up for our weekly newsletter.