Slate has relationships with various online retailers. If you buy something through our links, Slate may earn an affiliate commission. We update links when possible, but note that deals can expire and all prices are subject to change. All prices were up to date at the time of publication.
This article is adapted from Digital Punishment: Privacy, Stigma, and the Harms of Data-Driven Criminal Justice by Sarah Esther Lageson, published by Oxford University Press.
On a frozen December day in Minneapolis, William walked into a free legal aid seminar, to try to fix his criminal record. Lumbering toward a lawyer, his arms full of paperwork, William tried to explain his situation quickly. “I want to show you my record here that I got from my probation officer. Here.” Frustrated, William waved papers in the air.
After an employer and a landlord both denied his applications following private background checks, William started to suspect something was wrong with his criminal record. When he finally got a copy, the data made no sense. One arrest was dated to 1901. Another arrest was linked to an active warrant.
“Now, here’s a thing about it. I got one [conviction] in ’82; that was the last time I was in jail.” William paused to scan the document. “And that was that charge here. All of this,” he said, pointing to the paper, “is not me.” It seemed as if someone with a similar name—and a far more extensive criminal history—had been matched to William’s identity in a state police or court record database. He quickly realized that not only was his record incorrect, but it had spread across databases used by background check companies—and was posted on the internet. It was as if someone had stolen his identity—but instead of using his identity to buy something, they used it to slip stolen goods into his pocket.
The lawyers warned William of the Kafkaesque bureaucracy he would face. He had to fix the mismatched identity with the state police, ask the court to fix the 1901 data error, and close the mistaken (but open) warrant. Because he could not afford a lawyer, William had to rely on free legal aid or deal directly with the courts and state bureaus himself. This wasn’t what he wanted to hear. He had been trying for months to get help. The first time he’d tried to meet with a volunteer attorney, he was given an incorrect address and walked around downtown Minneapolis for hours trying to find the office. All of this confusion and frustration led him to the seminar today. He was about ready to give up.
“It’s too much. It’s too frustrating,” William said. “You know, you ain’t done nothing in 30-something years and then all of a sudden you want to get an apartment and you can’t. You’re just stuck the way you are at. That’s just terrible. It’s a bad feeling. It’s like I’ve been on a standstill.”
The problems William faced are rapidly multiplying across the country, in various forms. Incorrect or misleading records from years past pop up on Google searches. Criminal convictions that accurately appear on one background check don’t appear on another. Sealed, expunged, and juvenile records that are legally hidden from public view continue to live on across databases and websites.
Criminal records and background checks have become a lucrative and central part of American life, ushered in by the creation of more records as our criminal justice system expanded over the past several decades and paired with greater demands for more access to these records. This is because the American public not only uses criminal records to make important decisions about whom we employ or rent to, but also as fodder for entertainment, voyeurism, and public shaming.
Data brokers pay courts for bulk data sets that are repackaged with other sources of public and consumer data and then sold to background check services, market research companies, and even back to law enforcement. Websites post mug shots and charge people staggering fees to have their photos removed. Mobile apps purport to update users about sex offenders and recent arrestees in their neighborhood while simultaneously collecting and monetizing subscriber data. Google search results for a person’s name are accompanied by a litany of titillating background check and reputation management advertisements, and the search engine giant profits from this clickbait.
All of this “data” is marked with rampant error and misleading information. Records now begin at the very early stages of arrest and extend across a person’s entire lifetime, whether or not they are found guilty. As records are downloaded, sold, and shared, they quickly become decontextualized and stale. This proliferation leads to a particular form of anxiety: Criminal record subjects are nearly always uncertain about where their records can be found and what will appear on them, even if charges were dismissed or their record sealed or expunged by the courts. Often, a person never knows what is on her criminal record because there isn’t one single criminal record to consult. The internet’s version is often wildly different than the state’s version.
The consequences seep into everyday life. People begin to engage in digital avoidance—doing everything within their power to prevent someone else from Googling them, even if this means avoiding positive parts of life, like seeking better employment or housing, setting up an online dating profile, volunteering at their kid’s school, or meeting people in their neighborhood.
Shana, in Florida, was arrested once in her life after a disruption at a nightclub nearly a decade ago. A few years ago, her mug shot appeared on a website demanding hundreds of dollars to have it taken down. She looks terrified in it. “Embarrassment is an understatement,” she said. “You are ashamed of your identity. It creates a self-doubt that permeates nearly every aspect of your life. … I have thoughts and feelings that I cannot ever be who I was. There is a sense of paranoia and fear of who might search your name and see the trail of tabloid sites. These thoughts are a daily thing now. It is beyond horrible.”
Justin, in rural Indiana, was pulled over and booked for reckless driving in 2005 while in his early 20s. He appeared on an online roster of arrestees the next morning, and his arrest is still lodged in Google search results for his name to this day. This low-level record was legally sealed, but he cannot land a job. “It’s been painful,” he says. “You wouldn’t know the kind of guilt and shame I experience when I am overlooked by every employer I apply to because these records continue to exist, or the amount of pain it causes me to feel like I’ve failed my family.”
Albert, in New Jersey, passed the background check for a new apartment. At age 82, he was 12 years past a forgery conviction and in the process of expunging his record. But vestiges of his record stayed on the internet. Days before Albert was due to move, his new landlord called him and told him, “I forgot to tell you it’s my policy to Google everyone’s name, and I see that you have a record here for fraud.” Albert sighed as he recounted the disappointment. “There was no changing his mind.” He lost the apartment.
“I have to sort out four sources of this record,” he continued. “The police, the jail, the court, and the internet. The internet. That’s the biggest problem.”
Though the circumstances vary, these experiences all point to the failures of data and technology companies to effectively modernize criminal justice operations. Through uneven rollouts and competing legal and political mandates, data-driven criminal justice churns out millions of publicly available criminal records each year—a messy spillover far from the original intent of criminal recordkeeping. The data are often outdated, incorrect, and bought and sold in private markets by data companies.
Documenting everything from a police stop to a prison sentence, thousands of different types of records take on a digital life of their own as they are bought and sold and reposted across the internet. The result is “digital punishment,” where mere suspicion or a brush with the law can have lasting consequences.
There’s a strong set of incentives for the criminal justice system to release data to the private sector. Lacking the necessary budget and expertise to maintain digital records themselves, busy and overburdened criminal justice agencies have turned, over the past two decades, to technological solutions offered by IT companies. Newly digital operations produce volumes of data, including the names, photographs, and home addresses of people arrested or charged with a crime, transforming what used to consist of millions of paper records into a valuable commodity.
Regardless of factual or legal guilt, these records rapidly multiply across the private sector background checking and personal data industries. Once the personal data industry controls the information, there is no stopping its spread, leading to the errors in William’s record that cost him a job, or the Google search results that cost Albert his new apartment.
Not only does digital punishment unequally stigmatize people already targeted by the criminal justice system because of their race or neighborhood, but it creates privacy inequalities. Members of these already sidelined communities are less likely to have the ability to address, remedy, or overcome a criminal record. The ability to curate an online reputation or challenge a government record is inextricably linked (and proportional) to one’s relationship with technology and one’s capacity to argue for the right to privacy in the first place.
Though the errors in his record are not his fault, William is tasked with fixing them. Shana’s arrest photo is a profitable commodity for entertainment and extortion markets. But she must come up with the time and money to track down the source every time her booking photo or criminal record appears on the internet, and then she would need the legal skills to negotiate with, pay off, or sue every company that profited from her arrest data.
In digital criminal justice operations, a person’s disgrace is almost always up for sale and available instantly for public consumption. A hard-earned and nearly spotless reputation can be tarnished indefinitely with just a few clicks of a mouse. The result is that many people now live in terror of their digital reputation.
Perhaps we’ve opened Pandora’s box and our digital biographies have become irreparably cemented to our identities. Artificial intelligence, machine learning, facial recognition, and biometrics are increasingly incorporated into the tracking, surveillance, and record-keeping practices of the state. There will be voices pushing for transparency in data collection practices and privacy for the sharing of individual data, while corporations that buy and sell criminal records will seek to evade regulation.
Eventually, background checks will probably get better, due to consumer demand and improvements to information technologies. But the reality for the short term is that millions of people in America will spend the rest of their lives digitally marked, their identities warehoused into vast collections of mug shots, jailhouse rosters, and court documents. There is no easy escape from digital punishment—punishment that is perpetual, and not determined by judge or jury.
Americans have long been susceptible to claims about the need to crack down on crime and accept increasingly harsh penalties, leading to the war on drugs and mass incarceration. Data-driven advances in criminal justice operations have expanded an already wide net. The explosion of digital punishment has come without critical discussion of causes and consequences. But policy shifts can slow digital punishment.
There is enormous potential for reform that better addresses the relationship between criminal punishment, individual privacy, and governmental oversight in the digital age. Medical records and credit reports are regulated and protected in the U.S., partly to protect this personal information from falling into the wrong hands or being leveraged against a person. Criminal records could be treated the same way. Background check companies could be held legally accountable for reporting incorrect or outdated data.
The private sector can remedy this pain without legal regulation. Google could import aspects of the European “right to be forgotten” for those whose records have been sealed, expunged, or illegally disclosed. Facebook could stop allowing mug shots to be posted before a criminal conviction, or at least turn off the racist and terrifying commentary that follows.
Digital punishment is not the inevitable outcome of digital life. Technological advances do not determine their own fate; people and organizations use technologies and share data for specific ends. Decisions for how to collect, organize, and disclose records is a human-powered process. The ways we use criminal records—and the power of the private market to distribute this data—is a political choice. The openness and lifelong punishment of a criminal record looks very different outside of America, where rehabilitation policies are valued over punishment.
The justice system and information technology systems are both operating at unforeseen levels of activity. Humming along, each system touches more and more lives each day. And there are serious consequences to this confluence. As criminal records drift online, the internet exacts a criminal-like penalty—guilt-by-Google—before, or even without, prosecution or conviction. The widespread public release and sale of criminal justice data is leading to new forms of everlasting punishment.
Our data handling and processing practices need not be at the mercy of tech innovation and invariably encroaching systems of surveillance, though. Our current state of affairs is the result of very human processes. Policies are the result of choices. And we can always choose differently.
Digital Punishment: Privacy, Stigma, and the Harms of Data-Driven Criminal Justice
By Sarah Esther Lageson. Oxford University Press.
From Digital Punishment: Privacy, Stigma, and the Harms of Data-Driven Criminal Justice by Sarah Esther Lageson. Copyright © 2020 by Sarah Esther Lageson and published by Oxford University Press. All rights reserved.
Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.