Future Tense

What if Facial Recognition Technology Were in Everyone’s Hands?

A woman walking down the street with a facial recognition technology overlay that says:
Name: Amelia Tiddliwinks
Age: 28
Occupation: Consultant
Criminal history: Underage drinking
Interests: SCUBA, flower arranging
Photo illustration by Slate. Photos by Getty Images Plus.

We know we are not anonymous online. Our every move in the digital sphere is tracked, collected, analyzed. It’s all fascinating to our spies, who know our identity at every step. They can pinpoint us by the way we write our emails, use the mouse on our computer screens—even how we hold and swipe our cellphones.

Soon we may not be anonymous in public either. Increasingly ubiquitous facial recognition technology, used not just by law enforcement but by companies and even individuals, will enable us to identify one another as we walk down the street or mingle in the crowd.

Advertisement

This is the world that Clearview AI, among others, will make possible. Clearview has famously and controversially assembled a vast database, scraped from the web, that lets it identify millions of people—perhaps including you.

Advertisement
Advertisement
Advertisement

Law enforcement is enthusiastic about the capabilities afforded by Clearview. The technology can help track possible offenders from the mere snippet of a photograph. In one notable case, police sought to determine the identity of a man from a picture “in a Syrian user’s account” documenting child sex abuse. When they ran the image through Clearview’s database, they found a curious match—an Instagram photo featuring bodybuilders at an expo in Las Vegas. None of the people in the foreground matched the suspect; lurking far in the background, however, “at the edge of the photo’s frame,” they found the man, and Clearview supplied his identity. In this picture, the man’s image was tiny—“half as big as your fingernail.” That’s all it took for Clearview to pinpoint the suspect across its database of 3 billion images.

Advertisement

Unsurprisingly, there are several legal challenges against Clearview. The company is subject to complaints from European privacy and digital rights groups. Authorities in the U.K. and Australia are exploring taking action against the company. And Canada’s privacy commissioners have already determined that “Clearview’s face scraping is ‘illegal’ and creates a system that ‘inflicts broad-based harm on all members of society, who find themselves continually in a police lineup’ ”—presumably because the database runs through every person’s face in a given search.

Advertisement

The U.S. has been notably slower to challenge Clearview. The exception is Illinois, where the ACLU is suing the company for violating that state’s biometric privacy act. There is reason to doubt that said challenges will get very far on these shores, given the relative power of tech lobbies, which have thus far blocked federal privacy regulations, and police unions, which demand the help of such technology, especially in the current upsurge in crime. As the saying goes, it is hard to put the genie back in the bottle. In that case, we may have little choice but to contend with the possibility that Clearview could significantly alter daily life—for the rest of us—in strange and surprising ways.

Advertisement
Advertisement

Armed with a stranger’s identity, supplied by Clearview, you could do a search and uncover intimate, alarming information about them—on the spot. Especially if they have been forthcoming on social media (as many of us are) and Facebook lists their affiliations, proclivities, and tastes. You could draw conclusions on their finances, based on their alma mater, ZIP code, and profession. Or their bankruptcy records. Whether they are divorced or have been arrested. Until recently, search results for a childhood friend of mine—now CEO of a local company—featured his adoption records.

Advertisement

This technology raises a host of important questions: What is the value of anonymity? How might I benefit from being able to walk around in public without anyone knowing who I am? If or when people can learn about me at first glance—my education, my job, my wealth status, whether I am adopted—how will it influence how or if they will approach me? I start to think of all the conversations that will not be had.

Advertisement

Consider how Clearview might change the singles scene. You could stand at the threshold of the bar and immediately quantify the faces before you. This could be a welcome aid, helping you cut through the chaff and decide whom to spend your precious time and energy on. Who has the right job and degree? Who is Catholic, Jewish, Muslim? Democrat or Republican? Who makes a lot of money and lives in the right ZIP code?

Retailers will likely appreciate Clearview’s service too. Store attendants could use it to size you up as you walk through the door. What do your affiliations suggest about the kind of consumer you are? Are you worth their attention and effort? You might be quickly ignored when the Harvard-trained lawyer saunters in behind you.

Advertisement
Advertisement
Advertisement

On a more serious note, parents could use Clearview to determine if there are sex offenders in their midst—at the park, in the grocery store, on the street. It’s worth noting that the sex offender list is quite flawed and misleading. It lumps together child abusers with people who have committed much lesser crimes and are quite harmless. Parents will not care about the unpublished distinction, however, and steer clear of anyone on the list.

Some, even many, will object to the privacy violations of this technology. I wager such hesitation will be brief and we will overcome our squeamishness about invasive media once again. If you’re over a certain age, when you first got a smartphone you might have been squeamish about constant GPS tracking, which revealed your location at any moment. Most of us quickly forgot our concerns when Waze guided us through the nearest shortcut, or Yelp identified the best local restaurant and ushered us directly to its door.

Advertisement

Similarly, Clearview’s promise to let you peel back people’s facades with little effort is just too tantalizing, particularly if you don’t think about how your own facade can be peeled back as well.

Advertisement
Advertisement

Clearview will spare me face-to-face, in-depth research where I must engage in careful conversation with strangers to figure them out and pose sensitive, probing questions. I can cut to the chase and decide instantaneously if the person before me is worth my time—or if it’s someone I might, or should, avoid.

Of course, this technology will also allow us to cast judgment on people more easily—and possibly arrive at judgments that are unfair. Empathy and civility demand that we be open, that we refrain from leaping to conclusions about people—that we give them time to show their true colors, despite appearances. We must give people the benefit of the doubt. Because they deserve a modicum of respect, in this regard. Wouldn’t we hope for the same?

Advertisement
Advertisement

Clearview could exacerbate class consciousness. Our CVs, public affiliations, and accomplishments will precede us—quite literally—as we walk down the street. In general, we will be measured according to superficial markers.

I understand we are already liable to this kind of judgment, to some degree. When I enter a party, people make assumptions about me on the basis of several visible clues—my hairstyle, my clothes, my weight, my skin color; how I hold myself and walk or gesture; whether I betray confidence, hesitation, or insecurity. We know all this is conjecture, however. But Clearview offers an air of authority and certitude to our judgments. It relieves us of guesswork, after all, and provides data, which, by its nature, seems definitive.

Advertisement
Advertisement

Imagine the freedom that will be lost. There is freedom in anonymity, after all. There is freedom in not being known or recognized. There is freedom in being ignored, even, and allowed to do what I want—within reason, of course. It is liberating when I step off the train in New York City and mix with the crowd. People care not a whit about me, where I go, or what I do. The possibilities seem endless.

More importantly, Clearview’s technology threatens personal autonomy. It deprives me of the ability to define or introduce myself to others, on my terms. It denies me the opportunity to tell my story in the order of titles and accomplishments I think is important.

Advertisement

Charm, wit, humor, grace—these will all be endangered in Clearview’s world, if I am denied the chance to greet others organically, timidly, or step by step, or to reach out in subtle and surprising ways. And of course, Clearview would relieve us of the possibility of encountering and negotiating otherness. I can strictly discover my own kind—as on social media, now in the public realm—and gravitate toward them. I can strictly avoid anyone unlike me.

Advertisement
Advertisement

How you are advertised or branded publicly—that will be out of your control. Clearview will make it harder to escape legacies or affiliations you are less than proud of and would rather shed. What if you are trying to overcome an ignominious past? What if you are trying to straighten out your life and rehabilitate, or simply move on, improve—transform? Will Clearview reveal as much?

Advertisement

Better yet, will people care? If they spy a red flag on your record, will this overwhelm or cloud their judgment? In the name of convenience, people will look for excuses to not engage and move on.

Philosophers have long argued that the more we know about others, the more empathetic we are. Superficial data will not do. Intimate, sensitive, comprehensive information is in order. In the absence of technology, the ancient Stoics recommended we use our imagination: If someone offends you, fill in a likely backstory—imagine the circumstances shaping their behavior. What is going on in the life of this person that makes him so rude, angry, or unhappy? Listen patiently and take the time to find out what is driving him. Research, reflect, and pay attention. Learn about him, engage, open up to him, and he will open up in turn.

Advertisement
Advertisement

Clearview does not offer the requisite insight for compassion. Paradoxically, it offers too little information and too much all at once. It provides just enough for you to judge people quickly, and it provides too little to properly, fairly—charitably—assess them. The technology does not help us see people’s deepest motivations, their earnest intent, their trials and tribulations.

Can technology ever do so? It seems at odds with the practice of empathy. Our digital economy moves at light speed and prioritizes convenience at every turn—often needlessly. Empathy, by contrast, demands time, energy, personal investment and attention. It requires that we do not take shortcuts in understanding people, that we engage in hard work that is sometimes tedious and uncomfortable, but always rewarding. The insistent, pervasive demand for convenience, which Clearview offers, is a poor foundation for a moral society and a culture of compassion.

Future Tense is a partnership of Slate, New America, and Arizona State University that examines emerging technologies, public policy, and society.

Advertisement