As technology advances, will we use it to promote equity, or to serve and preserve systems of oppression? This question is central to Meg Elison’s “Safe Surrender,” which explores a future in which humans are in regular contact with extraterrestrials called Pinners, who exchange diplomats, trade goods, and even interbreed with Earthlings. In “Safe Surrender,” a grown-up human-Pinner hybrid (a “hemi”) struggles to find their identity and make sense of their origin—surrendered at birth by a mother who did not want or perhaps felt she could not care for or protect a hybrid infant.
In Elison’s not–totally foreign, not-so-distant future, the racial prejudices, inequities, and oppression that plague humankind today map easily onto extraterrestrials. Hemis are treated by many as “a mistake that shouldn’t exist” and frequently are abandoned as infants. Most humans can’t identify a hemi just by looking at them, but those who can often disfavor them in subtle or unsubtle ways. The protagonist recalls having once had a teacher who described the “many things that make us different” as “great,” yet exhibited clear bias against hemi children. Like black children today, hemis “got in trouble more easily, and served harsher punishments.”
Just as the teacher scrutinizes hemi students constantly, the government of the future also keeps a close watch via a ubiquitous video surveillance network. Footage from the network is monitored in real time by an A.I. on the lookout for suspicious behavior, a more muscular version of the Domain Awareness System currently operating in New York City. At one point, the protagonist remarks that hemi kids could never learn to be thieving street urchins like characters in old orphan stories because “[e]very eye on the street is programmed to notice that sort of thing.” And footage from those cameras isn’t transient or temporary; it’s stored—and searchable—forever.
It might seem like those eyes are watching everyone equally, but like surveillance of the past and present, they’re not. History has taught us that surveillance apparatuses of incumbent powers tend to focus disproportionately on those with a relative lack of power, and surveillance in Elison’s future is no different. When the protagonist runs a query of video captured on the day of their abandonment, searching for clues about their parentage, the A.I. highlights only the Pinners in the crowd, an assumption that reflects an Earthling bias.
Elison’s treatment of DNA also reflects on how humans of the future (and of today) will have to make deliberate decisions about whether to use technological advances to feed or to starve oppression. Seeking pieces to the puzzle of their birth, the protagonist requests access to their DNA. The request is partially denied. “Some of your DNA is available to you, under the law,” explains the clerk. But “most of it isn’t specific to you, because it belongs to other individuals. It’s a complex legal issue—if you had total access to your own genes, you might be invading the privacy of people who share your bloodline.” It’s a point that brings to mind the Golden State Killer. After a decadeslong search, law enforcement finally arrested a suspect in April after submitting the killer’s DNA to a small DNA-analysis company—with a free genealogy website—called GEDmatch. GEDmatch linked the submitted DNA to genetic data that some distant relatives had already shared with the service, and the suspect was tracked down from there. The suspect, Joseph DeAngelo, had never used the service (and may not have even known the distant relatives who did use it), but that didn’t stop him from being traceable.
It’s easy to see the good in the use of familial DNA to do something like catch a serial killer. But what happens when the wrong person is picked up on a familial DNA partial match and, as one journalist warned years ago, “the imperfect technology starts ruining lives”? Worse, what happens when businesses start looking for ways to use customers’ or applicants’ DNA to better inform important decisions? Imagine, for example, if lenders could find out what borrowers’ chances were of developing a devastating chronic illness. Existing laws don’t sufficiently protect against this. The Genetic Information Nondiscrimination Act prohibits some forms of discrimination based on an individual’s DNA profile, but not all. The Americans With Disabilities Act prohibits some forms of discrimination based on an individual’s disability, but not based on their likelihood of developing a disability.
To try to prevent this form of technology-supported discrimination and injustice from taking place, lawmakers might expand on existing nondiscrimination laws. In Elison’s future, lawmakers have done this, which seems to be good. And they have gone a step further: They have directly restricted access to DNA—even to one’s own DNA. But hemis like the protagonist have their DNA logged by the government for an “ongoing interbreeding genome project.” This raises alarm bells. What does the government know about genetic distinctions between humans, hemis, and Pinners, and how might that information be misused to serve or preserve interspecies inequity? Just as it used Census information to put Japanese Americans in internment camps in World War II, could the government be equipping itself with this type of data in case it needs it later to fuel a massive anti-Pinner or anti-hemi effort? Does having the data make that more likely to happen?
There are, to be sure, positive uses of racial and ethnic genetic data. But Elison’s story reminds us that when we use science and technology to categorize people, we should pause to reflect on what, exactly, is driving us to want to identify the differences, and ask whether the motivation is something sinister. As Elison’s protagonist muses, “I wonder if the ability to tell [the difference between a hemi and a human] is always accompanied by the certainty that one is better than the other.”