Future Tense

Are You Still You if Your Brain Is Uploaded?

Certain death isn’t the only problem with that company promising a form of digital immortality.

Brains.
Photo illustration by Slate. Images by Thinkstock.

Next week, representatives from a startup called Nectome will pitch their idea to an audience of investors in Silicon Valley. According to their website, the company is “Committed to the goal of archiving your mind.” That commitment is not in doubt, but one can question what Nectome means by “mind” and “your.” The idea is to take living people—preferably those already on their deathbeds—pump embalming fluid into them, killing them, then freeze them and scan their preserved brains into a computer. (Wondering whether Nectome can get away with this? According to MIT Technology Review, “The company has consulted with lawyers familiar with California’s two-year-old End of Life Option Act, which permits doctor-assisted suicide for terminal patients, and believes its service will be legal.”) The intended result: an afterlife in silico.

Advertisement
Advertisement
Advertisement
Advertisement

According to Nectome, 25 people have each paid $10,000 to be on the waiting list—all of them demonstrating magical thinking. Not just magical thinking as in unrealistic optimism about the capabilities of technology, though that’s probably the case, too. I mean magical thinking as in believing in actual magic, in the form of a soul.

Suppose the technology somehow worked perfectly. You’re frozen, your brain is scanned, your mind is reconstituted in a machine, and its circuits are switched on. The machine now has thoughts and emotions and memories identical to those of your prefrozen brain. Let’s say the artificial brain is also connected to an artificial body sending it signals much like a human body would: touch, sight, taste, etc. That machine would feel just as you would if you went to sleep and woke up in a new body. But in a critical sense it would not be you.

Advertisement
Advertisement

Why not? Imagine you did not die in the freezing process. You wake up on the table and drink a cup of hot chocolate. What’s that thing over there in the corner, making bleeping noises and claiming to be you? Surely you and the machine can’t both be you. You don’t feel its feelings and it doesn’t feel yours. You are distinct beings. So if that thing is not you when you wake up, why is it you if you don’t wake up? You may have thought it was you because, given only two serial strands of consciousness—one in a functioning brain pre-scanning and one in a functioning machine (but not brain) post-scanning—you intuitively link them. But connecting the pre- and post-uploaded minds into a continuous narrative of “you” implies something immaterial and supernatural has leapt from body to machine. It implies a soul.

Advertisement
Advertisement
Advertisement
Advertisement

(I will not try to disprove souls here, and I acknowledge that a proof of anything’s nonexistence is hard to come by—I haven’t seen a unicorn but maybe they’re hiding. However, I will point out that scientists have yet to find credible evidence that consciousness can exist independently of a physical substrate. Similarly, there’s no evidence for free will, the power of an immaterial consciousness to influence its material host, poltergeistlike.)

The thought experiment above may have left you with a nagging question. If you wake up in your body and call the machine in the corner a brand-new person, how do you think it feels? From its perspective, it went to sleep and woke up in a new body, and someone else is now inhabiting its old one. That thing is the impostor. What’s the solution? You both have souls? You’re both you? I think the solution is that, in a sense, neither one is you.

Advertisement
Advertisement

To some degree, any notion of personal identity suggests belief in a soul. When you nap and wake up, those are two different instances of consciousness; the second just happens to have the ability to simulate what the first instance of consciousness must have been like, what psychologists call episodic memory. Now take out the nap. Your consciousness in one instant can’t access your consciousness in any previous instant, no matter how infinitesimally recent. Every moment contains a different instance of consciousness. There’s no continuity there, nothing that persists of its own accord, just a perpetually generated series of projections, like movie frames. We’re born and we die infinite times per second. (Or maybe only 1044 times if time is discrete.) That doesn’t mean we can’t tell stories about continuous identity. I feel like the same person I was a minute ago, and we hold humans accountable for what “they” (their bodies) did in the past. These are valuable conveniences. But they’re stories.

Advertisement
Advertisement
Advertisement
Advertisement

Nectome also sells magical thinking in the more secular sense of irrational optimism. Before you even get to notions of a soul, how faithfully can technology archive a mind in the first place? As I’ve written in Slate, we don’t know how much information the brain contains, so we don’t know how much we need to archive. Do we need every cell, every synapse, every molecule, every atom? And then will the archival format be conscious? Computer simulations of water aren’t wet. It’s possible the archive may have to be run on a computer that is physically identical to the original brain to capture all its functionality. In which case: Why re-create a clone brain when you can just fix the original frozen one? Archiving may not offer anything beyond regular cryogenics and reanimation, if they ever work.

Advertisement

Nectome’s product is the ultimate vaporware. Customers are offered something, and by dint of purchasing it, they may actually make it impossible to see whether the product actually comes to be, because they’ll be dead. Which also makes it the ultimate killer app. Maybe the technology will someday work, and people will wake up in a machine. But will they really? No, that will be someone else. See my comments on: souls, lack thereof.

Advertisement

Maybe customers are OK with dying and just having an entity somewhat like themselves exist in the future, as a kind of legacy. They want their current ideas to have a life of their own and influence the world in perpetuity. That’s fine. It’s also a somewhat weaker form of magical thinking. Our desire for legacy is in part based on the belief that our consciousness will survive as part of our legacy, in what psychologists call “symbolic immortality.” People regularly sacrifice themselves for the survival of larger ideals with which they identify—God and country, etc. In the trade-off between actual and symbolic survival, they must see some equivalency, some way in which their soul will benefit later from the price it pays now.

That’s magical thinking, which, as I said, is fine. We all want to leave a legacy. Nectome should just advertise its services as such, at best.

Advertisement