Last month, one of Westworld’s science consultants stopped by my school. David Eagleman, a neuroscientist who advised the show’s staff for season two, and his co-author Anthony Brandt visited to chat with students about their new book on creativity, The Runaway Species: How Human Creativity Remakes the World.
When it was time for Q&A, I had big questions for him—but they weren’t about the book. I wanted to grill him on something from Westworld that bothered me for the duration of season two: the continuity of consciousness problem.
Season two of Westworld features multiple plotlines in which characters seek a sort of technological immortality, either by having themselves resurrected as androids or by being uploaded into a new virtual reality. One plot involves the park’s biggest shareholder, James Delos, having his consciousness installed in the body of a Westworld “host,” or android. The attempt ends in tragedy—it turns out that unless the copying is perfect, the copied subject will break down, go mad, necessitate his destruction, and have to be recreated again. The cliffhanger at the end of the season is the reveal that one of the show’s villains, William (a.k.a., the Man in Black), has also been mysteriously resurrected in the future as a host.
If an individual’s brain destructs or disintegrates for any reason, and technology subsequently allows the brain to be rebuilt exactly from either the same or different component parts, is the original consciousness of the individual preserved?
The show treats these robotic rebirths as attempts at immortality, whether they work or not. When there are even moderate successes, there’s an unbroken through-line between the dead person’s consciousness and that of the resurrected one. But a thought nagged me whenever these scenes appeared on screen: copying a person is not the same thing as prolonging the individual’s existence. Once a copy exists, there’s no reason to believe that the person’s original consciousness—their unique sense of being—is transferable to the copy. There’s no guarantee of continuity, and the show didn’t seem to care much about that.
This hadn’t been an issue in season one, even though it features “copies” of real people designed by park mastermind Robert Ford. Bernard Lowe is Ford’s android remake of Westworld co-founder Arnold Weber, and Ford makes host versions of his own family members—and even of himself as a child. Yet the show implies no continuity between the human consciousnesses of the originals and the android sentience of the simulacra.
The change in season two to treating android copies as a means of immortality bothered me because it ignores this basic philosophical question. I’d hoped that the sci-fi community had already worked this one out with Star Trek, but the case of Westworld suggests otherwise.
The Transporter Paradox
Fans of Star Trek might know this problem better as “the transporter paradox,” a philosophical quandary that predates the show but perhaps finds its best expression therein. The question of the paradox is as follows: If an individual’s brain destructs or disintegrates for any reason, and technology subsequently allows the brain to be rebuilt exactly from either the same or different component parts, is the original consciousness of the individual preserved?
On Star Trek, transporter technology breaks down people and reconstructs them in a new place. So, is the narrative of that person’s life snuffed out, with a paradoxically new and technically identical beamed down? If you believe that an individual’s consciousness and identity has coherence only until it is disrupted or destroyed, then you may well think that a Star Trek transporter essentially murders the person it teleports and builds a clone to replace them. The new person would even have all the thoughts and memories of the old, so they would have no sense of their entire reality coming into being only moments ago.
The viewer may be willing to buy into the logic of the show and assume the matter is irrelevant or somehow solved within the fantasy of the future. That said, the show itself deals with the paradox on the Next Generation Season 6 episode “Second Chances,” in which a teleporter accident creates two William Rikers, both of which are technically the same person. Neither man can claim to be the original, and they struggle with how the malfunction of a device has stripped them of their uniqueness, leaving them two identical men with the same memories but discrete consciousnesses.
Make as many copies as you want, we have yet to envision a means of translating the organic brain into inorganic immortality in a way that preserves the singular continuity of the human mind.
Just as transporters on Star Trek destroy and then perfectly recreate individuals, the android technology of Westworld allows its scientists to make exact copies of dead people. Thus, the transporter paradox equally applies to Westworld, since the show invites us to view these “resurrected” people as non-distinct from their original selves. What struck me about season two was how uninterested the show was in challenging this assumption, even though a foundational series like Star Trek had—at least tentatively—engaged it.
Just Asking Questions
I held back my question for a while during Eagleman and Brandt’s Q & A, not wanting to take time from my students. However, when the topics started ranging to matters besides the book and students seemed to be running out of things to ask, I saw my opportunity. I raised my hand and said, “On Westworld, a lot of the characters are after some form of technological immortality. They want to have their brains replicated in machine form, or they want to be uploaded into a computer. Dr. Eagleman, you’re a neuroscientist, so can you tell me if this preserves the person’s consciousness in any meaningful way? Or, is it just making a copy or clone of the person, who is, from their own perspective, dead no matter what?”
Eagleman responded enthusiastically. He said that the scientists of the future would very likely devise technology that perfectly mirrors the functionality of the human brain. He described the brain as a machine itself—an organic and complex one, but a machine nonetheless—and in that respect, there was little doubt that it could be reverse-engineered. The problem, as he saw it, was that such copies might struggle to make new memories and record significant new information, leaving little for them to look forward to. Even this, he hoped, might be overcome.
I appreciated his response, but I did not quite see it as an answer—it was like we were talking about two entirely different things. He was eager to predict that scientists would copy individuals’ brains in the future, while I wanted to know if that was the same thing as prolonging the singular consciousness trapped inside a person’s brain. At first, I thought that maybe he had misheard my question, but I looked up his own writings on the subject and found this blog post, in which he writes out much of what he told me and then concludes that the “downloading” of a brain results in eternal digital life for the downloaded person.
Upon reflecting on Eagleman’s answer, I found that it did resolve many of my larger questions about Westworld season two. I had wondered why the show did not bother to address the continuity of consciousness problem in any way. I had mused at how it depicted computer brains and digital playgrounds as paths to personal immortality, albeit complex and expensive ones. I was beginning to understand that the creators were likely more concerned with the scientific question of “Could it be done?” than the philosophical one of “Does it matter?”
Frankly, I’m not sure that it matters much for the downloaded person. I agree with Eagleman that scientists will likely be able to copy a human brain. I even believe that the information in a person’s brain could theoretically be copied into a digital environment, such as the various virtual reality settings of Westworld season two. However, I’m also fairly certain that the death of the meat-brain is the death of the prime individual. Make as many copies as you want, we have yet to envision a means of translating the organic brain into inorganic immortality in a way that preserves the singular continuity of the human mind, and Westworld fails to even hypothesize a solution. What the show does envision is the creation of a new and distinct continuity of consciousness that has no tether to the old one.
Here’s a thought experiment. Imagine that we do have the technology to copy the human brain. You watch a scientist make a perfect copy of your brain and install it into a 3D-printed version of your body. It’s just like you, memories and all. Now, you watch the scientist put a gun to the copy’s head and pull the trigger. Did you just die? Of course not. Your own consciousness survives, and one separate but identical to you dies. So, why would you think a copy of your brain made after your death would be any different? It would be a unique entity.
Here’s another thought experiment. Imagine we have the technology to copy minds into a computer afterlife. Scientists can download the exact data from your brain and transfer it into a digital environment. Now, imagine that they do it while you’re still alive. A digital version of you is happily at play in Computer Heaven, and—oh, what’s this?—a scientist is pressing the delete key. File terminated. The meat-you is still pacing the lab, perhaps a little traumatized, but are you dead? Of course not. So, again, why would you think a digital download put online after your demise would be any different?
There persists the fantasy that we can innovate around mortality, disrupt death in the way that Uber struck at taxis and Netflix killed Blockbuster. We will outsmart the grave. Tech CEO Hamlet will spike Yorick’s skull like a football in the endzone.
What I’m trying to demonstrate is that the creation of a perfect copy of a person does not entail the transfer of the individual’s consciousness to the copy. People interested in this technology want to live forever. They hope, as Eagleman writes, that “we will not have to die anymore.” What it seems like they will get is a distinct version of themselves that lives on after the original dies—a sentient statue in memoriam.
For some, that may be enough. It’s easy to imagine that an Elon Musk or a Donald Trump might think of themselves as important enough that the world should never be without some version of them. But, for characters like those in Westworld—and many of us in our world—who would simply like to forestall death indefinitely and continue our particular experiences in a new body or a new environment, there should be little comfort in these technologies. What’s dispiriting and, in fact, retrograde is how eagerly and without question the show embraces them.
Reinventing the Soul
As best I can tell, the idea on Westworld is that an assemblage of your exact thoughts and personality traits summons you into existence, even if you’re dead. When we watch William perform an adapted model of the Turing test on the Delos android, he’s checking for accuracy and sanity. He wants to make sure that the new Delos will respond exactly as the old one would—and he must also ascertain that the process hasn’t driven the creature mad. The whole thing feels mystical, like a summoning or resurrection.
Similarly, at the end of the season, the renegade hosts flee to the Valley Beyond, a virtual promised land that serves as an afterlife for them. Their bodies fall to the ground when they walk through the portal, but their digital selves live on, transferred to a new server secure from the interference of humans.
That’s all well and good for the hosts, who are digital natives. Their computer brains were made to interface with the park’s technologies, leaving them perfectly capable of relocating their personality files and memories—information in a relatively pure form—from one server to the next.
The human form of immortality—rebirth as a host—makes far less sense. A recreated Delos or William is just a copy, but the show never addresses the gap between the dream of immortality and the reality of mere iteration. Instead, it seems to take the special aura of the replicated individual as a given, with the tacit assumption that remaking a person “brings them back.” The show never outright says as much, but it’s implicit within the characters’ hopes for resurrection and the corporation’s dedication to getting it right.
So, what this science fiction television series of the late 2010s seems to have done is reinvent the concept of the soul. Its fantasy notion of bringing a specific human being back to life, original consciousness in tow, echoes both the pre-medieval Western tradition of resurrection and the Eastern corollary of reincarnation.
If we think of Westworld as inspired by Silicon Valley startups, this is actually on-brand—as tech tycoons are notorious for applying their genius to inventing things that, for the layman, appear to already exist, such as Elon Musk’s accidental reinvention of the bus stop or his potentially unnecessary revision to the brick. These ideas of the soul are ancient, primal examples of magical thinking, and it’s odd to see something like them surface unquestioned in a science fiction show that otherwise takes its science seriously.
How to Make a Brain on a Budget
If we separate the scientific advance of creating a computerized brain from the impossibility of preserving the individual thread of consciousness, the feat of making new brains on demand seems considerably less impressive. After all, about half the human population already has the onboard technology to 3D-print human brains—they just take nine months to gestate and then about two decades to fully mature. What’s more, the human beings attached to those brains contain the genetic heritage of their parents, meaning that they lend their creators the low-tech continuity of lineage, something comparable to the high-tech equivalent of an android copy. After all, if your individual consciousness won’t survive no matter what, a kid is a pretty cheap alternative to a clone.
In the penultimate episode of Westworld season two, the dying William kills his own daughter Emily as he seeks out the Forge, the site where the Delos Corporation makes host-copies of real humans. I assumed that the finale would deal with the sacrifice of William’s genetic lineage in his pursuit of immortality’s false hope. I was wrong. In the examples of James Delos as well as William, the show pitches android reincarnation as a kind of hell because the process is flawed, but it still treats the transference of the individual consciousness as legitimitate, and it plays Emily’s death merely as more evidence of William’s ruthlessness. In the world of the show, her murder hardly matters, because she becomes an android in the epilogue, too.
I’m left feeling that Westworld tiptoes to the precipice of the big questions about how our meat brains are incompatible with the computer devices we will create to enshrine them, but that it peers into the dark depths and walks away, not liking the answers. The fantasy of living forever is enticing. The likelihood of it remaining a fantasy is not.
The students at my STEM high school have me wondering if this is primarily a problem for those us who are millennials or older, though. Having watched the world transition from analog to digital, we view the technology of the latter with the assumptions of the former. But after Eagleman and Brandt spoke, a student walked up to me and asked, “Did your question today come from playing SOMA?” She was referring to the indie horror game that proposes consciousness cannot be transferred, only copied. In another conversation, a different student compared the continuity of consciousness problem to cloning a pet, something we can already do. “It’s just a comfort to the owner. It’s not the same dog,” she said. “Remaking a person would be like that. It might help their loved ones with grief or it might be weird, but it wouldn’t mean anything for the actual dead person.”
About half the human population already has the onboard technology to 3D-print human brains—they just take nine months to gestate and then about two decades to fully mature.
These young people acknowledge the vanity of immortality science and recognize it for the raw expression of ego that it is. Every human being that has ever lived has died, will die, or is presently dying. In the arts and humanities, brilliant people have devoted considerable intelligence to coming to terms with that. However, there persists the fantasy that we can innovate around mortality, disrupt death in the way that Uber struck at taxis and Netflix killed Blockbuster. We will outsmart the grave. Tech CEO Hamlet will spike Yorick’s skull like a football in the endzone.
The continuity of conscious problem, which my students grasp increasingly better than adults, argues otherwise, and I wish that Westworld had adequately explored that theme. At any rate, the kids—the genetic continuity of their parents, who will never extend their consciousness beyond the grave—are alright