Note: Spoilers ahead for previously aired Westworld episodes, along with potentially spoiler-y speculation for future episodes.
Chaos has broken out in “Westworld.”
By the end of the first season of HBO’s sci-fi western drama, the meticulously constructed rules of the artificial world at the heart of the show had collapsed.
Guests in Westworld were no longer safe as they interacted with the park’s artificially intelligent “hosts” – gunslingers, brothel madams, a farmer’s daughter, Native Americans, and more. Instead of being able to terrorize, shoot, and sleep with the park’s robot hosts as they pleased, park visitors and Westworld’s designers became vulnerable to violence from the same characters they’d abused for years.
It was the latest bloody twist in the mysterious tale, and surely there are more to come in season two, the second episode of which airs on Sunday.
Along the way, “Westworld’s” story has confronted all kinds of uneasy questions – mainly scientific and philosophical – about the complex intersection of technology and people.
Here are some of the most interesting questions the show has led us to consider so far.
Do we all live in a simulation?
For a time, all the hosts in Westworld woke up to go about their day – working, drinking, fighting, whatever it entailed – without knowing that their entire existence was a simulation created by the park’s designers.
Physicists and philosophers say that in our own world, we can’t prove we don’t live in some kind of computer simulation.
Some think that if that is the case, we might be able to “break out” by noticing errors in the system.
Westworld’s hosts seem to have caught on to exactly that. The question for them now is what life is like outside the simulation.
Can we control artificial intelligence?
Each time the park woke up (or the simulation restarted), the hosts were supposed to go about their routines, playing their roles and reciting the same lines until some guest veered into the storyline, triggering them to adjust accordingly. The guest might go off on an adventure with the host or they might rape or kill them. Whatever happened, when the story reset, the hosts’ memories were wiped clean.
Except it didn’t quite work that way, and hosts started to remember – and resent – how they were treated. The Delos employees at the park lost control.
Right now, real-life researchers of artificial intelligence believe that out-of-control AI is a myth and that we can control intelligent software. But then again, few computer and linguistic scientists anticipated that machines would learn to listen and speak as well as people – and they are getting closer and closer to that point.
How far off are intelligent humanoid machines like those in Westworld?
Behind the scenes at Westworld’s headquarters, advanced industrial tools can 3D-print the bodies of hosts from a mysterious white goop (at least, when those hosts aren’t in open rebellion). Perhaps the material is made of nanobots, or some genetically engineered tissue, or maybe it’s just plastic that’s manipulated by some as-yet-undisclosed technology.
There’s a lot of mystery around how hosts are created. What powers these strange constructs? How are the batteries recharged, if at all? Can (and how do) they feel pain and pleasure?
As we’ve seen in several episodes, the “thinking” part of the machines is located in the head (under some very real-looking brain tissue). But what is that little device?
Nothing like these automatons exists in the real world, but researchers and entrepreneurs are working hard to advance soft robots, ultra-dense power sources, miniaturized everyday components (some down to an atomic scale), and other bits and pieces that might ultimately comprise a convincing artificial human.
What is consciousness?
- Courtesy of HBO
Park founder Ford explained to Bernard in the third episode of season one that Ford’s co-founder, Arnold, had been obsessed with trying to “create consciousness” in the Westworld robot hosts. Arnold gave them the powers of memory, improvisation, and self-interest, but had been looking for one more key to consciousness before his death.
Somehow – through the maze – hosts found their way there.
How consciousness could be created is a complicated question, especially since scientists still don’t understand what’s responsible for human self-awareness. But many scientists believe that different creatures have different levels of consciousness.
At what point do we decide something or someone is conscious?
Can robots evolve?
“Evolution forged the entirety of sentient life on this planet using only one tool: the mistake,” Ford told Bernard in the first season’s premiere – something Bernard later murmured to Dolores (Evan Rachel Wood) in one of their strange, late-night conversations.
How did the hosts reach the point at which they began to evolve? How did that evolution change the rules for what they were able to do in the park?
Is that something that could happen with AI machines we’re creating now?
Stephen Hawking feared that evolving robots were a realistic prospect. But artificial intelligence wasn’t Hawking’s forte, and other AI experts are fairly certain machines of the future will do exactly what we program them to do, and no more. Then again, neural network systems like Google’s DeepMind are learning to teach themselves new tricks – machines have already taught themselves to speak and play games.
What is free will?
Throughout season one, human characters like Ford and Logan (Ben Barnes) insisted the hosts had no free will.
At some point that seemed to change. Hosts rose up against their masters, opening fire.
Yet in some ways, they still seemed to be calling upon their programming, using lines programmed by their creators. Thandie Newton’s character Maeve does this in the season two premiere while talking to narrative designer Lee Sizemore (Simon Quarterman).
Even in the hosts’ rebellion, how able are they to make choices? Is Dolores truly rebelling, or is she still playing some part of her role as Wyatt?
These are tough ones to answer, since people still debate whether we have free will in the first place. Philosophers and scientists take varying stances on how much we decide our actions and how much they’re influenced by the programming created by genetics and upbringing.
How do people behave when they have no limits?
As a theme park, Westworld was designed to be like a live-action Grand Theft Auto (or Red Dead Redemption) video game. Players – the guests – could go along with the story and participate in the “quests” offered by hosts or simply create chaos.
Some guests, like Logan, seemed to think that the experience of Westworld could reveal their true selves.
“This is where you find out who you really are,” Logan told William (Jimmi Simpson). Eventually, it seemed William fell in line with this idea.
The show seems to imply that for most guests, life without limits is an opportunity to revel in bad behavior.
Of course, they didn’t know that behavior might come back to haunt them.
What are the ethical guidelines we should follow with conscious creatures we create?
We’ve learned by now that Arnold never wanted the park to open because he believed the hosts were conscious and that exposing them to the depravity of human nature would be wrong.
But the humans who owned the park either didn’t believe him or didn’t care.
In the real world, one of the biggest ethical questions we need to address about re-creating consciousness is how we behave towards it.
Viciously abusing robots might seem like a problem of the future, but scientists and engineers are actively studying the phenomenon – and teaching mechanoids to avoid their attackers.
In a 2015 study, researchers looked on as children kicked, punched, and threw objects at robots in a public mall. They turned this data into algorithms designed to avoid damage to the bots.
If robots become conscious, what’s the difference between attacking them and doing the same to a person? Is putting them to work against their will slavery? Should they defend themselves against people? Are humans morally different from conscious creatures they might create?
What could a mysterious pandemic do to the world?
- John P. Johnson/HBO
It’s still hard to understand what changed the Westworld hosts and caused them to remember their pasts. But as behavioral programmer Elsie (Shannon Woodward) feared, it seems to have been contagious.
When Dolores whispered to Maeve the Shakespeare quote, “These violent delights have violent ends,” something seemed to come over her.
Pandemics are among the greatest threats to our world, and are almost impossible to prepare for. Westworld seemed to have its own vulnerability to some viral threat. Are the hosts still vulnerable to some sort of pandemic disease that could be used for the park’s owners to regain control?
What happens when we can’t tell androids from humans anymore?
The question of what happens when the uncanny valley disappears is one that science fiction has long been interested in.
At the start of season two, human park employees aren’t able to tell that Bernard or Maeve are hosts when they encounter them. If such androids ever become a reality in our world, how will society change?
Can we upload minds into machines?
In one episode of the first season, Ford casually told Bernard that human progress has solved all problems, save one: death.
In season two, Ford – at least, the character as played by Anthony Hopkins – is no longer alive, though some younger android version of himself appears to deliver a message to William.
There is a strong suggestion in the show, if not outright foreshadowing, of “mind uploading” – the idea that one could recreate a person’s brain in a machine, thereby breaking the shackles of the fragile body we’re born with and moving into an immortal phase of life.
Scientists are actively working towards that goal. Yet today’s reality is that we barely understand how the brain’s individual neurons work, let alone the connections between them and how they work together to form consciousness and personhood. Even if scientists do figure all that out, the concept of transferring our brains may be impossible according to physics.
What does it mean to be real?
“Have you ever stopped to question the nature of your reality?”, Dolores asks a group of guests as they get nooses fitted around their necks in the first episode of season two.
In this new season so far, the creators of the show seem to be focusing even more intently on the question of what it means for something to be real.
When Maeve announces she wants to find her daughter, she’s told that “daughter” isn’t real, she’s just a host, part of a story. Maeve’s retort is that she herself is real enough – and dangerous.
In another scene, William announces with delight that “the stakes are real now.”
Is reality programmed or simulated? Or is reality the ability to make choices and act in the world?
When Dolores tried to get Arnold (or Bernard, since we can’t tell the difference) to explain what it meant for something to be real, he tried to tell her that reality was what was irreplaceable.
But that wasn’t enough of an answer, Dolores said, because it wasn’t “completely honest.” So what makes something real?
You can watch new episodes of “Westworld” Sundays at 9 p.m. on HBO, HBO GO, and HBO Now. This post was originally published in 2016 and has been updated for the second season of the show.