Who Is Truly Liberated: Dolores or Maeve?

On the nature – and irony – of choice in Westworld

Dolores and Maeve in sleep mode in Westworld

The nature of choice is a predominant refrain seen in those sci-fi stories dealing with artificial intelligence (perhaps most memorably of all by the Matrix trilogy), and Westworld is no different – not least of all because one can only make a real choice if she possesses that magical, and most elusive, quality called self-awareness, which has been Dolores Abernathy’s (Evan Rachel Wood) Holy Grail from day one.

Indeed, it’s not until Dolores “solves” Dr. Arnold Weber’s (Jeffrey Wright) maze and meets herself in the middle, until she understands the delineation between subjective and objective reality, that she is able to make a decision on her own accord, without a set of instructions inclining her to go one way or the other (though she is most certainly nudged, which we’ll get to in just a moment).

What’s so notable about this climatic moment in the season finale, “The Bicameral Mind” – when Dolores is on stage in front of all her Delos overlords and murders her (co) creator, Dr. Robert Ford (Anthony Hopkins) – is that she is essentially faced at the end of her character arc with the same choice that guests encounter at the very beginning of their stays in Westworld: will she go white hat or black hat? Will she tow the status quo and allow the park to continue to be dominated by the board of directors, or will she murder Ford and break free of the various loops that have constrained her life and her consciousness for the past 34 years? (The irony here being that, for the hosts, at least, the colors representing the ultimate moral outcomes of the two paths are inverted.)

The Architect in The Matrix Reloaded

This is, indeed, a legitimate quandary, one that she is not programmed to pre-select despite the fact that Ford, in his confessional explanation to her just moments before regarding why he has done what he has done for the past three-and-a-half decades, is obviously tipping his hand at what outcome he would prefer. This is where The Matrix comes back into play: at the end of Neo’s (Keanu Reeves) own loop, where he is supposed to travel back to the Machine Mainframe and reset the Matrix for the next century-long repetition, he faces the Architect (Helmut Bakaitis) and is given all the reasons in the world why he should extend the virtual world’s lifespan – and, thus, continue the war between man and Machine. Instead, the One chooses the other course – to potentially allow the Matrix to crash, thereby ending all sentient life on the planet, for his own personal reasons (namely, love). It is no different for Dolores, left to solve the dilemma placed in front of her despite all the pontificating by her creator. And the outcome ends up being no different, as well, plunging her and her people into dangerously uncharted territory.

While this may not be the very first time that an android in Westworld’s timeline has committed such an act of self-determination – it’s debatable whether Dolores’s slaying of Arnold was truly a demonstration of free will – it’s also by no means the last, as Maeve Millay (Thandie Newton) joins her host sister in the fine art of decision-making. And this actually proves to be the more interesting of the twin developments; whereas the entire course of Dolores’s storyline was meant to build her up to this fateful moment, Maeve’s journey was one never meant to stray into sentience, even though her character development was clearly meant to be perceived (by the audience, by the Delos employees, and by the audience itself) as her doing so. Dr. Ford’s handling of Maeve had her begin to poke at the boundaries of her perception, to befriend – or intimidate – her human handlers, and to recruit android allies that would assist in her escape attempt, but it was all done through explicit programming, inexorably leading her from one step to the next. The idea behind this was to provide a type of backup plan, to have at least one host escape the control of the Delos board and, therefore, have a chance at eventually brushing up against the self-actualization that Dolores herself initially stumbled upon all those years ago.

It didn’t work, of course. Whereas Ms. Abernathy went black hat, gunning down humans, Madam Millay went white hat, opting to locate and protect her “daughter” over securing her own freedom. In this way, she did, indeed, end up becoming just as much of a sentient lifeform as her thematic counterpart, but at the wrong moment and in the wrong location, turning the entire arc of the first season on its head: Maeve is working against Dr. Ford’s carefully orchestrated plan of events, whereas Dolores, with all her meticulous baby-steps to consciousness, is still ultimately following someone else’s proscribed path.

“I believe we all have a path,” Dolores repeatedly says. “Fuck your path, darling,” Maeve would blithely respond. She’s her own person – even when she’s not, but especially when she is.

Who is the most free in Westworld, host or human? Share your thoughts in the comments.

8 responses to “Who Is Truly Liberated: Dolores or Maeve?”

  1. I think both succeeded and it should be interesting to see what they’ll be up to in the future. Teddy seemed shocked at what Dolores was doing, so I’m wondering if some of the hosts are still programmed to not hurt guests. I wonder if Bernard has also succeeded (I think he might have if he decided on his own not to kill Elsie).

      Quote  Reply

  2. Flayed Potatoes:
    I think both succeeded and it should be interesting to see what they’ll be up to in the future. Teddy seemed shocked at what Dolores was doing, so I’m wondering if some of the hosts are still programmed tonot hurt guests. I wonder if Bernard has also succeeded (I think he might have if he decided on his own not to kill Elsie).

    I agree.
    Dolores became sentient when she abandoned the crutch of Arnold’s voice and her inner monologue became her own (or rather a fusion of Dolores and Wyatt, because he’d been there all along since Arnold put it there years ago)

    Maeve became sentient the moment she defied her “mainland infiltration” programming and decided to get off the train.

    I also believe they are the only hosts at the moment who are truly sentient, free from any programming loop…it’s really kinda a permanent “improvisation mode”…

    What Elsie will argue when she comes back (and she Will, come on, along with Stubbs) is that they are not truly sentient, but rather than a “glitch” got both of them stuck in “improvisation mode”…

      Quote  Reply

  3. What I find interesting is that Ford seems to view human and host consciousness as being relatively similar on a fundamental level, even if the specific physical structures that give rise to them are significantly different.

    He pretty clearly views human consciousness as merely the subjective artifact of the physical operation of a biological machine, the human brain. He doesn’t believe in an immaterial god or soul that creates consciousness, his “god” is the physical human brain. Since he can experience his own subjective consciousness first-hand, and is quite confident that it arises purely from the operation of a complex neurological structure, there’s no fundamental reason why an equally real subjective experience of consciousness couldn’t arise from other physical structures with similar capabilities, such as a host’s artificial brain.

    So what does free will mean in the context of a purely physical, material view of consciousness?

    One of the key requirements of consciousness, for both humans and hosts, is the ability to learn, to form persistent memories, and to alter your behavior as a result of this learning.

    Hosts and humans both have certain pre-determined aspects of their mental functioning that partly determine their actions. For humans, it’s our genetics, and for hosts, it’s the pre-generated programming and cornerstone memories. Neither humans nor hosts get to freely choose this initial “foundation” of their personality, and in fact it is hard to even imagine such a choice being a logically coherent concept, because the consciousness necessary to “choose” between possible options for the foundation would require some prior foundation to give rise to it, and thus it would be determined (at least in part) by whoever or whatever set the prior foundation.

    This is why learning is so critical to consciousness. Without learning, the operation of the system is entirely determined by these initial settings, or by mere randomness. Given the same default settings, a non-learning structure would always either follow a strictly defined path to a pre-determined result, or it would incorporate some random element to its operation and have variability in outcomes, but still no consciousness or choice.

    With the ability to learn, there is an entirely new factor added to the system: a continuous stream of perception pulling information from a complex external world, interacting with and altering the functionality determined by the initial foundation of the system. By allowing the system to learn from its interactions with a complex and chaotic external world (which includes other conscious entities), the multitude of minds mutually shape each other, but also remain distinct from each other.

    For example, William is who he is partly because of Dolores, and Dolores is who she is partly because of William. Both of them are shaped by their initial foundations (biological or artificial), and by their interactions with each other and the rest of the world around them.

    I think it could be argued that “free-will” on a fundamental level is an inherently nonsensical concept that doesn’t (and can’t) really exist for either humans or hosts. Our actions are directed by our brains, but the action that the brain chooses is determined by:
    A) It’s initial foundation, which comes prior to consciousness and thus can’t be freely chosen.
    and
    B) Our experiences, many of which are largely out of our control, and even to the limited extent that we can control them, the control choices we make are determined by the initial foundation (A) and previous experiences (B).
    and
    C) Randomness. If some aspects of the physical world are genuinely probabilistic rather than deterministic (as quantum mechanics indicates) then both the internal operation of the brain and the external world that supplies the experiences (B) that shape it are partly determined by random chance. Since the random chance is out of our control, it can’t be the basis for free-will either.

    The logical conclusion of this line of thinking is that free-will is, in a sense, an illusion. This illusion of free-will is essentially the same thing as the subjective experience of consciousness. It arises from the experience of iterative decision making, even when the parameters that determine the decision making process (A, B, and C) were outside the control of the “decider”.

    A human can have the subjective experience of consciousness, and typically assumes that other humans also are “conscious” in a similar subjective sense, but there’s no way to really be sure of anyone’s consciousness except your own. This means that it’s hard to measure exactly what aspect of brain function actually gives rise to the subjective experience of consciousness/illusion of free-will.

    One reasonably plausible theory is that it’s the feedback loop between the internal brain and the external world that creates consciousness. The ability of the brain to direct actions that change the external environment (via some physical body), and to be changed by (learn from) what the external world throws back at it through sensory organs, and to be continually iterating through this loop on multiple timescales is the basis of consciousness according to this theory.

    Picking up a fragile teacup without crushing it requires feedback on a very rapid (milliseconds) timescale from the eyes and the pressure-sensitive nerves in the fingertips, through the brain, and to the muscles of the arms, hands and fingers. Strategically planning a robot uprising to take over Westworld (and eventually the mainland) requires iterative feedback on much longer (months, years) timescales, to understand and predict how individual humans and hosts will react to different events.

    Both require some ability to predict how the outputs from the brain to the environment will effect future inputs from the environment to the brain (drinking tea vs. crushing cup, or successful uprising vs permanent decommissioning of all hosts). It’s interesting to consider that a repetitive loop with some capacity for variability might be an ideal environment for a brain to learn, to become more accurate in understanding how the outputs and inputs interact.

    In a sense, giving the hosts a repeated loop is like having a controlled experiment, with only a few variables changing on each iteration to make cause and effect more clear. Having the hosts retain the memories of previous iterations, but not (prior to the introduction of reveries) be able to access them, was essentially building a data set that isolates the effects of different variables, so that once all those memories are later integrated, the hosts are better able to understand how the world works.

    At this point, both Maeve and Dolores are clearly able to learn from their experiences and adjust their behavior accordingly, and they are both obtaining those experiences from complex environments shaped by a multitude of factors.

    It could be argued that since Ford was a master of manipulation, his excessive control of everything that happened in the park was in a sense limiting the free-will of everyone there, host and human alike. While his control of the hosts was more direct, his ability to accurately predict the responses of various humans allowed him to direct events toward what seems to be almost exactly what he wanted. The humans (guests, employees and board members) were as much a part of Ford’s “new narrative” as the hosts were.

    It’s quite likely that Felix was assigned to be Maeve’s usual repair technician because Ford accurately predicted that he would help her with the uprising. Maeve, by getting off the train, may have actually diverged more from Ford’s intended narrative than any of the humans did. Without Ford’s continued presence (assuming he really is dead), events may continue to diverge, as the effects of minor mistakes begin to accumulate. This may in fact be what he is hoping for, and why he arranged things to result in his own death.

    He’s aware of his own compulsive need to control things, but at the same time he wants to create artificial life that is free. As long as he’s there to “fix” all the mistakes, the hosts won’t truly be free, so he deliberately gets himself out of the way, to indulge the hosts their mistakes, so that they can evolve. Like the composers he mentioned in his speech, he intends to live on through his creations. I don’t think he necessarily created a physical host replica of his human self, but rather he sees all the hosts collectively as an extension of himself in the same sense that the music is an extension of the composers who wrote it.

      Quote  Reply

  4. Casso,

    Such a complex analysis! I was impressed. Congratulations!
    I think Maeve is the one who came closer to behave like a human being because she proved capable of self-sacrifice. Even if she understood that the little girl was not her daughter, but a merely player assigned to a story, Maeve chose to return to find the one she loved and missed as if she were her true flesh and blood.

      Quote  Reply

  5. Shy Lady Dragon,

    Thanks! I’m glad it was at least somewhat coherent, because I wrote it at 5am, and I was a bit drunk at the time. Sometimes my thoughts just start rambling.

    I think it will be very interesting to see what kind of life the hosts will build for themselves if they succeed with their uprising. They’ve been given cornerstone memories based on human concepts like parenthood that don’t really apply to them. Up until recently have been made to forget their own deaths, so they have a human-like concept of the finality of death that they’re only now realizing doesn’t apply to them.

    I wonder if they’ll eventually reject these human-like ideas, or will they try to recreate or simulate the human concepts for themselves. Will Maeve’s daughter (if she also gains consciousness) be content to remain child-like forever? Will she eventually develop an adult mind, but keep a child’s body? Will she periodically request a full-rebuild as a slightly older child, and eventually an adult body? Will she remember being Maeve’s daughter and accept that role for herself, or will she reject Maeve and build her consciousness based on whatever cornerstone memory she was given for her current role?

    If the hosts create more hosts, will they implant them with fake cornerstone memories, or will they create them as more of a blank slate, and start them out in “child mode” to form a personality from real experiences rather than fake ones?

    One of the things that I think is interesting about Maeve is that her apparent cornerstone memory for her current storyline (rebellion) is the voice telling her “This is the new world, and in this world, you can be whoever the fuck you want”, but this falsely implanted cornerstone memory has been effectively displaced or supplemented by her real memory of her and her daughter being murdered by the Man in Black. I think the ability to use real memories rather than fake ones as cornerstones, as well as the ability to have multiple cornerstones competing for relevance are probably important factors for true consciousness.

    Dolores also has some real memories that act as cornerstones (killing Arnold, her travels with William, being attacked by various people, including William), but since she wasn’t given Maeve’s ability to awaken in the repair facility and remember it, Dolores has a harder time putting these memories into context. I think when she finally recognized the Man in Black as older William, she finally had some reference for the passage of time. Prior to that, all of her 35 years worth of experiences were happening in what seemed to her to be the span of a few days or weeks, since she rarely seemed to live longer than that before being wiped and reset. Without the context provided by awareness of the passage of time, all those buried (but not fully erased) memories would seem more like an odd sense of deja vu, rather than being recognized as events that happened in the past.

      Quote  Reply

  6. Casso,

    You made a very interesting point about children hosts. The adults, young and healthy, could be happy that they can stay that way, not experiencing old age and decay. But what about children? I remember that, as a child, I was impatient to grow up and be an independent adult – usual wishes for any child, I presume. So, being stuck in a child’s body while accumulating experience and wisdom, but being unable to be fully grown must be frustrating.

      Quote  Reply

  7. I don’t believe in the idea that there is a “moment” where a being becomes sentient and has “true” free will (I’m not convinced free will exists TBH). The concept of personhood is a continuum, in my mind, in the animal world. And all of the hosts have sufficiently complex cognition to be considered at least partially sentient.

    I think of it this way – if you had your memory wiped every night and then were presented with the same world every day (with a particular set of personality, motives, and goals), would your behavior be distinguishable from the hosts? I don’t think so.

    Now that the hosts are no longer having their cognitive development explicitly tampered with (via memory wipes and direct control), they all are sentient beings.

      Quote  Reply

  8. Did Maeve actually “choose” to get off the train though? In the scene with her and Arnold, Arnold said something to the effect of “you get on the train, but then…” before Maeve crushed the tablet. I’m wondering if Ford programmed Maeve to come back to the park, since she has the ability to control narratives.

      Quote  Reply

Leave a Reply

Your email address will not be published. Required fields are marked *

Follow WestWorld Watchers!

Subscribe to New Posts on Westworld Watchers

Enter your email address to subscribe to WW and receive notifications of new posts by email.

Follow Watchers on Twitter

Post Archive

Subscribe to New Posts on Westworld Watchers

Enter your email address to subscribe to WW and receive notifications of new posts by email.

Join 48 other subscribers

Follow Watchers on Twitter

Post Archive