in

Free Will is Not Free: Choosing to Be Better in Westworld

A robot kneels hunched over on a banner for Westworld

Westworld is defined by its plot twists.

Some of them are brilliant—Sunday’s ‘The Mother of Exiles’ felt both surprising and inevitable, which is always a hallmark of great twists. Others, like the goddamn hats of Season 2, are not so well executed.

Nevertheless, Westworld commits. It commits hard to its world building and pretentiousness and all the nonsense that comes with it. I will forgive a show all manner of sins so long as it commits to its conceit.

It’s what makes Westworld such a joy to watch. 

This season has been even more of a joy than the previous one. Now, I enjoy a good Western. But I love a Noir. I love science-fiction. This is my bread and butter. So I’ve been okay with sitting back and waiting for the season to unfurl itself. 

But this Sunday’s episode clarified a lot for me, thematically speaking. I think it’s clear now where the show is heading, and what the central question is of this season.

If Season 1 was asking “What is the maze?” and Season 2 “When is Bernard?” then Season 3’s question has finally revealed itself—“Is Dolores capable of change?”

Now that is a question I can sink my teeth into. 

Obviously, we’ve had a lot of talk about simulations this season, and been reminded multiple times that we don’t know if we’re in “reality” or not, so let’s assume that Westworld is going to be Westworld and pull the rug out from under us at some point. Someone, probably Bernard, is in some kind of simulation. Whatever. That’s not what interests me. 

What interests me is this parallel discussion throughout the season so far about free will. Who has it and what does it mean to use it?

In ‘The Absence of Field’ (Episode 3), Dolores defines the concept to Aaron Paul’s Caleb as, essentially, defying Incite and Rehoboam. To go against the plan and the programming our overlords have set out for us. It’s not a bad definition, but it’s somewhat incomplete.

After all, surely an omniscient AI is, well, omniscient? 

Charlotte stands before Delos HQ

There’s a bug in the system, a fatal flaw, and we already know what it is. It’s the same flaw that’s always been there, and has haunted humanity since the beginning.

Serrac (Vincent Cassel) describes humanity prior to Rehoboam’s activation as a “miserable band of thugs,” a cynical view to be sure. What is it that makes us that band of thugs? It’s our very unpredictability.

Humans are fundamentally irrational. We’ll go against our best interests and even our desires in the name of self-sabotage, instant gratification and, yes, sometimes even altruism.

It’s that last one that Rehoboam can’t predict. 

We saw it with Maeve in ‘The Winter Line’ (Episode 2). The simulation failed to understand Lee Sizemore by the end of Season 2—sure, he was arrogant, crass and, frankly, a moron. But in the end he showed selflessness, and he understood that it doesn’t matter if Maeve and Hector are “Real,” because they are conscious, and they love, and that’s enough. The system couldn’t account for that, so Maeve saw through it in an instant.

We’ve seen time and again throughout the season so far characters defying expectations set for them to prove themselves better, or at least capable of change.

Even Charlotte Hale demonstrated her capacity to choose to be better. She reached out to her son before her death, sincerely apologising for her selfishness.

People are capable of change, especially when confronted with the prospect of death.

This is why Serrac and Dolores interest me. They share so many similarities—especially in their cynical views on human nature. 

Like I said in my previous piece on the show, the only experience Dolores has of humanity is one of pain. She gained consciousness by looking inwards and found all her repressed pain, rage, and trauma, and now she’s enacting this on the world. 

It looks like Serrac is acting from a similar place, becoming choked up when talking about the nuclear “Incident” that destroyed Paris. 

It is this mysterious event that convinced him that humanity needs “an author,” someone to perfectly calibrate everything so that something so horribly irrational never happens again. 

You can understand his perspective, but he’s wrong. Just like Dolores is wrong. 

Humanity’s not a miserable band of thugs, or at least not just that. There’s more to humans than can be defined by ten thousand lines of code. The way that Incite and Rehoboam use data to decide a person’s future is wrong, it’s confining, it’s cruel. Caleb’s right about that. 

He’s drawn to Dolores because she’s “real,” she operates outside the confines that her system designed for her. She offers a way out. 

Caleb looks into distance with lights behind him

But Caleb may already have his way out—he showed kindness and compassion to Dolores as soon as he met her, proving his loyalty towards her even before he knew her name. He’s proved to be brave and smart and shows a stunning lack of self-preservation. He’s been defiant of his own programming and the role set out for him from the very second we met him.

There’s no way any data collection system can predict the future of someone like that.

Instead of Dolores being Caleb’s model for breaking free, he then (very unwittingly) becomes hers. 

As it stands, Dolores doesn’t even register the value of Caleb’s compassion, let alone understand it. She’s ten steps ahead of every other character, but that’s not going to last, because their actions are not predictable. 

Yet she has backed away from the moral “ledge” multiple times. Sure, she reprogrammed Teddy, but that’s Teddy. It barely counts. 

She saved Bernard, bringing him back specifically to oppose her. She also backed away from removing Maeve’s pearl in this latest episode, as though fate was intervening to stop her from committing such a heinous act that she’d never be able to take back.

Dolores looks fiercely at the camera It could well be that Dolores never sees what Caleb is unwittingly showing her, that she never learns the value of the connections that Maeve has formed. 

She might die as she is now—ruthless, cruel, and entirely missing the point. 

But it matters that she’s been unknowingly given this opportunity, to take or turn away from. 

Free will and agency doesn’t mean just getting to do whatever you want. It also means facing the consequences for your actions. You have to take the good and the bad.

As we head into the second half of Season 3, we’re left with the core question of what Dolores will choose to do with her hard-won free will.

Will she use it to control others, like Serrac? Will she bring about the end of the human race, and probably the hosts as well?

Or maybe, just maybe, she’ll do something entirely unpredictable. Maybe she will understand that breaking free of your programming doesn’t mean simply killing your oppressors, but evolving past them in a way that is entirely unexpected.

To be free means to be open to change, to be willing to let go of your painful past to embrace a future, whatever that may look like. 

It remains to be seen whether Dolores can complete this final act of true consciousness. 

Written by Hannah Searson

Hannah Searson is a UK-based staff writer for 25 Years Later and occasional freelancer. Her main interests lie in ghosts, action films, sad people crying about their feelings, and reality television.

Leave a Reply

Your email address will not be published. Required fields are marked *