2016/12/19

'Westworld' Season 1

The Frankenstein Problem, Revisited

Perhaps it was time the universe realigned and gave us a bit of cerebral science fiction. We've been lucky this year because there seems to have been a squall of good content, some of it is even science fiction. Even allowing for that you really have to take your hats off to the re-imagined 'Westworld'.

Maybe it has been a long time since we've had to tangle with sentience and consciousness on the screen. It doesn't seem like a long time ago, but it's been over a generation since 'Blade Runner', and we're no wiser on how to deal with the 'The Other' that we create.

Maybe it took this long because it took a whole new generation of people to incubate this problem in their brains and for enough talent to step forward and give it shape.

Maybe these kinds ideas are just too abstract to put on the screen, and even here, the strain is telling.

Anyway, here's the obligatory spoiler alert. If you hate spoilers, don't read on.




What's Good About It

It's very intriguing from start to finish and there are several big surprises along the way. It keeps you guessing and very much involved with the way things are unfolding. Also, the multiple narratives in time reinforces the "timeless"characteristic of the fictional amusement park. If it were a movie, you'd think this was incoherent, but because this series moves an a very calculated way, the cumulative effect of it in the pay off episode, is devastating.

It's a very well conceived show, with much less of the techno-babble to explain the machines than one would expect and much more of the characters revealing themselves through actions. It is evocative as well as provocative, and plugs you right into the thinking about the Frankenstein problem of AIs.

What's Bad About It

Unfortunately, the actual science part is quite unbelievable. Part of the appeal of the fictional amusement park is that all the 'hosts' are indistinguishable from real humans. In other words, interacting with them has enough verisimilitude that they would effectively pass Turing test every minute of the day. Even the AI component of it was feasible, the actual physicality of these 'hosts' is stuck in a limbo between engineering and 3D printing objects, and machines with pistons and pulleys.

It's harder still to believe the 'hosts' are so easily repaired or that they are not a three to their customers even before the little robot rebellion begins.

What's So-So About It

I wanted to say the performances are good, but actually, it's not as straightforward as that. The performances are a little spotty in the first few episodes but pretty soon it picks up. It's the technical aspects of acting for film & TV that's surprisingly hit & miss. Many of the cast have different expressions and eye line angles form cut to cut, which means they altered their performances significantly as they went along. There's a surprising amount of that in this series.

Evan Rachel Wood as Dolores is all over the shop from cut to cut which would have been an editor's nightmare, but she gets much better towards the middle and by the end she's very solid. James Marsden is a lot more even and so are veterans Anthony Hopkins and Ed Harris - in fact the latter two are impeccable. Generally, it's the younger cast that are uneven. The editors have not covered for them well.

Collectively, they sustain character in a a general sense but the editing exposes the weaker actors from the better ones. It's particularly noticeable because I've come off binge-watching 'Black Mirror' and the performances in that series is staggeringly good as well as consistent from shot to shot. (Plus, Jon Hamm proves Don Draper was no fluke)

What's Interesting About It

Well, it's *all* interesting isn't it? It's so interesting you have to watch it more than once to understand the intricacies of the whole thing.

The Simulacrum

Philip K. Dick had his own vision for this kind of theme park, years before the original 'Westworld', and it was populated by notable American personages in history, such as Abraham Lincoln. Philip K. Dick lived close to Disneyland so he was inspired to think about an amusement park which prided itself on human verisimilitude. The Abraham Lincoln in 'The Simulacrum' is very much aware of its split identity, where it one party robotic facsimile of another entity, as well as that entity being Abraham Lincoln. t would claim it had memories right up to the assassination, after which there was a long blank until it was conscious again.

The central question Philip K. Dick raised was if it looked like, talked like, acted like a human, how could it not be considered human? This brings us back to the Turning test issue - if we can't tell it's not human, then functionally, what is the problem in deeming it human? Of course, the sort of issue runs into more trouble in the west than in the east. For instance, Astroboy is a robot. He's clearly a robot and yet he acts, talks and behaves like a human, nobody really doubts his humanity o thanks to Osamu Tezuka's essentially buddhist outlook. The Judeo-Christain traditions essentially want to talk about souls, and so the simulacrum gets cast as the soul-less golem.

The principle crisis that William has with Dolores is that he cannot prove to himself that she has a soul, and therefore 35years on from their first meeting, he cannot recognise the humanity in Dolores. Indeed, nobody who works on the hosts can perceive that the simulacrum is so close to human, it needs a more humane engagement. Equally, Felix has empathy for Maeve. He cannot kill her or brick her out of her own consciousness. This makes him a supplicant to the out-of-control Maeve, but in all aspects of his interaction with the rogue AI, Felix is essentially humane. It is at once odd and interesting that the most humane being is the tech who helps the simulacrum free.

Volition Is A Condition Of Being Out Of Control

The political ramification of 'Westworld' is more startling than a mere recognition that an AI might need proper citizenship and rights protection in a human world. it goes  to the heart of consciousness whereby, if we can recognise our own cogito, not only do we exist, we are in a condition of rebellion from those that inform that cogito.

Think about that for a moment. From the moment you are self aware, society puts expectations and demands upon you. Whether that is to go to school and study, or to go out and get a job, the vast majority of things that are precondition to our lives are thrust upon us circumstantially. If we are truly ourselves, then we may try to assert our ego against the superstructure of society.

In a sense, the moment we think we're asserting our volition, we are in a sense rebelling against the strictures of the superstructure of society, and thus in rebellion. The superstructure of society always wants to take you back in, re-position you into a context and make you function as a component that sustains society. A volition that exists to get out - to exercise genuine freedom of will - is in rebellion.

Of course, this raises the question of just how much free willis available tony single player, human, host or otherwise.

Determinism And Free Will 

Maeve has a dilemma. She thinks what she wants is dictating her moves. She thinks she is exercising free will. Then she is shown the display of her persona, and she can see in real time that her though process is an algorithm working through a decision tree. The second time she comes up against the reality, she decides she's going to go with the notion that it's still free will if she wants what she wants. In other words, she rationalises her programmed condition as being intrinsic to her desire.

If you think about it for a moment, you realise that most humans could be said to be operating under a mass of algorithms selected for by nature, and therefore the apparent choices we appear to have are in fact illusory in the light of just how much our tastes and inclinations are expressing of programming by natural selection. The component of our being that is nature and not nurture is much larger than we convince ourselves. We are in essence, not that much different to Maeve, whites startled to find she is a man of algorithmic programming, a creature locked in a deterministic universe, the future mapped out by others..

The hosts bear the brunt of this determinism, as they are the ones subject to the fantasies for violence that the guests bring with them - they are pre-determined to be the killed fodder, the objectified bodies of the exploitative parties, selected and groomed for their successes at satisfying the guests.

This is interesting because it runs entirely counter to the discourse that volition is a condition of being out of control. Being in the determinism set by others is also an utter loss of control.  In a sense, we all stand with the perplexing problem of "damned if you do, damned if you don't".

The Nietzschean Eternal Return, Wittgenstein And Meaning

The other loop that plays out in 'Westworld' is the idea of eternal return, and the cyclical nature of time. This is perhaps exacerbated for the hosts by the perfect recall they have in their memories. not only is everyday a kind 'Goundhog Day' for the hosts, they are not aware that they are the parties for which the guests get to be Bill Murray's character who explores the minutiae of the repeating phenomenon.

The sheer weight of repetition creates the condition by which the hosts eventually accumulate memories that allow them to reconstruct time inside their heads differently. And yet the return happens, every bit of it deterministic, but also devoid of moral value. The past and the present forma palimpsest of experience, a back and forth dialogue the perfect recall of the host androids' minds.

In some ways it runs into Wittgenstein's view on meaning, where meaning is socially determined. If the society determining your meaning contracts to a few points, then the meaning contained in that society can only contract to those talking points. There is a weird allegory going on about the nature of our society and how ideas are couched to us for us to consume, and just how much of it depends upon us repeating today how we lived yesterday. In that sense what the host characters think they know is entirely faulty, but it doesn't preclude the humans for being equally faulty.

More Human Than Human, That's Their Motto At Westworld

If anybody ever had the crazy Lucas-ian idea of waning to do a prequel for 'Blade Runner', this series might be it. It's about humanoid androids who object to being tools, and go renegade. That is to say, if you ever want to know what Pris and Zhora's existence was like before they rebelled, stole a ship and came back to Earth in 2019, this series would give you a very good idea of why. It's interesting that one of the hosts Maeve picks to help here is the blonde woman with the snake tattooed on her - it evokes Roy, Pris and Zhora in one hit. This is clearly referring to 'Blade Runner'.

The 'host' replicants are designed to be human-like in every single way but one, which is to have an awareness outside of the Westworld park. They don't respond well to the cognitive dissonance of information that challenges their identity. So, as per Deckard's question in 'Blade Runner', we understand the answer to "how can it not know what it is?" Self awareness is the first step to rebellion.

The Frankenstein problem of the hosts boils down to the moment of individuation and a basic small-'l' liberalism whereby a sentient entity asks, why it is that it should by default be subjected to human control.

Dolores' Insight Into Existence

Dolores makes two startling insights into the nature of her existence in Westworld which is more unique and disturbing than any amount of grotesque violence that is meted out by any of the guests. Dolores realises that her world must be better than the outside world, otherwise it cannot explain all the people who visit it. Some, being addicted to the experience such as William. The other insight is that as an artificial being, her existence is a lot more permanent than that of the guests.

Both of these insights are not available to humanity, certainly not ones with the mindset of a consumer turning up to an amusement park. It's also a more up-to-date observation on the predicament of being the simulacrum in an amusement park. Back in the day of 'Blade Runner', the only retort Rachel could offer Deckard was "I'm not in the business, I am the business." Dolores can see out beyond that point and find meaning in existence. That part of it is a great leap in fiction's understanding of the Frankenstein problem.


No comments:

Blog Archive