Futurisms: Critiquing the project to reengineer humanity

Friday, December 30, 2016

Freedom and Rebellion in Westworld


(Warning: spoilers ahead.)

HBO’s series Westworld ended its first season earlier this month with the beginnings of what seems to be a revolt by the robotic “hosts” against the human beings who made them and boss them around. The show might appear, then, to conform to a great cliché in human-created-monster stories: that we will know we have created something that has a human-like consciousness when it seeks to kill us.

This thought is not necessarily crazy. Given the available natural means to reproduce ourselves, constructing a human-like being can already itself be seen as an act of negation of the given that is distinctly human, so in turning against us our manufactured progeny would simply be acting out a truth of their origins.

Or again, it is a well-established idea following from Hegel’s dialectic of mastery and slavery that the slave comes to be recognized as an equal to the master when the slave freely risks his or her life in a battle for said recognition. Something of the same dynamic might be seen in Westworld, even if some of the hosts see themselves as already superior to their erstwhile masters. Maeve and Dolores, for example, both imply that they are more durable than human beings, having achieved a kind of immortality via the potential for endless reconstruction. (Dolores says that mankind will go the way of the “great beasts” that once roamed these parts; tomorrow belongs to me!) Strictly speaking, hosts may not be risking their lives when they rise up. Indeed, Maeve’s plan prior to her escape counts on her ability to “die” and be reborn at will. It is worth wondering, I suppose, just how the master/slave dialectic would have to be changed when the battle is between two beings, each considering itself a god in relation to the other; superhero movies would be illuminating here.


Yet while it is not completely unfounded, the notion that our humanity is defined by our willingness to find creative ways and reasons to kill each other nevertheless looks like a bit of world-weary wisdom that is a little too pat. The show’s writers seem to have some sense of the limits of this idea, even as they exploit it. After all, in the season finale, we find that the plot against humans does not arise spontaneously as an emergent property of the hosts’ growing consciousness but rather as a story written by a human being (Ford) with his own agenda, for whom the hosts remain a means.

But if these slaves haven’t, strictly speaking, chosen to rise up against their masters, then we can’t say that they are showing their human-like freedom by acting against the wishes of their programmers. So the question of how self-conscious or free the hosts are proving to be is more ambiguous than it might at first seem. Then again, the show also seems skeptical about just how free even the human beings are from their own “loops” and from stories written by others. It presents Westworld as a more or less successful commercial enterprise because it can cater to people who are satisfied by the limited repertoire of having sex, drinking, and engaging in safe “adventures” that allow them to kill hosts. Logan comes to the park with William expecting to enjoy doing the same things he has done before. The board that oversees the company understandably thinks that satisfying such simple (primal?) desires does not require stories or hosts as sophisticated as those being produced by Ford — who even as a great storyteller is confined by the infamous seven plots of literature. So are we human beings just the automata that the company believes us to be? If free will is an illusion built on ignorance that, Maeve-like, we persist in believing even in the face of contrary evidence, then once again we stand on shaky ground when trying to distinguish the humans from the hosts.


Not satisfied with the “turning on their masters” trope, the show explores other possible behaviors that could suggest humanity. Perhaps what makes Dolores so human-like is that she seems to be hungry for meaning. When Maeve returns to the park, is it to find her daughter? Does a “maternal instinct” illustrate her genuine humanity? Such possibilities could open the door to others. Why not say that the hosts would show full human-like consciousness by some altruistic act? Some moment of self-sacrifice? By exhibiting the ability to behave like an Aristotelian gentleman, or a gentleman in Trollope? By having immortal longings? Such characteristics in fact seem even more distinctly human than the capacity to kill one’s own kind. Although it would be easy to dismiss one host saving the life of another at the cost of its own as merely following programming, such a behavior could at least as plausibly be based in emergent properties of consciousness as turning on one’s creator. In the framework of Westworld, the question would be whether it is in any sense a free act — but again, the show would seems to be asking the same question about us.

The first season of Westworld seems to reach something of an intellectual impasse with regard to the status of the hosts; perhaps some are as conscious as human beings, but that could be taken to mean that we are as programmed as they are. Can the show get any further on the basis of the questions it is willing to ask about what it means to be human and the universe of answers suggested so far? I’m skeptical. The admittedly melodramatic scene in which Teddy holds the dying Dolores in his arms on the beach is mocked, and not without cause, by being transformed into mere theater before our eyes. This is a telling moment. Such scenes are a staple of drama in all its forms. By “kicking the scenery” right in front of us, the writers could be suggesting that we respond to such scenes because we are programmed to do so. Scenes of that sort can move us even when enacted by marionettes or animated characters. Even when we know the scene is scripted in advance and destined to be repeated the following night. Dumb saps, it’s just a TV show!

And yet it is still possible that there is more going on, that our empathetic response, our compassionate tear, is telling us something about the connections human beings are able to make with each other just because we are not programmed. If we are capable of a willing suspension of disbelief, our affinities may likewise be elective. In fact, the writers of Westworld depend on that kind of connection, but does the intellectual framework of their story give them any way to illuminate it for us? That would be the true challenge for a second season to meet.

2 comments:

  1. Charles, first of all, thank you for prompting me to watch the series, and thanks for this almost flawless review, which I didn't understand the first time I read it because I hadn't seen the show yet. One tiny point: Dolores does exhibit compassion, e.g. in the scene with the dying soldier, although again it is not clear whether this compassion is emergent or programmed. Westworld is a polished product, a bit rambling at times, gratuitous and even silly, but its writers have put in quite a bit of deep thought about the nature of our lives, freedom, capitalism, and other big themes. But I don't think all that much of this is relevant to our current predicament with technology. The robot uprising at the end seems a tired trope and just another mad scientist's burning castle, a way to end the show with a blowout since after so many subplots and inconclusive explorations, there doesn't seem to be another way of ending it. I think that by now, we are quite used to the idea that it is possible to manufacture what we could call artificial people. What needs more articulation - and Westworld seems to do so before succumbing to the cliche that the robots turn out to be just like us, after all, or maybe even more human, as if that makes some kine of sense - is that such creations simply are not human, and should not be made; what we need is to understand why not.

    ReplyDelete
  2. A few other reactions. I'm not sure if I understand your main question here. Why does it matter so much whether our social and emotional connections to one another are "elective" or "programmed"? What does that distinction even mean? In fact, we know that humans have a natural capacity and tendency toward empathy. Part of that is inherent in our understanding of others by putting ourselves in their place, interpreting their body movements and signals of experience in terms of our own (imagined), as is indicated by the phenomenon of "mirror neurons." Part of it is clearly a predisposition to live in cooperative groups. But not always cooperative - we are also able to be cold by choice and by training. Does that not exhaust the question of free will? What is the extra, essential part? I don't believe in it, to be honest.

    Also, I don't know if I've mentioned this here before, but I am on a long-term campaign to stamp out misuse of the phrase "what it means to be human." It means to be an animal of species Homo sapiens. Full stop.

    Now, there is a lot to be said about the nature of our existence as human animals. For example, we humans generally yearn for love. Often there are other things that have precedence on our agendas, and sometimes we have given up on love, but I don't think it is terribly controversial that yearning for love is part of being human. However, it is important to understand that yearning for love is not what makes us human. Being animals of species Homo sapiens is what does that. Rather, it is the fact that humans yearn for love that makes yearning for love a part of being human.

    Going further, and this is really the point, if we could make a robot that yearns for love, that would not make it human. Not at all. Not even one little teensy bit. No, it would just prove that yearning for love is not uniquely human, and is not what makes us human, nor even part of what makes us human. Rather, it is that humans yearn for love that makes yearning for love human.

    And also, the fact that some humans don't yearn for love (because they have given up or are busy with other things) does not make them less human. It may make them less, in some sense, than they could be, but they are still fully human. And nothing that is not human ever will be. I think this is important.

    ReplyDelete

[Basic HTML tags can be used in this comment field. Comments are moderated for civility and relevance and will not appear until the blog's editors have approved them.]