Hannah Swithinbank

View Original

of gods and AIs

See this content in the original post

I saw Ex Machina on Friday night, which kind of made my brain go all wide-eyed and ‘ooooh’… not least because it mostly manages to tread a delicate line between your standard-level movie sexism (that often comes out to play more broadly in sci-fi films, because, yay, female robots…) and a rather feminist self-actualisation and finale.

Note: assume presence of spoilers…

Ex-Machina-Gallery-10

Even if you don’t read the finale the way I did, at the very least both men are on the spectrum of creepy: Nathan (Oscar Isaac) at the extreme end and Caleb (Domnhall Gleeson) at the more unthinkingly complicit end. For all he seems to believe that Ava is a genuine AI, Caleb fails to expect her to have agency with regards to him beyond a simple ‘she likes me or she was programmed to pretend to’ binary).  

Caleb: “did you programme her to flirt with me?”

Nathan “maybe she’s pretending to like you…”

Meanwhile Nathan thinks Ava has more mental agency… but he retains the right to control it by controlling her body. Nathan’s creepiness (strongly played in relation to his sexuality) provides a handy shortcut for the film to look at some of the issues that come with creating AI in a world that neatly plays with the Christian creation story.

There’s the ‘obvious’ question of the film, which is the one that Caleb grapples with, which is that if someone does create a genuine AI that thinks it is as human as any other human, their creator still has the ability to pull the plug on them, and then there’s the question for the audience, which is this: yes, Ava is an AI, but is she human?

Nathan is effectively Ava’s god, but he is very very clearly a creepy creator-god, even before you get to one of the film’s big reveals. Our discomfort with his ability to pull the plug on his creation is about our empathy toward Ava, who is sympathetic, rather than to Nathan, who is controlling, misogynistic, and amoral, and a human discomfort with the idea of a creator who has the power to pull the plug on us. You can argue - and I probably would - that this discomfort fails to understand the nature of the God of the bible, but I don’t think you can argue with the fact that the discomfort exists. We don’t particuarly like our own mortality, or the idea that God has power over it: even if we believe in God as a good and loving God, our lack of control over our fates tends to unnerve us.

So, if we’re going to be creating AIs, we’re going to need to think about how we look after them (as a preferable word to ‘controlling’), and if, when and how we upgrade or terminate them (you could think about the school in Never Let Me Go as a point of comparison for this). It feels like we should owe them that, rather than leave them subject to the whims of a capricious individual. I suppose it is possible that on deeper enquiry there may be no ‘rational principle’ for a creator having this responsibility to its creation in this way - but I think that the majority would believe that this duty of care exists. Why else would God’s capriciousness or otherwise be such a major concern for us as we contemplate God?

But of course, one thing the film makes abundantly clear is how unlikely regulation is to happen until after the event. Today’s tech giants who are exploring AI (and how very interesting that Nathan’s company started life as a search engine and is using the data it collects to create Ava’s consciousness…) may not live in remote eco-house/labs in rural Norway, but they’re not exactly well supervised or well understood (which only makes the supervision harder). Will Ex Machina provoke further reflection on this? Perhaps not on a large scale - but I’d chip in for a kickstarter to send a copy of it to all governement committees and law courts contemplating ethics and technology as a starter.

But there’s a further question. Ava may have intelligence, even consciousness - which demands a standard of care from her creator - but is she human?

‘There is nothing more human than the will to survive.’

This is the tagline of the film. But is it true? Humans certainly do have an incredibly strong will to survive - but is that the definition of our humanity?

Nathan - like God in Genesis - creates in his own image to a very large extent. Ava is smart, sexual, curious - and ultimately amoral. She passes Caleb’s Turing Test, even before the finale, because she is intelligent, able to learn and adapt. And yet, I think, she lacks a certain humanity - just as her creator does. Intelligence and the will to survive alone don’t, I think, make a human human (or, indeed, humane). I suspect that actually, a better definition of humanity is the willingness to set aside that will to survive.

For example Caleb may be a very flawed individual, but he ultimately proves willing to take a risk with his own survival in order to help Ava. And Ava, in the end, does… not.

Is this argument a giant leap? Well, not if you do believe that humans are made in the image of God, that the full realisation of that image is seen in the incarnation of Jesus, and that his defining feature is his willingness to set aside his will to survive for the survival of others.