Hannah Swithinbank

View Original

In which Transcendence doesn't Transcend.

I really wanted to like Transcendence. Up to and including the arrival of the trailer in the cinema, I was pretty excited by it, and intrigued by what it might have to say about AI and the potential future technological evolution of humanity. The fact that it tanked in America is pretty much neither here nor there, from my point of view - but the more I saw the trailer (and it has played a lot) the less excited I got: somehow it just couldn't quite sustain that initial excitement.

I wanted a smart, exciting movie about the human-AI interface, which I expected to veer to the negative - but which I hoped was going to be a lot of fun anyway. I got - well, a mixed bag, really. In reviewing it, Mark Kermode described it as, at heart, a B-Movie (despite its budget, which is very not a B-Movie budget), and I have to say, I almost wish it was.

I might have enjoyed a big silly movie about a battle between techies and luddites and ideas sloshing around somewhere at the extremities - but because Transcendence wanted to be more than that - it definitely wants to be a smart, engaged movie, probably more reflective and moody than fun and exciting - it falls down a pretension hole, with lots of moody shots of raindrops and sunflowers, and ends up ultimately unsatisfying.

Will Caster is clearly a Ray Kurzweil figure: a scientist interested in finding out what happens next - what humans can do next.

For one hundred and thirty thousand years, our capacity to reason has remained unchanged. The combined intellect of the neuroscientists, mathematicians and engineers pales in comparison to the most basic A.I. Once online, a sentient machine will quickly overcome the limits of biology; in a short time, its analytic power will become greater than the collective intelligence of every person born in the history of the world. Some scientists refer to this as the Singularity. I call it Transcendence.

Kate Mara is Luddite-in-Chief Bree. She doesn't get to have a character, or indeed much to do or many lines, beyond one short speech outlining her rejection of AI, but you can tell she's a malevolent badass because of her bleach blonde hair and copious eyeliner. Heaven forbid you should give your major antagonist a well-rounded character.

And Paul Bettany is man in the middle, Max, our entry point between the two extremes, guiding us through the choice between them - although the film seems keener for him to come down on one side of the debate than he does, or than he needs to be. With the exception of Bettany, it is exactly as unsubtle as it sounds, and he is ill-served by everything going on around him - even by Morgan Freeman, who is there, I think to Be Wise (which is a bit silly, because Max was doing perfectly fine).

But it all happens in a bit of a void. There's an assassination of a major figure at the opening of the film, followed up by the rounding up of a lot of 'terrorists' and then... nothing. Cillian Murphy and his FBI gang disappear back off to Washington, leaving Bree's group alone in their hideout, and Will and Evelyn alone in theirs, which is nonsensical. Remote places just aren't that remote and disconnected, even if people were disinterested in finding them - which they wouldn't be. Joseph, who was with them, would still have been looking to make contact with Max and with Evelyn, and frankly, every agency in the land would be looking into Evelyn given the way that AI-Will games the markets to give Evelyn an un-ending supply of money to build her new lab.

And the void extends beyond that: the whole thing feels completely detached from the rest of the world. Max holds Bree off for a while by telling her to wait for a tipping point in public opinion... but where is the public? There's no sense of political or media investigation, or scientific investigation, into what Evelyn Caster does after her husband 'dies'. How is public opinion going to change - or are we just assuming that AI-Will will do something so obviously evil that everyone is up in arms? Certainly, it doesn't happen - there's nothing that drives the denouement beyond Bree deciding that it's 'time'.

Later, Evelyn warns everyone that getting rid of AI-Will might cause a global blackout that would go down rather badly (which, duh), and so it does - but everyone seems to carry on much as before, to judge by the bookending of the film. I mean, it's nice to see a vision of the future where we don't all go feral after a major cataclysm, but just a bit too much of a post-apocalyptic idyll in Berkley.

The most interesting bit of the film, is a bit that doesn't really get explored at all - and that's Evelyn's relationship with AI-Will. Like in Her, one of the central relationships in the movie is between a human and AI, and again we see the relationship wobble - but unlike in Her, Transcendence doesn't really explore why.

Evelyn is apparently a perfectly delightful person, and a brilliant scientist - committed to making the world a better place, but we never really get to see her as a person on her own, or work on her own work. Her entire personality revolves around Will. Her work getting Will downloaded into the AI is successful because of someone else's work and Will's own ability to recode himself. Once they head for the desert, everything is Will's work, even though he phrases everything as 'we', and Evelyn - she wanders, and watches, and a distance emerges.

But the question the film asks isn't what happens when a human and an AI are in a relationship, and why, where cracks grow. The question it asks is whether or not the AI is really Will or not - because Human-Will is good, and so AI-Will must also be good to be really Will. But what actually seems to be the case is that AI-Will is a significantly more 'powerful' version of Human-Will - in terms of mental processing power and loss of physical weaknesses, while Evelyn is just as she was. Will is not only no longer tangible (and the film makes clear how much losing touch is a loss in their relationship) to her, he's also developing at a much faster rate, and he's not giving her the chance to keep up. He seems to be so substantially and existentially different from Human-Will, that the relationship Evelyn has with him is stretched to breaking point. And yet, in the end, their problems are answered by her 'realisation' that the AI is really Will, doing everything he does because he loves her.

It could, and should, be a complex state of affairs, with no tidy resolution. There isn't going to be a neat answer to any of the questions around human interaction with and possibility assimilation to technology. People are going to make choices and decisions that other people are going to disagree with. There's going to be mess, there's going to be pain, and it's not all going to be resolved, as Transcendence tries to do, by love conquering all.