Jake Schreier’s gentle comedy Robot & Frank, released on Friday in the UK, belongs both to an old genre, the misfit buddy movie, and a comparatively new one – as yet unnamed – in which robots are portrayed with something approaching realism. An introductory subtitle identifies the setting as the near future, but the script doesn’t do much with this licence. Its basic scenario – senile man, robot carer – is plausible and even imminent. Most films that take place the day after tomorrow require something very decisive to occur tomorrow, and build an “if” clause into their contract with the audience – “The Near Future [if Mars were inhabitable]” or “The Near Future [if bees attack]”. Not this one.
During a reflection on the “Robotic Bedpan” in his new book Who Owns the Future?, the computer scientist Jaron Lanier wonders “Would a robot nurse be emotionally acceptable?” But he knows that the question is of only academic interest. In practice, as the world’s population ages, “acceptability” will matter far less than affordability. To recoil in horror at this prospect is to forget that it rarely takes long for the newfangled to become the new normal.
Cinema, the most machine-based of art forms, has shown a certain amount of hostility towards silicon-based life forms – think of the murderous computer HAL in Kubrick’s 2001, or Roy Batty in Ridley Scott’s Blade Runner. Exceptions are rare – think of David, the android played by Haley Joel Osment in Spielberg’s Kubrick-developed AI.
Robot & Frank offers a more flattering picture, both unsuspicious of robots’ intentions and unqueasy about their involvement in our lives. On the other hand, the film is justifiably gloomy about the economic consequences of such developments, one of the main subjects of Lanier’s book. Frank’s son buys him a robot where he might previously have hired a housekeeper or nurse. The local library, run by Susan Sarandon, makes use of a computerised “book relocator” – until the printed holdings are sent off to be pulped.
The term “robot” comes from the Czech for “drudgery”, and we certainly prefer an image of machines as excelling at only particular kinds of tasks. We want them to be limited – subhuman – in some crucial way. When IBM’s computer Deep Blue beat Kasparov, it was easy to reply that chess is all about permutations. When IBM’s computer Watson won the American quiz show Jeopardy!, we could take comfort from the idea that any search engine worth the name is capable of connecting keywords to find the answer to a question (or, as Jeopardy! is played, the question to an answer). We can’t give up the idea that something sets us apart as a species – something, that is, other than the ability to create machines good with numbers and facts.
Creativity appears to fit the bill. In 1949, the neurosurgeon Geoffrey Jefferson argued that only once a machine has written a sonnet and, crucially, knows that it has written it will we be able to say that “machine equals brain”. For the time being, human intelligence retains this advantage over its artificial counterpart – though technology poses a threat to creativity of a different kind, partly by shaking the economic foundations of the music and publishing industries, and partly by moving at such a swift pace.
Futurology will always be a mug’s game – of course, I could be wrong – but for changing reasons. Imagination used to leave science behind, now it’s the other way around. The columnist and scriptwriter Charlie Brooker recently found to his irritation that a “digital afterlife” idea he had used in his TV series Black Mirror was not as “fanciful” as he had thought.
Having ignored to his cost both Moore’s Law – computer power doubles every two years – and the tech-guru Ray Kurzweil’s Law of Accelerating Returns – technological change increases exponentially – he came up with his own “rule”: “If you can picture something on the cusp of plausibility, it’ll definitely be real by Christmas.” Since his dystopian fantasy was already a subject for journalism, he shouldn’t “have bothered writing a script”. Technology doesn’t need a creative capacity to put writers out of business.
Underneath the flippancy, Brooker was making a useful observation: writers need to revise their sense of the “futuristic”. One solution might be to project a little further – no more messing around with the “near future” or “2019”. The recent film Cloud Atlas, adapted by Andy and Lana Wachowski and Tom Tykwer from David Mitchell’s novel, shows how to keep science fiction one small step ahead of science fact. One strand of this digital tribute to the written word – the season’s other old-man-falls-in-love-with-Susan-Sarandon film – takes place in 2144, a year that sounds sensibly remote and meaningless. And though a society reliant on a class of robot-slaves could well materialise sooner than that, the film is unlikely to be overtaken by the time it comes out on DVD – if DVDs are still around in August.
Peter Aspden is away
Listen to Leo Robson reading this column at www.ft.com/culturecast