Hawking with Intel Principal engineer and project lead Lama Nachman.jpg

The astrophysicist Stephen Hawking has warned that artificial intelligence “could outsmart us all” and is calling for humans to establish colonies on other planets to avoid ultimately a “near-certainty” of technological catastrophe.

His dire predictions join recent warnings by several Silicon Valley tycoons about artificial intelligence even as many have piled more money into it.

Prof Hawking, who has motor neurone disease and uses a system designed by Intel to speak, said artificial intelligence could become “a real danger in the not-too-distant future” if it became capable of designing improvements to itself.

Genetic engineering will allow us to increase the complexity of our DNA and “improve the human race”, he told the Financial Times. But he added it would be a slow process and would take about 18 years before human beings saw any of the benefits.

“By contrast, according to Moore’s Law, computers double their speed and memory capacity every 18 months. The risk is that computers develop intelligence and take over. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded,” he said.

Both PayPal co-founder Peter Thiel and Elon Musk, the entrepreneur behind the electric car Tesla and SpaceX, the private space flight company, have warned of the dangers of complacency over the consequences of unconstrained advances in artificial intelligence.

Google is making a push into “quantum computing” and has established an ethics committee to monitor the work of DeepMind, the artificial intelligence start-up it bought for £400m earlier in the year. Consumers are coming into increasing contact with “smart” machines, including flying drones and prototypes of self-driving cars.

A recent paper from researchers at the Oxford Martin School at Oxford university has warned governments to plan for the risk of “robo-wars” in which autonomous weapons can identify and decide to kill targets without human intervention.

Prof Hawking said he appreciated the benefits AI has brought. He mesmerised an audience as he displayed upgrades to communications devices that let him speak and write twice as fast as before. The new system was designed by Intel and SwiftKey, a British start-up that uses statistical modelling to analyse text and make predictions about what a user is going to write.

But he also warned that there was potentially a more destructive element to technology. “We face a number of threats to our survival, from nuclear war, catastrophic global warming, and genetically engineered viruses; the number is likely to increase in the future, with the development of new technologies, and new ways things can go wrong,” he said.

The chance of “a disaster to planet Earth” becomes “a near certainty in the next 1,000 or 10,000 years”, he added.

“We need to expand our horizons beyond planet Earth if we are to have a long-term future . . . spreading out into space, and to other stars, so a disaster on Earth would not mean the end of the human race. Establishing self-sustaining colonies will take time and effort, but it will become easier as our technology improves.”

Mr Hawking revealed he recently joined Facebook, and uses his system to make Skype calls and write scientific papers. His croaky, computerised voice remains the same – something he was “adamant about”, said Lama Nachman, who led the project for Intel.

At Tuesday’s presentation, a cursor flicked over a simulation of Mr Hawking’s desktop, showing some of the most common words in his lexicon: “holes”, “cosmology”, “horizon” and “pepper”.

Mr Hawking also joined the head of GCHQ, Britain’s spy agency, in urging tech groups to do more to stop their networks being used by terrorists.

“More must be done by the internet companies to counter threat, but the difficulty is to do this without sacrificing freedom and privacy,” he told the BBC.

Copyright The Financial Times Limited 2022. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article