a still from the science fiction film ‘I, Robot’
A still from the science fiction film ‘I, Robot’

Some of the biggest names in science and technology have called for a global ban on “killer robots”, amid warnings that crossing that threshold would start a new global arms race.

The intervention by more than 1,000 experts in the field of artificial intelligence came in an open letter, which was also signed by Professor Stephen Hawking, the cosmologist, Elon Musk, Tesla’s chief executive, and Steve Wozniak, the co-founder of Apple.

Although “robot soldiers” are still confined to the drawing board, the rapid advances in computational power and artificial intelligence have raised the prospect that the military could field one within two decades.

The petition, which will be presented on Tuesday at the International Joint Conference on Artificial Intelligence in Buenos Aires, warns that the development of weapon systems that can independently identify and attack targets without any human intervention would create the “third revolution in warfare” after the invention of gunpowder and nuclear weapons.

It paints a stark scenario of future conflicts akin to something from the film franchise Terminator.

“Autonomous weapons are ideal for tasks such as assassinations, destabilising nations, subduing populations and selectively killing a particular ethnic group”, the letter states.

“We therefore believe that a military AI arms race would not be beneficial for humanity. There are many ways in which AI can make battlefields safer for humans, especially civilians, without creating new tools for killing people.”

The United Nation is so concerned about the development of what it calls Lethal Autonomous Weapons that it convened its first ever meeting to discuss the risk posed by the new technology last year.

Stuart Russell, a specialist in artificial intelligence at the University of California, Berkeley and one of the signatories of the letter, has previously warned that a robotic weapons system could leave the human race “utterly defenceless”.

The letter is the second this year co-ordinated by the Future of Life Institute (FLI) hitting out at introducing artificial intelligence to the battlefield. But the latest petition is much harder hitting in calling for a ban on these weapon systems.

The FLI was founded in 2014 by volunteers including Jaan Tallinn, a co-founder of Skype. Both Mr Musk and Prof Hawking are members of FLI’s scientific advisory board.

The idea of autonomous, robotic killers appeals to many in the military because they could offer a crucial advantage against an adversary, their deployment does not put any of a country’s own troops in danger, and in the longer term they should be cheaper than advanced weapons systems that have to support and protect a crew, such as combat aircraft.

The Pentagon is one of the biggest backers of robotic research and in 2013 one of its think-tanks, the Office of Naval Research, awarded a $7.5m grant to researchers at Tufts, Brown, Rensselaer Polytechnic Institute, Georgetown and Yale to look into how autonomous robots could be taught the difference between right and wrong.

The fear among western military planners is that if they fail to pursue the technology, they could cede the lead in a new arms race to potential adversaries, such as China.

The UK, which has far stricter rules of engagement than the US, recently opposed an outright ban on the development of these systems.

In air warfare, a number of autonomous systems are already in various stages of development.

The US navy landed an autonomous drone, built by Northrop Grumman, on an aircraft carrier in 2013. This programme is meant to develop into a carrier-capable autonomous unmanned strike aircraft but has been held up ostensibly because the Pentagon is reviewing what roles it should perform.

Similarly, the UK’s BAE Systems has developed a so-called unmanned combat air vehicle demonstrator, dubbed Taranis, which is designed to fly and select targets autonomously.

Copyright The Financial Times Limited 2024. All rights reserved.
Reuse this content (opens in new window) CommentsJump to comments section

Follow the topics in this article

Comments