Tesla CEO Elon Musk speaks before unveiling the Model Y at the company's design studio Thursday, March 14, 2019, in Hawthorne, Calif. The Model Y may be Tesla's most important product yet as it attempts to expand into the mainstream and generate enough cash to repay massive debts that threaten to topple the Palo Alto, California, company. (AP Photo/Jae C. Hong)
Elon Musk is not alone in betting on the marriage of machine and mind © AP

Elon Musk, purveyor of electric cars and flame-throwers, has sometimes been compared to a Bond villain. His talk of integrating humans and microchips does little to dispel the bad boy image. Mr Musk has invested more than $100m in Neuralink, a company experimenting with brain implants. But he is not alone: companies and governments around the world are betting on the marriage of machine and mind.

A report from Britain’s Royal Society this week predicted those efforts will pave the way for a cybernetic future. Before humanity can reach that goal though, it sees two big challenges. Integrating complex electronic systems with organic wiring may be the simpler of the two. Developing policy for computer-brain interaction, when the ramifications are unclear, is even more of a challenge. It is nevertheless vital to try to mitigate the risks ahead.

Most work on neural implants focuses on their medical capabilities, and potential to transform healthcare. The report envisions that human creativity combined with immense computing power could radically change other areas of life as well. Workers in hazardous conditions could remotely control robots. Virtual reality experiences could reach a new level, beaming images directly to our minds. Thoughts could be wordlessly shared over a combined network, giving humanity something approaching telepathy.

While many of these possibilities are years away, the Royal Society is right to argue that the process for assessing possible pitfalls should begin now. After these devices are embedded in society (and surgically implanted in human brains), it will be far harder to reverse course. The risks are varied. Neural implants which make the users smarter but which are only accessible to the wealthy could further entrench inequality, creating a permanent “augmented” elite. Connecting human brains to a network could, in theory, give marketers, political campaigners and even employers access to individuals’ thoughts and emotions. There is even the prospect that malicious actors could hack implants, leaving victims little more than ghosts in their shells.

The business side of neural interfaces also presents challenges. The fact that Facebook and Alphabet are already studying the technology creates the danger of power being further concentrated in the hands of a few companies. Product lifespans are another cause for concern. It is unclear what will happen to the firmware inside users’ heads if companies go bust. Customers might need to have new implants installed every time a product update is released.

Several of the Royal Society’s proposals for ensuring that neural implants are developed safely are well worth putting into practice. The first is that international frameworks, not Big Tech, should govern the development and policy of neural implants. The tech giants’ dominance in other fields has already left regulators on the back foot. 

Public dialogue is vital too. The people who will have to live with the new reality should have a say in navigating its ethical and societal risks. Finally, these debates should not be restricted to a one-off process. As technologies evolve, so must norms and regulations.

William Gibson’s seminal novel Neuromancer turned 35 this year. Its imagined future, where humans would enter cyber space merely by sticking cables into their brains, may come to pass before it reaches 70. That is cause for both hope, and some fear. Neuromancer envisioned a society where cybernetically enhanced and genetically augmented superhumans were the norm. For those without upgrades, life would be nasty, brutish and short.

Get alerts on Artificial intelligence when a new story is published

Copyright The Financial Times Limited 2019. All rights reserved.
Reuse this content (opens in new window)

Follow the topics in this article