Google has taught its controversial human-sounding robot to identify itself as a machine when it makes a phone call, after an early demonstration caused a storm over whether people were being tricked by the artificial intelligence behind it.
The adjustment to the system, known as Duplex, comes as Google gets ready for its first tests in the real world. Duplex is designed to place calls automatically to handle simple interactions with businesses, such as booking a table at a restaurant.
Duplex stole the show at Google’s annual developer conference in May, when the robot voice was heard talking to an unsuspecting restaurant employee. Its natural-sounding conversation, complete with “ums” and “ahs”, left the human on the other end of the line unaware they had been talking to a machine.
The demonstration was taken up by Google’s critics as a sign of the company’s blindness to the impact of its powerful AI, putting the technology ahead of ethical considerations about how it affects people who deal with it.
Nick Fox, vice-president of product and design, this week defended the performance as an early technical demonstration, rather than a sign of what the final service was intended to be like. He said Google had been planning all along to address issues raised by Duplex, such as transparency and user control.
In demonstrations of the updated technology this week, the robot voice announced itself in one call to a restaurant as “Google’s automated booking service”, and in another as the Google Assistant. But it still used all the same non-verbal tics and casual language to emulate a human caller, ending one call: “OK, awesome”.
Scott Huffman, head of engineering for the service, said he was surprised that many people had taken the natural-sounding conversation from Duplex as a sign that AI was coming close to matching human intelligence. The system only works when it has been trained in extremely narrow contexts, or “domains”, he said, since the range of the language used is limited and it is easier to predict what people will say next.
Despite this, Duplex has attracted plenty of attention in the AI world, because even very simple free-form conversation is hard to master. The system can complete four out of five calls automatically, Google said, and hands off to a human in the company’s operations centre when it gets flummoxed.
Even at 80 per cent accuracy, the service, which Google does not plan to charge for, would require a large number of humans to back it up.
The technology echoes Facebook’s use of human back-up for an ambitious general-purpose digital assistant of its own, called M. The social networking company had hoped that the system would become smarter as it was trained on users’ queries, eventually dispensing with the need for people to support it, but the project was eventually abandoned.
Mr Huffman said the Facebook system had been far more ambitious in taking on general questions, rather than limiting itself to narrow areas where it had a better chance of success. He added that Duplex was still showing steady improvements that suggest its accuracy will rise much higher than 80 per cent.
Duplex has been trained initially on three tasks: asking about a business’s holiday hours, booking a table at a restaurant and making a hair appointment.
Google said it planned to start with a “very limited” trial of the first of these, before expanding to the others. Making calls to other businesses could follow, though each task would take separate training, using a team of humans to coach and supervise the system until it is ready to handle calls automatically.
“We’re going to be very slow, thorough and thoughtful here,” Mr Fox said.
Get alerts on Artificial intelligence when a new story is published