Google’s New “Duplex” Voice System Creates Some Troublesome Questions

UPDATE: (May 11, 2018): Teachable Moment: How Outrage Over Google’s AI “Duplex” Could Have Been Avoided

UPDATE (May 9, 2018): Calls From Google’s “Duplex” System Should Include Initial Warning Announcements

– – –

Google today demonstrated a project of theirs under development called “Duplex” — which permits their Google Assistant infrastructure to conduct unassisted two-way voice phone calls for tasks like making appointments. In their demos, employees at the firms being called apparently had no idea that they were talking to a machine. Very impressive, indeed.

But though I’m personally a fan of Assistant — I have several Assistant “Home” devices myself, plus Assistant on my phones — something about those demos today made me immediately uneasy. When I mentioned this on my mailing lists during the day, I was surprised by how many people responded with variations of “I’d be upset if I was conned into thinking that a real person was calling me when it was really a computer.”

And yeah, it would bug me too. I’m not completely sure why. I suspect it’s an aspect of human nature — and Google tends to occasionally have something of a blind spot in that respect. My guess is that most people don’t much mind talking to machines so long as they know that they’re talking to machines. But Duplex even throws in natural sounding “uh” utterances and such — technically perhaps sometimes to cover processing delays, but there’s no denying that humans would tend to perceive these as direct attempts to convince them that a human was at the other end of the line.

There was something else. I knew that I had seen something like this before, in an old sci-fi movie. I couldn’t find references on Google that matched, so I had to use my own unaided brain for the lookup. I finally got it.

In the 1977 film “Demon Seed” a woman is held captive in her own home by an advanced AI system created by her estranged husband, and in control of all the smart windows, doors, and other appliances of the house. Her husband isn’t present. He has no idea that this is happening. At various points in the plot, this system makes video phone calls and interacts with persons at the front door via an intercom, in all cases convincing them that they were actually speaking with the woman herself and that all was well.

Leaving aside the sci-fi melodrama, there are some interesting questions here. Is it ethical for computerized systems to interact with humans in a manner that pretends to be human? Even if it’s ethical, is it desirable? What are the possible blowbacks and undesirable possibilities?

Perhaps we can ask the same sort of question traditionally asked of undercover cops: “Are you a police officer?” In our case, we wonder how the system would respond if the called party asked Duplex “Are you a human being?”

Of course it’s relatively early in the development of this tech. But perhaps not too early to begin thinking about these questions. Google itself suggested that an appropriate level of “transparency” would be needed for these systems — I wholeheartedly agree. But what does that actually mean? Perhaps an announcement at the start of each call informing the called party that there were talking to an automated system? The various implications are decidedly nontrivial.

And we must also be thinking about how such systems could be purposely misused. Obviously not in the dramatic manner of that film I mentioned above, but in other ways that involve scams and frauds of various kinds, perhaps as a sort of evil evolution of the current robocalling scourge.

Yet technology itself is never actually good or evil — it’s how we choose to use tech that sets these parameters. And there are all sorts of ways that Duplex could do a lot of good — if it is deployed in manners that help people without trying to fool them, and if it has sufficient safeguards to minimize the risks of abusive applications.

Google has their work cut out for them on this one, because Duplex is a quintessential example of an AI system where getting it working is only half the battle. The other half is assuring to the greatest degree possible that it’s only used for good purposes, and not in evil ways.

–Lauren–