The unveiling of Google's Duplex a few weeks ago was a game-changer: that almost everyone can agree. But what game did it change?
For one: chatbots.
For me, I'm hopeful I'll never have to waste time on simple transactional phone calls with companies ever again. Of course, that's some time off. But eventually, each of us will have the capability of telling Google, Siri, Alexa, Cortana, Bixby, or whatever AI assistant we use to get things done.
And they will, even if it means interacting with an API-less "real" world.
For now, Duplex means five things will change about bots and chatbots.
Duplex made us sit and and take notice. Why? It used a combination of natural language understanding and natural language generation to sound ... natural.
"Duplex indeed put the world in an awe the moment it said 'Umm' on the phone call," says Avi Ben Ezra, founder of SnatchBot. "That is what made people swallow the fact [that] AI is taking over now."
(Full disclosure: I consult for SnatchBot.)
To compete, bots -- particularly the simple kind which have been built on if-then statements -- will need to get significantly smarter. Google's using its AI expertise to understand language and generate language, and that takes some hard-core technology.
Thanks to Duplex, people's expectations of what's possible just changed.
Now, whether communicating via voice or text, bots need to be able to act and react in ways that make sense based on human conversational flows. In other words, being mid-flow in customizing your pie with the Pizza Hut bot shouldn't stop you from remembering that you want delivery at the office, not your home.
And the bot should be smart enough to accept that input, react naturally, and re-start the tomatoes-or-olives toppings conversation without a hitch.
That said, there are different use cases.
"Duplex is a better version of a personal assistant," says Julie Blin, former strategy exec at Samsung Mobile. "I think they are complementary."