The bots are coming! Run!
An army of bots is descending on us, run!
Yes, but should we run from them, or towards them? I am talking about conversational bots of course, the ones that, among other applications, are promising to change Customer Service As We Know It. They answer chats and calls, understand the callers’ intent and answer all their questions – or at least the easier ones. Countless articles have been written on the market disruption that these services will generate, with savings in the billions and better CX.
Natural Language Processing (NLP) bots can be a very effective tool to serve your customers as they contact you, both on the chat and voice channels. No wonder they are all the rage: for customers, they are something (someone?) to talk to 24/7 and 365 days a year, that can answer their most common queries or refer them to a human agent for more complex ones. For companies, they represent huge potential savings, partly substituting human agents but more importantly efficiently qualifying the customers’ needs for shorter calls and better CX. For agents (while maybe eliminating some entry-level positions), they perform many of the initial, boring tasks in talking with customers, leaving more specialized and rewarding ones – those that really need the human touch.
But, not all bots are created equal. To begin with, there is quite a difference between NLP for voice and chat interactions. Even though voice is now easily reduced to text with a high accuracy, the way people use a spoken conversation to communicate is different from the way they write. They may use slang terms when they speak, mispronounce words, have a regional accent, occasionally forget details that have to be filled in later. There are no capital letters when speaking, which may be a problem to figure out people’s names or addresses. Speech is also likely to be more vague than writing, as there is less time to think about how to best express our needs. Conversely, people can make spelling mistakes when they chat or try to express needs with fewer words – especially while typing on a smartphone keyboard. Voice is also more “real time” than chat: we would think nothing of a chat response coming to us after 10 seconds, but this would be an unacceptable long time for a voice conversation.
These are the reason why, while there is a flurry of product announcements concerning chat bots, the number of voice-enabled NLP systems is much smaller.
Another set of considerations concerns the underlying technology of NLP systems. Most systems today are made possible by the latest advances in AI. Techniques as unsupervised learning, statistical language modeling, deep learning have gone mainstream and power the cloud-based general services like Alexa / Lex, Google Cloud Natural Language API, Cortana and IBM Watson. But these systems, while very powerful, have some drawbacks when we want to implement a specific application for a precise customer service domain.
For instance, it is hard to achieve a very high intent recognition on a very big domain, and not easy to restrict the domain to the number of queries that are likely to come for a certain service, which would improve the intent recognition performance. At least for the moment, normally there is no “follow-up” question if an intent is not recognized completely: either the system is able to understand the question the first time, or it goes back to square 1 – perhaps asking the caller to phrase the question differently. These systems are also not very accessible in terms of monitoring their status: either the intent is understood or not, but debugging the reason why a dialog fails is almost impossible - at least if you are not a computer scientist. Common knowledge is that, while at the end successful, projects tend to take a lot more time and effort than originally planned.
So, walk towards the bots with eyes well open, and without running (lest you trip).
I work for Interactive Media, and we take a different approach to bots for customer service. Our system is especially useful for voice interactions and optimized for limited-size domains - the right size for customer service applications. While it has some AI capabilities, the system is mostly rule-based, which allows modeling the semantics of the language and the structure of the service using a very complete Service Creation Environment. The same environment lets you iterate and debug the application – since it gives access to the caller requests that have not been understood, together with complete logs. The technology also allows to proceed step-by-step: if the caller does not specify a final intent (or the system does not understand it), but a partial intent has been found, the caller gets a context-aware follow-up question. In several projects that are in production, the development and deployment time has proven predictable, and the performance in terms of recognizing intents and acting on them very high.
I would be delighted to give you more details about our bots.
So, variety in bots is a good thing – different applications require different approaches. All obeying Asimov’s three laws of robotics of course!