Will we one day be having meaningful artificial conversations, or is it too much to ask our chatbot friends? Howard Williams, marketing director at chat specialist Parker Software, investigates where chatbots need to improve if they’re to become true conversationalists.
George Bernard Shaw famously insulted a contemporary by claiming that they had ‘lost the art of conversation but not, unfortunately, the power of speech.’ Chatbots may have the power of speech, but will they ever be capable of skilled conversation?
A compelling conversation is a dance of intricate steps. Both parties must take turns in the flow of words, maintaining fluidity while understanding subtle shades of meaning. For businesses, the question is whether this kind of engaging experience could be reproduced in a chatbot.
Context is king
Engaging conversationalists listen. So, for an engaging chatbot conversation, a chatbot needs to be able to replicate the feeling of being listened to. Listening certainly sounds easy enough, but it’s only achievable by maintaining an awareness of the context in a conversation. It needs to feel like the chatbot is actively understanding you.
Currently, chatbots aren’t always great at remembering the context of a conversation. They’re trained to answer FAQ and handle a few conversational pleasantries, but that’s not the same as contextual listening.
The context of a conversation requires chatbots to remember past details. For instance, a customer tells a chatbot they’re looking for an open restaurant. The chatbot then asks what the customer wants, and is met with ‘Chinese’ as a response. The chatbot should know that they mean food, not information about Chinese culture.
This is easy enough for a human. For a chatbot taught to repeat mechanical answers in response to key words and phrases, however, stringing together meaning is not quite so straightforward. Even this simple example requires deeper layers of contextual understanding – an area in which chatbots are decidedly lacking.
Keeping the context
When a chatbot misses context, the customer often has to repeat themselves or rephrase to a more formulaic sentence. That doesn’t make for a smooth experience, and it certainly doesn’t create an engaging chatbot conversation.
One way to possibly achieve chatbot context retention is by integrating your chatbot with your CRM. The more data a chatbot has, the higher the chance of an engaging chatbot conversation. So, by giving your bot access to customer CRM data, you give it an overview of previous conversations, purchases and behaviours. For example, the chatbot knows that the customer’s most ordered Chinese food is chicken chow mein, and can suggest their favourite dish.
Again, it’s not quite the same as contextual listening, but it threads context and personalised content into the conversation. This helps gives the illusion of memory retention. And at the very least, an engaging conversationalist chatbot will need to be capable of ‘remembering’ key contextual details. For a conversation that feels human-like, chatbots need the ability to store contextual keywords and refer to them effectively during the conversation.
Another aspect of an engaging conversation is emotional empathy. An engaging conversationalist can respond appropriately to the emotions of their conversation partner. This is another area where chatbots fall short – aggravating emotional situations and amplifying a negative experience.
For example, a customer is annoyed or angry. The chatbot does not recognise this or adapt its response to appease the customer. The customer grows even more irate when the chatbot doesn’t give them the answer they want, fails to understand their query, or react appropriately to frustration markers such as insults or profanity.
We might not be able to teach robots to love, but we need to teach them to appear as though they have emotional understanding. In other words, we must enable chatbots to respond to different emotions appropriately.
An area of AI that may help with boosting a chatbot’s empathy is sentiment analysis. Sentiment analysis is a process that determines the emotional tone behind a series of words. With sentiment analysis and the appropriate training, a chatbot would be able to mimic empathy, by responding based on whether the tone of a conversation is positive or negative.
This would help a chatbot provide a more engaging conversational experience. The bot, for example, could recognise humour and respond playfully, or pick up on sarcasm and avoid a robotic response.
However, sentiment analysis is currently restricted to only understanding whether the emotion behind a conversation is positive or negative. For sentiment analysis to truly help a chatbot become an engaging conversationalist, it would need to develop further into understanding the many dimensions of human emotion.
Plus, it would need to tie in to contextual retention. A furious customer saying, “Great job”, is mocking the chatbot, and the chatbot needs to be able to recognise the ongoing frustration behind this seemingly positive phrase. Replying with a set ‘thanks’ is only likely to enrage the customer further.
Engaging conversationalists can keep the pace of the conversation going. What makes humans shine as conversationalists is their ability to adapt to conversational cues. The flexibility of a human will need to be replicated by a chatbot if it is to supply an engaging conversation.
When a bot is confused, it’s common for it to just repeat the question until the user either gives up or supplies an answer it likes. When a human is confused – shown by them repeating themselves or explicitly expressing their confusion – sending the same answer won’t help them understand.
Either way, the conversation grinds to a halt. So, a chatbot needs to be capable of recognising when it’s about to repeat the message it just sent, or if it’s repeated itself too many times, and change its strategy accordingly.
Adapting the strategy
Chatbots need to be able to compromise and adapt when things aren’t working. Just as customers shouldn’t be forced to repeat themselves in a smooth conversation, the chatbot shouldn’t either.
There are a few strategies a chatbot could be taught to use when the conversation grinds to halt.
- Escalate the chat to a human agent
If a chatbot can recognise when it’s constantly repeating an answer, it could instead offer to connect the customer with a human agent. This could be the best strategy for keeping a conversation smooth is if the tone is negative, or the customer asks for a human agent explicitly.
- Change the interface
If the chatbot is struggling to understand the customer after supplying options, instead of asking again, it could offer a different interface for the customer to use. For example, a browser or survey for the customer to supply the needed information. The chatbot can always pick up the conversation again once this information has been supplied.
- Offer an alternative solution
Particularly when the human is repeating their questions or getting confused, an engaging chatbot should be able to adapt its answer and offer different solutions. If a customer doesn’t understand a fix, a human agent would find a different way to explain, after all.
Robotic isn’t smooth
A conversation may involve talking, but talking doesn’t necessarily make a conversation. Our current chatbots are nowhere near to being capable of replicating human conversation. But, with some clever design and growing AI, this might not mean that chatbots will never be capable of holding engaging conversations.
Before this can happen, chatbots need to avoid sounding like robots. We need to teach them to mimic empathy, to speak using natural, colloquial language, and to be capable of maintaining the context of the conversation.
We’ve seen the rise of the functional, formulaic chatbot. Next, we may soon see chatbots mastering the art of true conversation.
About the author
Howard Williams works in customer experience at Parker Software. He leads the activities of Parker Software’s global customer team, with a focus on the consumer, their experience, and how it can be continually improved.