Why good chatbots need context, not tree-based flows

Have you interacted with a chatbot? Does this seem familiar to you?

In the example, you’re interested in visiting an attraction site and want to find out how much the entrance tickets are, so you ask,

User: “How much are tickets for 2 adults and 1 child to the cloud forest?”

Surprisingly, the chatbot didn’t know the answer, despite having the relevant API integrations.

Bot: “Sorry, I’m still learning.”

With a bit of guidance, the chatbot redirects you to a guided (rule-based) conversation flow. It suggests that you should say “Buy tickets” first, followed by “Ticket prices”, and finally “Cloud Forest” to get to the answer.

Bot: “Tickets are available on the website.”

Not quite close yet.

Why couldn’t the bot answer the question directly?

The vast majority of virtual agents use a natural language understanding (NLU) model, but users are still stunted with the unnatural dialogues.

One cannot simply explain the intelligence of a chatbot by saying that one NLP platform is better or worse than the other. It is a convenient reason, but it is not in this case. Why? The purpose of a well-trained NLU model is to help map an input (user utterance) to an output (user intent). For example, both “Send curry chicken pizza to 20 Sunshine Avenue” and “I want fish and chips” refer to the same “Food Order” intent.

However, that is where the intent detection ends. As a conversation designer or developer, you need to consider what happens after intent detection. It’s called context to give a direct response as much as possible.

Trending Bot Articles:

1. How Conversational AI can Automate Customer Service

2. Automated vs Live Chats: What will the Future of Customer Service Look Like?

3. Chatbots As Medical Assistants In COVID-19 Pandemic

4. Chatbot Vs. Intelligent Virtual Assistant — What’s the difference & Why Care?

What is context?

In real life, if you and your friend finally meet up after months of lockdown, all the moments in the last trip that both of you remember shapes the context. It has specific parameters such as the city names and the people you meet along the way. Context is also perishable, which means that the pre-COVID holiday moments aren’t the first thing in mind if you and your friend have met up multiple times talking about other things.

When you’re programming chatbots, you may want to do something with the specific information uttered by the user. For example, a good idea for your virtual agent is to proactively extract the food name and delivery address during the conversation session and commit to a memory state (the context). The bot should not ask for the same information when the user has already said them down the path.

Unfortunately, some chatbots today can’t remember essential parameters to hold a helpful dialogue with the user, who will eventually have to repeat critical details to the chatbot to help it along.

How did context-less chatbots happen?

These are some possibilities:

  1. Designing happy paths only under tree-like conversation design tools in some low-code software
  2. Treating intents as turns or checkpoints in the flow, rather than goals the customer have in mind
  3. Presenting conversation mind maps or flowcharts to software engineers with no specifications about user error corrections and chat detours
  4. Having difficulty accounting for large permutations in a non-linear application, unlike a web or mobile app with finite flows to success/failure states

How should context work in chatbots?

User: “what are the ticket prices for 2 adults and 1 child to the cloud forest again?”

This time, the chatbot extracts the entities it looks for in a ticket price inquiry intent. Those are the participants and the attraction site. As there are sufficient data to look up ticket prices, the chatbot presents a couple of relevant rich cards.

Supposedly you made a mistake. You correct the error by saying

User: “what about 1 adult, 1 child and 1 senior instead?

Instead of a fallback (“Sorry, I didn’t understand”), the message leads to a parameter-based intent. The chatbot has already remembered your preferred attraction site and now only accounts for the new participant information. It also knows that you’re in the state of ticket price inquiry, so without requiring you to repeat, it tells you the new total price.

Bot: “The standard rates are $20 per adult, $12 per child and $10 per senior. The total is $42.”

You continue to mention that you’re a local citizen.

User: “i’m a local”

Again, without having you repeat the attraction site and the number of people and changing the current conversation topic, the chatbot looks up ticket prices based on all the updated information gathered. Success!

Bot: “Local rates are $12 per adult, $8 per child and $8 per senior. The total is $28.”

In the following article, I’ll run through serverless code examples to build context in conversations.

Opinions expressed are solely my own and do not express the views or opinions of my employer. If you enjoyed this, subscribe to my updates or connect with me over LinkedIn.

Don’t forget to give us your 👏 !


Why good chatbots need context, not tree-based flows was originally published in Chatbots Life on Medium, where people are continuing the conversation by highlighting and responding to this story.


Posted

in

by

Tags: