Category: Chat

  • Here is How NLP Powers Conversational AI? | Editorialge

    The ability infused into the machines- making them capable of interacting in the most humane ways possible- has a different kind of high to it. Yet, the question remains, how does conversational AI work in real-time and what kind of technology is powering its very existence.

    Let’s find out:

    Simply put, Conversational AI is a segment or sub-domain of Artificial Intelligence — aimed at enabling humans to interact more seamlessly with computing entities.

    Remember those quirky website chatboxes that return specific answers to even some of the trickiest questions! Well, those are exactly the Conversational AI products we are talking about. Yet, the question remains: how come these chatbots become so perceptive. Surely, it isn’t just about feeding raw training data, like human voice and text, into the relevant models. There has to be more.

    The answer is NLP or Natural Language Processing.

    Often touted as a computer science vertical, NLP or natural language processing focuses on helping computers understand voice and text better- allowing them to better interact with the world or rather us, the humans.

    The power of NLP allows humans to communicate with intelligent systems using specific languages, like English. Also, NLP helps a computer get better at machine learning, which is a pathway for computing entities to develop ‘artificial’ intelligence.

    NLP-based intelligence is fed using NLP engines, which is the core prepping component- responsible for interpreting speech and text- and eventually feeding the structured inputs right into the system. But there is a lot more to an NLP engine.

    What can Natural Language Engines do?

    As mentioned, NLP engines are important. Why, you ask? Keep reading on:

    • Vocab enhancement: Every conversational approach must keep the training stream open for new words. NLP engines can help with that- assisting with word, phrase, synonym, and descriptor additions.
    • Context determination: Every conversation is made of multiple stages. At one stage, the customer might just throw a random question, whereas, at any other stage, it might be a specific query to close the deal. A natural language processing engine helps the computer determine the context associated with every stage.
    • Entity identification: Imagine a customer typing the following command: ‘How can I procure the mentioned offer that is to start on 25th July 2022’? As you can see, the statement comprises a date, numbers, and even a description. Powerful NLP engines help machines identify these disparate entities to perfection.
    • Utterance detection: A question like ‘When can I expect the email?’ can be asked as ‘Where is my email?’ NLP engines help intelligent machines get accustomed to every variation of specific questions for added relevance by factoring in the nature of utterance- both verbal and textual.
    • Intent: ‘I would like my coupon code now’ looks a tad different than ‘Where the heck is my coupon code?’ While the first is a standard statement showing desire, the next might be borne out of frustration. Top-of-the-line machines are great at differentiating based on intent, thanks to the efficient natural language processing engines.

    Yet, analyzing the intent requires us to dig deeper.

    How does NLP help to analyze the Intent?

    Before we delve deeper, it is important to understand that conversational intent can be Casual (including every emotion) or Business. And that is exactly when ‘NLP-powered’ Intent Analysis is looked at as an important tool used by NLP engines.

    In simple words, intent analysis helps machines assess the exact or near-exact intention of any user input- often by extracting the relevant entities. And that is how a chatbot actually learns to pick up a suggestion, news, or even a complaint.

    As far as the approach is concerned, NLP tools (precisely the engines) help parse multiple intents- feeding them into the machine as high-quality training data. A structured dataset, therefore, helps with intent analysis. On top of that, the structured dataset is further fragmented to include some tricky words.

    Output-wise, the NLP engine analyzes the sentence and tries to analyze intent based on the data fed, plurality, positioning (position of words in a sentence), and conjugation. For speech, other factors are also taken into consideration. And that’s more or less how intent analysis takes place in the background.

    According to detailed research conducted by Market & Markets, the NLP market is expected to be valued at $26.4 by the year 2024. That is a CAGR of 21%.

    And their role in enhancing the quality of conversational AI isn’t limited to emotion, utterance, entity, or even intent analysis. NLP is also relevant to aspect mining, topic modeling, text summarization, and more- helping chatbots software, self-driving cars, home automation setups, and digital assistants get more intelligent over time. With NLP, the possibilities are virtually endless.

    Author Bio

    Vatsal Ghiya is a serial entrepreneur with more than 20 years of experience in healthcare AI software and services. He is the CEO and co-founder of Shaip, which enables the on-demand scaling of our platform, processes, and people for companies with the most demanding machine learning and artificial intelligence initiatives.

    Originally published at https://editorialge.com on July 26, 2022.


    Here is How NLP Powers Conversational AI? | Editorialge was originally published in Chatbots Life on Medium, where people are continuing the conversation by highlighting and responding to this story.

  • What Is Quality Education and How to Use It

    In this day and age, we’re lucky that most of us have access to education. In 2020, 90 per cent of the world’s population had completed primary education, and 66 per cent had completed secondary education.

    Education is an essential building block of our society because it provides knowledge, skills and an environment to help people grow — and in turn, they will help society grow.

    But out of the billions of people who have completed primary education, how many get to experience quality education? And what does that even mean?

    In this article, we’ll explore the definition of quality education, its importance, the dimensions of quality education, and how educators and institutions can get there.

  • How Chatbots Can Make Your Business’s Apps Better

    The popularity of chatbots and mobile apps is undeniable. This is evidenced by extensive data describing how much they are used and how the market is expected to benefit.

    For example, 2020 data from Statista shows that apps have reached over 3 million downloads for Android and over 2 million for iOS apps. Moreover, 2021 market share projections predict that 59% of global revenue will come from mobile phones. Meanwhile, 2016 reports show that the market for chatbots was estimated to be worth a little over USD 190 million. By 2025, it is expected to increase by nearly 24% to a value of USD 1.25 billion.

    With this in mind, marrying chatbots to mobile apps is like a match made in (business) heaven. Here are a few ways chatbots can make your business app better.

    Benefits

    Excellent Customer Service

    Chatbots, especially those powered by artificial intelligence, are efficient tools capable of responding 24/7 to customer queries, thereby reducing client wait time and increasing the likelihood of sales.

    Customized Experience

    Based on intent and app behavior, a chatbot can make highly personalized product recommendations and make choices that usually suit the settings and needs of the user, thus creating a personalized journey for each client.

    Strengthens Marketing Efforts

    With the use of chatbots in mobile apps, businesses may gather a wealth of demographic and engagement data that can be utilized to improve each customer’s experience and support wider business marketing strategies.

    Better Onboarding

    A significant percentage of consumers never use an app again after the first time, according to numerous research. Why? A lack of clarity on what the app contains and how to use it. An intelligent chatbot can be particularly helpful in these situations by interacting with the user from the onset. Users can be guided by a chatbot through all app features, significantly reducing the chance of misunderstanding.

    Use Cases

    Many industries have already recognized the benefits of using chatbots in their mobile apps. Some of which include:

    Finance

    Companies like Bank of America and PayPal have integrated chatbots in their applications to engage with customers via text and sound about questions, investment opportunities, payment tracking, and other financial matters.

    Healthcare

    Chatbots provide critical and possibly lifesaving advice in a fraction of the time one would probably have to wait to see a doctor. These types of apps are not meant to replace doctors but to support an already overwhelmed system. Medical bots can provide interim health guidance. A virtual radiologist bot was developed by radiologist experts at the University of California (UCLA) that can assist patients and those looking for care by providing insightful answers to their concerns. The AI-based bot can give the doctor clinical patient data and can give the patient a thorough treatment plan. Bot Libre’s parent company Paphus Solutions has developed several medical, first aid, and health and fitness chatbot apps for their clients across North America, Europe, and Asia using the Bot Libre platform.

    Shopping

    Stores like eBay and Whole Foods have created apps that have intelligent chatbots that help customers browse products, and in the case of eBay, act as personal shopping assistants offering suggestions based on uploaded pictures. Chatbots in the retail industry has become increasingly popular. A study showed that 70% of millennials and Gen Z were willing to use chatbots to make purchases of consumer products from applications.

    Learning

    As learning is no longer defined by the four walls of a classroom, virtual educational tools have increased in popularity. Bot Libre’s English Tutor app includes an intelligent chatbot that teaches English through regular conversation, offering a wide range of topics for the user to choose from. The app offers 24.7 learning availability, provides new and exciting learning techniques, creates a marketplace for English teachers, and is Self-paced. Another example comes from Duolingo. Through their app, users can practice their chosen language with a bot that develops in intelligence as more people use them.

    With open source platforms like Bot Libre, people can build their own chatbots with little to no programming and attach to their mobile apps using the open source Bot Libre mobile SDK and web API.

    For further information contact: sales@botlibre.com

    Instagram, Twitter & YouTube — @Bot Libre


    How Chatbots Can Make Your Business’s Apps Better was originally published in Chatbots Life on Medium, where people are continuing the conversation by highlighting and responding to this story.

  • The Most Successful Airport Chatbots Examples

    A few years ago, only 9% of airports were utilizing chatbots to communicate with customers. Adoption since then has continued to increase, with a projected spend of $3.69 billion USD projected by 2027 for the airports and aviation industries in AI-related services. With this rise of Conversational AI chatbots, airports are no longer forced to choose between high-quality customer service or low costs — they can have both. In this article, we’ll take a closer look at the top 5 examples of Conversational AI chatbots for airports in 2022, and the key use cases that each solution covers.

    Airport Chatbot Example #1: Melbourne Airport

    Melbourne Airport is the second-largest airport in Australia, with four terminals serving more than 2.2 million passengers in April 2022 alone. Melbourne Airport is famous for its innovative approach to customer services such as hybrid desks and the installation of self-service check-in kiosks, digital signage, and chatbot implementation for their call center automation. Melbourne Airport provides a really good airport AI chatbot example as it covers most customers’ use cases and provides digital assistance to users on both their website and Facebook Messenger.

    Chatbot Example for Airport: Melbourne Chatbot

    Use Cases covered by AI chatbot from Melbourne Airport

    • Real-time flight updates can be tracked by the airport chatbot: with information about the flight number, destination, and airline, current flight status can be checked, and with an API chatbot integration all updates can be sent to the client’s messenger service.
    • FAQ page automation with airport latest updates and parking information: which reduces support ticket maintenance. All common questions are collected under the FAQ section within the chatbot, so no need to request them from a live agent.
    • Food & beverage and shops search within each of the four airport terminals: the AI-powered chatbot provides a full range of information that contains a venue overview, working hours, and link to the official website.

    Airport Chatbot Example #2: Aéroports de Lyon

    Aéroports de Lyon, which is part of the VINCI group (a leading private airport operator), is an international airport based in Lyon, France that has served more than 11 million passengers in 2018 alone. As one of the main regional airports in Lyon, they provide a range of services for passengers such as parking, shopping, restaurants, and hotels, and much of their customer base asked questions about these topics to their AI-powered chatbot.

    Chatbot Example for Airport: Aéroports de Lyon Chatbot
    • Flight information automation: With the help of the Lyon airport AI chatbot, passengers can check their flight status, check-in, and ask flight related questions.
    • Shop and Restaurant finder: Users can ask the chatbot to look up cafes, restaurants, and boutiques based on which terminal they’re in at the airport. The bot also offers a recommendation engine for sures to find a place to shop or eat suited to their needs.
    • Parking information: Lyon airport provides approximately 16,000 parking spaces. By automating access to parking information, users can check the FAQs about the service as well as other activities, including booking parking in advance. After its launch, AI-powered chatbot increased the parking conversion rate by 40%.

    Also read: Explore how the leader in the luxury travel industry increases conversion rate 3x within the chatbot.

    • Baggage problem solving: Users can learn about what to do with their bags during connections and how to file a report for lost or stolen bags.
    • Navigation option via airport chatbot: In case a passenger wants to spend some free time in the airport, the AI-powered chatbot can navigate the user through shops, cafes, and restaurants, which includes the ability to book a reservation.
    • FAQ page automation: The Lyon airport chatbot provides continually updated information about the current COVID situation per country to its users. Since the requirements differ from country to country, and are continually updated when you consider global travel, the ability to provide the latest travel requirements (such as testing criteria and documentation) on specific locations is necessary.

    Growth your In-Airport Sales Today! Learn more about how Airports can use Conversational AI chatbots to improve their services?

    Airport Chatbot Example #3: Geneva Airport

    Geneva Airport is located in Switzerland and in 2021 served 5.92 million passengers. To assist passengers in finding related information and bridge the gap between customers, airlines, facilities, and the airport itself, the Geneva Airport launched their AI chatbot on the Facebook Messenger platform.

    With the help of their chatbot, the Geneva Airport automated FAQ page allows users to identify country entrance restrictions due to COVID-19, providing everything a passenger needs to know for how to prepare for the journey, including check-in services, baggage requirements, and the travel documents needed. Their approach on flight information and updates were also included as part of the chatbot, providing proactive notifications for any necessary change to passengers.

    Airport Chatbot Example #4: Brussels Airport Assistant “BRUce”

    In 2018, Brussels Airport launched its chatbot on Facebook Messenger for testing and collecting feedback about the bot. Now, after a series of tests, improvements, and launches, BRUce — the Brussels Airport Virtual Assistant — is available via WhatsApp, Facebook Messenger, and the airport website. It helps customers check flight information and answer common airport FAQs.

    Chatbot Example for Airport: Brussels Airport Assistant “BRUce”

    Additionally, shops, parking and train information, COVID-19 restrictions and tests, and all other airport FAQs are automated within the airport chatbot. More recently, services such as ordering a lounge pass and booking parking are also now available through the BRUce AI chatbot.

    Guide to Business Process Automation. Did you know that 60% of companies have at least 30% of their activities automated?

    Airport Chatbot Example #5: Gatwick’s Airport Assistant “Gail”

    Gail, Gatwick’s automated chat assistant, was launched in 2019 based on Facebook Messenger. Gatwick Airport, a major international airport in England, wanted to improve customer experience and the quality of conversations with clients. Over just a year, Gail managed to understand and answer about 80% of users’ questions.

    Chatbot Example for Airport: Gatwick’s Airport Assistant “Gail”

    Gatwick’s Airport chatbot takes the role of a guide and helps customers save time by helping them to find all important information before entering the airport itself. Find a flight, check Coronavirus latest info, book a COVID-19 test, and plan your free time in the airport by visiting shops and cafes. This AI-powered chatbot provides easy to access travel information 24/7.

    We introduced only 5 examples of how airports can implement chatbots and improve customer service by using Conversational AI solutions. Airport chatbots cover the most popular use cases and can be developed for the airport’s website as well as other channels for user engagement, such as Facebook Messenger, WhatsApp, and Google Business Messages.

    Businesses increased in sales with chatbot implementation by 67%. Ready to build your own Conversational AI solution? Let’s chat!

    Get in touch with us!


    The Most Successful Airport Chatbots Examples was originally published in Chatbots Life on Medium, where people are continuing the conversation by highlighting and responding to this story.

  • Multimodal Conversational AI assistants

    Artificial Intelligence (AI) adoption has skyrocketed over the last 18 months. And Gartner says that chatbots are just one step away from a slope of enlightenment on its AI hype cycle. At the same time, AI technologies are coming to accelerate business growth and ensure engineering trust. Together with Conversation Design, Conversational AI is transforming customer experience, customer support, and digital customer services for an onscreen world.

    From mobile-first experience to Conversational AI multimodality in customer interaction

    “Mobile-first experience” — this is the paradigm that has been the number one goal in the strategy of IT companies since Google announced this concept back in 2010. Now in 2022, it’s time for companies to expand on that approach and think about multimodality.

    To determine if multimodal experiences are best for your users, you need to ask yourself the following questions:

    • Do your users have access to multimodal devices?
    • How valuable is that for those users?
    • What natural conversations are your users having?
    • What are they looking for? And how could a bot help them achieve it?

    The mobile world shows the flexibility and scalability of company offers, and virtual assistants are the same. But not everyone has multimodal assistants in their household and its adoption for enterprises is still in its infancy.

    Featured resources: Free guide to Conversation Design and How to Approach It.

    Multimodal Conversation Design is exciting because it marries voice and chat together, and they can fill in gaps that each experience may not offer. For example, today’s voice technology is still limited, such as the challenges around understanding certain accents. Multimodal technology can support this pain point by leveraging visuals for the user to lean on instead of the voice experience. This offers a more accessible experience to all users.

    “During consultation for the automotive industry, when we looked at English support it became very clear that for English US, English UK, Australian etc cultural context is extremely important to consider. So the way you would name a car part in English US would be different from English UK, and you really need to customize your language model.” — Quirine van Walt Meijer, Senior Designer in Conversational AI at Microsoft.

    Conversational AI creates stable and well-trained language models as basics, and then you look outwards in the context, what channels are interesting, or what modalities can best surface brand or user experience. Language is the biggest factor in Conversational AI, once you get started to build a conversation you probably have dialects or different languages inside one country. Check out our investigation of different names of soft drinks in the United States in a recent post, Dialect Diversity in Conversation Design.

    Regional Word Variations Across the US

    It’s essential for conversation design teams to understand how the end-users talk about products, services, and things the virtual assistant will need to know. Always collect sample dialog from a diverse representative sample of the bot’s end users to ensure the system will understand all the different types of jargon and phrases.

    Read also: Three Secrets Behind Impactful Troubleshooting Chatbot Conversation Flows

    Best use cases for Multimodal Conversational AI Assistants

    A great multimodal experience is one that feels seamless, easily switching out contexts. A good example with a booking self-driving vehicle agent by the textbox, but also talking to you inside of the vehicle via voice. Check out more Multimodal Conversation Design Use Cases and opportunities for enterprises.

    Multimodal Conversational AI Assistants

    The Future of Multimodal Conversation Designed Experiences

    The not so far future will be that everytime a brand launches a conversational experience, it will be across multiple channels, specially designed for that channel. Brands need to invest in offering automation to their customers across multiple voice and chat channels, creating more accessible solutions. By allowing more entryways for users to self-serve, a company’s ROI will only increase.

    Want to Reduce Customer Support Costs? We analyze your customer pain points and address them with automation. Get in touch with us!


    Multimodal Conversational AI assistants was originally published in Chatbots Life on Medium, where people are continuing the conversation by highlighting and responding to this story.

  • The Evolution of Conversational AI

    Talking to machines through the years

    Old fashioned red telephone
    Photo by Miryam León on Unsplash

    Conversation comes naturally to us. It’s remarkable just how fluently we can converse on any number of topics, and adapt our style with ease to speak with any number of people.

    In contrast, our conversations with machines can be clumsy and stilted. Conversational AI has been a long-standing research topic, and much progress has been made over the last decades. There are some large-scale deployed systems that we’re able to interact with by language, both spoken and written, although I’m sure very few people would call the interactions natural. But whether it’s a task-based conversation like booking a travel ticket, or a social chatbot that makes small talk, we’ve seen continual evolution in the way the technology is built.

    The first chatbot

    One of the first, and still most famous, chatbots called Eliza was built around 1966. It emulates a psychotherapist using rule-based methods to discover the keywords in what a user types, and reformulate those keywords into a pre-scripted question to ask back to the user. There are implementations still around today which you can try.

    Eliza’s inventor, Joseph Weizenbaum, conceived Eliza as a way to show the superficiality of communication between people and machines. And so he was surprised by the emotional attachment that some people went on to develop with Eliza.

    “Press 1 to make a booking, press 2 for cancellations…”

    The personal computer wasn’t a reality until the late 1970s. So at the time of Eliza there wasn’t really a way that people could interact with a text-based chatbot, unless they happened to work with computers. Chat technology instead begun to be used in customer service scenarios over the phone. These systems were dubbed Interactive Voice Response (IVR). DTMF (dual-tone multi-frequency) was initially a key part of these systems for enabling user input. DTMF assigns each keypad number two frequencies when pressed, which can be decoded by the receiver to figure out which number the user pressed. This is the mechanism behind the scenes when call centres ask you to “Press 1 for bookings, press 2 for cancellations…”, etc.

    The first commercial IVR system for inventory control was invented in 1973, with commercialisation of IVRs picking up in the 1980s as computer hardware improved. Through the 1990s, as voice technology improved, limited vocabulary speech-to-text (STT) was increasingly able to handle some voice input from users, alongside continued use of DTMF. Phone conversations also need a way to respond to the user with voice. Initially, this would have been pre-recorded audio, and later text-to-speech (TTS).

    In early systems, the natural language processing (NLP) to interpret what users said is typically rule-based. To make life easier, questions asked by the system may be very direct in order to reduce confusion between the number of things a person might say in response, e.g. “Please say either booking or cancellation”, or “Please state the city you are departing from”.

    The conversation flow — i.e. what to say next — in these systems was handcrafted, like a flowchart. Standards were developed for writing conversational flows. VoiceXML is one such standard that came into being in 1999. It allowed VUI designers to focus solely on designing the conversation, while software engineers could focus on the system implementation.

    Learning how to converse

    Handcrafting conversation flows is complex, and leads to sometimes clumsy interactions and brittle systems that can break when users say something unexpected. From the early 2000s, researchers looked into ways to learn conversation flows rather than handcraft them. Many of the models at this time were based on reinforcement learning, and were able to learn a conversation flow (or ‘dialogue policy’) through interacting with simulators and by having lots of conversations with real people.

    One of the difficulties of deploying such statistical systems for dialogue policy is in the lack of control they offer to developers. In a world where companies like to maintain control of their brand in customer service interactions, it’s difficult to accept randomness in performance that might reflect poorly on them. A particularly egregious case is that of Tay — a social chatbot released by Microsoft in 2016 which quickly learnt to post offensive and inflammatory tweets, and had to be taken down.

    As the internet grew, so too did the places in which conversational AI technology was deployed. Web browsers, instant messaging and mobile apps quickly became channels in which text-based chat was now viable.

    The deep learning boom

    Through the 2010s, deep learning had a big impact on STT and TTS systems, significantly improving them to handle a wider range of language. Deep learning also started to have an impact in the NLP community. Understanding the meaning of what a user says in a conversation is cast as two machine learning tasks — intent recognition and slot (or entity) recognition. Commercial platforms like Amazon Lex and Google Dialogflow are based around the ideas of intent and slot. Intent recognition is a text classification task which predicts which out of a predefined set of intents a user has asked. For example, a ticket booking system might have MakeBooking or MakeCancellation intents. Slot recognition is a named entity recognition (NER) task which aims at picking salient entities (or slots) out of the text. In a ticket booking scenario, DestinationCity and SourceCity might be among the slots a system aims to recognise. Together, the entity and slots can be used to infer that “I’d like to book a ticket to London” and “Please can I buy a ticket and I’m going to London” effectively mean the same thing to a ticket booking system. The system can use the recognised intent and slots to communicate with a wide range of systems (databases, knowledge graph, APIs etc) and act on a user’s request.

    Using machine learning for NLP leads to conversational systems that can robustly handle a wide range of user inputs. Still, it’s common to have a layer of handcrafted rules alongside the ML model to handle edge cases or guarantee the system will behave appropriately for particularly important or common user queries. Further, even when machine learning can interpret individual user utterances, the overall conversation flow still usually remains handcrafted.

    Deep Learning for Dialogue

    Intent and slot has its limitations as a way of modelling dialogue. For now though, it’s a common way to build both voice and chat bots in real-world applications.

    Deep learning continues to impact the trajectory of conversational AI. Deep neural networks (DNNs) were first used for learning dialogue policies. Then, the natural collision of using DNNs for both NLP and for dialogue policy is to build a single model that directly predicts appropriate responses in a conversation. An example of this kind of model is Google’s MEENA — a large neural network that’s trained to be able to respond appropriately in conversations about different topics.

    These end-to-end neural dialogue models build on large non-conversational language models like BERT and GPT-3. However, they’re difficult to use in commercial products because of some key issues. It’s difficult to have any control over the conversation flow, and they can sometimes produce biased or inappropriate responses. This isn’t great for company branding! Also, they struggle to retain a consistent persona throughout a conversation, forget what they’ve previously said, often produce relatively boring responses, and cannot easily link with external sources of information like knowledge bases or APIs to take action or to find the right information. These models of dialogue are new, however, and current research is addressing these limitations.

    Conversational AI has been the topic of extensive research and development for decades, and a lot has changed in that time. It’s impossible to do justice to all of the research that’s happened, and is still going on, so this is a small snapshot of how the field has developed. Things will look very different in a few years time as the challenges of the current technology are addressed.


    The Evolution of Conversational AI was originally published in Chatbots Life on Medium, where people are continuing the conversation by highlighting and responding to this story.

  • How to Add Chatbot to React Native

    Building a chatbot on a React Native app may have been a complicated affair in the past, but not so today, thanks to Kommunicate’s Kompose chatbot builder.

    In this tutorial, we are going to build a chatbot application from scratch using Kompose ( Kommunicate Chatbot) and React Native.

    We’ll do the integration in 2 phases:

    1. Create a Kompose chatbot and set up the answers.
    2. Add the created chatbot to your React JS website.

    Let’s jump right into it.

    Phase 1: Create a chatbot in Kompose and setup the answers

    Step 1:Setup an account in Kommunicate

    If you do not have an account in Kommunicate, you can create one here for free.

    Next, log in to your Kommunicate dashboard and navigate to the Bot Integration section. Locate the Kompose section and click on Integrate Bot.

    If you want to build a bot from scratch, select a blank template and go to the Set up your bot section. Select the name of your Bot, your bot’s Avatar, and your bot’s default language and click “Save and Proceed”.

    You are now done creating your bot and all you have to worry about now is to “Enable bot to human transfer” when the bot encounters a query it does not understand. Enable this feature and click “Finish Bot Setup.”

    From the next page, you can choose if this bot will handle all the incoming conversations. Click on “Let this bot handle all the conversations” and you are good to go.

    Newly created bot here: Dashboard →Bot Integration → Manage Bots.

    Step 2: Create welcome messages & answers for your chatbot

    Go to the ‘Kompose — Bot Builder’ section and select the bot you created.

    First, set the welcome message for your chatbot. The welcome message is the first message that the chatbot sends to the user who initiates a chat.

    Click the “Welcome Message” section. In the “Enter Welcome message — Bot’s Message” box, provide the message your chatbot should be shown to the users when they open the chat and then save the welcome intent.

    After creating the welcome message, the next step is to feed answers/intents. These answers/intents can be the common questions about your product and service.

    The answers section is where you’ve to add all the user’s messages and the chatbot responses.

    Go to the “Answer” section, click +Add, then give an ‘Intent name’

    In the Configure user’s message section — you need to mention the phrases that you expect from the users that will trigger.

    Configure bot’s reply section — you need to mention the responses (Text or as Rich messages) the chatbot will deliver to the users for the particular message. You can add any number of answers and follow-up responses for the chatbot. Here, I have used custom payload by selecting the “Custom” option in the “More” option.

    Once you have configured the responses, you need to click on “Train Bot” which is at the button right and to the left of the preview screen. Once successfully trained, a toast “Anser training completed” will come at the top right corner.

    Phase 2: Add the created chatbot to your React Native project:

    Step 1: Setup the React Native development environment

    https://reactnative.dev/docs/environment-setup

    Step 2: Create a React Native app

    Create a new React Native app (my-app) by using the command in your terminal or Command Prompt:

    npx react-native init my-app

    Step 3: Now, navigate to the my-app folder

    cd my-app

    Step 4: Install Kommunicate to your project

    To add the Kommunicate module to you react native application, add it using npm:

    npm install react-native-kommunicate-chat –save

    Step 5: Add Kommunicate code to your project

    Navigate to App.js in your project. By default, a new project contains demo code which are not required. You can remove those codes and write your own code to start a conversation in Kommunicate.

    First, import Kommunicate using:
    import RNKommunicateChat from ‘react-native-kommunicate-chat’;

    Then, create this a method to open conversation before returning any views:

    Next, we need to add a button, which when clicked would open a conversation. Add these React elements and return it.

    const App: () => Node = () => {

    const isDarkMode = useColorScheme() === ‘dark’;

    const backgroundStyle = {

    backgroundColor: isDarkMode ? Colors.darker : Colors.lighter,

    };

    startConversation = () => {

    let conversationObject = {

    ‘appId’: ‘eb775c44211eb7719203f5664b27b59f’ // The [APP_ID](https://dashboard.kommunicate.io/settings/install) obtained from kommunicate dashboard.

    }

    RNKommunicateChat.buildConversation(conversationObject, (response, responseMessage) => {

    if (response == “Success”) {

    console.log(“Conversation Successfully with id:” + responseMessage);

    }

    });

    }

    return (

    <SafeAreaView style={styles.con}>

    <StatusBar barStyle={isDarkMode ? ‘light-content’ : ‘dark-content’} />

    <ScrollView

    contentInsetAdjustmentBehavior=”automatic”

    style={backgroundStyle}>

    <Header />

    <View

    style={{

    backgroundColor: isDarkMode ? Colors.black : Colors.white,

    }}>

    <Text style={styles.title}></Text>

    <Text style={styles.title}>Here you can talk with our customer support.</Text>

    <View style={styles.container}>

    <Button

    title=”Start conversation”

    onPress={() => startConversation()}

    />

    </View>

    </View>

    </ScrollView>

    </SafeAreaView>

    );

    };

    Here is my screenshot:

    Originally Published at https://www.kommunicate.io/ on 21/06/2022


    How to Add Chatbot to React Native was originally published in Chatbots Life on Medium, where people are continuing the conversation by highlighting and responding to this story.

  • Conversation Design Discussion: Designing for voice agent

    Welcome to part 2 of our Conversation Design discussion. In part 1 we learned about multimodal Conversational AI assistants: best use cases and their future. Today we’ll dive into voice assistants: how to build and test new customer experiences, collect feedback on it and make a decision to invest in Conversational AI solutions.

    Today, voice agents are commonly chatbots but with voice communication, so they help you with certain quick faqs and tasks, but if we look at that realm of the future it will be that multi-sensory multimodal experience. It’s not just about the modality, it’s about what other factors come into play, such as what we’ve seen from the Metaverse. With mixed reality where there’s so much opportunity, how do companies support that in the best way? Check out our Voice Assistant Use Cases for Business to automate repetitive or labor-intensive tasks.

    Voice agents and other potential user interactions to provide better customer experience

    We often see that companies start off with a chatbot development, and then they often replicate the experience to a voice channel. A big mistake companies make is not recognizing and designing for cognitive load. With listening humans can remember around 2–3 pieces of information in one communication, but when we’re engaging with text-based communications, we can give users 5–9+ pieces of information. Additionally, they can read and re-read the prompt without the bot having to do it for them.

    Cognitive load is a big challenge with voice agents because some of the voices, mostly default ones, may sound a little monotone to users. As time goes on we are however seeing more human-like voices come out on assistants that offer better inflections that help users focus and listen. By having those variations of voices it might give a little bit more flexibility for Conversation Design in terms of how much more information we can give a user than today.

    Testing in Conversation Design process

    Much of the quality in Conversation Design experiences lies in quality testing, especially in voice. If you’re creating conversational flows, it’s essential to act and read out your dialogs, this is the basics of conversation design process. We recommend back to back conversations so users can organically respond to prompts without looking at the speaker’s facial expressions. Check out more insights from our specialists in troubleshooting chatbot conversation flows to level up the answers of your bot.

    Download a Conversational flow chart diagram with the scenario of building dialogues for your chatbot

    These testing exercises allows the designer to address potential points of friction, confusion and immediately correct them and retest all before launching the experience. This mitigates unanticipated errors and promotes strong conversation and completion of your assistant’s conversational flows.

    Conversation Design Process workflow

    How to collect feedback about your AI chatbot or voice agent?

    Collecting feedback is the most critical step in the Conversation Design process in ensuring your conversational solution is useful and enjoyable for customers to use. As your solution expands in features, user feedback will allow your team to make mindful and impactful decisions to further improve the customer experience.

    A common mistake companies make is having their voice or chat bot after every prompt ask the user for initial feedback on if it answered their question or was helpful or not. This requires the customer to give multiple pieces of feedback in one interaction which can quickly cause user frustration or have them ignore it entirely. Think of your own experiences with talking to human agents, do they ask you how they did every minute? It’s unlikely they do because сonversation interactions would be robotic and it’s uncommon for humans to give feedback multiple times in one conversation and it turns into a very robotic experience.

    It’s challenging enough for users to fill out a single survey at the end of an experience, so asking for multiple rounds of feedback is even less likely.

    A great way to ask customer feedback is at the end of newly launched flows or in the form of an ‘Anything Else’ menu at the end of longer, more complex flows. As more feedback is received on new flows, conversation designers and bot tuners can further optimize the experience and once enough good feedback is received the feedback prompt is removed.

    How to make a decision on if your business should invest in Conversational AI?

    Before investing in conversational AI, firms should consider the following factors:

    • Technical abilities and constraints: The ability to integrate conversational AI solution and connect it via API is a must. If your firm has limited technology with little opportunities for the bot to integrate with systems that end users engage with today, your user experience may be slower than current state and expected ROI will be constrained. Remember, if users are being pushed to switch to a chat or voice channel, then that experience must be an improved version of current state otherwise customers won’t make the channel shift.
    • Economic viability: Ensure your firm has the financial investment to support discovery, design and development and optimization costs for a conversational AI solution. More importantly, that there is a budget to support ongoing improvements and new feature builds.
    Results of Conversational AI implementation according to Gartner research

    How to Choose Conversational AI Platform. Get the checklist

    It’s so critical for businesses to ensure their internal systems are flexible enough to allow for integrations. These will help produce frictionless, impactful and engaging experiences that end-users will want to interact with. This is why conversation design is so critical in the bot building process, so that what is launched is something that these users will want to use.

    Want to learn more about how voice can transform your customer experience? Reach out to our expert today!


    Conversation Design Discussion: Designing for voice agent was originally published in Chatbots Life on Medium, where people are continuing the conversation by highlighting and responding to this story.

  • Russian TV show to have virtual human, Elena created by Sber.

    Russian company Sber is going one step ahead by introducing a virtual human for hosting shows. Yes, this Russian tech giant has announced their Virtual human, Elena who is currently hosting Russian TV RBC TV’s markets and investor Calendar shows. To transform pictures and recording into an AI character that can easily replace the Sber has used their visper platform.

    Virtual Host

    With the platform Visper, SberDevices highly expect users to make the same kind of Avatar, Elena. This platform Visper is capable of creating any kind of AI character, starting from realistic Virtual human like Elena to animated cartoon characters. Sber thinks that these creatures can be highly beneficial in terms of making interaction online.

    “The Visper platform is quite young, but it is evolving rapidly, and the characters we create with it. For example, they have recently learned to speak English,” the Sber Devices stated.

    Several Russian TV shows seem to jump quickly over using such technology for hosting their shows. For example, recently Japanese company Nikkei innovation lab has introduced a platform for producing virtual human videos which includes character for news anchor in the list.

    “This is an interesting and important experiment for RBC. Similar technology is developing throughout the world, and we will soon see similar virtual hosts on many channels,” stated by RBC TV managing director.

    Don’t forget to give us your 👏 !


    Russian TV show to have virtual human, Elena created by Sber. was originally published in Chatbots Life on Medium, where people are continuing the conversation by highlighting and responding to this story.

  • GOOGLE CLOUD COLLABORATES WITH COHERE TO SET UP NPL MODELS

    Google Cloud has recently announced their multi-year collaboration with Cohere, the startup platform for developers to make Natural Language processing model-building easier, in applications. The whole process requires a lot of infrastructure and hardware resources to run and currently Cohere doesn’t own that many resources of infrastructure. So Google is going to work as an infrastructure resource by helping Cohere with Google cloud’s machine learning hardware and infrastructure.

    Cohere and Google

    For taking out the best conclusion and trends Cohere draws an enormous amount of unstructured data and texts and this collaboration will support Cohere’s language model development with Google’s infrastructure. Additionally, Google will allow Cohere to get access to the Tensor processing unit. Besides these, Cohere will be also able to use Google cloud’s server to provide wider access to its model for gaining sales. Developers will be able to adjust the models with a minimum amount of time and coding.

    However, just before the announcement of collaboration Cohere recently announced their API for its language models. And now with this collaboration, they are about to expand their appeal by providing affordable language models.

    Cohere CEO Aidan Gomez stated while announcing the collaboration “By partnering with Google Cloud and using Cloud TPUs, Google’s best-in-class machine learning infrastructure, we are bringing cutting-edge NLP technology to every developer and every organization through a powerful platform,”

    “Until now, high-quality NLP models have been the sole domain of large companies. Through this partnership, we’re giving developers access to one of the most important technologies to emerge from the modern AI revolution.”

    Cohere isn’t alone to collaborate with Google Cloud as recently multiple companies including Wendy’s tied their hand with Google Cloud to incorporate AI and voice tools into a restaurant chain.

    “Leading companies around the world are using AI to fundamentally transform their business processes and deliver more helpful customer experiences,” Google cloud CEO explained.

    Don’t forget to give us your 👏 !


    GOOGLE CLOUD COLLABORATES WITH COHERE TO SET UP NPL MODELS was originally published in Chatbots Life on Medium, where people are continuing the conversation by highlighting and responding to this story.