Month: January 2021

  • Users of KUKI wanted for a study

    Hello all! I am excited🤩 to carry out a study for my PhD on chatbots, and specifically Kuki (Mitsuku). I am interested in participants aged 18-30 (autistic users also welcome) who have been chatting with Kuki for 3 months or more. If you are willing to participate, please contact me here or at my email address: [ax23@kent.ac.uk](mailto:ax23@kent.ac.uk)

    submitted by /u/annaksig
    [link] [comments]

  • My Celebrity Hip Hop Artist ChatBot is live 🔥 Phase 2 begins this week 🤞

  • Alexa gains support for location-based reminders and routines

    Alexa clients can now tweak the orders fused into Alexa’s schedules. The new component permitting clients to work out increases the mix of…

  • Facebook Shares 100-Language Translation Model, First Without English Reliance

    Facebook revealed and made open-source a language interpretation model this week to move between any two of 100 dialects. The M2M-100 was…

  • Case Study | myTEXT

    App with chatbot as reading companion for delinquent teenagers

    Overview

    Time: September to December 2020

    Tasks: User research, experience strategy, information architecture, interaction design

    Tools: AdobeXD, Notion, Miro

    Team:

    • Martin — Product manager
    • Ramona — Product Manager
    • Christine — UX/UI Designer (LinkedIn)

    Objective

    How might we help teenagers finish their reading assignment with the least possible effort, while also igniting the joy of reading in them?

    Context

    In this project I worked for KonTEXT, a reading project for delinquent teenagers. Teens who are sentenced to reading a certain number of pages come to KonTEXT for supervision and guidance. They regularly meet with mentors who reflect with them on the book they read and on their lives. However, many teenagers struggle to finish their reading assignment. Therefore, KonTEXT wants to help the youths by giving them a reading companion on their phone in the shape of an Android app with a chatbot.

    User Interviews

    First, we needed to get to know our users to understand what was keeping them from finishing their reading assignments on time. So we conducted interviews with 6 people who had either recently finished their reading assignment or were still working on it.

    We wanted to find out about the problems they had with reading, as well as strategies they used to master the challenge and things that motivated them.

    That’s what the teenagers we interviewed said.

    Main takeaways from user interviews

    • 4 of 6 participants had no problems getting started with reading
    • 5 of 6 made some kind of plan for reading (by defining a time in their day where they would read or by setting a goal of a certain number of pages per day)
    • 3 of 6 said the motivation was not important, because they were forced to read and just had to accept it

    Trending Bot Articles:

    1. The Messenger Rules for European Facebook Pages Are Changing. Here’s What You Need to Know
    2. This Is Why Chatbot Business Are Dying
    3. Facebook acquires Kustomer: an end for chatbots businesses?
    4. The Five P’s of successful chatbots

    What helped these youths succeed was the fact that they actively organised their reading and that they accepted they had to do it. None of the youths we interviewed had major problems with reading, some of them even enjoyed reading in their free time before the assignment. We were surprised by that, because our team members who were also mentors for KonTEXT assured us that many of their mentees had grave problems.

    Mentor Interviews

    To understand if the teenagers we interviewed were just outliers, we wanted to talk to some youths who actually struggled with the program. But most of them were unwilling to speak to us, even when we offered a reward. The ones who agreed to meet with us didn’t show up for their appointments. So we decided to interview the mentors at KonTEXT instead.

    From the results we could see that the youths we interviewed had not been representative of our user group. Most teenagers actually disliked reading and were not at all motivated.

    Here’s how the teenagers we interviewed compare to the average.

    We wanted to find out how many of the mentees had problems with reading, what kinds of problems they had and which strategies and tools they used to succeed.

    Some quotes from mentors about the problems the teenagers face

    Main takeaways from mentor interviews

    • Motivation is often named as the biggest problem: Reading is perceived as a punishment for their crimes and so they reject it
    • Many teenagers lack the organisational skills to plan their day and their reading, which is exacerbated by stress
    • Since only 25% of teenagers had big problems with the assignment, we would take into account the other user group as well: People that don’t struggle a lot, but could use some extra motivation and organisation tips

    Personas

    To put a face to our research results, I created personas for the two user groups. I wanted us to keep in mind the needs of users struggling with the reading assignment and those who were (mostly) taking it in stride.

    Our two personas: Elias and Armin

    Main takeaways from personas

    • Motivation: Users need to see their progress and get encouragement to stay motivated.
    • Organization: Users need help with organizing their reading, so they can finish their assignment on time.
    • Strategies: Users need tips to approach reading more strategically, so they can read effectively.

    We understood that the app needed to give personalized tips to be helpful to both user groups. The level of support it offers needs to be adjustable. We decided to manage that in part with the chatbot, that would be able to give personalized advice.

    Navigation and main features

    We then utilized user stories, user flows and journey maps to define the features of the app.

    The main features of the app would be:

    • A chatbot that motivated users and gives tips
    • Reading tools that provided aid while reading
    • Statistics that help track the reading progress
    • A reading plan to stay organised
    • And an activities section that allowed users to dive deeper into the topics of their book
    Site map and key screens pf the app

    Reading plan

    The initial idea for the reading plan was that it would inform users if they had read the required amount of pages each day. But in the usability test we discovered that all of our testers had a very different mental model of the reading plan. To them it was supposed to be a calendar that lets you plan your reading. They saw a planning tool, instead of a progress tracking tool, like we envisioned.

    So we decided to change the reading plan to match our users’ mental model.

    Main features of the reading plan

    The new reading plan tells users when they have their next appointment with their mentor and how many pages they have yet to read until then. They can also set days in the calendar to be reading days by simply tapping them. The app automatically calculates how many pages they have to read per day to reach their goal.

    Key takeaways

    • Show users how many pages they have to read per day so they can divide their assignment into manageable portions
    • Let users set a reminder for all reading days so they don’t forget to read and then have to read all the pages the day (or night) before their appointment

    Chatbot

    In the beginning, the chatbot mainly had the job of talking with the teenagers about their life and about the book they were reading as a peer. That changed a lot after the usability tests, where we asked our users for feedback on the chatbot.

    Key takeaways from the usability tests

    • User don’t want to be asked about personal things by the chatbot
    • Users see the role of the chatbot as a mentor and guide, not as a friend

    We redefined the role of the chatbot to be a coach for the teenagers. He is older than our users, so he is someone the teenager can look up to. The chatbot can give personalized reading tips, encourage and motivate the users and guide them through the app. The bot can also talk about his own experiences, because he is based on a real person.

    Persona of our chatbot Maximilian

    We collaborated with a former criminal to create chatbot interactions based on his personality and experiences. The idea behind that is, that the teenagers can relate to these experiences and reflect on their own lives through them. In the future, the users will be able to choose between a handful of chatbots when they start using the app, and select the one they can best relate to.

    Visual design

    For the visual design of the app, we chose shades of blue to go with the strong brand color red. We made a style guide and component library and also started creating a design system to prepare for the coming development phase.

    A selection of UI elements

    Retrospective

    What worked well:

    • Involving the whole design team in research (e.g. as note takers)
    • Facilitating synthesis workshops for research with the design team

    What I would do different:

    • Involve team members outside of the design team
    • Outline the strategy in the beginning of the process

    What I would do next:

    • Conduct another round of user testing to validate changes
    • Define the MVP and bring in the devs

    Thank you for reading through this case study!

    You can find more of my work on my portfolio website stefaniemue.com.

    Don’t forget to give us your 👏 !

    https://medium.com/media/7078d8ad19192c4c53d3bf199468e4ab/href

    Case Study | myTEXT was originally published in Chatbots Life on Medium, where people are continuing the conversation by highlighting and responding to this story.

  • Why Is Everyone talking about Chatbot For Website?

    One of the biggest technology disruptors currently is a chatbot for websites. And there’s no doubt that people are talking about it in every hot debate happening worldwide, so check out how and why it is becoming so essential and spreading like a new wave into the digital marketplace.

    submitted by /u/botpenguin1
    [link] [comments]

  • How to add WordPress Chatbot into your website. A Comprehensive Setup Guide

  • Chatbot framework with intent extraction from data

    Hey! I am quite experienced with ML but I am making first steps with chatbots. I am trying to create simple chatbot around some topic. I have quite large database of sample conversations. Coming from ML background I suspected there would be some chatbot framework that supports most common intent extraction from provided dataset. That’s not the case, at least in Dialogflow. That really surprised me, I imagined that it’s pretty basic functionality for a chatbot – to extract intents from thousands of messages.

    That’s why I came here, maybe I understood something wrong about DF or there are no such solutions. Can you tell me if there are existing frameworks which do intent extraction from dataset?

    submitted by /u/IDontHaveNicknameToo
    [link] [comments]

  • UX Design Resources | Tools for Voice & Conversation Design

    Voice User Experience and Conversational Design are two of the fields I have found especially interesting while learning about UX Design. They have often been described as the next frontier in user experience design and leave ample space for experimentation. Before talking about tools and tricks, let’s talk about how these two areas are different.

    Voice UI Design

    Voice interface design uses speech recognition to allow users to engage with technology using voice commands. As the world becomes increasingly fast-paced and information-dense, voice technologies are challenging the dominance of the graphical user interface and can make the experience much smoother. (Definition via UXPin)

    Conversation Design

    Conversation Design is the process of designing a natural, two-way interaction between a user and a system, via voice or text, based on the principles of human to human conversation. (Definition via Voxgen)

    From my research and talking to professionals working in this area, below are some of the tools and resources I have started compiling to help learn about this area myself.

    Trending Bot Articles:

    1. The Messenger Rules for European Facebook Pages Are Changing. Here’s What You Need to Know
    2. This Is Why Chatbot Business Are Dying
    3. Facebook acquires Kustomer: an end for chatbots businesses?
    4. The Five P’s of successful chatbots
    Photo by Clay Banks on Unsplash

    Alexa Conversations (beta)

    Introducing Alexa Conversations (beta), a New AI-Driven Approach to Providing Conversational Experiences That Feel More Natural

    Amazon Skill Blueprints

    Create Alexa skills in minutes

    Botsociety

    Your conversational design suite. Design and prototype your next chatbot or voice assistant.

    Designing for Conversation

    In this course, you will learn about the design methods we recommend you use to develop engaging conversational voice user interfaces (VUI).

    Digital Assistant Academy

    Become a Certified Voice Interaction Designer

    Draw.io

    Easily build flowcharts for conversation design

    Einstein Bot

    Innovate and automate fast with AI across Salesforce.

    VoiceFlow

    Design, prototype, and build voice apps

    Wit.ai

    Build Natural Language Experiences Enable people to interact with your products using voice and text.

    Don’t forget to give us your 👏 !

    https://medium.com/media/7078d8ad19192c4c53d3bf199468e4ab/href

    UX Design Resources | Tools for Voice & Conversation Design was originally published in Chatbots Life on Medium, where people are continuing the conversation by highlighting and responding to this story.

  • Happy Alexa, Sad Alexa…

    In November 2020, Amazon Alexa added more new speech presets (“musical” and “conversational”) to their “Excited/Disappointed” Speech Synthesis Markup Language (SSML) presets debuted in November 2019.* Although Alexa does provide examples of the code for each preset, they are still pretty vague (good internal coding) so from an audio/neuroscience view, I was interested in what made the voices sonically different and how that compared to current literature on acoustic characteristics of language and emotional prosody, using the examples from their website.

    *I am a student and not affiliated with Amazon. All sounds, code and news are publicly available.

    With the original and new audio examples hard-panned L and R respectively, I was able to use Blue Cat Audio’s Freqanalyst Logic Pro X Plug-in to have a side-by-side peak frequency comparison. I would’ve loved to use a spectrogram for more accurate frequency/time comparison but A. Logic doesn’t have a good one to my knowledge and B. they are harder to visually compare side-by-side. (Any suggestions on how I could improve my testing methods though are graciously taken — I think Reaper has one that I’m not familiar with?). I also compared the temporal aspect of the voices’ pauses through visual comparison of the wave forms. In every example shown below, the original voice is always represented by the color blue and the new preset is in red. Since each example uses the same words as its counterpart, there should be no frequency difference due to phoneme differences. The audio files were also the same gain/volume.

    The “musical” speech (below) changes intonation and time–the voice is faster to reach the syntactic emphasis of the sentence (SomeOfTheMost POPULAR). The musical voice is also louder in the upper frequencies. Amazon suggests this mode to “emulate DJs or radio hosts”/ “Style the speech for talking about music, video, or other multi-media content”. This, however, isn’t a definable auditory preset so there’s no specific comparison to be made to current literature in the same way as emotional language; the use of “musical” in this context has brought up debate on their nomenclature among some sound researchers.

    FIGURE 1

    The “conversational” style (below) is slower in total than the neutral original, adding commas where human speech would likely take a breath for clarity. See the added pause between “virtual assistant” and “AI technology”. I don’t know if this pause was added via code or AI; based on the technical details provided by Amazon it doesn’t look like pauses are added in manually, however I know that there are SSML options to create pauses based on specific time duration or based on “strength” (syntactic emphasis e.g. paragraph vs. sentence breaks and commas). Potentially related, the pause in the new (red) speech is completely silent, as if there’s a noise gate on the signal, as compared to the blue original which shows tails of the voice. Using a gate or some other forced clipping could be a way of emphasizing the pause without changing the actual timing as much.

    Trending Bot Articles:

    1. The Messenger Rules for European Facebook Pages Are Changing. Here’s What You Need to Know
    2. This Is Why Chatbot Business Are Dying
    3. Facebook acquires Kustomer: an end for chatbots businesses?
    4. The Five P’s of successful chatbots
    FIGURE 2

    The “disappointed” speech is (uniquely) temporally identical to neutral, however the new speech is lower in the upper frequencies. Neither of these differences match standard emotional prosody from what I know. The sad voice does appear to have a lower 1st formant though (Juslin/Laukka, 2001)

    The “excited” speech is considerably faster than the original and had some of the most frequency variation from the neutral speech. Using Juslin/Laukka again, you can see the first formant is lower in the original in comparison to the excited speech, which follows their findings in emotional prosody/acoustic characteristics of emotional speech.

    Don’t forget to give us your 👏 !

    https://medium.com/media/7078d8ad19192c4c53d3bf199468e4ab/href

    Happy Alexa, Sad Alexa… was originally published in Chatbots Life on Medium, where people are continuing the conversation by highlighting and responding to this story.