Developing Conversational AI to be More Human

Conversational AI

New research led by Angus Addlesee, a Machine Learning Engineer at Wallscope, aims to help assist the elderly and ease the strain on carers.

Angus Addlesee

Speaking to our houses could allow elderly people to live in their own homes for longer and more independently. The population is ageing rapidly and Scotland’s carers are already stretched to their limits but smart homes and conversational robots could help ease these pressures. A new line of research aims to carve a path towards this future.

This research is by Angus Addlesee with Heriot-Watt University and funded by Wallscope and The Data Lab.

Overview

Human conversation is filled with phenomena that current conversational agents (like Siri, Cortana, Alexa and Google Assistant) completely ignore.

Siri-Google_Assistant-Alexa-Cortana

We correct ourselves in the middle of sentences, interrupt each other and sometimes forget the word we are thinking of. We also give feedback while someone is speaking to show we understand (or not) such as saying “yeah, mhmm”, nodding or screwing up our face. None of these signals are picked up by current conversational systems but they are important and guide our everyday conversations.

For people with Dementia, it is sometimes difficult to know that you have to change the way you have a conversation with Amazon Alexa. Pausing and forgetting the word you are looking for is also more common which is incredibly frustrating and can cause unnecessary stress as the system will think you have finished speaking and say “I’m sorry, I do not know how to do that”.

This research aims to make conversational systems more natural so that people with cognitive impairments, like Dementia, can use them with less risk of frustration and stress. This, in turn, allows smart home devices to assist people and allow them to live more independently and for longer in their own home.

Smart Home Devices

People are increasingly using smart home devices for convenience and security. We have smart lights, smart ovens, smart heating, smart doorbells and the list goes on… These same devices can be used instead to assist those that could really benefit.

For example, imagine a 73-year-old woman and let’s call her Jane.

elderly woman

Jane lives alone in her house but unfortunately has early signs of Alzheimer’s Disease and more developed Arthritis in her knees.

Jane has to go upstairs to turn the heating on in her bedroom so she can either walk upstairs twice or just once and try to sleep in a cold room while it heats up.

Smart heating, however, avoids this problem as Jane can turn her heating on upstairs from the comfort of her sofa and only has to walk upstairs once her bedroom is warm. This problem is not solved however as more and more devices are created with more and more complex functionality and more and more buttons are therefore added to the controls.

smart home

You can give these control panels to the elderly population and those with cognitive impairments but this only leads to confusion, frustration and unnecessary stress.

Conversation

Humans communicate with each other in the most natural form of interaction: conversation.

Conversational Agents (like Siri and Alexa) are booming in popularity to make our everyday lives easier and interact with these expanding number of ‘smart devices’ in our homes. These have been designed for the mass market but unfortunately, the mass market does not include elderly people and those with cognitive impairments such as Dementia.

Better conversational systems could let people like Jane turn up their heating in their bedroom upstairs with just her voice. She has been having conversations her whole life so doesn’t need to learn how to ask “Can you turn the heating on in my bedroom?”.

Angus has previously written about the current research that aims to make these conversational agents more naturally interactive and has also looked into how Dementia effects conversation as it progresses.

Collections of recordings of speech from people with Dementia are being used and Angus is also collecting a new dataset of more spontaneous conversation. Using these datasets, they can investigate which changes in speech cause the most problems in current systems and improve upon this.

If someone with Dementia pauses to think of a word, for example, the acoustic signals can be used to detect that they are not finished their sentence to avoid interrupting. They can also take into consideration more word errors, repetition, number of prepositions and slower speech rates that commonly occur in dialogue as Dementia progresses.

You can keep up to date with this research on Medium, LinkedIn or Twitter.



Latest News

Digital Diversity Featured News
Data News
17th June 2019

Amazon Faces Lawsuits Over Alexa Child Recordings

Cryptocurrency News
cybersecurity News