Comforting chatbot

Chatting with the customer service is now considered normal. But what if 'Eva', 'John' or 'Julia' were capable of not just solving technical problems but also providing us with emotional support? Janneke van der Zwaan investigated this possibility in the NWO research programme Responsible Innovation. She will defend her doctoral thesis on March 10.

Someone who is upset, angry or depressed is less capable of taking wise decisions and solving problems. That was the starting point for the research of Janneke van der Zwaan, who will shortly defend her doctoral thesis at Delft University of Technology. Sometimes a person needs a sympathetic listening ear before they can be receptive to good advice. That applies just as much in the virtual world as it does in real life. But an empathic chatbot did not exist yet.

Bullied children

Van der Zwaan is an expert in the area of artificial intelligence. She developed a prototype for an empathic virtual 'buddy' for who are being bullied. His name is Robin. 'I was mainly interested in the effects of Robin's appearance and behaviour on his conversation partners', says Van der Zwaan. 'I therefore kept the programme behind the robot as simple as possible. For example, I make use of multiple-choice answers. Chatting in a natural language is in itself already very difficult and that would only obscure our view of the interaction.'

Imitating people

Robin looks like a sort of SpongeBob who can do two things: chat (according to a preprogrammed question and answer model) and show emotions through his facial expressions. The emotions Robin can show have also been programmed. How can a fully programmed chatbot do something so inherently human as offer comfort? Very simple, explains Van der Zwaan: 'By imitating people.'

Models

She based the software behind Robin on models for human conversations in which is provided. Some of these are 'from textbooks', prescribed as an effective method for holding coaching conversations. Others have been acquired from hundreds of informal conversations in which friends, acquaintances or colleagues tried to support each other. From the various models, Van der Zwaan distilled a single conversation model with questions such as 'what happened exactly?', 'how do you feel now?' and 'have you tried this before?' and answers such as 'I am really sorry for you' and 'well done', 'smart, that you blocked this bully', or 'well done, that you talked to someone about the bullying'.

Tests

Van der Zwaan first tested the comforting cyberbuddy among one hundred students, then among professional care providers and then finally a group of children. The majority of people were convinced that a bullied child would feel better after chatting with Robin. The professional care providers found Robin's appearance too square and unnatural, whereas the children liked this aspect. The professional care providers found the combination of answers and particularly comforting but for the students, the children were most happy with the practical advice.

First step

'On balance, Robin therefore appears to be a successful empathic virtual buddy', says Van der Zwaan. 'This is still just a first step. The animations can be refined far more and greater variation needs to be brought into the conversations. But the underlying principle works; this study has demonstrated that. Also people who initially felt resistance towards the idea of a virtual character that got involved with human emotions still "came round" once they had talked with Robin.'

Taboos

Bullying is not the only area where empathic chatbots can play a role. Van der Zwaan: 'I deliberately chose this subject because children find it difficult to talk about it with their parents. The threshold for bringing up the subject with a virtual buddy is far less. There are more such taboo subjects. Examples are abuse, coping with grief or loneliness. A buddy could also be used as a coach for people who are on a diet, need to take medicines or quit smoking.'

Further development

Van der Zwaan is now looking for investors with who she can develop her prototype further. Of course a virtual buddy can never replace real human contact, she emphasises, let alone treatment. 'However, chatbots can certainly play a role in the first step towards a broader package of support or be a part of such a package, as long as they are warm and empathic. My research has shown that we can construct such "warm" virtual personalities.'

Citation: Comforting chatbot (2014, February 5) retrieved 26 April 2024 from https://phys.org/news/2014-02-comforting-chatbot.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Researchers perform first direct measurement of Van der Waals force

0 shares

Feedback to editors