Adding social touch to robotics

A squeeze in the arm, a pat on the shoulder, or a slap in the face – touch is an important part of the social interaction between people. Social touch, however, is a relatively unknown field when it comes to robots, even though robots operate with increasing frequency in society at large, rather than just in the controlled environment of a factory. Merel Jung is conducting research at the University of Twente CTIT research institute into social touch interaction with robots. Using a relatively simple system – a mannequin's arm with pressure sensors, connected to a computer – she has succeeded in getting it to recognize sixty percent of all touches. The research is being published today in the Journal on Multimodal User Interfaces scientific journal.

Robots are becoming more and more social. A well-known example of a social robot is Paro, a robot seal that is used in care homes, where it has a calming effect on the elderly residents and stimulates their senses. Positive results have been achieved with the robot for this target group, but we still have a long way to go before robots can correctly recognize, interpret, and respond to different types of social touch in the way that people can. It is a relatively little explored area in science, but one in which much could be achieved in the long term. Examples that come to mind are robots that assist children with autism in improving their social contacts, or robots that train medicine students for real-life situations.

Sixty percent

Merel Jung is therefore carrying out research at the University of Twente into social touch interaction between humans and robots. In order to enable a robot to respond in the correct manner to being touched, she has identified four different stages. The robot must perceive, be able to recognize, interpret, and then respond in the correct way. In this phase of her research, Jung focused on the first two stages – perceiving and recognizing. With a relatively simple experiment, involving a mannequin's arm fitted with 64 , she has succeeded in distinguishing sixty percent of almost 8,000 touches (distributed over fourteen different types of touch at three levels of intensity). Sixty percent does not seem very high on the face of it, but it is a good figure if you bear in mind that there was absolutely no social context and that various touches are very similar to each other. Possible examples include the difference between grabbing and squeezing, or stroking roughly and rubbing gently. In addition, the people touching the mannequin's arm had been given no instructions on how to 'perform' their touches, and the computer system was not able to 'learn' how the individual 'touchers' operated. In similar circumstances, people too would not be able to correctly recognize every single touch. In her follow-up research, which Jung is currently undertaking, she is concentrating on how robots can interpret touch in a . It is expected that robots, by interpreting the context, will be better able to respond to touch correctly, and that therefore the touch will be one step closer to reality.

More information: Merel M. Jung et al. Automatic recognition of touch gestures in the corpus of social touch, Journal on Multimodal User Interfaces (2016). DOI: 10.1007/s12193-016-0232-9

Citation: Adding social touch to robotics (2016, October 24) retrieved 19 March 2024 from https://phys.org/news/2016-10-adding-social-robotics.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Touching a robot can elicit physiological arousal in humans

9 shares

Feedback to editors