Technology can transfer human emotions to your palm through air, say scientists
Human emotion can be transferred by technology that stimulates different parts of the hand without making physical contact with your body, a University of Sussex-led study has shown.
Sussex scientist Dr Marianna Obrist, Lecturer at the Department of Informatics, has pinpointed how next-generation technologies can stimulate different areas of the hand to convey feelings of, for example, happiness, sadness, excitement or fear.
For example, short, sharp bursts of air to the area around the thumb, index finger and middle part of the palm generate excitement, whereas sad feelings are created by slow and moderate stimulation of the outer palm and the area around the 'pinky' finger.
The findings, which will be presented tomorrow (Tuesday 21 April) at the CHI 2015 conference in South Korea, provide "huge potential" for new innovations in human communication, according to Dr Obrist.
Dr Obrist said: "Imagine a couple that has just had a fight before going to work. While she is in a meeting she receives a gentle sensation transmitted through her bracelet on the right part of her hand moving into the middle of the palm. That sensation comforts her and indicates that her partner is not angry anymore.
"These sensations were generated in our experiment using the Ultrahaptics system.
"A similar technology could be used between parent and baby, or to enrich audio-visual communication in long-distance relationships.
"It also has huge potential for 'one-to-many' communication – for example, dancers at a club could raise their hands to receive haptic stimulation that enhances feelings of excitement and stability."
Using the Ultrahaptics system – which enables creating sensations of touch through air to stimulate different parts of the hand – one group of participants in the study was asked to create patterns to describe the emotions evoked by five separate images: calm scenery with trees, white-water rafting, a graveyard, a car on fire, and a wall clock. The participants were able to manipulate the position, direction, frequency, intensity and duration of the stimulations.
A second group then selected the stimulations created by the first group that they felt best described the emotions evoked by the images. They chose the best two for each image, making a total of 10.
Finally, a third group experienced all 10 selected stimulations while viewing each image in turn and rated how well each stimulation described the emotion evoked by each image.
The third group gave significantly higher ratings to stimulations when they were presented together with the image they were intended for, proving that the emotional meaning had been successfully communicated between the first and third groups.
Now Dr Obrist has been awarded £1 million by the European Research Council for a five-year project to expand the research into taste and smell, as well as touch.
The SenseX project will aim to provide a multisensory framework for inventors and innovators to design richer technological experiences.
Dr Obrist said: "Relatively soon, we may be able to realise truly compelling and multi-faceted media experiences, such as 9-dimensional TV, or computer games that evoke emotions through taste.
"Longer term, we will be exploring how multi-sensory experiences can benefit people with sensory impairments, including those that are widely neglected in Human-Computer Interaction research, such as a taste disorder."
Catherine Bearder, Liberal Democrat MEP for south-east England, said: "I am thrilled Dr Obrist has been awarded this EU funding for her incredible research into such a ground-breaking side of science.
"This is an example of the EU investing in those research projects it sees as having great potential to change our lives."