Researchers teach robots to touch

Touching and grasping objects are surprisingly complex processes, an area where contemporary robots are still clumsy. Principal investigator Jukka Häkkinen, Ph.D., and post-doctoral researcher Jussi Hakala, D.Sc. (Tech), have developed an imaging method for measuring human touch.

"When humans grasp something, a very complicated subliminal calculation takes place about which muscles are needed in the process, as well as which neural pathways are used to control them and at what intensity. In the field of psychology, these brain mechanisms have been extensively studied," says Jukka Häkkinen, a psychologist and principal investigator at the University of Helsinki. He is one half of the pair behind the Grasp Sense method.

With the help of thermal and depth cameras, Grasp Sense can be used to measure the heat signature left on the surface of objects by human touch. Data collected on human touch can be utilised in robotics. Thus far, grasping and touching has posed a challenge for the development of robots to be used, for example, in logistics and healthcare.

"Robots need to know exactly the object's three-dimensional structure, material and weight distribution, whereas humans have the ability of intuitive grasp. Our goal is to transfer human skills to robots," says Jussi Hakala, a post-doctoral researcher and the other developer of the Grasp Sense. Hakala's research has focused on 3D imaging and display technologies.

Problems in robotics are related to whether a can maintain its hold on an object and, on the other hand, avoid crushing it. From the perspective of the care robots of the future, this aspect is becoming increasingly important.

"Their grip must be pleasant, unwavering and reliable," notes Häkkinen.

Earlier, Häkkinen conducted a research project funded by the Academy of Finland that focused on measuring eye movements during grasping tasks.

"I examined how various grasping tasks impact the orientation of eye movements. The term 'just in time selection' is used in connection with . In other words, the eyes are focused on collecting the exact information required for the next 500 milliseconds," explains Häkkinen.

This led to the idea for also measuring the manually completed actions during grasping tasks.

"Video-based methods are not accurate enough, so my first thought was to use finger paint," Häkkinen laughs.

Later, he considered using heat signatures left by touch, and the multiple applications for the method.

In addition to robotics, the Grasp Sense method could be applied to designing various utility articles. Touch data might be useful in designing objects that must be pleasant, ergonomic and precise to use.

According to Häkkinen, the same technology could also be used to create models for hospital hygiene by installing cameras on hospital ceilings. With the help of thermal cameras, models revealing the most touch-intensive surfaces could be created, making it easier and increasingly effective to keep them clean.

More information: graspsense.com/

Citation: Researchers teach robots to touch (2018, May 8) retrieved 26 April 2024 from https://phys.org/news/2018-05-robots.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Deep-learning robot shows grasp of different objects

16 shares

Feedback to editors