Researchers teach robots to touch

May 8, 2018, University of Helsinki

Touching and grasping objects are surprisingly complex processes, an area where contemporary robots are still clumsy. Principal investigator Jukka Häkkinen, Ph.D., and post-doctoral researcher Jussi Hakala, D.Sc. (Tech), have developed an imaging method for measuring human touch.

"When humans grasp something, a very complicated subliminal calculation takes place about which muscles are needed in the process, as well as which neural pathways are used to control them and at what intensity. In the field of psychology, these brain mechanisms have been extensively studied," says Jukka Häkkinen, a psychologist and principal investigator at the University of Helsinki. He is one half of the pair behind the Grasp Sense method.

With the help of thermal and depth cameras, Grasp Sense can be used to measure the heat signature left on the surface of objects by human touch. Data collected on human touch can be utilised in robotics. Thus far, grasping and touching has posed a challenge for the development of robots to be used, for example, in logistics and healthcare.

"Robots need to know exactly the object's three-dimensional structure, material and weight distribution, whereas humans have the ability of intuitive grasp. Our goal is to transfer human skills to robots," says Jussi Hakala, a post-doctoral researcher and the other developer of the Grasp Sense. Hakala's research has focused on 3D imaging and display technologies.

Problems in robotics are related to whether a can maintain its hold on an object and, on the other hand, avoid crushing it. From the perspective of the care robots of the future, this aspect is becoming increasingly important.

"Their grip must be pleasant, unwavering and reliable," notes Häkkinen.

Earlier, Häkkinen conducted a research project funded by the Academy of Finland that focused on measuring eye movements during grasping tasks.

"I examined how various grasping tasks impact the orientation of eye movements. The term 'just in time selection' is used in connection with . In other words, the eyes are focused on collecting the exact information required for the next 500 milliseconds," explains Häkkinen.

This led to the idea for also measuring the manually completed actions during grasping tasks.

"Video-based methods are not accurate enough, so my first thought was to use finger paint," Häkkinen laughs.

Later, he considered using heat signatures left by touch, and the multiple applications for the method.

In addition to robotics, the Grasp Sense method could be applied to designing various utility articles. Touch data might be useful in designing objects that must be pleasant, ergonomic and precise to use.

According to Häkkinen, the same technology could also be used to create models for hospital hygiene by installing cameras on hospital ceilings. With the help of thermal cameras, models revealing the most touch-intensive surfaces could be created, making it easier and increasingly effective to keep them clean.

Explore further: Deep-learning robot shows grasp of different objects

More information:

Related Stories

Deep-learning robot shows grasp of different objects

October 10, 2015

Robot researchers have had much success in getting robots to walk and run; another challenge has persisted for years, and that is getting robots to pick up and hold on to objects successfully. An international workshop on ...

Robots that can learn like humans

April 9, 2018

Researchers say that artificial intelligence (AI) is now superior to human intelligence in supervised learning using vast amounts of labeled data to perform specific tasks. However, it is considered difficult to realize human-like ...

Making robots that can work with their hands

April 25, 2017

It's quite common for humans – especially those who work in manufacturing – to tie a knot, strip the casing off a cable, insert a pin in a hole or use a hand tool such as a drill. They may seem like simple tasks, but ...

New robots can see into their future

December 4, 2017

University of California, Berkeley, researchers have developed a robotic learning technology that enables robots to imagine the future of their actions so they can figure out how to manipulate objects they have never encountered ...

Robots get a light touch

February 26, 2018

It's the Holy Grail in robotics: an android that can perform complex tasks beside humans in a real-world environment.

Recommended for you

Nanoscale Lamb wave-driven motors in nonliquid environments

March 19, 2019

Light driven movement is challenging in nonliquid environments as micro-sized objects can experience strong dry adhesion to contact surfaces and resist movement. In a recent study, Jinsheng Lu and co-workers at the College ...

OSIRIS-REx reveals asteroid Bennu has big surprises

March 19, 2019

A NASA spacecraft that will return a sample of a near-Earth asteroid named Bennu to Earth in 2023 made the first-ever close-up observations of particle plumes erupting from an asteroid's surface. Bennu also revealed itself ...

The powerful meteor that no one saw (except satellites)

March 19, 2019

At precisely 11:48 am on December 18, 2018, a large space rock heading straight for Earth at a speed of 19 miles per second exploded into a vast ball of fire as it entered the atmosphere, 15.9 miles above the Bering Sea.

Revealing the rules behind virus scaffold construction

March 19, 2019

A team of researchers including Northwestern Engineering faculty has expanded the understanding of how virus shells self-assemble, an important step toward developing techniques that use viruses as vehicles to deliver targeted ...

Levitating objects with light

March 19, 2019

Researchers at Caltech have designed a way to levitate and propel objects using only light, by creating specific nanoscale patterning on the objects' surfaces.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.