Operating smart devices from the space on and above the back of your hand

May 3, 2017, Saarland University
A novel input method expands the input space to the back of the hand and the 3-D space above it. Credit: Oliver Dietze

It relies on a depth sensor that tracks movements of the thumb and index finger on and above the back of the hand. In this way, not only can smartwatches be controlled, but also smartphones, smart TVs and devices for augmented and virtual reality.

They're called the "Apple Watch Series 2", "LG Watch", "Samsung GEAR S3" or "Moto 360 2nd Gen" but they all have the same problem. "Every new product generation has better screens, better processors, better cameras, and new , but regarding input, the limitations remain," explains Srinath Sridhar, a researcher in the Graphics, Vision and Video group at the Max Planck Institute for Informatics.

Together with Christian Theobalt, head of the Graphics, Vision and Video group at MPI, Anders Markussen and Sebastian Boring at the University of Copenhagen and Antti Oulasvirta at Aalto University in Finland, Srinath Sridhar has therefore developed an input method that requires only a small camera to track fingertips in mid-air, and touch and position of the fingers on the back of the hand. This combination enables more expressive interactions than any previous sensing technique.

Regarding hardware, the prototype, which the researchers have named "WatchSense", requires only a depth sensor, a much smaller version of the well-known "Kinect" game controller from the Xbox 360 . With WatchSense, the depth sensor is worn on the user's forearm, about 20cm from the watch. As a sort of 3D camera, it captures the movements of the thumb and index finger, not only on the back of the hand but also in the space over and above it. The software developed by the researchers recognizes the position and movement of the fingers within the 3D image, allowing the user to control apps on smartphones or other devices. "The currently available depth sensors do not fit inside a smartwatch, but from the trend it's clear that in the near future, smaller depth sensors will be integrated into smartwatches," Sridhar says.

But this is not all that's required. According to Sridhar, with their software system the scientists also had to solve the challenges of handling the unevenness of the back of the hand and the fact that the fingers can occlude each other when they are moved. "The most important thing is that we can not only recognize the fingers, but also distinguish between them," explains Sridhar, "which nobody else had managed to do before in a wearable form factor. We can now do this even in real time." The software recognizes the exact positions of the thumb and in the 3D image from the depth sensor, because the researchers trained it to do this via machine learning. In addition, the researchers have successfully tested their prototype in combination with several mobile devices and in various scenarios. "Smartphones can be operated with one or more fingers on the display, but they do not use the space above it. If both are combined, this enables previously impossible forms of interaction," explains Sridhar. He and his colleagues were able to show that with WatchSense, in a music program, the volume could be adjusted and a new song selected more quickly than was possible with a smartphone's Android app. The researchers also tested WatchSense for tasks in virtual and augmented reality, in a map application, and used it to control a large external screen. Preliminary studies showed that WatchSense was more satisfactory for each case than conventional touch-sensitive displays. Sridhar is confident that "we need something like WatchSense whenever we want to be productive while moving. WatchSense is the first to enable expressive input for devices while on the move."

From May 6, the researchers will present WatchSense at the renowned "Conference on Human Factors in Computing," or CHI for short, which this time takes place in the city of Denver in the US.

Explore further: 'Lab-on-a-glove' could bring nerve-agent detection to a wearer's fingertips

More information: handtracker.mpi-inf.mpg.de/projects/WatchSense/

Related Stories

Electromagnets and sensors track the motions of fingers

December 23, 2015

A university's ubiquitous computing lab has teamed up with virtual reality company Oculus to work on a system capable of precisely tracking finger movements. You and your VR game in the future could be thriving on magnets ...

Finger vein authentication using smartphone camera

October 26, 2016

Hitachi today announced the development of highly-accurate finger vein authentication technology using the camera commonly integrated in the standard smartphone. This technology will enable the use of biometric authentication ...

SWiM—an evolution in one-handed texting

March 17, 2017

The growing popularity of mobile devices with large screens – called "phablets" – and also small screen wearable devices, such as smartwatches, is making interacting with them with one hand increasingly difficult. This ...

Recommended for you

Light-based production of drug-discovery molecules

February 18, 2019

Photoelectrochemical (PEC) cells are widely studied for the conversion of solar energy into chemical fuels. They use photocathodes and photoanodes to "split" water into hydrogen and oxygen respectively. PEC cells can work ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.