3-D Air-Touch display operates on mobile devices

Jan 30, 2014 by Lisa Zyga feature
Concept of playing a virtual 3D STACKO game on a 3D mobile display using Air-Touch technology. Credit: Wang, et al. ©2013 IEEE

(Phys.org) —While interactive 3D systems such as the Wii and Kinect have been popular for several years, 3D technology is yet to become part of mobile devices. Researchers are working on it, however, with one of the most recent papers demonstrating a 3D "Air-Touch" system that allows users to touch floating 3D images displayed by a mobile device. Optical sensors embedded in the display pixels can sense the movement of a bare finger in the 3D space above the device, leading to a number of novel applications.

The researchers, Guo-Zhen Wang, et al., from National Chiao Tung University in Taiwan, have published a paper on the 3D Air-Touch system in a recent issue of the IEEE's Journal of Display Technology.

"The 3D Air-Touch system in mobile devices can offer non-contact finger detection and limited viewpoint for operating on a floating image, which can be applied to 3D games, interactive digital signage and so on," Wang told Phys.org. "Although current technology still has some issues, such as yield rate, sensor uniformity and so on, we predict that this technology could become available in the near future."

Because of the small size and portable nature of mobile devices, implementing a 3D system on these devices is different from 3D systems used on TVs and other large screens. Often, large 3D systems require either additional bulky devices or cameras for motion detection. For mobile systems, these additional devices would be inconvenient and the cameras have a limited field of view for detecting objects in close proximity to the display. Some proposed 3D systems for mobile devices use sensors near the screen, but these systems require bright environmental lighting, so they don't work well in dark conditions.

Optical sensors that are embedded in the mobile device detect finger movement. The depth range is currently 3 cm. Credit: Wang, et al. ©2013 IEEE

Working around these restrictions, Wang, et al., designed a 3D system in a 4-inch display screen in which optical sensors are embedded directly into the display pixels, while an infrared backlight is incorporated into the device itself. The researchers also added angular scanning illuminators to the edges of the display to provide adequate lighting. Overall, these three components provide a 3D system that is compact, has a wide field of view, and is independent of ambient conditions.

The researchers explain that the algorithm for calculating the 3-axis (x, y, z) position of the fingertip is less complex than that used for image processing, allowing for rapid real-time calculations. First, the infrared backlight and the are used to determine the 2D (x, y) position of the fingertip. Then to calculate the depth of the fingertip, the angular illuminators emit infrared light at different tilt angles. An analysis of the accumulated intensity at different regions provides the scanning angle with maximum reflectance, resulting in the 3D location of the fingertip.

(a) To calculate the 2-axis (x and y) positions of a fingertip, the IR backlight is reflected by the fingertip. (b) To calculate the depth (z) of the fingertip, the system uses IR scanning devices on opposite sides of the display panel. Credit: Wang, et al. ©2013 IEEE

Experimental results showed that the prototype 3D Air-Touch system performed very well. 2D touch systems require that the maximum error in positioning be no more than 0.5 cm, and the 3D touch prototype has a maximum error of 0.45 cm at large depths, and smaller errors for smaller depths. The prototype's depth range is 3 cm, but the researchers predict that this range can be further increased by improving the sensor sensitivity and scanning resolution.

In the future, the 3D touch interface might also be extended from single-touch to multi-touch functionality, which could enable more applications. However, multi-touch functionality will require overcoming the occlusion effect, which occurs when one fingertip blocks the second so that the sensors cannot distinguish between the two. The researchers also plan to work on 3D Air-gesture operation for making 3D signatures in .

Explore further: Your smartphone as a 3D scanner: Scientists present new software

More information: Guo-Zhen Wang, et al. "Bare Finger 3D Air-Touch System Using an Embedded Optical Sensor Array for Mobile Displays." Journal of Display Technology, Vol. 10, No. 1, January 2014. DOI: 10.1109/JDT.2013.2277567

3.8 /5 (16 votes)

Related Stories

HTC launches 3D smartphone in Taiwan

Aug 17, 2011

Taiwan's leading smartphone maker HTC Wednesday launched its first 3D cellphone onto the local market, picking what a local telecom operator said was an opportune time ahead of the iPhone 5.

The worlds smallest 3D HD display

May 16, 2011

(PhysOrg.com) -- It seems like small displays are all of the rage these days, and they just keep getting more and more advanced. In October of last year Ortus Technology created a 4.8-inch liquid crystal ...

Atheer Labs demos 3-D virtual object-manipulation goggles

Jul 01, 2013

(Phys.org) —Atheer Labs has announced the development of a new type of technology that allows for creating and manipulating virtual three-dimensional objects via goggles or by other types of devices. Calling ...

Recommended for you

Alibaba mega IPO caps founder Jack Ma success tale

2 hours ago

When Jack Ma founded Alibaba 15 years ago he insisted the e-commerce venture should see itself as competing against Silicon Valley, not other Chinese companies. That bold ambition from a time when China was ...

Privacy groups take 2nd hit on license plate data

2 hours ago

A California judge's ruling against a tech entrepreneur seeking access to records kept secret in government databases detailing the comings and goings of millions of cars in the San Diego area via license plate scans was ...

User comments : 5

Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Jan 30, 2014
Wait wait, back up. When have we started being able to *project* 3D images to be able to be touched??? Where's my hologram machine?
not rated yet Jan 30, 2014
@Shavera: This article was just highlighting the idea of how to detect hand position/movement in a 3D environment for 3D object interaction. Unfortunately, this article says nothing about the possibility of even being able to project 3D into the air with a mobile device. It really looks like they are tackling the easiest problems first, and then will worry about how to even accomplish a 3D projection, which really should be done first. It's kind of like working out the rules for a game when you don't even have the design for the game yet.
not rated yet Jan 30, 2014
@Tangent Cart before the Horse?
not rated yet Jan 31, 2014
Interacting with virtual images, particularly since the images aren't 3-D, is not that useful. Not a satisfying experience for users. But the technology opens things up for developing a richer set of gestures that might be used to quickly, efficiently interact with the device. Especially if these researchers can get the technology to work with multiple fingers. Imagine something like ASL, but using fingers.

Tongue-in-cheek comment on the picture - not realistic. In the real world users, while seated next to each other, would each have the app running on his/her own phone. All social interaction would be through the phone.
not rated yet Feb 03, 2014
3D mobile phones and tablets are technically possible and I believe a few of them have been made, including by LG previously. I have seen one myself. Its resolution was rather low but I am sure it can be improved.

In fact it is easier to build glass free 3D projection for mobile phones and tablets as there is only one viewer most of the time. The technology involves tracking the user's eyes and that's easier if there is only one user (as opposed to the situation with 3D TV).