Elliptic Labs develops ultrasonic gesture control for hand-held devices

Oct 11, 2013 by Bob Yirka report

(Phys.org) —Norwegian based Elliptic Labs has revealed that the company has not only developed an ultrasonic gesture control chip for hand-held devices, but that it is already in talks with Asian hand-held hardware makers to embed the new technology. Representatives from Elliptic Labs have told reporters that they believe their chip technology will be available to consumers inside main-stream devices, as early as next year.

To date, the vast majority of devices use infrared light, a system that has worked very well for gadgets such as Microsoft's Kinect. But as owners of such devices can attest, they all have one major drawback—limited range. The by Ellipitc solves that problem by using sound waves instead of light. That means, as the company demonstrates in a video on its website, that users can control the within a 180-degree field. The chip allows a device to "see" a hand held higher or lower than the screen, for example, or off to the left or right. Even more remarkably, it can do so from as far away as three feet. Company CEO Lila Danielson, says that the biggest advantage of using ultrasound over infrared is that it uses just a small fraction of the amount of power. And because the chip is tiny, that makes it a perfect fit for tablet computers or smartphones.

Gesture control with hand-held devices would most likely be used by users to turn pages (when hands are dirtied from cooking, etc.) or to move through slides or songs in a playlist. Being able to swipe a screen from a distance offers users an additional degree of control.

Elliptic Labs won the CEATEC 2013 Innovation Award in the Computing and Networking category this year for its innovative chip, because of its ease of portability to multiple devices and extremely small size allowing for embedding in virtually any device. It was at that ceremony that the company wowed an audience by demonstrating the chips capabilities by connecting it to an Android enabled smartphone. The company also notes that the chip can be easily integrated with new or current features of a device. One example is of a person using a smartphone snapping a photograph, then using a simple flinging gesture in the air, to send it over to a person holding another enabled device.


Explore further: HP marks October availability of gesture-control PC (w/ Video)

More information: www.ellipticlabs.com/

Related Stories

NEC unveils gesture controlling device

May 16, 2012

Japanese technology titan NEC has unveiled a gadget that allows users to control their TV, mobile phone or tablet computer using a virtual input device.

ARM chip makers set to reach 3GHz next year

Jul 10, 2013

(Phys.org) —ARM chip makers TSMC and GlobalFoundries have revealed that they plan to release ARM processor chips capable of running at 3GHz sometime next year. Such chips will almost certainly be welcomed ...

Atheer Labs demos 3-D virtual object-manipulation goggles

Jul 01, 2013

(Phys.org) —Atheer Labs has announced the development of a new type of technology that allows for creating and manipulating virtual three-dimensional objects via goggles or by other types of devices. Calling ...

Recommended for you

Patent talk: Google sharpens contact lens vision

Apr 16, 2014

(Phys.org) —A report from Patent Bolt brings us one step closer to what Google may have in mind in developing smart contact lenses. According to the discussion Google is interested in the concept of contact ...

Neuroscientist's idea wins new-toy award

Apr 15, 2014

When he was a child, Robijanto Soetedjo used to play with his electrically powered toys for a while and then, when he got bored, take them apart - much to the consternation of his parents.

Land Rover demos invisible bonnet / car hood (w/ video)

Apr 14, 2014

(Phys.org) —Land Rover has released a video demonstrating a part of its Discover Vision Concept—the invisible "bonnet" or as it's known in the U.S. the "hood" of the car. It's a concept the automaker ...

User comments : 3

Adjust slider to filter visible comments by rank

Display comments: newest first

Moebius
1 / 5 (1) Oct 12, 2013
Sounds to me like the death knell for the touch screen. Why have a touch screen if a gesture at the glass will do the same thing? I don't want to be waving my hands 12 inches away from the device like a magician conjuring a rabbit but finger gestures at the screen would work just as well as actually touching the screen. Maybe better since touch isn't 100% reliable.
arq
not rated yet Oct 12, 2013
because touching seems more natural than finger gesturing and after a while gesturing is slightly more tiring than touching
VendicarE
not rated yet Oct 12, 2013
This has major fail written all over it.

More news stories

Tiny power plants hold promise for nuclear energy

Small underground nuclear power plants that could be cheaper to build than their behemoth counterparts may herald the future for an energy industry under intense scrutiny since the Fukushima disaster, the ...

Hand out money with my mobile? I think I'm ready

A service is soon to launch in the UK that will enable us to transfer money to other people using just their name and mobile number. Paym is being hailed as a revolution in banking because you can pay peopl ...