UTA patents headset that allows persons to point to objects of interest using their eyes

July 3, 2018 by Louisa Kellie, University of Texas at Arlington
The University of Texas at Arlington has been awarded a patent for a revolutionary new technology that is able to scan a user's eye movements and enable them to navigate mobile platforms, such as electric wheelchairs, without the use of hands. The technology would also allow users to communicate to a robotic platform when they would like to use an object, such as a glass of water. Credit: UTA

People with disabilities such as ALS, spinal injury or Lou Gehrig's disease, often lose use of their legs, arms or hands. Even at advanced stages of the disease, one may still retain movement in their eyes. Some technologies have incorporated eye-tracking to enable disabled persons to interact with a computer to communicate messages to a caregiver, but the devices are often difficult to calibrate without expert assistance and do not allow patients to express their wishes right away.

The University of Texas at Arlington has been awarded a patent for a revolutionary new technology that is able to scan a user's eye movements and enable them to navigate mobile platforms, such as , without the use of hands. The technology would also allow users to communicate to a robotic platform when they would like to use an object, such as a glass of water.

"My interest in this technology grew out of seeing how my mother-in-law struggled with eye-tracking devices as an ALS patient," said inventor Christopher McMurrough, now a computer science and engineering lecturer at UTA.

"The latest version of our device can be worn as a pair of ski goggles with cameras on top and eye-trackers embedded in the lenses, making it very easy for patients to use it over long periods of time as it moves with them," he added.

This new technology, called a 3-D point of gaze headset, grew out of McMurrough's doctoral thesis in computer science and engineering as a student at UTA. The device combines 3-D mapping using a 3-D camera on the top of the glasses, with eye tracking. The data is then fed into a program that models the user's surrounding environment and what is currently being looked at by projecting the line of vision out and crossing it with the 3-D view from the camera.

Experiment. Credit: University of Texas at Arlington

"This could also be important to track medical problems that affect , such as stroke, and other conditions," McMurrough said. "We are also seeing that it could have implications for video games and emerging augmented reality applications."

Explore further: Ergonomic eye-tracking technology for high-quality AR/VR experiences

Related Stories

A solution for precise, low-cost eye movement detection

May 19, 2017

Imec and Holst Centre (set up by imec and TNO) today announced the development of a sensing technology to detect eye movement in real time based on electrical sensing. Paving the way for the next generation of eye-tracking ...

Beyond Google Glass: Researcher looks to the future

August 20, 2013

(Phys.org) —A wearable display being developed by UA optical scientist Hong Hua could have capabilities even more advanced than those of the recently unveiled Google Glass, a pair of glasses with smartphone capabilities.

Minecraft iOS app takes models to new places

November 26, 2012

(Phys.org)—Hold on, is that a pirate ship in your office? An airplane flying over your washing machine? A robot guarding the doorway? Minecraft, a runaway hit with game players, even topping Call of Duty: Modern Warfare ...

Recommended for you

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.