New techniques for eye-gaze tracking could change computer interaction

April 24, 2015

Mice, and now touchscreens, have become a daily part of our lives in the way we interact with computers. But what about people who lack the ability to use a mouse or touchscreen? Or situations where these would be impractical or outright dangerous?

Many researchers have explored eye-gaze as a potential control mechanism. These tracking mechanisms have become sophisticated and small enough that they currently feature in devices such as smartphones and tablets. But on their own, these mechanisms may not offer the precision and speed needed to perform complex computing tasks.

Now, a team of researchers at the Department of Engineering has developed a computer control interface that uses a combination of eye-gaze tracking and other inputs. The team's research was published in a paper, 'Multimodal Intelligent Eye-Gaze Tracking System', in the International Journal of Human-Computer Interaction.

Dr Pradipta Biswas, Senior Research Associate in the Department's Engineering Design Centre, and the other researchers provided two major enhancements to a standalone gaze-tracking system. First, sophisticated software interprets factors such as velocity, acceleration and bearing to provide a prediction of the user's intended target. Next, a second mode of input is employed, such as a joystick.

"We hope that our eye-gaze tracking system can be used as an assistive technology for people with severe mobility impairment," Pradipta said. "We are also exploring the potential applications in military aviation and automotive environments where operators' hands are engaged with controlling an aircraft or vehicle."

The selection problem

One challenge that arises when designing such a system is that once the target is selected, how does the user indicate a desire for selection? On a typical personal computer, this is accomplished with a click of the ; with a phone or tablet, a tap on the screen.

Basic eye-gaze tracking systems often use a signal such as blinking the eyes to indicate this choice. However, blinking is not often ideal. For example, in combat situations, pilots' eyes might dry up, precluding their ability to blink at the right time.

Pradipta's team experimented with several ways to solve the selection problem, including manipulating joystick axes, enlarging predicted targets, and using a spoken keyword such as 'fire' to indicate a target.

Unsurprisingly, they found that a mouse remains the fastest and least-cognitively stressful method of selecting a target – possibly assisted by the fact that most computer users are already comfortable with this technique. But, a multimodal approach combining eye-gaze tracking, predictive modelling, and a joystick can almost match a mouse in terms of accuracy and cognitive load. Further, when testing computer novices and with sufficient training in the system, the intelligent multimodal approach can even be faster.

The hope is that these revelations will lead to systems that perform as well – or better – than a mouse. "I am very excited for the prospects of this research," Pradipta said. "When clicking a mouse isn't possible for everyone, we need something else that's just as good."

Explore further: Eye tracking is the next frontier of human-computer interaction

More information: "Eye-gaze Tracking Based Interaction in India," Procedia Computer Science, Volume 39, 2014, Pages 59-66, ISSN 1877-0509, dx.doi.org/10.1016/j.procs.2014.11.010

Related Stories

Tobii's eye tracker REX will showcase at CES (w/ video)

January 3, 2013

(Phys.org)—Tobii Technology is introducing the REX, a USB-connected peripheral that works with Tobii's software Gaze. The Stockholm-based company will show its REX device for Windows 8 at the CES show in Las Vegas, from ...

Moral decisions can be manipulated by eye tracking

March 16, 2015

Moral decisions can be influenced by tracking moment to moment movements of the eyes during deliberation, finds new research from Lund University, Sweden, University College London and University of California Merced.

Recommended for you

Old, meet new: Drones, high-tech camera revamp archaeology

November 24, 2017

Scanning an empty field that once housed a Shaker village in New Hampshire, Jesse Casana had come in search of the foundations of stone buildings, long-forgotten roadways and other remnants of this community dating to the ...

4 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

antialias_physorg
not rated yet Apr 24, 2015
A very simple thing I'd like to see implemented is eye tracking to determine which window has focus. It's so frustrating to have mouse interactions or keyboard entry happen in the wrong window (sometimes it's even dangerous - e.g. when you think you're typing in your password in a blanked field but instead type it in a word processor for everyone to see.)
cemery50
3 / 5 (1) Apr 24, 2015
Yay...a nod and a blink to multi-modal interfacing....combined with ai and training on the corpus of the users interactions hunching over a box may be over....I like google glass/immersive vr/augmented reality....I also ware glasses anyway....the more auto the better....
Pradipta
not rated yet Apr 29, 2015
antialias_physorg, I like your idea although highlighting windows mean changing their Z-order, which is do-able but needs a bit of hacking of OS code. However, we can activate on screen items following eye-gaze for custom-made application.

cemery50, we are presently exploring use of eye glasses though the present level of SDK from commercial manufacturers are not great. The Tobii EyeX tracker works well with my -14 Dioptre glasses.

More details on our research at www-edc.eng.cam.ac.uk/~pb400/ResearchDOThtml
antigoracle
not rated yet Apr 29, 2015
This could revolutionize the porn industry.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.