May 1, 2013 report
SideWays eye-tracking system shown at Paris conference (w/ video)
The eye-tracking system shown at the Conference on Human Factors in Computing Systems in Paris this week is called SideWays, by Andreas Bulling of the Max Planck Institute for Informatics in Saarbrücken, Germany, and Yanxia Zhang and Hans Gellersen at Lancaster University in the UK.
The system can track what the customer is looking at; it detects the faces of people walking by and calculates where the eyes are relative to the eye corners. Potentially, the store could readjust ads and displays for optimal positioning of products that appear to grab the greatest interest.
The device uses a single camera positioned close to the screen. In their paper, "SideWays: A Gaze Interface for SpontaneousInteraction with Situated Displays," the researchers wrote, "SideWays robustly detects whether users attend to the center of the display or cast glances to the left or right."
The viewer can control the screen, using eyes to control content such as scrolling through items on a list. Attention to the central display is the default state, but sidelong glances can trigger input or actions.
The authors think they have made a contribution in their work by validating that "SideWays enables eye gaze as input for interactive displays, without the need of prior calibration or specialist hardware." They said the significance lay in the achievement of "robust gaze control, albeit coarse-grained, without need for calibration means that our system is person-independent." Any user can walk up to a display fitted with their system, and interact with it using their eyes only.
"Person-independence and interaction without preparation are critical steps toward genuinely spontaneous interaction with displays we encounter in public environment."
© 2013 Phys.org