The EU DEEPVIEW project has shown that innovative displays with gaze-tracking devices can enhance a viewer's perception of depth, allowing greater exploration of an object of interest.
Researchers are creating innovative displays that allow users to explore focus, depth, colour, and new ways of presenting information, simply by tracking their gaze across the screen. The user's gaze alone is enough to direct focus, and enhance depth perception and better distinguish colours – without the need to click on a cursor to call up data or focus on the object they are interested in.
Innovative GAZER software
GAZER is the first application to come out of the DEEPVIEW project, carried out by SACHI, the Computer Human Interaction Research Group at St Andrew's University in Scotland, UK. The software, developed following tests on over 50 different users, works in conjunction with eye tracking devices to allow photographers taking pictures with light field cameras to explore images by automatically focusing on objects using just their eyes.
'We are exploring the potential uses of an exciting new area called gaze-based perceptual augmentation,' explained DEEPVIEW coordinator Dr Miguel Nacenta. 'Instead of moving a cursor around to focus, the gaze-contingent display (GCD) does it automatically through the position of the user's gaze. This creates a sensation of depth, 3D without the glasses if you like - a richer, more salient and natural way of seeing that is meant to enhance the viewer's experience.'
The use of GCDs
A GCD works by modifying the information gathered from the eye-tracker about the user's gaze, not just its location but other metrics such as blinks, fixations and saccades. The aim is to do this in a way that does not allow the user to perceive the system reacting to his gaze, but instead creates a holistically changed impression of the display.
In the past, this technology has mostly been proposed for performance gain (i.e., improve compute rendering times) by selectively omitting details in unattended parts of a display. DEEPVIEW's goal, however, is to find perceptual modifications that augment the displayed information and thus create an enhanced viewing experience for the user.
The project is investigating the use of GCDs in other ways, too, such as enhancing perception of colour and contrast, and for multimedia applications. Enhancing colour, so the user perceives a wider range than the monitor is capable of displaying, could in future be become a highly useful tool, for example, in big data analysis.
In applying the technology to multimedia, users reading an article might unconsciously call up all sorts of complementary information – graphs, photos, videos, supplementary text – which offers itself as they scan the text paragraph by paragraph.
'The key will be to do this so that the reading experience is enriched, not disrupted,' Dr Nacenta pointed out. 'In fact, using gaze perception technology promises to be less disruptive than pointing and clicking on a cursor. You can take advantage of the natural behaviour of people looking at things, rather than asking them to interact explicitly with the system.'
Other possible uses for GCDs
The sky is literally the limit as far as gaze-based perceptual augmentation is concerned, as DEEPVIEW also turns its attention to astronomy.
'Astronomers sometimes need specific colour and depth in their work,' said Michael Mauderer, a doctoral student working on the project and the main developer of the GAZER application. 'They look at these very complex images they get from telescopes which go from infrared to ultraviolet, exceeding human vision bandwidths. We are working to help them view such data in a different way, which could eventually lead to new discoveries.'
The use of GCDs may eventually become widespread, according to Dr Nacenta, whether they be for enhancing cinematic experience, allowing doctors better interpretation of patterns in magnetic resonance images, or helping police solve crimes from sketchy CCTV images.
More information: For more information please see the DEEPVIEW CORDIS project page: cordis.europa.eu/project/rcn/103570_en.html
Provided by CORDIS