Sound imaging: clever acoustics help blind people see the world (w/ Video)

Sound imaging: clever acoustics help blind people see the world

(PhysOrg.com) -- Video from portable cameras is analysed to calculate the distance of obstacles and predict the movements of people and cars. This information is then transformed and relayed to a blind person as a three-dimensional ‘picture’ of sound.

The concept is apparently simple and two have been successfully tested. Laser and digital video cameras become the eyes for the blind man and see the objects and activity going on around him.

Researchers from the University of Bristol have developed powerful real-time image processing and some clever algorithms to then identify objects and obstacles, such as trees, street furniture, vehicles and people. The system uses the stereo images to create a “depth map” for calculating distances. The system can also analyse moving objects and predict where they are going.

This video is in Spanish.

So much for the image processing, but how do you present this visual information to a blind person? Technology developed at the University of Laguna in Spain makes it possible to transform spatial information into three-dimensional acoustic maps.

A blind person wears headphones and hears how sounds change as they move around. The stereo audio system makes it possible to place sounds so that the brain can interpret them as a point in space. Sounds get louder as you walk towards objects, quieter as you move away. Objects to your right are heard on your right, and if you move your head the sound moves too. And if something is heading right for you, you'll hear it coming, with a tone that tells you to get out of the way.

The full picture

The EU-funded CASBLiP project was conceived to integrate the image processing and acoustic mapping technologies into a single, portable device that could be worn by blind people and help them to navigate outdoors.

The University of Laguna worked to adapt its acoustic mapping system and the University of Bristol refined its image processing algorithms. The device also incorporates a gyroscopic sensor developed by the University of Marche, Italy. This component, called the head-positioning sensor, detects how the wearer moves his head. It feeds back the position of the head and the direction it is facing, so that the relative position of the sounds being played to the wearer also move as expected. For example, if you turn your head towards a sound on the right, the sound must move left towards the centre of the sound picture.

Vision for the future

After three years, the consortium has produced two prototype devices mounted on a helmet. They have been tested successfully in trials by blind people in several real-world environments, including busy streets. Two blind institutions (the German Federation of the Blind and Partially Sighted and the Francesco Cavazza Institute, Italy) were heavily involved in the testing programme.

The first design (M1) uses a laser sensor developed by Siemens and originally intended to detect passengers in cars. It can calculate the distance to objects within 0 to 5m in a 60º field of view. The system is mounted inside glasses and cannot be seen by others because it uses infrared light. The M1 has been extensively tested by blind users who are able to recognise items, such as chairs and trees, from the sound picture they receive.

A second version (M2) adds two digital video cameras to either side of a helmet. It can detect moving objects and predict their path.

The University of Marche has also worked closely with the Cavazza Institute to build a complementary GPS location system. This technology could be used to pinpoint the location of a blind person and integrate the device with additional data sources, such as mapping services. It could provide the wearer with verbal directions to their destination.

“We know that the technology works,” says Guillermo Peris-Fajarnés, who coordinated the project from the Research Group on Graphic Technologies at the Universidad Politecnica de Valencia. “Our tests have been very successful and blind people have been able to navigate comfortably in controlled tests and even along a normal street.”

“There is still a lot of development work to do before this could go on the market, especially to prove that the system is 100% reliable,” Peris-Fajarnés notes. “You can't risk it going wrong while a user is crossing the road.”

He says the consortium has decided to continue work on this aspect beyond the end of the EU funding period.

Nevertheless, Peris-Fajarnés is confident that the device could be commercialised: “We are now looking for manufacturing partners to explore the possibilities for a commercially viable product. There's no other system like this available and it should complement existing aids, such as the white stick. But its commercial success will depend on miniaturising the system and mounting the cameras onto glasses.”

“In my personal opinion, anything that can offer some more autonomy to the blind person is positive,” said one tester.

CASBLiP received funding from the ICT strand of the EU’s Sixth Framework Programme for research.

More information: www.casblip.com/

Provided by ICT Results

Citation: Sound imaging: clever acoustics help blind people see the world (w/ Video) (2009, July 2) retrieved 15 November 2024 from https://phys.org/news/2009-07-imaging-clever-acoustics-people-world.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Enabling the blind to find their way

0 shares

Feedback to editors