Sound imaging: clever acoustics help blind people see the world (w/ Video)

Jul 02, 2009
Sound imaging: clever acoustics help blind people see the world

(PhysOrg.com) -- Video from portable cameras is analysed to calculate the distance of obstacles and predict the movements of people and cars. This information is then transformed and relayed to a blind person as a three-dimensional ‘picture’ of sound.

The concept is apparently simple and two have been successfully tested. Laser and digital video cameras become the eyes for the blind man and see the objects and activity going on around him.

Researchers from the University of Bristol have developed powerful real-time image processing and some clever algorithms to then identify objects and obstacles, such as trees, street furniture, vehicles and people. The system uses the stereo images to create a “depth map” for calculating distances. The system can also analyse moving objects and predict where they are going.

This video is not supported by your browser at this time.
This video is in Spanish.

So much for the image processing, but how do you present this visual information to a blind person? Technology developed at the University of Laguna in Spain makes it possible to transform spatial information into three-dimensional acoustic maps.

A blind person wears headphones and hears how sounds change as they move around. The stereo audio system makes it possible to place sounds so that the brain can interpret them as a point in space. Sounds get louder as you walk towards objects, quieter as you move away. Objects to your right are heard on your right, and if you move your head the sound moves too. And if something is heading right for you, you'll hear it coming, with a tone that tells you to get out of the way.

The full picture

The EU-funded CASBLiP project was conceived to integrate the image processing and acoustic mapping technologies into a single, portable device that could be worn by blind people and help them to navigate outdoors.

The University of Laguna worked to adapt its acoustic mapping system and the University of Bristol refined its image processing algorithms. The device also incorporates a gyroscopic sensor developed by the University of Marche, Italy. This component, called the head-positioning sensor, detects how the wearer moves his head. It feeds back the position of the head and the direction it is facing, so that the relative position of the sounds being played to the wearer also move as expected. For example, if you turn your head towards a sound on the right, the sound must move left towards the centre of the sound picture.

Vision for the future

After three years, the consortium has produced two prototype devices mounted on a helmet. They have been tested successfully in trials by blind people in several real-world environments, including busy streets. Two blind institutions (the German Federation of the Blind and Partially Sighted and the Francesco Cavazza Institute, Italy) were heavily involved in the testing programme.

The first design (M1) uses a laser sensor developed by Siemens and originally intended to detect passengers in cars. It can calculate the distance to objects within 0 to 5m in a 60º field of view. The system is mounted inside glasses and cannot be seen by others because it uses infrared light. The M1 has been extensively tested by blind users who are able to recognise items, such as chairs and trees, from the sound picture they receive.

A second version (M2) adds two digital video cameras to either side of a helmet. It can detect moving objects and predict their path.

The University of Marche has also worked closely with the Cavazza Institute to build a complementary GPS location system. This technology could be used to pinpoint the location of a blind person and integrate the device with additional data sources, such as mapping services. It could provide the wearer with verbal directions to their destination.

“We know that the technology works,” says Guillermo Peris-Fajarnés, who coordinated the project from the Research Group on Graphic Technologies at the Universidad Politecnica de Valencia. “Our tests have been very successful and blind people have been able to navigate comfortably in controlled tests and even along a normal street.”

“There is still a lot of development work to do before this could go on the market, especially to prove that the system is 100% reliable,” Peris-Fajarnés notes. “You can't risk it going wrong while a user is crossing the road.”

He says the consortium has decided to continue work on this aspect beyond the end of the EU funding period.

Nevertheless, Peris-Fajarnés is confident that the device could be commercialised: “We are now looking for manufacturing partners to explore the possibilities for a commercially viable product. There's no other system like this available and it should complement existing aids, such as the white stick. But its commercial success will depend on miniaturising the system and mounting the cameras onto glasses.”

“In my personal opinion, anything that can offer some more autonomy to the blind person is positive,” said one tester.

CASBLiP received funding from the ICT strand of the EU’s Sixth Framework Programme for research.

More information: www.casblip.com/

Provided by ICT Results

Explore further: Sweeping air devices for greener planes

add to favorites email to friend print save as pdf

Related Stories

Enabling the blind to find their way

Oct 24, 2008

(PhysOrg.com) -- “Eyes on the future” is the mantra of the ‘World Sight Day’ held this month to raise awareness of blindness and vision impairment. New technologies, developed by European researchers offering the ...

Assistive technologies for the blind

Dec 15, 2004

Researchers at the University of California, Santa Cruz, are developing new assistive technologies for the blind based on advances in computer vision that have emerged from research in robotics. A "virtual white cane" is ...

Spanish scientists develop echolocation in humans

Jun 30, 2009

A team of researchers from the University of Alcalá de Henares (UAH) has shown scientifically that human beings can develop echolocation, the system of acoustic signals used by dolphins and bats to explore their surroundings. ...

Recommended for you

Sweeping air devices for greener planes

4 hours ago

The large amount of jet fuel required to fly an airplane from point A to point B can have negative impacts on the environment and—as higher fuel costs contribute to rising ticket prices—a traveler's wallet. ...

User comments : 1

Adjust slider to filter visible comments by rank

Display comments: newest first

visual
not rated yet Jul 03, 2009
something like this exists for a quite some time now - http://www.seeing...und.com/

it is not based on depth-maps but actual pixel color (well, brightness anyway, it is kinda like gray-scale) and it does not try any kind of motion analysis or other kinks...

i wonder if at least the image-to-sound part of this new tech is based on or similar to the one that i linked.