Research could lead to wearable sensors for the blind

Sep 28, 2011

Wearable sensors that allow the blind to "see" with their hands, bodies or faces could be on the horizon, thanks to a $2 million award from the National Science Foundation (NSF) to researchers at The City College of New York and Georgia Institute of Technology (Georgia Tech).

The grant, through the NSF "Emerging Frontiers in Research and Innovation" program, will fund a multidisciplinary team investigating devices for "alternative perception" and the principles underlying the human-machine interaction. Alternative perception emulates vision by combining electronics and input from the other senses. In addition to aiding the visually impaired, the researchers expect the findings to lend themselves to other applications, such as the development of .

The grant is the first to result from a collaboration supported by CCNY's City SEED Grants program, an internal award of $50,000 in seed money to promote interdisciplinary faculty research partnerships. The program, initiated in the fall of 2010 by President Lisa S. Coico, requires grant recipients to include a plan to expand their projects and apply for further funding from other organizations.

The initial collaboration involved Dr. Zhigang Zhu, professor of computer science and computer engineering in City College's Grove School of Engineering, the principal investigator on the NSF grant, Dr. Tony Ro, professor of psychology and director of the Program in , and Dr. Ying Li Tian, professor of electrical engineering. "The whole project needed something more interdisciplinary, so I looked for complementary research and found my neighbor Tony (Ro's laboratory) is right next door," said Professor Zhu.

"(This) was truly a good example of an interdisciplinary proposal and members with a complementary expertise -- not just similar overlapping expertise -- which is unusual," said CCNY Associate Provost for Research Larry Bank, who oversees the City SEED Grants program. "We must integrate input from the sciences, engineering and, often, art and humanities, to have a true understanding of phenomena."

The researchers joined forces to disentangle how humans learn to coordinate input from their senses -- e.g. vision, touch -- with movements, like reaching for a glass or moving through a crowded room. They will then map out how machines, such as robots and computers, learn similar tasks, in order to model devices that can assist humans.

The team, which combines expertise in engineering, computer science, neuroscience, motor control and biomechanics, envisions a multifunctional array of sensors on the body and has already developed prototypes for some of the devices. The full complement of wearable sensors would help a sightless person navigate by conveying information about his or her surroundings.

Professor Zhu works on navigation and obstacle detection by robots. For the project, he will focus on machine sensing and computer learning to understand the human-computer interaction.

He will also refine displays that would feed information from electronic to the human wearer of the device. His lab is already testing a sensor that can detect proximity to an object and convey its distance with vibration on the hand or other body part. As one gets closer to a table, for example, it gradually increases the intensity of the stimulation.

Professor Ro, a neuroscientist, will provide a window into what is going in the brain as sighted and visually impaired individuals navigate a room or virtual environment with and without devices to assist them. Using Professor Zhu's distance sensor, he is now testing how sensitive people are in discriminating vibrations to the hand that tell them how far it is from an object. He will determine whether they can make accurate judgments and whether they might be using the visual parts of the brain.

Professor Tian works on higher-level visual understanding by machines, such as detecting and identifying doors, exit signs, colors or stairs in a room. A system like this could audibly tell the wearer that an object on the floor was a cat or a footstool, for example.

Dr. Kok-Meng Lee, professor of mechanical engineering and director of the Advanced Intelligent Mechatronics Research Laboratory at Georgia Tech, who has expertise in mechanotronics -- the combination of mechanics and the electronics of information systems -- works on machine vision and novel sensor designs. He will help develop the theory and methods for detecting objects thermally and magnetically and find out how this affects walking. This will help break down the essentials of orienting oneself in a new environment and navigating through it.

Dr. Boris Prilutsky, professor in the School of Applied Physiology at Georgia Tech, studies sensory feedback in motor control or how one learns and organizes movements -- like walking or reaching out -- using sensory information. He will look at how quickly people with normal and impaired vision can learn to use devices for alternative perception and help develop models for some of the findings.

The researchers hope their findings on perception, and the prototypes they develop, will spawn a raft of wearable electronic devices to help the blind to "see" their environment at a distance through touch, hearing and other senses. The technology would also benefit sighted individuals who must navigate in poor visibility, such as firefighters and pilots.

Such devices could outperform existing assistive technologies by providing more information and being lower in cost. There are even advantages over one of the best forms of assistance for the blind, the guide dog. "A service dog can't convey what is around you, it can just guide," notes Professor Ro. "This [device] can actually tell you how far things are or what things are in much more detail."

Explore further: FAA, industry launch drone safety campaign

Provided by City College of New York

not rated yet
add to favorites email to friend print save as pdf

Related Stories

See no shape, touch no shape, hear a shape?

Oct 18, 2010

(PhysOrg.com) -- Scientists at The Montreal Neurological Institute and Hospital – The Neuro, McGill University have discovered that our brains have the ability to determine the shape of an object simply by processing ...

Finger-Friendly 'Tactile Interface' Could Aid

Nov 12, 2007

A Johns Hopkins researcher has joined experts from four other institutions who plan to create a dynamic electronic surface to allow blind or visually impaired people to "feel" mathematical graphs, diagrams and other visuals ...

Enabling the blind to find their way

Oct 24, 2008

(PhysOrg.com) -- “Eyes on the future” is the mantra of the ‘World Sight Day’ held this month to raise awareness of blindness and vision impairment. New technologies, developed by European researchers offering the ...

Recommended for you

FAA, industry launch drone safety campaign

9 hours ago

Alarmed by increasing encounters between small drones and manned aircraft, drone industry officials said Monday they are teaming up with the government and model aircraft hobbyists to launch a safety campaign.

Off-world manufacturing is a go with space printer

Dec 20, 2014

On Friday, the BBC reported on a NASA email exchange with a space station which involved astronauts on the International Space Station using their 3-D printer to make a wrench from instructions sent up in ...

User comments : 1

Adjust slider to filter visible comments by rank

Display comments: newest first

rsklyar
not rated yet Sep 29, 2011
Some similar and plagiaristic research at Northwestern University: issuu.com/r_sklyar/docs/sklyarvsmussaivaldi

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.