Assistive technologies for the blind

December 15, 2004

Researchers at the University of California, Santa Cruz, are developing new assistive technologies for the blind based on advances in computer vision that have emerged from research in robotics. A "virtual white cane" is one of several prototype tools for the visually impaired developed by Roberto Manduchi, an assistant professor of computer engineering, and his students.
The traditional white cane is still the most common mobility device for the blind. It is a simple and effective tool that enables users to extend their sense of touch and "preview" the area ahead of them as they walk. But the long, rigid cane is not well-suited to all situations or all users.

Manduchi's high-tech alternative is a laser-based range-sensing device about the size of a flashlight. A laser, much like the one in an ordinary laser pointer, is combined with a digital camera and a computer processor that analyzes and integrates spatial information as the user moves the device back and forth over a scene. The user receives feedback about the scene in the form of audio signals, and an additional tactile interface is being developed for future prototypes.

"In the audio signal, the pitch corresponds to distance, and there are also special sounds to indicate features such as a curb, step, or drop-off," Manduchi said.

Dan Yuan, a graduate student working with Manduchi on the virtual white cane project, built the initial prototype. The UCSC researchers are collaborating with the Smith-Kettlewell Eye Research Institute, a nonprofit research institute in San Francisco (http: //, on the virtual white cane and other projects.

"The people at Smith-Kettlewell are helping us to understand the real needs of the blind, and they have blind engineers who test the systems we develop," Manduchi said.

In another project, for example, Manduchi is working with Smith-Kettlewell scientist James Coughlan on a system that uses a compact device with a camera to detect and gather information from small labels or tags placed in key locations. For example, the tags might help a blind person locate a doctor's office in a medical building. The device would only work where tags have been placed in the environment, but the tags--small colored labels with bar codes on them--are very inexpensive and require no maintenance.

"A blind person staying at a hotel could put a sticker on their door so they could easily find their way back to the room," Manduchi said. "Or I could put tags here in the Engineering 2 Building to help a blind visitor find my office."

The tags could be detected by a handheld computer with a simple camera, or even a camera phone, he said. Michi Mutsuzaki, a UCSC undergraduate working in Manduchi's lab, used a small handheld computer with a camera to develop a protoype device that can detect the colored targets.

A third collaboration with Smith-Kettlewell is a project Manduchi refers to as "MapQuest for the blind," in reference to the Internet map site

"The problem is how to enable a blind person to explore a map," Manduchi said. "The current devices are braille maps, but those require a special printer. We want to create a feedback environment to enable a blind person to explore a map on the computer."

The feedback would be provided by a "force-feedback mouse," which vibrates to produce a variety of physical sensations the user can feel as the pointer moves across features on a computer screen. These devices are readily available, so the project involves creating software that will enable the blind to use a force-feedback mouse to "feel" their way through a map.

Michele Clarke, an undergraduate at St. Mary's University of Minnesota, began working with Manduchi on this project last summer as a participant in UCSC's Summer Undergraduate Research Fellowship in Information Technology (SURF-IT) program, funded by the National Science Foundation. She is continuing to work on the project at St. Mary's during the current academic year.

Before coming to UC Santa Cruz in 2001, Manduchi worked for several years at NASA's Jet Propulsion Laboratory, applying computer vision technology to autonomous robotic systems.

"It is a natural evolution from helping a robot drive around to helping a blind person navigate their environment," he said.

Source: University of California, Santa Cruz

Related Stories

Recommended for you

Trade in invasive plants is blossoming

October 3, 2015

Every day, hundreds of different plant species—many of them listed as invasive—are traded online worldwide on auction platforms. This exacerbates the problem of uncontrollable biological invasions.

How much for that Nobel prize in the window?

October 3, 2015

No need to make peace in the Middle East, resolve one of science's great mysteries or pen a masterpiece: the easiest way to get yourself a Nobel prize may be to buy one.

Drone market to hit $10 billion by 2024: experts

October 3, 2015

The market for military drones is expected to almost double by 2024 to beyond $10 billion (8.9 billion euros), according to a report published Friday by specialist defence publication IHS Jane's Intelligence Review.

En route to CEATEC: 17.3-inch 8K4K LCD module

October 3, 2015

In the old days, people were impressed if a screen image simply was not blurry. "Clear" was the supreme compliment. We know the rest. Technology advances have raised consumer expectations; a competitive vendor in electronics ...

Fusion reactors 'economically viable' say experts

October 2, 2015

Fusion reactors could become an economically viable means of generating electricity within a few decades, and policy makers should start planning to build them as a replacement for conventional nuclear power stations, according ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.