Scientists design indoor navigation system for blind

May 18, 2012
Human-computer interaction researcher Eelke Folmer of the University of Nevada, Reno, watches as Dora Uchel, a university student, demonstrates the indoor navigation system for the visually impaired developed by Kostas Bekris and Folmer of the Computer Science Engineering Department. She was one of several visually impaired students and community members who helped test the low-cost accessible system that operates with a standard smartphone. Credit: Photo by Mike Wolterbeek, University of Nevada, Reno

University of Nevada, Reno computer science engineering team Kostas Bekris and Eelke Folmer presented their indoor navigation system for people with visual impairments at two national conferences in the past two weeks. The researchers explained how a combination of human-computer interaction and motion-planning research was used to build a low-cost accessible navigation system, called Navatar, which can run on a standard smartphone.

"Existing indoor navigation systems typically require the use of expensive and heavy sensors, or equipping rooms and hallways with radio-frequency tags that can be detected by a handheld reader and which are used to determine the user's location," Bekris, of the College of Engineering's Robotics Research Lab, said. "This has often made the implementation of such systems prohibitively expensive, with few systems having been deployed."

Instead, the University of Nevada, Reno navigation system uses digital 2D architectural maps that are already available for many buildings, and uses low-cost sensors, such as and compasses, that are available in most smartphones, to navigate with visual impairments. The system locates and tracks the user inside the building, finding the most suitable path based on the users special needs, and gives step-by-step instructions to the destination.

"Nevertheless, the smartphone's sensors, which are used to calculate how many steps the user has executed and her orientation, tend to pick up false signals," Folmer, who has developed exercise video games for the blind, said. "To synchronize the location, our system combines probabilistic algorithms and the natural capabilities of people with visual impairments to detect landmarks in their environment through touch, such as corridor intersections, doors, stairs and elevators."

Folmer explained that as touch screen devices are challenging to use for users with visual impairments, directions are provided using synthetic speech and users confirm the presence of a landmark by verbal confirmation or by pressing a button on the phone or on a Bluetooth headset. A benefit of this approach is that the user can leave the phone in their pocket leaving both hands free for using a cane and recognizing tactile landmarks.

"This is a very cool mix of disciplines, using the user as a sensor combined with sophisticated localization algorithms from the field of robotics," Folmer, of the University's Computer Science Engineering Human-Computer Interaction Lab, said.

The team is currently trying to implement their navigation system in other environments and integrate it into outdoor navigation systems that use GPS.

"My research is motivated by the belief that a disability can be turned into an innovation driver," Folmer said. "When we try to solve interaction design problems for the most extreme users, such as users with visual impairments, there is the potential to discover solutions that may benefit anyone. Though the navigation system was specifically developed for users with visual impairments, it can be used by sighted users as well."

For their work on the indoor navigation system for the blind, Bekris and Folmer recently won a PETA Proggy Award for Leadership in Ethical Science. PETA's Proggy Awards ("Proggy" is for "progress") recognize animal-friendly achievements. The was deemed such an achievement because it could decrease the need to rely on guide dogs.

They presented and demonstrated their research at the IEEE International Conference on Robotics and Automation in St. Paul., Minn. on May 15 and on May 7 at the CM SIGCHI Conference on Human Factors in Computing Systems, which is the premier international conference on .

Explore further: Adaptive zoom riflescope prototype has push-button magnification

More information: For more information on the system, visit eelke.com/navatar

Related Stories

Google Maps Navigation soon to be available offline

Jun 09, 2011

(PhysOrg.com) -- All About Phones, a Dutch-based tech site has released information that implies that Google Maps Navigation will have an offline version before the end of the summer. The information, which is o ...

Java Mobile Phones Find the Way – New Mobile Navigation

Dec 06, 2005

Java-enabled mobile phones are becoming mobile pathfinders. VDO Dayton has become the first supplier to launch a navigation system for cell phones that feature the widely used programming language Java. Navigation ...

Mapping out the future of GPS technology

Feb 16, 2012

Ditching satellites and complex, powerful computers and opting for camera technology inspired by small mammals may be the future of navigation systems.

Recommended for you

Intelligent materials that work in space

7 hours ago

ARQUIMEA, a company that began in the Business Incubator in the Science Park of the Universidad Carlos III de Madrid, will be testing technology it has developed in the International Space Station. The technology ...

Using sound to picture the world in a new way

Oct 22, 2014

Have you ever thought about using acoustics to collect data? The EAR-IT project has explored this possibility with various pioneering applications that impact on our daily lives. Monitoring traffic density ...

User comments : 0