EyeRing helps visually impaired point, press, and hear information

August 12, 2012 by Nancy Owano report
EyeRing helps visually impaired point, press, and hear information

Beyond canes and seeing-eye dogs, there is always room for more technology ideas to help the visually impaired ease up daily tasks that go beyond just walking and navigating sidewalks safely. MIT researchers have come up with a novel way for the visually impaired to independently identify objects and learn more about them. “EyeRing is a wearable intuitive interface that allows a person to point at an object to see or hear more information about it,” say the researchers. Their EyeRing is actually a system made up of ring, smartphone, and earpiece.

The user points to an object with a camera-equipped ring worn on the finger. This camera-equipped ring is designed to capture an image and send it to a smartphone for processing. The idea is that the wearer of the ring will simply point the ring at a word or item, snap a photo, and an app on the phone will speak the word or describe the item to them.

In detail, the researchers describe the design as a micro camera worn as a ring on the index finger with a button on the side, which can be pushed with the thumb to take a picture or a video that is then sent wirelessly to a to be analyzed.

EyeRing - A finger-worn visual assitant from Fluid Interfaces on Vimeo.

A “computation element embodied as a mobile phone” is in turn accompanied by the earpiece for “information loopback.” The finger-worn device is autonomous and wireless. A single button initiates the interaction. Information transferred to the phone is processed, and the results are transmitted to the headset for the user to hear.

Several videos about EyeRing have been made, one of which shows a person making his way in a retail clothing environment where he is touching t-shirts on a rack, as he is trying to find his preferred color and size and he is trying to learn the price. He uses his EyeRing finger to point to a shirt to hear that it is color gray and he points to the pricetag to find out how much the shirt costs.

The researchers note that a user needs to pair the finger-worn device with the mobile phone application only once. “Henceforth a Bluetooth connection will be automatically established when both are running.”

The Android application on the mobile phone analyzes the image using the team’s computer vision engine. The type of analysis and response depends on the pre-set mode, for example, color, distance, or currency. “Upon analyzing the image data, the Android application uses a Text to Speech module to read out the information though a headset,” according to the researchers.

The MIT group behind EyeRing are Suranga Nanayakkara, visiting faculty in the Fluid Interfaces group at MIT Media Lab and also a professor at Singapore University of Technology and Design; Roy Shilkrot, a first year doctoral student in the group; and Patricia Maes, associate professor and founder of the Media Lab’s Fluid Interfaces group.

The EyeRing in concept is promising but the team expects the prototype to evolve with more iterations to come. They are now at the stage where they want to prove it is a viable solution yet seek to make it better. The EyeRing creators say that their work “is still very much a work in progress.” The current implementation uses a TTL Serial JPEG Camera, 16 MHz AVR processor, Bluetooth module, 3.7V polymer Lithium-ion battery, 3.3V regulator, and a push button switch. They also look forward to a device that can carry advanced capabilities such as real-time video feed from the camera, higher computational power, and additional sensors like gyroscopes and a microphone. These capabilities are in development for the next prototype of EyeRing.

Explore further: New device puts vision impaired in the picture

More information: fluid.media.mit.edu/publications/EyeRing-CHI2012-WIP_V12_cameraReady.pdf

Related Stories

New device puts vision impaired in the picture

April 28, 2011

(PhysOrg.com) -- Visually impaired people may soon have greater access to graphical information thanks to a new device developed by Monash University’s Faculty of Information and Technology.

Google patent sends ring signals to Project Glass

May 19, 2012

(Phys.org) -- Google's September 2011 patent that was filed for a wearable display device was granted this week, which suggests that its envisioned heads-up display device can be controlled by infrared markers in the form ...

Augmented reality in an iPhone app

June 20, 2011

Imagine you’re in a museum, and you can point your iPhone camera to a painting or an object in an exhibit and instantly get additional information about what you’re looking at. This is what PixLive, an iPhone app ...

TVs, Cell Phones to Learn 'Sign Language'

January 13, 2008

The days of pressing tangible buttons to operate TVs and cell phones may one day be replaced by waving, clapping, and pointing. With two upcoming products, Sony Ericsson´s Z555 gesture phone and JVC´s "Snap & Gesture" TV ...

Recommended for you

Making it easier to collaborate on code

October 26, 2016

Git is an open-source system with a polarizing reputation among programmers. It's a powerful tool to help developers track changes to code, but many view it as prohibitively difficult to use.

Dutch unveil giant vacuum to clean outside air

October 25, 2016

Dutch inventors Tuesday unveiled what they called the world's first giant outside air vacuum cleaner—a large purifying system intended to filter out toxic tiny particles from the atmosphere surrounding the machine.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.