EyeRing helps visually impaired point, press, and hear information

Aug 12, 2012 by Nancy Owano report
EyeRing helps visually impaired point, press, and hear information

Beyond canes and seeing-eye dogs, there is always room for more technology ideas to help the visually impaired ease up daily tasks that go beyond just walking and navigating sidewalks safely. MIT researchers have come up with a novel way for the visually impaired to independently identify objects and learn more about them. “EyeRing is a wearable intuitive interface that allows a person to point at an object to see or hear more information about it,” say the researchers. Their EyeRing is actually a system made up of ring, smartphone, and earpiece.

The user points to an object with a camera-equipped ring worn on the finger. This camera-equipped ring is designed to capture an image and send it to a smartphone for processing. The idea is that the wearer of the ring will simply point the ring at a word or item, snap a photo, and an app on the phone will speak the word or describe the item to them.

In detail, the researchers describe the design as a micro camera worn as a ring on the index finger with a button on the side, which can be pushed with the thumb to take a picture or a video that is then sent wirelessly to a to be analyzed.

EyeRing - A finger-worn visual assitant from Fluid Interfaces on Vimeo.

A “computation element embodied as a mobile phone” is in turn accompanied by the earpiece for “information loopback.” The finger-worn device is autonomous and wireless. A single button initiates the interaction. Information transferred to the phone is processed, and the results are transmitted to the headset for the user to hear.

Several videos about EyeRing have been made, one of which shows a person making his way in a retail clothing environment where he is touching t-shirts on a rack, as he is trying to find his preferred color and size and he is trying to learn the price. He uses his EyeRing finger to point to a shirt to hear that it is color gray and he points to the pricetag to find out how much the shirt costs.

The researchers note that a user needs to pair the finger-worn device with the mobile phone application only once. “Henceforth a Bluetooth connection will be automatically established when both are running.”

The Android application on the mobile phone analyzes the image using the team’s computer vision engine. The type of analysis and response depends on the pre-set mode, for example, color, distance, or currency. “Upon analyzing the image data, the Android application uses a Text to Speech module to read out the information though a headset,” according to the researchers.

The MIT group behind EyeRing are Suranga Nanayakkara, visiting faculty in the Fluid Interfaces group at MIT Media Lab and also a professor at Singapore University of Technology and Design; Roy Shilkrot, a first year doctoral student in the group; and Patricia Maes, associate professor and founder of the Media Lab’s Fluid Interfaces group.

The EyeRing in concept is promising but the team expects the prototype to evolve with more iterations to come. They are now at the stage where they want to prove it is a viable solution yet seek to make it better. The EyeRing creators say that their work “is still very much a work in progress.” The current implementation uses a TTL Serial JPEG Camera, 16 MHz AVR processor, Bluetooth module, 3.7V polymer Lithium-ion battery, 3.3V regulator, and a push button switch. They also look forward to a device that can carry advanced capabilities such as real-time video feed from the camera, higher computational power, and additional sensors like gyroscopes and a microphone. These capabilities are in development for the next prototype of EyeRing.

Explore further: For Google's self-driving cars, learning to deal with the bizarre is essential

More information: fluid.media.mit.edu/publicatio… _V12_cameraReady.pdf
fluid.media.mit.edu/people/sur… current/eyering.html

Related Stories

New device puts vision impaired in the picture

Apr 28, 2011

(PhysOrg.com) -- Visually impaired people may soon have greater access to graphical information thanks to a new device developed by Monash University’s Faculty of Information and Technology.

Google patent sends ring signals to Project Glass

May 19, 2012

(Phys.org) -- Google's September 2011 patent that was filed for a wearable display device was granted this week, which suggests that its envisioned heads-up display device can be controlled by infrared markers ...

Augmented reality in an iPhone app

Jun 20, 2011

Imagine you’re in a museum, and you can point your iPhone camera to a painting or an object in an exhibit and instantly get additional information about what you’re looking at. This is what PixLive, ...

TVs, Cell Phones to Learn 'Sign Language'

Jan 13, 2008

The days of pressing tangible buttons to operate TVs and cell phones may one day be replaced by waving, clapping, and pointing. With two upcoming products, Sony Ericsson´s Z555 gesture phone and JVC´s "Snap ...

Recommended for you

Bluetooth may be the key to your future smart home

Nov 25, 2014

If you've ever considered trying to turn your house into a smart home, you've likely found the prospect expensive or technologically intimidating. That situation could soon change, thanks in part to an old ...

Self-driving cars could be the answer to congested roads

Nov 24, 2014

If cars with drivers still suffer under gridlock conditions on roads, how will driverless cars fare any better? With greater computerisation and network awareness, driverless cars may be the answer to growing ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.