EyeRing helps visually impaired point, press, and hear information

Aug 12, 2012 by Nancy Owano report
EyeRing helps visually impaired point, press, and hear information

Beyond canes and seeing-eye dogs, there is always room for more technology ideas to help the visually impaired ease up daily tasks that go beyond just walking and navigating sidewalks safely. MIT researchers have come up with a novel way for the visually impaired to independently identify objects and learn more about them. “EyeRing is a wearable intuitive interface that allows a person to point at an object to see or hear more information about it,” say the researchers. Their EyeRing is actually a system made up of ring, smartphone, and earpiece.

The user points to an object with a camera-equipped ring worn on the finger. This camera-equipped ring is designed to capture an image and send it to a smartphone for processing. The idea is that the wearer of the ring will simply point the ring at a word or item, snap a photo, and an app on the phone will speak the word or describe the item to them.

In detail, the researchers describe the design as a micro camera worn as a ring on the index finger with a button on the side, which can be pushed with the thumb to take a picture or a video that is then sent wirelessly to a to be analyzed.

EyeRing - A finger-worn visual assitant from Fluid Interfaces on Vimeo.

A “computation element embodied as a mobile phone” is in turn accompanied by the earpiece for “information loopback.” The finger-worn device is autonomous and wireless. A single button initiates the interaction. Information transferred to the phone is processed, and the results are transmitted to the headset for the user to hear.

Several videos about EyeRing have been made, one of which shows a person making his way in a retail clothing environment where he is touching t-shirts on a rack, as he is trying to find his preferred color and size and he is trying to learn the price. He uses his EyeRing finger to point to a shirt to hear that it is color gray and he points to the pricetag to find out how much the shirt costs.

The researchers note that a user needs to pair the finger-worn device with the mobile phone application only once. “Henceforth a Bluetooth connection will be automatically established when both are running.”

The Android application on the mobile phone analyzes the image using the team’s computer vision engine. The type of analysis and response depends on the pre-set mode, for example, color, distance, or currency. “Upon analyzing the image data, the Android application uses a Text to Speech module to read out the information though a headset,” according to the researchers.

The MIT group behind EyeRing are Suranga Nanayakkara, visiting faculty in the Fluid Interfaces group at MIT Media Lab and also a professor at Singapore University of Technology and Design; Roy Shilkrot, a first year doctoral student in the group; and Patricia Maes, associate professor and founder of the Media Lab’s Fluid Interfaces group.

The EyeRing in concept is promising but the team expects the prototype to evolve with more iterations to come. They are now at the stage where they want to prove it is a viable solution yet seek to make it better. The EyeRing creators say that their work “is still very much a work in progress.” The current implementation uses a TTL Serial JPEG Camera, 16 MHz AVR processor, Bluetooth module, 3.7V polymer Lithium-ion battery, 3.3V regulator, and a push button switch. They also look forward to a device that can carry advanced capabilities such as real-time video feed from the camera, higher computational power, and additional sensors like gyroscopes and a microphone. These capabilities are in development for the next prototype of EyeRing.

Explore further: Bringing history and the future to life with augmented reality

More information: fluid.media.mit.edu/publications/EyeRing-CHI2012-WIP_V12_cameraReady.pdf

Related Stories

New device puts vision impaired in the picture

Apr 28, 2011

(PhysOrg.com) -- Visually impaired people may soon have greater access to graphical information thanks to a new device developed by Monash University’s Faculty of Information and Technology.

Google patent sends ring signals to Project Glass

May 19, 2012

(Phys.org) -- Google's September 2011 patent that was filed for a wearable display device was granted this week, which suggests that its envisioned heads-up display device can be controlled by infrared markers ...

Augmented reality in an iPhone app

Jun 20, 2011

Imagine you’re in a museum, and you can point your iPhone camera to a painting or an object in an exhibit and instantly get additional information about what you’re looking at. This is what PixLive, ...

TVs, Cell Phones to Learn 'Sign Language'

Jan 13, 2008

The days of pressing tangible buttons to operate TVs and cell phones may one day be replaced by waving, clapping, and pointing. With two upcoming products, Sony Ericsson´s Z555 gesture phone and JVC´s "Snap ...

Recommended for you

Patent talk: Google sharpens contact lens vision

Apr 16, 2014

(Phys.org) —A report from Patent Bolt brings us one step closer to what Google may have in mind in developing smart contact lenses. According to the discussion Google is interested in the concept of contact ...

Neuroscientist's idea wins new-toy award

Apr 15, 2014

When he was a child, Robijanto Soetedjo used to play with his electrically powered toys for a while and then, when he got bored, take them apart - much to the consternation of his parents.

Land Rover demos invisible bonnet / car hood (w/ video)

Apr 14, 2014

(Phys.org) —Land Rover has released a video demonstrating a part of its Discover Vision Concept—the invisible "bonnet" or as it's known in the U.S. the "hood" of the car. It's a concept the automaker ...

User comments : 0

More news stories

Venture investments jump to $9.5B in 1Q

Funding for U.S. startup companies soared 57 percent in the first quarter to a level not seen since 2001, as venture capitalists piled more money into an increasing number of deals, according to a report due out Friday.

White House updating online privacy policy

A new Obama administration privacy policy out Friday explains how the government will gather the user data of online visitors to WhiteHouse.gov, mobile apps and social media sites. It also clarifies that ...

Hackathon team's GoogolPlex gives Siri extra powers

(Phys.org) —Four freshmen at the University of Pennsylvania have taken Apple's personal assistant Siri to behave as a graduate-level executive assistant which, when asked, is capable of adjusting the temperature ...

Leeches help save woman's ear after pit bull mauling

(HealthDay)—A pit bull attack in July 2013 left a 19-year-old woman with her left ear ripped from her head, leaving an open wound. After preserving the ear, the surgical team started with a reconnection ...

Scientists tether lionfish to Cayman reefs

Research done by U.S. scientists in the Cayman Islands suggests that native predators can be trained to gobble up invasive lionfish that colonize regional reefs and voraciously prey on juvenile marine creatures.