Researchers develop magnifying smartphone screen app for visually impaired

April 22, 2016, Massachusetts Eye and Ear Infirmary
Demonstration of magnifying smartphone application using Google Glass for visually impaired. Credit: Gang Luo, Ph.D.

Researchers from the Schepens Eye Research Institute of Massachusetts Eye and Ear/Harvard Medical School have developed a smartphone application that projects a magnified smartphone screen to Google Glass, which users can navigate using head movements to view a corresponding portion of the magnified screen. They have shown that the technology can potentially benefit low-vision users, many of whom find the smartphone's built-in zoom feature to be difficult to use due to the loss of context. Their results are published online in the journal IEEE Transactions on Neural Systems and Rehabilitation Engineering.

"When people with low visual acuity zoom in on their smartphones, they see only a small portion of the screen, and it's difficult for them to navigate around—they don't know whether the current position is in the center of the screen or in the corner of the screen," said senior author Gang Luo, Ph.D., associate scientist at Schepens Eye Research Institute of Mass. Eye and Ear and an associate professor of ophthalmology at Harvard Medical School. "This application transfers the image of smartphone screens to Google Glass and allows users to control the portion of the screen they see by moving their heads to scan, which gives them a very good sense of orientation."

An estimated 1.5M Americans over the age of 45 suffer from low vision—severe visual impairment caused by a variety of conditions. People with low vision often have great difficulty reading and discerning fine details. Magnification is considered the most effective method of compensating for visual loss. The researchers developed the head-motion application to address the limitations of conventional smartphone screen zooming, which does not provide sufficient context and can be painstaking to navigate.

In an evaluation of their new technology, the researchers observed two groups of research subjects (one group that used the head-motion Google Glass application and the other using the built-in zoom feature on a smart phone) and measured the time it took for them to complete certain tasks. The researchers showed that the head-based navigation method reduced the average trial time compared to conventional manual scrolling by about 28 percent.

As next steps for the project, the researchers would like to incorporate more gestures on the Google Glass to interact with smartphones. They would also like to study the effectiveness of head-motion based navigation compared to other commonly-used smartphone accessibility features, such as voice-based navigation.

"Given the current heightened interest in smart glasses, such as Microsoft's Hololens and Epson's Moverio, it is conceivable to think of a smart glass working independently without requiring a paired mobile device in near future." said first author Shrinivas Pundlik, Ph.D. "The concept of head-controlled screen navigation can be useful in such glasses even for people who are not visually impaired."

To view a demonstration of the technology, please view this video created by the researchers:

Explore further: Google expands sales of Internet-connected glasses (Update)

Related Stories

Wearable device helps vision-impaired avoid collision

March 26, 2015

People who have lost some of their peripheral vision, such as those with retinitis pigmentosa, glaucoma, or brain injury that causes half visual field loss, often face mobility challenges and increased likelihood of falls ...

Review: Google Glass needs lots of polishing

September 2, 2013

Google Glass is an innovative - if unpolished - technology. But it has what I think is a fundamental flaw: Designed to be worn on the face throughout the day, Glass is a barrier between users and the real world around them.

K-Glass 3 offers users a keyboard to type text

February 26, 2016

K-Glass, smart glasses reinforced with augmented reality (AR) that were first developed by the Korea Advanced Institute of Science and Technology (KAIST) in 2014, with the second version released in 2015, is back with an ...

Smart Windshield is concept for motorbike rider safety

March 29, 2016

The obvious advice for bike riders as well as car drivers is the same: stay focused. Avoid messaging and phone calls that could take your mind off the most important activity facing you, handling the car or bike on the road. ...

Google patent calls on eyes to unlock smart glasses

August 9, 2012

( -- In patent language, the application granted to Google this week says simply “Unlocking a screen using eye tracking information” but the message seems clear enough that Google plans to offer a way for ...

Recommended for you

World's biggest battery in Australia to trump Musk's

March 16, 2018

British billionaire businessman Sanjeev Gupta will built the world's biggest battery in South Australia, officials said Friday, overtaking US star entrepreneur Elon Musk's project in the same state last year.

1 in 3 Michigan workers tested opened fake 'phishing' email

March 16, 2018

Michigan auditors who conducted a fake "phishing" attack on 5,000 randomly selected state employees said Friday that nearly one-third opened the email, a quarter clicked on the link and almost one-fifth entered their user ...

Origami-inspired self-locking foldable robotic arm

March 15, 2018

A research team of Seoul National University led by Professor Kyu-Jin Cho has developed an origami-inspired robotic arm that is foldable, self-assembling and also highly-rigid. (The researchers include Suk-Jun Kim, Dae-Young ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.