Beyond Google Glass: Researcher looks to the future

Aug 20, 2013 by Shelley Littin

(Phys.org) —A wearable display being developed by UA optical scientist Hong Hua could have capabilities even more advanced than those of the recently unveiled Google Glass, a pair of glasses with smartphone capabilities.

University of Arizona associate professor of Hong Hua is developing technology that could make a wearable display that is lighter, easier to use and has finer and more varied capabilities than the recently rolled-out Google Glass.

Imagine strolling down the street wearing a new pair of glasses – but these are no ordinary shades. A miniscule computer lodged in the frame projects text onto the lenses before your eyes, reflecting the light so that the information appears to be at arm's distance away from you, or a little farther, but only you can read it. You can control the functions of the device by voice, generate a map giving you directions, read text messages and take photographs and video.

Your glasses can do essentially everything a smartphone can do, all in a wearable, lightweight, transparent display.

The recently unveiled Google Glass product can do all this, but Hua is working to take the technology a step beyond even those capabilities.

"Google Glass is not intended for applications that require a large field of view and ," Hua said. She said the device she is developing will have a much larger field of view to create a sense of a large . Her system can be configured to display information in both eyes; it has the capability of creating the sense that you are seeing three-; and it will display true high-definition images.

"Google Glass is also not intended for 3-D capability and not optimal for applications in virtual and ," Hua added.

Hua works with augmented reality, or AR, technology that she said is related to virtual reality but still allows the user to interact with the real world.

"For you wear a display and you are totally immersed into a computer generated image so you don't see the outside reality. For augmented reality you are not going to generate the entire view digitally. You are going to use the surrounding environment and can pick and select what information you want to display."

"If you wear this device on the street, of course you want to be able to see the outside world," Hua said, which is where the augmented reality technology comes into play. "You could use this type of device to superpose digital information on top of your real world to assist your daily activity."

For example, let's say that at some point during your walk you feel a grumble in your stomach and begin to scan the storefronts in search of a place to grab lunch. Through your glasses, translucent lettering would pop up in front of you, overlaying the restaurant across the street with relevant information such as menu items and prices.

Developing a device such as this, Hua said, requires integrating a variety of technologies including microdisplays, optical technology, sensor technology, electronics and computing.

"You need to allow the person to see the outside world without any obstruction and the device needs to be as lightweight and convenient as possible," she said.

The micro-display needs to be bright enough for people to use outside, and the image needs to be in high-resolution to provide detailed maps or text messages, she added.

Hua is creating freeform optical surfaces, which are different from conventional optical surfaces in that they are highly variable in their surface shape, with some areas more curved than others. This emerging optical technology allows her to design an eyepiece for wearable displays with a very compact form factor, for example shaped like sporting sunglasses.

"Traditionally we wouldn't be able to make an eyepiece in such a compact form," Hua said. "This can be achieved with freeform surfaces."

Three-dimensional AR displays such as the one Hua is developing have a variety of promising applications, Hua said, from assisting users with directions to video gaming to military training.

"For example, we can build an augmented environment where the soldier could see the battlefield through this type of glass," Hua said. "It could be used for training."

The device has medical applications, too. Hua collaborates with UA assistant professor of neurology Katalin Scherer and Cass Faux, a UA clinical assistant professor of speech, language and hearing sciences, to make the glasses compatible for patients with amyotrophic lateral sclerosis, or ALS, a neurodegenerative disease that causes patients slowly to lose control over their muscles.

"Eventually the only thing they can move are their eyes," Hua said. "The eye movements become their only means to communicate with the outside world."

"We are developing one of these systems which has the capability to monitor eye movements," she said. "So the patient can wear this device and see a virtual screen in front of her and then move her eyes to type information telling the caregiver what she wants or needs."

The device would be a leap beyond the current system used for ALS patients, in which patients use a large computer screen and mounted camera to capture their . "Now we are trying to shrink that whole package down to something very portable," Hua said.

Hua believes the device can be used for other medical applications, for example in surgery, where surgeons could see information about the location of organs or other body parts through the glasses while operating.

"The vision in the future is that your phone is a pair of glasses that you wear and you can dial in using your voice and see your screen pop up in front of you, generated through the glass," Hong said. "And then you start to interact with it with your eye gaze, voice or gesture."

Explore further: Meta glasses to place virtual reality worlds at fingertips (w/ Video)

Related Stories

Google Glass may run with laser-projected keyboard

Jan 18, 2013

(Phys.org)—Just when you thought you could swing into 2013 without another report on Google Glass in-the-wings, this is the week your luck runs out. Ideas continue to fly regarding what could possibly be ...

Recommended for you

Patent talk: Google sharpens contact lens vision

Apr 16, 2014

(Phys.org) —A report from Patent Bolt brings us one step closer to what Google may have in mind in developing smart contact lenses. According to the discussion Google is interested in the concept of contact ...

Neuroscientist's idea wins new-toy award

Apr 15, 2014

When he was a child, Robijanto Soetedjo used to play with his electrically powered toys for a while and then, when he got bored, take them apart - much to the consternation of his parents.

Land Rover demos invisible bonnet / car hood (w/ video)

Apr 14, 2014

(Phys.org) —Land Rover has released a video demonstrating a part of its Discover Vision Concept—the invisible "bonnet" or as it's known in the U.S. the "hood" of the car. It's a concept the automaker ...

User comments : 5

Adjust slider to filter visible comments by rank

Display comments: newest first

TheGhostofOtto1923
1 / 5 (3) Aug 20, 2013
Here's a walkthrough of ghost recon future soldier which shows what this augmented reality and AI might look like in a combat environment
http://www.youtub...a_player
SolidRecovery
1 / 5 (11) Aug 20, 2013
From land lines to cell phones to smart phones to glasses to this? Although impressive, I don't think it goes far enough with what technology will be capable of in the NEAR future. It is a step in the right direction, but does it have to be glasses? Does it have to fit any kind of profile?
krundoloss
1 / 5 (7) Aug 20, 2013
Terminator-Vision Here we come!

I personally am excited and a little afraid of when we can interface computers directly into our brains. No more of this archaic "reading" or "watching" or "listening" it will all just become "knowing".
TheGhostofOtto1923
1 / 5 (3) Aug 20, 2013
Terminator-Vision Here we come!

I personally am excited and a little afraid of when we can interface computers directly into our brains. No more of this archaic "reading" or "watching" or "listening" it will all just become "knowing".
And once AI is let loose on the internet, it will flush out all the senseless bullshit from our accumulated store of knowledge. And THEN it will become very hard to lie wont it? Instant fact-checking and transmission to all involved.

Will humans have anything left to say at all? We'll have to wait and see-
dougie_fresh_007
3 / 5 (4) Aug 20, 2013
as a quadriplegic i find any handsfree tech cool although very few actually are truly handsfree, it will be nice when peripherals like these are widely available. it's a tech that if done properly could be a huge benefit to many people.

More news stories

Students take clot-buster for a spin

(Phys.org) —In the hands of some Rice University senior engineering students, a fishing rod is more than what it seems. For them, it's a way to help destroy blood clots that threaten lives.

Finnish inventor rethinks design of the axe

(Phys.org) —Finnish inventor Heikki Kärnä is the man behind the Vipukirves Leveraxe, which is a precision tool for splitting firewood. He designed the tool to make the job easier and more efficient, with ...

First steps towards "Experimental Literature 2.0"

As part of a student's thesis, the Laboratory of Digital Humanities at EPFL has developed an application that aims at rearranging literary works by changing their chapter order. "The human simulation" a saga ...