Meta glasses to place virtual reality worlds at fingertips (w/ Video)

May 18, 2013 by Nancy Owano report
Meta glasses to place virtual reality worlds at fingertips

(Phys.org) —Yawn. Two startup visionaries claim they have just the device to replace keyboard and mouse forever and ever. Where have you heard that before. But maybe these two have something important. Meron Gribetz, the startup founder and CEO and Ben Sand, the co-pilot and evangelist, are behind something called the Meta wearable computer headset, which consists of stereoscopic glasses and camera. It's the way computers always should have been: wearable, viewed through both eyes, and directly controlled using the entire arms and hands, according to its founder and CEO Gribetz. The belief is that the future of computing is in this technology that can display information from the real world and control objects with one's fingers, Tony Stark-style, at low latency and high dexterity. Meta founder and CEO Gribetz referred to the technology as the keyboard and mouse of the future.

Meta grew out of a Columbia University project where a team built advanced surface tracking algorithms. The algorithms allow the anchoring of virtual content to the , without use of fiducial markers.

The Meta effort in wearable computing eyewear is not to be treated as competition for Google Glass, which may be an easy connection to make once the words wearable computing and spectacles are mentioned. The Meta team makes a point of calling out the two as different. As their promotional video said, "This is true , not just popups from the corner of your eye." The Google Glass display is different, as Meta's is in the center of the field of view rather than above the eyeline. The role of the two are different, if one thinks of Glass as a helpful companion while navigating city streets, looking for landmarks, restaurants, or translations for signs in foreign languages. Meta's augmented reality eyewear is immersive applied to games, for example, played in front of your face or to working on a surface, while you use your hands to interact with your . The system includes 3-D glasses supplied by Epson and a 3-D camera to track the user's hand movements.

Meta glasses to place virtual reality worlds at fingertips

This is a technology project that dates back several years, when the initial concept for a dual screen 3-D interface took hold. Lab work on the software and hardware included building the first 3-D occlusion algorithms to mesh the real and virtual world together in realtime. Supported platforms are Windows 32bit/64bit. Other platform support is still in development. Scenarios in which the Meta system might be applied include surgeons who could be assisted by floating 3-D models of CAT scans

A number of sites commenting on the device made note of its less than ideal bulky looks, calling attention to the big size of the glasses. Meta's team is planning to improve on the looks, though, and intends to have a next iteration as lighter and more fashionable. meta is a Y Combinator backed augmented reality venture and right now, the focus, in the form of a Kickstarter campaign, is to lure more developers on board, and their campaign seeks funds for making the Meta systems for application developers. The goal is $100,000; the project will be funded if at least $100,000 is pledged by June 16. At the time of this writing, they raised $75,393.

A "meta 1 Dev Kit" will be available for $750. The SDK provides gestures and finger tracking, general depth data that works with objects in range of the depth camera, RGB data and surface tracking (access planes and meshes of surfaces in the real world, so virtual objects can be anchored to them, or augmented in other ways.

"Our software stack is very open. Our depth camera exposes all depth data at the low level and at the higher levels give you access to object meshes, fingers/hands and gestures."

Explore further: Japan firm showcases 'touchable' 3D technology

More information: www.kickstarter.com/projects/5… ed-reality-interface

Related Stories

Google Glass may run with laser-projected keyboard

Jan 18, 2013

(Phys.org)—Just when you thought you could swing into 2013 without another report on Google Glass in-the-wings, this is the week your luck runs out. Ideas continue to fly regarding what could possibly be ...

Google patent sends ring signals to Project Glass

May 19, 2012

(Phys.org) -- Google's September 2011 patent that was filed for a wearable display device was granted this week, which suggests that its envisioned heads-up display device can be controlled by infrared markers ...

Recommended for you

User comments : 6

Adjust slider to filter visible comments by rank

Display comments: newest first

axemaster
3.5 / 5 (4) May 18, 2013
This project will die for the same reason every other alternative computer interface has died: it is more cumbersome, less precise, and slower than the mouse and keyboard. It might find a very small niche where it's worthwhile, such as for people who need to manipulate 3D objects, but they probably represent less than 1% of computer users. This is pretty much tacitly admitted in the video - we never see anyone performing any of the normal computer tasks with it; instead we see an array of glowy things floating around.

What I would much prefer to have, is glasses that simply allow you to have an unlimited number of screens beyond the physical computer screen boundary. I would love to be able to have multiple datasheets and graphs up simultaneously at full resolution.
Mayday
3.7 / 5 (3) May 18, 2013
As much as I wish it weren't so, Axe, I completely agree. They may be able to price it really high and sell to a niche group, but for mass market uses, it seems to be a non-starter. For real, everyday uses it IS really all about resolution and we jut aren't there yet. In fact, the sad truth is, we aren't even close.
Even Glass claims the equivalent of just a twenty-five inch screen at eight feet. That's like holding your smart phone at arm's length. Not very good. Without a breakthrough in resolution, the whole industry is hamstrung. So the best we have for quite a while will, unfortunately, remain a couple of twenty-five inch monitors on a desktop. But we can dream, right?
packrat
2 / 5 (4) May 18, 2013
It might not be for everybody and I totally agree with you guys on that but I can see a wonderful 3d cad interface possibly coming down the road with it. I could really have fun with that.
ForFreeMinds
1 / 5 (3) May 19, 2013
I think that as computers continue to improve in performance at the rate it has, the issues of "cumbersome, less precise and slower" will disappear. One cannot deny that lugging around a keyboard and mouse has some disadvantages as well.
sams2013
not rated yet May 19, 2013
This approach to computing is a natural evolution, its not really a new idea, we all have dreamed about it, seen movies with it, computing needs to integrate on every level and beyond as time goes on. So why design a product if limits on its technology has not yet made its application to the full market feasible? Yes, Google has its glass, and a whole lot more funding than $100k, you would think with all monies invested and researched they would have opted for this option if it were feasible. Google and others would not yet see it as marketable, but possible buy a project like this to stop future competition of glass 2.0. HHmmm....
jackjump
1 / 5 (3) May 19, 2013
Sure it's not going to beat out keyboard and mouse or touch screen right now but wait until the glasses are just plain glasses and the speed is as fast as needed. Wait until it interfaces to a powerful computer the size of a pen or to the cloud. Then it will be a superior interface and likely will wipe out not just keyboard, mouse and touch but desktops, laptops, tablets and smartphones too.