(Phys.org) —Yawn. Two startup visionaries claim they have just the device to replace keyboard and mouse forever and ever. Where have you heard that before. But maybe these two have something important. Meron Gribetz, the startup founder and CEO and Ben Sand, the co-pilot and evangelist, are behind something called the Meta wearable computer headset, which consists of stereoscopic glasses and camera. It's the way computers always should have been: wearable, viewed through both eyes, and directly controlled using the entire arms and hands, according to its founder and CEO Gribetz. The belief is that the future of computing is in this technology that can display information from the real world and control objects with one's fingers, Tony Stark-style, at low latency and high dexterity. Meta founder and CEO Gribetz referred to the technology as the keyboard and mouse of the future.
Meta grew out of a Columbia University project where a team built advanced surface tracking algorithms. The algorithms allow the anchoring of virtual content to the real world, without use of fiducial markers.
The Meta effort in wearable computing eyewear is not to be treated as competition for Google Glass, which may be an easy connection to make once the words wearable computing and spectacles are mentioned. The Meta team makes a point of calling out the two as different. As their promotional video said, "This is true augmented reality, not just popups from the corner of your eye." The Google Glass display is different, as Meta's is in the center of the field of view rather than above the eyeline. The role of the two are different, if one thinks of Google Glass as a helpful companion while navigating city streets, looking for landmarks, restaurants, or translations for signs in foreign languages. Meta's augmented reality eyewear is immersive applied to games, for example, played in front of your face or to working on a surface, while you use your hands to interact with your virtual reality. The system includes 3-D glasses supplied by Epson and a 3-D camera to track the user's hand movements.
This is a technology project that dates back several years, when the initial concept for a dual screen 3-D interface took hold. Lab work on the software and hardware included building the first 3-D occlusion algorithms to mesh the real and virtual world together in realtime. Supported platforms are Windows 32bit/64bit. Other platform support is still in development. Scenarios in which the Meta system might be applied include surgeons who could be assisted by floating 3-D models of CAT scans
A number of sites commenting on the device made note of its less than ideal bulky looks, calling attention to the big size of the glasses. Meta's team is planning to improve on the looks, though, and intends to have a next iteration as lighter and more fashionable. meta is a Y Combinator backed augmented reality venture and right now, the focus, in the form of a Kickstarter campaign, is to lure more developers on board, and their campaign seeks funds for making the Meta systems for application developers. The goal is $100,000; the project will be funded if at least $100,000 is pledged by June 16. At the time of this writing, they raised $75,393.
A "meta 1 Dev Kit" will be available for $750. The SDK provides gestures and finger tracking, general depth data that works with objects in range of the depth camera, RGB data and surface tracking (access planes and meshes of surfaces in the real world, so virtual objects can be anchored to them, or augmented in other ways.
"Our software stack is very open. Our depth camera exposes all depth data at the low level and at the higher levels give you access to object meshes, fingers/hands and gestures."
Explore further: For Google's self-driving cars, learning to deal with the bizarre is essential
More information: www.kickstarter.com/projects/5… ed-reality-interface