Laptop clip-on is on a mission to outdo mouse

Aug 15, 2013 by Nancy Owano weblog

(Phys.org) —Haptix is a newly announced gesture-based controller that launched this week on Kickstarter. Haptix looks like a sleek ice cream bar with its anodized bead-blasted aluminum casing. The Haptix is designed to transform tabletops and keyboards into tools that you use to interact with your computer. Basically, it wants to be the reason that workers can finally say, without fear, goodbye mouse. The creators consider their device "Multitouch Reinvented." They say that it is as intuitive and natural as a multitouch screen, just without the actual screen. They have a point. This product is no Leap Motion me-too hiccup, but rather an attempt to create a practical alternative for people who work with spreadsheets, reports, and design projects where traditionally the mouse and keyboard have been thought to be the most practical tools to get the work done.

With Haptix, the option to clip the device on your laptop turns your keyboard into a multitouch tool with which you can control your computer entirely from your keyboard. (It automatically turns off when you type.) The device uses twin cameras to see what the user's hands are doing and turns actions into input signals. You can use your middle finger as the cursor, your to left-click, and your to right-click.

People who are working with spreadsheets can use five-finger touch. Designers, artists and engineers can capture pen or brush strokes. A distinguishing feature about Haptix is hand comfort. The Haptix team is vocal about the fact that their product, in supporting 2D and 3D gestures, allows users to rest their hands while working, and in turn they can use the controller for extended periods without discomfort in the wrists.

Haptix has two CMOS image sensors that capture the position of your hands in 640x360 resolution, attached to a . The device works with any lighting condition—making use of infrared if in the dark. Haptix connects to a computer through a USB 2.0 cable.

It supports a number of gesture types, including pinch to zoom, on flat surfaces, but need to be performed within its field of vision. The Kickstarter prototype has a 120 degree field of view. The creators said they are in the middle of transitioning to better lenses with a 150 degree field of view.

The Haptix Touch creators are Darren Lim, CEO and Lai Xue. CTO. Haptix Touch is in San Francisco. They seek $100,000 to help fund manufacturing costs; the project will only be funded if at least $100,000 is pledged by September 13. They said the funding will go toward "refinement of the electronics and additional tooling for mass production." Haptix currently works with Windows and Ubuntu. Android and OS X support are in the works.

The retail price will be $70 but early birds can get a Haptix at $59. After that they cost $65. Haptix expects to ship to backers by February next year.

Explore further: The ethics of driverless cars

More information: www.kickstarter.com/projects/h… ultitouch-reinvented

Related Stories

Leap Motion targets May for pre-orders and store sales

Mar 09, 2013

(Phys.org) —Leap Motion will start shipping its 3-D motion controllers on pre-order basis in mid-May. Big news? For a growing Leap Motion fan base, it's great news. Leap Motion launched its sensing con ...

Google rolls its own keyboard app for Android 4.0 and up

Jun 06, 2013

(Phys.org) —Google Maps, Google Drive, Google This, Google That….But there is always room for one more new arrival from Google, and now it is in the form of an app called Google Keyboard. Available at ...

Touch Mouse ready for Windows 7 after two long years

Jan 12, 2011

More than two years of hard work, countless prototypes, and intense collaboration with team members around the globe are just a few of the things that went into the creation of the Microsoft Touch Mouse, a ...

Leap jumps to capture next step in motion control

Mar 12, 2013

In a bustling tent set up in a parking lot here at the South By Southwest Interactive Festival, people are pointing their hands and gesturing with chopsticks as they guide various actions on a dozen computer screens.

Recommended for you

Should you be worried about paid editors on Wikipedia?

2 hours ago

Whether you trust it or ignore it, Wikipedia is one of the most popular websites in the world and accessed by millions of people every day. So would you trust it any more (or even less) if you knew people ...

DESY and IBM develop big data architecture for science

3 hours ago

IBM today announced it is collaborating with Deutsches Elektronen-Synchrotron (DESY), a leading national research center in Germany, to speed up management and storage of massive volumes of x-ray data. The ...

How much do we really know about privacy on Facebook?

4 hours ago

The recent furore about the Facebook Messenger app has unearthed an interesting question: how far are we willing to allow our privacy to be pushed for our social connections? In the case of the Facebook ...

User comments : 3

Adjust slider to filter visible comments by rank

Display comments: newest first

antialias_physorg
not rated yet Aug 15, 2013
It's an intrigueing idea - to use the keyboard as a finger-mouse/pad. I wonder how the will distinguish motion of a finger from a key to the next (in order to type) from a "mouse-move" gesture.

Recently got the Leap motion..and it's a cute toy to play with (after I got it working).
NikFromNYC
1 / 5 (3) Aug 15, 2013
...and I just spent $9,000 for a Geomagic haptic pen with their Freeform software since alternatives so far lack an additional element of tactile feedback. But the real haptic pressure feedback as often just *fights* what I want to do instead of helps and at times feels like more of a boutique feature rather than something essential. It's not as claimed that it emulates working with clay by hand as if you can really feel a 3D model and that's because it's a mere tipped stylus that only has feedback in XYZ but no rotational resistance as you angle any shaped stylus tool tip around or rotate it. And all that tiring and glitch prone haptic work slows the interface way down even with the best PC upgrades. But "organic" 3D modeling is just too awkward in finicky NURBS programs like Rhino.
alfie_null
not rated yet Aug 16, 2013
Figure out how to optimize the information flow from the user to the computer via this device. This is going to require a radically new way of inputing.

Emulating a keyboard and mouse is a tempting way to start, but suffers shortcomings. It's clumsy, inefficient, and not satisfying. Gestures and swipes as used on tablet surfaces don't provide a wide range of different data inputs. Something akin to ASL would be good but the interface (software, driver) would have to be skilled at disambiguating similar types of hand movements.

Then there's the issue of training. As with learning to type, it will take time and patience. Users would have to be convinced to make the investment.