Carnegie Mellon group shows iPad skeuomorphism

May 4, 2014 by Nancy Owano weblog

( —The Human Interfaces Group at Carnegie Mellon, led by the group's director Chris Harrison, an assistant professor of Human Computer Interaction, have done work that shows how traditional hand movements to perform tasks such as measuring and erasing can be naturally applied to the digital screen, improving on a natural interaction with computers. They have come up with TouchTools, a gesture design approach. With TouchTools, you manipulate tools on the screen just as you would in real life; the idea is to make software more natural to use. They are showing the world their TouchTools concept, which Gizmodo recently referred to as "skeuomorphism applied to interaction design."

Harrison said, "The core idea behind TouchTools is to draw upon user familiarity and motor skill with tools from the real world, and bring them to interactive use on computers."

The same hand gestures that you would use, for example, to run a tape measure across an item in the real world becomes the same hand gesture you use with a virtual, realistic-looking tape measure across the screen. Grab the virtual tape measure appearing on the screen, and you can roll it out, or move a market to draw, or press a camera shutter button.

Using TouchTools, users replicate a tool's corresponding real-world grasp and press it to the screen as though it was physically present. The system recognizes this pose and instantiates the virtual tool as if it was being grasped at that position. Users can rotate and manipulate the tool as they would its physical counterpart. But what's wrong with touchscreen interactions today? Why bother? The team's argument is that hand movements can be cumbersome in today's interactive environments and their approach is more natural.

This video is not supported by your browser at this time.

"Contemporary applications often expose a toolbar that allows users to toggle between modes (e.g., pointer, pen, eraser modes) or require use of a special physical , such as a stylus. TouchTools can utilize the natural modality of our hands, rendering these accessories superfluous."

The way a computer user has to chord the fingers is not natural, as we do not normally perform actions in the real world by movements with certain numbers of fingers, as we do for screen tasks. Yet, as the group noted, the average person can skillfully manipulate a plethora of tools, from hammers to tweezers, with human hand actions. Despite this remarkable natural dexterity, gestures on today's touch devices are simplistic, relying primarily on the chording of fingers: one-finger pan, two-finger pinch, four-finger swipe.

"We propose that touch gesture design be inspired by the manipulation of physical tools from the real world. In this way, we can leverage user familiarity and fluency with such tools to build a rich set of gestures for touch interaction." This is what the team said in their paper, "TouchTools: Leveraging Familiarity and Skill with Physical Tools to Augment Touch Interaction," prepared for CHI 2014, which took place in Toronto from April 26 to May 1. They reported results after recruiting participants, and an iPad was used. Participants were given physical versions of various test tools to handle. "With only a few minutes of training on a proof-of-concept system, users were able to summon a variety of virtual tools by replicating their corresponding real-world grasps." The authors said they believe that "designing gestures around real-world tools improves discoverability, intelligibility and makes gestures memorable."

This video is not supported by your browser at this time.

One might contend, why go to this effort when a tablet is so easy to use? Would TouchTools be seen as a lot of unnecessary bother? Arguments for the use of skeuomorphism in digital design involve ease of use. Digital emulation with objects and the way humans use them provide easy familiarity.

Gizmodo commented that what the Carnegie Mellon team is trying to convey is, "the library of interactions we currently use is quite thin—and by looking at the world around us, and the 'natural modality' of our 10 fingers, designers might find unexpectedly smart solutions to digital problems."

To be sure, the authors of the TouchTools paper, Chris Harrison, Robert Xiao, Julia Schwarz and Scott E. Hudson, stated that "We hope this work offers a new lens through which the HCI community can craft novel touch experiences."

Explore further: Holodesk prototype puts life in computers (w/ video)

More information: TouchTools:

Research paper: Harrison, C., Xiao, R., Schwarz, J., and Hudson, S. TouchTools: Leveraging Familiarity and Skill with Physical Tools to Augment Touch Interaction. In Proceedings of the 32nd Annual SIGCHI Conference on Human Factors in Computing Systems (Toronto, Canada, April 26 - May 1, 2014). CHI '14. ACM, New York, NY.

Related Stories

Holodesk prototype puts life in computers (w/ video)

October 20, 2011

( -- A research project at Microsoft Research Cambridge has brought forth a prototype called Holodesk, which lets you manipulate virtual objects with your hand. You literally "get your hands on" the virtual display. ...

Recommended for you

Smart home heating and cooling

August 28, 2015

Smart temperature-control devices—such as thermostats that learn and adjust to pre-programmed temperatures—are poised to increase comfort and save energy in homes.

Smallest 3-D camera offers brain surgery innovation

August 28, 2015

To operate on the brain, doctors need to see fine details on a small scale. A tiny camera that could produce 3-D images from inside the brain would help surgeons see more intricacies of the tissue they are handling and lead ...

Team creates functional ultrathin solar cells

August 27, 2015

(—A team of researchers with Johannes Kepler University Linz in Austria has developed an ultrathin solar cell for use in lightweight and flexible applications. In their paper published in the journal Nature Materials, ...

Interactive tool lifts veil on the cost of nuclear energy

August 24, 2015

Despite the ever-changing landscape of energy economics, subject to the influence of new technologies and geopolitics, a new tool promises to root discussions about the cost of nuclear energy in hard evidence rather than ...


Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet May 04, 2014
I wonder how their application can tell the difference between an erase gesture and a scribble gesture? Do they always assume that a scribble gesture means erase when there are annotations present in the area of the scribbling? How does it tell the difference between the gesture to activate the camera and the large eraser? It looked like many of the gestures were similar. Their demonstration works, but how often does it pick the wrong tool?

It's a neat idea, but the tools still looked cumbersome to me and the tool menu at least has the advantage of correctly interpreting the user's intentions. Still, it's worth continuing research work.
not rated yet May 04, 2014
And it's got nothing to do with iPads other than they used one for the trial. When are people going to stop saying iPad when they really mean any tablet?
not rated yet May 05, 2014
Hahaha. Thank you "box" haha

They say ipad because they used an ipad, moron

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.