Pressing a button is more challenging than appears—new theory improves button designs

March 20, 2018, Aalto University
Both physical and touch buttons provide clear tactile signals from the impact of the tip with the button floor. However, with the physical button this signal is more pronounced and longer. Credit: Aalto University

Pressing a button appears effortless and one easily dismisses how challenging it is. Researchers at Aalto University, Finland, and KAIST, South Korea, have created detailed simulations of button pressing with the goal of producing human-like presses.

"This research was triggered by admiration of our remarkable capability to adapt button pressing," says Professor Antti Oulasvirta at Aalto University. "We push a button on a remote controller differently than a piano key. The press of a skilled user is surprisingly elegant when looked at terms of timing, reliability, and energy use. We successfully press buttons without ever knowing the inner workings of a button. It is essentially a black box to our motor system. On the other hand, we also fail to activate buttons, and some buttons are known to be worse than others."

Previous research has shown that touch buttons are worse than push-buttons, but there has not been adequate theoretical explanation.

"In the past, there has been very little attention to buttons, although we use them all the time," says Dr. Sunjun Kim. The new theory and simulations can be used to design better buttons.

"One exciting implication of the theory is that activating the button at the moment when the sensation is strongest will help users better rhythm their keypresses."

To test this hypothesis, the researchers created a new method for changing the way buttons are activated. The technique is called Impact Activation. Instead of activating the button at first contact, it activates it when the button cap or finger hits the floor with maximum impact.

The technique was 94 percent more precise in rapid tapping than the regular activation method for a push-button (Cherry MX switch) and 37 percent than a regular touchscreen button using a capacitive touch sensor. The technique can be easily deployed in touchscreens. However, regular physical keyboards do not offer the required sensing capability, although special products exist (e.g., the Wooting keyboard) on which it can be implemented.

The technique could help gamers and musicians in tasks that require speed and rhythm.

The simulations shed new light on what happens during a button press. One problem the must overcome is that muscles do not activate perfectly. Instead, every press is slightly different. Moreover, a button press is very fast, occurring within 100 milliseconds, and is too fast for correcting movement. The key to understanding button pressing is therefore to understand how the brain adapts based on the limited sensations that are the residue of the brief button-pressing event.

The researchers argue that the key capability of the brain is a probabilistic model: The brain learns a model that allows it to predict a suitable motor command for a button. If a press fails, it can pick a very good alternative and try it out. "Without this ability, we would have to learn to use every button like it was new," says Professor Byungjoo Lee from KAIST. After successfully activating the button, the brain can tune the motor command to be more precise, use less energy and to avoid stress or pain. "These factors together, with practice, produce the fast, minimum-effort, elegant touch people are able to perform."

The brain uses probabilistic models also to extract information optimally from the sensations that arise when the finger moves and its tip touches the button. It "enriches" the ephemeral sensations optimally based on prior experience to estimate the time the button was impacted. For example, tactile sensation from the tip of the finger a better predictor for button activation than proprioception (angle position) and visual feedback.

Best performance is achieved when all sensations are considered together. To adapt, the brain must fuse their information using prior experiences. Professor Lee explains: "We believe that the brain picks up these skills over repeated button pressings that start already as a child. What appears easy for us now has been acquired over years."

The researchers also used the simulation to explain differences among physical and touchscreen-based button types. Both physical and touch buttons provide clear tactile signals from the impact of the tip with the button floor. However, with the physical button this signal is more pronounced and longer.

"Where the two button types also differ is the starting height of the finger, and this makes a difference," explains Prof. Lee. "When we pull up the finger from the touchscreen, it will end up at different height every time. Its down-press cannot be as accurately controlled in time as with a push-button where the finger can rest on top of the key cap."

Three scientific articles, "Neuromechanics of a Button Press," "Impact activation improves rapid button pressing," and "Moving target selection: A cue integration model," will be presented at the CHI Conference on Human Factors in Computing Systems in Montréal, Canada, in April 2018.

Explore further: Amazon adds 50 Dash Buttons, says orders roll in twice a minute

More information: Project web pages:

userinterfaces.aalto.fi/neuromechanics
userinterfaces.aalto.fi/impact_activation
kiml.org/Moving-Target-Selection

Related Stories

Spinning 'Orbita' Mouse Available in January

December 15, 2008

It looks like a large shiny button, but the round object is actually the world's first wireless three-axis mouse, according to its designers, an Australian technology company called Cyber Sport. The company designed the Orbita ...

Apple Intros 'Mighty Mouse'

August 2, 2005

Apple today introduced Mighty Mouse, its next generation mouse with several innovative new features that make using a Mac even more powerful and easy. Mighty Mouse features the revolutionary Scroll Ball that lets you move ...

Recommended for you

Permanent, wireless self-charging system using NIR band

October 8, 2018

As wearable devices are emerging, there are numerous studies on wireless charging systems. Here, a KAIST research team has developed a permanent, wireless self-charging platform for low-power wearable electronics by converting ...

Facebook launches AI video-calling device 'Portal'

October 8, 2018

Facebook on Monday launched a range of AI-powered video-calling devices, a strategic revolution for the social network giant which is aiming for a slice of the smart speaker market that is currently dominated by Amazon and ...

Artificial enzymes convert solar energy into hydrogen gas

October 4, 2018

In a new scientific article, researchers at Uppsala University describe how, using a completely new method, they have synthesised an artificial enzyme that functions in the metabolism of living cells. These enzymes can utilize ...

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

antialias_physorg
not rated yet Mar 20, 2018
Previous research has shown that touch buttons are worse than push-buttons, but there has not been adequate theoretical explanation.

*Sigh*...that's what I'm trying to tell our UI guys all the time. And of course there is an even worse button design: "Push buttons" in virtual/augmented reality settings (as shown in the video with the Leap Motion controller). The best UI designs on touch/virtual reality settings don't use push button concepts at all (but swipes and gestures respectively)

However, it seems these failures are something every company has to learn on their own - the hard way.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.