(PhysOrg.com) -- Microsoft researches have teamed up with the University of Washington and the University of Toronto to develop a muscle-controlled interface that allows for hands-free, gesture-driven interaction with computers.
By attaching a band of electrodes to a person's forearm, electrical activity can be read from different arm muscles. The signals are then compared to different hand gestures and processed by software.
Video Caption: Science advisor, Steven Spielberg has created a real-world implementation of the computer systems seen in the Minority Report.
The current model uses six electromyography sensors (EMG) and two ground electrodes placed in a ring around a person's upper right forearm for sensing finger movement. Two additional sensors are placed on the upper left forearm for identifying hand squeezes.
Since the sensors can't accurately interpret muscle activity, software must be used to train the associate electrical signals with different gestures. By using standard machine-learning algorithms, the software learns to recognize EMG signals produced by a user performing gestures.
The algorithms use three aspects of the EMG signal: the magnitude of muscle activity, the rate of muscle activity, and the wave patterns taking place across several sensors at once.
After the software is properly trained, using standard machine-learning algorithms, participants gestures were accurately determined 85 percent of the time. In the early stages of training, participants' gestures must be carefully controlled so that the machine-learning algorithms are properly trained.
The goal of this research is to provide a more seamless integration between user and computer. One day advance gesture control, using muscle-base interface, will become the norm and current PC interfaces, such as a mouse, will become obsolete.
More information: Visit Microsoft Research for muCIs
Via: Technology Review
© 2009 PhysOrg.com
Explore further: Brain inspired data engineering