Researchers using Kinect to allow deaf people to communicate via computer (w/ Video)

Jul 18, 2013 by Bob Yirka report

Researchers from Microsoft Asia and the Institute of Computing Technology at the Chinese Academy of Sciences have been working together to develop a computer system able to translate gestures used in sign language to text. The combined team presented the results of their research at this year's Faculty Summit 2013—a conference held annually by Microsoft to promote information technology sharing among the academic community.

While teaching a to recognize and translate to text might seem unnecessary—people that are deaf or hard of hearing can simply type words and sentences using a keyboard and read those typed to them—those that are hearing impaired would like to speak in their using a computer just as much as non-hearing impaired people. Unfortunately, to date, most such efforts have been less than successful—some require the user to wear gloves, other's rely on simple web cams—neither approach has proven to be practical. For that reason, the researchers in this latest effort turned to Microsoft's Kinect device.

Members of the team demonstrated their system at the DemoFest portion of the conference, showcasing software that has been developed for the Kinect that successfully translates American Sign Language (ASL) into text. The system developed by the team operates in two modes. The first, called simply Translation Mode, translates physical hand or into text or speech. The second, called Communication Mode, allows a person speaking in ASL to communicate with someone else who is communicating in typed English. The system uses an Avatar to translate text coming from someone typing text on a keyboard, then converts their response to text and sends it back to the other person. Their demonstration showed that the system is capable of translating sentences, not just words, a significant step forward.

This video is not supported by your browser at this time.

The researchers stressed that their system is still a work in progress but hope to eventually create a system that is fully functional and reasonably inexpensive. That would mean a Kinect based communication system that operates entirely with hand gestures and spoken words—all in real time. Also, it would allow for conversion to other sign language dialects as well.

Explore further: Facebook outlines plans for virtual reality

add to favorites email to friend print save as pdf

Related Stories

Language by mouth and by hand

Apr 03, 2013

Humans favor speech as the primary means of linguistic communication. Spoken languages are so common many think language and speech are one and the same. But the prevalence of sign languages suggests otherwise. Not only can ...

Recommended for you

Computer student on gesture control: Start experimenting

Mar 25, 2015

Back in 2012, authors from Microsoft Research and UbiComp Lab at University of Washington prepared their paper, "SoundWave: Using the Doppler Effect to Sense Gestures," for the Proceedings of the Association ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.