Researchers using Kinect to allow deaf people to communicate via computer (w/ Video)

Jul 18, 2013 by Bob Yirka report

Researchers from Microsoft Asia and the Institute of Computing Technology at the Chinese Academy of Sciences have been working together to develop a computer system able to translate gestures used in sign language to text. The combined team presented the results of their research at this year's Faculty Summit 2013—a conference held annually by Microsoft to promote information technology sharing among the academic community.

While teaching a to recognize and translate to text might seem unnecessary—people that are deaf or hard of hearing can simply type words and sentences using a keyboard and read those typed to them—those that are hearing impaired would like to speak in their using a computer just as much as non-hearing impaired people. Unfortunately, to date, most such efforts have been less than successful—some require the user to wear gloves, other's rely on simple web cams—neither approach has proven to be practical. For that reason, the researchers in this latest effort turned to Microsoft's Kinect device.

Members of the team demonstrated their system at the DemoFest portion of the conference, showcasing software that has been developed for the Kinect that successfully translates American Sign Language (ASL) into text. The system developed by the team operates in two modes. The first, called simply Translation Mode, translates physical hand or into text or speech. The second, called Communication Mode, allows a person speaking in ASL to communicate with someone else who is communicating in typed English. The system uses an Avatar to translate text coming from someone typing text on a keyboard, then converts their response to text and sends it back to the other person. Their demonstration showed that the system is capable of translating sentences, not just words, a significant step forward.

This video is not supported by your browser at this time.

The researchers stressed that their system is still a work in progress but hope to eventually create a system that is fully functional and reasonably inexpensive. That would mean a Kinect based communication system that operates entirely with hand gestures and spoken words—all in real time. Also, it would allow for conversion to other sign language dialects as well.

Explore further: Skin icons can tap into promise of smartwatch

Related Stories

Language by mouth and by hand

Apr 03, 2013

Humans favor speech as the primary means of linguistic communication. Spoken languages are so common many think language and speech are one and the same. But the prevalence of sign languages suggests otherwise. Not only can ...

Recommended for you

Apple issues security warning for iCloud

28 minutes ago

Apple has posted a new security warning for users of its iCloud online storage service amid reports of a concerted effort to steal passwords and other data from people who use the popular service in China.

Review: Better cameras, less glare in iPad Air 2

30 minutes ago

If I've seen you taking photos with a tablet computer, I've probably made fun of you (though maybe not to your face, depending on how big you are). I'm old school: I much prefer looking through the viewfinder ...

Apple sees iCloud attacks; China hack reported

11 hours ago

Apple said Tuesday its iCloud server has been the target of "intermittent" attacks, hours after a security blog said Chinese authorities had been trying to hack into the system.

User comments : 0