Sign language over a mobile phone

Aug 22, 2008

A group at the University of Washington has developed software that for the first time enables deaf and hard-of-hearing Americans to use sign language over a mobile phone. UW engineers got the phones working together this spring, and recently received a National Science Foundation grant for a 20-person field project that will begin next year in Seattle.

This is the first time two-way real-time video communication has been demonstrated over cell phones in the United States. Since posting a video of the working prototype on YouTube, deaf people around the country have been writing on a daily basis.

"A lot of people are excited about this," said principal investigator Eve Riskin, a UW professor of electrical engineering. For mobile communication, deaf people now communicate by cell phone using text messages. "But the point is you want to be able to communicate in your native language," Riskin said. "For deaf people that's American Sign Language."

Video is much better than text-messaging because it's faster and it's better at conveying emotion, said Jessica DeWitt, a UW undergraduate in psychology who is deaf and is a collaborator on the MobileASL project. She says a large part of her communication is with facial expressions, which are transmitted over the video phones.

Low data transmission rates on U.S. cellular networks, combined with limited processing power on mobile devices, have so far prevented real-time video transmission with enough frames per second that it could be used to transmit sign language. Communication rates on United States cellular networks allow about one tenth of the data rates common in places such as Europe and Asia (sign language over cell phones is already possible in Sweden and Japan).

Even as faster networks are becoming more common in the United States, there is still a need for phones that would operate on the slower systems.

"The faster networks are not available everywhere," said doctoral student Anna Cavender. "They also cost more. We don't think it's fair for someone who's deaf to have to pay more for his or her cell phone than someone who's hearing."

The team tried different ways to get comprehensible sign language on low-resolution video. They discovered that the most important part of the image to transmit in high resolution is around the face. This is not surprising, since eye-tracking studies have already shown that people spend the most time looking at a person's face while they are signing.

The current version of MobileASL uses a standard video compression tool to stay within the data transmission limit. Future versions will incorporate custom tools to get better quality. The team developed a scheme to transmit the person's face and hands in high resolution, and the background in lower resolution. Now they are working on another feature that identifies when people are moving their hands, to reduce battery consumption and processing power when the person is not signing.

The team is currently using phones imported from Europe, which are the only ones they could find that would be compatible with the software and have a camera and video screen located on the same side of the phone so that people can film themselves while watching the screen.

Mobile video sign language won't be widely available until the service is provided through a commercial cell-phone manufacturer, Riskin said. The team has already been in discussion with a major cellular network provider that has expressed interest in the project.

Source: University of Washington

Explore further: Microsoft CEO is driving data-culture mindset

add to favorites email to friend print save as pdf

Related Stories

Twitter tweaks website to attract new users

Apr 09, 2014

As Twitter looks to broaden its appeal beyond its 241 million users, the company is introducing a redesign of profile pages that includes bigger photos, more user controls and a distinct resemblance to Facebook.

Geeks the stars as Apple 'Mac' turns 30

Jan 26, 2014

Geeks who brought the Macintosh computer to life became Silicon Valley rock stars on Saturday, with people asking for autographs or photos while celebrating the Apple desktop machine's 30th birthday. ...

Truly a web game, Monster Madness is unveiled

Dec 14, 2013

(Phys.org) —The director of Nom Nom Games, a subsidiary of Trendy Entertainment, has converted the Monster Madness game to the Web using technologies pioneered by Mozilla. Jeremy Stieglitz, Development ...

Recommended for you

Hackathon team's GoogolPlex gives Siri extra powers

5 hours ago

(Phys.org) —Four freshmen at the University of Pennsylvania have taken Apple's personal assistant Siri to behave as a graduate-level executive assistant which, when asked, is capable of adjusting the temperature ...

Microsoft CEO is driving data-culture mindset

Apr 16, 2014

(Phys.org) —Microsoft's future strategy: is all about leveraging data, from different sources, coming together using one cohesive Microsoft architecture. Microsoft CEO Satya Nadella on Tuesday, both in ...

User comments : 0

More news stories

Hackathon team's GoogolPlex gives Siri extra powers

(Phys.org) —Four freshmen at the University of Pennsylvania have taken Apple's personal assistant Siri to behave as a graduate-level executive assistant which, when asked, is capable of adjusting the temperature ...

Better thermal-imaging lens from waste sulfur

Sulfur left over from refining fossil fuels can be transformed into cheap, lightweight, plastic lenses for infrared devices, including night-vision goggles, a University of Arizona-led international team ...

Deadly human pathogen Cryptococcus fully sequenced

Within each strand of DNA lies the blueprint for building an organism, along with the keys to its evolution and survival. These genetic instructions can give valuable insight into why pathogens like Cryptococcus ne ...