Sign language speakers' hands, mouths operate separately

August 23, 2010

When people are communicating in sign languages, they also move their mouths. But scientists have debated whether mouth movements resembling spoken language are part of the sign itself or are connected directly to English. In a new study on British Sign Language, signers made different mistakes in the sign and in the mouthing—which means the hand and lip movements are separate in the signer's brain, not part of the same sign.

David P. Vinson, of University College London, and his colleagues Robin L. Thompson, Robert Skinner, Neil Fox, and Gabriella Vigliocco planned to do basic research on how signers process language. They recruited both deaf and hearing signers, all of whom grew up signing with deaf parents. Each person sat in front of a monitor with a video camera pointed at them. They were shown sets of pictures—for example, one set contained various fruits, another set contained modes of transportation—and were asked to sign the name of each item. In another session, they were shown those words in English and asked to translate them into British Sign Language. The idea is to show the pictures or words quickly enough that people tend to make mistakes, mistakes which help reveal how language is processed.

The researchers only planned to look at the signs, but the videos also captured the signers' mouths. "We noticed that there were quite a few cases where the hands and the seemed to be doing something different," says Vinson. When people were looking at pictures, the hands and mouth would usually make the same mistakes—signing and mouthing "banana" when the picture was an apple, for example. But when they were translating English words, the hands made the same kind of mistakes, but the lips didn't. This suggests that the lip movement isn't part of the sign. "In essence, they're doing the same thing as reading an English word aloud without pronouncing it," says Vinson. "So they seem to be processing two languages at the same time." This study appears in Psychological Science, a journal of the Association for Psychological Science.

British Sign Language is a separate language from both English and American ; it developed naturally, and is mentioned in historical records as far back as 1576. Most British signers are bilingual in English. Vinson speculates that mouthing English words may help deaf people develop literacy in English.

Explore further: Sign language cell phone service created

Related Stories

Sign language cell phone service created

March 6, 2007

The world's first sign language dictionary available from a mobile phone has been launched by the University of Bristol's Centre for Deaf Studies.

Deaf children use hands to invent own way of communicating

February 15, 2009

Deaf children are able to develop a language-like gesture system by making up hand signs and using homemade systems to increase their communication as they grow, just as children with conventional spoken language, research ...

Sign language puzzle solved

December 15, 2009

( -- Scientists have known for 40 years that even though it takes longer to use sign language to sign individual words, sentences can be signed, on average, in the same time it takes to say them, but until now ...

Second language learners recall native language when reading

June 1, 2010

( -- Adults fluent in English whose first language is Chinese retrieve their native language when reading in English, according to new research in the June 2 issue of The Journal of Neuroscience. This study suggests ...

Language Helps People Solve Spatial Problems: Study

June 27, 2010

( -- Language appears to be key in helping humans figure out the physical world. By testing people who use an emerging sign language in Nicaragua, Wellesley College Assistant Professor of Psychology Jennie Pyers ...

Recommended for you

How the finch changes its tune

August 3, 2015

Like top musicians, songbirds train from a young age to weed out errors and trim variability from their songs, ultimately becoming consistent and reliable performers. But as with human musicians, even the best are not machines. ...

Machine Translates Thoughts into Speech in Real Time

December 21, 2009

( -- By implanting an electrode into the brain of a person with locked-in syndrome, scientists have demonstrated how to wirelessly transmit neural signals to a speech synthesizer. The "thought-to-speech" process ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.