Words, gestures are translated by same brain regions, says new research

Nov 09, 2009

Your ability to make sense of Groucho's words and Harpo's pantomimes in an old Marx Brothers movie takes place in the same regions of your brain, says new research funded by the National Institute on Deafness and Other Communication Disorders (NIDCD), one of the National Institutes of Health.

In a study published in this week's Early Edition of (PNAS), researchers have shown that the brain regions that have long been recognized as a center in which spoken or written words are decoded are also important in interpreting wordless gestures. The findings suggest that these brain regions may play a much broader role in the interpretation of symbols than researchers have thought and, for this reason, could be the evolutionary starting point from which originated.

"In babies, the ability to communicate through gestures precedes spoken language, and you can predict a child's language skills based on the repertoire of his or her gestures during those early months," said James F. Battey, Jr., M.D., Ph.D., director of the NIDCD. "These findings not only provide compelling evidence regarding where language may have come from, they help explain the interplay that exists between language and gesture as children develop their language skills."

Scientists have known that sign language is largely processed in the same regions of the brain as spoken language. These regions include the inferior frontal gyrus, or Broca's area, in the front left side of the brain, and the posterior temporal region, commonly referred to as Wernicke's area, toward the back left side of the brain. It isn't surprising that signed and spoken language activate the same brain regions, because sign language operates in the same way as spoken language does—with its own vocabulary and rules of grammar.

In this study, NIDCD researchers, in collaboration with scientists from Hofstra University School of Medicine, Hempstead, N.Y., and San Diego State University, wanted to find out if non-language-related gestures—the hand and body movements we use that convey meaning on their own, without having to be translated into specific words or phrases—are processed in the same regions of the brain as language is. Two types of gestures were considered for the study: pantomimes, which mimic objects or actions, such as unscrewing a jar or juggling balls, and emblems, which are commonly used in social interactions and which signify abstract, usually more emotionally charged concepts than pantomimes. Examples include a hand sweeping across the forehead to indicate "it's hot in here!" or a finger to the lips to signify "be quiet."

While inside a functional MRI machine, 20 healthy, English-speaking volunteers—nine males and 11 females—watched video clips of a person either acting out one of the two gesture types or voicing the phrases that the gestures represent. As controls, volunteers also watched clips of the person using meaningless gestures or speaking pseudowords that had been chopped up and randomly reorganized so the brain would not interpret them as language. Volunteers watched 60 video clips for each of the six stimuli, with the clips presented in 45-second time blocks at a rate of 15 clips per block. A mirror attached to the head enabled the volunteer to watch the video projected on the scanner room wall. The scientists then measured brain activity for each of the stimuli and looked for similarities and differences as well as any communication occurring between individual parts of the brain.

The researchers found that for the gesture and spoken language stimuli, the brain was highly activated in the inferior frontal and posterior temporal areas, the long-recognized language regions of the brain.

"If gesture and language were not processed by the same system, you'd have spoken language activating the inferior frontal and posterior temporal areas, and gestures activating other parts of the brain," said Allen Braun, M.D., senior author on the paper, "But in fact we found virtual overlap."

Current thinking in the study of language is that, like a smart search engine that pops up the most suitable Web site at the top of its search results, the posterior temporal region serves as a storehouse of words from which the selects the most appropriate match. The researchers suggest that, rather than being limited to deciphering words alone, these regions may be able to apply meaning to any incoming symbols, be they words, gestures, images, sounds, or objects. According to Dr. Braun, these regions also may present a clue into how language evolved.

"Our results fit a longstanding theory which says that the common ancestor of humans and apes communicated through meaningful gestures and, over time, the brain regions that processed gestures became adapted for using words," he said. "If the theory is correct, our language areas may actually be the remnant of this ancient communication system, one that continues to process as well as language in the human brain."

Dr. Braun adds that developing a better understanding of the systems that support gestures and words may help in the treatment of some patients with aphasia, a disorder that hinders a person's ability to produce or understand language.

Source: NIH/National Institute on Deafness and Other Communication Disorders

Explore further: New viral tools for mapping brains

add to favorites email to friend print save as pdf

Related Stories

Researchers identify language feature unique to human brain

Mar 23, 2008

Researchers at the Yerkes National Primate Research Center, Emory University, have identified a language feature unique to the human brain that is shedding light on how human language evolved. The study marks the first use ...

Chimp and human communication trace to same brain region

Feb 28, 2008

An area of the brain involved in the planning and production of spoken and signed language in humans plays a similar role in chimpanzee communication, researchers report online on February 28th in the journal Current Biology.

Mandarin language is music to the brain

Dec 12, 2006

It’s been shown that the left side of the brain processes language and the right side processes music; but what about a language like Mandarin Chinese, which is musical in nature with wide tonal ranges"

Study: Grammar ability hardwired in humans

Feb 07, 2006

University of Rochester scientists studying why characteristics of grammar are found in all languages say the use of grammar is hardwired in our brains.

Unlocking the brain after stroke

Sep 23, 2008

University of Queensland research is set to unlock the regions of the brain central to successful language treatment following a stroke.

Recommended for you

New viral tools for mapping brains

18 hours ago

(Medical Xpress)—A brain-computer-interphase that is optogenetically-enabled is one of the most fantastic technologies we might envision today. It is likely that its full power could only be realized under ...

Link seen between seizures and migraines in the brain

Oct 30, 2014

Seizures and migraines have always been considered separate physiological events in the brain, but now a team of engineers and neuroscientists looking at the brain from a physics viewpoint discovered a link ...

Neuroscience: Why scratching makes you itch more

Oct 30, 2014

Turns out your mom was right: Scratching an itch only makes it worse. New research from scientists at Washington University School of Medicine in St. Louis indicates that scratching causes the brain to release ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.