Our brains are wired so we can better hear ourselves speak, study shows

Dec 08, 2010 By Yasmin Anwar
Activity in the auditory cortex when we speak and listen is amplified in some regions of the brain and muted in others. In this image, the black line represents muting activity when we speak. (Courtesy of Adeen Flinker)

(PhysOrg.com) -- Like the mute button on the TV remote control, our brains filter out unwanted noise so we can focus on what we're listening to. But when following our own speech, a new brain study from UC Berkeley shows that instead of one mute button, we have a network of volume settings that can selectively silence and amplify the sounds we make and hear.

Neuroscientists from UC Berkeley, UCSF and Johns Hopkins University tracked the emitted from the brains of hospitalized patients. They discovered that neurons in one part of the patients' hearing mechanism were dimmed when they talked, while neurons in other parts lit up.

Their findings, published today (Dec. 8, 2010) in the Journal of Neuroscience, offer new clues about how we hear ourselves above the noise of our surroundings and monitor what we say. Previous studies have shown a selective in monkeys that can amplify their self-produced mating, food and danger alert calls, but until this latest study, it was not clear how the human auditory system is wired.

"We used to think that the human auditory system is mostly suppressed during speech, but we found closely knit patches of with very different sensitivities to our own speech that paint a more complicated picture," said Adeen Flinker, a doctoral student in neuroscience at UC Berkeley and lead author of the study.

"We found evidence of millions of neurons firing together every time you hear a sound right next to millions of ignoring external sounds but firing together every time you speak," Flinker added. "Such a mosaic of responses could play an important role in how we are able to distinguish our own speech from that of others."

While the study doesn't specifically address why humans need to track their own speech so closely, Flinker theorizes that, among other things, tracking our own speech is important for , monitoring what we say and adjusting to various noise environments.

"Whether it's learning a new language or talking to friends in a noisy bar, we need to hear what we say and change our speech dynamically according to our needs and environment," Flinker said.

He noted that people with schizophrenia have trouble distinguishing their own internal voices from the voices of others, suggesting that they may lack this selective auditory mechanism. The findings may be helpful in better understanding some aspects of auditory hallucinations, he said.

Moreover, with the finding of sub-regions of cells each tasked with a different volume control job – and located just a few millimeters apart – the results pave the way for a more detailed mapping of the auditory cortex to guide brain surgery.

In addition to Flinker, the study's authors are Robert Knight, director of the Helen Wills Neuroscience Institute at UC Berkeley; neurosurgeons Edward Chang, Nicholas Barbaro and neurologist Heidi Kirsch of the University of California, San Francisco; and Nathan Crone, a neurologist at Johns Hopkins University in Maryland.

The auditory cortex is a region of the brain's temporal lobe that deals with sound. In hearing, the human ear converts vibrations into electrical signals that are sent to relay stations in the brain's auditory cortex where they are refined and processed. Language is mostly processed in the left hemisphere of the brain.

In the study, researchers examined the electrical activity in the healthy brain tissue of patients who were being treated for seizures. The patients had volunteered to help out in the experiment during lulls in their treatment, as electrodes had already been implanted over their auditory cortices to track the focal points of their seizures.

Researchers instructed the patients to perform such tasks as repeating words and vowels they heard, and recorded the activity. In comparing the activity of electrical signals discharged during speaking and hearing, they found that some regions of the showed less activity during speech, while others showed the same or higher levels.

"This shows that our brain has a complex sensitivity to our own speech that helps us distinguish between our vocalizations and those of others, and makes sure that what we say is actually what we meant to say," Flinker said.

Explore further: New ALS associated gene identified using innovative strategy

Related Stories

Lend me your ears -- and the world will sound very different

Jan 14, 2008

Recognising people, objects or animals by the sound they make is an important survival skill and something most of us take for granted. But very similar objects can physically make very dissimilar sounds and we are able to ...

Zeroing in on the brain's speech 'receiver'

Jun 20, 2007

A particular resonance pattern in the brain’s auditory processing region appears to be key to its ability to discriminate speech, researchers have found. They found that the inherent rhythm of neural activity called “theta ...

Neuron research could improve hearing loss restoration

May 04, 2010

(PhysOrg.com) -- New research into the way our brain uses neurons to enable us to perceive sound and understand speech could fundamentally improve the design of current surgical implants and so help restore hearing in patients ...

Recommended for you

New ALS associated gene identified using innovative strategy

11 hours ago

Using an innovative exome sequencing strategy, a team of international scientists led by John Landers, PhD, at the University of Massachusetts Medical School has shown that TUBA4A, the gene encoding the Tubulin Alpha 4A protein, ...

Can bariatric surgery lead to severe headache?

11 hours ago

Bariatric surgery may be a risk factor for a condition that causes severe headaches, according to a study published in the October 22, 2014, online issue of Neurology, the medical journal of the American Academy of Neurol ...

Bipolar disorder discovery at the nano level

11 hours ago

A nano-sized discovery by Northwestern Medicine scientists helps explain how bipolar disorder affects the brain and could one day lead to new drug therapies to treat the mental illness.

Brain simulation raises questions

15 hours ago

What does it mean to simulate the human brain? Why is it important to do so? And is it even possible to simulate the brain separately from the body it exists in? These questions are discussed in a new paper ...

Human skin cells reprogrammed directly into brain cells

15 hours ago

Scientists have described a way to convert human skin cells directly into a specific type of brain cell affected by Huntington's disease, an ultimately fatal neurodegenerative disorder. Unlike other techniques ...

User comments : 0