Sound adds speed to visual perception

August 12, 2008

The traditional view of individual brain areas involved in perception of different sensory stimuli—i.e., one brain region involved in hearing and another involved in seeing—has been thrown into doubt in recent years. A new study published in the online open access journal BMC Neuroscience, shows that, in monkeys, the region involved in hearing can directly improve perception in the visual region, without the involvement of other structures to integrate the senses.

Integration of sensory stimuli has traditionally been thought of as hierarchical, involving brain areas that receive signals from distinct areas of the brain layer known as the cortex that recognise different stimuli. But the recent finding of nerve cells projecting from the auditory cortex (associated with the perception of sound) directly into the visual cortex (associated with sight), suggest that perception of one sense might affect that of another without the involvement of higher brain areas.

"Auditory or visual–auditory responses in the primary visual cortex are highly probable given the presence of direct projections from the primary auditory cortex", explain P. Barone and colleagues from the Centre for Brain and Cognition Research, Toulouse, France. "We looked for modulation of the neuronal visual responses in the primary visual cortex by auditory stimuli in an awake monkey."

The researchers recorded the neuronal responses with microelectrodes inserted directly into the primary visual cortex of a rhesus macaque. The monkey was then required to orient its gaze towards a visual stimulus. The time taken for the neurons in the visual cortex to respond to the stimulus, or latency, was recorded. Barone and colleagues then measured the latency when the visual stimulus was accompanied by a sound emanating from the same spot. When the visual signal was strong—i.e., high contrast—the auditory stimulus did not affect latency; however, if the visual signal was weaker—i.e., low contrast—latency decreased by 5-10%, suggesting that in some way the auditory stimulus speeds up the response to the visual stimulus.

"Our findings show that single neurons from one primary sensory cortex can integrate information from another sensory modality", the researchers claim. They propose that the auditory cue is processed more quickly than the visual stimulus, and because the monkeys have learned to associate that sound and sight, the visual cortex is primed to perceive the weaker signal. "Our results argue against a strict hierarchical model of sensory integration in the brain and that integration of multiple senses should be added to the list of functions of the primary visual cortex."

Source: BioMed Central

Explore further: What neuroscience can learn from computer science

Related Stories

What neuroscience can learn from computer science

August 10, 2015

What do computers and brains have in common? Computers are made to solve the same problems that brains solve. Computers, however, rely on a drastically different hardware, which makes them good at different kinds of problem ...

Functioning brain follows famous sand pile model

June 22, 2015

One of the deep problems in understanding the brain is to understand how relatively simple computing units (the neurons), collectively perform extremely complex operations (thinking).

Method to reconstruct overt and covert speech

October 31, 2014

Can scientists read the mind, picking up inner thoughts? Interesting research has emerged in that direction. According to a report from New Scientist, researchers discuss their findings in converting brain activity into sounds ...

Recommended for you

How the finch changes its tune

August 3, 2015

Like top musicians, songbirds train from a young age to weed out errors and trim variability from their songs, ultimately becoming consistent and reliable performers. But as with human musicians, even the best are not machines. ...

Machine Translates Thoughts into Speech in Real Time

December 21, 2009

(PhysOrg.com) -- By implanting an electrode into the brain of a person with locked-in syndrome, scientists have demonstrated how to wirelessly transmit neural signals to a speech synthesizer. The "thought-to-speech" process ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.