Sound adds speed to visual perception

August 12, 2008

The traditional view of individual brain areas involved in perception of different sensory stimuli—i.e., one brain region involved in hearing and another involved in seeing—has been thrown into doubt in recent years. A new study published in the online open access journal BMC Neuroscience, shows that, in monkeys, the region involved in hearing can directly improve perception in the visual region, without the involvement of other structures to integrate the senses.

Integration of sensory stimuli has traditionally been thought of as hierarchical, involving brain areas that receive signals from distinct areas of the brain layer known as the cortex that recognise different stimuli. But the recent finding of nerve cells projecting from the auditory cortex (associated with the perception of sound) directly into the visual cortex (associated with sight), suggest that perception of one sense might affect that of another without the involvement of higher brain areas.

"Auditory or visual–auditory responses in the primary visual cortex are highly probable given the presence of direct projections from the primary auditory cortex", explain P. Barone and colleagues from the Centre for Brain and Cognition Research, Toulouse, France. "We looked for modulation of the neuronal visual responses in the primary visual cortex by auditory stimuli in an awake monkey."

The researchers recorded the neuronal responses with microelectrodes inserted directly into the primary visual cortex of a rhesus macaque. The monkey was then required to orient its gaze towards a visual stimulus. The time taken for the neurons in the visual cortex to respond to the stimulus, or latency, was recorded. Barone and colleagues then measured the latency when the visual stimulus was accompanied by a sound emanating from the same spot. When the visual signal was strong—i.e., high contrast—the auditory stimulus did not affect latency; however, if the visual signal was weaker—i.e., low contrast—latency decreased by 5-10%, suggesting that in some way the auditory stimulus speeds up the response to the visual stimulus.

"Our findings show that single neurons from one primary sensory cortex can integrate information from another sensory modality", the researchers claim. They propose that the auditory cue is processed more quickly than the visual stimulus, and because the monkeys have learned to associate that sound and sight, the visual cortex is primed to perceive the weaker signal. "Our results argue against a strict hierarchical model of sensory integration in the brain and that integration of multiple senses should be added to the list of functions of the primary visual cortex."

Source: BioMed Central

Explore further: 'Rat vision' may give humans best sight of all

Related Stories

'Rat vision' may give humans best sight of all

November 19, 2015

Humans have the best of all possible visual worlds because our full stereo vision combines with primitive visual pathways to quickly spot danger, a study led by the University of Sydney has discovered.

Large eyes come at a high cost

September 11, 2015

Researchers from Lund University in Sweden have shown that well-developed eyes come at a surprising cost to other organ systems. The study involving Mexican cavefish shows that the visual system can require between 5% and ...

Parts of brain can switch functions: study

February 28, 2011

( -- When your brain encounters sensory stimuli, such as the scent of your morning coffee or the sound of a honking car, that input gets shuttled to the appropriate brain region for analysis. The coffee aroma ...

Long-distance brain waves focus attention (w/Video)

May 28, 2009

( -- Just as our world buzzes with distractions -- from phone calls to e-mails to tweets -- the neurons in our brain are bombarded with messages. Research has shown that when we pay attention, some of these neurons ...

Recommended for you

How the finch changes its tune

August 3, 2015

Like top musicians, songbirds train from a young age to weed out errors and trim variability from their songs, ultimately becoming consistent and reliable performers. But as with human musicians, even the best are not machines. ...

Machine Translates Thoughts into Speech in Real Time

December 21, 2009

( -- By implanting an electrode into the brain of a person with locked-in syndrome, scientists have demonstrated how to wirelessly transmit neural signals to a speech synthesizer. The "thought-to-speech" process ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.