Virtual reality study shows echolocation in humans not just about the ears

November 12, 2014 by Bob Yirka, report
Illustration of the virtual corridor. Echo-acoustic orientation performance was tested at two positions on the midline of the corridor (positions M1 and M2 at rear wall distances of 75 and 700 cm, respectively) and two positions 75 cm from the left lateral wall (positions L1 and L2 at rear wall distances of 75 and 700 cm, respectively). Credit: Royal Society Open Science, DOI: 10.1098/rsos.140185

(—A pair of researchers with Ludwig-Maximilians-Universität München in Germany has found that echolocation in humans involves more than just the ears. In their paper published in the journal Royal Society Open Science, Ludwig Wallmeier and Lutz Wiegrebe describe how echolocation is thought to work in humans as compared to other animals, and the results of a study they conducted using volunteers and a virtual reality system.

Echolocation is a means of determining the location of an object in the near vicinity by emitting sounds and then listening to the echoes that are bounced off objects when they come back. Bats are perhaps most famous for their echolocation abilities but many other animals have some degree of ability as well, including humans. Wallmeier and Wiegrebe note that several studies have been conducted recently to discover just how well humans can use sounds as a means of navigating terrain when they are unable to see. Thus far, they also note, none of the studies conducted to date have been able to quantify such an ability, which tends to muddy the results. In their study, they sought to do just that.

To find out how good people are at echolocation and what parts of the body are involved, they enlisted the assistance of eight sighted students—each was asked to wear a blindfold and to make clicking noises as they made their way through a long corridor. Over several weeks' time, each learned to differentiate between sounds that were echoed back to them, which allowed them to gauge wall distance and eventually to walk easily through the corridor with no other assistance.

Once they'd mastered the real corridor, each of the volunteers was asked to sit at a workstation that simulated a walk through the same corridor and to use the same clicks they'd used earlier. In the simulation, the researchers varied the experience—they tested abilities when the volunteers were able to alter the orientation of the corridor and how well they were able to continue their virtual walk when their head or body was held steady, preventing them from getting different angles on the echo feedback.

In analyzing all the data they'd collected, the two researchers found that the volunteers lost most of their echolocation abilities when they were restricted from movement—they ran into walls that were easily avoided when allowed to move freely. By moving echolocation to a simulated environment, the researchers believe that they have finally found a way to quantify ability in humans.

Explore further: Researchers find humans process echo location and echo suppression differently

More information: Self-motion facilitates echo-acoustic orientation in humans, Royal Society Open Science, DOI: 10.1098/rsos.140185 ,

The ability of blind humans to navigate complex environments through echolocation has received rapidly increasing scientific interest. However, technical limitations have precluded a formal quantification of the interplay between echolocation and self-motion. Here, we use a novel virtual echo-acoustic space technique to formally quantify the influence of self-motion on echo-acoustic orientation. We show that both the vestibular and proprioceptive components of self-motion contribute significantly to successful echo-acoustic orientation in humans: specifically, our results show that vestibular input induced by whole-body self-motion resolves orientation-dependent biases in echo-acoustic cues. Fast head motions, relative to the body, provide additional proprioceptive cues which allow subjects to effectively assess echo-acoustic space referenced against the body orientation. These psychophysical findings clearly demonstrate that human echolocation is well suited to drive precise locomotor adjustments. Our data shed new light on the sensory–motor interactions, and on possible optimization strategies underlying echolocation in humans.

Related Stories

Hungry bats compete for prey by jamming sonar

November 6, 2014

In their nightly forays, bats hunting for insects compete with as many as one million hungry roost-mates. A study published today in Science shows that Mexican free-tailed bats jam the sonar of competitors to gain advantage ...

Recommended for you

Digging deep into distinctly different DNA

January 22, 2018

A University of Queensland discovery has deepened our understanding of the genetic mutations that arise in different tissues, and how these are inherited.

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Nov 12, 2014
So it's motion that allows us to tell front from back. Who'd a thunk it. Me, that's who. A lot of work with binaural recording where the distinction is largely lost left me with no other reasonable possibility.

When I was doing that there wasn't the readily available and cheap head tracking hardware but I did work out an HTRF interpolation scheme that might well have worked to prove it. Neither was there readily available processing performance then to do the interpolation in real time but I think there is now. Were that to be incorporated in this research I think their thesis could be more firmly established.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.