Monkey's Thoughts Make Robot Walk from Across the Globe

January 17, 2008

In a first-of-its-kind experiment, the brain activity of a monkey has been used to control the real-time walking patterns of a robot halfway around the world, according to researchers at Duke University Medical Center.

The Duke team is working with the Computational Brain Project of the Japan Science and Technology Agency (JST) on technology they hope will one day help those with paralysis regain the ability to walk.

This video is not supported by your browser at this time.

"We believe this research could have significant implications for severely paralyzed patients," said senior study investigator Miguel Nicolelis, M.D., Ph.D., the Anne W. Deane Professor of Neuroscience at Duke. "This is a breakthrough in our understanding of how the brain controls the movement of our legs, which is vital information needed to ultimately develop robotic prosthesis."

Researchers used some of the most sophisticated methods available to capture activity from hundreds of brain cells located in multiple areas of the brain. To collect this information, two rhesus monkeys were implanted with electrodes that gathered feedback from cells in the brain's motor and sensory cortex. This technology recorded how the cells responded as the monkey walked on a treadmill at a variety of speeds and while walking forward and backward. At the same time, sensors on the monkey's legs tracked the actual walking patterns of the legs while moving.

Using mathematical models, the researchers were able to analyze the relationship between the leg movement and brain cell activity to determine how well the information gathered from the brain cells was able to predict the exact speed of movement and stride length of the legs.

"We found that certain neurons in multiple areas of the brain fire at different phases and at varying frequency, depending on their role in controlling the complex, multi-muscle process of motion. Each neuron provides us with a small piece of the puzzle that we compile to predict the walking pattern of the monkeys with high accuracy," Nicolelis explained.

"In this experiment, we were able to record brain activity, predict what the pattern of locomotion will be and send the signal from the motor commands of the animal to the robot," Nicolelis said. "We also created a real-time transmission of information that allowed the brain activity of the monkey in North Carolina to control the commands of a robot in Japan. As a result, they can walk in complete synchronization."

"We are delighted with the remarkable outcome of this collaboration between Duke University and JST, as now we can further advance our research to better understand how the brain processes information," said Mitsuo Kawato, M.E., Ph.D., director of ATR Computational Neuroscience Laboratories and research director of the Computational Brain Project of JST.

The experiment built on earlier work conducted by Nicolelis' laboratory in which monkeys were able to control the reaching and grasping movements of a robotic arm with only their brain signals.

"We are also exploring how the brain processes feedback sensations - both visual and electrical - from the robot. This feedback plays a critical role in completing the act of walking. In essence, we are seeking to capture the information that the foot sends to your brain when it touches the ground as you walk," Nicolelis said.

He added, "The most stunning finding is that when we stopped the treadmill and the monkey ceased to move its legs, it was able to sustain the locomotion of the robot for a few minutes - just by thinking - using only the visual feedback of the robot in Japan."

The researchers are estimating that work will begin within the next year to develop prototypes of the robotic leg braces for potential use with humans.

Source: Duke University

Explore further: LUTZ Pathfinder pod is off to University of Oxford for brains

Related Stories

LUTZ Pathfinder pod is off to University of Oxford for brains

September 16, 2015

The electric-powered LUTZ Pathfinder pod was presented Tuesday. This is Britain's self-driving car and it is paving the way for trials across the UK, said The Telegraph. The trials will be assessing the vehicle on "pedestrianized" ...

How moths integrate sensory and control information

September 21, 2015

It's difficult enough to see things in the dark, but what if you also had to hover in midair while tracking a flower moving in the wind? That's the challenge the hummingbird-sized hawkmoth (Manduca sexta) must overcome while ...

An insect eye for drones

August 12, 2015

Is it possible to catch a fly? Small insects seem to possess a sixth sense that allows them to dodge any threats. Yet there is no magic trick, but only a compound eye that is an organ of vision made of thousands of ommatidia. ...

Recommended for you

How the finch changes its tune

August 3, 2015

Like top musicians, songbirds train from a young age to weed out errors and trim variability from their songs, ultimately becoming consistent and reliable performers. But as with human musicians, even the best are not machines. ...

Machine Translates Thoughts into Speech in Real Time

December 21, 2009

( -- By implanting an electrode into the brain of a person with locked-in syndrome, scientists have demonstrated how to wirelessly transmit neural signals to a speech synthesizer. The "thought-to-speech" process ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.