Researchers develop a robot that can learn to navigate through its environment guided by external stimuli (w/ Video)

Feb 05, 2014

Researchers of Freie Universität Berlin, of the Bernstein Fokus Neuronal Basis of Learning, and of the Bernstein Center Berlin and have developed a robot that perceives environmental stimuli and learns to react to them. The scientists used the relatively simple nervous system of the honeybee as a model for its working principles. To this end, they installed a camera on a small robotic vehicle and connected it to a computer. The computer program replicated in a simplified way the sensorimotor network of the insect brain. The input data came from the camera that-akin to an eye-received and projected visual information. The neural network, in turn, operated the motors of the robot wheels-and could thus control its motion direction.

The outstanding feature of this artifical mini brain is its ability to learn by simple principles. "The network-controlled robot is able to link certain external stimuli with behavioral rules," says Professor Martin Paul Nawrot, head of the research team and professor of neuroscience at Freie Universität Berlin. "Much like honeybees learn to associate certain flower colors with tasty nectar, the robot learns to approach certain colored objects and to avoid others."

In the learning experiment, the scientists located the network-controlled robot in the center of a small arena. Red and blue objects were installed on the walls. Once the robot's camera focused on an object with the desired color-red, for instance-, the scientists triggered a light flash. This signal activated a so-called reward sensor nerve cell in the artificial network. The simultaneous processing of red color and the reward now led to specific changes in those parts of the network, which exercised control over the robot wheels. As a consequence, when the robot "saw" another red object, it started to move toward it. Blue items, in contrast, made it move backwards. "Just within seconds, the robot accomplishes the task to find an object in the desired color and to approach it," explains Nawrot. "Only a single learning trial is needed, similar to experimental observations in honeybees."

The current study was carried out at Freie Universität Berlin within an interdisciplinary collaboration between the research groups "Neuroinformatics" (Institute of Biology) led by Professor Martin Paul Nawrot and "Artificial Intelligence" (Institute of Computer Science) led by Professor Raúl Rojas. The scientists are now planning to expand their by supplementing more learning principles. Thus, the mini brain will become even more powerful-and the more autonomous.

This video is not supported by your browser at this time.

The Bernstein Focus "Neuronal Basis of Learning" with its project "Insect inspired robots: towards an understanding of memory in decision making" and the Bernstein Center Berlin are part of the National Bernstein Network Computational Neuroscience in Germany. With this funding initiative, the German Federal Ministry of Education and Research has supported the new discipline of Computational Neuroscience since 2004 with more than 170 million Euros.. The network is named after the German physiologist Julius Bernstein (1835-1917)

Explore further: Computing with silicon neurons: Scientists use artificial nerve cells to classify different types of data

More information: L. I. Helgadóttir, J. Haenicke, T. Landgraf, R. Rojas & M. P. Nawrot (2013): "Conditioned behavior in a robot controlled by a spiking neural network." 6th International IEEE/EMBS Conference on Neural Engineering (NER), 891-894, dx.doi.org/10.1109/NER.2013.6696078

add to favorites email to friend print save as pdf

Related Stories

The human touch makes robots defter

Nov 07, 2013

Cornell engineers are helping humans and robots work together to find the best way to do a job, an approach called "coactive learning."

Recommended for you

Students turn $250 wheelchair into geo-positioning robot

6 hours ago

Talk about your Craigslist finds! A team of student employees at The University of Alabama in Huntsville's Systems Management and Production Center (SMAP) combined inspiration with innovation to make a $250 ...

Using robots to study evolution

Apr 14, 2014

A new paper by OIST's Neural Computation Unit has demonstrated the usefulness of robots in studying evolution. Published in PLOS ONE, Stefan Elfwing, a researcher in Professor Kenji Doya's Unit, has succes ...

User comments : 0

More news stories

Quantenna promises 10-gigabit Wi-Fi by next year

(Phys.org) —Quantenna Communications has announced that it has plans for releasing a chipset that will be capable of delivering 10Gbps WiFi to/from routers, bridges and computers by sometime next year. ...

Floating nuclear plants could ride out tsunamis

When an earthquake and tsunami struck the Fukushima Daiichi nuclear plant complex in 2011, neither the quake nor the inundation caused the ensuing contamination. Rather, it was the aftereffects—specifically, ...

Unlocking secrets of new solar material

(Phys.org) —A new solar material that has the same crystal structure as a mineral first found in the Ural Mountains in 1839 is shooting up the efficiency charts faster than almost anything researchers have ...

Patent talk: Google sharpens contact lens vision

(Phys.org) —A report from Patent Bolt brings us one step closer to what Google may have in mind in developing smart contact lenses. According to the discussion Google is interested in the concept of contact ...

How kids' brain structures grow as memory develops

Our ability to store memories improves during childhood, associated with structural changes in the hippocampus and its connections with prefrontal and parietal cortices. New research from UC Davis is exploring ...