'Simplified' brain lets the iCub robot learn language (w/ Video)

'Simplified' brain lets the iCub robot learn language (w/ Video)

(Phys.org)—This technological prowess was made possible by the development of a "simplified artificial brain" that reproduces certain types of so-called "recurrent" connections observed in the human brain.

The artificial enables the to learn, and subsequently understand, new sentences containing a new grammatical structure. It can link two sentences together and even predict how a sentence will end before it is uttered. This research has been published in the journal.

INSERM and CNRS researchers and the Université Lyon 1 have succeeded in developing an "artificial " constructed on the basis of a fundamental principle of the workings of the human brain, namely its ability to learn a new language. The model was developed after years of research in the INSERM 846 Unit of the Institut de recherche sur les cellules souches et cerveau, through studying the structure of the human brain and understanding the mechanisms used for learning.

One of the most remarkable aspects of language-processing is the speed at which it is performed. For example, the human brain processes the first words of a sentence in real time and anticipates what follows, thus improving the speed with which humans process information. Still in real time, the brain continually revises its predictions through interaction between new information and a previously created context. The region inside the brain linking the and the plays a crucial role in this process.

Based on this research, Peter Ford Dominey and his team have developed an "artificial brain" that uses a "neuronal construction" similar to that used by the . Thanks to so-called recurrent construction (with connections that create locally recurring loops) this artificial brain system can understand new sentences having a new grammatical structure. It is capable of linking two sentences and can even predict the end of a sentence before it is provided.

To put this advance into a real-life situation, the INSERM researchers incorporated this new brain into the iCub humanoid robot.

In a video demonstration, a researcher asks the iCub robot to point to a guitar (shown in the form of blue object) then asking it to move a violin to the left (shown by a red object). Before performing the task, the robot repeats the sentence and explains that it has fully understood what it has been asked to do.

For researchers, the contribution that this makes to research into certain diseases is of major importance. This system can be used to understand better the way in which the brain processes language. "We know that when an unexpected word occurs in a sentence, the brain reacts in a particular way. These reactions could hitherto be recorded by sensors placed on the scalp", explains Peter Ford Dominey. The model developed by Dr Xavier Hinaut and Dr Peter Ford Dominey makes it possible to identify the source of these responses in the . If this model, based on the organisation of the cerebral cortex, is accurate, it could contribute to possible linguistic malfunctions in Parkinson's disease.

This research has another important implication, that of contributing to the ability of robots to learn a language one day. "At present, engineers are simply unable to program all of the knowledge required in a robot. We now know that the way in which robots acquire their knowledge of the world could be partially achieved through a learning process – in the same way as children", explains Peter Ford Dominey.

Explore further

iCub, the Toddler Robot (w/ Videos, Pictures)

Journal information: PLoS ONE

Citation: 'Simplified' brain lets the iCub robot learn language (w/ Video) (2013, February 19) retrieved 17 July 2019 from https://phys.org/news/2013-02-brain-icub-robot-language-video.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Feedback to editors

User comments

Feb 19, 2013

Feb 19, 2013
So to make a computer more human, first it has to be simplified.
That makes sense.
Evolution would be parsimonious with it's resources.

Feb 19, 2013
I had trouble understanding what he wanted.

Feb 20, 2013
I had trouble understanding what he wanted.
Its just a variation of grammar, quite a few European languages have this to a degree and many can accommodate both, ie:

"This lamp post is blue" is equivalent to:-

"It is blue, this lamp post"

The sentence the guy spoke to the robot didnt have a comma, humans recognise this well with background of having heard other languages though the robot is programmed to listen to the slight pause (if any) where the comma would have been had it been typed.

Feb 20, 2013
This is a clearest case of real reasonably free form verbal communication between a machine and a person that I have seen.


In 20 years this will be common place. faster, more comprehensive in terms of knowledge and vocabulary, and more fluid to the point of approaching human to human communication.

The interesting thing is, that once you train a machine once, it's knowledge is now available to equivalently structured machines.

Train one, and all can know.

Still, there is the problem of sensory modeling that will be required if machines are to grok the world around them rather than simply churn through a database of factoids.

This will take significantly longer.

Feb 20, 2013
@vendicarE, Within ten years it will be common. The more breakthroughs we achieve, the more breakthroughs we can achieve (unless somebody holds the technology to themselves for whatever reason)

Feb 20, 2013
Another Test: Making it aware of 20 Sentences Each of 10 Languages - Then asking it WHAT LANGUAGE IT HAS HEARD!

Feb 21, 2013
Failed; both actions ocurred to same object when two objects were specified in instructions but it still demonstrates an advance in speech recognition and AI.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more