Researchers investigate early language acquisition in robots

Aug 24, 2012
Researchers investigate early language acquisition in robots

(Phys.org)—Research into robotics continues to grow in Europe. And the introduction of humanoid robots has compelled scientists to investigate the acquisition of language. A case in point is a team of researchers in the United Kingdom that studied the development of robots that could acquire linguistic skills. Presented in the journal PLoS ONE, the study focused on early stages analogous to some characteristics of a human child between 6 and 14 months of age, the transition from babbling to first word forms. The results, which shed light on the potential of human-robot interaction systems in studies investigating early language acquisition, are an outcome of the ITALK ('Integration and transfer of action and language knowledge in robots') project, which received EUR 6.3 million under the 'Information and communication technologies' (ICT) Theme of the EU's Seventh Framework Programme (FP7).

from the Adaptive Systems Research Group at the University of Hertfordshire in the United Kingdom have discovered that a analogous to a child between 6 and 14 months old has the ability to develop rudimentary linguistic skills. The robot, called DeeChee, moved from various syllabic babble to various word forms, including colours and shapes, after it 'conversed' with humans. The latter group was told to speak to the robot as if it were a small child.

'It is known that infants are sensitive to the frequency of sounds in speech, and these experiments show how this sensitivity can be modelled and contribute to the learning of word forms by a robot,' said lead author Caroline Lyon of the University of Hertfordshire.

In their paper, the authors wrote: 'We wanted to explore human-robot interaction and were deliberately not prescriptive. However, leaving participants to talk naturally opened up possibilities of a wide range of behaviour, possibilities that were certainly realised. Some participants were better teachers than others: some of the less good produced very sparse utterances, while other talkative participants praised DeeChee whatever it did, which skewed the learning process towards non-words.'

The researchers said one of the reasons that the robot learnt the words is because the teacher said the words repeatedly, an already anticipated response. The second reason is that the non-salient word strings were variable, so their frequencies were spread about. According to the team, this phenomenon is the basis of a number of automated plagiarism detectors, where precise matches of short lexical strings indicate copying. Lastly, they said the phonemic representation of speech from the teacher to the robot is not a uniformly stable mapping of sounds.

'The frequencies of syllables in words with variable phonemic forms may be attenuated compared with those in salient content words, or parts of such words,' they wrote. 'It has long been realised that there is in practice a great deal of variation in spontaneous speech. This work shows the potential of human-interaction systems to be used in studies of , and the iterative development methodology highlights how the embodied nature of interaction may bring to light important factors in the dynamics of language acquisition that would otherwise not occur to modellers.'

Explore further: Robots recognize humans in disaster environments

More information: Lyon, C., et al. 'Interactive Language Learning by Robots: The Transition from Babbling to Word Forms'. PLoS ONE 7(6): e38236. doi:10.1371/journal.pone.0038236

Related Stories

Robots learn to create language

May 17, 2011

(PhysOrg.com) -- Communication is a vital part of any task that has to be done by more than one individual. That is why humans in every corner of the world have created their own complex languages that help ...

'Motherese' important for children's language development

May 06, 2011

(Medical Xpress) -- Talking to children has always been fundamental to language development, but new research reveals that the way we talk to children is key to building their ability to understand and create ...

Recommended for you

Microsoft beefs up security protection in Windows 10

10 hours ago

What Microsoft users in business care deeply about—-a system architecture that supports efforts to get their work done efficiently; a work-centric menu to quickly access projects rather than weather readings ...

US official: Auto safety agency under review

23 hours ago

Transportation officials are reviewing the "safety culture" of the U.S. agency that oversees auto recalls, a senior Obama administration official said Friday. The National Highway Traffic Safety Administration has been criticized ...

Out-of-patience investors sell off Amazon

23 hours ago

Amazon has long acted like an ideal customer on its own website: a freewheeling big spender with no worries about balancing a checkbook. Investors confident in founder and CEO Jeff Bezos' invest-and-expand ...

Ebola.com domain sold for big payout

23 hours ago

The owners of the website Ebola.com have scored a big payday with the outbreak of the epidemic, selling the domain for more than $200,000 in cash and stock.

Hacker gets prison for cyberattack stealing $9.4M

Oct 24, 2014

An Estonian man who pleaded guilty to orchestrating a 2008 cyberattack on a credit card processing company that enabled hackers to steal $9.4 million has been sentenced to 11 years in prison by a federal judge in Atlanta.

Magic Leap moves beyond older lines of VR

Oct 24, 2014

Two messages from Magic Leap: Most of us know that a world with dragons and unicorns, elves and fairies is just a better world. The other message: Technology can be mindboggingly awesome. When the two ...

User comments : 1

Adjust slider to filter visible comments by rank

Display comments: newest first

alohim_haushulamee
1 / 5 (1) Sep 03, 2012
Natural Logic, by symbolic-instructions, written as well as documented for flexible design applications of "Automatic Comprehension" in any given "Super Natural Language Tongue" by 'users voice-prints' is found by searching the web for usciiiiii code / USCIIIIII CODE SynTexting of EchoLogical Computers and Robots.