The Next Level in Robots: Monkey See, Monkey Do, Monkey Create

Oct 29, 2007 by Mary Anne Simpson weblog
The  Next Level in Robots: Monkey See, Monkey Do, Monkey Create
Ancient Japanese Three Wise Monkeys - Photo Credit: Wikipedia

The next level of robot is currently in the research and development stage in Japan's National Institute of Information and Communication Technology. The next level of robot untethered by human omnipresence allows it to take cues from gestures and make immediate and appropriate responses.

The Japanese, National Institute of Information and Communications Technology is working on a project wherein machines can learn and teach themselves what to do. Presently, robots are tethered to human commands or guided by programs in advance that operate in real time. The new level of robot will take cues from gestures and operate more autonomously through a learning process.

The Institutes's Spoken Language Division is in the development stage of creating a robot that measure 155 cm and weighs 85 kg that learns through gestures, thereby creating a more autonomous robot. The Spoken Language Group´s main focus is to develop an information communication system that understands when people talk correctly and automatically takes appropriate actions to people and other machines. The actions are based on the knowledge they receive from the talk by people in their presence.

According to the Institute, the current research is involved in producing stress-free unambiguous communication that a machine understands immediately and tells its understanding immediately to a person or another machine. Its primary goal is to establish a technology to give messages to network terminals by people's natural expressions, such as gestures, hand signals and body language that transcend language differences and allow for approximations.

As of this writing, the prototype of this next level of robot has not made its public debut. There are reports of its development. According to Digital World Tokyo, the work in progress robot can understand the gesture of pointing a finger at an object. It can possibly understand the traditional Japanese bow indicating a respectful greeting.

In addition, the new robot can repeat the same gestures in the appropriate circumstances. Specifically it can pointing out a direction and then move in that direction. This indicates the robot has formed its own learning process without being programmed to do so or by a formal teaching command.

Explore further: Future US Navy: Robotic sub-hunters, deepsea pods

add to favorites email to friend print save as pdf

Related Stories

Virtual robotization for human limbs

3 hours ago

Recent advances in computer gaming technology allow for an increasingly immersive gaming experience. Gesture input devices, for example, synchronise a player's actions with the character on the screen. Entertainment ...

Robots do kitchen duty with cooking video dataset

Jan 05, 2015

Now that we have robots that walk, gesture and talk, roboticists are interested in a next level: How can they learn more than they already know? The ability of these machines to learn actions from human demonstrations ...

Dogs' behavior could help to design social robots

Sep 12, 2013

Designers of social robots, take note. Bring your dog to the lab next time you test a prototype, and watch how your pet interacts with it. You might just learn a thing or two that could help you fine-tune ...

Recommended for you

Future US Navy: Robotic sub-hunters, deepsea pods

16 hours ago

The robotic revolution that transformed warfare in the skies will soon extend to the deep sea, with underwater spy "satellites," drone-launching pods on the ocean floor and unmanned ships hunting submarines.

Festo has BionicANTs communicating by the rules for tasks

Mar 27, 2015

Germany-based automation company Festo, focused on technologies for tasks, turns to nature for inspiration, trying to take the cues from how nature performs tasks so efficiently. "Whether it's energy efficiency, ...

Virtual robotization for human limbs

Mar 26, 2015

Recent advances in computer gaming technology allow for an increasingly immersive gaming experience. Gesture input devices, for example, synchronise a player's actions with the character on the screen. Entertainment ...

Robots on reins could be the 'eyes' of firefighters

Mar 25, 2015

Researchers at King's College London have developed revolutionary reins that enable robots to act like guide dogs, which could enable that firefighters moving through smoke-filled buildings could save vital ...

Robot revolution will change world of work

Mar 24, 2015

Robots will fundamentally change the shape of the workforce in the next decade but many industries will still need a human touch, a QUT Future of Work Conference has heard.

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.