Inventor Demonstrates Humanoid Robot's Latest AI Abilities (w/ Video)August 25th, 2009 in Technology / Computer Sciences
Aiko has the ability to identify objects, learn what new objects are, understand more than 13,000 sentences, and more. Image credit: Le Trung.
(PhysOrg.com) -- In August 2007, Le Trung invented Aiko, a Yumecom, or "Dream Computer Robot." Although it took only a month and a half to build Aiko's exterior, the artificial intelligence software has been a work in progress ever since. Recently, Le Trung has demonstrated his most recent improvements to the software, called BRAINS (Bio Robot Artificial Intelligence Neural System).
In the video below, Le Trung demonstrates Aiko's internal operating system, which gives the robot many abilities, including the ability to speak two languages (English and Japanese), solve high school math problems, communicate the weather forecast, understand more than 13,000 sentences, sing songs, identify objects, focus on objects or people of importance, read newspapers and other materials, and mimic human physical touch.
As Le Trung explains, in some ways the BRAINS software is even more powerful than a human brain because it can link to infinite sources of data. Similar to a human brain, the software is designed to interact with the surrounding environment, process it, and record the information in its internal memory. Once the internal memory is at full capacity, the information can be transferred into a server database. The information can then be shared with current and future robots.
With the BRAINS software, Aiko (whose name means "beloved one") has the potential for many applications. For example, in the home, Aiko could help elderly people by reminding them when to take their medicine and helping them read the newspaper. It could also help kids with their math homework. In work and public environments, the robot could be used at information desks, where it could give directions and inform people when and where events take place. Le Trung also suggests that, with Aiko's ability to detect 250 faces per second, it could be useful in airports to quickly scan and filter faces, as well as answer questions regarding flight times and gate locations. In addition, Aiko's sensitivity sensors and humanlike appearance offer the potential for its use as a companion robot.
"The most recent improvement with Aiko is the BRAINS software," Le Trung said. "I have just finished re-architecting the BRAINS software to have triple threads, which will make the software run a bit smoother and process about 15% faster for 3D recognition. As a result, Aiko can distinguish the difference between a $20 Canadian bill and $20 American bill. Aiko also has new improved facial expressions with 21 recognition points. Aiko will know when you are angry, happy, etc. Finally, the BRAINS can now process newspaper reading much faster and more accurate."
Le Trung, whose background is in microbiology and chemistry, was originally inspired to build Aiko after watching "Chobits," a Japanese manga that explores the relationships between humans and personal computers. While he hopes to continue to improve Aiko's software, he currently faces a hardware limitation, as the CPU is currently at 99% capacity. Le Trung hopes to raise funds to upgrade the CPU.
In the future, Le Trung hopes to enable Aiko to achieve further skills, such as making tea, coffee, and a breakfast of eggs and bacon; cleaning a human's ears with a Q-tip; giving a neck massage; writing; and cleaning windows, shelves, and bathrooms. He also hopes that, one day, he will be able to mass produce sister copies of Aiko for an estimated cost of about $17,000 - $20,000.
"Future improvements include making the voice with more emotions and feelings when speaking, improving the silicone material on her face so that she can do facial expressions like humans, and redesigning the body and arm system to move more naturally and carry heavier things," Le Trung said.
© 2009 PhysOrg.com
"Inventor Demonstrates Humanoid Robot's Latest AI Abilities (w/ Video)." August 25th, 2009. http://phys.org/news170419268.html