NRL artificial intelligence team win 2 video awards (w/ Video)

September 11, 2009
Robot "Octavia," featured in "Robotic Secrets Revealed," demonstrates cognitive robotics being developed at NRL. Credit: NRL Navy Center for Applied Research in Artificial Intelligence.

Researchers at NRL's Navy Center for Applied Research in Artificial Intelligence, within the laboratory's Information Technology Division (ITD), received two top awards at the 21st International Joint Conference on Artificial Intelligence (IJCAI) held in California. Selecting from a cadre of 39 competitor videos, IJCAI awarded the NRL films with top honors in the categories of "Best Overall" and "Most Informative."

Since 2006, the research community has held this prestigiously honored competition for videos documenting exciting artificial intelligence advances in research, education and application and that are accessible to a broad on-line audience.

"We are very excited to have won not just one, but two awards in our first year entering this competition," said Dr. Greg Trafton, section head, NRL Intelligent Systems Section. "Both videos are extremely entertaining and display the top-notch research currently occurring in artificial intelligence and robotics at NRL".

In the category of "Best Overall," the award went to "Casey's Quest: Transfer Learning for Adversarial Environments" by Kalyan Gupta, Matthew Molineaux and Philip Moore. The video describes recent research that ITD has conducted with members of Knexus Research Corporation and the University of Central Florida on the topic of transfer learning—the ability to leverage experience gained from one task to improve performance on a different task. In transfer learning, the software first learns the intent of an adversary in a multi-agent simulation game. It then uses this experience to assist in controlling friendly agents, and was shown to significantly increase scores for this task in comparison to the non-transfer agent, which was not provided with this experience. This has practicable application to the Navy by providing more realistic training scenarios for a variety of mission tasks and creating a more intelligent adversary in training simulators involving semi-automated forces.

In the category of "Most Informative," the award went to "Robotic Secrets Revealed, Episode 001" by Anthony Harrison, Ben Fransen, Magdalena Bugajska and Greg Trafton. This video highlighted recent gesture recognition work and NRL's novel cognitive architecture, ACT-R/E. While set in a popular game of skill, this video illustrates several Navy relevant issues, including: computational cognitive architecture that allows autonomous function and integrates perceptual information with higher level cognitive reasoning; gesture recognition for shoulder-to-shoulder human-robot interaction; and anticipation and learning on a robotic system. Such abilities will be critical for future Naval Autonomous systems for persistent surveillance, tactical mobile robots and other autonomous platforms.

NCARAI is engaged in research and development efforts designed to address the application of artificial intelligence technology and techniques to critical Navy and national problems. Research is directed toward understanding the design and operation of systems capable of improving performance based on experience; efficient and effective interaction with other systems and with humans; sensor-based control of autonomous activity; and the integration of varieties of reasoning as necessary to support complex decision-making.

Both award winning videos may found via the Internet by entering either of the below links.

Casey's Quest:

Robotic Secrets Revealed:

Source: Naval Research Laboratory (news : web)

Explore further: NRL Developing Space 'Tow Truck' Technology For Satellite Operations

Related Stories

Piecing together the next generation of cognitive robots

May 5, 2008

Building robots with anything akin to human intelligence remains a far off vision, but European researchers are making progress on piecing together a new generation of machines that are more aware of their environment and ...

NRL's XFC UAS achieves flight endurance milestone

August 6, 2009

( -- The Naval Research Laboratory (NRL) has completed a successful flight test of the fuel cell powered XFC (eXperimental Fuel Cell) unmanned aerial system (UAS). During the June 2 flight test, the XFC UAS was ...

Recommended for you

Xbox gaming technology may improve X-ray precision

December 1, 2015

With the aim of producing high-quality X-rays with minimal radiation exposure, particularly in children, researchers have developed a new approach to imaging patients. Surprisingly, the new technology isn't a high-tech, high-dollar ...

Making 3-D imaging 1,000 times better

December 1, 2015

MIT researchers have shown that by exploiting the polarization of light—the physical phenomenon behind polarized sunglasses and most 3-D movie systems—they can increase the resolution of conventional 3-D imaging devices ...


Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Sep 11, 2009
let's stop pretending not to be wasting money....the current model for a.i. , if not currently useless in applications mimicing human behavior, has a very limited amount of time left before it is obvious that the way forward is with a fully new paradigm embracing neural hardward/software packages . the paradigm is being developed in many places, one notable project is the brains is silicon project at stanford/ darps's synapse program.

for now, a.i. is best staying away from the goal of mimicing human mental behavior. waste of money. stick with robotics, data mining, pattern recognition, etc....high level cognition is just totally out of reach for decades.
not rated yet Sep 13, 2009
let's stop pretending not to be wasting money....the current model for a.i. , if not currently useless in applications mimicing human behavior

That is a bit of a misnomer, it does not matter if you use static dedicated gate structures to "mimic" human intelligence. When virtually all of our thoughts are mimicry through a human perception. A programmable cpu can do just as good a job and still have alot of room extra improvement. Our neurons only have a bit rate of about 55 bps, and the 200 trillion ones we have are not at all needed for cognitive emulation in robotics. Are you saying a android must have subroutines to tell it- when it has a belly ache for example?

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.