Locate and guide function for the human symbiotic robot 'EMIEW2'

Feb 28, 2012

Hitachi announced the addition of a new feature to the human symbiotic robot, EMIEW2, where when asked the location of an object, is now able to smoothly guide the enquirer to the location of the object after identifying its location using network cameras installed within a facility as "eyes" to locate the object.

The human symbiotic robot, EMIEW2 developed by in 2007, is a compact and light-weight robot capable of autonomous movement using a two-wheel mechanism moving at a speed of 6km/h, about the same pace as a human walking briskly, designed with view to providing a guidance service in offices and public facilities. Technology development was continued to enhance functionality, such as the features of being able to accurately recognize and respond to against as well as overcome cables and uneven floor heights for smooth movement, announced in 2010.

In order to provide even smoother service in offices and public facility guidance, a "locate and guide" function was newly developed. With this function, when EMIEW2 is asked to locate an object, it automatically "recognizes" the object using a database created from information on the Web, and identifies the location of the object using images captured by the multiple network cameras installed in the facility, and then quickly and smoothly guides the person to the location without dropping speed at corners or in narrow aisles. Details of the technology developed are as described below.

Web-based "object recognition" technology

"" technology was developed whereby when EMIEW2 is presented with an object and asked its name, similar images are retrieved from a Web-based image database using high-speed similar image retrieval technology and the text data attached to the retrieved images is statistically analyzed to infer the name of the object, which it then returns as a reply.

1. Web-based image database: A database specifically for object recognition. The database stores images collected from the Web as image features that are high-dimensional numerical data which express image characteristics such as color distribution, shapes, etc., and link the image features to text data which were originally attached to the images on the Web.

2. High-speed similar image retrieval technology: Technology which retrieves visually similar images based on the image features which are aligned in the database for a high-speed search.

"Object search" technology using network cameras to locate objects

"Object search" technology was developed whereby when EMIEW2 is given a name of an object and asked to locate it, the object is matched against an Object-found database compiled from the images collected by multiple network cameras set up in the facility, and answers after identifying its location.

3. Object-found database: A database which records the position and time when the image was captured by the camera, and the name of the object inferred by " " described in above.

"Model-predictive posture-control technology" to achieve smooth cornering while guiding persons

EMIEW2, with previous autonomous movement technology, "experienced" a strong change in centrifugal force when maneuvering a corner making it necessary to drop speed and to change directions before continuing movement. In order to cope smoothly with this change in centrifugal force and maneuver corners without dropping speed, "Model-predictive posture-control technology" was developed to calculate in real-time the optimal posture to counter the centrifugal force. By applying this technology in combination with the "Active suspension," a feature of EMIEW2, smooth movement without dropping speed was achieved even with sudden curves or in continuous curves along the route.

EMIEW2 is based on new developments on technology developed as part of the New Energy Development Organization (NEDO) commissioned project on the practical application of next-generation robots (prototype development support program) for EMIEW in 2005.

Details of the "Web-based object-recognition technology" will be presented at the 181st Computer vision and Image media Workshop of the Information Processing Society of Japan, to be held from 15th to 16th March 2012 at the Tokyo Institute of , Tokyo, Japan.

Explore further: Q&A: Drones might help explain how tornadoes form

add to favorites email to friend print save as pdf

Related Stories

Hitachi demos 3D real-world object projector

Oct 05, 2011

(PhysOrg.com) -- In a feat of technical wizardry combined with several doses of panache, Hitachi has demoed a 3D projector that can project images onto real-world objects in stunning fashion. For the demo, ...

Augmented reality in an iPhone app

Jun 20, 2011

Imagine you’re in a museum, and you can point your iPhone camera to a painting or an object in an exhibit and instantly get additional information about what you’re looking at. This is what PixLive, ...

Human eye inspires advance in computer vision (w/Video)

Jun 18, 2009

Inspired by the behavior of the human eye, Boston College computer scientists have developed a technique that lets computers see objects as fleeting as a butterfly or tropical fish with nearly double the accuracy and 10 times ...

Recommended for you

Q&A: Drones might help explain how tornadoes form

Dec 18, 2014

Researchers say they have collected promising weather data by flying instrument-laden drones into big Western and Midwestern storms. Now, they want to expand the project in hopes of learning more about how ...

First steps for Hector the robot stick insect

Dec 16, 2014

A research team at Bielefeld University has succeeded in teaching the only robot of its kind in the world how to walk. Its first steps have been recorded in a video. The robot is called Hector, and its construction ...

Getting bot responders into shape

Dec 16, 2014

Sandia National Laboratories is tackling one of the biggest barriers to the use of robots in emergency response: energy efficiency.

Robot 'shadow hand'

Dec 12, 2014

Picking up an apple is one of those jobs requiring the delicate touch of the human hand – or its robotic counterpart.

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.