Tests check out rescue robots' life-saving vision

Jun 12, 2008

To save lives, search and rescue robots crawling through the rubble of a collapsed building or surveying a chemical spill area must be capable of beaming back clear, easily interpretable images of what they "see" to operators and emergency planners, working away from the immediate disaster site.

A new ASTM International standard, developed under a National Institute of Standards and Technology (NIST) coordinated program with first responders and manufacturers, offers a systematic way to evaluate the robot visual capability humans need to drive the device, search for victims and access general hazard conditions.

Emergency personnel will be able to use the test data to select the best systems for their specific needs. Industry adoption of the standard is expected to accelerate innovation, development and deployment of the life-saving robots.

In science fiction, images relayed from robots are readily interpretable by remote operators. Reality can be different. Real-time color video images from urban search-and-rescue robots reflect the type of sensors or camera lens used. A zoom lens, for instance, can be like looking through a soda straw, yet it could be useful in zeroing in on certain important objects.

Similarly, images from a lens offering a wide field of view, such as 120 to 150 degrees, offer little depth perception and are of little use for navigating in tight quarters but can, in the case of aerial robots and ground vehicles, provide useful survey data. Both far-vision acuity and near vision acuity, in such instances, can be important for surveys of HAZMAT disaster sites, with the far-vision cameras providing the overall picture and the near-vision acuity playing a critical role in reading chemical labels. (Near-vision acuity also is critical for small robots that must operate in confined spaces.) Finally, the amount of available light can affect monitor images.

The standard's test methods measure the field of view of the camera, the system's visual acuity at far distances with both ambient lighting and lighting onboard the robot, visual acuity at near distances, again in both light and dark environments, and visual acuity in both light and dark environments with zoom lens capability, if provided. Results are useful for writing procurement specifications and for acceptance testing of robots for urban search and rescue applications.

Further information on NIST's urban search and rescue robot performance standards project can be found at www.isd.mel.nist.gov/US&R_Robot_Standards .

Source: National Institute of Standards and Technology

Explore further: Sistine chapel dazzles after technological makeover

add to favorites email to friend print save as pdf

Related Stories

Recommended for you

Sistine chapel dazzles after technological makeover

21 hours ago

High above the altar in the Vatican's Sistine Chapel, the halo around Jesus Christ's head in Michelangelo's famous frescoes shines with a brighter glow, thanks to a revolutionary new lighting system.

Free urban data—what's it good for?

Oct 29, 2014

Cities around the world are increasingly making urban data freely available to the public. But is the content or structure of these vast data sets easy to access and of value? A new study of more than 9,000 ...

Rice team sets sights on better voting machine

Oct 27, 2014

At the urging of county election officials in Austin, Texas, a group of Rice University engineers and social scientists has pulled together a team of U.S. experts to head off a little-known yet looming crisis ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.