Engineers Develop Undetectable Means Of Measuring Speed, Motion

March 30, 2005

Research aimed at teaching robots to "see" may soon make it possible to bag speeding motorists, track enemy planes, and automatically safeguard the nation’s borders and resources without any chance of detection.
A University of Florida engineering researcher is developing a method to determine speed and other characteristics of a moving object using computer algorithms, or instructions, that rely on data from standard visual cameras rather than radar or lasers. The technique has the potential to render current detection systems in so-called "fuzz busters" and some military technologies useless. That’s because, instead of painting a target with radar waves or laser beams, a camera merely needs to capture an image or series of images from the target.

“If it can view the object moving, that’s all it needs. The computer figures out everything else,” said Warren Dixon, a UF assistant professor of mechanical and aerospace engineering. “We’re trying to use both regular and infrared cameras, so night or adverse weather conditions don’t present a problem.”

Dixon’s most recent co-authored article on the research appears in the March issue of the journal Automatica. Related articles, also co-authored by Dixon, are scheduled to appear shortly in the journal Transactions on Robotics and Automation.

Achieving computerized speed and motion detection requires overcoming several challenges. One is figuring out how to get a computer to understand the surrounding environment by interpreting images recorded by a video or still camera.

“The information from a camera is just a flat-screen, two-dimensional image,” Dixon said. “The challenge is figuring out the mathematics of how do you take two images and understand how things are moving in our three-dimensional world.”

People and animals can perceive depth because their brains combine each eye’s snapshots. Two cameras can also achieve stereo vision, but computers can make sense of it only if they know the exact position of each camera. That allows them to triangulate the target and learn its position relative to the camera. Part of Dixon’s achievement is developing the underlying mathematics and software to circumvent this requirement.

“With my work, you don’t need to know that specific location information,” he said. “You could have one camera taking an image from an airplane and another mounted on a car taking a picture of the same image -- and not know how the airplane and car are related to each other -- and through this new mathematics you can understand how they’re related to the target.”

The technology has law enforcement and military applications.

Police in moving or parked squad cars could use the computer-camera systems much as they do radar and laser guns to track and ticket suspected speeders. The target would have to be within the line of sight, with the range varying according to the power of the lenses in the camera. Dixon said the UF engineers have not built such a system, but “any camera with the right software could be used,” and a prototype could be built within a year.

Soldiers, meanwhile, could mount the cameras on airborne drones or truck convoys and set them to look for and automatically report potentially hostile objects moving toward the convoys – again, without any fear of giving away the convoys’ locations.

Robotic drones or remote camera-based monitoring posts outfitted with the technology also could be used for applications ranging from private security in warehouses and shopping centers to continuous remote monitoring of borders to protecting water supply reservoirs.

In addition to the robotic applications, the technique is being refined for a project led by Andy Kurdila, a UF professor of mechanical and aerospace engineering, to provide vision systems for tiny airborne surveillance drones called micro air vehicles.

The goal of that five-year project, which is jointly funded by a $5 million grant from Eglin Air Force Base in Florida and by the Air Force Office of Scientific Research and involves several UF faculty members, is to give the drones the ability to fly without the assistance of a remote human operator. Instead, they would base navigational decisions solely on what onboard cameras view in the terrain as they fly, mimicking a human pilot.

Explore further: US moves toward mandatory registration of drones

Related Stories

US moves toward mandatory registration of drones

November 23, 2015

Owners of drones weighing 250 grams or more should provide authorities with their name and address and put an ID number on the aircraft, experts hired by the US government recommended Monday.

No lens? No problem for FlatCam

November 23, 2015

How thin can a camera be? Very, say Rice University researchers who have developed patented prototypes of their technological breakthrough.

In Hawaii, living with lava

November 25, 2015

When the most recent eruption of Hawaii's Kilauea volcano started last June, Melvin Sugimoto at first did not think much of it. Hawaii, where he has lived all his life, is made entirely of hardened lava, and Kilauea, perhaps ...

Speeding particles in the sights of a laser

November 23, 2015

It might be easier to track tiny particles in the future – even when they hurtle along with the speed of a rifle bullet. This is thanks to researchers working with Christoph Marquardt and Gerd Leuchs at the Max Planck Institute ...

A common mechanism for human and bird sound production

November 27, 2015

When birds and humans sing it sounds completely different, but now new research reported in the journal Nature Communications shows that the very same physical mechanisms are at play when a bird sings and a human speaks.

Recommended for you

NASA's space-station resupply missions to relaunch

November 29, 2015

NASA's commercial space program returns to flight this week as one of its private cargo haulers, Orbital ATK, is to launch its first supply shipment to the International Space Station in more than 13 months.

CERN collides heavy nuclei at new record high energy

November 25, 2015

The world's most powerful accelerator, the 27 km long Large Hadron Collider (LHC) operating at CERN in Geneva established collisions between lead nuclei, this morning, at the highest energies ever. The LHC has been colliding ...


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.