Engineers Develop Undetectable Means Of Measuring Speed, Motion

Mar 30, 2005

Research aimed at teaching robots to "see" may soon make it possible to bag speeding motorists, track enemy planes, and automatically safeguard the nation’s borders and resources without any chance of detection.
A University of Florida engineering researcher is developing a method to determine speed and other characteristics of a moving object using computer algorithms, or instructions, that rely on data from standard visual cameras rather than radar or lasers. The technique has the potential to render current detection systems in so-called "fuzz busters" and some military technologies useless. That’s because, instead of painting a target with radar waves or laser beams, a camera merely needs to capture an image or series of images from the target.

“If it can view the object moving, that’s all it needs. The computer figures out everything else,” said Warren Dixon, a UF assistant professor of mechanical and aerospace engineering. “We’re trying to use both regular and infrared cameras, so night or adverse weather conditions don’t present a problem.”

Dixon’s most recent co-authored article on the research appears in the March issue of the journal Automatica. Related articles, also co-authored by Dixon, are scheduled to appear shortly in the journal Transactions on Robotics and Automation.

Achieving computerized speed and motion detection requires overcoming several challenges. One is figuring out how to get a computer to understand the surrounding environment by interpreting images recorded by a video or still camera.

“The information from a camera is just a flat-screen, two-dimensional image,” Dixon said. “The challenge is figuring out the mathematics of how do you take two images and understand how things are moving in our three-dimensional world.”

People and animals can perceive depth because their brains combine each eye’s snapshots. Two cameras can also achieve stereo vision, but computers can make sense of it only if they know the exact position of each camera. That allows them to triangulate the target and learn its position relative to the camera. Part of Dixon’s achievement is developing the underlying mathematics and software to circumvent this requirement.

“With my work, you don’t need to know that specific location information,” he said. “You could have one camera taking an image from an airplane and another mounted on a car taking a picture of the same image -- and not know how the airplane and car are related to each other -- and through this new mathematics you can understand how they’re related to the target.”

The technology has law enforcement and military applications.

Police in moving or parked squad cars could use the computer-camera systems much as they do radar and laser guns to track and ticket suspected speeders. The target would have to be within the line of sight, with the range varying according to the power of the lenses in the camera. Dixon said the UF engineers have not built such a system, but “any camera with the right software could be used,” and a prototype could be built within a year.

Soldiers, meanwhile, could mount the cameras on airborne drones or truck convoys and set them to look for and automatically report potentially hostile objects moving toward the convoys – again, without any fear of giving away the convoys’ locations.

Robotic drones or remote camera-based monitoring posts outfitted with the technology also could be used for applications ranging from private security in warehouses and shopping centers to continuous remote monitoring of borders to protecting water supply reservoirs.

In addition to the robotic applications, the technique is being refined for a project led by Andy Kurdila, a UF professor of mechanical and aerospace engineering, to provide vision systems for tiny airborne surveillance drones called micro air vehicles.

The goal of that five-year project, which is jointly funded by a $5 million grant from Eglin Air Force Base in Florida and by the Air Force Office of Scientific Research and involves several UF faculty members, is to give the drones the ability to fly without the assistance of a remote human operator. Instead, they would base navigational decisions solely on what onboard cameras view in the terrain as they fly, mimicking a human pilot.

Explore further: Tax benefits for housing not as outsized as previously thought, study says

add to favorites email to friend print save as pdf

Related Stories

'Street view' goes undersea to map reefs, wonders

Aug 13, 2014

It's easy to go online and get a 360-degree, ground-level view of almost any street in the United States and throughout the world. Soon, scientists hope people will be able to do the same with coral reefs ...

Touch+ goes on sale, aims to make waving so yesterday

Aug 06, 2014

Touch+ announced yesterday that it is available on its makers' Ractiv website for $74.99: Ractiv is offering this product as an add-on for a computer, evolved from a crowdfunding campaign last year for Haptix. ...

Recommended for you

Jurassic Welsh mammals were picky eaters, study finds

11 hours ago

For most people, mere mention of the word Jurassic conjures up images of huge dinosaurs chomping their way through lush vegetation – and each other. However, mammals and their immediate ancestors were also ...

The changing landscape of religion

14 hours ago

Religion is a key factor in demography, important for projections of future population growth as well as for other social indicators. A new journal, Yearbook of International Religious Demography, is the first to bring a quan ...

User comments : 0