Improved sensor technology could someday keep tabs on terrorists by remote control

February 12, 2009,

Scientists at Rochester Institute of Technology are designing a new kind of optical sensor to fly in unmanned air vehicles, or surveillance drones, tracking suspects on foot or traveling in vehicles identified as a threat.

"The Air Force has clearly recognized the change in the threat that we have," says John Kerekes, associate professor in RIT's Chester F. Carlson Center for Imaging Science. "I think we all understand that our military has a paradigm shift. We're no longer fighting tanks in the open desert; we're fighting terrorists in small groups, asymmetric threats."

Kerekes won a $1 million Discovery Challenge Thrust grant from the Air Force Office of Scientific Research to design efficient sensors using multiple imaging techniques to track an individual or a vehicle.

The sensor will collect only the data it needs. It will assess a situation and choose the best sensing mode (black and white imaging, hyperspectral or polarization) for the purpose. Developing two strands of information—one about the target, the other about the background environment—will be key to maintaining a connection and for piercing through camouflage effects.

This is how it will work: The sensor will collect a black and white image of a target, say a car, and will record the shape of the object. A hyperspectral image will plot the object's color as it appears in multiple wavelengths, from the visible light to the near and short infrared parts of the spectrum beyond what the eye can see. (This mode can tell the difference between two blue cars passing.) The third imagery mode, polarization, cuts through glare and gives information about surface roughness. It provides details that distinguish between objects of similar color and shape. (This mode can lock onto the unique material properties of the blue car in question.)

"These are all complementary pieces of information and the idea is that if the object you are tracking goes into an area where you lose one piece of information, the other information might help," Kerekes says.

As the lead scientist on the project, Kerekes assembled a comprehensive team with RIT collaborators and other scientists to envision the system from end to end: all the way from the design of the optical and microelectronic devices to the synchronizing algorithms that tie everything together.

Zoran Ninkov, professor of imaging science at RIT, is working on the overall optical system. Ninkov is modifying one of his own astronomical optical sensors for this downward-looking purpose. Alan Raisanen, associate director of RIT's Semiconductor and Microsystems Fabrication Laboratory, is designing tunable microelectronics devices to collect specific wavelengths. Ohio-based Numerica Inc., a large subcontractor on the project, is creating the advanced algorithms necessary for tracking a target and picking the right imaging mode based on the scenario.

According to Kerekes, motivation for this project came from Paul McManamon, former chief scientist at the Air Force Research Laboratory's Sensors Directorate in Dayton, Ohio, partly as a means of eliminating data overload.

"The idea is to lead to more efficient sensing both from the point of view of collecting the data necessary and being able to adapt to these different modalities based on the conditions in the scene or the task at hand," says Kerekes.

The catch phrase is 'performance-driven sensing,'" he continues. "The idea behind that is you let the task at hand and the desire to optimize the performance drive what information is collected."

Kerekes and his team are testing their preliminary models using generic scenarios played out in a simulated world akin to Second Life. The computer program, known as Digital Imaging and Remote Sensing Image Generation model (dirsig.cis.rit.edu/), is driven by computer graphic codes that predict simulated sensor data and provide a platform for testing scenarios based on imaging problems, such as Kerekes' new sensor technology.

Source: Rochester Institute of Technology

Explore further: Cutbacks at Stratolaunch, Virgin Galactic show the space industry is entering a second stage

Related Stories

When high tech goes underground

December 19, 2018

ANYmal, a robot developed at ETH, can see and hear, and even open doors. An international research team is now working to ensure the robot can function in extreme conditions – a mission that takes them to the labyrinth ...

Explaining a fastball's unexpected twist

November 18, 2018

An unexpected twist from a four-seam or a two-seam fastball can make the difference in a baseball team winning or losing the World Series. However, "some explanations regarding the different pitches are flat-out wrong," said ...

Recommended for you

Paleontologists report world's biggest Tyrannosaurus rex

March 22, 2019

University of Alberta paleontologists have just reported the world's biggest Tyrannosaurus rex and the largest dinosaur skeleton ever found in Canada. The 13-metre-long T. rex, nicknamed "Scotty," lived in prehistoric Saskatchewan ...

NASA instruments image fireball over Bering Sea

March 22, 2019

On Dec. 18, 2018, a large "fireball—the term used for exceptionally bright meteors that are visible over a wide area—exploded about 16 miles (26 kilometers) above the Bering Sea. The explosion unleashed an estimated 173 ...

Coffee-based colloids for direct solar absorption

March 22, 2019

Solar energy is one of the most promising resources to help reduce fossil fuel consumption and mitigate greenhouse gas emissions to power a sustainable future. Devices presently in use to convert solar energy into thermal ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.