Improved sensor technology could someday keep tabs on terrorists by remote control

Feb 12, 2009

Scientists at Rochester Institute of Technology are designing a new kind of optical sensor to fly in unmanned air vehicles, or surveillance drones, tracking suspects on foot or traveling in vehicles identified as a threat.

"The Air Force has clearly recognized the change in the threat that we have," says John Kerekes, associate professor in RIT's Chester F. Carlson Center for Imaging Science. "I think we all understand that our military has a paradigm shift. We're no longer fighting tanks in the open desert; we're fighting terrorists in small groups, asymmetric threats."

Kerekes won a $1 million Discovery Challenge Thrust grant from the Air Force Office of Scientific Research to design efficient sensors using multiple imaging techniques to track an individual or a vehicle.

The sensor will collect only the data it needs. It will assess a situation and choose the best sensing mode (black and white imaging, hyperspectral or polarization) for the purpose. Developing two strands of information—one about the target, the other about the background environment—will be key to maintaining a connection and for piercing through camouflage effects.

This is how it will work: The sensor will collect a black and white image of a target, say a car, and will record the shape of the object. A hyperspectral image will plot the object's color as it appears in multiple wavelengths, from the visible light to the near and short infrared parts of the spectrum beyond what the eye can see. (This mode can tell the difference between two blue cars passing.) The third imagery mode, polarization, cuts through glare and gives information about surface roughness. It provides details that distinguish between objects of similar color and shape. (This mode can lock onto the unique material properties of the blue car in question.)

"These are all complementary pieces of information and the idea is that if the object you are tracking goes into an area where you lose one piece of information, the other information might help," Kerekes says.

As the lead scientist on the project, Kerekes assembled a comprehensive team with RIT collaborators and other scientists to envision the system from end to end: all the way from the design of the optical and microelectronic devices to the synchronizing algorithms that tie everything together.

Zoran Ninkov, professor of imaging science at RIT, is working on the overall optical system. Ninkov is modifying one of his own astronomical optical sensors for this downward-looking purpose. Alan Raisanen, associate director of RIT's Semiconductor and Microsystems Fabrication Laboratory, is designing tunable microelectronics devices to collect specific wavelengths. Ohio-based Numerica Inc., a large subcontractor on the project, is creating the advanced algorithms necessary for tracking a target and picking the right imaging mode based on the scenario.

According to Kerekes, motivation for this project came from Paul McManamon, former chief scientist at the Air Force Research Laboratory's Sensors Directorate in Dayton, Ohio, partly as a means of eliminating data overload.

"The idea is to lead to more efficient sensing both from the point of view of collecting the data necessary and being able to adapt to these different modalities based on the conditions in the scene or the task at hand," says Kerekes.

The catch phrase is 'performance-driven sensing,'" he continues. "The idea behind that is you let the task at hand and the desire to optimize the performance drive what information is collected."

Kerekes and his team are testing their preliminary models using generic scenarios played out in a simulated world akin to Second Life. The computer program, known as Digital Imaging and Remote Sensing Image Generation model (dirsig.cis.rit.edu/), is driven by computer graphic codes that predict simulated sensor data and provide a platform for testing scenarios based on imaging problems, such as Kerekes' new sensor technology.

Source: Rochester Institute of Technology

Explore further: The delivery drones are coming, so rules and safety standards will be needed – fast

Related Stories

NOAA's DSCOVR going to a 'far out' orbit

Jan 26, 2015

Many satellites that monitor the Earth orbit relatively close to the planet, while some satellites that monitor the sun orbit our star. DSCOVR will keep an eye on both, with a focus on the sun. To cover both ...

UM researcher helps NASA get the dirt on soil moisture

Jan 15, 2015

During the early-morning hours on Tuesday, Jan. 29, NASA will launch a satellite that will peer into the topmost layer of Earth's soils to measure the hidden waters that influence our ecosystems weather and ...

NOAA's DSCOVR to provide 'EPIC' views of Earth

Jan 07, 2015

NASA has contributed two Earth science instruments for NOAA's space weather observing satellite called the Deep Space Climate Observatory or DSCOVR, set to launch in January 2015. One of the instruments called ...

HaptoMime offers mid-air interaction system (w/ Video)

Oct 29, 2014

HaptoMime gives the word "touchscreen" a new meaning—one that will need to be carefully reworded, as HaptoMime involves a screen that you cannot touch. All the same, it enables interaction with floating ...

NASA: 'There's your new spacecraft, America!"

Dec 05, 2014

(AP)—NASA's newest space vehicle, Orion, accomplished its first test flight with precision and pizazz Friday, shooting more than 3,600 miles (5,800 kilometers) out from Earth for a hyperfast, hot return ...

Recommended for you

Internet of things should be developable for all

Mar 30, 2015

Within the next five to ten years, around 100 billion different devices will be online. A large part of the communication takes place solely between machines, and to ensure that they can communicate, the ...

User comments : 0

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.