New theory could reduce number of sensors required for terahertz imaging systems

May 5, 2014 by Larry Hardesty
Credit: Jose-Luis Olivares/MIT

Terahertz imaging, which is already familiar from airport security checkpoints, has a number of other promising applications—from explosives detection to collision avoidance in cars. Like sonar or radar, terahertz imaging produces an image by comparing measurements across an array of sensors. Those arrays have to be very dense, since the distance between sensors is proportional to wavelength.

In the latest issue of IEEE Transactions on Antennas and Propagation, researchers in MIT's Research Laboratory for Electronics describe a new technique that could reduce the number of sensors required for terahertz or millimeter-wave imaging by a factor of 10, or even 100, making them more practical. The technique could also have implications for the design of new, high-resolution radar and sonar systems.

In a digital camera, the lens focuses the incoming light so that light reflected by a small patch of the visual scene strikes a correspondingly small patch of the . In lower-frequency imaging systems, by contrast, an incoming wave—whether electromagnetic or, in the case of sonar, acoustic—strikes all of the sensors in the array. The system determines the origin and intensity of the wave by comparing its phase—the alignment of its troughs and crests—when it arrives at each of the sensors.

As long as the distance between sensors is no more than half the wavelength of the incoming wave, that calculation is fairly straightforward, a matter of inverting the sensors' measurements. But if the sensors are spaced farther than half a wavelength apart, the inversion will yield more than one possible solution. Those solutions will be spaced at regular angles around the sensor array, a phenomenon known as "spatial aliasing."

Narrowing the field

In most applications of lower-frequency imaging, however, any given circumference around the detector is usually sparsely populated. That's the phenomenon that the new system exploits.

"Think about a range around you, like five feet," says Gregory Wornell, the Sumitomo Electric Industries Professor in Engineering in MIT's Department of Electrical Engineering and Computer Science and a co-author on the new paper. "There's actually not that much at five feet around you. Or at 10 feet. Different parts of the scene are occupied at those different ranges, but at any given range, it's pretty sparse. Roughly speaking, the theory goes like this: If, say, 10 percent of the scene at a given range is occupied with objects, then you need only 10 percent of the full array to still be able to achieve full resolution."

The trick is to determine which 10 percent of the array to keep. Keeping every tenth sensor won't work: It's the regularity of the distances between sensors that leads to aliasing. Arbitrarily varying the distances between sensors would solve that problem, but it would also make inverting the sensors' measurements—calculating the wave's source and intensity— prohibitively complicated.

Regular irregularity

So Wornell and his co-authors—James Krieger, a former student of Wornell's who is now at MIT's Lincoln Laboratory, and Yuval Kochman, a former postdoc who is now an assistant professor at the Hebrew University of Jerusalem—instead prescribe a detector along which the sensors are distributed in pairs. The regular spacing between pairs of sensors ensures that the scene reconstruction can be calculated efficiently, but the distance from each sensor to the next remains irregular.

The researchers also developed an algorithm that determines the optimal pattern for the sensors' distribution. In essence, the algorithm maximizes the number of different distances between arbitrary pairs of sensors.

With his new colleagues at Lincoln Lab, Krieger has performed experiments at radar frequencies using a one-dimensional array of sensors deployed in a parking lot, which verified the predictions of the theory. Moreover, Wornell's description of the sparsity assumptions of the theory—10 percent occupation at a given distance means one-tenth the sensors—applies to one-dimensional arrays. Many applications—such as submarines' sonar systems—instead use two-dimensional arrays, and in that case, the savings compound: One-tenth the sensors in each of two dimensions translates to one-hundredth the in the complete array.

James Preisig, a researcher at the Woods Hole Oceanographic Institution and principal at JP Analytics, says that he's most interested in the new technique's ability to reduce the computational burden of high-resolution sonar imaging. "This technique helps significantly with the computational complexity of using signals from very large arrays," Preisig says. "I can imagine it being deployed in situations where you are using very, very large arrays to get good spatial resolution, but you're processing signals on a vehicle or something where you did not have significant computational power."

In those contexts, Preisig says, the new technique's sparsity assumptions make perfect sense. "In this context, the field of view is divided into sectors," Preisig says.

"The field has to be sector-sparse—only a subset of sectors have objects in them. That is realistic."

Explore further: NIST unveils prototype video imaging system for remote detection of hidden threats

More information: Krieger, J.D., et al. "Multi-Coset Sparse Imaging Arrays." Antennas and Propagation, IEEE Transactions on. Volume 62 , Issue: 4. April 2014. DOI: 10.1109/TAP.2014.2299819

Related Stories

Pencil drawing of a sensor actually is a sensor

February 28, 2014

Using graphite pencils to draw on regular paper, researchers can make some very inexpensive piezoresistive (PZR) sensors. Due to the piezoresistive effect, a sensor's resistance changes under an applied strain, allowing it ...

Bell Labs researchers build camera with no lens

June 4, 2013

( —A small team of researchers at Bell Labs in New Jersey has built a camera that has no lens. Instead, as they explain in their paper they've uploaded to the preprint server arXiv, the camera uses a LCD array, ...

Recommended for you

Swiss unveil stratospheric solar plane

December 7, 2016

Just months after two Swiss pilots completed a historic round-the-world trip in a Sun-powered plane, another Swiss adventurer on Wednesday unveiled a solar plane aimed at reaching the stratosphere.

Solar panels repay their energy 'debt': study

December 6, 2016

The climate-friendly electricity generated by solar panels in the past 40 years has all but cancelled out the polluting energy used to produce them, a study said Tuesday.

Wall-jumping robot is most vertically agile ever built

December 6, 2016

Roboticists at UC Berkeley have designed a small robot that can leap into the air and then spring off a wall, or perform multiple vertical jumps in a row, resulting in the highest robotic vertical jumping agility ever recorded. ...

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

5 / 5 (1) May 05, 2014
Thorough, detailed, explanatory stories like this are why I keep coming to this site. Good job!

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.