SAVI camera ditches long lens for distant images

April 14, 2017, Rice University
Details from a fingerprint image taken from a distance of 1 meter by the SAVI prototype developed at Rice and Northwestern universities. At top is one of many speckle patterns captured from a laser reflecting off the original image. At bottom, a clear print is the result of combining dozens of images of the fingerprint taken from slightly different angles and processed by a "synthetic aperture" program. Credit: Jason Holloway/Rice University

A unique camera that can capture a detailed micron-resolution image from a distance uses a laser and techniques that borrow from holography, microscopy and "Matrix"-style bullet time.

A prototype built and tested by engineers at Rice and Northwestern universities reads a spot illuminated by a laser and captures the "speckle" pattern with a camera sensor. Raw data from dozens of camera positions is fed to a computer program that interprets it and constructs a high-resolution image.

The system known as SAVI - for "Synthetic Apertures for long-range, subdiffraction-limited Visible Imaging"—doesn't need a long to take a picture of a faraway object. The prototype only works with coherent illumination sources such as lasers, but Ashok Veeraraghavan, a Rice assistant professor of electrical and computer engineering, said it's a step toward a SAVI camera array for use in visible light.

"Today, the technology can be applied only to coherent (laser) light," he said. "That means you cannot apply these techniques to take pictures outdoors and improve resolution for sunlit images - as yet. Our hope is that one day, maybe a decade from now, we will have that ability."

The technology is the subject of an open-access paper in Science Advances.

Labs led by Veeraraghavan at Rice and Oliver Cossairt at Northwestern's McCormick School of Engineering built and tested the device that compares between multiple speckled images. Like the technique used to achieve the "Matrix" special effect, the images are taken from slightly different angles, but with one camera that is moved between shots instead of many fired in sequence.

Rice University graduate student Yicheng Wu demonstrates the SAVI prototype, which is able to capture fine details of an object from a distance, effectively replacing a large telephoto lens. The prototype camera is on a motorized track in the foreground at left, while a laser at right creates a speckle pattern on the target, a fingerprint. Credit: Jeff Fitlow/Rice University

Veeraraghavan explained the speckles serve as reference beams and essentially replace one of the two beams used to create holograms. When a laser illuminates a rough surface, the viewer sees grain-like speckles in the dot. That's because some of the returning light scattered from points on the surface has farther to go and throws the collective wave out of phase. The texture of a piece of paper - or even a fingerprint - is enough to cause the effect.

The researchers use these phase irregularities to their advantage.

"The problem we're solving is that no matter what wavelength of light you use, the resolution of the image - the smallest feature you can resolve in a scene - depends upon this fundamental quantity called the diffraction limit, which scales linearly with the size of your aperture," Veeraraghavan said.

"With a traditional camera, the larger the physical size of the aperture, the better the resolution," he said. "If you want an aperture that's half a foot, you may need 30 glass surfaces to remove aberrations and create a focused spot. This makes your lens very big and bulky."

SAVI's "synthetic aperture" sidesteps the problem by replacing a long lens with a computer program the resolves the speckle data into an image. "You can capture interference patterns from a fair distance," Veeraraghavan said. "How far depends on how strong the laser is and how far away you can illuminate."

"By moving aberration estimation and correction out to computation, we can create a compact device that gives us the same surface area as the lens we want without the size, weight, volume and cost," said Cossairt, an assistant professor of electrical engineering and computer science at Northwestern.

A schematic shows the single-beam SAVI system developed at Rice and Northwestern universities. The system employs a single beam, multiple images and sophisticated software to capture detailed images from a distance. Credit: Jason Holloway/Rice University

Lead author Jason Holloway, a Rice alumnus who is now a postdoctoral researcher at Columbia University, suggested an array of inexpensive sensors and plastic lenses that cost a few dollars each may someday replace traditional telephoto lenses that cost more than $100,000. "We should be able to capture that exact same performance but at orders-of-magnitude lower cost," he said.

Such an array would eliminate the need for a moving camera and capture all the data at once, "or as close to that as possible," Cossairt said. "We want to push this to where we can do things dynamically. That's what is really unique: There's an avenue toward real-time, high-resolution capture using this synthetic aperture approach."

Cossairt started thinking about the idea when applying for his National Science Foundation (NSF) CAREER Award. "Later on, Ashok and I got interested in techniques through some colleagues of ours in California who were using them in microscopy."

Veeraraghavan said SAVI leans on work by the California Institute of Technology and the University of California, Berkeley, which developed the Fourier ptychography technique that allows microscopes to resolve images beyond the physical limitations of their optics.

The SAVI team's breakthrough was the discovery that it could put the light source on the same side as the camera rather than behind the target, as in transmission microscopy, Cossairt said. He spent three months at Rice to develop the system with Holloway and others in Veeraraghavan's lab.

"We started by making a larger version of their microscope, but SAVI has additional technical challenges. Solving those is what this paper is about," Veeraraghavan said.

Explore further: No lens? No problem for FlatCam

More information: SAVI: Synthetic apertures for long-range, subdiffraction-limited visible imaging using Fourier ptychography, Science Advances  14 Apr 2017:
Vol. 3, no. 4, e1602564, DOI: 10.1126/sciadv.1602564 ,

Related Stories

No lens? No problem for FlatCam

November 23, 2015

How thin can a camera be? Very, say Rice University researchers who have developed patented prototypes of their technological breakthrough.

Bell Labs researchers build camera with no lens

June 4, 2013

( —A small team of researchers at Bell Labs in New Jersey has built a camera that has no lens. Instead, as they explain in their paper they've uploaded to the preprint server arXiv, the camera uses a LCD array, ...

Team develops faster, higher quality 3-D camera

April 24, 2015

When Microsoft released the Kinect for Xbox in November 2010, it transformed the video game industry. The most inexpensive 3-D camera to date, the Kinect bypassed the need for joysticks and controllers by sensing the user's ...

Recommended for you

Walking crystals may lead to new field of crystal robotics

February 23, 2018

Researchers have demonstrated that tiny micrometer-sized crystals—just barely visible to the human eye—can "walk" inchworm-style across the slide of a microscope. Other crystals are capable of different modes of locomotion ...

Researchers turn light upside down

February 23, 2018

Researchers from CIC nanoGUNE (San Sebastian, Spain) and collaborators have reported in Science the development of a so-called hyperbolic metasurface on which light propagates with completely reshaped wafefronts. This scientific ...

Recurrences in an isolated quantum many-body system

February 23, 2018

It is one of the most astonishing results of physics—when a complex system is left alone, it will return to its initial state with almost perfect precision. Gas particles, for example, chaotically swirling around in a container, ...

Seeing nanoscale details in mammalian cells

February 23, 2018

In 2014, W. E. Moerner, the Harry S. Mosher Professor of Chemistry at Stanford University, won the Nobel Prize in chemistry for co-developing a way of imaging shapes inside cells at very high resolution, called super-resolution ...

Hauling antiprotons around in a van

February 22, 2018

A team of researchers working on the antiProton Unstable Matter Annihilation (PUMA) project near CERN's particle laboratory, according to a report in Nature, plans to capture a billion antiprotons, put them in a shipping ...


Adjust slider to filter visible comments by rank

Display comments: newest first

Da Schneib
not rated yet Apr 14, 2017
Fascinating; the application of speckle interferometry to photography in real time. If they can do it with natural light, this will be a major advance in optics.
1 / 5 (2) Apr 15, 2017
worthless. They tried speckle interferometry on seeing the surfaces of stars in the 1970s. It was a bust. They produced some fanciful images. Had it not been a flop, it would have been refined today to the point where the largest, closest star surfaces would be somewhat detailed. Also, why, if this technology is remotely refined would it take a decade to migrate into the visible region? Also, the most expensive telephoto I'm aware of today is Canon's 800mm at about $18,000. A Fuji pro video telephoto might run $90k. I don't know any common lenses that costs $100k plus.
5 / 5 (3) Apr 15, 2017
Had it not been a flop, it would have been refined today to the point where the largest, closest star surfaces would be somewhat detailed

No. Speckle interferometry can only get you up to the diffraction limit of the telescope. It works but very few stars are resolved at that low resolution. Optical intereferometers can achieve resolutions up to 30 times higher and even they don't produce very detailed images of the surfaces of stars, again because stars are small. It's been widely eclipsed by adaptive optics. It's a very limited technique in astronomy but that doesn't mean it doesn't work.
Da Schneib
5 / 5 (1) Apr 15, 2017
Actually, amateur astronomers have re-discovered one technique for speckle imaging: shift-and-stack, and made some improvements on it by not only selecting out the best images, but even doing so automatically rather than by eye. A lot of planetary enthusiasts use this, because planets are nice bright targets amenable to very short exposures. There are people out there getting some amazing images with webcams and fairly cheap small telescopes using this technique.
not rated yet Apr 16, 2017
Seems like a very high speed "fly's Eye" lens/detector array with very narrow visible band sensitivity would work in visible light as a telephoto lens. The computer power would probably be about the same as a new smart phone.
1 / 5 (1) Apr 16, 2017
I only contributed creatively to the Matrix script, my viewpoint is therefore limited. This technique could utilize ultra large surface receptors to improve distant out of solar system planet images. An inflatable or a series of inflatables, but of course they're probably already doing that and have been with radar for a long time now.
The same could be accomplished through time with multiple exposures of the same object, too, no?

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.