Researchers use multiple photos to estimate lighting conditions of outdoor scenes

December 10, 2014, Disney Research

Techniques now used to reconstruct 3D models based on multiple photos of a building, object or scene can also be leveraged to automatically estimate illumination conditions depicted in a collection of photographs, scientists at Disney Research and Université Laval report.

Everyone knows that objects can look markedly different depending on , the physical characteristics of the objects and the angle at which they are viewed. That makes it difficult for photo editors to insert 3D objects into imagery and make them appear as if they are reflecting light or casting shadows naturally. But knowledge of the lighting conditions in an image could greatly simplify such efforts, according to Iain Matthews, principal research scientist at Disney Research in Pittsburgh.

Matthews and Jean-François Lalonde, an assistant professor of electrical and computer engineering at Université Laval, found that structure-from-motion (SfM) algorithms, which are now widely used to create 3D models based on multiple photographs, could be a key to estimating those illumination conditions. They first used SfM techniques to create 3D models based on collections of photos that all focused on the same landmark; they then used an inverse rendering approach they developed to recover the lighting conditions for each of the photos.

They will present their findings at the International Conference on 3-D Vision, Dec. 8-11, in Tokyo.

Lalonde noted that the knowledge of lighting conditions gleaned from this method not only would permit editors to realistically insert objects into one of the photos in a collection but, almost magically, all of the photos.

"If one adds a virtual statue in front of a building in one of the photographs from the collection, the same statue can now be inserted in all the other photos with the correct illumination for each image," he said.

To develop the lighting estimation technique, Lalonde and Matthews used a novel database that included collections of photos of 22 different landmarks for which the actual conditions - brightness, position of the sun, sky conditions - were recorded for each photo. Knowledge of the actual conditions provided a check on their ability to estimate those conditions.

Matthews noted that they were able to obtain high-dynamic range (HDR) lighting environment maps even when using input images of low-dynamic range.

"You can edit existing outdoor by inserting any 3D object you want and it will look believable - without using additional light and HDR data capture equipment usually required to do this in visual effects," he added.

One limitation, they noted, is that the estimation technique only works in outdoor images with natural illumination.

Explore further: Tone mapping technique creates 'hyper-real' look

More information: … r-image-collections/

Related Stories

Tone mapping technique creates 'hyper-real' look

December 4, 2014

A new image processing technique developed by Disney Research Zurich could make high dynamic range (HDR) video look better when shown on consumer-quality displays by preserving much of the rich visual detail while eliminating ...

Recommended for you

Nanoscale Lamb wave-driven motors in nonliquid environments

March 19, 2019

Light driven movement is challenging in nonliquid environments as micro-sized objects can experience strong dry adhesion to contact surfaces and resist movement. In a recent study, Jinsheng Lu and co-workers at the College ...

OSIRIS-REx reveals asteroid Bennu has big surprises

March 19, 2019

A NASA spacecraft that will return a sample of a near-Earth asteroid named Bennu to Earth in 2023 made the first-ever close-up observations of particle plumes erupting from an asteroid's surface. Bennu also revealed itself ...

The powerful meteor that no one saw (except satellites)

March 19, 2019

At precisely 11:48 am on December 18, 2018, a large space rock heading straight for Earth at a speed of 19 miles per second exploded into a vast ball of fire as it entered the atmosphere, 15.9 miles above the Bering Sea.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.