(Phys.org) —Panasonic's new color filtering technology is in the news this week after a video from DigInfo TV presented what imaging experts at Panasonic have been up to, and that is using "micro color splitters," which achieve twice the brightness than before possible. These micro color splitters replace a traditional filter array over the image sensor. The result from the new approach is especially relevant for those working with low light photography—situations wherever there is less than daytime light outside, or any indoor photography without much ambient light. The researchers found their new approach could almost double the brightness in photos taken in low light environments. Saying no to traditional color filters, the researchers wanted a technique where light is captured without any loss.
The problem has been that image sensors have produced color pictures by using red, green, and blue filters for each pixel, but with that system, 50 percent to 70 percent of the light is lost. The micro color splitters control the diffraction of light at a microscopic level. Panasonic's imaging experts said that they achieved approximately double the color sensitivity in comparison with conventional sensors that use color filters.
"Conventional color image sensors use a Bayer array [the arrangement of color filters used in imaging sensors in digital cameras, camcorders, and scanners to create a color image]. The filter pattern is 50 percent green, 25 percent red and 25 percent blue in which a red, green, or blue light-transmitting filter is placed above each sensor. These filters block 50 to 70 percent of the incoming light before it even reaches the sensor," according to a Panasonic release.
Seeing demand for higher-sensitivity cameras on the rise, Panasonic sought a new solution to enable sensors to capture "uniquely vivid" color images.
In the video, Seiji Nishiwaki commented further: "Here, color filters aren't used. So light can be captured without loss, which enables us to achieve approximately double the sensitivity."
Nishiwaki said Panasonic's technology can be used on different types of sensors, whether CCD, CMOS, or BSI and can be in step with current semiconductor fabrication processes. He said the new approach would not require any special materials or processes.
According to DigInfo TV: "The image sensor uses two types of color splitters: red deflectors and blue deflectors.The red and blue deflectors are arranged diagonally, with one of each for every four pixels. RGB values can be obtained by determining the intensity of light reaching each of the four pixels. For example, if white light enters each pixel, pixels where it doesn't pass through a deflector receive unmodified white light. But in pixels with a red deflector, the light is split into red diffracted light and cyan non-diffracted light. And when white light passes through a blue deflector, it's split into blue diffracted light and yellow non-diffracted light. As a result, the pixel arrangement is cyan, white + red, white + blue, and yellow. The RGB values are then calculated using a processing technique designed specifically for mixed color signals."
Nishiwaki made special note of something called Babinet-BPM: "We've developed a completely new analysis method, called Babinet-BPM. Compared with the usual FDTD method, the computation speed is 325 times higher, but it only consumes 1/16 of the memory. This is the result of a three-hour calculation by the FDTD method. We achieved the same result in just 36.9 seconds."
FDTD stands for finite-difference time-domain and BPM stands for beam propagation method. Both are numerical analysis techniques.
Panasonic's work is also described in Nature Photonics, in a study called "Efficient colour splitters for high-pixel-density image sensors." The authors said, "We experimentally demonstrate that this principle of colour splitting based on near-field deflection can generate color images with minimal signal loss."
Explore further: Bringing history and the future to life with augmented reality
More information: Nature Photonics paper: www.nature.com/nphoton/journal/v7/n3/full/nphoton.2012.345.html