Computer scientists to develop smart vision machines

September 20, 2010, University of Southern California
This is an artist's conception of neurovision sensor with custom machined gimbal head, custom printed circuit boards, and precision-machined unibody enclosure. The gimbal head is crucial to allow the "eye" to move to actively see the scene based on feedback from the software system. Credit: USC Viterbi School of Engineering

Five years ago, Laurent Itti of the University of Southern California presented groundbreaking research on how humans see the world. Now, he is heading a $16-million Defense Advanced Research Project Agency effort to build machines that see the world in the way he discovered humans see.

But to do this, Itti and fellow scientists from eight other institutions first plan to learn more about how humans see, and then immediately use this more detailed knowledge to build new machines.

The "neuromorphic visual system for intelligent unmanned sensors" project builds on a previous effort called Neovision, which attempted the same goal, but relied on existing that were not completely compatible with the that Itti had discovered were crucial.

The new project will try to build its eyes from the bottom up, from scratch, finding out more precisely how these neural systems function and then building new software and hardware into which the new understandings will be embedded, tying research directly to development in a continuing reciprocal process.

The goal is clear: "The modern battlefield requires that soldiers rapidly integrate information from a multitude of sensors," says Itti's project description. "A bottleneck with existing sensors is that they lack intelligence to parse and summarize the data. This results in information overload … Our goal is to create intelligent general-purpose cognitive vision sensors inspired from the primate brain, to alleviate the limitations of such human-based analysis."

To do this will require applying existing understandings as well as new fundamental research: "At the core of the present project is the belief that new basic science is crucial," the project description continues.

This is because living creatures' vision involves much more than just images. Rather, it was honed by evolution to seek out the specific visual signals critical to a seeing creature's survival. This involves complex circuitry in the retina, where the outputs from light detector cells are processed to give rise to 12 different types of visual "images" of the world (as opposed to the red, green and blue images of a standard camera). These images are further processed by complex neural circuits in the visual cortex, and in deep-brain areas including the superior colliculus, with feedbacks driving eye movements.

Itti's 2006 paper "Bayesian Surprise Attracts Human Attention," co-authored by Pierre Baldi of UC Irvine, argued that a mathematical algorithm that analyzed incoming visual data looking for a precisely characterized element the authors called a 'surprise' seemed to fit experimental data of eye movement. The theory was the basis of the earlier Neovision work, and has since been extensively revised and developed.

The new plan is to go much farther than Neovision. The researchers will model the entire complex interactive system from the ground up to understand the exact messages transmitted from the retina to the cortex and further to the colliculus, and how the brain cells understand them. Then they will attempt to embed parallel transactions, using the same perception algorithms, into working silicon systems.

The execution of this strategy will be unusual. Typically, such projects aim at a finished system at the end of the project, reporting progress toward such a system at regular intervals.

Instead, Itti's plan is to create a whole series of prototypes, complete with breadboard hardware at a rate of one every six months, as understandings emerge from the project's basic science side.

Researchers will regularly convene at USC for intensive workshops, to learn the findings of a "core team of engineers at USC [interacting] with Ph.D. students and postdocs, performing the basic science directly in an academic setting. Because the engineering core will work directly with the researchers, we expect that the technology transition will be swift and efficient," said Itti.

Itti is an associate professor in the Viterbi School Department of Computer Science. The other institutions that will be part of the effort he is leading include UC Berkeley, Caltech, MIT, Queens University (Canada), Brown, Arizona State, and Penn State, along with a company, Imagize, that specializes in the field.

Explore further: Probe of the quintessence of surprise

Related Stories

Probe of the quintessence of surprise

November 29, 2005

We ignore some sudden noises, while others make us take action. We turn our eyes to look at some moving things -- but not all. Why? A new theory and experimental evidence suggests a novel mathematical explanation of how brains ...

Involuntary maybe, but certainly not random

February 12, 2009

Our eyes are in constant motion. Even when we attempt to stare straight at a stationary target, our eyes jump and jiggle imperceptibly. Although these unconscious flicks, also known as microsaccades, had long been considered ...

Learning about brains from computers, and vice versa

February 15, 2008

For many years, Tomaso Poggio’s lab at MIT ran two parallel lines of research. Some projects were aimed at understanding how the brain works, using complex computational models. Others were aimed at improving the abilities ...

Technology to Treat Blindness Earns Award

July 23, 2009

Research performed at Caltech as part of a collaborative U.S. Department of Energy-funded artificial-retina project designed to restore sight to the blind has received one of R&D Magazine's 2009 R&D 100 Awards. The prize ...

Team IDs binocular vision gene

September 14, 2007

In work that could lead to new treatments for sensory disorders in which people experience the strange phenomena of seeing better with one eye covered, MIT researchers report that they have identified the gene responsible ...

Recommended for you

Technology near for real-time TV political fact checks

January 18, 2019

A Duke University team expects to have a product available for election year that will allow television networks to offer real-time fact checks onscreen when a politician makes a questionable claim during a speech or debate.

Privacy becomes a selling point at tech show

January 7, 2019

Apple is not among the exhibitors at the 2019 Consumer Electronics Show, but that didn't prevent the iPhone maker from sending a message to attendees on a large billboard.

China's Huawei unveils chip for global big data market

January 7, 2019

Huawei Technologies Ltd. showed off a new processor chip for data centers and cloud computing Monday, expanding into new and growing markets despite Western warnings the company might be a security risk.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.