Researcher sees new angles in visual search

October 26, 2011 by Beth Kwon, Columbia University
Engineering professor Shih-Fu Chang is developing technology to make visual search as effortless as typing keywords into a search engine. Image credit: Eileen Barroso/Columbia University

Engineering professor Shih-Fu Chang is trying to make visual search technology as effortless as typing a keyword like “Morningside restaurants” into Google.

In Chang’s ideal world, you could sift through a season’s worth of Major League Baseball games to find every double play or sort through your digital archives to find all the pictures you’ve taken of your kid blowing out her birthday candles.

Chang, the Richard Dicker Professor of Telecommunications, who has a joint appointment in electrical engineering and computer science, develops algorithms to identify and index data, as well as new techniques and software systems to help users manage large amounts of multimedia information. The director of Columbia Engineering’s Digital Video and Multimedia Lab, Chang was recently honored with a lifetime achievement award from the Association for Computing Machinery’s Special Interest Group on Multimedia.

Chang credits his associates for the prestigious award. “It really should be attributed to all the wonderful students and collaborators I have had the good fortune to work with during my career,” he said.

A native of Taiwan, he was fascinated by technology from a young age, he remembers getting his first computer, an Apple II, as a high school student in 1981 and envying classmates who were fortunate enough to have an Atari computer system.

“I grew up in a time when the information technology industry was rapidly expanding, and it caught the imagination of young people,” Chang recalls. “My parents didn’t know anything about computers, but we as students caught the wave and jumped on it.” He received his Ph.D. from the University of California, Berkeley, and joined Columbia in 1993.

In 1998, he developed one of the first video systems, VideoQ, and in the 1990s he pioneered search by sketch technology, in which a user can draw what he or she is looking for.

His work has been broadly funded by government and industry, including Eastman Kodak, and many video indexing technologies developed by his group have been licensed to companies. With the support of the National Science Foundation, he is working on technology to determine whether images and videos have been tampered with.

Recently, he’s been working on technology that lets users adjust the importance of multiple characteristics in a search. If you’re looking for an image of a sun setting over a mountain near a body of water, for instance, you would enter the search words “sunset,” “mountain” and “water” into a grid next to a database of images. Then you could home in on a specific feature—the mountain, for instance—by using your mouse to nudge the cursor closer to that word. All the images in the database would be indexed automatically without time-consuming human tagging.

Other research projects let users upload a specific image they want to match, where a photo can be matched with images all over the Internet. Chang is working on refining technology that allows people to specify the exact portion of an image to search for—a certain sculpture or building, for example—so that searching is more accurate and efficient, even on mobile devices like iPhones. His research could also be applied in the medical field, allowing technicians to find a moment in an ultrasound film that could help make a diagnosis.

Another aspect of his research involves using the human brain as a search tool. In collaboration with biomedical engineering professor Paul Sajda, Chang designed a device that monitors brain activity as a subject looks at pictures. First, an EEG machine records the “aha” moments of recognition, then a computer analyzes the recognition patterns to identify similarities in other photographs in large databases that elicited the same strong reaction.

“The machine does what it’s best at, and the human does what he or she is best at in the most natural way,” Chang says.

Explore further: Professor develops mobile app to identify plant species

Related Stories

Professor develops mobile app to identify plant species

June 8, 2011

( -- Not every child can dream up a smartphone application and see it come to life. But that’s what happened when 8-year-old William Belhumeur suggested his father make an app that identifies plants using ...

For software developers, more speed and mobility

December 14, 2010

Across the globe, technology and innovation are becoming increasingly more reliant on mobility and accessibility. For software developers working on highly complex projects, that means being able to save their work quickly ...

Understanding the social side of cyber-security issues

May 4, 2011

When Engin Kirda started focusing on cyber-security research 10 years ago, those primarily responsible for launching Internet attacks were teenagers out for kicks, he said. But the scope of threats existing through the Web ...

Recommended for you

Cryptocurrency rivals snap at Bitcoin's heels

January 14, 2018

Bitcoin may be the most famous cryptocurrency but, despite a dizzying rise, it's not the most lucrative one and far from alone in a universe that counts 1,400 rivals, and counting.

Top takeaways from Consumers Electronics Show

January 13, 2018

The 2018 Consumer Electronics Show, which concluded Friday in Las Vegas, drew some 4,000 exhibitors from dozens of countries and more than 170,000 attendees, showcased some of the latest from the technology world.

Finnish firm detects new Intel security flaw

January 12, 2018

A new security flaw has been found in Intel hardware which could enable hackers to access corporate laptops remotely, Finnish cybersecurity specialist F-Secure said on Friday.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.