(PhysOrg.com) -- Not every child can dream up a smartphone application and see it come to life. But thats what happened when 8-year-old William Belhumeur suggested his father make an app that identifies plants using visual recognition technology.
As a professor of computer science at the engineering school and director of Columbias Laboratory for the Study of Visual Appearance, Peter Belhumeur has worked on face recognition software since the mid-1990s. He quickly saw that the same algorithms that can process the curve of an eyebrow or the angle of a cheekbone could be applied to the shape of a leaf.
The idea of building classifiers that say, Is this person in the photo a man or a woman? or Is that leaf a sugar maple or a silver maple? uses a lot of the same sort of math and technology, says Belhumeur.
With the help of computer scientist David Jacobs at the University of Maryland and John Kress, research botanist and curator at the Smithsonian Institution, Belhumeur developed LeafSnap, an electronic field guide that is now available on the iPhone and iPad, and on Android phones later this year. It is easy enough for a child to use, but goes well beyond the basics for botanists.
The team started by photographing leaves from the Smithsonians vast library. But they soon realized a viable application would have to be able to recognize leaves in the wild, not just museum specimens. So Belhumeurs student volunteers collected thousands of leaves from Central Parkup to 50 samples each from the parks 145 speciesand photographed them with their iPhones.
A leafs shape is its least variable feature and easiest to capture in a photo, so the team focused on characteristics like smooth versus jagged, many-lobed or single-lobed. They then programmed the computer to perform a sort of process of elimination. The computer basically ranks images by most similar to least similar, says Neeraj Kumar, a Ph.D. candidate in computer science who manages LeafSnaps software coding and is in charge of the volunteer leaf identifying team.
Back in a Schapiro Hall lab, the team trained the computer to distinguish one species from another. We pick one feature we extract from the leaf, and using that we can say, This looks more like all of these maples Ive seen and less like something else, says Kumar.
The app, which is free, allows a user to photograph a leaf, upload it and see a list of possible matches within seconds. There is also a complementary website (http://leafsnap.com) with profiles of each species. Initial interest is high; the app has been installed 150,000 times.
In addition to the Central Park trees, LeafSnaps database covers the 160 species in Washington, D.C.s Rock Creek Park; between the two parks, most native species in the northeast are represented. Belhumeurs team hopes to eventually map species across the United States and use a crowd-sourcing element to let users add their own images to the database. This is the sort of system we need because species are disappearing off the planet at an alarming rate, and the process of identification is very slow, explains Belhumeur.
Belhumeur went to Brown as an undergraduate and received his Ph.D. in engineering sciences from Harvard. He came to Columbia in 2002 after eight years as an electrical engineering professor at Yale. For him, LeafSnap bridges his high-tech background and love of nature. As a child in Providence, R.I., he remembers looking up at trees with his parents and trying to identify leaves with a field guide. Now his family has a farm in Cornwall, Conn., where they raise cattle, sheep, pigs, geese and chicken. It was fun to take that visual recognition technology and drop it into this domain, working with biologists and doing something I cared about as a kid, he says.
His son is already thinking of new apps. I think it is time for Fishsnap and Bugsnap, says William, so there is still a lot of work to do.
Explore further: Researchers launch first iPhone field guide using visual search