Self-directed robot can identify objects

Self-directed robot can identify objects

"That is a ball." "I do believe that is a cone." "Seems like a wonderful book." The voice is mechanical and flat, and anyone offering such banal commentary and sounding so bored would surely bomb in a job interview. But in this case, the observations are impressive. They're made by what looks like a two-foot-tall stack of hors d'oeuvre trays on wheels, careening around the floor and proclaiming its discoveries as its "eye," an attached camera, falls on them.

This robot has learned to recognize these specific objects—and to steer around obstacles, albeit clumsily—without human guidance. Its camera sends information about what it sees to a laptop sitting atop the robot; the laptop in turn communicates with a laboratory desktop, whose monitor flashes whatever the robot's camera catches.

"It's almost self-thinking" in its ability to get around roadblocks, says Emily Fitzgerald (ENG'16), who bestowed the 'bot with a brain as her summer 2015 project with Boston University's Undergraduate Research Opportunities Program (UROP), which provides funding for faculty-mentored research by undergrad students. More important than the robot's autonomous navigation, she says, is its ability to recognize specific objects.

Such self-guiding, object-spotting robots are a Holy Grail for scientists, with potential applications that include exploring distant planets' landscapes. In Fitzgerald's case, she used a deep neural network, a form of artificial intelligence that simulates brain neurons. Deep neural networks process huge amounts of data to solve problems, like recognizing a ball or cone.

"There's an algorithm that will take a ton of pictures of one object and will put it in and compile it all," says Fitzgerald. "Then we basically assign a number to it." The robot "will come upon an object and it will say, 'Oh, there's an object in front of me, let me think about it.' It will…find a picture that corresponds with the object, pick that number, and then it will be able to use that as a reference, so it can exclaim, 'Oh, it's a ball,' 'It's a cone,' or whatever object I had decided to teach it."

Massimiliano Versace (GRS'07), a BU College of Arts & Sciences research assistant professor and director of BU's Neuromorphics Lab, oversaw Fitzgerald's UROP project, and she had help from Lucas Neves (ENG'16), a volunteer in Versace's lab, and Matthew Luciw, a visiting researcher at BU's Center for Computational Neuroscience & Neural Technology.

Asked how hard it was to train their metallic pupil in object recognition, the team members laugh. "There were quite a few times where we did despair a little bit that, you know, this wasn't going to work," says Fitzgerald, who first had to master an unfamiliar programming language. Then the team needed to make sure that the array of different software in the project would work together "without crashing the system," she says.

Often, the software wasn't compatible, resulting in a somewhat ditsy robot. "Most of the time, it just didn't start," Neves says, ruefully recalling those tough moments. It also could get lost: sensors in its wheels tell the robot how far it's traveled. But "the wheels weren't moving at a constant rate, so whenever the would shoot off, it would think it had gone farther than it had because the wheels spun faster," says Fitzgerald.

So the Terminator it isn't. Whether Fitzgerald's project will yield a commercial application someday remains an open question, says Versace, but he has no doubt about the viability of this type of work. Versace heads Neurala, a BU spin-off company, and members of his lab met recently with NASA to discuss related research.

As for Fitzgerald, who was turned on to engineering after excelling at physics and math in high school, she says the project persuaded her to pursue a career in bioimaging. Someday, she says, robotic surgical devices running off neural networks will detect objects in human patients.

"I've actually taken this project and I've said, OK, what else can I do with it in the biomedical setting as well?" she says. "It's really shaped how I've thought about my future going forward."

Provided by Boston University

Citation: Self-directed robot can identify objects (2016, February 23) retrieved 20 April 2024 from https://phys.org/news/2016-02-self-directed-robot.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Researchers develop a robot that can learn to navigate through its environment guided by external stimuli (w/ Video)

478 shares

Feedback to editors