Students demo autonomous robotic systems

August 22, 2012 By Anne Ju, Cornell University
Students demo autonomous robotic systems
Chuck Yang, M.Eng. '12, models a pair of glasses that mirror a computer screen, providing the user with a heads-up display as he communicates with a robot on a search-and-rescue mission. Image: Lindsay France

( -- Pop into Cornell's Autonomous Systems Lab in Rhodes Hall any given day, and a mechanical arthropod might be negotiating a steep ramp, or a Roomba-like rover could be cleaning up a cluttered room.

Students led by Mark Campbell, professor of mechanical and , and Hadas Kress-Gazit, assistant professor in the same department, are helping to bring robotics out of the rigid hard-wired programming systems of yore into more sophisticated, integrated and automated functions for a variety of robot platforms. Several student researchers showed off their latest contributions at the end of last semester with a project demo.

"What we want to do is create machines that can do things autonomously," explained Kress-Gazit, whose research interests include a high-level software toolkit called Linear Temporal Mission Planning (LTLMop). "That means from the simplest things -- 'Don't collide with something,' to the more complex."

For instance, Robert Villalba '15 used LTLMoP to create high-level commands for a spiderlike robot that's smart enough to traverse different terrains without being programmed with every move; instead, it reacts to an environment based on broad specifications.

"The idea is to use English and tell the robot what you want it to do," Villalba said. The robot, for example, walks with a relaxed gait on a flat surface; when it encounters an incline, it adjusts its gait with more exaggerated movements to aid its climb.

Students demo autonomous robotic systems
Annie Dai '12 demonstrates her project, a universal robotic gripper integrated with a platform that patrols an "apartment" to find trash. Image: Lindsay France

Another group might someday put campus tour guides out of a job. Ahmed Elsamadisi '14 and his group demonstrated their design of a that can give a prerecorded campus tour -- and knows where it's going with minimal human supervision.

They used a rolling mechanical Segway platform and QR code-like tags along the walls of Rhodes hall for a "vision" system so the robot could orient itself. Preloaded with specifications, the robot can negotiate hallways and corners on its own, providing auditory recordings along the way. It can adjust its behavior depending on what it encounters -- such as a cluster of people walking by.

"The robot has to constantly inform itself and update its knowledge based on what it sees," Elsamadisi said. "The full goal would be [for it] to learn habits … and respond with verbal communication."

Annie Dai '12 demonstrated how she used a robotic universal "gripper," which is a balloon filled with coffee grounds that hardens around and picks up objects, integrated with a platform to patrol "bedrooms" of a mock apartment and find trash, pick it up and drop it off in receptacles. Like the arthropod and the , this robot operates with high-level understanding of its environment, reacting to situations as they arise, guided by simple English commands.

Yet another group, with the goal of improving search-and-rescue missions, designed a software interface with an interactive map of an area (in this case, the engineering quad) that a person could input information into -- such as, "go here, not there"; "there may be something interesting to look at here, but not there" -- using a touchpad. In turn, the could relay information back to the human.

And to really make things futuristic, the students loaded their interface into a pair of monitor goggles, which mirrors the computer screen. This provides the wearer with a heads-up display -- more convenient than looking down at a screen, and more ideal in a search-and-rescue type environment.

"The idea is to leverage things robots are good at, like doing long, monotonous or dangerous missions, with the things humans are good at, which is interpreting scenes and doing high-level decision making and pattern recognition," said Nisar Ahmed, a postdoctoral associate who works on the project, which is called Husion.

These projects and more all demonstrate the lab's unifying theme: increasing the autonomy of robots, Kress-Gazit said.

"Robotics has different flavors, but here we focus on machines that can do things autonomously with minimum human intervention, and still be safe while doing something interesting," she said.

Explore further: Structured English brings robots closer to everyday users

Related Stories

Robots Playing Shuffleboard (w/ video)

June 8, 2011

( -- Intense robot battles have, for the most part, been confined to the silver screen. Occasionally a robot comes by to trounce us at chess, but robot on robot competition has been fairly limited. In this case ...

Robots could improve everyday life, do chores

September 21, 2010

( -- They're mundane, yet daunting tasks: Tidying a messy room. Assembling a bookshelf from a kit of parts. Fetching a hairbrush for someone who can't do it herself. What if a robot could do it for you?

Researchers give robot ability to learn (w/ Video)

August 2, 2011

Researchers with the Hasegawa Group at the Tokyo Institute of Technology have created a robot that is capable of applying learned concepts to perform new tasks. Using a type of self-replicating neural technology they call ...

Recommended for you

China auto show highlights industry's electric ambitions

April 22, 2018

The biggest global auto show of the year showcases China's ambitions to become a leader in electric cars and the industry's multibillion-dollar scramble to roll out models that appeal to price-conscious but demanding Chinese ...

Robot designed for faster, safer uranium plant pipe cleanup

April 21, 2018

Ohio crews cleaning up a massive former Cold War-era uranium enrichment plant in Ohio plan this summer to deploy a high-tech helper: an autonomous, radiation-measuring robot that will roll through miles of large overhead ...

Virtually modelling the human brain in a computer

April 19, 2018

Neurons that remain active even after the triggering stimulus has been silenced form the basis of short-term memory. The brain uses rhythmically active neurons to combine larger groups of neurons into functional units. Until ...

'Poker face' stripped away by new-age tech

April 14, 2018

Dolby Laboratories chief scientist Poppy Crum tells of a fast-coming time when technology will see right through people no matter how hard they try to hide their feelings.


Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.