Professors teach robot to 'play ball'

Sep 26, 2008
Ph.D. student researcher Michael Rush, Thomas Sugar and Michael McBeath work with their ground-ball-fielding robot, Catchbot. Photo by John C. Phillips

(PhysOrg.com) -- Baseball is elegant in its simplicity. Pitch a ball, hit the ball. Score more runs than your opponent and you win the game.

But baseball also is complex. It depends on the level at which the game is played.

At the major league level, baseball becomes a game of inches and of finely tuned instinct. The shortstop knows the hitter well enough to shade him a little to his left. He also knows the pitcher likes to pitch away to this particular batter.

As the pitcher winds up and makes his pitch, the shortstop instinctively leans to his left. The ball is hit hard right up the middle past the pitcher and nearly over second base. That is where the shortstop has raced to intercept the ground ball. The shortstop then uses his lateral momentum and throws to first base. The throw barely beats the runner racing down the baseline.

Derek Jeter and Torii Hunter are just two of many all-star major league baseball players. They routinely mtheake the great plays. They also make those plays look easy. Players like Hunter and Jeter combine speed, agility, and quick decision-making with anticipation. As a result, they can chase down a fly ball hit over their heads or snare a line drive or hard hit ground ball that is scorched to their right or left.

Now, try and teach a robot those types of skills.

That is the challenge confronting Michael McBeath and Thomas Sugar. McBeath is a psychology professor at Arizona State University. His specialty is perception. He has studied how people visually track and catch a baseball and how dogs catch Frisbees. Sugar is an engineering professor and robotics researcher at ASU’s Polytechnic campus.

Their unlikely collaboration has gone on for more than seven years. In that time, the ASU researchers have completed several groundbreaking studies on perception and tracking. They have detailed how humans and other animals react and move to intercept objects.

The Catchbot is a result of that work. Their invention is a four-wheeled, low-slung, ground-ball-fielding robot.

Catchbot can intercept ground balls with a .750 fielding percentage. That’s not bad…for a little leaguer. It is still far below what is expected of even an average major league infielder.

In its present form, McBeath and Sugar’s quicksilver infielder is pretty good. Catchbot can in an instant detect the motion of an object, calculate its relative speed, and determine how it needs to move to intercept that object.

But Catchbot doesn’t have an eye, a glove, or even a hat for that matter. Instead, the squat remote controlled car-like robot uses tires for legs, has a gimbaled camera for eyes, a bumper for a glove, and microchip for a brain. The sharp grounder up the box may never be the same.

Catchbot is a novel blend of robotics and basic principles that underlie perception of a moving target. It brought together two ASU professors from seemingly wildly different disciplines. It also caught the fancy of the East Coast press. Catchbot appeared twice in The New York Times—once in its sport pages in June 2006 and again as an “Idea of the Year” in The New York Times Magazine in December 2006.

“We were on the same page as Vladimir Putin,” Sugar says of the Ideas piece.

“This is a lot bigger than something that just interests two nerdy scientists,” McBeath adds.

Birds and bees do it

Perception really is everything, when it comes to catching a baseball. In order to catch a fly ball, the McBeath-Sugar team found that a human outfielder first sees the ball as it moves up in the air. He then instinctively moves relative to the ball to keep the ball’s image moving at a constant rate. Maintaining a constant change in visual angle keeps the fielder on track to catch the ball.

When a baseball is hit up into the air, the outfielder will instinctively move to be under where it is headed simply by keeping the image of the ball continually rising, even while the ball physically descends. If the image of the ball starts to curve toward the ground, the fielder runs so as to straighten the curved trajectory back up.

“As the ball moves, the fielder moves to the spot that keeps the ball angle rising,” McBeath says.

“We found that insects, birds, bats, and dogs all appear to behave the same way,” McBeath explains. “It seems to be a generic strategy. No matter where a target is coming from, you try to control the relationship between you and the target. If you can see and keep the target moving at a constant speed and pretty much in a straight direction relative to yourself, then interception is guaranteed.”

McBeath and Sugar have piles of data from years of experiments. Using that information, they’ve created detailed diagrams showing the trajectory of a ball and the movement of a fielder in relation to the ball. Those diagrams come complete with mathematical formulas and geometrical progressions. They also have hours of videotaped experiments supporting their theory.

“You don’t need complex calculations to solve these problems,” says McBeath, who has undergraduate and graduate degrees in electrical engineering, as well as psychology. “It’s simple control theory. The idea is to keep the ball image moving up and over at a constant rate.”

That same principle can be applied to ground balls. The only difference is that the geometrical progressions are flopped. For their field trials, Sugar and McBeath employ the “Jose Canseco Principle” of catching a baseball. Canseco was a former major league ballplayer of limited fielding ability who once had a fly ball bounce off his head over the fence for a homerun.

“That would be a catch for us,” McBeath says wryly. “Our catches are really just a ball bouncing off the front bumper of the robot.”

Sugar’s interest in Catchbot is driven by another reason. He wants to develop more capable mobile robots.

I Robot

“We want to develop mobile robots that can navigate naturally around environments,” Sugar says. “To do that, the robot needs a means to develop maps and store them to be referenced later for navigation. Or, it can use the perceptual information at hand, like what we are doing with Catchbot. I wanted to know just how far we can take robots that rely only these visual, perception-based algorithms.”

Mobile navigating robots are not new. But one of the continual stumbling blocks to expanding their capabilities is their lack of independence. NASA has very sophisticated robot rovers still working on the surface of Mars.

“But what many people do not realize is that behind those robots is a large support group of humans,” Sugar explains. "The humans keep the robots running. They give them direction in what to explore and what to avoid,” he says.

Another example is provided by “The Terminator,” a classic Arnold Schwarzenegger science fiction movie. The central character is a humanlike robot with bad intentions. “What you didn’t see on the screen were the 30 people operating joysticks to operate the Terminator robot,” Sugar says.

The next step is to miniaturize Catchbot and give it more on-board intelligence. McBeath and Sugar believe the ultimate end to their collaboration would be a baseball game played solely by autonomous robots. But the present generation Catchbot has already made important contributions to robotics and perception sciences.

“The goal of robotics folks is to build the best robots possible. The goal of perception people is to model what humans do,” McBeath explains. “This project helps both fields. In perception, we have all kinds of perception-action models. But it’s difficult to prove if the models really do work as advertised in the real world. We are able to do that kind of testing with Catchbot.”

Sugar says their work with Catchbot has uncovered delays in its system. A robot’s performance is limited when the robot has to act solely on a stream of visual data. Improvements for Catchbot are in the works.

The ASU researchers are hoping that the result will be a new Catchbot that can field a ball more like Jeter and less like Canseco.

Provided by ASU

Explore further: Co-robots team up with humans

add to favorites email to friend print save as pdf

Related Stories

Recommended for you

Audi to develop Tesla Model S all-electric rival

14 hours ago

The Tesla Model S has a rival. Audi is to develop all-electric family car. This is to be a family car that will offer an all-electric range of 280 miles (450 kilometers), according to Auto Express, which ...

A green data center with an autonomous power supply

19 hours ago

A new data center in the United States is generating electricity for its servers entirely from renewable sources, converting biogas from a sewage treatment plant into electricity and water. Siemens implemented ...

After a data breach, it's consumers left holding the bag

20 hours ago

Shoppers have launched into the holiday buying season and retailers are looking forward to year-end sales that make up almost 20% of their annual receipts. But as you check out at a store or click "purchase" on your online shopping cart ...

Can we create an energy efficient Internet?

20 hours ago

With the number of Internet connected devices rapidly increasing, researchers from Melbourne are starting a new research program to reduce energy consumption of such devices.

Brain inspired data engineering

21 hours ago

What if next-generation ICT systems could be based on the brain's structure and its cognitive and adaptive processes? A groundbreaking paradigm of brain-inspired intelligent ICT architectures is being born.

User comments : 1

Adjust slider to filter visible comments by rank

Display comments: newest first

DGBEACH
not rated yet Oct 02, 2008
A feeble attempt at making an otherwise boring game into something more than a simple tobacco "spit festival"!

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.