A novel engineering capstone project enables a robot to be controled using a brain-computer interface. Photo by Mary Knox Merrill

(PhysOrg.com) -- A squat, circular robot scurries along the floor of a laboratory, moving left, then right, then left again, before coming to a stop. A Northeastern University student researcher commands the gadget through a brain-computer interface that controls the movement of the robot using signals produced by his visual cortex.

The innovative technology was developed for a senior capstone project under the direction of electrical and professors Deniz Erdogmus and Bahram Shafai. The team members, who won first prize in the electrical and computer engineering capstone project competition, included Saumitro Dasgupta, Mike Fanton, Jonathan Pham and Mike Willard.

Erdogmus says that the technology could serve a variety of practical uses, from assisting or enhancing human cognitive or sensory in disabled or neurologically impaired users, to controlling military vehicles, light switches or TVs.

Scientific researchers have long experimented with brain-computer interfaces, but only in the last 10 years has technology advanced far enough to make the impossible possible.

“People with disabilities will soon be able to communicate through the computer to operate wheelchairs or other vehicles or devices,” Erdogmus says.

A user’s ability to command the robot to move in any direction is based on a neurological principle: When the retina experiences a , ranging in frequency from 3.5 to more than 100HZ, the brain generates at the same frequency.

Team members exploited the phenomenon. They divided a computer screen into four checkerboard patterns that flash at different frequencies and represent different control commands for the robot, including the ability to move left, right and forward.

When a user stares at one of the checkerboard patterns, the resulting are picked up by electrodes attached to the back of the head and sent to a computer software program. The program wirelessly transmits the control command to a laptop that is mounted on the robot, moving it according to the user’s command.

The user can track the ’s whereabouts through real-time video feedback using Skype.

Noting the “good accuracy and speed” of his team’s system, Dasgupta says, “Navigating through obstacles requires a degree of fine control. You can use this system to control different types of environments.”

Next year, a new team of electrical and computer engineering students will advance the project a step further by designing a brain-computer interface for a wheelchair, Dasgupta says.