Controlling robots with your thoughts

May 06, 2013
Student Angel Garcia use his eyes, eyebrows and other parts of his face to make the robot move. Credit: Thor Nielsen

This is Angel Perez Garcia. He can make a robot move exactly as he wants via the electrodes attached to his head.

"I use the movements of my eyes, eyebrows and other parts of my face", he says. "With my I can select which of the robot's I want to move" smiles Angel, who is a Master's student at NTNU.

Facial grimaces generate major () across our heads, and the same happens when Angel concentrates on a symbol, such as a flashing light, on a . In both cases the read the activity in the brain. The signals are then interpreted by a processor which in turn sends a message to the robot to make it move in a pre-defined way.

"I can focus on a selection of lights on the screen. The robot's movements depend on which light I select and the type of activity generated in my brain", says Angel. "The idea of controlling a robot simply by using our thoughts (EEG brainwave activity), is fascinating and futuristic", he says.

A school for robots

Angel Garcia is not alone in developing new ways of manoeuvring robots. Today, teaching robots dominates activity among the cybernetics community at NTNU/SINTEF.

In the robotics hall, fellow student Signe Moe is guiding a robot by moving her arms, while SINTEF researcher and supervisor Ingrid Schjølberg is using a new training programme to try to get her three-fingered robot to grasp objects in new ways.

"Why all this enthusiasm for training?"

"Well, everyone knows about used on production lines to pick up and assemble parts", says Schjølberg. "They are pre-programmed and relatively inflexible, and carry out repeated and identical movements of specialised graspers adapted to the parts in question", she says.

"So you are developing something new?"

"We can see that industries encounter major problems every time a new part is brought in and has to be handled on the production line", she says. "The replacement of graspers and the robot's guidance programme is a complex process, and we want to make this simpler. We want to be able to programme robots more intuitively and not just in the traditional way using a panel with buttons pressed by an operator.

This video is not supported by your browser at this time.

"We want you to move over here"

Signe Moe's task has thus been to find out how a robot can be trained to imitate human movements. She has solved this using a system by which she guides the robot using a Kinect camera of the type used in games technology.

"Now it's possible for anyone to guide the robot", says Moe. "Not long ago some 6th grade pupils visited us here at the robotics hall. They were all used to playing video games, so they had no problems in guiding the robot", she says.

To demonstrate, she stands about a metre and a half in front of the camera. "Firstly, I hold up my right hand and make a click in the air. This causes the camera to register me and trace the movements of my hand", says Moe. "Now, when I move my hand up and to the right, you can see that the robot imitates my movements", she says.

SINTEF-scientist Ingrid Schjølberg is demonstrating her three-fingered robotic grasper. Teaching robots new ways of grasping will greatly benefit the manufacturing industry, she says. Credit: Thor Nielsen

"It looks simple enough, but what happens if....?

"The Kinect camera has built-in algorithms which can trace the movements of my hand", she says. "All we have to do is to transpose these data to define the position we want the robot to assume, and set up a communications system between the sensors in the camera and the robot", she explains. "In this way the robot receives a reference along the lines of 'we want you to move over here', and an in-built regulator then computes how it can achieve the movement and how much electricity the motor requires to carry the movement out" says Moe.

New learning using camera images and sensors

Ingrid Schjølberg is demonstrating her three-fingered robotic grasper. Teaching robots new ways of grasping will greatly benefit the manufacturing industry, and this is why researchers are testing out new approaches.

"We are combining sensors in the robotic hand with Kinect images to identify the part which has to be picked up and handled", says Schjølberg. "In this way the robot can teach itself the best ways of adapting its grasping action", she says. "It is trying out different grips in just the same way as we humans do when picking up an unfamiliar object. We've developed some pre-determined criteria for what defines a good and bad grip", she explains. "The robot is testing out several different grips, and is praised or scolded for each attempt by means of a points score", smiles Schjølberg.

"Has this method been put into practice?"

"This system has not yet been industrialised, but we have it up and running here at the robotics hall", says Schjølberg. "The next step might be to install the on the premises of one of the project's industry partners", she says.

The project "Next Generation Robotics for Norwegian Industry" incorporates all relevant aspects of activities being carried out by our industry partners in the project – Statoil, Hydro, Glen Dimplex and Haag – who are all taking an active part.

"They are funding the work, but also involve themselves actively by providing case examples and problems", says Schjølberg. "During the remainder of the project term, we will be working to make both the software and the system more robust", she says.

Explore further: SRI microrobots show fast-building factory approach (w/ video)

add to favorites email to friend print save as pdf

Related Stories

Robots Playing Shuffleboard (w/ video)

Jun 08, 2011

(PhysOrg.com) -- Intense robot battles have, for the most part, been confined to the silver screen. Occasionally a robot comes by to trounce us at chess, but robot on robot competition has been fairly limited. ...

Brain scanner, not joystick, is in human-robot future

Jul 06, 2012

(Phys.org) -- Talk about fMRI may not be entirely familiar to many people, but that could change with new events that are highlighting efforts to link up humans and machines. fMRI (Functional Magnetic Resonance Imaging) is ...

Teaching robots to move like humans (w/ Video)

Mar 07, 2011

When people communicate, the way they move has as much to do with what they're saying as the words that come out of their mouths. But what about when robots communicate with people? How can robots use non-verbal ...

Recommended for you

Simplicity is key to co-operative robots

Apr 16, 2014

A way of making hundreds—or even thousands—of tiny robots cluster to carry out tasks without using any memory or processing power has been developed by engineers at the University of Sheffield, UK.

Students turn $250 wheelchair into geo-positioning robot

Apr 16, 2014

Talk about your Craigslist finds! A team of student employees at The University of Alabama in Huntsville's Systems Management and Production Center (SMAP) combined inspiration with innovation to make a $250 ...

Using robots to study evolution

Apr 14, 2014

A new paper by OIST's Neural Computation Unit has demonstrated the usefulness of robots in studying evolution. Published in PLOS ONE, Stefan Elfwing, a researcher in Professor Kenji Doya's Unit, has succes ...

User comments : 3

Adjust slider to filter visible comments by rank

Display comments: newest first

Patrice Boivin
5 / 5 (1) May 06, 2013
To me a robot is autonomous or semi-autonomously controlled. If it's being fully controlled, it's just an r/c unit isn't it? Like a plane, boat, model tank, race car, etc. When an r/c plane can fly by itself, it's a robotic drone.
antialias_physorg
5 / 5 (1) May 06, 2013
When an r/c plane can fly by itself, it's a robotic drone.

Then it's an autonomous drone. 'Robot' just means 'worker'. (It's from the Czech 'robota' which means 'serf labor')
jeelani_sk_73
not rated yet Jun 06, 2013
This is just amazing. The journey from simple Online flash games to this kind of nextgen robots, hats off to the creativity.

More news stories

Venture investments jump to $9.5B in 1Q

Funding for U.S. startup companies soared 57 percent in the first quarter to a level not seen since 2001, as venture capitalists piled more money into an increasing number of deals, according to a report due out Friday.

Hackathon team's GoogolPlex gives Siri extra powers

(Phys.org) —Four freshmen at the University of Pennsylvania have taken Apple's personal assistant Siri to behave as a graduate-level executive assistant which, when asked, is capable of adjusting the temperature ...

White House updating online privacy policy

A new Obama administration privacy policy out Friday explains how the government will gather the user data of online visitors to WhiteHouse.gov, mobile apps and social media sites. It also clarifies that ...

Scientists tether lionfish to Cayman reefs

Research done by U.S. scientists in the Cayman Islands suggests that native predators can be trained to gobble up invasive lionfish that colonize regional reefs and voraciously prey on juvenile marine creatures.

Leeches help save woman's ear after pit bull mauling

(HealthDay)—A pit bull attack in July 2013 left a 19-year-old woman with her left ear ripped from her head, leaving an open wound. After preserving the ear, the surgical team started with a reconnection ...