Taking brain-computer interfaces to the next phase (w/ Video)

Feb 17, 2011
Robotino is a BCI distance-controlled robot to help disabled people interact with their surroundings developed at EPFL. Credit: Alain Herzog / EPFL

You may have heard of virtual keyboards controlled by thought, brain-powered wheelchairs, and neuro-prosthetic limbs. But powering these machines can be downright tiring, a fact that prevents the technology from being of much use to people with disabilities, among others. Professor Jose del R. Millan and his team at the Ecole Polytechnique Federale de Lausanne (EPFL) in Switzerland have a solution: engineer the system so that it learns about its user, allows for periods of rest, and even multitasking.

In a typical (BCI) set-up, users can send one of three commands – left, right, or no-command. No-command is the static state between left and right and is necessary for a brain-powered wheelchair to continue going straight, for example, or to stay put in front of a specific target. But it turns out that no-command is very taxing to maintain and requires extreme concentration. After about an hour, most users are spent. Not much help if you need to maneuver that wheelchair through an airport.

In an ongoing study demonstrated by Millan and doctoral student Michele Tavella at the AAAS 2011 Annual Meeting in Washington, D.C., the scientists hook volunteers up to BCI and ask them to read, speak, or read aloud while delivering as many left and right commands as possible or delivering a no-command. By using statistical analysis programmed by the scientists, Millán's BCI can distinguish between left and right commands and learn when each subject is sending one of these versus a no-command. In other words, the machine learns to read the subject's mental intention. The result is that users can mentally relax and also execute secondary tasks while controlling the BCI.

This video is not supported by your browser at this time.
José del R. Millán explains how "shared control" and probability theory are used to reduce fatigue, allowing BCI users to control the devices over longer periods of time.

The so-called Shared Control approach to facilitating human-robot interactions employs image sensors and image-processing to avoid obstacles. According to Millan, however, Shared Control isn't enough to let an operator to rest or concentrate on more than one command at once, limiting long-term use.

Millán's new work complements research on Shared Control and makes multitasking a reality while at the same time allows users to catch a break. His trick is in decoding the signals coming from EEG readings on the scalp—readings that represent the activity of millions of neurons and have notoriously low resolution. By incorporating statistical analysis, or probability theory, his BCI allows for both targeted control—maneuvering around an obstacle—and more precise tasks, such as staying on a target. It also makes it easier to give simple commands like "go straight" that need to be executed over longer periods of time (think back to that airport) without having to focus on giving the same command over and over again.

Robotino uses both "shared control" sensors and a recently developed system that uses probability theory to help users use the BCI for longer periods of time Credit: Alain Herzog / EPFL

It will be a while before this cutting-edge technology makes the move from lab to production line, but Millán's prototypes are the first working models of their kind to use probability theory to make BCIs easier to use over time. His next step is to combine this new level of sophistication with Shared Control in an ongoing effort to take BCI to the next level, necessary for widespread use. Further advancements, such as finer grained interpretation of cognitive information, are being developed in collaboration with the European project for Tools for Brain Computer. The multinational project is headed by Professor Millán and has moved into the clinical testing phase for several BCIs.

Explore further: Lifting the brakes on fuel efficiency

Provided by Ecole Polytechnique Federale de Lausanne

4.7 /5 (6 votes)

Related Stories

Mind over body: new hope for quadriplegics

Mar 10, 2008

Around 2.5 million people worldwide are wheelchair bound because of spinal injuries. Half of them are quadriplegic, paralysed from the neck down. European researchers are now offering them new hope thanks to groundbreaking ...

Brain powered robot

Jun 01, 2010

(PhysOrg.com) -- A squat, circular robot scurries along the floor of a laboratory, moving left, then right, then left again, before coming to a stop. A Northeastern University student researcher commands the ...

Towards zero training for brain-computer interfacing

Aug 13, 2008

While invasive electrode recordings in humans show long-term promise, non-invasive techniques can also provide effective brain-computer interfacing (BCI) and localization of motor activity in the brain for paralyzed patients ...

A (virtual) smart home controlled by your thoughts

May 11, 2009

Light switches, TV remote controls and even house keys could become a thing of the past thanks to brain-computer interface (BCI) technology being developed in Europe that lets users perform everyday tasks ...

Recommended for you

Lifting the brakes on fuel efficiency

11 hours ago

The work of a research leader at Michigan Technological University is attracting attention from Michigan's Governor as well as automotive companies around the world. Xiaodi "Scott" Huang of Michigan Tech's ...

Large streams of data warn cars, banks and oil drillers

Apr 16, 2014

Better warning systems that alert motorists to a collision, make banks aware of the risk of losses on bad customers, and tell oil companies about potential problems with new drilling. This is the aim of AMIDST, the EU project ...

User comments : 2

Adjust slider to filter visible comments by rank

Display comments: newest first

Eikka
5 / 5 (1) Feb 17, 2011
The problem is similiar to bashing random keys on a keyboard, but occasionally the user means something when they press a particular key.

Why not encode the commands into action sequences that are more easily distinguished from the noise because of their lower probability to occur?

If you see "alskdjaohdaforwardsödaklpdo", it's pretty easy to see what the intention is.
pauljpease
5 / 5 (2) Feb 17, 2011
I'm glad these BCI researchers have finally figured this out. After reading about the first generation of BCIs, I realized that they would be much more efficient if the computer uses it's computational ability to read intentions, thus simplifying the commands that the user needs to give. In other words, an AI makes decisions and the person controlling it just needs to do nothing if the AI is doing the right thing and indicate "NO!" if the AI guesses wrong. I think this is much more how our own brains operate. Some sub-region of my brain (below conscious level) is making decisions and guiding me, so doing normal tasks doesn't take any focus. As computers get smarter, they can figure out the right thing to do with less oversight from the user...

More news stories

LinkedIn membership hits 300 million

The career-focused social network LinkedIn announced Friday it has 300 million members, with more than half the total outside the United States.

Treating depression in Parkinson's patients

A group of scientists from the University of Kentucky College of Medicine and the Sanders-Brown Center on Aging has found interesting new information in a study on depression and neuropsychological function in Parkinson's ...