New algorithm improves the way computers interpret readings of the brain’s electrical signals

May 30, 2011 By Lee Swee Heng
Credit: 2010 iStockphoto.com/annedde

Electroencephalography (EEG) records the electrical signals produced by the brain using an array of electrodes placed on the scalp. Computers use an algorithm called common spatial pattern (CSP) to translate these signals into commands for the control of various devices.

Haiping Lu at the A*STAR Institute for Infocomm Research and co-workers have now developed an improved version of CSP for classifying signals. The new algorithm will facilitate the development of advanced –computer interfaces that may one day enable paralyzed patients to control devices such as computers and robotic arms.

CSP distinguishes and interprets commands by estimating the variations between EEG signals, and its accuracy strongly depends on how many signals are provided. As a result, CSP may make an incorrect interpretation when the number of EEG signals is small.

The new CSP algorithm developed by Lu and his colleagues uses two parameters to regularize the estimation of EEG signal variations. One parameter lowers the variations of the estimates, while the other reduces the tendency of the CSP algorithm to bias the estimates towards values from only a small number of samples.

Together, these parameters significantly improve the accuracy of CSP for classifying EEG signals. The researchers optimized the new algorithm even further by aggregating a number of different regularizations.

They evaluated their algorithm in the third Brain–Computer Interface Competition, held in 2005. The competition enables researchers who are developing advanced brain–computer interface technologies to test their methods for processing and classifying EEG signals using publicly available data sets.

The algorithm developed by Lu’s group significantly out-performed four other groups in three sets of experiments with varying testing scenarios, and its superiority was particularly evident when the number of sample EEG signals was small.

Conventional algorithms require 20–90 EEG signals, but the algorithm of Lu and co-workers needs only ten. This significantly reduces the effort required to collect data for brain–computer interfaces, the memory requirements for EEG signal processing applications, and the processing time needed for processing the signals.

“This is a method to improve the accuracy of current brain–computer interfaces,” says Lu. “Our applies ensemble-based learning in the feature-extraction stage of an EEG-based brain–computer interface, which could be integrated with training data ensembles in the data pre-processing stage. It would be among many other improvements to be tested and used in existing brain–computer interface systems.”

Explore further: Forging a photo is easy, but how do you spot a fake?

More information: Lu, H., et al. Regularized common spatial pattern with aggregation for EEG classification in small-sample setting. IEEE Transactions of Biomedical Engineering 57, 2936–2946 (2010) dx.doi.org/10.1109/TBME.2010.2082540

Provided by Agency for Science, Technology and Research (A*STAR)

4.5 /5 (2 votes)

Related Stories

New way to detect epileptic seizures

Mar 23, 2011

Researchers at Concordia University have pioneered a computer-based method to detect epileptic seizures as they occur – a new technique that may open a window on the brain's electrical activity. Their paper, "A Novel ...

Using your mood to operate a computer game

May 28, 2009

(PhysOrg.com) -- Brain Computer Interfaces measure electrical signals from the brain and convert them into data that can be used by a computer. You can move a cursor on your screen, for example, simply by ...

Towards zero training for brain-computer interfacing

Aug 13, 2008

While invasive electrode recordings in humans show long-term promise, non-invasive techniques can also provide effective brain-computer interfacing (BCI) and localization of motor activity in the brain for paralyzed patients ...

Recommended for you

Forging a photo is easy, but how do you spot a fake?

Nov 21, 2014

Faking photographs is not a new phenomenon. The Cottingley Fairies seemed convincing to some in 1917, just as the images recently broadcast on Russian television, purporting to be satellite images showin ...

Algorithm, not live committee, performs author ranking

Nov 21, 2014

Thousands of authors' works enter the public domain each year, but only a small number of them end up being widely available. So how to choose the ones taking center-stage? And how well can a machine-learning ...

Professor proposes alternative to 'Turing Test'

Nov 19, 2014

(Phys.org) —A Georgia Tech professor is offering an alternative to the celebrated "Turing Test" to determine whether a machine or computer program exhibits human-level intelligence. The Turing Test - originally ...

Image descriptions from computers show gains

Nov 18, 2014

"Man in black shirt is playing guitar." "Man in blue wetsuit is surfing on wave." "Black and white dog jumps over bar." The picture captions were not written by humans but through software capable of accurately ...

User comments : 1

Adjust slider to filter visible comments by rank

Display comments: newest first

Yogaman
not rated yet May 30, 2011
"They evaluated their algorithm in the third BrainComputer Interface Competition, held in 2005."
No time machine was involved, despite what this sentence says.

Apparently, they evaluated their algorithm against the 3rd BCI dataset and got the reported, excellent results. Sadly, no trophy, though.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.