Brainput system takes some brain strain off multi-taskers

May 16, 2012 by Bob Yirka report
Brainput provides a passive, implicit input channel to interactive systems, with little effort from the user. Image: MIT

(Phys.org) -- A research team made up of members from Indiana University, Tufts and MIT and led by Erin Treacy Solovey, a has built a brain monitoring system that offloads some of the computer related activities a person is performing when multi-tasking begins causing stress. Called Brainput, the system makes use of functional near-infrared spectroscopy (fNIRS) to monitor brain activity and a computer system to interpret the results and then to cause changes to tasks to reduce brain overload, i.e. when stress levels reach a certain point, the computer turns on autonomous computer activities that hopefully reduce the amount of stress the person is experiencing.

In practice, the fNIRS system is made up of a headset worn across the forehead that is able to capture from the brain, combining features of and in one small, less powerful package. The headset is connected to a computer running software that analyses the signals it receives and then when certain patterns emerge, takes actions that lesson the load on the person wearing the headset.

To test their system, the researchers set up an experiment where volunteers were asked to direct virtual robots in a maze with the goal of finding a spot where a Wi-Fi signal was powerful enough to allow for sending messages. To up the stress factor, the volunteers were asked to direct two robots at the same time, which was doubly difficult because both robots continued to move regardless of commands given and because the volunteers were also asked to do their best to prevent the robots from running into walls. The setup thus required the volunteers to multi-task in a . As the experiments unfolded, the Brainput system constantly monitored the of each participant and when certain thresholds were reached, commands were sent to the robots turning on their sensors and directing them to perform some actions autonomously; in essence, helping out the volunteers. What’s perhaps most interesting about the experiments is that most of the volunteers never even noticed that the robots had started helping them out. For their part, the researchers noted that performance levels overall increased when the robots started doing their fair share. Also interesting was when autonomous behavior by the robots was instigated prior to the volunteers reaching overload, they tended to notice what was going on and performance dropped off as a result.

Solovey and her team plan to next add other cognitive recognition algorithms to the system to detect other mental states allowing for assistance in wider variety of applications. Possible real-world uses for such a system might include driver or pilot assist applications that help out when the person in control temporality loses focus or becomes fatigued.

Explore further: Robots provide insight into human perception

More information: web.mit.edu/erinsol/www/fNIRS.html
Research paper (PDF): web.mit.edu/erinsol/www/papers … y.CHI.2012.Final.pdf

via TechnologyReview


Related Stories

Robots provide insight into human perception

August 18, 2010

Research using a robot designed to express human emotions has revealed unexpected insights into how our perception is affected by anthropomorphism, or giving human characteristics to non-human animals or things.

First steps toward autonomous robot surgeries

May 6, 2008

The day may be getting a little closer when robots will perform surgery on patients in dangerous situations or in remote locations, such as on the battlefield or in space, with minimal human guidance.

Flocking robots take to the sky (w/ video)

September 27, 2011

(PhysOrg.com) -- The next time you look up in the sky and think you are seeing a flock of geese flying south for the winter, take a closer look. If you are in Lake Geneva, Switzerland, these flocks may actually be robots ...

Thanks to RoboEarth the bots can learn on their own

February 10, 2011

(PhysOrg.com) -- We tend to assume that robots need human input in order to understand the world around them. In the near future humans may not even be a part of the robotic-learning equation. Soon, robots will be able to ...

Recommended for you

A not-quite-random walk demystifies the algorithm

December 15, 2017

The algorithm is having a cultural moment. Originally a math and computer science term, algorithms are now used to account for everything from military drone strikes and financial market forecasts to Google search results.

US faces moment of truth on 'net neutrality'

December 14, 2017

The acrimonious battle over "net neutrality" in America comes to a head Thursday with a US agency set to vote to roll back rules enacted two years earlier aimed at preventing a "two-speed" internet.

FCC votes along party lines to end 'net neutrality' (Update)

December 14, 2017

The Federal Communications Commission repealed the Obama-era "net neutrality" rules Thursday, giving internet service providers like Verizon, Comcast and AT&T a free hand to slow or block websites and apps as they see fit ...

The wet road to fast and stable batteries

December 14, 2017

An international team of scientists—including several researchers from the U.S. Department of Energy's (DOE) Argonne National Laboratory—has discovered an anode battery material with superfast charging and stable operation ...

0 comments

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.