Brainput system takes some brain strain off multi-taskers

May 16, 2012 by report
Brainput provides a passive, implicit input channel to interactive systems, with little effort from the user. Image: MIT

(Phys.org) -- A research team made up of members from Indiana University, Tufts and MIT and led by Erin Treacy Solovey, a has built a brain monitoring system that offloads some of the computer related activities a person is performing when multi-tasking begins causing stress. Called Brainput, the system makes use of functional near-infrared spectroscopy (fNIRS) to monitor brain activity and a computer system to interpret the results and then to cause changes to tasks to reduce brain overload, i.e. when stress levels reach a certain point, the computer turns on autonomous computer activities that hopefully reduce the amount of stress the person is experiencing.

In practice, the fNIRS system is made up of a headset worn across the forehead that is able to capture from the brain, combining features of and in one small, less powerful package. The headset is connected to a computer running software that analyses the signals it receives and then when certain patterns emerge, takes actions that lesson the load on the person wearing the headset.

To test their system, the researchers set up an experiment where volunteers were asked to direct virtual robots in a maze with the goal of finding a spot where a Wi-Fi signal was powerful enough to allow for sending messages. To up the stress factor, the volunteers were asked to direct two robots at the same time, which was doubly difficult because both robots continued to move regardless of commands given and because the volunteers were also asked to do their best to prevent the robots from running into walls. The setup thus required the volunteers to multi-task in a . As the experiments unfolded, the Brainput system constantly monitored the of each participant and when certain thresholds were reached, commands were sent to the robots turning on their sensors and directing them to perform some actions autonomously; in essence, helping out the volunteers. What’s perhaps most interesting about the experiments is that most of the volunteers never even noticed that the robots had started helping them out. For their part, the researchers noted that performance levels overall increased when the robots started doing their fair share. Also interesting was when autonomous behavior by the robots was instigated prior to the volunteers reaching overload, they tended to notice what was going on and performance dropped off as a result.

Solovey and her team plan to next add other cognitive recognition algorithms to the system to detect other mental states allowing for assistance in wider variety of applications. Possible real-world uses for such a system might include driver or pilot assist applications that help out when the person in control temporality loses focus or becomes fatigued.

Explore further: Neuroscientist's idea wins new-toy award

More information: web.mit.edu/erinsol/www/fNIRS.html
Research paper (PDF): web.mit.edu/erinsol/www/papers/Solovey.CHI.2012.Final.pdf

via TechnologyReview

Related Stories

Robots provide insight into human perception

Aug 18, 2010

Research using a robot designed to express human emotions has revealed unexpected insights into how our perception is affected by anthropomorphism, or giving human characteristics to non-human animals or things.

First steps toward autonomous robot surgeries

May 06, 2008

The day may be getting a little closer when robots will perform surgery on patients in dangerous situations or in remote locations, such as on the battlefield or in space, with minimal human guidance.

Flocking robots take to the sky (w/ video)

Sep 27, 2011

(PhysOrg.com) -- The next time you look up in the sky and think you are seeing a flock of geese flying south for the winter, take a closer look. If you are in Lake Geneva, Switzerland, these flocks may actually ...

Thanks to RoboEarth the bots can learn on their own

Feb 10, 2011

(PhysOrg.com) -- We tend to assume that robots need human input in order to understand the world around them. In the near future humans may not even be a part of the robotic-learning equation. Soon, robots ...

Recommended for you

Neuroscientist's idea wins new-toy award

19 hours ago

When he was a child, Robijanto Soetedjo used to play with his electrically powered toys for a while and then, when he got bored, take them apart - much to the consternation of his parents.

Land Rover demos invisible bonnet / car hood (w/ video)

Apr 14, 2014

(Phys.org) —Land Rover has released a video demonstrating a part of its Discover Vision Concept—the invisible "bonnet" or as it's known in the U.S. the "hood" of the car. It's a concept the automaker ...

Visions of 1964 World's Fair didn't all come true

Apr 12, 2014

Video phone calls? Yeah, we do that. Asking computers for information? Sure, several times a day. Colonies on the moon and jet packs as a mode of everyday transportation. OK, maybe not.

User comments : 0

More news stories

Intel reports lower 1Q net income, higher revenue

Intel's earnings fell in the first three months of the year amid a continued slump in the worldwide PC market, but revenue grew slightly because of solid demand for tablet processors and its data center services.

Low Vitamin D may not be a culprit in menopause symptoms

A new study from the Women's Health Initiative (WHI) shows no significant connection between vitamin D levels and menopause symptoms. The study was published online today in Menopause, the journal of The North American Menopa ...

Astronomers: 'Tilt-a-worlds' could harbor life

A fluctuating tilt in a planet's orbit does not preclude the possibility of life, according to new research by astronomers at the University of Washington, Utah's Weber State University and NASA. In fact, ...