How the brain handles big data

October 1st, 2013
These days the term "big data" evokes thoughts of very large datasets. But in the case of an August SFI working group, "Big Data In the Brain," the focus was the brain and how it manages large amounts of data – plus the mapping of the brain itself, which is another kind of big data challenge.

"Basically, everything in the brain involves doing calculations on large amounts of data," says neurobiologist and SFI External Professor Charles Stevens of the Salk Institute, the meeting's host.

There are two classes of such calculations: Those hard-wired by evolution and those that have to be learned. Hard-wired calculations, for example, are those involved in making sense of visual data from the eyes.

On the other hand, olfactory data from the nose is more complex than building a picture in three dimensions and can't be hardwired, Stevens explains. There are so many odors and combinations of odors that instead of three visual dimensions to make sense of, olfactory perception has to juggle something like a thousand dimensions to identify a single point.

"That's big data," Stevens says. "So, for example, until a few hundred years ago nobody had smelled coffee. Today everybody can tell the odor of coffee, good from bad coffee, and even whether it's from Starbucks. Evolution could not have wired it in."

The working group's participants – Stevens, Venkatesh Murthy of Harvard, and Stephen Smith of Stanford – also discussed mapping of the brain, cell by cell, which poses a different problem: How do you handle the massive amount of data that would then describe a brain?

The working group took place August 11-24 at SFI.

Provided by Santa Fe Institute

This Phys.org Science News Wire page contains a press release issued by an organization mentioned above and is provided to you “as is” with little or no review from Phys.Org staff.

More news stories

Team infuses science into 'Minecraft' modification

The 3-D world of the popular "Minecraft" video game just became more entertaining, perilous and educational, thanks to a comprehensive code modification kit, "Polycraft World," created by University of Texas at Dallas professors, ...

States ascend into the cloud

Seven years ago, the state of Delaware started moving computer servers out of closets and from under workers' desks to create a consolidated data center and a virtual computing climate.

Volunteer guidelines for clinicians in the Ebola epidemic

Disaster Medicine and Public Health Preparedness Journal has released a novel, informative article that speaks to volunteers within the Ebola epidemic. The article, contributed by a consortium of Boston-based hospitals, is ent ...