The human brain has evolved over millions of years to become a vast network of billions of neurons and synaptic connections. Understanding it is one of humankind's greatest pursuits.
But to understand how the brain processes information, researchers must first understand the very basics of neurons — even down to how proteins inside the neurons act to change the neuron's voltage.
To do so requires a balance of experimentation and computer modeling — a partnership across disciplines traversed by Bill Kath, professor of engineering sciences and applied mathematics in the McCormick School of Engineering and Applied Science, and Nelson Spruston, professor of neurobiology and physiology in the Weinberg College of Arts and Sciences.
The two have worked together for more than a decade, with Spruston designing experiments and Kath developing computer models that explain the results that Spruston found. (It also works the other way: Kath's models have provided Spruston with ideas to test experimentally.)
Spruston has been studying ion channels of neurons that change their shape when activated, allowing sodium to enter from outside the neuron. This changes the voltage of the neuron, causing the neuron to fire and send off a chain of neural activity within the brain. The difficulty in modeling such behavior lies in the time scale over which this happens — anywhere from fractions of a millisecond out to several seconds.
So the two, along with graduate student Vilas Menon, took a cue from nature and used the process of evolution to study one of evolution's greatest achievements.
Evolutionary algorithms work like this: rather than making one model, researchers make 100 models with many different parameters. They then run those models (using high-speed computers) and compare the results to the experimental data to see how well they match. Researchers then keep the best traits of different models and mix and match (breeding) to make 100 more models. Thousands of generations later they get a model that matches the characteristics of the real thing. Researchers have used this technique in modeling before, but Kath and colleagues introduced a new twist: they allowed the structure of the model (not just its parameters) to be "mutated" during the "breeding".
"In the end, the computer found a quite simple state-dependent model for the sodium channels that provides a very accurate behavior on short time scales and out to several seconds, as well," Kath says. Their results were recently published in the Proceedings of the National Academy of Sciences.
Modeling of even this small a process is important, Spruston says, because it helps scientists understand the important details about how the brain works.
"We want to make sure we truly understand how these channels work by building a model that can recapitulate all the features we've observed," he says. "Making computer models is a way of identifying both what you understand and also where the gaps in your knowledge need to be filled. The cool thing is you're taking a page from a part of biology — evolution — and applying it to another part of biology — neurobiology — and using the computer in the middle."
The neurons the group studied are in the hippocampal region of the brain, which researchers have identified as being important for memory.
"If you want to understand how this neural circuit is processing information and memory, you have to understand how these neurons behave in different situations," Kath says. "If you leave out key details, you might miss something important."
Source: Northwestern University (news : web)
Explore further: Rock-paper-scissors model helps researchers demonstrate benefits of high mutation rates