Synapses need only few bits

September 22, 2015

Deep learning is possibly the most exciting branch of contemporary machine learning. Complex image analysis, speech recognition and self-driving cars are just a few popular examples of a multitude of new applications where machine learning, and deep learning in particular, show their amazing capabilities.

Deep neural networks are made up of many layers of artificial neurons with hundreds of millions of connections between them. The structure of such deep networks is reminiscent of the brain, where billions of neurons are connected through thousands of synaptic contacts each. These types of networks can be trained to perform hard classification tasks over huge datasets, with the remarkable property of extracting information from examples and generalizing them to unseen items.

The way neural networks learn is by tuning their multitude of connections, or synaptic weights, following the signal provided by a learning algorithm that reacts to the input data. This process is in some aspects similar to what happens throughout the nervous system, in which plastic modifications of synapses are considered to be responsible for the formation and stabilization memories. The problem of devising efficient and scalable learning algorithms for realistic synapses is crucial for both technological and biological applications.

In a recent study, published in Physical Review Letters, researchers from Politecnico di Torino and Human Genetics Foundation (Italy) showed that extremely simple synaptic contacts, even one-bit switch-like synapses, can be efficiently used for learning in large-scale , and can lead to unanticipated computational performance. The study was conducted by a research group led by Riccardo Zecchina and composed by Carlo Baldassi, Alessandro Ingrosso, Carlo Lucibello and Luca Saglietti.

Until now, theoretical analysis suggested that learning with simple discretized synaptic connections was exceedingly difficult and thus impractical. Applying principles from statistical physics of disordered systems, the researchers found that the problem actually becomes extremely simple. The authors provide an in-depth theoretical explanation for why the problem can become simple and provide concrete learning strategies.

These new results are consistent with biological considerations, and recent experimental evidence that suggests that synaptic weights are not arbitrarily graded, but store a few bits each. Still, the most immediate follow-ups will be of a technological nature—the hardware implementation of relying on extremely simple synapses can overcome many of the computational bottlenecks (memory and speed) that the future generation of algorithms will have to face.

Explore further: Learning and memory: How neurons activate PP1

More information: "Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses." Phys. Rev. Lett. 115, 128101 – Published 18 September 2015. dx.doi.org/10.1103/PhysRevLett.115.128101

Related Stories

Memory in silent neurons

August 31, 2014

When we learn, we associate a sensory experience either with other stimuli or with a certain type of behavior. The neurons in the cerebral cortex that transmit the information modify the synaptic connections that they have ...

Modeling memory in the brain

May 18, 2015

Scientists at EPFL have uncovered mathematical equations behind the way the brain forms – and even loses – memories.

Neuroscience-based algorithms make for better networks

July 9, 2015

When it comes to developing efficient, robust networks, the brain may often know best. Researchers from Carnegie Mellon University and the Salk Institute for Biological Studies have, for the first time, determined the rate ...

Light-activated learning

August 28, 2015

A German-French team has developed a light-sensitive switch that regulates a protein implicated in the neurobiology of synaptic plasticity. The agent promises to shed new light on the phenomenology of learning, memory and ...

Recommended for you

More to rainbows than meets the eye

August 25, 2016

In-depth review charts the scientific understanding of rainbows and highlights the many practical applications of this fascinating interaction between light, liquid and gas.

Chemists explore outer regions of periodic table

August 25, 2016

A little known—and difficult to obtain—element on the fringes of the periodic table is broadening our fundamental understanding of chemistry. In the latest edition of the journal Science, Florida State University Professor ...

Measuring tiny forces with light

August 25, 2016

Photons are bizarre: They have no mass, but they do have momentum. And that allows researchers to do counterintuitive things with photons, such as using light to push matter around.

DNA chip offers big possibilities in cell studies

August 25, 2016

A UT Dallas physicist has developed a novel technology that not only sheds light on basic cell biology, but also could aid in the development of more effective cancer treatments or early diagnosis of disease.

Understanding nature's patterns with plasmas

August 23, 2016

Patterns abound in nature, from zebra stripes and leopard spots to honeycombs and bands of clouds. Somehow, these patterns form and organize all by themselves. To better understand how, researchers have now created a new ...

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

ichisan
not rated yet Sep 22, 2015
Great research.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.