Synapses need only few bits

September 22, 2015

Deep learning is possibly the most exciting branch of contemporary machine learning. Complex image analysis, speech recognition and self-driving cars are just a few popular examples of a multitude of new applications where machine learning, and deep learning in particular, show their amazing capabilities.

Deep neural networks are made up of many layers of artificial neurons with hundreds of millions of connections between them. The structure of such deep networks is reminiscent of the brain, where billions of neurons are connected through thousands of synaptic contacts each. These types of networks can be trained to perform hard classification tasks over huge datasets, with the remarkable property of extracting information from examples and generalizing them to unseen items.

The way neural networks learn is by tuning their multitude of connections, or synaptic weights, following the signal provided by a learning algorithm that reacts to the input data. This process is in some aspects similar to what happens throughout the nervous system, in which plastic modifications of synapses are considered to be responsible for the formation and stabilization memories. The problem of devising efficient and scalable learning algorithms for realistic synapses is crucial for both technological and biological applications.

In a recent study, published in Physical Review Letters, researchers from Politecnico di Torino and Human Genetics Foundation (Italy) showed that extremely simple synaptic contacts, even one-bit switch-like synapses, can be efficiently used for learning in large-scale , and can lead to unanticipated computational performance. The study was conducted by a research group led by Riccardo Zecchina and composed by Carlo Baldassi, Alessandro Ingrosso, Carlo Lucibello and Luca Saglietti.

Until now, theoretical analysis suggested that learning with simple discretized synaptic connections was exceedingly difficult and thus impractical. Applying principles from statistical physics of disordered systems, the researchers found that the problem actually becomes extremely simple. The authors provide an in-depth theoretical explanation for why the problem can become simple and provide concrete learning strategies.

These new results are consistent with biological considerations, and recent experimental evidence that suggests that synaptic weights are not arbitrarily graded, but store a few bits each. Still, the most immediate follow-ups will be of a technological nature—the hardware implementation of relying on extremely simple synapses can overcome many of the computational bottlenecks (memory and speed) that the future generation of algorithms will have to face.

Explore further: Modeling memory in the brain

More information: "Subdominant Dense Clusters Allow for Simple Learning and High Computational Performance in Neural Networks with Discrete Synapses." Phys. Rev. Lett. 115, 128101 – Published 18 September 2015.

Related Stories

Modeling memory in the brain

May 18, 2015

Scientists at EPFL have uncovered mathematical equations behind the way the brain forms – and even loses – memories.

Neuroscience-based algorithms make for better networks

July 9, 2015

When it comes to developing efficient, robust networks, the brain may often know best. Researchers from Carnegie Mellon University and the Salk Institute for Biological Studies have, for the first time, determined the rate ...

Memory in silent neurons

August 31, 2014

When we learn, we associate a sensory experience either with other stimuli or with a certain type of behavior. The neurons in the cerebral cortex that transmit the information modify the synaptic connections that they have ...

Light-activated learning

August 28, 2015

A German-French team has developed a light-sensitive switch that regulates a protein implicated in the neurobiology of synaptic plasticity. The agent promises to shed new light on the phenomenology of learning, memory and ...

Recommended for you

Researchers discover new rules for quasicrystals

October 25, 2016

Crystals are defined by their repeating, symmetrical patterns and long-range order. Unlike amorphous materials, in which atoms are randomly packed together, the atoms in a crystal are arranged in a predictable way. Quasicrystals ...

1 comment

Adjust slider to filter visible comments by rank

Display comments: newest first

not rated yet Sep 22, 2015
Great research.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.