New memristor boosts accuracy and efficiency for neural networks on an atomic scale

New memristor boosts accuracy and efficiency for neural networks on an atomic scale
Hardware that mimics the neural circuitry of the brain requires building blocks that can adjust how they synapse. One such approach, called memristors, uses current resistance to store this information. New work looks to overcome reliability issues in these devices by scaling memristors to the atomic level. Researchers demonstrated a new type of compound synapse that can achieve synaptic weight programming and conduct vector-matrix multiplication with significant advances over the current state of the art. They discuss their work in this week's Journal of Applied Physics. This image shows a conceptual schematic of the 3D implementation of compound synapses constructed with boron nitride oxide (BNOx) binary memristors, and the crossbar array with compound BNOx synapses for neuromorphic computing applications. Credit: Ivan Sanchez Esqueda

Just like their biological counterparts, hardware that mimics the neural circuitry of the brain requires building blocks that can adjust how they synapse, with some connections strengthening at the expense of others. One such approach, called memristors, uses current resistance to store this information. New work looks to overcome reliability issues in these devices by scaling memristors to the atomic level.

A group of researchers demonstrated a new type of compound synapse that can achieve synaptic weight programming and conduct vector-matrix multiplication with significant advances over the current state of the art. Publishing its work in the Journal of Applied Physics, the group's compound synapse is constructed with atomically thin boron nitride memristors running in parallel to ensure efficiency and accuracy.

The article appears in a special topic section of the journal devoted to "New Physics and Materials for Neuromorphic Computation," which highlights new developments in physical and materials science research that hold promise for developing the very large-scale, integrated "neuromorphic" systems of tomorrow that will carry computation beyond the limitations of current semiconductors today.

"There's a lot of interest in using new types of materials for memristors," said Ivan Sanchez Esqueda, an author on the paper. "What we're showing is that filamentary devices can work well for neuromorphic computing applications, when constructed in new clever ways."

Current memristor technology suffers from a wide variation in how signals are stored and read across devices, both for different types of memristors as well as different runs of the same . To overcome this, the researchers ran several memristors in parallel. The combined output can achieve accuracies up to five times those of conventional devices, an advantage that as devices become more complex.

The choice to go to the subnanometer level, Sanchez said, was born out of an interest to keep all of these parallel memristors energy-efficient. An array of the group's memristors were found to be 10,000 times more energy-efficient than memristors currently available.

"It turns out if you start to increase the number of devices in parallel, you can see large benefits in accuracy while still conserving power," Sanchez said. Sanchez said the team next looks to further showcase the potential of the compound synapses by demonstrating their use completing increasingly complex tasks, such as image and pattern recognition.


Explore further

Memory-processing unit could bring memristors to the masses

More information: Efficient learning and crossbar operations with atomically-thin 2-D material compound synapses, Journal of Applied Physics (2018). DOI: 10.1063/1.5042468
Journal information: Journal of Applied Physics

Citation: New memristor boosts accuracy and efficiency for neural networks on an atomic scale (2018, October 16) retrieved 18 July 2019 from https://phys.org/news/2018-10-memristor-boosts-accuracy-efficiency-neural.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
169 shares

Feedback to editors

User comments

Oct 16, 2018
This is where computing is heading after the end of Moore's Law in the next 10 years or so. By 2030 almost all deployed computing will be NN and Von Neumann artchitecture a minor player

Oct 16, 2018
This is where computing is heading after the end of Moore's Law in the next 10 years or so. By 2030 almost all deployed computing will be NN and Von Neumann artchitecture a minor player


And that, ladies and gentlement, is what the starting pistol of the next dot.com bubble sounds like. Prepare your checkbooks, because only the early investors win!

Oct 16, 2018
So where is Ray Kurzweil's hand in all this? Or is this a different level of a future sentience in machines?

Oct 16, 2018
So where is Ray Kurzweil's hand in all this? Or is this a different level of a future sentience in machines?

This is hardware. Kurzweil is more of a software guy...

Oct 17, 2018
So where is Ray Kurzweil's hand in all this? Or is this a different level of a future sentience in machines?


According to his predictions, a human-brain equivalent supercomputer should cost $4000 this year. We should also be wearing computer display contact lenses, computers have only speech recognition and no keyboards, and computers today are running on "Three-dimensional nanotube lattices".

In the year 2019, computers do most of the vehicle driving—-humans are in fact prohibited from driving on highways unassisted. Prototype personal flying vehicles using microflaps exist. The basic needs of the underclass are met, and the average lifespan is 100 years.

Can't wait for the new year.

Oct 17, 2018
Also, in 2005 he made the prediction that 10 terabytes of RAM will cost $1000 this year.

Last time I checked, it was still about $150 for 16 GB so... roughly $100,000 for that amount. Not bad, just a factor of 100x off the mark.

That's the problem with predicting with exponential functions. The difference between two functions growing exponentially is also exponential, so getting your prediction coefficients wrong gives you an error that grows exponentially over time - especially if the reality you're trying to predict ISN'T growing exponentially in the first place.

You can buy 10 TB of "memory" for less than $1000, but that's shifting the goalposts. Buying a big pile of the cheapest CD-Rs would get you there.

Oct 17, 2018
Also, in 2005 he made the prediction that 10 terabytes of RAM will cost $1000 this year.

Last time I checked, it was still about $150 for 16 GB so... roughly $100,000 for that amount. Not bad, just a factor of 100x off the mark.

That's the problem with predicting with exponential functions. The difference between two functions growing exponentially is also exponential, so getting your prediction coefficients wrong gives you an error that grows exponentially over time - especially if the reality you're trying to predict ISN'T growing exponentially in the first place.

You can buy 10 TB of "memory" for less than $1000, but that's shifting the goalposts. Buying a big pile of the cheapest CD-Rs would get you there.

You can get a 12TB network cloud drive for about 550 at Best Buy

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more