Tiny chip mimics brain, delivers supercomputer speed

Aug 07, 2014 by Rob Lever
IBM's new neurosynaptic processor intergrates 1 million neurons and 256 million (414) synapses on a single chip. Credit: IBM

Researchers Thursday unveiled a powerful new postage-stamp size chip delivering supercomputer performance using a process that mimics the human brain.

The so-called "neurosynaptic" is a breakthrough that opens a wide new range of computing possibilities from self-driving cars to that can installed on a smartphone, the scientists say.

The researchers from IBM, Cornell Tech and collaborators from around the world said they took an entirely new approach in design compared with previous computer architecture, moving toward a system called "cognitive computing."

"We have taken inspiration from the cerebral cortex to design this chip," said IBM chief scientist for brain-inspired computing, Dharmendra Modha, referring to the command center of the brain.

He said existing computers trace their lineage back to machines from the 1940s which are essentially "sequential number-crunching calculators" that perform mathematical or "left brain" tasks but little else.

The new chip dubbed "TrueNorth" works to mimic the "right brain" functions of sensory processing—responding to sights, smells and information from the environment to "learn" to respond in different situations, Modha said.

It accomplishes this task by using a huge network of "neurons" and "synapses," similar to how the human brain functions by using information gathered from the body's sensory organs.

The researchers designed TrueNorth with one million programmable neurons and 256 million programmable synapses, on a chip with 4,096 cores and 5.4 billion transistors.

A key to the performance is the extremely low energy use on the new chip, which runs on the equivalent energy of a hearing-aid battery.

Sensor becomes the computer

Infographic: A brain-inspired chip to transform mobility and Internet of Things through sensory perception. Credit: IBM

This can allow a chip installed in a car or smartphone to perform supercomputer calculations in without connecting to the cloud or other network.

"The sensor becomes the computer," Modha told AFP in a phone interview.

"You could have better sensory processors without the connection to Wi-Fi or the cloud.

This would allow a self-driving vehicle, for example, to detect problems and deal with them even if its data connection is broken.

"It can see an accident about to happen," Modha said.

Similarly, a mobile phone can take smells or visual information and interpret them in real time, without the need for a network connection.

"After years of collaboration with IBM, we are now a step closer to building a computer similar to our brain," said Rajit Manohar, a researcher at Cornell Tech, a graduate school of Cornell University.

The project funded by the US Defense Advanced Research Projects Agency (DARPA) published its research in a cover article on the August 8 edition of the journal Science.

The researchers say TrueNorth in some ways outperforms today's supercomputers although a direct comparison is not possible because they operate differently.

But they wrote that TrueNorth can deliver from 46 billion to 400 billion "synaptic" calculations per second per watt of energy. That compares with the most energy-efficient supercomputer which delivers 4.5 billion "floating point" calculations per second and per watt.

The chip was fabricated using Samsung's 28-nanometer process technology.

"It is an astonishing achievement to leverage a process traditionally used for commercially available, low-power mobile devices to deliver a chip that emulates the by processing extreme amounts of sensory information with very little power," said Shawn Han of Samsung Electronics, in a statement.

"This is a huge architectural breakthrough that is essential as the industry moves toward the next-generation cloud and big-data processing."

Modha said the researchers have produced only the chip and that it could be years before commercial applications become available.

But he said it "has the potential to transform society" with a new generation of computing technology. And he noted that hybrid computers may be able to one day combine the "left brain" machines with the new "right brain" devices for even better performance.

Explore further: IBM to spend $3 bn aiming for computer chip breakthrough

More information: "A million spiking-neuron integrated circuit with a scalable communication network and interface," by P.A. Merolla et al. Science, 2014. www.sciencemag.org/lookup/doi/… 1126/science.1254642

add to favorites email to friend print save as pdf

Related Stories

UW team part of IBM 'cognitive' computing chip project

Aug 19, 2011

(PhysOrg.com) -- University of Wisconsin-Madison researchers are part of the IBM-led team that has unveiled a new generation of experimental computer chips - the first step in a project to create a computer that borrows pri ...

Brain in a box: Computer R&D teams explore new models

Jan 03, 2014

Beyond technology headlines announcing new wearable designs, curved displays and 3D printing machines, there is another research path. Researchers continue to explore how computers may learn from their own ...

IBM pursues chips that behave like brains

Aug 18, 2011

Computers, like humans, can learn. But when Google tries to fill in your search box based only on a few keystrokes, or your iPhone predicts words as you type a text message, it's only a narrow mimicry of what ...

Recommended for you

China blocks 'privacy' search engine DuckDuckGo

10 hours ago

China has begun blocking the privacy-protecting search engine DuckDuckGo, which avoids storing user data or tracking online activity, according to the company and security researchers.

FBI widens probe of naked celebrity photos

10 hours ago

The FBI vowed Monday to widen a probe into the massive hacking of naked celebrity photos if necessary, after new reported leaks including nude shots of Kim Kardashian.

User comments : 32

Adjust slider to filter visible comments by rank

Display comments: newest first

shavera
5 / 5 (7) Aug 07, 2014
This seems amazing for self-driving cars. Being able to process input faster and recognize patters faster (presumably the techniques this style of chip is good for). I can imagine more than a few groups itching to start over from scratch and see what this new tool can provide for them.
Arties
3 / 5 (2) Aug 07, 2014
It would be great for devices like these ones (Arduino Visual)
George_Rajna
Aug 07, 2014
This comment has been removed by a moderator.
grondilu
3.8 / 5 (5) Aug 07, 2014
Between IBM's cognitive computing and HP's "Machine", it's hard to decide on whom put my money.

I do have a slight preference for The Machine, though. Neural networks are great but they can only perform a fairly limited range of tasks. The machine is still a general purpose computing device and thus is less limited to what it can do.
youngmaester
4 / 5 (4) Aug 07, 2014
That's incredible.... @gronilu - I don't think you have to pick. They're both pretty amazing. HP's "The Machine" sounds like it is further along, but regardless, I find this soooo impressive.
Whydening Gyre
5 / 5 (1) Aug 07, 2014
It would be great for devices like https://www.youtu...hXvuP4Vo (Arduino Visual)

Now we know the truth about Arties - He's got stock in Clearplex and Samsung...:-)
russell_russell
1 / 5 (3) Aug 07, 2014
Damage and repair is normal for all learning and memory.
http://medicalxpr...ain.html
Where's the mimicry?
DARPA'S unparallelled contribution is the world wide web.
Child's play in comparison to the sciences of the brain and mind.
24volts
2.2 / 5 (5) Aug 07, 2014
I know it sounds kind of silly but my first thought after reading the article was possible tricorder processor...I have no idea why.
antonima
4 / 5 (2) Aug 07, 2014
I'm no computer scientist, but supercomputers aren't necessarily designed to be efficient / watt. I'm pretty sure they are designed to be efficient / $ .

Another question I don't have an answer for : can this performance be scaled up for supercomputers in the future?

I can imagine that a chip which runs on very little power also doesn't heat up very much, so maybe it can run at a faster speed without frying itself.
TheKnowItAll
4.5 / 5 (2) Aug 08, 2014
What a bunch of teasers. I want one now!
Birger
not rated yet Aug 08, 2014
Are we talking about parallel processing? Neural networks?
a_boeglin67
4.8 / 5 (5) Aug 08, 2014
I'm no computer scientist, but supercomputers aren't necessarily designed to be efficient / watt. I'm pretty sure they are designed to be efficient / $ .

Another question I don't have an answer for : can this performance be scaled up for supercomputers in the future?

I can imagine that a chip which runs on very little power also doesn't heat up very much, so maybe it can run at a faster speed without frying itself.


Actually when dealing with neurons, speed doesn't matter that much since it can perform a huge degree of parallelism. Human brain neurons work at the scale of 1-200Hz for most of them. Auditory areas might go up to a KHz approximately. I'm not saying we shouldn't aim for better speed than brain if we can, but it isn't the main goal.
alfie_null
4.5 / 5 (2) Aug 08, 2014
Next step is what? Developing the tools used to program it? Will it be something like programming FPGAs? Some way to efficiently translate our thoughts on what the device should do to a working configuration.
Dug
4 / 5 (2) Aug 08, 2014
Trying to emulate natural architectures like the brain may be counter productive in the long run. Comparing the computer to the evolution of aircraft - one of the primary impediments to practical flight development early own was trying to emulate the bird. If we hadn't gotten past this, we would still be trying keep the feathers glued on all our jets.
EyeNStein
5 / 5 (2) Aug 08, 2014
This will be a good chip to learn the necessary programming skills on.
If 'memristors' show their expected promise as both memory and neuron simulator cells then 10Gb synapses could soon look like a small device. Complex Data analytics would go domestic mainstream: You could have cheap access to a personal cloud based 'Watson v2.0'.
Automated Data structures, relationships and organisation will be the key. Unless you really want to "Sell Shares" because "Its raining".
antialias_physorg
4.2 / 5 (5) Aug 08, 2014
Calm down everyone. This thing does in hardware what people have been doing in software for quite some time.
Advantage:
- hardware is (way) faster
- now you have a dedicated way to program neural networks instead of everyone "rolling their own"
Disadvantage: The way you can parametrize the neurons is very limited compared to what you can do in software (where there is basically no limit to the complexity you can go to: activation functions, learning functions, modelling biological effects like hysteresis or refractory periods, ... )

It's something that will be very good for data mining. But don't expect this to bring about thinking machines.
a_boeglin67
3.7 / 5 (3) Aug 08, 2014
Trying to emulate natural architectures like the brain may be counter productive in the long run.


I disagree. We did get inspired from birds at first. It helped us understand how geometry of bird wings could allow them to glide without falling straight. It got us up there. It was not practical nor a good design, but it gave us a path, a direction to follow and investigate. As far as I know planes still use wings don't they ?

Now consider intelligence. We know close to nothing about it at the moment. And the only one we know of is the one that happens in the neocortex of mammals. I agree with you on the point that on the long run, intelligent systems might move away from that biological analogy. But we first need to understand what intelligence is and try to replicate it by any means. We have a working system, we study it, we reproduce its functions, we improve it.

Finally, we did try a lot of non biological approaches already. Where did it lead us ?
TechnoCreed
5 / 5 (3) Aug 08, 2014
@russell_russell
DARPA'S unparallelled contribution is the world wide web.
Allow me to correct the above statement. Darpa's contribution to the WWW was Arpanet which later became the internet with the adoption of the TCP/IP protocol. The WWW itself was created by Tim Berners-Lee, who developed the hypertext transfer protocol or HTTP; this allowed information exchange between any computers.
Da Schneib
2.3 / 5 (3) Aug 09, 2014
What's one cost? I don't mean the research, I mean how much are they selling them for?

What programming language will they support? Are they just writing assembly language right now?

Interesting stuff.
antigoracle
2 / 5 (4) Aug 09, 2014
I hope these guys are watching over their backs for Sarah Connor.
cabhanlistis
5 / 5 (1) Aug 09, 2014
What's one cost?

It's brand new. They haven't figured that out yet. They might not even sell this particular version.
What programming language will they support? Are they just writing assembly language right now?

I don't know, but I think it might be Red Hat Enterprise Linux with Lustre. That's what IBM did with Sequoia, so that makes sense.
flashgordon
3.7 / 5 (3) Aug 09, 2014
I hope these guys are watching over their backs for Sarah Connor.


I hope immature anti-science nazies are not looking behind their shoulders for Robocop;
antigoracle
2.3 / 5 (3) Aug 10, 2014
Whoa... anti-science Nazis eh!
Well, heil idiot.
russell_russell
5 / 5 (1) Aug 10, 2014
Technocreed correctly points out that

"Many people use the terms Internet and World Wide Web (aka. the Web) interchangeably, but in fact the two terms are not synonymous. The Internet and the Web are two separate but related things."
http://www.webope...rnet.asp

With this in mind the original incorrect sentence:
DARPA'S unparallelled contribution is the world wide web,
.
has to correctly read:
DARPA'S unparallelled contribution is the internet.

Don't ever let yourself be caught using these terms synonymously.
Simply unforgivable, never to be forgotten.

antigoracle
1 / 5 (1) Aug 10, 2014
Howz about we call it the interweb?
russell, DARPA's contribution was really ARPAnet.
Eikka
4 / 5 (1) Aug 10, 2014
Trying to emulate natural architectures like the brain may be counter productive in the long run. Comparing the computer to the evolution of aircraft - one of the primary impediments to practical flight development early own was trying to emulate the bird. If we hadn't gotten past this, we would still be trying keep the feathers glued on all our jets.


And yet we started out by arguing that we don't need to build a brain to get functions like a brain (like intelligence/AI) back in the 70's and 80's, because everyone thought it was just too difficult to do. So instead they tried to reduce intelligence and language etc. into syntactic algorithms and math equations.

And look how well it has worked so far. There's something about the brain and how it works that makes it work well, and ignoring it is like building airplanes without wings by simply flying around on a sufficiently powerful engine.

antigoracle
not rated yet Aug 10, 2014
Eikka, current computer technology has worked well so far, without it our modern world would grind to a complete halt in an instant, it just hasn't worked well for AI.
Valentiinro
not rated yet Aug 10, 2014
...ignoring it is like building airplanes without wings by simply flying around on a sufficiently powerful engine.



Technically the things we've flown the farthest/fastest are ones without wings but with really powerful engines.
Eikka
not rated yet Aug 10, 2014
Technically the things we've flown the farthest/fastest are ones without wings but with really powerful engines.


Yeah, but that's not the entire point of flying.

You don't go to Thailand for your vacation in an X-15, and even if they did build a hypersonic jet plane for doing that, it would still have proper wings for takeoff and landing.

Similarily, we've got really powerful machines like Watson that consists of a roomful of computers drawing kilowatts of power (human brain: 25W) to merely fake intelligence, so that you can ask them a simple question and probably get the right answer.

Eikka
not rated yet Aug 10, 2014
Eikka, current computer technology has worked well so far, without it our modern world would grind to a complete halt in an instant, it just hasn't worked well for AI.


And without AI the great promise of computing can be summed up as mostly an elaborate version of the telegraph combined with a scientific calculator. With great effort you can make a car drive itself, provided that you give it a pre-defined GPS route and hope that nothing unexpected and unforeseen (unprogrammed) passes its way.

We're not really doing a whole lot with the vast amounts of processing power we have because there's not a whole lot we -can- or know how to do with it. You're probably browsing this website with a computer that is capable of processing billions of operations a second, but you're using about 5% of its power because the way it's built is dumber than a bag of hammers and you don't really need that much raw linear calculating power. It's mostly just wasted on inefficient programming.

krundoloss
not rated yet Aug 11, 2014
You ever wonder how an AI might perceive time? Thinking so quickly, it would probably be frustrated by waiting a virtual eternity on the next word to come out of our mouth. They could think a whole book in the time we think of one word.
russell_russell
not rated yet Aug 12, 2014
Eikka is correct.
Damage and repair is normal for all neuronal learning and memory.
http://medicalxpr...ain.html

No approach in the computer sciences uses damage and repair to process and store information for learning and memory. Learning and memory is intelligence.

No approach in the neurosciences uses computer architecture or software to process (learn) and store (memory) information in the brain.

An approach has been taken to close the ever-widening-gap between the computer sciences and the neurosciences.

The approach is one of scale. NOT the approach of ever faster processing. Once the computer sciences reaches the atomic and molecular scales nature uses for all information processing and storage the sooner the computer sciences will recognize that the damage that nature provides for all life has nothing to do with information loss and the sequence repair has everything to do with intelligence and the evolution of intelligence and life.
russell_russell
not rated yet Aug 12, 2014
Typo correction in the second-to-last sentence in the comment posted above:
Sequence=subsequent

@kundoloss
Yours and everyone's "sense" of "time" comes from damage. All physical damage is sequential.
That used to be a hallmark of physics (cause and effect).

Normally a repair follows after damage. Which makes repair sequential too.
That is where and how all humans "sense" a "passage" of time - by recalling the sequential stored repairs.

When repairs are delayed, faulty, or do not even occur humans are eventually labeled with a pathological condition.

The rest is hype for IBM.