Computer chip that computes probabilities and not logic

Aug 19, 2010 by John Messina weblog
Lyric's Error Correction Chip for Flash Memory. Credit: Lyric Semiconductor

(PhysOrg.com) -- Lyric Semiconductor has unveiled a new type of chip that uses probability inputs and outputs instead of the conventional 1's and 0's used in logic chips today. Crunching probabilities is much more applicable to many computing task performed today rather than binary logic.

Ben Vigoda, CEO and founder of Lyric Semiconductor, has been aggressively working on this technology since 2006 and is partly being funded by the U.S. Defense Advanced Research Projects Agency (DARPA). DARPA is interested in using this technology in defense applications that involves information that is not clear cut and can use probability calculations to come to a conclusion.

Because probability calculations are used in so many products, there are many potential applications. Ben Vigoda stated: "To take one example, Amazon's recommendations to you are based on probability. Any time you buy from them, the fraud check on your credit card is also probability based, and when they e-mail your confirmation, it passes through a spam filter that also uses probability."

Conventional chips have transistors arranged in digital NAND gates which are used to implement digital functions using 1's and 0's. In a probability processor transistors are used to build Bayesian NAND gates. Bayesian probability is a field of mathematics named after the eighteenth century English statistician Thomas Bayes.

Lyric Semiconductor plans to have prototypes of their all-purpose probability chips operational within three years. Currently a smaller flash memory error-correcting , based on the technology, is available for license this week. The company plans on having flash memory chips in portable devices like tablets and smartphones within two years.

Explore further: Researchers developing algorithms to detect fake reviews

More information: Lyric Semiconductor
Via: Technology Review

Related Stories

Micron, Intel try out 50 nm NAND memory

Jul 25, 2006

Semiconductor giants Micron and Intel said Tuesday they were sampling the first NAND flash memory chips built on 50-nanometer processing technology.

Toshiba to launch 32nm process NAND flash memory

Apr 27, 2009

Toshiba Corporation today announced that it will start shipping NAND flash memory products fabricated with 32nm process technology. Samples of the world's first 32nm generation, 32-gigabit (Gb) single chips (4 gigabytes (GB)), ...

Toshiba Takes NAND Flash Memory to 4Gb Level

Apr 14, 2004

Toshiba America Electronic Components, Inc. (TAEC)*, and its parent Toshiba Corp. (Toshiba), reinforcing the company's leadership in the development and fabrication of powerful, high capacity NAND flash memory, today introduced ...

Samsung's new flash chips for mobile devices

Jan 14, 2010

(PhysOrg.com) -- Samsung Electronics has announced two new flash chip storage devices for mobiles: a removable 32-Gbyte micro SD (secure digital) card and a 64-Gbyte moviNAND flash memory module. Both are ...

Samsung Develops 2Gb Flash Memory Using 60nm Process

Jun 30, 2006

Samsung Electronics Co., Ltd., the world leader in advanced memory technology, announced today that it has successfully developed a faster and higher capacity version of the world's fastest memory chip.-- OneNAND ...

Recommended for you

Apple sees iCloud attacks; China hack reported

7 hours ago

Apple said Tuesday its iCloud server has been the target of "intermittent" attacks, hours after a security blog said Chinese authorities had been trying to hack into the system.

HP supercomputer at NREL garners top honor

10 hours ago

A supercomputer created by Hewlett-Packard (HP) and the Energy Department's National Renewable Energy Laboratory (NREL) that uses warm water to cool its servers, and then re-uses that water to heat its building, has been ...

User comments : 19

Adjust slider to filter visible comments by rank

Display comments: newest first

ClickHere
5 / 5 (4) Aug 19, 2010
So, looks like the Infinite Improbability Drive will reach market before warp technology.
balde
5 / 5 (2) Aug 19, 2010
As one nerd to another I say, well played sir!
gunslingor1
5 / 5 (1) Aug 19, 2010
Lol... see you at DragonCon in two weeks!
d44x
5 / 5 (1) Aug 19, 2010
Agggggghhhhhhhhhhhh

wrong, wrong, wrong, wrong, wrong

It's a different type of processor architecture that's all. It'll still use 0s and 1s. However it may lead to new types of CPU architecture which incorporate a PPU (probability processor unit), however this would assume the design could be optimised so that the inevitable slow down which would occur with the extra gates needed to incorporate such a processor, wouldn't out way the positive effects of having a very minimised design. - Most often in CPU design, less is more - just look at the RISC vs CISC for example.

Also not all (in fact most minimised chips aren't) made up of NAND gates - they are made up of whatever types of gate the engineer thought best to make them up out of!
DamienS
5 / 5 (1) Aug 19, 2010
This sounds interesting in principle, in that it may find use in hardware based neural AI situations (as opposed to software based simulations), but I don't know how useful it will be in general computing (such as the Amazon example cited), as conventional chips are perfectly well capable of doing probabilistic/Bayesian computations.
plasticpower
5 / 5 (1) Aug 20, 2010
It could very well be a fuzzy logic circuit of some sort. It doesn't need to do boolean algebra, therefore doesn't need conventional transistors arranged in gates.

I think the idea is to make them really fast. Your brain predicts objects before fully recognizing them. It will either confirm the prediction, or correct itself, but a significant amount of time is passed between when your brain predicts an object to be something and when it confirms that prediction. Ever stare at something for a long time thinking it's one thing or the other but not completely sure until it clicks? Computers would hang and freeze or give up presented such a task. With this chip it can be done much faster. The military needs it to automate UAVs and all kinds of equipment.
dangiankit
5 / 5 (1) Aug 20, 2010
The application example of Amazon doesn't relate well. I am curious to understand the advantages of such a chip. Any one?
marcin_szczurowski
not rated yet Aug 20, 2010
It's all mystification. Inside there's token or RNG ;]
CSharpner
not rated yet Aug 21, 2010
Sounds like an analog computer. I haven't heard any chatter about those in a while.
TabulaMentis
not rated yet Aug 22, 2010
D44X:

The article says the probability chips will not use conventional 1's and 0's.

C Sharperner is right when they say it sounds like an analog computer.

Anyone know what the human brain use besides logic and probablities even if it is a really far out theory?
GrayMatter
not rated yet Aug 22, 2010
A chip based on probability tech? Maybe we can use it to predict future lottery results. :)
tpq
not rated yet Aug 23, 2010
So where are the comparisons to quantum computing..? To me this sounds immitation of quantum computers but still done with 0's and 1's (thus slow) or am I totally wrong?

Granted, we might need a while for quantum computing to really appear
Skeptic_Heretic
not rated yet Aug 23, 2010
Looks like we're simply moving towards a chip design that allows for qualitative processing in addition to quantitative.

Very interesting.
TabulaMentis
not rated yet Aug 23, 2010
So where are the comparisons to quantum computing..? To me this sounds immitation of quantum computers but still done with 0's and 1's (thus slow) or am I totally wrong?
Granted, we might need a while for quantum computing to really appear

A quantum computer uses 0's and 1's, and values in between, or something like that. Eventually, quantum computers will use 0's, 1's and 2's.
Javinator
not rated yet Aug 23, 2010
A chip based on probability tech? Maybe we can use it to predict future lottery results. :)


Unfortunately all lottery picks are equally probable (or improbable depending on how you look at it)
Skeptic_Heretic
not rated yet Aug 24, 2010
A quantum computer uses 0's and 1's, and values in between, or something like that. Eventually, quantum computers will use 0's, 1's and 2's.

Not really. A quantum computer would use superpositions. So a bit would be either 0, 1, or a superposition of both 0 and 1.
TabulaMentis
not rated yet Aug 24, 2010
A quantum computer uses 0's and 1's, and values in between, or something like that. Eventually, quantum computers will use 0's, 1's and 2's.

Not really. A quantum computer would use superpositions. So a bit would be either 0, 1, or a superposition of both 0 and 1.

Yes, but we are both right.
Maybe I should have said a quantum computer uses 0's and 1's and superpositions in between, but I have heard it said other ways.
Skeptic_Heretic
not rated yet Aug 24, 2010
Yes, but we are both right.
Maybe I should have said a quantum computer uses 0's and 1's and superpositions in between, but I have heard it said other ways.
Superpositions are not "in between" they are both at once. It's a little different, but different enough to make a semantics argument out of it. For example a superposition isn't 0.25 or 0.5 it is both 1 AND 0. If you could liken it to a light switch you'd have off, on, and this weird position where the light was both off and on at the same time.
TabulaMentis
not rated yet Aug 24, 2010
You are correct. I do not memorize/retain all of that stuff. I was trying to answer TPQ's question/statement above.
I am glad you straigthened that one out.
Thanks.