Computer chip that computes probabilities and not logic

Computer Chip That Computes Probabilities and Not Logic
Lyric's Error Correction Chip for Flash Memory. Credit: Lyric Semiconductor
(PhysOrg.com) -- Lyric Semiconductor has unveiled a new type of chip that uses probability inputs and outputs instead of the conventional 1's and 0's used in logic chips today. Crunching probabilities is much more applicable to many computing task performed today rather than binary logic.

Ben Vigoda, CEO and founder of Lyric Semiconductor, has been aggressively working on this technology since 2006 and is partly being funded by the U.S. Defense Advanced Research Projects Agency (DARPA). DARPA is interested in using this technology in defense applications that involves information that is not clear cut and can use probability calculations to come to a conclusion.

Because probability calculations are used in so many products, there are many potential applications. Ben Vigoda stated: "To take one example, Amazon's recommendations to you are based on probability. Any time you buy from them, the fraud check on your credit card is also probability based, and when they e-mail your confirmation, it passes through a spam filter that also uses probability."

Conventional chips have transistors arranged in digital NAND gates which are used to implement digital functions using 1's and 0's. In a probability processor transistors are used to build Bayesian NAND gates. Bayesian probability is a field of mathematics named after the eighteenth century English statistician Thomas Bayes.

Lyric Semiconductor plans to have prototypes of their all-purpose probability chips operational within three years. Currently a smaller flash memory error-correcting , based on the technology, is available for license this week. The company plans on having flash memory chips in portable devices like tablets and smartphones within two years.


Explore further

Samsung unveils world's first 16-gigabit NAND flash memory chip

More information: Lyric Semiconductor
Via: Technology Review

© 2010 PhysOrg.com

Citation: Computer chip that computes probabilities and not logic (2010, August 19) retrieved 22 April 2019 from https://phys.org/news/2010-08-chip-probabilities-logic.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.
0 shares

Feedback to editors

User comments

Aug 19, 2010
So, looks like the Infinite Improbability Drive will reach market before warp technology.

Aug 19, 2010
As one nerd to another I say, well played sir!

Aug 19, 2010
Lol... see you at DragonCon in two weeks!

Aug 19, 2010
Agggggghhhhhhhhhhhh

wrong, wrong, wrong, wrong, wrong

It's a different type of processor architecture that's all. It'll still use 0s and 1s. However it may lead to new types of CPU architecture which incorporate a PPU (probability processor unit), however this would assume the design could be optimised so that the inevitable slow down which would occur with the extra gates needed to incorporate such a processor, wouldn't out way the positive effects of having a very minimised design. - Most often in CPU design, less is more - just look at the RISC vs CISC for example.

Also not all (in fact most minimised chips aren't) made up of NAND gates - they are made up of whatever types of gate the engineer thought best to make them up out of!

Aug 19, 2010
This sounds interesting in principle, in that it may find use in hardware based neural AI situations (as opposed to software based simulations), but I don't know how useful it will be in general computing (such as the Amazon example cited), as conventional chips are perfectly well capable of doing probabilistic/Bayesian computations.

Aug 20, 2010
It could very well be a fuzzy logic circuit of some sort. It doesn't need to do boolean algebra, therefore doesn't need conventional transistors arranged in gates.

I think the idea is to make them really fast. Your brain predicts objects before fully recognizing them. It will either confirm the prediction, or correct itself, but a significant amount of time is passed between when your brain predicts an object to be something and when it confirms that prediction. Ever stare at something for a long time thinking it's one thing or the other but not completely sure until it clicks? Computers would hang and freeze or give up presented such a task. With this chip it can be done much faster. The military needs it to automate UAVs and all kinds of equipment.

Aug 20, 2010
The application example of Amazon doesn't relate well. I am curious to understand the advantages of such a chip. Any one?

Aug 20, 2010
It's all mystification. Inside there's token or RNG ;]

Aug 21, 2010
Sounds like an analog computer. I haven't heard any chatter about those in a while.

Aug 22, 2010
D44X:

The article says the probability chips will not use conventional 1's and 0's.

C Sharperner is right when they say it sounds like an analog computer.

Anyone know what the human brain use besides logic and probablities even if it is a really far out theory?

Aug 22, 2010
A chip based on probability tech? Maybe we can use it to predict future lottery results. :)

tpq
Aug 23, 2010
So where are the comparisons to quantum computing..? To me this sounds immitation of quantum computers but still done with 0's and 1's (thus slow) or am I totally wrong?

Granted, we might need a while for quantum computing to really appear

Aug 23, 2010
Looks like we're simply moving towards a chip design that allows for qualitative processing in addition to quantitative.

Very interesting.

Aug 23, 2010
So where are the comparisons to quantum computing..? To me this sounds immitation of quantum computers but still done with 0's and 1's (thus slow) or am I totally wrong?
Granted, we might need a while for quantum computing to really appear

A quantum computer uses 0's and 1's, and values in between, or something like that. Eventually, quantum computers will use 0's, 1's and 2's.

Aug 23, 2010
A chip based on probability tech? Maybe we can use it to predict future lottery results. :)


Unfortunately all lottery picks are equally probable (or improbable depending on how you look at it)

Aug 24, 2010
A quantum computer uses 0's and 1's, and values in between, or something like that. Eventually, quantum computers will use 0's, 1's and 2's.

Not really. A quantum computer would use superpositions. So a bit would be either 0, 1, or a superposition of both 0 and 1.

Aug 24, 2010
A quantum computer uses 0's and 1's, and values in between, or something like that. Eventually, quantum computers will use 0's, 1's and 2's.

Not really. A quantum computer would use superpositions. So a bit would be either 0, 1, or a superposition of both 0 and 1.

Yes, but we are both right.
Maybe I should have said a quantum computer uses 0's and 1's and superpositions in between, but I have heard it said other ways.

Aug 24, 2010
Yes, but we are both right.
Maybe I should have said a quantum computer uses 0's and 1's and superpositions in between, but I have heard it said other ways.
Superpositions are not "in between" they are both at once. It's a little different, but different enough to make a semantics argument out of it. For example a superposition isn't 0.25 or 0.5 it is both 1 AND 0. If you could liken it to a light switch you'd have off, on, and this weird position where the light was both off and on at the same time.

Aug 24, 2010
You are correct. I do not memorize/retain all of that stuff. I was trying to answer TPQ's question/statement above.
I am glad you straigthened that one out.
Thanks.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more