A team led by Yale University researchers has created the first rudimentary solid-state quantum processor, taking another step toward the ultimate dream of building a quantum computer.

They also used the two-qubit superconducting chip to successfully run elementary algorithms, such as a simple search, demonstrating quantum information processing with a solid-state device for the first time. Their findings will appear in *Nature*'s advanced online publication June 28.

"Our processor can perform only a few very simple quantum tasks, which have been demonstrated before with single nuclei, atoms and photons," said Robert Schoelkopf, the William A. Norton Professor of Applied Physics & Physics at Yale. "But this is the first time they've been possible in an all-electronic device that looks and feels much more like a regular microprocessor."

Working with a group of theoretical physicists led by Steven Girvin, the Eugene Higgins Professor of Physics & Applied Physics, the team manufactured two artificial atoms, or qubits ("quantum bits"). While each qubit is actually made up of a billion aluminum atoms, it acts like a single atom that can occupy two different energy states. These states are akin to the "1" and "0" or "on" and "off" states of regular bits employed by conventional computers. Because of the counterintuitive laws of quantum mechanics, however, scientists can effectively place qubits in a "superposition" of multiple states at the same time, allowing for greater information storage and processing power.

For example, imagine having four phone numbers, including one for a friend, but not knowing which number belonged to that friend. You would typically have to try two to three numbers before you dialed the right one. A quantum processor, on the other hand, can find the right number in only one try.

"Instead of having to place a phone call to one number, then another number, you use quantum mechanics to speed up the process," Schoelkopf said. "It's like being able to place one phone call that simultaneously tests all four numbers, but only goes through to the right one."

These sorts of computations, though simple, have not been possible using solid-state qubits until now in part because scientists could not get the qubits to last long enough. While the first qubits of a decade ago were able to maintain specific quantum states for about a nanosecond, Schoelkopf and his team are now able to maintain theirs for a microsecond—a thousand times longer, which is enough to run the simple algorithms. To perform their operations, the qubits communicate with one another using a "quantum bus"—photons that transmit information through wires connecting the qubits—previously developed by the Yale group.

The key that made the two-qubit processor possible was getting the qubits to switch "on" and "off" abruptly, so that they exchanged information quickly and only when the researchers wanted them to, said Leonardo DiCarlo, a postdoctoral associate in applied physics at Yale's School of Engineering & Applied Science and lead author of the paper.

Next, the team will work to increase the amount of time the qubits maintain their quantum states so they can run more complex algorithms. They will also work to connect more qubits to the quantum bus. The processing power increases exponentially with each qubit added, Schoelkopf said, so the potential for more advanced quantum computing is enormous. But he cautions it will still be some time before quantum computers are being used to solve complex problems.

"We're still far away from building a practical quantum computer, but this is a major step forward."

__More information:__ www.nature.com/nature/journal/vaop/ncurrent/full/nature08121.html

Source: Yale University (news : web)

**Explore further:**
Yale scientists bring quantum optics to a microchip

## plasticpower

A non-deterministic state machine? I thought that wasn't possible.

## nick7201969

## Damon_Hastings

According to most of the dominant interpretations of QM, the entire universe is a non-deterministic state machine. But Einstein disputed this ("God does not play dice with the universe!"), and the jury is still out. No one really knows yet what superposition really means, or what the quantum wavefunction collapse really is (is it a real physical event, or merely a change in how humans conceptualize the system?) The dominant interpretation is that the wavefunction collapses when it is "observed". But does the universe actually recognize a physical separation between the observer and the observed? And what is an observer, anyway?

There are many conflicting interpretations of QM, some of which are deterministic, and some of which are not. And you'd need a PhD to really understand how these interpretations differ. QM itself provides relatively little guidance -- QM is, at its core, merely a set of equations that tell you what possible outcomes an experiment might have, and the probability of each. What's actually happening inside the experiment is anybody's guess, and there's no shortage of guesses. ;-)

I think of QM as being sort of like a set of statistical equations that tell you how often a coin toss will give you heads or tails under different circumstances. These equations regard the coin toss as a "random" event, but of course it's not really random -- in theory you could predict the outcome of each toss if you knew the coin's starting position, velocity, spin, etc. So it's only random until you discover and understand the underlying dynamics. And this leads me to question whether true "randomness" really exists. Will we perhaps one day discover a deeper set of equations which can perfectly predict supposedly "random" quantum events? No one knows. But the dominant interpretation models the universe as being inherently non-deterministic (i.e. random). You can compute the probability of any given outcome (and the probability curve itself evolves deterministically over time), but that's as much determinism as you get. As I understand it. :-)

## flashgordon

## flashgordon

Talk about tip of the iceberg! Talk about the calm before the storm! This is like astronomically beyond the comprehension of the standard human being; or even some mathematical genius!

## Damon_Hastings

Yep, it would be really cool to have atom-sized computing elements! Eventually we might even use subatomic particles. And some say that even the fundamental particles are just the tips of an underlying spacetime "froth", sort of like whitecaps appearing on a lake. They say that the 99.9999999% of "empty space" between particles is composed of this froth. Can you imagine if we were able to craft that froth directly into some sort of computing array, without even involving anything that physicists of today would call a "particle"? It would be like hijacking whatever "machine" runs the universe itself, and making it run our own computations! Harnessing the raw computing power of the universe itself. The possibilities boggle the mind.

## komone

## SmartK8

## suff

## Mercury_01

## Damon_Hastings

Yeah, I've always had a soft spot for the multiverse interpretation of QM, too. But it seems to have fallen into disfavor lately, what with Bohm and decoherence and such. Oh, well. String theory will probably blow them all away anyway, eh? ;-)

## finitesolutions

## fuzz54

## probes

I am always forgetting peoples phone numbers.

## Icester

## Traveler

Quantum Computing Crackpottery:

http://rebelscien...ery.html

## NeilFarbstein

## superhuman

String theory is a dead end.

## Quantum_Conundrum

the analogy of making a single phone call to guarantee the correct number among 4 random recipients is simply hogwash.

Basicly, in logical "computer terms" this is like searching an un-sorted, un-referenced array( the 4 un-labelled phone numbers) for an un-identified value (the correct friend) and getting it right on the first try as anything other than a fluke.

This is simply logically impossible, as there is no mechanism presented to explain this behaviour.

The following section assumes a one-molecule transistor(qubit) for simplicity.

1) One can conceptualize how an electron might concievably have two states at once, but there is no novel mechanism for detecting the presence of both states in a single comparison. If the spin of an electron is both up and down, it would seem to require at least two seperate comparisons to prove that.

2) Even though a 3 state transistor(my take on "up", "down" and "both") could theoretically perform multiple tests in a single step for certain very cleverly designed data objects, in general this would not apply to everything, and would only work with some very inventive data object design and database design by the programmer. The quantum processor could never "magically" perform scores of calculations in a single step and output the results immediately the way Quantum Computing promoters often claim. For example, it simply cannot add 2 3=5 and 1 6=7 simultaneously on a one processor, 1 word machine, even if it is "quantum", as there is no way to know which "up," "down," or "both" applies to which data object from which algorithm.

3) Now in terms of data storage, power consumption, and processor time, theoretically, a quantum computer would be much better than a binary computer. This assumes a non-volatile RAM (spintronic/qubit transistors) and a non-volatile spintronic secondary storage(data stick w/ non-volatile qubit transistors.)

in binary

8 bits = 1 byte = 256 values 0-255

32 bits = 4 byte = 4294967296 values 0 upward

in qubits

6 qubits = 729 values from 0-728

24 qubits = 282429536481 values 0 upward

Applications:

Power:

because far few transistors are needed to store the same size number, or to handle the same number of symbols, data sets, and command sets, you use far less power. A 24 qubit quantum processor use only 2/3 as many transistors as a 32 bit binary processor, so right away the energy from "flipping switches" in the prcessor is cut by 1/3. Not counting that quantum qubits are non-volatile if based on "spintronics," and thus, use less energy anyway...potentially hundredths, thousandths, or even less of the energy of an electronic computer. This means less power used, and less wear and tear on the components, i.e. motherboards, processors, and ram never burn out or over heat.

Math:

Allows for handling absurdly large(or small) numbers in far fewer steps of the processor because very large numbers take up far fewer transistors in base 3 allowed by quantum transistors than in base 2 of electronic transistors (see figures above). While this may not appear hugely significant for small numbers, when dealing with astronimically large or small numbers, which are represented by "multi-word" data objects and require very many cycles of the processor to add, subtract, multiply, or divide a single number, etc. When done with data that fits into fewer transistors, and therefore fewer "words", this saves tremendous number of operations.

Example:

3^48 > 2^64

3^128 >>> 2^192*

* this saves 64 transistors and stores a far larger number, and can then be managed in far fewer operations of the processor since it involves fewer "words" and so on.

Basicly, the higher the "base," the fewer the operations you need; equal or fewer, never more.

text documents (inlcuding html and scripting languages.)

Its a bit lengthy to explain, but I have concocted potential data storage and compression algorith which would work WONDERS in qubits which is physically impossible to do in binary.

Imagine that "qubyte" above which was only 6 qubits. Now in a text document, it can story almost 3 times as many possible symbols as a byte, and is yet 2 transistors smaller!

A typical algorithm in data compression is to make a single bit 0 or 1 which indicates whether the following byte is an "real text" character or whether it is a symbol for a "compressed text" found in the symbol table. Well, since in base three, the preceding qubit can have 3 possibilities interpreted as 0, 1, or 2, we can literally have twice as large of a symbol table (column 1 or column 2). Perhaps "colum 1" are "standard" compression items, such as common phonetics, words, and syllables ("Th", "The", "to", "an", "es", "ing", etc). Basicly, this becomes a "codebook" that is common to all computers. Meanwhile, some or all of column 2 might be "custom" compression symbols, such as words that are especially common or unique to the text in mind (proper names, rare scientific words or acronyms which are used repeatedly in the text, etc.) In reality, you would only need a few dozen of such entries, as it would be better to use all of the possibilites for the most common words, prefixes, and suffixes in the relevant language.

Thus, uncompressed text documents would literally be at least 25% smaller in "qubytes" to begin with as compared to text in bits. Then the EXACT SAME basic data compression concept could allow most text documents to be compressed to perhaps as little as 10-20% or less in some cases.

In terms of data compression, the reason this is superior to the same basic concept in binary are this: (hopefully, this is understandable to the reader)

1) 6 qubits fits on 25% fewer transistors than 1 byte, and yet has almost 3 times the possible symbols.

2) 1 qubit "flag" allows us to identify not only whether a symbol is compressed or whether it is plain text, but also, in the same step, tell the processor which table to use to decompress if it indeed is compressed. (In binary, since you need extra bits as part of the "flag", this would have more and more overhead to perform the same algorithm. Base 3 has literally half the overhead in this case.)

## ler177

Well, can this device actually perform the operation stated, or can't it?

Any theory dealing with such high energies can't in the foreseeable future be tested, and so will always be accused by some of just being 'philosophy.'

## Quantum_Conundrum

Anyway, there are certain physical benefits to a quantum computer, but I have never been convinced on the "do multiple calculations simultaneously" part. It just doesn't hold water.

## Quantum_Conundrum

===

The answer is "no, it can't. No computer ever could."

Just because a transistor can be in two states simultaneously does not mean it can actually store two seperate data simultaneously, nor does it mean it can be used to perform two seperate tests or operations simultaneously (as stated, with a very few very specialized exceptions.)

The classic notion of a quantum computer performing millions of calculations simultaneously, or solving all possible solutions to an equation simultaneously, or even "dialing all 4 numbers simultaneously and only the correct one answers," are all science fiction.*

There is no basis whatsoever for the claims made in that article, as there is no mechanism, real or imaginary, which could actually facilitate this with any degree of reliability.

* even the programming language to attempt to handle any such claims is ridiculous.

variable initializations:

$N = 1;

$A = 0;

$M = 1 and 0;

So what is N M? What is A M?

What happens on a classical "if" statement which is checking "M"? Do you literally split "threads" and handle both possibilities as legitimate outcomes, even though this may become nonsensical and conflicting?

## Quantum_Conundrum

It CAN DO the following things (mostly due to being able to store larger numbers or more symbols on the same number of transistors.)

1) CAN: Perform the same mathematical operation POTENTIALLY in fewer steps; Depending on how large or small the numbers are which you are working with.

For small, ordinary numbers that most computer software uses (small loops, few entries in a database, small number math, etc,) this would be insignifigant or even entirely non-beneficial.

2) CAN: Store the same text document in at least 25% less space while uncompressed, and possibly 90% less space while compressed. Almost anyone who uses the internet would benefit from this.

3) CAN: Reduce the size of all "one word" instruction sets by 25%, and also potentially reduce SOME multi-word instruction sets to fewer words than they currently are (highly data and/or application specific, but works much like the "very large or small numbers math" idea above).

Things it CANNOT do (because there is no logical mechanism to facilitate them.)

A) CANT: perform multiple operations on the same processor simultaneously. This is pure science fiction. To demonstrate why, just imagine the base 3 number system.

0=0

1=1

2="both"

Now imagine our quantum processor trying to do two math problems simultaneously as science fiction (and crackpottery,) claims. To keep this very simple we consider the math operates:

1 plus 2 = 3

3 plus 4 = 7

For simplicity, I'll just look at the first 4 bits.

usual binary: (p is short for the "plus" sign.)

equation A: 0001 p 0010 = 0011

equation B: 0011 p 0100 = 0111

The problem with the "superposition" interpretation appears here.

"quantum" (assuming binary, to try to demonstrate the paradox of alleged simultaneous operations.)

The problem comes in because is zero really zero?

0 plus 0 = 0

but in binary, 1 plus 1 = 10

Is the "2" ("both" state) that way because equation A has a 1 and B has a 0, or is it because A has a 0 and B has a 1? It makes a WORLD of difference, and it is not possible to detect which is the case without additional operations.(in fact, more operations than two seperate simple binary additions.)

To further illustrate this paradox and its absurdities: (Q tells us which qubit we are looking at, A and B are data results of the respective equations, "2" represents "both", C is the value in the processor outputs)

Q 123456

A 000011

B 000111

C 000211

Now the problem is, C cannot actually tell us whether A's 4th qubit is 0 or 1, only that between A and B both 0 and 1 are represented. thus the processor has a wrong result for BOTH numbers 50% of the time(actually, ithas a wrong result for both numbers 63 out of 64 times, which makes it worse than guessing).

B) CANT: search an un-sorted, non-referenced array in one step with guaranteed correct outcome; in spite of what this article implies with the "phone number" example. That is just non-sensical and hogwash. This would not be possible EVEN IF "A" above was possible. This would not be possible even if the uncertainty principle is not true. With the uncertainty principle, even if you COULD perform this operation successfully, doing so just one time, you will have destroyed all four data objects (phone numbers) in the process, which is bad for future use of any of the data in question.

## pfau

To the best of my knowledge, what D-Wave did was build a regular analog computer using the coherent state of electrons in a superconductor, which behave classically even though they can only exist due to quantum mechanical effects. So they couldn't implement any true quantum algorithms, but they marketed it as "quantum computing" to get press. This sounds like the real deal however.

## Quantum_Conundrum

This article doesn't even explain what its alleged "algorithms" were. What sort of algorithm could actually be run on a 2 qubit machine which actually does something complicated enough to test any of the science(fiction) quantum computing concepts I've addressed? In the base 3 interpretation, a 2 qubit processor can only handle numbers 0 through 8 as a single "word", so if by "algorithm" they mean adding 3 plus 5 equals 8, that is very unimpressive.

If they mean sorting a randomized array of 100 terms in verifiably fewer operations than is physically possible with a 2 bit binary computer, now that would be an "algorithm," but I doubt anything that useful was actually done here, else they would have presented it.

Are they actually claiming that this 2 qubit processor has already run an "algorithm" which someone invented to search an un-sorted array of 4 terms, making only one comparison, and getting it right every time on the first try?

Lets see the algorithm, and lets actually see some sort of physical proof of this absurd claim.

## Traveler

## flashgordon

Quantum Computing Crackpottery:

http://rebelscien...ery.html"

Wow, one could give up after seeing somebody who's so far down their path of thought like this . . . trying to explain anything to anybody about science. I mean this guy sounds like the people who couldn't look through the Galileo telescope because "why doesn't the air get left behind", "why doesn't the earth fall apart", "why do I stay firmly planted to the solid earth if it is moving like this"?

I've tried to explain things like this to "CRN/FORESIGHT" people . . . that the only way for humanity to grow up and get past . . . the past . . . is to be free; to be able to get together with those who see and understand things to be able to get away from those who don't and will not because they are wrapped up in their world; i mean I'm not even talking about fear mongers here; i'm just talking about people who think but cannot and will always refuse to not thing about new ideas. But no, we've got to bind everybody up here on earth because dam if some humans see that 1) humans are the technologically dependent species, and 2) we are about becoming transhuman. But Eric Drexler, Chris Phoenix, Bill Joy, Mike Treder, Ralph Merkle, and all the rest I can only suppose since nobody criticizes them!

Like I've said before; you don't want to think things scientifically, I don't want to hear your problems; just shut up; and when push comes to shove in the future(it will with all those irrationalists out there; to the tune of practically six billion people all socially bound up with anti-science as a social grace . . . oh yea, push will come to shove) . . . I don't want to hear about it.

## flashgordon

I know my family had the book of the author who came up with the famous quote "those who don't learn their history are doomed to repeat it" but it seems quite clearly to not be around anymore; it got lost, grew wings and flew away apparently. Bottom line here is what happened to Greek mathematical science due to Plato's restrictions of no experiment and compass and straightedge will happen again. When the future comes around, we'll be living in a kind of logan's run; and, because the technology is so powerfull and incomprehensible, we'll never leave the earth; and, because no knowledge is the final knowledge(CRN/Foresight guys/gals hate Godel's theorems), problems will ensue both social/psychologically and 'technologically.'

## superhuman

This is just a bad excuse for a failure.

Low energy physics has to be derivable from high energy physics just as newton laws are derivable from relativistic laws. There are plenty of low energy phenomena which need to be explained by a proper theory of everything (which string theory aspires to be) and which should therefore allow for it's testing. Things like origin of mass, why masses and mixing angles are what they are, where does fine structure constant come from, why Koide formula holds, why are there 3 generations of particles, where gauge symmetries come from, how entanglement works, what causes decoherence, how many forces are there, what constitutes dark matter and dark energy, what is the nature of neutrinos, do gravitons really exist and in what form, and so on.

And as for theories which are genuinely beyond experimental testing they should never be founded from public funds. Every successful human model has been based on experimental evidence, without such evidence to guide the way it's pretty much guaranteed that the model will be wrong and the funding which went into it wasted. Therefore development of theories has to be postponed until the time when proper experiments are within reach.

String theory is a cancer on the body of theoretical physics. It's a manifestation of a more general problem concerning science - increasing the number of scientists is not always a good thing. Theres only so many bright and honest people with a genuine passion for a particular field. Lowering the barrier to entry to rise numbers of scientists means allowing mediocre ones into the field which results in lower overall quality and dilution of both talent and knowledge. Mediocre scientists have to produce publications but being unable to tackle the real problems of the fields they invent replacement problems - string theory, multiverse, extra dimensions, the first 10-40 seconds after big bang and other such speculative nonsense. Normally such "research" should not pass the peer review but with mediocre scientists far outnumbering the real ones it not only passes it even becomes a hot and trendy area of "research"! What's more this crap is then packed up in hype and sold to ignorant public. A whole industry build on pseudoscience.

## Quantum_Conundrum

I would like few things more than for quantum computing as pourtrayed in this article and in science fiction to be true. However, it simply is neither plausible nor possible in this universe, nor any universe, which has any degree of causality whatsoever.

I have attempted (admittedly poorly,) to show above the physical and logical reasons why no processor could actually perform multiple calculations simultaneously, even if the principle of superposition is true. The most basic reason is that the input and output data from one calculation would necessarily corrupt the input and output data of the other simultaneous calculation, and as I demonstrated, there is no way to recover even one of the results, much less both of them.

Just think about this for a moment, can anyone reading this actually concoct a procedure or a logical algorithm which a quantum computer could execute to supposedly solve an equation for all roots in a single step, or make a single comparison and guarantee the correct result while searching an array?

## flashgordon

## lomed

## Quantum_Conundrum

Also, the grover algorithm is still probabilistic and carriers a percentage chance of error! This is therefore absurd, and amounts to rolling a set of fixed dice. You may have an increased chance of getting the right number, but you still have a chance of getting the wrong number too, which is unacceptable for real world applications.

To me, the efficiency of an algorithm to solve the "worst case scenario" search or sort must be defined by the first iteration that is guaranteed to always have the right result, NOT the first iteration that "probably" has the right result.

That is simply an absurdly unacceptable standard, because in reality, a "worst case scenario" in Grover's algoithm actually ends up requiring more iterations than even a linear search, because you have to check and re-check the results to be sure you haven't made a mistake, and even then, since future checks also have a probability of being wrong, your results can never be "guaranteed correct". Which is functionally useless.

IN a binary search in a regular computer, the WORST case scenario when searching an array of 2^N objects requires only N iterations.

Obviously, the best case scenario possible for any algorithm would be finding the right result always on the first iteration, but I still do not believe that has been proven possible for anything other than an array of one element, unless you already know the correct answer ahead of time(but then why search, simply de-reference and move on...).

But the worst case in the Grover algorithm (which was glossed over and ignored,) is actually an infinite loop of always getting a different answer on each subsequent check, which CAN and WILL happen from time to time, which makes it useless.

Any probability less than unity is functionally useless in an algorithm.

## probes

## Quantum_Conundrum

Its science fiction.

## lomed

## probes

## mattytheory

## lomed

However, the probability amplitudes of any given processes in the universe (as far as I know) change deterministically/predictably. So, given a large enough set of occurrences of a process, the proportion of each possible outcome can be predicted very precisely (as with most statistical observations, the more instances the smaller the uncertainty in the proportions.) So, uncertainty produces unpredictability on small scales, but does not preclude a large measure of determinism on large scales.

## probes

## KBK

It was publicly announced and seen back in 1946 that for every year that went by, black ops scientific research moved 50 years ahead of what the public sees. The money, effort and manpower in operations that were going black at the end of the war was huge. Especially when 100,000 TONS of data (paperwork, etc) came to the US from the black ops of Germany..along with the scientists. Start digging, the evidence is starting to come out.

Much of it has been found since the end of the cold war and the dropping of the wall. With that, much evidence that was suspected was expended tremendously from data found in Eastern European archives. Archives that were not as secreted as the US would have wished,as their 'stories' about how the war ended and what was done... who was in glory, who created what etc..all those stories are being undone by this data and the anomalies in science, like the transistor, the laser, etc, all that 'history' is being undone. What really went on is coming to the fore and all the anomalous stories that were considered to be coming from nutjobs and conspiracy freaks, those stories are turning out to be based in facts. Maybe not as seen before the data came out but more true than the fabricated US history that was created for public consumption. Most of it is very hair raising, in the least.

One simple point in hundreds was that the Nazi's were about 1 week away from sending rockets into orbit. One point out of hundreds. That they had reduced circuitry to the point that a television transceiver was designed that was as small as shoebox, and was used for video and guided missiles of extreme accuracy. 3000 mile range ballistic missiles, TV guided. 1 week away. And there's more. Much more.

The point being, that I -JUST- read about a scientist that recently came forward and was involved in the modern version of these works..the place where all those missing $trillions of dollars that Donald Rumsfeld reported as missing from the pentagon on Sept. 10, 2001. Yes, he reported that on the day before. $2.2 Trllion that they could not account for. Utter bullshit. Tip of the iceberg, it is.

And..to add..that whatever hit the pentagon killed the accountants and the entire independent accounting party that was investigating the missing funds and exactly where they were located.

Killing a few people over hiding the biggest secret enterprise on the plant would be a simple choice would it not? So it was done. That's why no plane, but a penetrating missile,as the evidence shows. As the ex head of the US Military intelligence services recently said, flat out he said..and that is Stubblebine. the video is on Youtube, it just went up.

The people killed at the pentagon where the accountants looking into the pentagon records that would expose over $2.2 trillion in deep black ops.

Look it up, if you don't believe it.

And that was merely the stuff that was on the books, never mind the stuff that was and is off the books.

The point being, is that this scientist works for..wait for it..Los Alamos. And he came forth with an easy to manufacture quantum chip that was more powerful than 10,000 of the fastest PC's that exist today. All on one sample quantum chip. Only part of what he showed.

The story goes that the speed that the black ops has been moving forward with is now down to 44 years ahead, for every real year in public life.

This is due to being unfettered and dedicated, excited by discovery, etc..and using their own discoveries to leapfrog ahead, not an uncommon situation, real scientists and researchers will understand this point that I speak of. A held secret of an advance in science can enable one to seemingly leapfrog well ahead of others--AS LONG AS THE SECRET IS HELD!

So yes, if one goes looking for this data, they will find it.

This sort of thing is that reason that NASA is such a failure,as there are black ops that have been well beyond NASA for a very long time.

NASA gets $10-20 billion to do a public 'song and dance'..and the black ops get $100-200 Billion to move ahead with their designs, which are already far beyond what is seen publicly.

Start looking. It's out there.

## probes

## KBK

I just noticed that.

That's the public scholastic arm of -same people- who brought the 3000 Nazi Scientists to the US to work in the black ops programs.

So quantum chip shows up in the same area as they are doing all the black ops in? Utter bullshit, like I said, they've had much much more than that...for a very very long time.

You are being given, being 'gifted' a -- toy. It's being inched out to you.

Ask yourself why.

## bmcghie

Also, nice post superhuman.

## probes