Stars as random number generators could test foundations of physics

May 16, 2017 by Lisa Zyga, feature

The proposed Bell test uses stars and quasars as random number generators to address the freedom-of-choice loophole and show that the quantum world does not obey local realism. Credit: Wu et al. ©2017 American Physical Society
(—Stars, quasars, and other celestial objects generate photons in a random way, and now scientists have taken advantage of this randomness to generate random numbers at rates of more than one million numbers per second. Generating random numbers at very high rates has a variety of applications, such as in cryptography and computer simulations.

But the researchers in the new study are also interested in using these cosmic random number generators for another purpose: to test the foundations of physics by progressively addressing another loophole in the Bell tests. While Bell tests show that quantum particles are correlated in ways that cannot be explained by classical physics, the results may not be reliable if parts of these tests manage to take advantage of any kind of loophole.

The researchers, led by Jian-Wei Pan, at the University of Science and Technology of China in Shanghai, have published a paper on using cosmic sources to generate in a recent issue of Physical Review Letters.

"We presented an experimental realization of cosmic random number generators (RNGs) and a realistic design of an event-ready Bell test experiment with these RNGs to address the freedom-of-choice loophole while closing the locality and efficiency loopholes simultaneously," coauthor Jingyun Fan told "It will be of high interest to implement the proposed experiment in the near future."

In their work, the researchers used an optical telescope located at the Astronomy Observatory in Xinglong, China, to collect light from a variety of very bright and distant cosmic radiation sources. Some of these objects are more than a trillion times brighter than our Sun and located hundreds of millions of light-years away.

Since the time interval between photon emission events is random, the photons are detected by the telescope at random time intervals. The device has a time resolution of 25 picoseconds (a picosecond is one trillionth of a second). On average, a photon is detected about once every 100 nanoseconds, corresponding to more than a million photons detected per second. This rate is competitive with today's current best random number generators, which use lasers as the photon source.

In the second part of their study, the physicists proposed that this cosmic random number generator could be used to improve Bell tests. These tests aim to show that, unlike our observations of the classical world, the quantum world does not obey local realism—a concept that refers to a combination of locality (that objects cannot influence each other across large distances) and realism (that objects exist even before any measurement is made). Violating a Bell inequality shows that, at the quantum level, nature violates either locality or realism, or both.

However, Bell tests have several loopholes. Typically, loopholes are ways for the objects being measured to secretly share information in a classical way in order to make it appear that local realism is violated when it is not. Although physicists have recently closed two of these loopholes (the locality loophole and detection loophole), there may always be some loopholes that can conceivably circumvent the restrictions of the .

One such possibility is called the freedom-of-choice (or randomness) loophole. This loophole suggests that the detector settings—which are determined using random number generators—could have somehow been correlated even before the experiment began. Before now, it has been thought that these correlations could have occurred just a fraction of a second before the start of the experiment.

By using random number generators based on cosmic sources, the researchers showed that these correlations must have occurred before the photons left the stars, which is at least 3000 or so years before the experiment began—an improvement of more than 16 orders of magnitude. (A couple months ago, a paper was independently published that restricted the correlations to at least 600 years in the past, using similar methods based on cosmic sources of .)

In addition, a third independent group of researchers has recently suggested that the time constraint for the freedom-of-choice could be pushed back by billions of years by using very distant quasars as random generators.

To further pursue this possibility, the researchers in the new study suggest that a satellite-based cosmic Bell experiment may achieve better results than Earth-based experiments because, for one thing, it would avoid atmospheric disturbances. They hope to further pursue such improvements in the future.

Explore further: Physicists demonstrate new way to violate local causality

More information: Cheng Wu et al. "Random Number Generation with Cosmic Photons." Physical Review Letters. DOI: 10.1103/PhysRevLett.118.140402

Related Stories

Physicists demonstrate new way to violate local causality

April 21, 2017

(—For the first time, physicists have experimentally demonstrated the violation of "bilocal causality"—a concept that is related to the more standard local causality, except that it accounts for the precise way ...

Physicists close two loopholes while violating local realism

November 30, 2010

( -- The latest test in quantum mechanics provides even stronger support than before for the view that nature violates local realism and is thus in contradiction with a classical worldview. By performing an experiment ...

Debunking and closing quantum entanglement 'loopholes'

November 15, 2010

( -- An international team of physicists, including a scientist based at The University of Queensland, has recently closed an additional 'loophole' in a test explaining one of science's strangest phenomena -- ...

Physicists show ion pairs perform enhanced 'spooky action'

March 28, 2017

Adding to strong recent demonstrations that particles of light perform what Einstein called "spooky action at a distance," in which two separated objects can have a connection that exceeds everyday experience, physicists ...

Recommended for you

A quantum magnet with a topological twist

February 22, 2019

Taking their name from an intricate Japanese basket pattern, kagome magnets are thought to have electronic properties that could be valuable for future quantum devices and applications. Theories predict that some electrons ...

Sculpting stable structures in pure liquids

February 21, 2019

Oscillating flow and light pulses can be used to create reconfigurable architecture in liquid crystals. Materials scientists can carefully engineer concerted microfluidic flows and localized optothermal fields to achieve ...


Adjust slider to filter visible comments by rank

Display comments: newest first

5 / 5 (2) May 16, 2017
Since the time interval between photon emission events is random, the photons are detected by the telescope at random time intervals.

Just a niggle: Shouldn't expansion lead to a slight, steady increase in average time between photon arrival (assuming average output of the source is constant - which is by no means a given). While this effect is likely tiny - more pronounced the further away the source - it would skew the numbers somewhat over time. Changes in relative motion of Earth with the source (during orbit) should also lead to a variability in incoming numbers.
1 / 5 (2) May 16, 2017
Can't we just simulate the response using only an infinite set of diametrical parabolic spherical fields, within an infinite space, based upon Maxwell. Well, maybe define the matter distribution, the magnetic fields gives a lot of information about a super imposed charge flow, the frequency distribution across the entire body, i.e. frequency digitization, polarity, for an idea of atomic composition. Anyway, if it can be modeled, why don't we understand the difference between the model and actuality, sort of a feedback loop. I would use a binary search along all the variable space abs[simulation-actual] = 0. If you don't get convergence a well defined minimum you've got the wrong physics. juz say'n got it, only 2 fields, simple truth.
3.7 / 5 (3) May 16, 2017
Using starlight to generate random, unrelated choices in a Bell test is certainly dramatic, but is it fundamentally more meaningful than a comparable and more easily implementable experiment based on earth-based technology? For example, two independent random generators synchronized precisely to a global time standard and separated by relativistic effects, i.e., in each other's future light cone. See for example HyperGen at Comscire website. The difference seems very subtle with no theoretical basis.
5 / 5 (5) May 16, 2017
, but is it fundamentally more meaningful than a comparable and more easily implementable experiment based on earth-based technology?

Yes it is, because the entire point of one of the loopholes is that there could be some hidden variable that is influencing both sources of an experiment here on Earth (a common cause if you so will). But with two stellar sources they are so far apart that the generation of the photons that arrive here on Earth cannot have had a common cause, because it would take superluminal velocity to connect their generation processes.
not rated yet May 16, 2017
Couldn't there be some not yet discovered physics that render these less than perfectly random? Like maybe some other very very weak symmetry breaking, or an effect caused by random molecules or dust through space?
not rated yet May 17, 2017
Couldn't there be some not yet discovered physics that render these less than perfectly random?

There are various tests for randomness (there's even some online sites where you can upload your numbers from your own random number generator to test how good the randomness is)

an effect caused by random molecules or dust through space?

That would still fall under the speed-of-light rule, so that avenue is out. In any case if these molecules are random and the source is random then you still get something random as a result.

Like maybe some other very very weak symmetry breaking

A symmetry break is always a possibility since they aren't (as far as I know - please someone correct me if I'm mistaken) bound to the speed of light limit.
Da Schneib
not rated yet May 17, 2017
This looks like it has similarities to the Wheeler Delayed Choice experiment. I'll check out the paper.
Da Schneib
not rated yet May 17, 2017
@antialias, watch out for randomness tests. They're quite bad at detecting real randomness, though pretty good at detecting real non-randomness. In other words they are prone to false negatives, marking really random sources as non-random, but not very prone to false positives, marking non-random sources as random. We don't have a very good set of algorithms in this area. People should be careful assuming a source is really non-random based on failing these tests.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.