(Phys.org) -- In the model astrophysicists and astronomers use to describe the history of the universe, the Big Bang is used as the ultimate starting point, which is believed to have occurred some 13.7 billion years ago. After that, things grow a little murkier as at some point atoms were formed, then stars, and then entire galaxies. The timeline for these formations has been difficult to gauge though as there is so little evidence for researchers to look at; still most agree that the first stars likely appeared somewhere in the neighborhood of a hundred million years after the Big Bang. Now a team of researchers has found that in creating a simulation on a computer, as they describe in their paper published in the journal Nature, they might have found a way to detect the signature of the very first stars to have formed.
In creating the simulation the team looked at theories that suggest that during the early stages of the development of the universe, dark matter and regular (baryonic) matter began moving at different speeds due to light interacting with them. Dark matter is believed to be mostly impervious to the impact of light, whereas baryonic matter gets pushed when struck. But because this change in speed was occurring during the time when stars were forming (due to clouds of gas bunching together from their collective gravity and pressure from dark matter) the number of new stars being formed would be lessened, leading to a "lumpier" universe than has been previously thought.
The researchers used the relative differences in the speed of moving dark matter and baryonic matter and the way they believe radiation emitted from the earliest stars would have impacted those that came after, to calculate that the first stars would likely have appeared some 180 million years after the Big Bang. Using that information they were able to show via their simulation that the radiation emitted from the first stars should be detectable in the 50 to 100 megahertz range here today on Earth. Unfortunately, there are no radio telescopes currently operating in this range, so the team has suggested that one such as the Murchison Wide-field Array in Australia be modified to detect such signals. Doing so, they say would allow researchers to actually listen for clues left behind by the lumpiness of early star distribution and gas interactions allowing them for the first time to detect the presence of those first stars to be born.
Explore further: Short, sharp shocks let slip the stories of supernovae
More information: The signature of the first stars in atomic hydrogen at redshift 20, Nature (2012) doi:10.1038/nature11177 . http://www.nature.com/nature/journal/vaop/ncurrent/full/nature11177.html
Dark and baryonic matter moved at different velocities in the early Universe, which strongly suppressed star formation in some regions1. This was estimated2 to imprint a large-scale fluctuation signal of about two millikelvin in the 21-centimetre spectral line of atomic hydrogen associated with stars at a redshift of 20, although this estimate ignored the critical contribution of gas heating due to X-rays3, 4 and major enhancements of the suppression. A large velocity difference reduces the abundance of haloes1, 5, 6 and requires the first stars to form in haloes of about a million solar masses7, 8, substantially greater than previously expected9, 10. Here we report a simulation of the distribution of the first stars at redshift 20 (cosmic age of around 180 million years), incorporating all these ingredients within a 400-megaparsec box. We find that the 21-centimetre hydrogen signature of these stars is an enhanced (ten millikelvin) fluctuation signal on the hundred-megaparsec scale, characterized2 by a flat power spectrum with prominent baryon acoustic oscillations. The required sensitivity to see this signal is achievable with an integration time of a thousand hours with an instrument like the Murchison Wide-field Array11 or the Low Frequency Array12 but designed to operate in the range of 50100 megahertz.