Data discrepancies may affect understanding of the universe

June 6, 2018, University of Texas at Dallas
Why the expansion of the universe appears to be accelerating remains a mystery, but new research from UT Dallas may help shed light on it. Credit: NASA, ESA and the LEGUS team

One of the unsolved mysteries in modern science is why the expansion of the universe appears to be accelerating. Some scientists argue it is due to a theoretical dark energy that counteracts the pull of gravity, while others think Albert Einstein's long-accepted theory of gravity itself may need to be modified.

As astrophysicists look for answers in the mountains of data gathered from astronomical observations, they are finding that inconsistencies in that data might ultimately lead to the truth.

"This is like a detective story, where inconsistent evidence or testimony could lead to solving the puzzle," said Dr. Mustapha Ishak-Boushaki, professor of astrophysics in the School of Natural Sciences and Mathematics at The University of Texas at Dallas.

Ishak-Boushaki and his doctoral student Weikang Lin have developed a new mathematical tool that identifies and quantifies inconsistencies in cosmological data gathered by various scientific missions and experiments. Their findings could shed light on the cosmic acceleration conundrum and have a significant impact on our understanding of the .

Their most recent research, published last October in the journal Physical Review D, was presented June 4 at a meeting of the American Astronomical Society in Denver.

"The inconsistencies we have found need to be resolved as we move toward more precise and accurate cosmology," Ishak-Boushaki said. "The implications of these discrepancies are that either some of our current have systematic errors that need to be identified and removed, or that the underlying cosmological model we are using is incomplete or has problems."

A Model Universe

Astrophysicists use a standard model of cosmology to describe the history, evolution and structure of the universe. From this model, they can calculate the age of the universe or how fast it is expanding. The model includes equations that describe the ultimate fate of the universe—whether it will continue expanding, or eventually slow down its expansion due to gravity and collapse on itself in a big crunch.

There are several variables—called cosmological parameters—embedded in the model's equations. Numerical values for the parameters are determined from observations and include factors such as how fast galaxies move away from each other and the densities of matter, energy and radiation in the universe.

But there is a problem with those parameters. Their values are calculated using data sets from many different experiments, and sometimes the values do not agree. The result: systematic errors in data sets or uncertainty in the standard model.

"Our research is looking at the value of these parameters, how they are determined from various experiments, and whether there is agreement on the values," Ishak-Boushaki said.

New Tool Finds Inconsistencies

The UT Dallas team developed a new measure, called the index of , or IOI, that gives a numerical value to the degree of discordance between two or more data sets. Comparisons with an IOI greater than 1 are considered inconsistent. Those with an IOI over 5 are ranked as strongly inconsistent.

For example, the researchers used their IOI to compare five different techniques for determining the Hubble parameter, which is related to the rate at which the universe is expanding. One of those techniques—referred to as the local measurement—relies on measuring the distances to relatively nearby exploding stars called supernovae. The other techniques rely on observations of different phenomena at much greater distances.

"We found that there is an agreement between four out of five of these methods, but the Hubble parameter from local measurement of supernovae is not in agreement. It's like an outlier," Ishak-Boushaki said. "In particular, there is a clear tension between the local measurement and that from the Planck science mission, which characterized the cosmic microwave background radiation."

To complicate matters, multiple methods have been used to determine that local measurement, and they all produced a similar Hubble value, still in disagreement with Planck and other results.

"Why does this local measurement of the Hubble parameter stand out in significant disagreement with Planck?" Ishak-Boushaki asked.

He and Lin also applied their IOI tool to five sets of observational data related to the large-scale structure of the universe. The cosmological parameters calculated using those five data sets were in strong disagreement, both individually and collectively, with parameters determined by observations from Planck.

"This is very intriguing. This is telling us that the universe at the largest observable scales may behave differently from the universe at intermediate or local scales," Ishak-Boushaki said. "This leads us to question whether Albert Einstein's theory of gravity is valid all the way from small scales to very large scales in the universe."

The UT Dallas researchers have made their IOI tool available for other scientists to use. Ishak-Boushaki said the Dark Energy Science Collaboration, part of the Large Synoptic Survey Telescope project, will use the tool to look for inconsistencies among data sets.

"These inconsistencies are starting to show up more now because our observations have progressed to a level of precision where we can see them," said Ishak-Boushaki, who published his first paper about the inconsistencies in 2005. "We need the right values for these because it has important implications for our understanding of the universe."

Explore further: Study finds 'lumpy' universe cannot explain cosmic acceleration

More information: Weikang Lin et al, Cosmological discordances. II. Hubble constant, Planck and large-scale-structure data sets, Physical Review D (2017). DOI: 10.1103/PhysRevD.96.083532

Related Stories

Prof explores universe through gravity lens studies

April 30, 2012

(Phys.org) -- The National Science Foundation recently awarded Dr. Mustapha Ishak-Boushaki, associate professor of physics at UT Dallas, a $222,000 research grant for his investigations of the gravitational lensing technique ...

New flowchart to eliminate universe models

November 8, 2016

Cosmologists have many possible models for the universe, of which only one can be true. A new flowchart detailed in Physical Review D on November 7 will eliminate some of them when two specific universe features are accurately ...

New insights on dark energy

October 2, 2017

The universe is not only expanding - it is accelerating outward, driven by what is commonly referred to as "dark energy." The term is a poetic analogy to label for dark matter, the mysterious material that dominates the matter ...

Recommended for you

Hunting molecules to find new planets

June 19, 2018

It's impossible to obtain direct images of exoplanets as they are masked by the high luminous intensity of their stars. However, astronomers led by UNIGE propose detecting molecules present in the exoplanet's atmosphere in ...

Exploring planetary plasma environments from your laptop

June 15, 2018

A new database of plasma simulations, combined with observational data and powerful visualisation tools, is providing planetary scientists with an unprecedented way to explore some of the Solar System's most interesting plasma ...

NASA encounters the perfect storm for science

June 14, 2018

One of the thickest dust storms ever observed on Mars has been spreading for the past week and a half. The storm has caused NASA's Opportunity rover to suspend science operations, but also offers a window for four other spacecraft ...

23 comments

Adjust slider to filter visible comments by rank

Display comments: newest first

Gigel
4 / 5 (2) Jun 06, 2018
One possible solution:
https://arxiv.org...11.07058

It remains to be tested thoroughly.
theredpill
5 / 5 (1) Jun 06, 2018
""These inconsistencies are starting to show up more now because our observations have progressed to a level of precision where we can see them,"

This is a good thing, better to learn than to assume you already know.
ZoeBell
2.1 / 5 (11) Jun 06, 2018
Data discrepancies may affect understanding of the universe
I understand, what the author had on mind there - but data cannot affect something, which doesn't actually exist. Which understanding of Big Bang, inflation or metric expansion we actually have? Completely zero, no one can explain how such things should actually work, nobody can explain their physical mechanism. This is always warning sign in science.

The observation of discrepancies just reveals the LACK of understanding and ad-hoced epicycle character of contemporary cosmology. We continue in recognizing of universe in exactly the way, which medieval astronomers did: "if something moves around us, it just means, we are standing at place and Universe around us moves". There is zero introspection from Galileo case in science.
ZoeBell
1.4 / 5 (10) Jun 06, 2018
In dense aether model Universe is steady state and the red shift is explained by scattering of EM waves by quantum fluctuations of vacuum. This model is naturally wavelength dependent because the longer wavelengths tend to scatter less - so there is no problem with different speed of scattering of visible light from distant objects and with speed of scattering of microwave background. Which would lead into different values of red shift and Hubble constant observed at different wavelengths. Of course, if the space-time would expand, then indeed it would affect all wavelengths at the same way.
billpress11
1 / 5 (5) Jun 06, 2018
I agree with you ZoeBell that the observed red shift probably has another explanation. Below is a link to another explanation, simply light from opposite directions refracting light.
http://www.scribd...of-Physi
cs
rrwillsj
3 / 5 (4) Jun 06, 2018
When anyone manages to construct an aether bomb and blow up a city? Or for that matter any working aether device? You all would get a lot more respect than your idle chattering has managed.

Heck, a working prototype of an aether-powered beer bottle opener! That meets the requirements of applications to the US Patent Office.

I really begrudge accepting SR/GR as reality. The reality of nuclear weapons at least gets them a respectful consider. I prefer my hypothesis of Chaotic Gravity with a Temporary (few trillion earth-years or so) Local (a hundred billion parsecs or so) Cosmos the BB as an accidental break with reality. Explaining why our observations make less & less sense the more we refine the technology.

Vector Gravity hypothesis may help to reorganize this mess into an coherent POV. If it is ever proven reasonably correct.

Until the next theorist comes along to prove we may not be wrong but in all empirical certainty, failing to grasp the new obvious.
ZoeBell
1.8 / 5 (5) Jun 06, 2018
simply light from opposite directions refracting light
This doesn't look like feasible theory for me, simple the less.
I prefer my hypothesis of Chaotic Gravity
How it does explain the red shift and its discrepancies? If in no way, why to bother with it right here?
cantdrive85
2.3 / 5 (10) Jun 06, 2018
"The implications of these discrepancies are that either some of our current data sets have systematic errors that need to be identified and removed, or that the underlying cosmological model we are using is incomplete or has problems."

If history is any indication, some ad hoc "dark" explanation will be conjured up to explain away the failures of theory.
ZoeBell
2.1 / 5 (7) Jun 06, 2018
@cantdrive85 Undoubtedly, because the astronomers already noted that young galaxies are less rich of dark matter (they don't know why), whereas older areas of Universe look more rich of dark matter (they don't know why) - so that dark matter lensing around more distant objects tends to compensate mutually, which introduces a systematical bias into red shift of remote objects. The scientists are getting wet once they can combine math from effects, which have no explanation yet, because they're getting a feeling, they're important and useful - while they still didn't lost the evasion for further research. The annoying thing about all real explanations is that it also ends the further research in their matter.
julianpenrod
1 / 5 (7) Jun 06, 2018
Again, the "acceleration" is non existent.
Consider Perlmutter's "experiment". They used the Hubble Constant to determine the speed of a galaxy 5 billion light years away. To begin with, if the constant was constant, that means there can't have been any acceleration in the past 5 billion years or further galaxies, seen in past eras, would be moving more slowly than a Hubble Constant would allow. Using the observed speed of the galaxy by the Doppler Shift, Perlmutter calculated the distance as 5 billion light years by the Hubble Constant. But, then, Perlmutter says the light from a Type Ia supernova in the galaxy is too faint, so the galaxy must be further away than 5 billion light years, so it must be traveling faster. Perlmutter is saying it must be traveling faster than its Doppler Shift indicates.
ZoeBell
1.7 / 5 (6) Jun 06, 2018
@JulianPenrod: this is just the example confusion which follows from assumption of expanding space-time between galaxies and supernovas, which actually stay at place - just the light traveling from them dilates in its wavelength. But it was Edwin Hubble - the founder of the red shift himself - who first pointed to it. He was thus smarter and less biased than the whole crowd of his blind parrots and followers - which is also typical for many frontiers, because being first in something suggests additional qualities: like the ability to think independently.
danR
5 / 5 (4) Jun 06, 2018
It's not helpful to attempt to replace problematic but otherwise robust theories with outlier conjectures that aren't even wrong.
ZoeBell
1.8 / 5 (4) Jun 06, 2018
It's not helpful to attempt to replace problematic but otherwise robust theories with outlier conjectures that aren't even wrong
This is standard approach in science - the wrong theories will get always replaced by something, which doesn't look very convincing at the first sight. This is just the inherent property of all breakthroughs, that they don't look very convincing at their beginning - the ideas which look self-evident from scratch belong into trivial gradualist epicycle based progress, which often hits its own limit soon.
Merrit
1 / 5 (2) Jun 06, 2018
I am glad someone is on the right track. When you find an anomaly the first thing you should check is your tools. But, this is only the tip of the iceberg. Really all aspects of cosmology need to be scrutinized. There are countless assumptions being made that might not be completely valid at all levels. This is where, in my opinion, AI algorithms could really come in use. We are just about at the level of machine learning ability or past it to do an exhaustive analysis of all our cosomolgical data. Not that we imperfect humans necessarily made a mistake somewhere, but it would at least be able to narrow down on where the dark matter and energy issues are coming from and rule out possible DM theories,
ZoeBell
1 / 5 (4) Jun 06, 2018
fourinfinities
1 / 5 (2) Jun 06, 2018
The standard model is incomplete. Star and galaxy formation constitute "action," and there is no "reaction" term(s) in the model. My guess is that lambda, or 'dark energy," represents the missing reaction terms.
TheGhostofOtto1923
5 / 5 (1) Jun 06, 2018
I prefer my hypothesis of Chaotic Gravity

How it does explain the red shift and its discrepancies?
Well of course it only explains the psychopathic compulsion to deceive. Willis wants to see what he can get away with. A sick game we should all be familiar with by now.
If in no way, why to bother with it right here?
You really think he knows anything about cosmology? Shame on you.
Anonym642864
1 / 5 (4) Jun 07, 2018
To my mind if we consider world as universe then death of people can be compared as death of stars. As several lands are created on earth so as milky way in the universe. Therefore to my mind exploration is needed in human 's constitution of the body will reflect the formation of universe.
Ojorf
4 / 5 (4) Jun 07, 2018
Glad I don't have your mind.
Anonym262722
1 / 5 (3) Jun 08, 2018
A solution to the discussed problems is to open, read and understand any book, paper, presentation and recent comments/blogs about Suntola Dynamic Universe (DU) rethinking of the physics foundations. DU corrected the very basic starting mistake of GRT and subsequent building blocks in cosmology and quantum theories of particle physics with his team. Key e-books and papers since1995 are collected for public view at the web site of Physics Foundations Society. The applications of array (unified matrix and tensor) calculus in photogrammetry (such as metric astronomical imaging for 4/5-D reconstruction and triangulation of cosmic object locations) and geodesy resolved the general inverse or estimation problem of 'biased' epicycle parameters such DE/DM and possible continued mistakes in interpretation of Gravitational Waves.
HenryE
1 / 5 (1) Jun 10, 2018
How do we know that the expansion is accelerating and not actually decelerating?

If the expansion of the universe was now decelerating, the most distant parts of the universe would LOOK as though they were accelerating away from us. But it would be from a decrease in our velocity, not because of an accelerating expansion.

The talk of the expansion accelerating, is primarily based on measurements of the most distant objects we can see. From our perspective, they are accelerating away from us. But only from our perspective.

Frankly, the expansion could even have stopped by now and yet the outer fringes of the universe would continue to appear as if they were still accelerating away from us for billions of years.

We need much more data before we can really determine how the expansion is proceeding. It is even possible that the inconsistencies in our current data exist because all thoughts are on acceleration instead of deceleration.
Anonym262722
1 / 5 (1) Jun 11, 2018
The energy balance law of DU connects the expansion speed C4 of 3-D space (surface of 4-sphere) to Riemann 4-radius R4 such that decreasing present C4=300,000 km/s by a factor 1/k increases R4 (presently at distance of 13.8 B l.y.) by the factor of k^2 at the absolute Newtonian time T4= k^3 9.2 B years since the latest bounce. The ticking rate of atomic clocks (with all other atomic processes like decay rate) slows down such that the locally observable speed of light IN the space is observed constant but the true variable value of C is close to C4, typically within 1 ppm near mass centers, see Suntola's DU book and papers about the practical proofs and several GRT/QM based mistakes.
Anonym262722
not rated yet 8 hours ago
The universe at the largest observable 'old' global (vs local and present 'young') scales in terms of constant C postulate of GRT behaves differently from the universe at intermediate or more recent local scales. Albert Einstein's theory of gravity is valid only in the local scales of nested energy frames in 'Suntola Dynamic Universe'. This caused the mistake of 2011 awarded Nobel to interpret 1998 observed intergalactic SN1a data in terms of Dark Energy/Matter densities, in addition by 5-10 other GRT/QM mistakes, including the starting point of quantum theory in Planck energy equation with a constant that actually includes the variable speed of C. This caused the 'DE confirmation' due to ,e.g., Planck dilution of EM wave from its emitting to receival times in the decelerated expansion speed C4 of Riemann 4-radius R4 with scalar absolute or global Newtonian time T4. Same mistakes appear to be shared by the 2017 awarded interpretations of recent GW mass wave detection events.

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Click here to reset your password.
Sign in to get notified via email when new comments are made.