Numbers limit how accurately digital computers model chaos

Numbers limit how accurately digital computers model chaos
"The Great Floating Point Wave" in homage to Hokusai's "The Great Wave Off Kanagawa" Credit: P V Coveney, H S C Martin & Charu G

The study, published today in Advanced Theory and Simulations, shows that digital computers cannot reliably reproduce the behaviour of 'chaotic systems' which are widespread. This fundamental limitation could have implications for high performance computation (HPC) and for applications of machine learning to HPC.

Professor Peter Coveney, Director of the UCL Centre for Computational Science and study co-author, said: "Our work shows that the behaviour of the chaotic dynamical systems is richer than any digital computer can capture. Chaos is more commonplace than many people may realise and even for very simple chaotic systems, numbers used by digital computers can lead to errors that are not obvious but can have a big impact. Ultimately, computers can't simulate everything."

The team investigated the impact of using floating-point arithmetic—a method standardised by the IEEE and used since the 1950s to approximate real numbers on digital computers.

Digital computers use only rational numbers, ones that can be expressed as fractions. Moreover the denominator of these fractions must be a power of two, such as 2, 4, 8, 16, etc. There are infinitely more real numbers that cannot be expressed this way.

In the present work, the scientists used all four billion of these single-precision floating-point numbers that range from plus to minus infinity. The fact that the numbers are not distributed uniformly may also contribute to some of the inaccuracies.

First author, Professor Bruce Boghosian (Tufts University), said: "The four billion single-precision floating-point numbers that digital computers use are spread unevenly, so there are as many such numbers between 0.125 and 0.25, as there are between 0.25 and 0.5, as there are between 0.5 and 1.0. It is amazing that they are able to simulate real-world chaotic events as well as they do. But even so, we are now aware that this simplification does not accurately represent the complexity of chaotic dynamical systems, and this is a problem for such simulations on all current and future digital computers."

The study builds on the work of Edward Lorenz of MIT whose weather simulations using a simple computer model in the 1960s showed that tiny rounding errors in the numbers fed into his computer led to quite different forecasts, which is now known as the 'butterfly effect'.

The team compared the known mathematical reality of a simple one-parameter chaotic system called the 'generalised Bernoulli map' to what digital computers would predict if every one of the available single-precision floating-point numbers were used.

They found that, for some values of the parameter, the computer predictions are totally wrong, whilst for other choices the calculations may appear correct, but deviate by up to 15%.

The authors say these pathological results would persist even if double-precision floating-point numbers were used, of which there are vastly more to draw on.

"We use the generalised Bernoulli map as a mathematical representation for many other systems that change chaotically over time, such as those seen across physics, biology and chemistry," explained Professor Coveney. "These are being used to predict important scenarios in , in and in nuclear reactors, for example, so it's imperative that computer-based simulations are now carefully scrutinised."

The team say that their discovery has implications for the field of artificial intelligence, when machine learning is applied to data derived from simulations of chaotic dynamical systems, and for those trying to model all kinds of natural processes.

More research is needed to examine the extent to which the use of floating-point arithmetic is causing problems in everyday and modelling and, if errors are found, how to correct them.

Professor Bruce Boghosian and Dr. Hongyan Wang are at Tufts University, Medford, Massachusetts, United States (Dr. Wang now works at Facebook in Seattle). Professor Peter Coveney of UCL is speaking at an event tomorrow in the Science Museum on the future of quantum computing.

Explore further

Quantum simulation more stable than expected

More information: 'A new pathology in the simulation of chaotic dynamical systems on digital computers' Advanced Theory and Simulations, DOI: 10.1002/adts.201900125
Citation: Numbers limit how accurately digital computers model chaos (2019, September 23) retrieved 16 October 2019 from
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Feedback to editors

User comments

Sep 23, 2019
What they are saying without saying it is, the climate models are bupkis just as Freeman Dyson always said they were.

Sep 23, 2019
What they are saying without saying it is, the climate models are bupkis just as Freeman Dyson always said they were.

I'm sure you have heard of the GIGO; Garbage In, Garbage Out, principle in compute science. Well, what they are confirming is the LIGO principle that governs climate models; Lies In, Gospel Out.
Hmm... who are masters of the old LIGO. Ah yes, Cults and Religion

Sep 24, 2019
Climate models are not entirely dependent on floating-point calculations for the projections that really count, and what the findings show would be equally applicable to weather forecasting and hurricane alerts, and a myriad of models that are predicated on chaos-driven causality.
This doesn't mean everyone stops telling your next week's weather, or sanction brain-damaged leaders taking a sharpie to a storm map.

Sep 24, 2019
What they are saying without saying it is, the climate models are bupkis just as Freeman Dyson always said they were.

What you are saying is you didn't understand the article or the models.
Thanks for that.

Sep 24, 2019
@danR I don't think they know the difference between an int() and a float().

As for this article, yes, we knew that. Everyone always has. Even high precision float()s necessarily have gaps between modifications of the last digit in the value. Once you get into these kinds of numbers you can't avoid that. It's why float()s aren't used for mathematics; they're only estimates.

Sep 24, 2019
Ultimately, even quanum computers won't be able to deal with chaos. They're still digital. Only an analog computer can deal with it, and eventually we'll have questions that can't be answered that way. We may even find that in the end analog computers won't be precise enough.

Sep 24, 2019
Chaos is infinite, so you can never invent a finite number type that has sufficient precision to deal with it.

That is basically what they are saying, and yeah, who'd have thought?

Same if you convert floats to integers. None of them provide infinite precision. Integers just have same precision everywhere, while floats vary.

Everyone using floats for scientific calculations are aware of their limitations, and the accumulated error in each step of calculations. Or should be.

Sep 24, 2019
Well, I wouldn't say chaos is infinite, but I would say it's infinitely complex. That's its nature.

And there are ways to finesse the matter in many cases. But essentially, yes, that's the problem.

Sep 24, 2019
It's an infinite feedback loop, self modulated by other infinite feedback loops. It's happening all around us, and we have yet to witness it devolve into runaway chaos. Just because we haven't figured it out and so our models do runaway, that should not translate into it being infinitely complex.

Sep 25, 2019
as digital telephoned began to replace analog telephones
when talking to people on the different systems

i noticed that digital phones were a less background noise
a lot less static interference
a more precise voice than analog

it was also easier to misunderstand the conversation

the hypothesis i (& others, i'm sure) came up with then
was that our brains are analog devices
& what we hear is processed to filter out extraneous noise but we still expect the missing noise

the monkey that couldn't separate macaws screeching from leopard growls?
did not become our ancestor!

there are a number of people who insist that they much prefer the sound of vinyl recordings over digital

now, there is the speculation that our brain's "personality? cerebral functions? soul?
are quantum results
however our brain's are still analog devices

so the research in this article
is one more step toward encompassing all these guesses

Sep 25, 2019
dj, insisting that Freeman Dyson said this or that?
you are taking those statements out of historical context to make a current ideological rant

yo bolster the weak denier whinge on behalf of altright fairytale political correctness
you are deliberately, with malice aforethought
misinterpreting Dyson's criticisms
for use as propaganda

Dyson was correct, at that time
the climate models were crude
lacking enough datapoints to make accurate predictions

you are deliberately
& with full intent to defraud
ignoring all the data collected since
& all the improvements to new technology for collecting data
as well as a steady improvement in methodology to process the increasing flood of data

you may as well be claiming that Galvin or Franklin. Archimedes or Imohotep were critical of the modern ideas about climate change

but then you would actually have to read them
instead of parroting facist slogans

Sep 25, 2019
the hypothesis i (& others, i'm sure) came up with then
was that our brains are analog devices
& what we hear is processed to filter out extraneous noise but we still expect the missing noise

Wrong again willis, your hypothesis is just yours and it couldn't be wronger. But, we have certainly come to expect your posts to be all noise and you don't disappoint.

Oct 06, 2019
So the conclusion of this article is weather models are not perfect, computer have memory and computing bandwidth limitations that make it impossible to perfectly simulate a chaotic phenomenon, and computer floating point arithmetic has flaws.

Doesn't this paper just repeat what everybody has known to be true since the invention of modern computers ?

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more