# Numbers limit how accurately digital computers model chaos

The study, published today in *Advanced Theory and Simulations*, shows that digital computers cannot reliably reproduce the behaviour of 'chaotic systems' which are widespread. This fundamental limitation could have implications for high performance computation (HPC) and for applications of machine learning to HPC.

Professor Peter Coveney, Director of the UCL Centre for Computational Science and study co-author, said: "Our work shows that the behaviour of the chaotic dynamical systems is richer than any digital computer can capture. Chaos is more commonplace than many people may realise and even for very simple chaotic systems, numbers used by digital computers can lead to errors that are not obvious but can have a big impact. Ultimately, computers can't simulate everything."

The team investigated the impact of using floating-point arithmetic—a method standardised by the IEEE and used since the 1950s to approximate real numbers on digital computers.

Digital computers use only rational numbers, ones that can be expressed as fractions. Moreover the denominator of these fractions must be a power of two, such as 2, 4, 8, 16, etc. There are infinitely more real numbers that cannot be expressed this way.

In the present work, the scientists used all four billion of these single-precision floating-point numbers that range from plus to minus infinity. The fact that the numbers are not distributed uniformly may also contribute to some of the inaccuracies.

First author, Professor Bruce Boghosian (Tufts University), said: "The four billion single-precision floating-point numbers that digital computers use are spread unevenly, so there are as many such numbers between 0.125 and 0.25, as there are between 0.25 and 0.5, as there are between 0.5 and 1.0. It is amazing that they are able to simulate real-world chaotic events as well as they do. But even so, we are now aware that this simplification does not accurately represent the complexity of chaotic dynamical systems, and this is a problem for such simulations on all current and future digital computers."

The study builds on the work of Edward Lorenz of MIT whose weather simulations using a simple computer model in the 1960s showed that tiny rounding errors in the numbers fed into his computer led to quite different forecasts, which is now known as the 'butterfly effect'.

The team compared the known mathematical reality of a simple one-parameter chaotic system called the 'generalised Bernoulli map' to what digital computers would predict if every one of the available single-precision floating-point numbers were used.

They found that, for some values of the parameter, the computer predictions are totally wrong, whilst for other choices the calculations may appear correct, but deviate by up to 15%.

The authors say these pathological results would persist even if double-precision floating-point numbers were used, of which there are vastly more to draw on.

"We use the generalised Bernoulli map as a mathematical representation for many other systems that change chaotically over time, such as those seen across physics, biology and chemistry," explained Professor Coveney. "These are being used to predict important scenarios in climate change, in chemical reactions and in nuclear reactors, for example, so it's imperative that computer-based simulations are now carefully scrutinised."

The team say that their discovery has implications for the field of artificial intelligence, when machine learning is applied to data derived from computer simulations of chaotic dynamical systems, and for those trying to model all kinds of natural processes.

More research is needed to examine the extent to which the use of floating-point arithmetic is causing problems in everyday computational science and modelling and, if errors are found, how to correct them.

Professor Bruce Boghosian and Dr. Hongyan Wang are at Tufts University, Medford, Massachusetts, United States (Dr. Wang now works at Facebook in Seattle). Professor Peter Coveney of UCL is speaking at an event tomorrow in the Science Museum on the future of quantum computing.

Explore further

**More information:**'A new pathology in the simulation of chaotic dynamical systems on digital computers'

*Advanced Theory and Simulations*, DOI: 10.1002/adts.201900125

**Citation**: Numbers limit how accurately digital computers model chaos (2019, September 23) retrieved 16 October 2019 from https://phys.org/news/2019-09-limit-accurately-digital-chaos.html

## User comments

ShootistantigoracleI'm sure you have heard of the GIGO; Garbage In, Garbage Out, principle in compute science. Well, what they are confirming is the LIGO principle that governs climate models; Lies In, Gospel Out.

Hmm... who are masters of the old LIGO. Ah yes, Cults and Religion

danRThis doesn't mean everyone stops telling your next week's weather, or sanction brain-damaged leaders taking a sharpie to a storm map.

OjorfWhat you are saying is you didn't understand the article or the models.

Thanks for that.

Da SchneibAs for this article, yes, we knew that. Everyone always has. Even high precision float()s necessarily have gaps between modifications of the last digit in the value. Once you get into these kinds of numbers you can't avoid that. It's why float()s aren't used for mathematics; they're only estimates.

Da SchneibsimonlThat is basically what they are saying, and yeah, who'd have thought?

Same if you convert floats to integers. None of them provide infinite precision. Integers just have same precision everywhere, while floats vary.

Everyone using floats for scientific calculations are aware of their limitations, and the accumulated error in each step of calculations. Or should be.

Da SchneibAnd there are ways to finesse the matter in many cases. But essentially, yes, that's the problem.

antigoraclerrwillsjwhen talking to people on the different systems

i noticed that digital phones were a less background noise

a lot less static interference

a more precise voice than analog

it was also easier to misunderstand the conversation

the hypothesis i (& others, i'm sure) came up with then

was that our brains are analog devices

& what we hear is processed to filter out extraneous noise but we still expect the missing noise

the monkey that couldn't separate macaws screeching from leopard growls?

did not become our ancestor!

there are a number of people who insist that they much prefer the sound of vinyl recordings over digital

now, there is the speculation that our brain's "personality? cerebral functions? soul?

are quantum results

however our brain's are still analog devices

so the research in this article

is one more step toward encompassing all these guesses

rrwillsjyou are taking those statements out of historical context to make a current ideological rant

yo bolster the weak denier whinge on behalf of altright fairytale political correctness

you are deliberately, with malice aforethought

misinterpreting Dyson's criticisms

for use as propaganda

Dyson was correct, at that time

the climate models were crude

lacking enough datapoints to make accurate predictions

you are deliberately

& with full intent to defraud

ignoring all the data collected since

& all the improvements to new technology for collecting data

as well as a steady improvement in methodology to process the increasing flood of data

you may as well be claiming that Galvin or Franklin. Archimedes or Imohotep were critical of the modern ideas about climate change

but then you would actually have to read them

instead of parroting facist slogans

antigoracleWrong again willis, your hypothesis is just yours and it couldn't be wronger. But, we have certainly come to expect your posts to be all noise and you don't disappoint.

SkyLyDoesn't this paper just repeat what everybody has known to be true since the invention of modern computers ?

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more