(PhysOrg.com) -- "A perplexing philosophical issue in science is the question of anticipation, or prediction, versus causality," Shawn Pethel tells PhysOrg.com. "Can you tell the difference between something predicting an event and something actually causing an event?"
Pethel is a scientist working at the Redstone Arsenal in Alabama. Along with Daniel Hahs, he set out to identify a method of distinguishing anticipation from causality using tools from information theory. Any process that has to react in real time can improve its performance through anticipation, and in studying such processes it is important to find new ways to quantify causality Pethel says.
The question of anticipation versus causality is one that has real-world application in a number of areas. Pethel points out that this issue has implications in covert operations, as well as in financial areas, especially with regard to the development of bubbles. Is there a way, from passive measurements, to tell whats driving what? Pethel and Hahs try to answer this question in Physical Review Letters: Distinguishing Anticipation from Causality: Anticipatory Bias in the Estimation of Information Flow.
If a system is generating information, its like a fingerprint that can be used to figure out where the information is coming from and where it is going, Pethel explains. A quantity called transfer entropy has been used since 1990 to measure information flow in experimental data from neuroscience, finance and even music.
What we were interested in, he continues, is the anticipation issue. We wanted to look at that using transfer entropy, and see if we could discover whats predicting and whats causing.
In order to set up the situation, Pethel and Hahs used chaotic systems, which produce information. They were careful that their model would be connected in only one direction. We set it up so that we know the causality that x is causing y, but y cannot cause anything for x. However, we designed the y system to be able to predict x to some degree. We then collected data and used transfer entropy to tell us which was the causal system.
When analyzing the results, Pethel and Hahs found something rather interesting: Even though the response system, y, wasnt the cause, it looked very much like it was under many different test conditions. There is an anticipatory bias. Its a very strong effect, the more anticipation there is, the stronger it will be. There is a huge bias going on in some cases, and it is giving the exact wrong answer. The system that was only predictive was indistinguishable from the cause.
The good news is that Pethel and Hahs also noticed that there are some signs that indicate that the presence of a bias. Even if you cant pinpoint what the causal source is, you can see some behaviors that provide clues that you need to be on the alert for anticipatory dynamics.
Pethel wants to take this further, though. Weve studied the one way system, and now that we know there is a bias, we can account for that, and study systems that have mutual back and forth coupling.
Hopefully, the work will provide helpful clues in understanding how causality can be indentified using information theory. Theres a level of broad interest, Pethel says, since you can deduce causal mechanisms from data. Particularly interesting are financial markets and biological systems where anticipation plays a large role in allowing systems to adapt to a changing environment.
Explore further: Negative resistivity leads to positive resistance in the presence of a magnetic field
More information: Daniel W. Hahs and Shawn D. Pethel, Distinguishing Anticipation from Causality: Anticipatory Bias in the Estimation of Information Flow, Physical Review Letters (2011). Available online: link.aps.org/doi/10.1103/PhysRevLett.107.128701