(Phys.org)—In the world of probability and statistics almost anything can be labeled as a percentage of likelihood of occurring; statistics based on actual numbers give rise to probabilistic estimates that in some cases may be very accurate, not so accurate, or impossible to prove one way or another. With such a view, two statisticians, Aaron Clauset and Ryan Woodard have trained their sights on terrorist incidents and the likelihood of them occurring, specifically, the big kinds, like 9/11. They have found, as they describe in their paper they've uploaded to the preprint server *arXiv*, that using tried and true statistical models, that the likelihood of another attack as big, or even bigger, than 9/11, is as likely as not.

The basis of any statistical model is data and in this case the data was based on the number of terrorist acts committed between the years 1969 and 2007, which of course included 9/11. The attack on the twin towers in New York stands out of course, as the number of people killed that day was six times more than any other terrorist attack. To first see how accurate any given model might be, the researchers calculated the likelihood of 9/11 actually happening based on prior data using three different types of standard models; power law, exponential distributions and log-normal. After crunching the numbers they say it came down to between an eleven to thirty five percent chance, which they say is reasonable and shows that what happened on 9/11 was not unlikely, statistically speaking, to have happened.

The two then applied the same models looking forward into the future and came up with a likelihood of another 9/11 type attack falling between twenty and fifty percent, depending on which model was used and assuming that things remain the same, i.e. the average number of attacks per year (approximately 2000) stays the same. But, realizing that the odds of things holding steady isn't itself very realistic they also tried factoring in such destabilizing scenarios as rising food prices or things calming or growing worse in two of the current hot spots for terrorism; Iraq and Afghanistan. In such cases the models became truly alarming, indicating that in the worst case scenarios the likelihood of another event as deadly as 9/11 occurring, becomes nearly ninety five percent.

**Explore further:**
Terrorism and the Olympics by-the-numbers: Analysis from UMD-based START

**More information:**
Estimating the historical and future probabilities of large terrorist events, arXiv:1209.0089v1 [physics.data-an] arxiv.org/abs/1209.0089

**Abstract**

Quantities with right-skewed distributions are ubiquitous in complex social systems, including political conflict, economics and social networks, and these systems sometimes produce extremely large events. For instance, the 9/11 terrorist events produced nearly 3000 fatalities, nearly six times more than the next largest event. But, was this enormous loss of life statistically unlikely given modern terrorism's historical record? Accurately estimating the probability of such an event is complicated by the large fluctuations in the empirical distribution's upper tail. We present a generic statistical algorithm for making such estimates, which combines semi-parametric models of tail behavior and a non-parametric bootstrap. Applied to a global database of terrorist events, we estimate the worldwide historical probability of observing at least one 9/11-sized or larger event since 1968 to be 11-35%. These results are robust to conditioning on global variations in economic development, domestic versus international events, the type of weapon used and a truncated history that stops at 1998. We then use this procedure to make a data-driven statistical forecast of at least one similar event over the next decade.

## antialias_physorg

It also really depends on what type of events you add to your simulation what type of distribution you get.

No statistical model is 'true'. There are statistical models that fit better than others. But if any statistical model were really 'true' then it would completely and accuratley predict future outcomes - and then it wouldn't be a STATISTICAL model anymore but a deterministic one.

## kochevnik

## Doug_Huffman

A Bayesian analysis will be more effective than mere frequentist dreaming. The conspiracy of ignorance masquerades as common sense.

## Tausch

What are the chances?

## kochevnik

## julianpenrod

Statistics had meaning, much less usefulness, when describing a static situation! When conditions are not changing in specific ways to reflect ach other, statistical likelihood is meaningless. Brownian motion in water below boiling is different from in an environment of water vapor.

In a way, Clauset and Woodward gave a clue to the New World Order. They looked only at "terrorist" events between 1969 and 2007. 1969 was the beginning of the Nixon Era, when the gloves of the NWO started to come off, 2007 preceded Obama, also funded by the NWO, who was intended to suggest that bloody fist methods, ostensibly, are not necessary.