© The Financial Times Ltd 2016 FT and 'Financial Times' are trademarks of The Financial Times Ltd.
April 26, 2013 6:21 pm
What is the risk of a war on the Korean peninsula or South China Sea? Or, for that matter, of another terrorist attack on American soil? These are questions that western diplomats and security experts are asking themselves this spring. And as speculation grows, those officials have been duly scouring satellite feeds, intelligence reports and history books.
Over in Colorado, Aaron Clauset, a computational scientist, is pondering the dangers from a different perspective. Clauset, who teaches at the University of Colorado, Boulder and is part of the Santa Fe Institute, has spent the past decade on the frontier of computing and statistical research. But he has not focused on areas normally beloved by geeks, such as engineering, physics or biology.
Instead, Clauset and other statisticians, such as Ryan Woodard of the Swiss Federal Institute of Technology in Zurich, have analysed the past 200 years of military conflicts. And this has produced a thought-provoking conclusion: if you look at the global pattern of war and terrorism, human violence has moved in surprisingly stable cycles.
Indeed, it is so stable that Clauset sees strong parallels between human conflict and earthquakes – at least in statistical terms. He and other researchers are now borrowing models developed from seismology and physics to forecast future patterns of violence. The aim of this “terror physics” (as some dub it) is not to predict exactly where and when a terrorist attack may occur – doing that is as hard as pinpointing the next quake. Instead, these statisticians are working out the likely rate of attacks and wars – to tell when one seems statistically overdue.
“The frequency and severity of wars has been pretty constant for 200 years despite all the massive changes in geopolitics, technology and population,” Clauset explains. On average the world sees one new international war every two years and a new civil war about every 1.5 years. And while terrorist attacks typically occur in clusters, with a few “mega” attacks accounting for large numbers of deaths, there are clear statistical rhythms there too. So much so that Clauset and Woodard argue that seemingly “rare” events, such as 9/11, are not actually that extraordinary after all. As they write in a 2012 paper: “Patterns observed in the frequency of severe terrorist events suggests that some aspects of this phenomenon, and possibly of other complex social phenomena, are not nearly as contingent or unpredictable as is often assumed.”
I daresay that some people would consider this analysis to be ridiculous or offensive. After all, we tend to think that the 21st century is a time of great flux, when we are reshaping the world. However, “terror physics” can only predict the future if you think that humans are doomed always to behave in consistent ways, without the capacity for change or progress. That is not a popular idea among governments. Some academics might question it too: the psychologist Steven Pinker, for example, argues that human violence is steadily declining in the world today, at least when measured in terms of violence per capita, as opposed to gross military casualties.
In any case, diplomats usually study conflicts in terms of idiosyncratic social and historical factors, not cold data points. Or as Clauset says: “The conflict studies community usually wants to look at the motives of terrorists or their tactics, not the bigger pattern ... it’s like asking a weather forecaster to worry about climate change.”
But while military experts might be ambivalent about the value of terror physics, Clauset and Woodard’s research is now causing a buzz in the statistical world. It is also attracting serious interest from insurance companies and bankers, who are keen to work out the risks of terrorist attacks. Clauset and his fellow number-crunchers are hoping that the wider policy community starts to pay more attention too.
If the number-crunchers can persuade governments to recognise that there is a statistical rhythm to violence, their argument goes, countries might be able to mobilise resources in preparation. And if policy makers acknowledge these cycles, they might also start to reflect on a fundamental question: what exactly drives those outbreaks of war or terrorism? Can we always blame violence on idiosyncratic personalities (be that the North Korean leaders, Osama bin Laden or anyone else)? Or is there something about the human condition – or our interaction with the environment – which dooms us to terrorism and war with such regularity?
These are, of course, big philosophical issues. I don’t expect that any government will rush to discuss them publicly soon – not when politicians are busy fighting a “war on terror”, with the unspoken assumption that it is possible for humans to eradicate the scourge. But if nothing else, Clauset’s numbers put the recent past in perspective (by historical standards the Boston attack, for example, looks pretty small). And they should make us think about the future too. Clauset reckons that the chance of seeing another war this century on the same scale as the second world war (with 60m deaths) is 41 per cent. Meanwhile, the chance of a 9/11-size event this decade is between 19 per cent and 46 per cent. This is, of course, still irritatingly vague; but as predictions go, it seems too large to entirely ignore. Least of all in a place such as Boston, London – or even Korea.
Read Lisa Jardine’s review of ‘Global Crisis: War, Climate Change and Catastrophe in the Seventeenth Century’ , by Geoffrey Parker, in Books
Copyright The Financial Times Limited 2016. You may share using our article tools.
Please don't cut articles from FT.com and redistribute by email or post to the web.