The world is a complicated place. When the design student Thomas Thwaites decided to reverse-engineer a toaster, he discovered that it comprised 400 components; when Eric Beinhocker, then of the McKinsey Global Institute, tried to estimate the number of products and services available in a big urban economy such as London, he estimated that it was about tens of billions; the Bank for International Settlements reported that at the start of the credit crunch in 2007, the face value of outstanding derivative contracts was more than one quadrillion dollars.
This complexity is a symptom of economic success. But it can pose serious risks, especially when dealing with “tightly coupled” systems, from a web of financial contracts to a nuclear reactor.
We rely on regulators to keep us safe in the face of this complexity, but regulations have themselves become more complex. The original US constitution was less than 5,000 words long; the Acquis Communautaire, the body of EU law to which new countries must sign up, is about 35 million words in English. This difference surely reflects the gap between the 18th century and the 21st more than any peculiarly Eurocratic love of the baroque. After all, the famous Glass-Steagall Act of 1933 was 37 pages long but the recent Dodd-Frank Act, also designed as a response to a great financial crisis, is 848 pages long despite delegating many details to regulators. Andy Haldane of the Bank of England estimated that the eventual Dodd-Frank rules would top out at about 30,000 pages.
This looks like bureaucracy gone mad, yet it also looks inevitable given the complexity of the economy. But perhaps it is not.
First, economic complexity may not be causing the regulatory complexity: in the financial industry, the causation often runs the other way. The first credit default swap, for example, was designed in response to regulations on minimum levels of bank risk capital. The boom in elaborately repackaged sub-prime mortgages was fuelled by the fact that the repackaged products were risky – and thus promised higher returns – but ticked the regulatory box that said “safe”. Complex rules invite complex rule-bending.
Second, complex rules may be a very poor response to complex situations. This is an argument made both by Haldane and by Andrew Zolli and Ann Marie Healy in their recent book, Resilience, and they draw on different traditions.
Haldane cites work on decision-making by the psychologist Gerd Gigerenzer. Gigerenzer has found that in many different cases, simple rules of thumb outperform complex statistical rules unless huge amounts of data are available. With less data, the clever number-crunching “over-fits”, finding meaningful patterns in what is, in fact, random noise.
This may be true in financial regulation, too. Haldane looks at the largest 100 or so banks before the crisis and finds that, with hindsight, simple rules such as “highly leveraged firms are in danger” had 10 times the power to predict a future bail out than the sophisticated risk-weighting systems the regulators were actually using – although if simpler regulations were introduced, this pattern might disappear.
Zolli, meanwhile, turns to systems theory to argue that thickets of regulations and safety measures nudge us towards “robust-yet-fragile” systems, which are extremely safe in the face of predictable risks, yet crumble completely when something unexpected comes along.
Haldane and Zolli both say it is time for regulators to aim for a simpler financial system as a goal worth pursuing in its own right. And both recognise that if we want simpler banks, simpler bank regulations would be a promising place to start.
Tim Harford is the presenter of Radio 4’s “More or Less”