November 5, 2011 1:30 am

Thinking, Fast and Slow

Why even experts must rely on intuition and often get it wrong
A turtle and rabbit thinking

Thinking, Fast and Slow, by Daniel Kahneman, Allen Lane, RRP£25, 512 pages

There have been many good books on human rationality and irrationality, but only one masterpiece. That masterpiece is Daniel Kahneman’s Thinking, Fast and Slow.

Kahneman, a winner of the Nobel Prize for economics, distils a lifetime of research into an encyclopedic coverage of both the surprising miracles and the equally surprising mistakes of our conscious and unconscious thinking. He achieves an even greater miracle by weaving his insights into an engaging narrative that is compulsively readable from beginning to end. My main problem in doing this review was preventing family members and friends from stealing my copy of the book to read it for themselves.

Kahneman presents our thinking process as consisting of two systems. System 1 (Thinking Fast) is unconscious, intuitive and effort-free. System 2 (Thinking Slow) is conscious, uses deductive reasoning and is an awful lot of work. System 2 likes to think it is in charge but it’s really the irrepressible System 1 that runs the show. There is simply too much going on in our lives for System 2 to analyse everything. System 2 has to pick its moments with care; it is “lazy” out of necessity.

Books on this subject tend to emphasise the failings of System 1 intuition, creating an impression of vast human irrationality. Kahneman dislikes the word “irrationality” and one of the signal strengths of Thinking, Fast and Slow is to combine the positive and negative views of intuition into one coherent story. In Kahneman’s words, System 1 is “indeed the origin of much that we do wrong” but it is critical to understand that “it is also the origin of most of what we do right – which is most of what we do”.

The “marvels” of System 1 include an ability to recognise patterns in a fraction of a second, so that it will “automatically produce adequate solutions to challenges”. An even more remarkable accomplishment is “expert intuition”, in which after much practice a trained expert, such as a doctor or a firefighter, can unconsciously produce the right response to complex emergencies. The classic example is the firefighting captain who correctly anticipates that a house on fire is about to explode and gets his team out in time yet cannot articulate why he knew that.

Of course, Kahneman is one of the fathers of the field of cognitive biases, and most of the book is indeed spent on the mistakes made by System 1. We get probability and uncertainty terribly wrong, usually leading to overconfidence and mistaken decisions. We react to identical situations differently depending on what is already on our minds. Even worse, we don’t know what we don’t know. In one experiment, chief financial officers of corporations were asked to forecast the return on the Standard & Poor’s index over the following year, giving one number they were 90 per cent sure was too high and another they were 90 per cent sure was too low. The true number was outside their intervals 67 per cent of the time.

The related “planning fallacy” is excess optimism on projects. Planners estimated that the new Scottish parliament building in Edinburgh would cost up to £40m in 1997; the final cost on completion in 2004 was £431m. We also fail to cut our losses as we realise how mistaken our expectations were. We ignore the possibility of rare events, except when such an event has occurred recently, and then we vastly overstate the likelihood of it happening again. The list goes on and on.

A transitional moment linking the positive and the negative aspects of thinking fast illustrates why the author’s personality – and thus the book – is so engaging. Kahneman regards even the experts as prone to the mistakes of System 1 listed above, and cheerfully admits that he is no exception. But he wants to know whether this view can be reconciled with cases such as that of the firefighting captain. So he engages one of his vehement critics on this issue and they debate their way to a joint paper. Their answer is that expertise can be learnt by prolonged exposure to situations that are “sufficiently regular to be predictable”, and in which the expert gets quick and decisive feedback on whether he did the right or the wrong thing. Experts can thus train their unconscious “pattern recognition” mechanism to produce the right answer quickly. So this certainly applies to chess, and it certainly does not apply to predicting the course of Middle East politics.

Another classic bias is called the “halo effect”, when somebody very good at some things is falsely assumed to be good at everything. This book itself could benefit from something similar, as amid its general excellence a few stumbles are easily overlooked. The main flaw comes predictably in the final section in which, according to some mysterious universal law, all authors in the social sciences are required to produce a public policy fix for the problems they have identified.

Kahneman’s endorsement of “libertarian paternalism” contains many good ideas for nudging people in the right direction, such as default savings plans or organ donations. But his case here is much too sweeping, because it overlooks everything the rest of the book says about how the experts are as prone to cognitive biases as the rest of us. Those at the top will be overly confident in their ability to predict the system-wide effects of paternalistic policy-making – and the combination of democratic politics and market economics is precisely the kind of complex and spontaneous order that does not lend itself to expert intuition.

But I hope that one quibble does not deter readers because this is one of the greatest and most engaging collections of insights into the human mind I have read. Kahneman’s book will help you Think Slow about what Thinking Fast gets very wrong, and what it gets very right.

William Easterly is a professor of economics at New York University

-------------------------------------------

Letter in response to this article:

Cream tea was the intuitive decision / From Mr Hubert de Castella

Copyright The Financial Times Limited 2014. You may share using our article tools.
Please don't cut articles from FT.com and redistribute by email or post to the web.

SHARE THIS QUOTE