Thinking, Fast And Slow
- Penguin Books Ltd
- Publication Date:
- 10 May 2012
- Cognition & Cognitive Psychology
Showing 1-4 out of 25 reviews. Previous | Next
I first encountered the work of Daniel Kahneman and his collaborator Amos Tversky when I was in graduate school. A student of artificial intelligence, I was keenly interested in the question of how to design systems that can behave “rationally” despite their computational limits. I was thinking about computer systems, but of course human beings are also systems that have limits on the amount of “computation”—or at least thinking—that they can do. The adage of economists that rational behavior consists in doing the actions that do the most to further one’s goals given one’s beliefs simply didn’t seem to make a lot of sense when considered from a computational perspective: weighing one’s options in light of one’s goals and beliefs takes time, and if you try to weigh all your options, the world will change before you’ve finished. As various people have said, human beings are not “frictionless deliberators.”Enter Kahneman and Tversky, whose work showed that people deviate in systematic ways from the standard model of rational thinking. The key insight is that the deviations are systematic—and they’re systematic in ways that accord with having finite reasoning capabilities. It’s not that people are irrational—Kahneman bristles at that description of his results. Rather it’s that the model of rational thinking developed by the economists may be fine as a normative model, but can’t work as a descriptive one.Kahneman and Tversky collaborated on psychological studies of reasoning for more than a decade, writing hugely influential papers, including their 1974 Science article “Judgment under Uncertainty: Heuristics and Biases,” which is amongst the most highly cited paper in economics, and which reproduced in full in “Thinking Fast and Slow.” Tversky died in 1996 at the age of 59, but Kahneman went on to win the Nobel Prize in Economics for their joint work in 2002. Indeed, “Thinking Fast and Slow” is a joy to read not only for the science it contains, but also, secondarily, for the touching accounts he provides of his long collaboration with Tversky.This is Kahneman’s magnum opus: an overarching review of 50 years of research that he conducted, first with Tversky, and subsequently with other leading social scientists. It’s remarkably accessible, even when he’s describing somewhat arcane points of psychology. Amongst the many cognitive biases he describes are these:•Seemingly little things that make it easier to process information—the size or darkness of the font in text, for example—have a powerful effect on the likelihood that you will believe or approve of the information.•We readily adopt a principle of WYSIATI—what you see is all there is. We neglect the possibility that our conclusions may be biased by having only incomplete information.•We tend to apply a halo effect in assessing other people’s character or capabilities: we make specific assessments based on overall impression of them, and the errors that this induces are compounded by WYSIATI.•When we’re trying to answer a challenging question, we often substitute an easier one, without even realizing it.•Our numeric judgments are biases by “anchoring effects” in which we are swayed by sometimes irrelevant numbers that we encounter during our reasoning process.•We often assess the quality of an experience by what happens at the end of it.These are just six of the dozens of biases that Kahneman describes. For each bias, he presents the sometimes astonishing psychological studies that support it. For example, the last bias was uncovered in experiment with a group of patients who underwent colonoscopies in the early 1990s, when anesthesia was not well-administered for these procedures. One group of patients had a few extra minutes added to the procedure at the end, during which they experienced a relatively reduced level of pain. The patients in that group felt that the experience was overall less unpleasant, even though they’d experienced all the pain of the other group plus some more!There’s an overarching coherence to Kahenman’s work: he explains human decision-making and the cognitive biases it includes, in terms of two systems: a quick-and-dirty Malcolm-Gladwell-like System 1, which, when needed, provides information and tentative conclusions to a more effortful, deliberative System 2. However, while you might be tempted to think that the errors produced by the types of cognitive biases listed above are solely the result of System 1, that would be incorrect. In fact, the connection between “accurate” reasoning and two systems is more complex, and even when humans rely on System 2, they are still not able to behave in the “fully rational” ways that would be specified by classical economics. It takes some careful reading of Kahneman’s book to sort this out, but on careful consideration one realizes that even with System 2 the economist’s rationality can’t be fully obtained. How could it, by creatures with finite brains?
Two systems drive how we think. System 1 is fast, intuitive, and emotional (which suffers faults and biases, subject to intuitive impressions); System 2 is slower, more deliberative, and more logical. Corporate strategies are impacted by loss aversion and overconfidence, difficulties predicting what makes future happiness, difficulties framing home and workplace risks, and cognitive biases.Slower, more deliberative thinking can help us choose in business and personal living.
This is a wonderful book. The basic premise is that the human brain has evolved from an animal brain by adding the cerebral cortex (the 'rational' bit) to the pre-existing brain. We now have two 'systems' the quick intuitive from the underlying brain, and the slower, reasoning capacity from the human part of the brain. As Kahneman amply and elegantly demonstrates, we use the intuitive part of our brain much more than we realise, and that the reasoning part is lazy, and used much less than we would otherwise imagine. The result is usually fine - when a leopard is leaping out of a tree at you, you don't want to be carefully, and slowly, analysing the prospects of the threat being an optical illusion. However, and this is where Kahneman is so good in his original thinking, his experiments, and his written explanations, there are many instances in modern life and as homo economicus, that the quick and dirty response may not be so good. And, probably more importantly, the economic theory based on rational economic choices is thus baseless.He has much more in this book - which I will go back and re-read.A wonderful book, by a gifted writer and original thinker.Read July 2012
Daniel Kahneman's engaging and well-documented treatise on how we make accurate and inaccurate decisions is, in some ways, reminiscent of what Daniel Ariely ("Predictably Irrational"), Malcolm Gladwell ("Blink"), and others have explored in terms of how irrational our decisions can sometimes be. It also takes us much deeper into understanding, by experiencing what he is describing, the ways we trick ourselves into thinking we are better at decision-making than we actually are. At the heart of Kahneman's work is what he calls System 1 and System 2 thinking--shorthand terms for the ways we approach decision-making (sometimes quickly, sometimes after engaging in intellectual efforts requiring plenty of work). In example after example, we see how fear and inaccurate perceptions we assume are true govern our decisions. And if we apply these lessons to workplace leaning and performance (staff training), we can easily see not only the need to identify and correct misperceptions early in the learning process, but also the need to instill in ourselves and our learners an awareness of how easily we can be affected and influenced by stressful, emotional situations and inaccurate sources of information with little regard to their veracity.
Reviews provided by Librarything.
No reviews here.