This is perhaps the most influential book of the past couple decades on the whole subject of human mental biases, how the ways in which we think entail errors of perception, and employ heuristics that are often but not always true. The book has been a consistent bestseller, on extended charts like Amazon’s, ever since it was published, in 2011. The author won a Nobel Prize — in economics, since there’s no prize there in psychology — and would have shared it with his very influential collaborator Amos Tversky, had Tversky not died in 1996, before the Nobel Prize was awarded in 2002.
It’s one of those books that has influenced the great ‘thought leaders’ of our time, like Bill Gates (https://www.gatesnotes.com/About-Bill-Gates/Great-Summer-Reading). Kahneman has TED talks (https://www.youtube.com/results?search_query=kahneman) — which I’ve not looked at; nor have I looked at Wikipedia summaries of this book. Reading it for myself, first.
This subject of human mental biases has fascinated me for some time now; it overturns the assumption that humans are rational beings who evaluate evidence and make decisions via logical means, and perceive the world as it actually is. No; we are all subject to biases and heuristics built into human cognition, through evolution, because they serve the purpose of human survival, and/or they work most of the time, and when they fail they do little relative harm.
My favorite books on this topic are the two by David McRaney, You Are Not So Smart (review here: http://www.markrkelly.com/Blog/2013/10/14/review-of-david-mcraneys-you-are-not-so-smart/), and You Are Now Less Dumb (http://www.markrkelly.com/Blog/2013/10/14/review-of-david-mcraneys-you-are-now-less-dumb-part-1/ and http://www.markrkelly.com/Blog/2013/11/19/david-mcraney-2-gravity-haiyan-grr-martin/). McRaney focuses on identifying these various biases, with many examples. Kahneman takes a more academic approach, identifying aspects of mental thinking that generate the biases McRaney identifies. I’m not sure, so far into Kahneman, that he further tries to understand why, evolutionarily, these biases came into existence; I think perhaps he will, but not so far in the first 100 pages — of which I summarize just the first 30 pages here.
Author wants to inform the average water-cooler conversation with understanding about judgments, the choices of others, how people make decisions. This entails understanding the distinctive patterns of errors people make—biases. We are generally unaware of how our beliefs are formed.
These ideas go back to 1969, when author met Amos Tversky, at Hebrew University of Jerusalem, and began many years of fruitful collaboration. They researched whether people are good intuitive statisticians. (Answer: no.) They developed and implemented many experiments, e.g. one about a shy meek person who might be a librarian, or a farmer. About the letter K.
In 1974 they published a highly influential article in Science magazine, that challenged current assumptions at the time: that people are generally rational; that departures from rationality are caused by emotion. (Article is in Appendix 1.)
Author notes that one reason for their success was that they included full examples of questions and answers from their experiments—an example of how “luck plays a large role in every story of success; it is always easy to identify a small change in the story that would have turned a remarkable achievement into a mediocre outcome. Our story was no exception.” (p9b [i.e. page 9 bottom]; the theme of Robert H. Frank’s recent book–https://www.amazon.com/Success-Luck-Good-Fortune-Meritocracy/dp/0691167400/)
There were some criticisms, but these ideas are now generally accepted. They shifted attention to decision making under uncertainty, and published a second paper. For these two papers the author won a Nobel Prize, in 2002, which Amos would have shared, had he not died in 1996 [of metastatic melanoma, at age 59; https://en.wikipedia.org/wiki/Amos_Tversky].
This book is a summary of all recent developments in cognitive and social psychology. Stories of intuition, especially expert intuition, and how intuition can go wrong. Intuition often devolves into two modes of thinking: if no immediate solution presents itself (fast thinking), we switch to a slower, more deliberate mode of thought. Author describes these as System 1 and System 2, fast and slow thinking. These are not literal mechanisms in the brain, but broad descriptions of different kinds of thinking.
(Then follows outline of the book’s five major sections.)
Part 1: Two Systems
1, The Characters of the Story
A photo of an angry woman; the problem of solving 17 x 24. These involve different systems of processing in the brain. You know immediately about the first; solving the second takes effort.
Description of System 1 and System 2; the first automatic and quick, and involuntary; the second requiring effortful mental attention.
Examples of activities addressed by each system – responding to sounds, faces, driving a car down an empty road; vs. searching memory for a particular sound, telling someone your phone number, checking the validity of a complex logical argument.
‘Paying attention’ is how you invoke System 2 to override the automatic responses of System 1.
When the driver of a car is overtaking a truck on a narrow road, for example, adult passengers quite sensibly stop talking. They know that distracting the driver is not a good idea, and they also suspect that he is temporarily deaf and will not hear what they say.
[Though from personal experience I know that not all adult passengers are attentive or perceptive in this way.]
Famous experiment: the Invisible Gorilla. When told to pay attention to a particular team in a basketball game, viewers *don’t see* a person in a gorilla suit walking through the court. We are blind to blindness.
The theme of this book is the interaction of these two systems. “Most of what you (your System 2) think and do originates in your System 1, but System 2 takes over when things get difficult, and it normally has the last word.” (p25.2)
Examples of conflict: reading words for size or meaning; optical illusions like the famous Müller-Lyer illusion. There are also cognitive illusions; can they be overcome?
The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high. The premise of this book is that it is easier to recognize other people’s mistakes than our own.
(( This recalls Matthew Hutson’s book about learning to recognize such biases and trying to *take advantage* of them… https://www.amazon.com/Laws-Magical-Thinking-Irrational-Beliefs/dp/0452298903/ ))
Author stresses these two Systems are ‘useful fictions,’ useful because “The mind — especially System 1 — appears to have a special aptitude for the construction and interpretation of stories about active agents, who have personalities, habits, and abilities.” p29.8
They could as well be called the “automatic system” and the “effortful system.”
(this is to page 30, of a 499 page book, of which 418 pages are text, not counting the articles in the appendices.)