Gilovich, 1

I just finished a new book co-written by Thomas Gilovich, author of the 1991 volume HOW WE KNOW WHAT ISN’T SO: The Fallibility of Human Reason in Everyday Life, the earliest volume in my library on the theme of that subtitle, which, from my perspective as a casual reader, seems to have matured substantially over the past two decades, judging from the increasing frequency of books such as those I’ve read in the last couple years by David McRaney, Jonathan Haidt, Jesse Bering, and Chris Mooney (all of which liberally cite many more recent specific psychological studies performed over those past couple decades).

The new book, cowritten with Lee Ross, is THE WISEST ONE IN THE ROOM: How You Can Benenfit From Social Psychology’s Most Powerful Insights. This is the second recent book — the other is Matthew Hutson’s THE 7 LAWS OF MAGICAL THINKING: How Irrational Beliefs Keep Us Happy, Healthy, and Sane (2012) — that take the lessons of all these psychological insights and present them to the reader as ideas to *use* to one’s advantage. This parallels one of the key themes of this blog: the way in which these discoveries, and other discoveries in the past couple centuries about the extent in time and space of the natural world, and in a different sense the ideas of science fiction, can be used to suggest that one’s personal experiences and social circumstances are only the tiniest sliver of reality, that reality is in fact deeply weird and unintuitive, but by using these tools, and heuristics, it’s possible to *think around the problem* and, at least intellectually, perceive a larger truth.

But first a summary of Gilovich’s 1991 book. Compared to more recent books on its general theme, Gilovich focuses not so much on human mental biases, as on real world considerations for how people do not experience balanced, objective data. Thus early chapters consider how people draw conclusions from what is actually random data, incomplete or unrepresentative data, ambiguous or inconsistent data, and so on. (Some of his examples overlap with those of mathematician Jordan Ellenberg.) He then moves on to psychological issues, ideas currently described as various biases (confirmation bias, self-enhancement bias, and so on), without those exact labels. He does mention the “Lake Wobegon effect”, the notion of reacting to confirming or challenging evidence as being about “can I believe” vs “must I believe”, and the idea of second-hand information, how a famous psychological case about “Little Albert” (from 1920) got recounted in many psychological textbooks, but with details omitted or exaggerated…to make it a better ‘story’.

This book had some ideas I had not heard about before, including considerations of what makes a ‘good story’, in the sense of a story being told by one person to another. Such accounts are subject to ‘sharpening’ and ‘leveling’, i.e. highlighting key points and leaving out extraneous points. Informative stories often omit qualifications, especially in service of promoting an agenda, e.g. in service of a ‘greater truth’. [[ Which I suppose must explain the many examples of conservative politicians and evangelical historians who seem to have no trouble bearing false witness on matters of fact. ]] And the book discusses the ‘imagined agreement of others’, how people project their own ideas and values to others, especially these days via filters of news sources.

Finally, as in the current book, the 1991 book ends with several examples of “Questionable and Erroneous Beliefs” that were current in back then: “alternative” health practices, interpersonal strategies (why people boast or name-drop or self-handicap), and especially ESP, with emphases on how news coverage claims sell, while rebuttals of claims do not, and the ‘will to believe’ in some kind of ‘transcendental temptation’.

Concluding chapters suggest that we can at least compensate what we can’t cure, p186t:

The underlying causes of faulty reasoning and erroneous beliefs will never be eliminated. People will always prefer black-and-white over shades of grey, and so there will always be the temptation to hold overly-simplified beliefs and to hold them with excessive confidence. People will always be tempted by the idea that everything that happens to them is controllable. Likewise, the tendency to impute structure and coherence to purely random patterns is wired deep into our cognitive machinery, and it is unlikely to ever be completely eliminated. …

And so we should develop habits of mind:

  1. First, be aware of trying to draw conclusions based on incomplete and unrepresentative evidence;
  2. Second, be aware of how often our role, status, or position in society can cut us off from certain classes of informative data.
  3. Third, be aware of how we are inclined to interpret data to conform to our pre-existing beliefs. E.g., consider how you would react if the opposite happened—would that also support your beliefs?
  4. Finally, — and this is my most important take-away from this book — the value of a science education, the concepts of control groups, regression, doubt, and uncertainty. For these reasons it may be more useful for students to be exposed to the ‘probabilistic’ sciences, psychology, economics, even medicine. Social scientists may have ‘physics envy’, but the social sciences have developed methodological innovations to deal with messy, real world situations – and have an obligation to pass on these lessons to students.

What social sciences might best offer both their students and the general public is their methodological sophistication, their way of looking at the world, the habits of mind that they promote—process more than content. … An awareness of how and when to question and a recognition of what it takes to truly know something are among the most important elements of what constitutes an educated person. Social scientists, I believe, may be in the best position to instill them.

P193b

In short: require every college student to take a course in a social (or medical) science, so they understand the ideas of messy data, control groups, and uncertainty. I would endorse that.

Notes and comments about the new Gilovich and Ross book in next post.

This entry was posted in Book Notes. Bookmark the permalink.