Nonfiction Notes: Bobby Duffy, WHY WE’RE WRONG ABOUT NEARLY EVERYTHING

Bobby Duffy, Why We’re Wrong About Nearly Everything: A Theory of Human Misunderstanding (2018) (US edition Nov 2019)

Here’s another book on a seemingly familiar theme: why people so frequently misunderstand the world, and what we can do to correct our thinking. However it’s not aligned with Shermer’s WEIRD THINGS or Shtulman’s SCIENCEBLIND, as much as to Rosling’s FACTFULNESS in its focus on contemporary issues. It actually complements Rosling. Rosling arranges his book by “instinct” and shows how each one affects views of the world; Duffy arranges his book by peoples’ views of the world, and shows which biases or social effects cause them.

The author is a public policy researcher in London who did a survey about how people’s beliefs match reality. See https://perils.ipsos.com/, which has a short quiz you can take. (The correct answers are invariably higher or lower than the average guesses.)

In the case of this book it didn’t work to try to find six or eight take-away points. The subbullets below are the general principles, but most of the book discusses many, many examples of the things people are wrong about, which are hard to summarize except along the same themes as the book’s chapters.

Summary:

  • People have been wrong about commonplace issues for decades. We can understand that this isn’t due to ignorance so much as about what people are afraid of, in denial of, or ashamed of. The book explores various issues along five themes:
    • Many of us got lots of basic facts very wrong.
    • This is due both to faulty thinking and the media and politicians.
    • Our delusions provide clues to our own emotional responses.
    • Our delusions can have serious consequences.
    • Realizing the complexity of the problem is our only chance to deal with our delusions.
  • Health issues, people greatly underestimate rates of obesity; resistance to vaccines results from a lack of understanding of risk, and sensational media and internet stories; people think they’re better than average and think other people are less happy than they are
  • Sexual matters: people overestimate the number of partners men and women have had, the number of teen pregnancies; we understand motivated reasoning and confirmation bias, e.g. we’re biased to find vivid anecdotes more memorable than boring statistics, and the media focuses on exceptional cases.
  • Money matters: most people understand percentages, but have little ideas of how to make big financial decisions. We underestimate how much it will take to raise a child, or how much we’ll need in retirement, or degrees of inequality.
  • Immigration and religion: People overestimate the number of immigrants in their countries, and the people who disapprove of immigrants overestimate more greatly. Telling people the truth has a mixed effect on their opinions; many double-down, the back-fire effect. There’s little or no relationship between immigration and crime; actually, immigrants commit fewer crimes. People overestimate the number of Muslims in the population.
  • Safe and Secure. People hear lurid reports of individual crimes and so ignore statistics about overall decreases in crime (true in most countries); people think things are getting worse, that the past was better than the present. Journalists focus on the negative; conservatives politicize crime. Similar perceptions and trends for terrorism.
  • Politics. Politicians may not know the price of milk, but most people don’t know how many people vote, which party is in charge of the Senate. Trump plays on feelings over facts, e.g. about unemployment rates. People think others go by their gut while they themselves take evidence into account.
  • Brexit and Trump. Both are case studies for understanding how our delusions are driven by preexisting beliefs and wishful thinking. Politicians mischaracterize facts for their own advantage. People believe utterly fabricated news stories—and claim opinions about made-up claims (e.g. the “North Dakota crash”).
  • Filtering. Filter bubbles occur when, e.g., Google returns different results depending on your past history; that leads to echo chambers in which we hear only what we want to hear. We choose news sources, even friends, that support our already held views. People think their own behaviors are relatively common; thus everyone thinks there are more Facebook users than there are. The issue of the accelerating pact of change has created problems with fake videos, the need for fact checkers and news literacy programs (i.e. critical thinking to overcome our evolutionary biases).
  • Worldwide Worry. People greatly overestimate how much the US spend on foreign aid, guessing 31% of the budget, when the reality is under 1%. (Here are explicit references to Rosling’s work.) Negativity dominates opinions on global poverty and wealth: very few know global poverty has halved in the last 20 years. Similar reactions to vaccines. Trends are positive about terrorist attacks, murder rates, etc. And we’re usually unaware of how standards change, how we’re now outraged by things taken for granted in the past.
  • Who’s Most Wrong? That is, the most wrong country. The winner: Italy. Second place: the US. Emotional expressiveness correlates with delusions; educational levels negatively correlate. Those preferring authoritarian leaders and more often wrong about realities: the confused and angry what simple solutions. Both Republicans and Democrats are wrong to different extents, but the former is more wrong. Dunning-Kruger: the more confidant countries are the poorer performers.
  • Dealing with Our Delusions. In the broadest terms, sources of our delusions range from math/critical literacy, to biases and heuristics, to rational ignorance; then to media, social media, politics, and our own experience. We can try to be aware of these causes and correct for them. Ten ideas for forming a more accurate view of the world:
    1. Things are not as bad as we think—and most things are getting better. This is emotional innumeracy; beware letting your concerns bias your perceptions. Beware the ‘rosy retrospection’ that the past was good because we’ve forgotten the bad.
    2. Accept the emotion, but challenge the thought. Don’t deny you have emotional reactions, but try to understand them.
    3. Cultivate skepticism but not cynicism. Avoid the extremes. Beware the journalistic practice of “first simplify, then exaggerate”.
    4. Other people are not as like us as we think. Beware personal experience.
    5. Our focus on extreme examples also leads us astray.
    6. Figure out what’s real. Learn to recognize fake news but beware government programs to declare what’s true. There are apps to make you see what other folks are saying or reading. Some papers do that; red feed, blue feed.
    7. Critical, statistical, and news literacy are going to be difficult to shift, but we can do more. Ideally early on, though it’s difficult to change school curricula.
    8. Facts aren’t cure-alls, but they still matter. They work sometimes and not others. Some people just don’t care; but some are affected by fact-checking.
    9. We also need to tell the story. But be careful about which stories you tell.
    10. Better and deeper engagement is possible. For instance, ‘deliberation days’ to publicly discuss issues.

Complete notes taken while reading. In my notes I sometimes put down my own guess about a rate, in [[ brackets ]], before the author revealed the correct number.

Intro: Perils Everywhere

Author hated his psych classes at college. Didn’t like being tricked by biases in thinking. Author then spent 20 years at an opinion research firm. www.perils.ipsos.com. (quiz). Across many countries people get the answers mostly wrong. Why? How about, is the Great Wall of China visible from space? No, but a lot of people think so. Various reasons why. The book isn’t about such trivial questions, or even about cranks and conspiracy mongers, but about widespread delusions. How about, what percentage of the population is over 65? Most people guess about twice the actual rate in their country.

This doesn’t have anything to do with current misinformation programs. People have gotten commonplace questions like this wrong for decades. So this isn’t about ignorance so much as delusions. They’re more about what we’re afraid of, or in denial of. Or what we’re ashamed of.

How We Think, p9. How we grapple with numbers, mathematics, and statistical concepts. Not very well. One, two, three, and more. Wells quote. Many don’t understand basic concepts. Most people aren’t concerned. Studies of biases and heuristics reveal our focus on the negative, that we stereotype, that we imitate the majority. Emotional innumeracy creates a feedback loop; we overestimate crime, say, because we’re concerned about it, and vice versa. There are now studies of responses to physical stimuli; psychophysics.

What We’re Told, p13. Everyone cites fake news. But it’s of limited relevance here. More to the point are filter bubbles and echo chambers; unseen algorithms; selection bias. Direct communication from politicians.

A System of Human Delusion, p16. Of course these various effects interact. Books on one subject or the other tend to ignore the other. Actually they reinforce each other. And they’ve been around forever. We underestimate how baked-in they are. Still, people have a responsibility for resisting them, for the social good.

The following chapters will involve five points:

  • Many of us got lots of basic facts very wrong
  • This is due both to faulty thinking and the media and politicians
  • Our delusions provide clues to our own emotional responses
  • Our delusions can have serious consequences
  • Realizing the complexity of the problem is our only chance to deal with our delusions

Author recalls Isaiah Berlin distinction between foxes and hedgehogs. This book is for foxes. Our aim is to a fact-based understanding of the world.

Ch1, A Healthy Mind, p23

There’s lots of advice, from tabloids to government guidelines, about how to be healthy. The issues are complex, much of the data flawed, there are vested interests. Same for happiness.

Food for Thought, p24. In their studies, across 33 countries, an average of 57% of adults were overweight or obese. But people greatly underestimate these rates. Why? Definitions, different standards. Availability heuristic. People don’t assess even themselves correctly. And we tend to hang out with people like ourselves; the group’s behavior sets the norm.

Shame and Sugar, p29. People were asked about sugar consumption and exercise. Their own, and others’. People think others’ behaviors are worse than their own. Guesses are on p30; actuals are better, though people are actually bad at recording what they do.

The Dangers of Our Herding Instinct, p32. Does spreading information about health benefits work? The worse a problem is the more it seems the norm and so people don’t react. Experiment about group pressure, p33, with three lines. People overestimate rates of diabetes. Things that work are generally individual actions.

Inoculating Against Ignorance, p36. In 1955 the results of the polio vaccine were announced. Celebrations. Now, opponents of vaccines send hate mail and death threats. Even around the world, significant percentages believe in a connection between vaccines and autism. How did this happen? The issue is emotive; it involves understanding risk, which people struggle with; the media keeps the story alive; ‘balanced’ stories only polarize. Bogus websites have respectable sounding names. Case study narratives stick with us. These delusions have serious consequences: herd immunity; distraction from direct study of autism.

On Top of the World, p41. Well-being studies were common for a while. Most people are happy most of the time, and whatever governments do doesn’t affect happiness much. Kahneman discusses the experiencing self and remembering self. How an experience ends affects how the whole event is remembered. [[ how a story ends is most important – a good point for discussing narrative, even individual examples ]] In most countries, people guess that other people are less happy than they are, p44. One reason is the ‘illusory superiority bias’—we think we’re better than the average person when considering positive traits. Also, people answer differently in person or anonymously over the internet, p46. And, people aren’t always truthful—people want to look good. And so they deny illicit or embarrassing behavior.

So: the factors here involve biases, that we’re better than average; but also that we can be misled about vaccines or what makes us healthy. We can use tools that work with our biases rather than against them. We can avoid the trap of balanced reporting of discredited ideas.

Ch2, Sexual Fantasies

We’re wired for sex but don’t like to talk about it. So there’s little reliable information. Thus delusions breed. No, there’s no correlation between hand size and penis size. The idea that men think about sex every seven seconds is implausible. Sex education programs that contain no facts, only abstinence advice.

What’s Your Number? P51. How many partners have you had in your lifetime? Or that of others? [[ for straight people? How about, a dozen or so for men, fewer for women. ]] Actually in the US it’s 19 for men, and people guess 20; it’s 12 for women, and people guess 20. People’s guesses for women are all much higher than actuals. Yet the actual numbers should be about the same, right? Perhaps men subconsciously exaggerate. Another question: how many times have young people had sex in the past four weeks? [[ 18-29? 10, two or three times a week… OTOH a lot are single, so fewer ]] Actual: once or twice a week. But the guesses are much higher. As in the first question, some men guess way too high for women. Take-away: everyone guesses way too high compared to actuals.

What Were You Expecting, p57. It’s easy to think of media focusing on the occasional pregnant teenage girl. Gottschall discussion, 57b. People greatly overestimate numbers of teen pregnancies. 23% each year?! The reality is 2%, in the US. All guesses are high, across countries, p59. Well, we’re storytelling animals who find vivid anecdotes more memorable than boring statistics. But the media have a responsibility to reflect reality. But today’s media plays in consonance, in confirmation bias. Our personal experience with pregnant teens is low, so media reports exaggerate our ideas about them. Similar with dog bites and pit bulls. Actually, rates of teen pregnancies have fallen. But such trends don’t make the news. Sticky ideas are simple, unexpected, concrete, credible, emotional, and tell a story, 61b. We’re also resistant to changing our opinions. Study of this goes back to Festinger’s ‘cognitive dissonance’ in the 50s, and an Illinois apocalyptic cult. Also, how heavy smokers were reluctant to accept the link between smoking and cancer. These days we talk about motivated reasoning, confirmation bias, etc. Darwin anticipated these. Changing someone’s opinion needs vivid stories alongside facts. An exercise program for women in the UK tried to use such stories to motivate women.

A Moral Compass, p65. There were also studies of what people think other people think—social norms. Pluralistic ignorance can lead us to do things based on what we think others think or do, without knowing if we’re right or wrong. Example of how students at Princeton thought others loved the drinking culture far more than they themselves did. Similarly, questions about whether homosexuality is morally unacceptable. [[ in the US, a third? Or less ]] Actually it’s 37% in the US. Actual rates vary widely, p69. The trend is that people overestimate others’ prejudices. This might be because of our tendency to think we’re better than average, more tolerant than others. Or perhaps lingering stereotypes taken as most available.

Again, the delusions here a likely due to media and advertising… The key is to talk more about the reality of sex and sexuality.

Ch3, On the Money?

We’re wildly overconfident in our financial prowess. Most people, but not all, understand percentages. Thaler & Sunstein’s Nudge was about how the financial industry tried to help people overcome their biases and heuristics to save more. But on the big financial decision of our life, we’re often clueless.

The Bank of Mum and Dad, p74. How much does it cost to raise a child? In the US it’s $235,000. The US guess was $150k. We don’t think about the whole cost. Furniture. Holidays. We tend to focus on the short term.

Stuck in the Next, p77. How the millennials have stagnant wages and high debt, and so are stuck at home. Yet people overestimate the number of 25-35 year olds still live at home; in the US the guess is 34%, actual 12%. Answers to such questions involve need for accuracy, and confirmation of our worldview.

Our Golden Years? P79. How long will we live in retirement? People underestimate this. Estimates of life expectancy were better. But those who are already 65 will live longer, because they haven’t already died. And that means we’ll need more retirement savings. How much pension savings would you need to provide an income of 25k per year? Something like 300k. Careful planning is required. Thaler and others have done work here. Lack of willpower, and inertia, are involved. We tend to think in the short term, and to stick to the status quo. Thaler’s work led to ‘Save More Tomorrow’ with things like auto-enrollment in pension plans. Most people think it’s common not to have enough money to retire on. Actually it’s about half in the US. In contrast, most people know very well average real estate prices.

Unequal Measures, p85. (3 hr break lunch, errand, hike). Estimates of the top 1%–how much of a country’s wealth—are also too high. In the US it’s 37%, while estimates are 57%. And so on about inequality. These delusions reflect are real concerns about these issues.

Ch4, Inside and Out: Immigration and Religion, p93

Key issues in Brexit, various elections, to preference natives; with attendant concerns about religion, esp Islam.

Imagined Immigration, p94. People overestimate the number of immigrants in their countries—but it’s the people who disapprove of immigrants who overestimate more greatly. In the US its 14% of the population; guesses are 33%. Again this reflects concerns, but also is caused by media coverage and political discussion. [[ wouldn’t it be nice if only those who had the best grasp of reality were allowed to make decisions about matters like these? i.e. filter out the ones who think the problem is much larger than it actually is? or that it’s a problem at all? ]] The overestimates are from those who think immigration is bad, 96b. Our mental image of a typical immigrant is also very wrong—people think most of them are refugees. What happens when you inform people of the true values? People defend their estimates. They think illegals weren’t counted, or whatever. In general, telling people the truth has a mixed effect on their opinions. Facts matter, but other factors affect how people change their minds. There is the backfire effect, where people double down. There are also issues about whether immigrants take jobs from natives—economists disagree; it’s not a zero sum game, though some lower-skilled jobs may indeed be affected. And concerns are affected by increasing rates of immigration, and where people get their news—Daily Mail vs. Guardian.

Immigrant Prisoners, p102. There’s little or no relationships between immigration and crime. Evidence is in fact immigrants commit fewer crimes. Yet people think the opposite. Thus people overestimate the proportion of immigrants who are prisoners. In the US, with 14% immigrant population, the immigrant share of the prison population is under 5%.

Islamic States of Mind, p106. And people overestimate the Muslim population. In the US, it’s 1%; guesses are 17%. Asking people to project four years into the future, estimates were even higher. Anchoring has an effect here. And we’re more responsive to negative information than positive. How the same question is framed—e.g. surviving an operation—makes a different in the response.

All in our heads? P112. There’s also psychophysics, how we perceive things like light and heat. We detect small increases from small sensations, but need larger increases to notice against larger sensations. We tend to hedge to 50%.

So, we understand where some of these misperceptions come from, but we also know just putting the facts out there won’t bring people a ‘sensible’ view. It’s better to use narrative along with facts…

// one of themes here, though he doesn’t highlight this, is that people who most fear things like immigrants are also the least well informed on the subject. But we knew that.

 

Ch5, Safe and Secure

Cites quote “fact speak louder than statistics” from 1950 about how lurid reports of individual crimes trumps, for some people, statistics in overall decreases in crime. Happens over and over. Tony Blair. It’s a common area for delusions. People think things are getting worse.

More Murder? P119. When asked, people think murder rates are rising. In fact, murder rates have dropped in most countries. Again, such delusions reflect our actual concerns. And each new event or crime reinforces the current state; and we’re biased to think the past was better than the present. Examples of vacations, grades at school. Journalists tend to focus on the negative. And gossip serves a social function. Conservatives, e.g. Donald Trump, politicize crime. Beware the word ‘amid.’ Repeating false information leads to the ‘illusory truth effect’.

More Terror? P125. Incidents of terror cause people to lose all sense of proportion. The Global Terrorism Database provides reliable data from 1970. The general trend: terrorism deaths down in most countries. But most people think they’ve gone up. Reasons are similar as for murders.

The dangers in these delusions is that they leave people vulnerable to others to offer easy solutions—like, the system is broken and we need to tear it up. Cf Pinker Enlightenment Now. Obama quote: if you had to choose…

Ch6, Political Misdirection and Disengagement, p131

How politicians don’t know the price of milk. The idea is they were out of touch with the lives of ‘ordinary people’. But what to ordinary people really know?

Democratic Deficit, p133. People underestimate the portion of the population that voted. Perhaps due to selective news reporting. In US it’s 58%. But maybe it’s rational not to vote if you feel you have no impact on the results. Polls over the decades show that only 55% or so of Americans know which party is in control of the Senate. Some people just don’t care; political ignorance can be rational. Higher voter turnout implies overall less-informed voters. Plato, Aristotle, Mill all had ideas about this.

A Man’s World? P137. People think gender equality will be attained much more quickly than actual estimates suggest. How snow was cleared in Sweden at one time greatly advantaged men. The percentage of women in governments varies widely; in the US, 19%.

The Left Behind, p142. Trump claimed very high unemployment rates, implying that many people had just ‘given up.’ He plays on feelings. In fact, people everywhere grossly overestimate the unemployment rate – 6% in the US, but the guess is 32%. People in some communities don’t think they get their fair share of decision-making power, of resources, or of respect. –due to motivated reasoning with confirmation and disconfirmation biases. People think other people go by their gut, while they themselves try to take evidence into account. Yet the percentage of people who strongly align with a particular parties is lower for younger people.

Ch7, Brexit and Trump: Wishful and Wrongful Thinking, p151

What’s different in this post-truth era isn’t the mendacity of politicians, but the public’s indifference to it. Brexit and the Trump election are case studies for understanding how our delusions are driven by preexisting beliefs and wishful thinking.

The EU Referen-Dumb, p152. Most people thought the UK paid most into the EU, of several nations, when actually it paid the least. But people aren’t swayed by facts, but by emotions. We believe things that aren’t true largely by wishful thinking. We misinterpret data if the data imply something we don’t want to believe. (Or some people do, not all.) There were also errors in estimating how much investment came from China, thinking Britain less dependent on the EU than it really is.

Bendy Bananas, p156. There was even controversy about whether the EU banned certain bananas. Boris mischaracterized abstruse regulations about banana imports. Boris also misstated the amount Britain sends to the EU each week by ignoring rebates. The Remain campaign was criticized for its overuse of facts.

Real Fake News, p161. In a survey for Buzzfeed, one in five Americans saw three utterly fabricated news stories, p161b. And many people believed them. Placebo misperceptions are when people claim opinions about claims that have been made up; examples includes the “North Dakota Crash” and “choramine”. Long before ‘fake news’ there was Colbert’s ‘truthiness’, see quote 164t, reacting to a quote from the Bush administration about creating their own reality. Kurt Anderson made a similar point, p165.

The Wisdom of the Crowds and Wishful Thinking, p165. Long history of how average guesses from crowds are better than experts’. But it’s difficult to apply the idea to political polling. Most countries thought Hillary would win. Some was wishful thinking; people blend what they think will happen with what they want to happen. Thus Russia thought Trump would win; Mexico most strongly thought Clinton would.

Our apparent disregard for evidence isn’t new. We need to be aware of how our emotional stance on issues affects wrongful and wishful thinking.

Ch8, Filtering Our Worlds, p171

How the chief economist at Google thinks statisticians will be the sexy job in 10 years. Or at least, very important, to handle the huge amount of data. It was originally assumed the internet would enable the truth to emerge—but that was before we took into account all our biases and heuristics. And so the opposite happened.

Our Online Echo Chambers, p172. Filter bubbles occur when, e.g., Google returns different result depending on your past history. Tracking beacons left on our devices enable advertisers to target us. Filters lead to echo chambers in which we only hear what we want to hear; no more shared facts. Biased search engine rankings can affect voter preferences. Recall Cambridge Analytica. A real concern is disinformation campaigns designed so confuse and sow divisions. One method is to send out conflicting message to convince people there are many versions of the truth. These ideas go back to Hanna Arendt (note quote p176). The consistency of what we see, regardless of its truth, determines what we believe.

We Choose Our Friends Too Carefully, p177. We choose news sources, and even friends, that support our already held views. Confirmation bias. The exchange of information is ritualistic. Thus different people react to messages in different ways. We engage in impression management.

Online, All the Time, p178. Out of every 100 people in your country, how many have access to the internet? [[ in the US? 95% or higher ]] Actual: 87%. Some results reveal a false consensus effect; people think their own behaviors are relatively common. Thus the relatively few internet users in India, who took the survey online, thought India’s rate was much higher than it was.

Bringing the World Closer Together? P182. Facebook is a behemoth. 30% of world’s population. Yet people overestimate how many people in their country are Fb users. [[ in US, 50%? Less? ]] reality 58%, but guesses are 75%.

Facebook users. Everyone thinks more of their fellow citizens have facebook accounts than actually do. Again, due to  tendency to generalize from our own experience.

Bursting Our Bubble, p186. The real challenge is the accelerating pace of change. Images are more quickly processed than words, and they can be manipulated too. Videos can be faked—of Obama, of porn. And voices can be faked. Facebook and Google are working to counter bubble effects. Fact-checking operations increase. The goal is third generation checking to do so in real time, rather than countering misinformation later. Some call for regulation. Or implement news literacy programs, 190t, i.e. critical thinking to overcome our evolutionary biases. Professional fact checkers are better at detecting fake websites than undergrads or PhDs.

Ch9, Worldwide Worry, p193

International development, i.e. foreign aid, is always controversial. And so people greatly overestimate the amount being spent, e.g. 31% of the US budget, where the reality is under 1%. Other global issues show similar trends. They’re the focus of Gapminder, set up by the Roslings in 2005. Hans Rosling is famous for his TED talk.

Global Poverty and Health, p194. Negativity dominates these topics. How has poverty changed in the last 20 years? Very few know that’s halved. Actual answers are worse than random; they’re chosen because of our biases. Because however much poverty may change, we still hear stories about it, about tragedies; we’re aware of the negative info and never hear the positive. Similar reactions to vaccines. Negative info is attention grabbing; progress is gradual and not as easily noticed. Our negativity about the present results in rosy thinking about the past; Pinker quote 198b. And we’re unaware of how standards change—we’re outraged by some things that were taken for granted in years past.

It’s All Going Wrong, p199. Is the world getting better or worse? Two thirds thought it was getting worse. But many trends are positive: terrorist attacks, murder rates, poverty rates, etc. Thus the rise of a ‘new optimism’ movement—which is criticized as inviting complacency. But the opposite risk is to feel overwhelmed that nothing can be done.

One Versus Many, p201. Psychic numbing is the result of being confronted with large scale tragedies. Thus, it works better asking for donations to help one little girl, than a large population of starving children. Even presenting two children reduces donations. We’re motivated by emotion, not facts.

Feel the Fear, p204. A sense of fear isn’t always a bad thing; consider the reaction to the David Wallace-Wells article in 2017. Some said stoking fear was unhelpful. But it’s not so simple; people react differently, and change over time. And people still aren’t clued in about climate change.

So the new optimism is a good thing; it’s best to have a fact-based view of how the world has improved.

Ch10, Who’s Most Wrong? P207

At the Ipsos conference in London an award is given to the most wrong country. The winner is…Italy. The US is second. Sweden and Germany are best. How to explain the differences? Keep in mind our biases: we look for causation, for stories; we confuse correlation with causation. Note the Spurious Correlations website. We need to guard against those. And there are only 13 data points (countries). And hard to drill down for detailed data. But they can identify some broad indications.

1, Emotional Expressiveness. The moreso, the more delusions—Italy and France, vs Sweden and Japan.

2, Education Levels. At an individual level there’s a clear pattern between education level and accurate perceptions.

3, Media and Politics. Among many possible correlations, one strong correlation was between agreement with the statement “I wish my country was run by a strong leader instead of the current government” (asked in 2016) with being wrong about realities. The confused and angry want simple solutions. Only a few specific issues correlated with political persuasion: questions about guns, immigrants, and terrorists. I.e, Republicans are more wrong than Democrats, p215. (Though both parties were often still wrong, just to different extents.)

Then there’s the Dunning-Kruger effect: this illusory superiority bias is about how people with low abilities actually think themselves more competent than people with high abilities. Famous case of bank robber who thought lemon juice would make him invisible. And it applies to countries: the more confidant are the poorer performers. Chart p218. The US is in the middle.

Ch11, Dealing with Our Delusions, p221

The reasons we’re often wrong are nut just due to the media and politicians. Fake news isn’t new. Truth in government has never been high. If anything, trust in numerous professions has increased in UK polls since 1983. Except clergy. Politicians and journalists are still least trusted. See chart p223.

In the broadest terms, sources of our delusions range from math/critical literacy, to biases and heuristics, to rational ignorance; then to media, social media, politics, and our own experience, p224b.

What Can We Do? P225. Even Daniel Kahneman admits biases are hard to overcome and his book won’t help. But you can try to bring System 2 into the picture, to alert you when not to trust System 1. Example: we underestimate distances on clear days. We can be aware of that. [ Note reference to Rolf Dobelli, author of book about 99 ways to think clearly. ] Keep in mind that not everyone is equally swayed by their biases.

Why Are We Not Becoming More Deluded? P227. If our information environment is changing so quickly, are we not getting more and more wrong? Evidence shows similar numbers on issues going back decades. Perhaps we’re just at the start of a new age of disinformation. Or maybe it just increases polarization. [ cf Ezra Klein? ] Author suggests 10 ideas for forming a more accurate view of the world.

1, Things are not as bad as we think—and most things are getting better. This is emotional innumeracy; beware letting your concerns bias your perceptions. Beware the ‘rosy retrospection’ that the past was good because we’ve forgotten the bad.

2, Accept the emotion, but challenge the thought. Don’t deny you have emotional reactions, but try to understand them.

3, Cultivate skepticism but not cynicism. Avoid the extremes. Beware the journalistic practice of “first simplify, then exaggerate”.

4, Other people are not as like us as we think. Beware personal experience…

5, Our focus on extreme examples also leads us astray.

6, Figure out what’s real. Learn to recognize fake news but beware government programs to declare what’s true. There are apps to make you see what other folks are saying or reading. Some papers do that; red feed, blue feed.

7, Critical, statistical, and news literacy are going to be difficult to shift, but we can do more. Ideally early on, though it’s difficult to change school curricula.

  1. Facts aren’t cure-alls, but they still matter. They work sometimes and not others. Some people just don’t care; but some are affected by fact-checking.
  2. We also need to tell the story. But be careful about which stories you tell.

10, Better and deeper engagement is possible. For instance, ‘deliberation days’ to publicly discuss issues.

We’ll never manage away our delusions, and we shouldn’t ignore them. We learn a lot by understanding why we’re often wrong.

\\

This entry was posted in Book Notes, Changing One's Mind, Culture, Social Progress. Bookmark the permalink.

Leave a Reply

Your email address will not be published.