Links and Comments: Fiction and Truth, Ignorance and Knowledge, Science Denial and the Scientific Attitude

Three interesting essays this week.

In the New York Times Sunday Review section, an essay by Yuval Noah Harari: Why Fiction Trumps Truth, subtitled “We humans know more truths than any species on earth. Yet we also believe the most falsehoods.”

We are both the smartest and the most gullible inhabitants of planet Earth. Rabbits don’t know that E=MC² , that the universe is about 13.8 billion years old and that DNA is made of cytosine, guanine, adenine and thymine. On the other hand, rabbits don’t believe in the mythological fantasies and ideological absurdities that have mesmerized countless humans for thousands of years. No rabbit would have been willing to crash an airplane into the World Trade Center in the hope of being rewarded with 72 virgin rabbits in the afterlife.

Fiction has three advantages over truth, Harari says.

First, whereas the truth is universal, fictions tend to be local. Consequently if we want to distinguish our tribe from foreigners, a fictional story will serve as a far better identity marker than a true story.


The second huge advantage of fiction over truth has to do with the handicap principle, which says that reliable signals must be costly to the signaler. … If political loyalty is signaled by believing a true story, anyone can fake it. But believing ridiculous and outlandish stories exacts greater cost, and is therefore a better signal of loyalty. If you believe your leader only when he or she tells the truth, what does that prove? In contrast, if you believe your leader even when he or she builds castles in the air, that’s loyalty!


Third, and most important, the truth is often painful and disturbing. Hence if you stick to unalloyed reality, few people will follow you.

Furthermore, people compartmentalize; they can be rational about some things, tribal and irrational about others, reflecting the way our brains works. He concludes,

Even if we need to pay some price for deactivating our rational faculties, the advantages of increased social cohesion are often so big that fictional stories routinely triumph over the truth in human history. Scholars have known this for thousands of years, which is why scholars often had to decide whether they served the truth or social harmony. Should they aim to unite people by making sure everyone believes in the same fiction, or should they let people know the truth even at the price of disunity? Socrates chose the truth and was executed. The most powerful scholarly establishments in history — whether of Christian priests, Confucian mandarins or Communist ideologues — placed unity above truth. That’s why they were so powerful.

\\

A complement to Harari’s recognition that people are irrational in order to bolster social cohesion is the realization that it’s possible – perhaps only at an personal level – to realize what is actually true.

From Forbes (via a Fb post I saw): Ethan Siegel, Your Glorified Ignorance Wasn’t Cool Then, And Your Scientific Illiteracy Isn’t Cool Now, subtitled, “The Universe is out there, waiting for you to discover it.”

There are so many remarkable things that we — as a species — have figured out about existence. We know what life is: how to identify it, how it evolves, what the mechanisms and molecules are that underpin it, and how it came to survive and flourish on Earth. We know what reality is made of on a fundamental level, from the smallest subatomic particles to the nature of space and time that encompasses the entire Universe. We know how matter behaves under extreme conditions, from the vacuum of space to the centers of stars to the ultra-cold conditions we can only achieve in laboratories here on Earth.

Our most valuable explorations of the world and Universe around us have been scientific ones: where we learn about reality by asking it the right questions about itself, and listen to the answers that it reveals.

Of course, not everyone knows all (or even most) of these answers. It’s impossible, in this day and age, to be an expert in all possible things. Most of us learn this at an early age: that most of what is known to humanity is not known to us as individuals, and that we can either study to gain that expertise and learn it, or go find the appropriate expert to learn what the answer is from them.

At least, that’s how you behave if you’re genuinely interested in learning the actual answer. You’ll either undertake the research yourself to reach expert-level competence, where you’ll learn how to perform critical tests and experiments that determine the answer, or you can learn to discern whose expertise is worth listening to and why, and then to take that expert advice. That’s how you gain meaningful knowledge.

But many of us don’t choose that route for a number of reasons. First off, it requires making a series of admissions to ourselves that are very difficult to accept. These include:

  1. admitting that we don’t know everything,
  2. admitting that we might be wrong about something that we’ve thought or even publicly espoused,
  3. admitting that we might have been swindled or conned by a charlatan,
  4. requiring us to do additional research, work, and mental labor,
  5. and to admit to ourselves that our heroes — the people we admire most — are often flawed or incorrect.

This is not an easy situation to be in, regardless of your education or background. It’s human nature to want to save face and appear like we knew the right answer all along. But, if we’re being honest with ourselves, that isn’t a real solution.

Some of the ways this manifests in society might seem like low-stakes affairs that aren’t worth much effort. Maybe you laugh when you hear about a rapper claiming the Earth is flat. Or a basketball player saying we never landed on the Moon. Or at the 25% of the population who thinks the Sun orbits the Earth. But it’s not laughable; it’s something we should all be ashamed of.

When we trust our own non-expertise over the genuine expertise of bona fide experts, terrible things happen. We wind up with cities without fluoridated drinking water, increasing cavities by 40% in the most low-income populations. We get vaccine-preventable diseases causing outbreaks and epidemics. We continue to pollute the Earth with greenhouse gases even as we’re experiencing the early consequences of global climate change.


Glorified underachieving, proclaiming falsehoods as truths, and the derision of actual knowledge are banes on our society. The world is made objectively worse by every anti-science element present within it. Nobody likes to hear that sometimes, they’re the problem. But sometimes, it really is on each of us to do better. The next time you find yourself on the opposite side of an issue from the consensus of experts in a particular field, remember to be humble. Remember to listen and be open to learning. The future of our civilization may hang in the balance.

\\

This essay in turn links to another in Newsweek by the author of the just-published nonfiction book The Scientific Attitude: Defending Science from Denial, Fraud, and Pseudoscience. The essay is Flat Earthers, and the Rise of Science Denial in America.

Every day in the media we see once-unthinkable science headlines. More than seven hundred cases of measles across 22 states in the U.S., largely due to vaccine deniers. Climate change legislation stalled in the U.S. Senate—due mainly to partisan politicians who routinely confuse climate and weather—even as scientists tell us that we have only until 2030 to cut worldwide carbon emissions by half, then drop them to zero by 2050. And, in one of the most incredible developments of my lifetime, the Flat Earth movement is on the rise.

With the growing realization that you can’t convince people with evidence; people dig in to their prior beliefs in the so-called “backfire effect.” (Haidt characterized this as people aren’t characteristically rational, they’re characteristically lawyerly, defending positions they’ve arrived by non-rational grounds.)

The author visits a Flat Earth conference in Denver, talks to people, and tries to understand how they rationalize their beliefs.

The arguments were absurd, but intricate and not easily run to ground, especially if one buys into the Flat Earthers’ insistence on first-person proof. And the social reinforcement that participants seemed to feel in finally being “among their own” was palpable. Psychologists have long known that there is a social aspect to belief.

This last is a key point: people don’t form beliefs by independent examining evidence and reaching conclusion; they take on the beliefs of groups they belong, or want to belong, to, and rationalize.

After numerous conversations, I came away with the conclusion that Flat Earth is a curious mixture of fundamentalist Christianity and conspiracy theory, where outsiders are distrusted and belief in Flat Earth is (for some) tantamount to religious faith. This is not to say that most Christians believe in Flat Earth, but almost all of the Flat Earthers I met (with a few notable exceptions) were Christians.


Virtually all of the standards of good empirical reasoning were violated. Cherry-picking evidence? Check. Fitting beliefs to ideology? Check. Confirmation bias? Check. How to convince anyone in this sort of environment? You don’t convince someone who has already rejected thousands of years of scientific evidence by showing them more evidence. No matter what I presented, there was always some excuse: NASA had faked the pictures from space. Airline pilots were in on the conspiracy. Water can’t adhere to a spinning ball.

So the author went after their reasoning.

There is a rampant double standard for evidence: no evidence is good enough to convince them of something they do NOT want to believe, yet only the flimsiest evidence is required to get them to accept something they DO want to believe.

(This recalls a point in the first Gilovich book (review) about the notion of reacting to confirming or challenging evidence as being about “can I believe” vs “must I believe”.)

And so,

Instead of saying “show me your evidence” (which they were only too happy to do) or “here’s my evidence” (which they wouldn’t believe anyway,) I asked “what would it take to convince you that you were wrong?” They seemed unprepared for this question.

This is a key point of the writer’s new book: the scientific attitude is always about being to change one’s mind, and knowing what it would take to be required to do so.

…all science deniers use roughly the same reasoning strategy. Belief in conspiracy theories, cherry picking evidence, championing their own experts. These are also the tactics used by deniers of evolution, climate change, and the recent spate of anti-vaxx. How many more years before the Flat Earthers are running for school board, asking physics teachers to “teach the controversy,” just as Intelligent Designers did not too many years back?


In scientific reasoning there’s always a chance that your theory is wrong. What separates science deniers from actual scientists is how rigorously they pursue that possibility.

This entry was posted in Changing One's Mind, Lunacy, Religion, Science. Bookmark the permalink.