Subtitled: “How to Fight For Truth and Protect Democracy.” The MIT Press, 2023.

I’m behind on writing up my recent reading here on this blog, so let me resume with this very short little book, small in size and just 133 pages of text long, published in August and read in September. (Click image for larger pic.)

This post recalls earlier comments and quotes I’ve posted here about the author and the book; my own summary and take on key points; and then 2000 words of notes and summary I wrote as I read the book.

I’ve mention the author, and this book, a couple times, e.g. here, which summarizes his thesis thus:

The most frustrating thing by far is when I see members of the media continue to use the word “misinformation” when they really mean “disinformation.” Misinformation is an accident or a mistake. It’s when you believe a falsehood but there was no intention behind it (so maybe you will change your mind when you get better information later). But disinformation isn’t like that. Disinformation is a lie. It is when someone intentionally shares a falsehood because it is in their best interest to create an army of deniers (who usually end up as victims of the disinformer), so they can get what they want (money, political power, etc).

There’s a similar statement in an interview linked to this post.

I’ve sorta suspected this, and have alluded to it, without having any authoritative source for its actual extent. That is, I like I suppose many of us have naively believed the promulgators of conspiracy theories are to some degree sincere. They’ve discovered something that doesn’t make sense to them, so they look for some alternate explanation than the ‘official’ one. The earth is flat; the moon landing was a hoax. The attraction of being secretly ‘in the know’ draws them on. They’re not duplicitous; they’re just misinformed, or simply uneducated, and don’t understand why the claims they make are implausible or absurd.

But the gist here according to McIntyre is many of these conspiracy theories, and other outlandish claims, are deliberate, promoted for nefarious purposes. Some may come from trolls just out to see how many they can fool; but many are deliberate lies meant to advance an agenda, or protect corporate profits (as in the examples of the cigarette companies), or pursue political power. McIntyre in particular focuses on MAGA denialism about the election, and broader efforts by conservatives to undermine the very notion of objective reality.

Brief summary of McIntyre’s chapters:

Ch1, January 6th was the result of 70 years of lies about tobacco, evolution, global warming, and vaccines. The world of reality denial.

Ch2, It began in the 1950s as the tobacco companies fought against a scientific study linking cigarette smoking to lung cancer. Their strategy later worked for climate issues that would damage the profits of oil companies.

Ch3, A denialist campaign requires creators, amplifiers, and believers. The strategy by creators runs like this:

  1. Cherry-pick evidence
  2. Believe in conspiracy theories
  3. Engage in illogical reasoning
  4. Rely on fake experts (and denigrate real experts)
  5. Have impossible expectations for what the other side must achieve

Ch4, The rise in social media makes amplification easier than ever before. As has Fox News.

Ch5, Our cognitive biases result in many people believing things unsupported by evidence. People don’t form beliefs based on facts; they’re more concerned about values, and community, and feelings.

Ch6, Finally the author offers advice on how to win the ‘war on truth’: it’s not about better education or critical thinking; it’s more about exposing and naming the truth killers, revealing their tactics and financial ties.


As I read the book I boiled the situation to this:

  1. There are bad actors who spread disinformation for their own purposes – to gain power or maintain their profits.
  2. Most people are not well-educated about the world and do not recognize when such disinformation is obviously false.
  3. Most people will believe what others in their tribe or community believe.
  4. Social media has made it easy for small numbers of people to spread disinformation.
  5. People want to believe to conform to their neighbors, and because it’s difficult to admit that one was wrong.

The basic lesson is: disinformation is out there. It’s bad people deliberately trying to fool you. Not very many, but social media amplifies. It’s not a fair matter of debate between two alternate sides. Most people who believe in disinformation aren’t fools; they’re merely victims.


Full Notes/Summary (2000 words):

1, Truth Killers, p1

The storming of the capitol on Jan 6th was the result of 70 years of lies about tobacco, evolution, global warming, and vaccines…. The world of reality denial. 2.3. The post-truth playbook, p3. Hannah Arendt, and Timothy Snyder. Now the GOP is poised to install whomever they like as president, and we may soon wake up in an electoral dictatorship.

2, The History of Strategic Denialism, p7

Denialism isn’t a mistake, it’s a lie. It’s a coordinated campaign, not some kind of accident. It’s about deliberately raising doubts where there were none.

Modern science denial began with four big tobacco companies, in 1953, deciding what to do about a scientific study linking cigarette smoking to lung cancer. Their answer: fight the science; take out ads; insist that ‘both sides’ of the controversy must be told, that nothing has been ‘proved.’ It worked. For 40 years. In 1998 ‘big tobacco’ was busted and charged a $200 billion civil fine—but they could keep selling cigarettes. Later, a leaked memo from 1969 showed they knew smoking was deadly all along. Discussion of Oreskes/Conway book Merchants of Doubt. Which showed how this strategy later worked for acid rain, the ozone hole, and global warming, so that companies could continue to profit. Again, they knew about global warming since 1977. The strategy worked; they realized they could lie about anything. Not just science, but reality itself.

3, The Creators, p15

MAGA is a denialist campaign, following the same flawed reasoning strategy as earlier ones:

1, cherry-pick evidence
2, believe in conspiracy theories
3, engage in illogical reasoning
4, rely on fake experts (and denigrate real experts)
5, have impossible expectations for what the other side must achieve

These explain why the deniers keep niggling on small details while ignoring the vast amount of evidence against them; why they resort to absurd conspiracy theories; why they think if the experts can’t *prove* their case (to some implausible standard), their own denialist beliefs are just as credible.

Now this has become reality denial, with claims of microchips in vaccines, or that the 2020 election was stolen and all the evidence destroyed.

Among Trump’s 30,000 lies while in office were examples of all five points, above, p18-20 (with sources in the endnotes).

There was a purposeful path leading down the rabbit hole, and it starts with disinformation. Example of Trump speech on Jan 6, 2021.

For the denialist strategy to work, it requires that disinformation be created, amplified, and believed. Disinformation is not based on facts; they are rooted in identity. And people tend to conform to their ‘tribe’. Our beliefs are molded by community, trust, values—for evolutionary reasons; even if an iconoclast is actually right he may not get along with others. But this allows for manipulation by propaganda and smooth-talking liars. [[ this is actually a key point just recently read about elsewhere. ]] The goal is to make you distrust anyone who doesn’t believe in the falsehood. It works with all those same issues. Thus polarization, and silos. Us against them. The other side is the enemy. Perhaps they should be thrown in jail. Author discussed these matters in his 2018 book Post-Truth. Politics trumps reality; beginning with science denial. Now it’s come to MAGA.

Jan 6th didn’t succeed, but 66% of Republicans still think the 2020 election was stolen. Trump wants to undermine the idea that fair elections are even possible. So many of his lies are so easily refuted, why do so many believe him?

He’d learned from the Russians, beginning in the 1920s. Information warfare not only against foreign enemies, but their own citizens. [[ This is of course what 1984 is all about. ]] Trump learned from the Putin administration. Two common tactics: “firehose of lies” and “whataboutism.” The former can be multiple incompatible explanations to leave people confused. The latter is simply changing the subject. Trump does both all the time. Multiple excuses for Jan. 6th. Documents at Mar-a-Lago. False equivalence of Putin and other killers. It’s easy to imagine that Russia inspired much of Trump’s strategy. Russia has fueled much science denial in the US—about vaccines, climate change, and so on. Putin specifically wanted to discredit the Pfiser and Moderna vaccines, in order to promote his own. Thus the microchips in vaccines story, which came from Russia. Some 44% of Republicans believed it. Was Russia behind election denial? Certainly it welcomes chaos in the US, but Trump is the more likely villain. He’s managed to control the agenda of the Republican party, and convince his followers to distrust elections, but to change the laws instead. Thus voter suppression laws to suppress Democratic votes. And other tactics… The believers don’t care that over 60 courts have dismissed Trump’s claims of election fraud. It’s shocking how many Republican politicians follow his lead. Putin is presumably pleased.

4, The Amplifiers, p45

Disinformation would be useless unless it was propagated. It only takes a few. Much of the anti-vax propaganda was spread by 12 people on Twitter, 111 on Facebook. Or even fewer: Rupert Murdoch and his empire. Fox News leaves viewers less informed than as if they hadn’t watched any news at all. And increasingly Fox is promoting stories from Russian-funded news sources, i.e. propaganda. Examples. Conservative interest groups (examples) now do what tobacco and fossil fuel industries did for science denial. And the mainstream media tends to cherry-pick the most sensational parts of stories in order to increase audience engagement. Even major news outlets like CNN and MSNBC can succumb to confirmation bias—choosing stories that fit a preconceived narrative. Even if they’re not as bad as Fox, every news outlet should be able to tell the truth.

And ‘telling both sides of the story’ doesn’t work, because the truth is *not* somewhere in between. To see if it’s raining don’t debate; look out the window. Trump and his allies should be treated like a hostile foreign autocratic regime. And yet MSM keeps having Trump surrogates on TV to spout their disinformation. Or pretending to be balanced by over-emphasizing failures of the Biden administration. E.g. the withdrawal from Afghanistan. High gas prices, though Biden had nothing to do with those. And of course everyone is driven by ratings.

And social media. They do very little to police for misinformation. They cite numbers, not rates. Social media is very good at policing some kinds of content: porn, suicides, terrorism. But not disinformation. It’s not in their financial interest to do so. True, they’re not *intentionally* promoting disinformation. What happens is driven by algorithms. They feed us more of what we look at. And play whack-a-mole as necessary. They won’t reveal the algorithms, or how many people are working on them. Of course many people who spread false information may believe them to be true. But that doesn’t help solve the problem. It just makes some people useful idiots.

Yet how do we decrease amplification? Social media has little incentive to do it. What then? Maybe restore the ‘Fairness Doctrine.’ Equal time to opposing viewpoints. It was repealed in 1987. Which led to Rush Limbaugh and Fox News. Isn’t that a problem with freedom of speech? Wouldn’t truth win out? No; the problem is disinformation within particular media sources, since most people are siloed.

But how would the Fairness Doctrine work in today’s media? It originally applied only to opinion, not to fact. But who decides what’s factual anymore? Another idea is to make web platforms liable for the damage they cause. If they can get sued, they might change their ways. A third idea is to pursue known individuals who amplify disinformation. The top 1% say. It’s been done with crime. And a fourth idea might be to focus on the companies that run the internet. None of this violates the first amendment if it’s not the government doing the deplatforming. Wikipedia recovered from trolls in 2006.

A final idea is transparency—make the social media algorithms available to academic researchers. And tweak them so they don’t pull people down rabbit holes. Why would they do that? If the alternative was regulation and lawsuits. …

5, The Believers, p91

These are the audiences that will either buy the propaganda or not. Our many cognitive biases result in many people believing in things unsupported by evidence. Some 50% of Americans believe at least one conspiracy theory; examples p92-93. Is there any hope for these people? It’s like trying to treat people who are already infected. There are some techniques to make science deniers give up their beliefs, but they are time-consuming, and don’t always work. Some people just won’t give up. Author wrote How to Talk to a Science Denier with ideas, e.g. face to face conversation with someone they trust. But since they didn’t form their beliefs based on facts, how would facts change their minds? They’re more about values, and community, and feelings. And identity. Examples of personal encounters. “Street epistemology” and other techniques. [[ We read about this in the latest McRaney book. ]]

But these are not scalable; they’re no match for propaganda campaigns. But author would get this message out to all of them: You have been lied to. Twain: It is easier to fool someone than to convince them that they have been fooled.” Simple education might help, but there are too many, and it takes too long. So what else?

6, How to Win the War on Truth, p103

First: admit we’re in a war. The military teaches about information warfare. But only from foreign enemies. And our own open way of life makes us more vulnerable to such attacks. But there are measures we can take. 1, increase the number of messengers for truth. 2, match the messengers for truth to the people we’re trying to reach, e.g. direct community outreach. 3, repeat the truth more often. And treat disinformation as an act of war. Mark Milley took lessons from Donald Trump, treating Jan 6 2020 like a dress rehearsal. The Atlantic ran a story in Jan 2022 about how Trump’s next coup has already begun. It’s all part of an ongoing civil war. Now there are truth killers *in* Congress. How GOP legislatures are restricting votes. …

What can ordinary citizens do? 1, confront the liars. 2, heed history, how autocracies lock up dissidents and censor the internet. 3, resist polarization; don’t treat people who disagree with you as the enemy. Try to be trustworthy. 4, recognize that in some sense deniers are victims. 5, tune out the bullshit; insist that your favorite media outline stop feeding the “both sides” beast that got us here. 6, don’t think this can all be solved by ‘better education’ or ‘critical thinking’. 7, stop looking for facile solutions; if they existed they would be done by now. Push back against Facebook and Twitter. 8, engage in political activism to get Congress to regulate social media—make the algorithms transparent. 9, take solace in the fact that there are many others doing this. 10, continue to learn – Snyder’s book [[ which I reviewed here ]] , another called Network Propaganda. And others. There’s another election coming up in 2024. Even if it goes well, disinformation is still running rampant.

Expose and name the truth killers. Reveal their tactics and financial ties. Boycott those who enable them. Recall Navalny.

Author recalls reading the encyclopedia as a boy. Why did progress take so long? He’d thought all truth had been discovered. And yet now we need to fight for the same things he thought were settled when he was a boy. Forces against truth are reborn into every age.

This entry was posted in Book Notes, Epistemology, Politics, Psychology. Bookmark the permalink.

Leave a Reply

Your email address will not be published.