This page summarizes principles and guidelines for understanding the world, in particular how to evaluate claims made in politics, by advertising and the news media, and by religion, science, and pseudo-science.
These principles fall into roughly three groups: logical fallacies, often used in politics and in casual arguments, that have been recognized for centuries; cognitive biases, recognized and understood mostly in recent decades by modern psychology; and perceptual illusions, ways in which what we perceive is not accurate, some venerable but others also recognized only recently.
These are familiar rhetorical gambits like ad hominem, straw man, false dilemma, appeals to fear or ignorance, genetic fallacy (i.e. if so-and-so said it, it can’t possibly be true), slippery slope, circular reasoning, and “tu quoque” (i.e. “you too” and recently “but what about—“ [e.g. Hillary’s emails]).
Resources with details and examples at
- Wikipedia: https://en.wikipedia.org/wiki/Formal_fallacy
- Dedicated website https://yourlogicalfallacyis.com/
- Book and website https://bookofbadarguments.com/
- College text Logic and Contemporary Rhetoric, 13th edition (Amazon: https://www.amazon.com/Logic-Contemporary-Rhetoric-Reason-Everyday/dp/1305956028/), which also covers psychological biases, and discusses how to apply its principles to writing, understanding advertising and politics and the news media
Personal examples to follow, but these are all routinely observed especially in political debates. When you detect these, they are usually evidence that there is no legitimate case to the argument. (E.g., if your rhetoric is merely to insult people, via schoolboy nicknames, you haven’t made any kind of argument about the issue at hand, and likely you’re admitting you don’t have a legitimate argument.)
These were abstruse just a couple decades ago but more and more are working their way into popular culture. These include confirmation bias, anchoring, the Dunning-Kruger effect, in-group bias, the sunk cost fallacy, and many others.
- Wikipedia: https://en.wikipedia.org/wiki/List_of_cognitive_biases
- Dedicated website https://yourbias.is/
Examples to follow.
These include pareidolia, the tendency to see kittens in clouds and the Virgin Mary in a tortilla (https://en.wikipedia.org/wiki/Pareidolia), and general issues from the basic principle that our senses are contextual — we perceive reality not as it “truly is” (whatever that means), but in the context of what surrounds the things we perceive. Famous recent example: the blue dress (https://en.wikipedia.org/wiki/The_dress), whose colors varied by the viewer, and many optical illusions (https://en.wikipedia.org/wiki/Optical_illusion) about color, shape, and size.
Another is the inability of humans to appreciate enormous numbers, including long spans of time.
These are ‘rules of thumb’ that often work but are not guaranteed to work. (https://en.wikipedia.org/wiki/Heuristics_in_judgment_and_decision-making) Some of these overlap with the cognitive biases, e.g. the availability bias (https://en.wikipedia.org/wiki/Availability_heuristic), the tendency to draw conclusions based on examples close to hand. (The notorious application of this is to conclude the world is going to pieces because of all the crises seen on the evening news. The corrective is to understand why the evening news shows you only the crises, and how these crises are infinitesimal samples of everything that occurs around the world each day.)
The most famous heuristic is Occam’s razor (https://en.wikipedia.org/wiki/Occam%27s_razor), the notion that of all possible explanations, the simplest one is mostly likely true. But the rule doesn’t *guarantee* the simplest explanation is true.
- Understand how and why the media works.
- Understand how and why advertising works.
- Understand methods for drawing conclusions and testing them and correcting when necessary.
- Understand why “good people” can disagree about politics and religion, and these disagreements are not about right and wrong, good or evil.
- Understand what sources in the media, or from among experts, can be trusted, and why others might be biased — e.g. understand the Media Bias Chart and beware relying on extreme sources on either side.
- “Follow the money” — who benefits? who loses? This explains much political debate, which isn’t about principles, but about who’s trying to gain personal or financial advantage.
- “It is difficult to get a man to understand something when his salary depends upon his not understanding it,” which entirely explains the organized opposition (by special interest groups) to the recognition of climate change.
Other topics to follow:
- Moral Foundations Theory — the idea that people respond to several dimensions of morality: care, fairness, loyalty, authority, sanctity, and liberty, and that some people are more sensitive to some than others explains the differences between progressives vs. conservatives.
- Stages of moral development — it’s not true that for people to be moral you need to post a list of ancient tribal rules on courthouse walls. First seen in Wilson 1980: Kohlberg’s six states of ethical development (the Biblical fundamentalists never get past the first stage):
- Obedience and punishment orientation [e.g. Biblical threats of hell for disobedience to The Rules]
- Self-interest orientation (what’s in it for me?)
- Interpersonal accord and conformity; social norms
- Authority and social-order maintaining orientation; law and order morality
- Social contract orientation; laws are recognized as social contracts rather than universal rules
- Universal ethical principles; principled conscience [e.g. Kant’s categorical imperative]
- Why virtually all conspiracy theories are wrong.
- Rules for reviewing, beyond liking something or not:
- What was the author/filmmaker/etc. trying to do?
- How well did they do it?
- Was it worth doing?
Motto: Be skeptical, but not cynical. Be savvy.
— In work beginning Dec 2019 —