I plan to post various kinds of responses to books I’ve read or am reading; some will be reviews, some will be notes with comments. This entry is one of the latter. (Note this only covers the first 60 pages or so of a 400+p book.)
Daniel Dennett is a well-known philosopher (Wikipedia) who’s applied his thinking to cutting edge issues of science – evolution, consciousness, free will — and has written numerous tomes on these subjects and others. This book is something of a hodgepodge, but a useful one. It starts by describing various thinking tools, including ‘intuition pumps’, and then applies those tools to issues of meaning, mind, and free will. Some of the ‘tools’ are negative ones, to be avoided, since they are most often used to mislead and deflect critical thinking. Thus this book is of interest to me not just for its application to those studies, but as a compilation of thinking tools that can be applied in any area of daily life.
Following is a summary, with occasional comments by me in [brackets].
Just as blacksmiths make their own tools, so have philosophers: Cartesian coordinates, probability theory, Bayes’ theorem. Basic tools include labels, examples, analogies and metaphors, staging (working parts of a problem at a time), and thought experiments that author calls ‘intuition pumps’. These can be rigorous arguments (reduction ad absurdum) or little stories, like Plato’s cave. Example: the ‘whimsical jailer’, who unlocks the doors every night while prisoners are asleep. Are the prisoners free? A way to think about the problem is to ‘turn the knobs’ – examine every aspect of the premise and consider variations. [One technique I picked up somewhere is to take a question about a trend, and instead of asking what a difference of 10%, say, would mean, ask what would happen if the increase were to 100%, or infinity – take it to the extreme. Then the direction of the answer is usually apparent.]
Doug Hofstadter’s (with whom Dennett cowrote a book a while back) favorite tools included wild goose chases, dirty tricks, sour grapes, etc (p9). Two of his that are included here are ‘jootsing’ and ‘sphexishness’.
Author’s style is to try to write for bright undergraduates.
A dozen general tools
1, Making mistakes. The history of philosophy is of making mistakes; it’s the key to making progress. Just as evolution works by mistakes. [Dennett is notable for applying the evolutionary principles, e.g. natural selection, to many branches of knowledge, not just biological evolution. Nice quote page 21b: “If you attempt to make sense of the world of ideas and meanings, free will and mortality, art and science and even philosophy itself without a sound and quite detailed knowledge of evolution, you have one hand tied behind your back.”]
Lesson: don’t hide from mistakes; savor them, examine them.
In some cases you can only make a guess, and correct from it, the way early seagoing navigation was done.
Card magicians proceed by a sort of trial and error, trying a trick that rarely works and moving on to tricks that are more likely to work.
Science means making mistakes in public. That’s how science advances. Making mistakes, actually, don’t break your career.
2, Reductio ab absurdum
Take an assertion or premise, and find a contradiction or preposterous implication. (Author gives examples from his own talks.) [This of course is a well-known technique.]
3, Rapoport’s Rules
Beware easy parodies of your opponent’s ideas. Acknowledge those ideas, find points of agreement, acknowledge anything you’ve learned, and *then* criticize.
4, Sturgeon’s Law
From 1953; Sturgeon famously said maybe 90% of science fiction is crap, but then 90% of everything is crap.
Maybe an exaggeration. Point is: don’t waste time attacking the bad stuff. Spend your attention addressing the best examples, according to the leaders in that field. (Also, avoid caricature; it’s easy, but does you discredit.)
5, Occam’s Razor
The familiar dictum “do not multiply entities beyond necessity”. That is, the simplest explanation is usually right. Of course, this is just a useful suggestion, it’s not *necessarily* true.
6, Occam’s Broom
This is the process of sweeping away inconvenient facts under a rug, used by intellectually dishonest champions of one theory or another – that is, this is an anti-thinking tool, a ‘boom crutch’, and one should beware employing it. Conspiracy theorists and creationists are experts using this tool.
7, Using Lay Audiences as Decoys
Experts have a way of *under*-explaining themselves to fellow experts, so as not to risk offending; thus experts sometimes talk past one another. A tool to help cure this is for an expert to explain his idea to a lay audience (e.g. undergraduates), and ask fellow experts to stand at the back and listen, without participating. This way the listeners may understand the idea in a way they had not appreciated.
This means ‘jumping out of the system’, and it applies to the arts as well. That is, learn the rules and current standards before trying to create something novel. Creativity violates rules, but you have to know what the rules are first; you must know tradition to subvert it.
These are ‘boom crutches’ used to Stephen Jay Gould (the respected and prolific but contentious evolutionary biologist) who used rhetorical techniques like these in his arguments.
Rathering: suggesting a false dichotomy, e.g. “It is not the case that A; rather it is B, which is completely different.” This deflects the possibility that both are true, to some degree, or that there are other alternatives.
Piling on: implying more than the examples support [I’m paraphrasing], e.g. “We talk about A and B; nothing could be further from reality.” Which is further from reality, A?, B? everything?
Gould two-step: first, refute a straw-man; second, suggest that in doing so, your opponents have grudgingly conceded your attack. A detailed example is given.
When used, usually a ‘boom crutch’ indicating that the author is sliding past a weak point in his argument.
11, Rhetorical questions
A rhetorical question usually indicates the author doesn’t want to talk about it, and wants you to think the answer is obvious, without defending why it should be so.
These are statements that are supposed to sound profound but in fact don’t mean much of anything, or could mean lots of things, but the author hasn’t said which. Example: “Love is just a word.”
I think of *science fiction* as an intuition pump, to the extent that it is “suppose for the sake of argument” speculation. A turning of the knobs on reality, or at least on the particulars of one’s culture, background, and experience. (As history is, in another way.) Knowing that these variations exist, or can be imagined, helps defuse the tendency toward fundamentalist belief systems, or cultural presumptions.
Jootsing is one particular way in which SF works this way.
If I have a theme to this blog, it is how science fiction is a bridge, a tool, to thinking about the most important questions in life.