Links and Comments from Sunday’s New York Times

Front page article: Indian Writers Return Awards to Protest Government Silence on Violence

File this under Conservative [religious] Resistance:

…growing activism from conservative Hindu nationalists who seek to suppress forms of expression they view as offensive to their religion. They have pressured publishers to withdraw books, pushed universities to remove texts from syllabuses and filed criminal complaints against those they deem to have insulted Hinduism.

Few writers have drawn more criticism than M.M. Kalburgi, a noted rationalist scholar who enraged far-right Hindu nationalists through his criticism of idol worship and superstition. Mr. Kalburgi said he received multiple death threats, and on Aug. 30 was shot dead at point-blank range in his home in Karnataka, a state in southern India. No arrests have been made.

\\\

And in NYT’s Sunday Review, an essay by the venerable science writer Geroge Johnson: Gamblers, Scientists and the Mysterious Hot Hand.

This concerns new studies about the “gambler’s fallacy”, the idea that if you detect several coin toss heads in a row, a tail is somehow more likely than usual to come up on the next toss. Even scientists who study this can be tricked into detecting illusory patterns, as the essay explores. Bottom line, final lines:

We’re all in the same boat. We evolved with this uncanny ability to find patterns. The difficulty lies in separating what really exists from what is only in our minds.

I have Johnson’s book Fire in the Mind: Science, Faith and the Search for Order, still on my to-read shelf.

\\

And in the NYT Book Review, a “By the Book” Q&A with science writer Matt Ridley.

What are the best books ever written about science?

“The Double Helix,” by James Watson, and “The Selfish Gene,” by Richard Dawkins. Both books gave me the important message — which my teachers had somehow mostly missed telling me — that science is not a catalog of facts, but the search for new and bigger mysteries.

And this (!):

The last book that made you furious?

Al Gore’s “An Inconvenient Truth.” It uses all the tricks of a fire-and-brimstone preacher to sell a message of despair and pessimism based on a really shaky, selective and biased understanding of the science of climate change.

And then this:

Disappointing, overrated, just not good: What book did you feel you were supposed to like, and didn’t?

Easy. The Bible. Not even the fine translations of William Tyndale, largely adopted by King James’s committee without sufficient acknowledgment, can conceal the grim tedium of this messy compilation of second-rate tribal legends and outrageous bigotry.

\\

Among the reviews themselves, this by Leonard Mlodinow of two books about “making sense of the world and trying to predict the future”: Richard E. Nisbett’s Mindware: Tools for Smart Thinking and Philip E. Tetlock and Dan Gardner’s Superforecasting: The Art and Science of Prediction. Mlodinow reacts with a

reinforced conviction that books on how to think should be required reading in high schools across the country.

(I agree: see The One Book I’d Have Every College Student Read.) He likes the former book, but says,

My verdict is mixed. If you are looking for a survey of the topics covered in the book’s six sections, this is a good one. You’ll learn about our overzealousness to see patterns, our hindsight bias, our loss aversion, the illusions of randomness and the importance of the scientific method, all in under 300 pages of text. But there isn’t much in “Mindware” that is new, and if you’ve read some of the many recent books on the unconscious, randomness, decision making and pop economics, then the material covered here will be familiar to you.

These are all topics of interest to me, as reflected in one of my Provisional Conclusions…

The second book is somewhat different, but the themes are similar in that they involve drawing conclusions by trying to overcome the biases discussed in the first book. His summary:

The central lessons of “Superforecasting” can be distilled into a handful of directives. Base predictions on data and logic, and try to eliminate personal bias. Keep track of records so that you know how accurate you (and others) are. Think in terms of probabilities and recognize that everything is uncertain. Unpack a question into its component parts, distinguishing between what is known and unknown, and scrutinizing your assumptions.

Of course, astoundingly few people have the discipline to do this, including most of the electorates in, I imagine, every country on earth.

This entry was posted in Conservative Resistance, MInd, Provisional Conclusions, Religion, Science. Bookmark the permalink.