The one-sentence summary
Our cultural landscape is riddled with intellectual black holes, but by understanding the techniques that people use in order to create them we can avoid being sucked into these intellectual flytraps.
Can’t be bothered to read it? Too much screen time lately? Listen to the 5-minute podcast in 2 parts.
WHAT THE BOOK SAYS
- Significant numbers of people believe that aliens built the pyramids, that the Holocaust never happened, and that the World Trade Center was brought down by the US government.
- These self-sealing bubbles of belief can trap even the most intelligent of us and can pose significant dangers in scientific reason, religion, relativism, brainwashing, button pushing, bias, and the power of anecdote.
- In order to immunize ourselves against the wiles of cultists, religious and political zealots, conspiracy theorists, promoters of flaky alternative medicines and proponents of crackpot theories, we need to understand 8 mechanisms that they use to create and maintain these insidious belief systems.
- Playing the mystery card. This involves making unjustified appeals to mystery, such as suggesting that this is “beyond the ability of science to decide.” However, few scientists believe in scientism – the idea that science can explain every legitimate question.
- “But it fits!” This is coming up with ways of making evidence and theory ‘fit’ after all. Any theory, no matter how absurd, can be made consistent with the evidence. It is often combined with an approach the author calls the blunderbuss – firing off endless salvos of apparent problems with your opponent’s theory, many of them irrelevant or simply invented.
- Going nuclear. This is exploding a sceptical or relativist philosophical argument that appears to bring all beliefs down to the same level. This forces a draw in any debate.
- Moving the semantic goalposts. This dodges possible refutations by switching back and forth between meanings to confuse the issue.
- “I just know!” This claim suggests that the truth of someone’s belief has somehow been revealed to them, for example by some sort of psychic or god-sensing faculty.
- Pseudoprofundity. This is the art of making the trite, false or nonsensical appear both true and deep. These are often word plays or statements that create the illusion of a profound insight into the human condition. A deepity involves saying something with two meanings – one trivially true and the other profound sounding but false, such as “Love is just a word.”
- Piling up the anecdotes. Anecdotes are in most cases almost entirely worthless as evidence but they can be highly persuasive when offered in great quantity.
- Pressing your buttons. This relies on certain kinds of non-truth-sensitive techniques for shaping belief, including isolation, control, uncertainty, repetition, and emotional manipulation. Religions, for example deploy meditation and prayer, isolation, collective singing and chanting, architecture (imposing buildings), giving, and ritual.
WHAT’S GOOD ABOUT IT
- The HADD hypothesis states that we have evolved to be hyperactive agency detectors – many believe in the existence of invisible agents such as religious or supernatural characters.
- A nonspatial mountain is an analogy for a claim that cannot be proven. By definition, a mountain must have a summit that is higher than the rest of it and valleys that are lower. Strip that framework away, and we end up talking nonsense.
- In order for a theory to be strongly confirmed by the evidence, at least three conditions must be met. The theory must make predictions that are clear and precise, surprising and true.
- Claiming that something is ineffable means saying that it is beyond our comprehension. Sophisticated theists use this approach to suggest that you have not refuted their sort of belief in a higher power. Some even hold the apophatic view that we cannot say what god is, only what he is not.
WHAT YOU HAVE TO WATCH
- A lot of the critique surrounds religion, specifically belief in (a) god so those with faith may not enjoy these perspectives.