We often assume that the human mind is designed to be rational, and failures of rationality are seen as defects in human thought. In my long study of cognitive biases and decision fallacies, that’s been a continuing theme: here’s why your mind isn’t working right.
But as it turns out, that may be the wrong way to look at it. Dr. Stephen Pinker, a Harvard professor specializing in evolutionary psychology and the computational theory of mind, argues that the process of natural selection is not concerned with the truth per se, and in many cases actually disfavors a truth-seeking mind. In an emergency, factual truth-seeking may be way too slow; a fast approximation, even if of questionable accuracy, can promote survival.
Even more importantly, non-factual or even counter-factual beliefs play an important social role. Believing that your own social group is better than other social groups helps you and your group be more successful. Believing that your romantic partner is unique, amazing, and special — even if objective evidence argues otherwise — contributes to a successful relationship.
In fact, some evolutionary psychologists argue that the role of cognitive biases and decision fallacies exist to protect your mind against challenges that would weaken your non-rational beliefs. You want the truth? You can’t handle the truth, and thanks to cognitive biases and decision fallacies, you don’t have to. Beliefs are deeply rooted in the human psyche, and it’s not an accident that they are so resistant to reason.
If a belief increases the survival potential of you and your group, on some level it’s irrelevant whether it’s actually correct. Beliefs that are clearly and immediately contra-survival (“Look, ma, I can fly!”) are evolutionarily self-correcting. It’s not logical argument and rational thinking that changes your mind, but rather the impact of your belief when it collides with the cold, hard ground. When cause and effect is less clear and less immediate, the lesson tends not to sink in.
We are not rational animals. We are thinking animals, but much of our thought isn’t rational at all. We eagerly consume self-admitted lies whenever we read fiction, and as legions of media fans can attest, the imaginary worlds we enter are often more satisfying and fulfilling than the one in which we officially live.
Hokum can expand our universe or contract it. From David Copperfield, who cheerfully admits that what he’s doing is trickery, it’s one small step to Uri Geller, who turns an otherwise unremarkable spoon-bending trick into claims of telekinesis, and not a giant leap to conclude that Area 51 is hiding a world of alien powers right under our very noses. Conspiracy theories and supernatural beliefs of all sorts give us a simple, fulfilling way to eliminate complexity and ambiguity in our lives.
Rationality, enshrined above all in the scientific method, has been under continuing and unremitting attack ever since the scientific revolutions of the 16th and 17th centuries. As soon as Copernicus dethroned the earth as center of the cosmos, the backlash began. In fighting the battle for truth, justice, and the scientific way, we’ve tended to assume that we shared a mutual goal with our opposition: a desire for truth. But that, as we've seen, is not a good assumption.
Which brings up the obvious question, pace Pilate: what is truth? Gandhi argues that truth is self-evident; Churchill argues that it is incontrovertible. But Mark Twain says it best:
“Truth is mighty and will prevail. There is nothing the matter with this, except that it ain’t so.”