We return to our series on cognitive biases this week with some new ways our thinking tends to be distorted. The series begins here.
Actor-observer bias — This cognitive bias make us assume other people act the way they do because of their personality and not because of their situation. Do people steal food because they are immoral, or because they are hungry? The real answer may vary; the bias is to assume the first.
Of course, when it comes to ourselves, the bias is reversed. We excuse our own behavior by citing our circumstances. Fight this bias in judging other people by focusing extra attention on their circumstances; fight this bias in yourself by being aware of your own ethical choices.
Anchoring effect — When people were asked the percentage of African nations that are members of the UN, people who were first asked “Was it more or less than 45%?” gave lower estimates than those who were first asked “Was it more or less than 65%?”
The numbers don’t even have to be related. When an audience is first asked to write the last two digits of their Social Security numbers, and then to submit mock bids in an auction, the half with the higher two-digit numbers submitted bids between 60% and 120% higher than those of the other half!
You can use the anchoring effect to your advantage in negotiation or sales situations. To combat it, be aware of any numbers mentioned, and consciously try to disconnect them from your decision process.
Attentional bias — If someone with cancer drinks green tea, and the cancer goes away, attentional bias might make someone conclude that drinking green tea cures cancer. After doing some research, it turns out that there are many cases in which someone who drank green tea had a remission of cancer.
But that leaves out three other ideas that need to be tested: Have there been green tea drinkers whose cancer wasn’t cured? Have there been people who didn’t drink green tea whose cancer went into remission anyway? Is it the case that non-green tea drinkers always suffer fatal cancers?
Attentional bias happens when you focus on one piece of evidence and fail to examine different possible outcomes. To fight attentional bias, consciously list the various possibilities and make sure you analyze each one.
Availability cascade — “Repeat something long enough and it will become true.” Political operatives of all stripes take advantage of the availability cascade. Start with an idea that summarizes a complex situation in a simple, straightforward manner, and you can start a chain reaction. The availability cascade is one of the processes that make up groupthink.
A variation on the availability cascade is to accuse others of falling victim to it to give the illusion that a minority position is in fact true. Both those who agree with the consensus on global warning and those who disagree with it accuse the other side of influencing the debate through this technique. However, it’s important to distinguish between a consensus of popular opinion, which is heavily influenced by repetition, and a consensus of scientific opinion, which rests on a body of evidence. (One can challenge the evidence, of course, but that’s a different kind of debate altogether.)
Availability heuristic — If something’s accessible in your memory, this cognitive bias causes you to think it’s also more probable. In surveys, people think dying in a plane crash is more common than dying in a car crash, when it’s the other way around. Plane crashes, of course, get more publicity.
A lot of racial or cultural stereotyping relies on the availability heuristic. “[Fill in the blanks] steal a lot. I know, because a [fill in the blank] robbed my neighbor.” Because a single close example stands out in memory, it seems probable that the characteristic is widespread, when of course a single case proves nothing one way or another.
Belief Bias — Why is it so hard for our logical, well-reasoned arguments to penetrate other people's thick skulls? And, of course, why is it that people so seldom give logical, well-reasoned arguments to support their idiot ideas? Belief bias is the tendency for all of us to evaluate the logical strength of someone's argument based on whether we believe in the truth or falsity of the conclusion. We're all subject to this one; susceptibility to belief bias is independent of reasoning ability.
The Red Queen in Through The Looking Glass practiced believing five impossible things before breakfast, and it's not a bad exercise. Make sure you look at a diversity of information, and spend effort imagining how a reasonable person could reach a conclusion so different from your own. This isn't an argument that you should necessarily change your belief; of course. But make sure your beliefs don't suffer from hardening of the mental arteries.
Next week’s installment will be brought to you by the letter “C.”