It’s been a few weeks since the last installment of our survey of popular distortions in thought, perception, and decision-making. This installment is brought to you by
The Story of O.
Observer-expectancy effect
In September 1969, Tim Harper, a student at Drake University in Des Moines, Iowa, published a humorously intended article in the campus newspaper, titled “Is Paul McCartney Dead?” The article listed a number of supposed reasons, including the claim that the surviving Beatles had planted backward messages in various songs.
About a month later, a caller to WKNR-FM in Detroit asked radio dj Russ Gibb about the rumor, asking him to play “Revolution 9” backwards. Gibb did, and heard the phrase “Turn me on, dead man.”
Or so he thought.
The “Paul is dead” story quickly got out of control, and any number of people (some not even stoned) started to pick up clues. Even statements from Paul himself were not enough to stop the story. There are still claims today that photographs of Paul pre-1966 and post-1966 show significant differences in facial structure.
We see what we expect to see. If we’re looking for a particular answer, the cognitive bias known as the observer-expectancy effect results in unconscious manipulation of experiments and data so that yes, indeed, we find what we were looking for.
The use of double-blind methodology in performing experiments is one way to control for the observer-expectancy effect. Try this thought experiment: if you are
wrong, what would you expect to see differently?
Omission bias
You know an opponent of yours is allergic to a certain food. Before a big competition, you have an opportunity to do one of two things. Which, in your judgment, is less immoral?
- Slip some of the allergen in his or her food.
- Notice that the opponent has accidentally ordered food containing the allergen, and choose to say nothing.
A clear majority say the harmful action (1) is worse than the harmful inaction (2). The net result for the opponent is the same, of course. The reason is omission bias, the belief that harmful inaction is ethically superior to harmful action.
Part of the reinforcement of the bias is that it’s harder to judge motive in cases of omission. “I didn’t know he was allergic!” you might argue, and there’s a good chance you’ll get away with it. Every employee knows the technique of “malicious compliance,” whether or not we personally use it — that’s the tactic of applying an order or directive with such appalling literal-mindedness that you guarantee a disastrous result.
Even if no one else can judge your intent, you can. Don’t let the omission bias lead you into ethical choices you’ll later regret.
Optimism bias
Optimism bias is the tendency for people to be over-optimistic about the outcome of planned actions. Excessive optimism can result in cost overruns, benefit shortfalls, and delays when plans are implemented or expensive projects are built. In extreme cases these can result in defeats in military conflicts, ultimate failure of a project or economic bubbles such as market crashes.
A number of studies have found optimism bias in different kinds of judgment. These include:
- Second-year MBA students overestimated the number of job offers they would receive and their starting salary.
- Students overestimated the scores they would achieve on exams.
- Almost all newlyweds in a US study expected their marriage to last a lifetime, even while aware of the divorce statistics.
- Most smokers believe they are less at risk of developing smoking-related diseases than others who smoke.
Optimism bias can induce people to underinvest in primary and preventive care and other risk-reducing behaviors. Optimism bias affects criminals, who tend to misjudge the likelihood of experiencing legal consequences.
Optimism bias causes many people to grossly underestimate their odds of making a payment late. Companies have exploited this bias by increasing interest rates to punitive rates for any late payment, even if it is to another creditor. People subject to optimism bias think this won’t happen to them — but eventually it happens to almost everbody.
Optimism bias also causes many people to substantially underestimate the probability of having serious financial or liquidity problems, such as from a sudden job loss or severe illness. This can cause them to take on excessive debt under the expectation that they will do better than average in the future and be readily able to pay it off.
There’s a good side to optimism bias as well. Depressives tend to be more accurate and less overconfident in their assessments of the probabilities of good and bad events occurring to others, but they tend to overestimate the probability of bad events happening to them, making them risk-averse in self-destructive ways.
Ostrich effect
The optimism bias is linked to the ostrich effect, a common strategy of dealing with (especially financial) risk by pretending it doesn’t exist. Research has demonstrated that people look up the value of their investments 50-80% less often during bad markets.
Outcome bias
At the end of World War II, Montgomery Ward chairman Sewell Avery made a fateful decision. The United States, he was sure, would experience major difficulties moving from a wartime to a peacetime economy. Millions of troops would return, all seeking jobs. At the same time, factories geared for the production of tanks, bombers, and fighting ships would grind to a halt with no further need for their production.
Let Sears and JCPenney expand; Montgomery Ward would stand pat on its massive cash reserves (one Ward vice president famously said, “Wards is one of the finest banks with a storefront in the US today.”) and when the inevitable collapse came, Montgomery Ward would swallow its rivals at pennies on the dollar.
As we know, it didn’t turn out that way. Instead of falling back into depression, the United States in the postwar years saw unprecedented economic growth.
Sewell Avery was wrong. But was he stupid?
Outcome bias describes our tendency to judge the quality of the decision by the outcome: Sewell Avery was stupid. But that’s not fair. The outcome of the decision doesn’t by itself prove whether the decision was good or bad. Lottery tickets aren’t a good investment strategy. The net return is expected to be negative. On the other hand, occasionally someone wins. That doesn’t make them a genius. Wearing your seatbelt is a good idea. There are, alas, certain rare accidents in which a seatbelt could hamper your escape.
As it happens, Avery was stupid — not because he made a decision that turned out to be wrong, but because he stuck to it in the face of increasing evidence to the contrary, even firing people who brought him bad news. But that’s a different bias.
Outgroup homogeneity bias
In response to the claim that all black people look alike, comedian Redd Foxx performed a monologue that listed some thirty or forty different shades of black, set against the single color of white. “No, dear white friends,” Foxx said, “it is
you who all look alike.”
The proper name for this perception (in all directions) is “outgroup homogeneity bias,” the tendency to see members of our own group as more varied than members of other groups. Interestingly, this turns out to be unrelated to the number of members of the other group we happen to know. The bias has been found even when groups interact frequently.
Overconfidence effect
One of the most solidly demonstrated cognitive biases is the “overconfidence effect,” the degree to which your personal confidence in the quality and accuracy of your own judgment is greater than the actual quality and accuracy. In one experiment, people were asked to rate their answers. People who rated their answers as 99% certain turned out to be wrong about 40% of the time.
The overconfidence gap is greatest when people are answering hard questions about unfamiliar topics. What’s your guess as to the total egg production of the United States? How confident are you in the guess you just made? (The average person expects an error rate of 2%, but the real error rate averages about 46%.)
Clinical psychologists turn out to have a high margin of overconfidence.
Weather forecasters, on the contrary, have none.
Previous Installments
Part 1 — Bias blind spot, confirmation bias, déformation professionnelle, denomination effect, moral credential effect.
Part 2 — Base rate fallacy, congruence bias, experimenter’s bias
Part 3 — Ambiguity aversion effect (Ellsberg paradox), choice-supportive bias, distinction bias, contrast effect
Part 4 — Actor-observer bias, anchoring effect, attentional bias, availability cascade, belief bias
Part 5 — Clustering illusion, conjunction fallacy, cryptomnesia
Part 6 — Disposition effect, egocentric bias, endowment effect, extraordinarity bias
Part 7 — False consensus effect, false memory, Forer effect, framing, fundamental attribution error
Part 8 — Gambler’s fallacy, halo effect
Part 9 — Hawthorne effect, herd instinct, hindsight bias, hyperbolic discounting
Part 10 — Illusion of asymmetric insight, illusion of control, illusory superiority, impact bias, information bias, ingroup bias, irrational escalation
Part 11 — Just-world phenomenon, loss aversion, ludic fallacy, mere exposure effect, money illusion
Part 12 — Need for closure, neglect of probability, “not-invented-here” (NIH) syndrome, notational bias