Tuesday, January 18, 2011

A Credit To His Race (The final installment of Cognitive Biases)

At long last, we reach the end of our series on Cognitive Biases. In this installment, we'll study the ultimate attribution error, the valence effect, the von Restorff effect, wishful thinking, and the zero-risk bias.


Ultimate Attribution Error

A phrase I used to hear from time to time in my Alabama days was, “He’s a credit to his race.” It was never used to refer to a white person, of course, but only to blacks. On the surface, it appears to be a compliment, but it’s an example of the ultimate attribution error.

In the ultimate attribution error, people view negative behaviors on the part of members of an outgroup as a normal trait, and positive behavior as exceptions to the norm. It relates to the fundamental attribution error, in which we explain our own behavior as reactions to situations and other peoples’ behavior as a matter of basic character, and clearly relates to stereotyping. Ultimate attribution error is one of the basic mechanisms of prejudice.

Valence Effect

In psychology, valence refers to the positive or negative emotional charge of a given event or circumstance. The valence effect is a probability bias in which people overestimate the likelihood of something good rather than something bad: it’s the basic mechanism that stimulates the sale of lottery tickets.

There are numerous studies that demonstrate the valence effect. In one study, people assigned a higher probability of picking a card with a smiling face than one with a frowning face in a random deck.

The valence effect can be considered as wishful thinking, but it’s been shown in some case that belief in a positive outcome can increase the odds of achieving it — you may work harder or refuse to give up as early.

Von Restorff Effect

First identified by Dr. Hedwig von Restorff in 1933, this bias (also called the isolation effect) predicts that an item that "stands out like a sore thumb" (called distinctive encoding) is more likely to be remembered than other items. For instance, if a person examines a shopping list with one item highlighted in bright green, he or she will be more likely to remember the highlighted item than any of the others.

Wishful Thinking

This popular cognitive bias involves forming beliefs and making decisions based on your imagination rather than evidence, rationality, or reality. All else being equal, the valence effect holds: people predict positive outcomes are more likely than negative ones.

There is also reverse wishful thinking, in which someone assumes that because it’s bad it’s more likely to happen: Murphy’s Law as cognitive bias.

Wishful thinking isn’t just a cognitive bias, but a logical fallacy: I wish that P would be true/false; therefore, P is true/false. It’s related to two other fallacies that are reciprocals of one another: negative proof and argument from ignorance.

In negative proof, the absence of certainty on one end of the argument is taken as proof of the opposite end: climate scientists cannot say with 100% certainty that their claims about global warming are true, therefore, they must be false. The reciprocal fallacy is known as the argument from ignorance: no one can be sure that there is no God; therefore, there is a God.

Zero-Risk Bias

Since 2000, terrorists attacks against the United States or Americans abroad have killed about 3,250 people, the vast majority of them on 9/11. Your odds of being a victim are about one in ten million.

The Transportation Security Administration consumes $5.6 billion a year. Its job is to reduce the chance of terrorist attacks on transportation infrastructure, primarily air, to zero. Let’s assume that they are completely effective in their mission. If so, the cost per life saved is $1.7 million.

Perhaps that’s a completely reasonable price to pay to save a human life. However, from a logical point of view, you have to consider what else $5.6 billion might accomplish. Over a ten-year period, about 420,000 people die in car accidents. If $5.6 billion would eliminate 100% of the risk of aviation terrorist deaths, or 10% of the risk of car accident deaths, which risk would you chose to attack?

Common sense argues for a 10% reduction in car accidents, but the zero-risk bias argues the opposite: it’s the preference for completely eliminating a risk (even if small) to reducing a larger risk. It values certainty over residual risk.

There are other arguments that can be made in support of anti-terrorist activities, but the zero-risk bias is also operational here, and it leads to faulty decisions.


______________________________________________



With this installment, our long march through the wilds of cognitive bias comes to an end. I deeply appreciate the many insightful comments you’ve provided.


And now for something completely different…

2 comments:

  1. Great, reasoned thinking on the TSA. If only they cared about great, reasoned, thinking...

    ReplyDelete
  2. Vashti -

    Thanks. The problem is that TSA is in fact protecting against a very important risk: the reputational and career risk to its leadership and to the political class. This drives the decision far more than the risk to the public. Their decision is rational — wrong, in my opinion — but still rational.

    Regards,

    Michael

    ReplyDelete