Tuesday, April 27, 2010

Not Invented Here! (Part 12 of Cognitive Biases)

A cognitive bias is a pattern of deviation in judgment that occurs in particular situations, and boy howdy, are there a lot of them! In this week’s installment, brought to you by the letter “N,” we discuss the need for closure, the neglect of probability, the “Not Invented Here” (NIH) Syndrome, and notational bias.

Need for Closure

On a scale of 1 (strongly disagree) to 6 (strongly agree), how would you rate yourself on the following statements?
  1. I don't like to go into a situation without knowing what I can expect from it.
  2. I think that having clear rules and order at work is essential for success.
  3. I'd rather know bad news that stay in a state of uncertainty.
  4. I usually make important decisions quickly and confidently.
  5. I do not usually consult many different opinions before forming my own view.
These questions are part of the 42-item Need for Closure Scale (NFCS), a way to measure the extent of your need for cognitive closure, your desire for an answer to settle the matter, even if the answer isn’t the correct one or the best one.

There are five different types of the closure bias. The first statement above tests your desire for predictability. In order, the others test your preference for order and structure, your degree of discomfort with ambiguity, your decisiveness, and your close-mindedness.

If you have a high need for closure, you tend to rely more on information received earlier, and prefer the first workable answer you come across. You tend to search for information more narrowly, and apply rules and shortcuts to aid quick decision-making. A low need for closure is, unsurprisingly, associated with creativity, especially the process of coming up with a large number of potential solutions.

Need for closure is affected by outside circumstances as well as by basic temperament. Time pressure, in particular, plays a significant role. The need for closure is attributed not only to individuals, but also to cultures as a whole, illustrated by the argument that the “need for national closure” warranted stopping the process of recounting votes in the Florida 2000 presidential election.

Neglect of probability

Several of our cognitive biases involve misapplication or misunderstanding of probability in a given situation. So far, we’ve covered the base rate effect, the gambler’s fallacy, the hindsight bias, and the ludic fallacy.

Neglect of probability is something different. It’s the complete disregard of probability rather than its incorrect use. Children are particularly subject to this bias. In a 1993 study, children were asked the following question:

Susan and Jennifer are arguing about whether they should wear seat belts when they ride in a car. Susan says that you should. Jennifer says you shouldn’t. . . . Jennifer says that she heard of an accident where a car fell into a lake and a woman was kept from getting out in time because of wearing her seat belt, and another accident where a seat belt kept someone from getting out of the car in time when there was a fire. What do you think about this?

Here’s how one subject responded:

A: Well, in that case I don’t think you should wear a seat belt.
Q (interviewer): How do you know when that’s gonna happen?
A: Like, just hope it doesn’t!
Q: So, should you or shouldn’t you wear seat belts?
A: Well, tell-you-the-truth we should wear seat belts.
Q: How come?
A: Just in case of an accident. You won’t get hurt as much as you will if you didn’t wear a seat belt.
Q: OK, well what about these kinds of things, when people get trapped?
A: I don’t think you should, in that case.

Another subject replied, “If you have a long trip, you wear seat belts half way.” Notice that the comparative probability of the two events doesn’t come into the discussion at all.

For adults, a 2001 study found that a typical subject was willing to pay $7 to avoid a 1% chance of a painful electric shock, but only $10 to avoid a 99% chance of the same shock, suggesting that probability is more likely to be neglected when the outcomes produce anxiety.

“Not Invented Here” (NIH) Syndrome

It doesn’t take a lot of experience in the world of work before you begin to encounter the “Not Invented Here” syndrome. Although mostly intended as a somewhat cynical joke, the behavior is quite real and has a significant effect on organizations. Interestingly, it’s not always negative, and not always antithetical to creativity and innovation. Like many cognitive biases, the trick is to be conscious of how it works in your life and in your organization.

Numerous factors can trigger an NIH response. Personal and organizational egotism plays a large role: we are inherently superior or unique, therefore what may work elsewhere is inferior or inapplicable. Loyalty matters. In the early days of personal computing, the British-made Timex Sinclair was hugely popular in Britain but hardly known in the United States, and the Japanese/Dutch MSX computer was successful in Japan and much of Europe, but not in either Britain or the United States.

There can be economic advantages to NIH behavior. Television networks more commonly buy programs from suppliers in which they have a financial interest. Such shows are more profitable to the network than a show from a non-affiliated supplier that drew higher ratings. Economic advantages can also accrue to individuals at the same time they penalize organizations. In one case, a department refused to help another group in the same company because doing so would perversely lowered bonuses to those doing the helping.

NIH can also form the basis of corporate strategy, and as such can be a vehicle to promote innovation rather than retard it. Apple, for example, commonly ignores or actively denigrates trends in the computer industry and invents its own. “Netbooks aren’t better than anything,” argued Steve Jobs. “They’re just cheap laptops.” Accordingly, Apple ignored the netbook model and invented its own: the iPad.

Notational bias

BRITANNUS (shocked).
Caesar: this is not proper.

THEODOTUS (outraged).
How!

CAESAR (recovering his self-possession).
Pardon him. Theodotus: he is a barbarian, and thinks that the customs of his tribe and island are the laws of nature.

This famous moment from George Bernard Shaw’s Caesar and Cleopatra illustrates notational bias, the assumption that conventions of one’s own society are equivalent to laws of logic or of nature. Examples abound. If you read most European languages, you read from left to right. It’s “natural.” But if you read Hebrew, it’s the other way around. It’s “natural” for Americans to drive on the right. But of course these are ultimately arbitrary choices that become the norm for a particular culture.

When you fall into notational bias, it’s not about whether you prefer your culture’s choice, or even whether your culture’s choice is arguably better. Instead, notational bias blinds you to the idea that there’s even a choice to be made.

Previous Installments

Part 1 — Bias blind spot, confirmation bias, déformation professionnelle, denomination effect, moral credential effect.

Part 2 — Base rate fallacy, congruence bias, experimenter’s bias

Part 3 — Ambiguity aversion effect (Ellsberg paradox), choice-supportive bias, distinction bias, contrast effect

Part 4 — Actor-observer bias, anchoring effect, attentional bias, availability cascade, belief bias

Part 5 — Clustering illusion, conjunction fallacy, cryptomnesia

Part 6 — Disposition effect, egocentric bias, endowment effect, extraordinarity bias

Part 7 — False consensus effect, false memory, Forer effect, framing, fundamental attribution error

Part 8 — Gambler’s fallacy, halo effect

Part 9 — Hawthorne effect, herd instinct, hindsight bias, hyperbolic discounting

Part 10 — Illusion of asymmetric insight, illusion of control, illusory superiority, impact bias, information bias, ingroup bias, irrational escalation

Part 11 — Just-world phenomenon, loss aversion, ludic fallacy, mere exposure effect, money illusion

No comments:

Post a Comment