Monday, July 5, 2010

Martian, Martian, Martian! (Part 14 of Cognitive Biases)

A cognitive bias is a pattern of deviation in judgment that occurs in particular situations, and boy howdy, are there a lot of them!

Here’s another installment of Cognitive Biases, this one brought to you by the range of Pa-Po. Pr-Pu will follow shortly.

(Marcia and Jan have only an indirect connection to what follows.)




Pareidolia

On July 25, 1976, a camera aboard Viking 1 took a series of pictures of the Cydonia region of the planet Mars. Above, you see a photograph of a 1.2 mile long Cydonian mesa at 40.75° north latitude and 9.46° west latitude. Nothing special, right?

How about the picture below?

This is the famous “Face on Mars,” an example of the cognitive bias known as pareidolia, the tendency of the human brain to turn vague or random stimuli into objects of significance. Watching for patterns in clouds is an exercise in voluntary pareidolia. Some people overrate the significance of these patterns, especially when they see apparent religious imagery, like the infamous Virgin Mary grilled cheese sandwich or the Jesus tortilla.

When you look at a Rorschach inkblot, the images you see are the result of “directed pareidolia.” The blots are carefully designed not to resemble any object in particular, so that what you see is what you project. Pareidolia appears in sound as well. There’s a tendency to hear apparently meaningful words and phrases in a recording played backward. To me, the resemblance between the sound “Martian” and “Marcia” led to the Brady Bunch influenced title of this installment.


Planning Fallacy

In a 1994 study, 37 psychology students were asked to estimate how long it would take to finish their senior theses. The average estimate was 33.9 days. They also estimated how long it would take "if everything went as well as it possibly could" (averaging 27.4 days) and "if everything went as poorly as it possibly could" (averaging 48.6 days). The average actual completion time was 55.5 days, with only about 30% of the students completing their thesis in the amount of time they predicted.

The researchers asked their students for estimates of when they (the students) thought they would complete their personal academic projects, with 50%, 75%, and 99% confidence.

13% of subjects finished their project by the time they had assigned a 50% probability level;
19% finished by the time assigned a 75% probability level;
45% (less than half) finished by the time of their 99% probability level.

In project management, this is sometimes referred to as Hofstadter’s Law: It always takes longer than you expect, even when you take into account Hofstadter’s Law. (Douglas Hofstadter was the author of the 1979 work Gödel, Escher, Bach.) There are a number of theories as to why this is so often true. To my mind, the best explanation comes from Elihu Goldratt in his 1997 Critical Chain, which analyzed project management issues from a different perspective.

Goldratt argued that when asked to estimate task duration, people tended to give a safe estimate whenever possible. Knowing the estimate had safety built in, people then tended to procrastinate or attack other problems until the actual time available was insufficient to get the job done. This is also known as Parkinson’s Law, the tendency of work to expand to fill the time available for its completion.

Numerous books (including some of mine) try to point out solutions, but the problem persists.


Post-Purchase Rationalization

There’s the infamous story about the guy who accidentally dropped a quarter in an outhouse, so he pulled out a $20 bill and threw it in afterward. When asked why, he said, “If I’ve got to go down there, it had better be worth my while.”

Post-purchase rationalization is the bias that once you’ve invested significant time, money, or energy in something, you tend to think it was all worthwhile. In his brilliant 1984 book, The Psychology of Influence, Dr. Robert Cialdini cites several examples. Just after placing a bet at the racetrack, people are much more confident about their horse winning than they were before they placed the bet. Researchers staged thefts on a New York City beach to see if onlookers would risk themselves to stop the thefts. Four in twenty observers gave chase. Then they did it again, but now the supposed victim first asked the onlooker, “Would you watch my things?” Nineteen out of twenty people tried to stop the theft or catch the thief.

Most interestingly, when an attendee at a sales meeting for Transcendental Meditation raised a series of embarrassing questions that undermined the claims made by the presenter, enrollments went up, not down! One person who signed up told the observer that he agreed with the points, but needed help so much that the criticisms made him sign up now, before he had time to think about them and fail to join up.

There’s a value in consistency. Foolish consistency, as we recall, is the hobgoblin of little minds.


More to come…


Previous Installments

You can find the bias you’re interested in by clicking in the tag cloud on the right. To find all posts concerning cognitive biases, click the very big phrase.

Part 1 — Bias blind spot, confirmation bias, déformation professionnelle, denomination effect, moral credential effect.

Part 2 — Base rate fallacy, congruence bias, experimenter’s bias

Part 3 — Ambiguity aversion effect (Ellsberg paradox), choice-supportive bias, distinction bias, contrast effect

Part 4 — Actor-observer bias, anchoring effect, attentional bias, availability cascade, belief bias

Part 5 — Clustering illusion, conjunction fallacy, cryptomnesia

Part 6 — Disposition effect, egocentric bias, endowment effect, extraordinarity bias

Part 7 — False consensus effect, false memory, Forer effect, framing, fundamental attribution error

Part 8 — Gambler’s fallacy, halo effect

Part 9 — Hawthorne effect, herd instinct, hindsight bias, hyperbolic discounting

Part 10 — Illusion of asymmetric insight, illusion of control, illusory superiority, impact bias, information bias, ingroup bias, irrational escalation

Part 11 — Just-world phenomenon, loss aversion, ludic fallacy, mere exposure effect, money illusion

Part 12 — Need for closure, neglect of probability, “not-invented-here” (NIH) syndrome, notational bias

Part 13 — Observer-expectancy effect, omission bias, optimism bias, ostrich effect, outgroup homogeneity bias, overconfidence effect