Showing posts with label denomination effect. Show all posts
Showing posts with label denomination effect. Show all posts

Tuesday, December 28, 2010

When 1+1=3 (Part 19 of Cognitive Biases)

Our 19th installment of Cognitive Biases covers the status quo bias, stereotyping, and the subadditivity effect.


Status Quo Bias

Sigmund Freud suggested that there were only two reasons people changed: pain and pressure. Evidence for the status quo bias, a preference not to change established behavior (even if negative) unless the incentive to change is overwhelming, comes from many fields, including political science and economics.

Another way to look at the status quo bias is inertia: the tendency of objects at rest to remain at rest until acted upon by an outside force. The corollary, that objects once in motion tend to stay in motion until acted upon by an outside force, gives hope for change. Unfortunately, one of those outside forces is friction, which is as easy to see in human affairs as it is in the rest of the material universe.

Daniel Kahneman (this time without Amos Tversky) has created experiments that can produce status quo bias effects reliably. It seems to be a combination of loss aversion and the endowment effect, both described elsewhere.

The status quo bias should be distinguished from a rational preference for the status quo in any particular incident. Change is not in itself always good.

Stereotyping

A stereotype, strictly speaking, is a commonly held popular belief about a specific social group or type of individual. It’s not identical to prejudice:


  • Prejudices are abstract-general preconceptions or abstract-general attitudes towards any type of situation, object, or person.
  • Stereotypes are generalizations of existing characteristics that reduce complexity.


The word stereotype originally comes from printing: a duplicate impression of an original typographic element used for printing instead of the original. (A cliché, interestingly, is the technical term for the printing surface of a stereotype.) It was journalist Walter Lippmann who first used the word in its modern interpersonal sense. A stereotype is a “picture in our heads,“ he wrote, “whether right or wrong.“

Mental categorizing and labeling is both necessary and inescapable. Automatic stereotyping is natural; the necessary (but often omitted) follow-up is to make a conscious check to adjust the impression.

A number of theories have been derived from sociological studies of stereotyping and prejudicial thinking. In early studies it was believed that stereotypes were only used by rigid, repressed, and authoritarian people. Sociologists concluded that this was a result of conflict, poor parenting, and inadequate mental and emotional development. This idea has been overturned; more recent studies have concluded that stereotypes are commonplace.

One theory as to why people stereotype is that it is too difficult to take in all of the complexities of other people as individuals. Even though stereotyping is inexact, it is an efficient way to mentally organize large blocks of information. Categorization is an essential human capability because it enables us to simplify, predict, and organize our world. Once one has sorted and organized everyone into tidy categories, there is a human tendency to avoid processing new or unexpected information about each individual. Assigning general group characteristics to members of that group saves time and satisfies the need to predict the social world in a general sense.

Another theory is that people stereotype because of the need to feel good about oneself. Stereotypes protect one from anxiety and enhance self-esteem. By designating one's own group as the standard or normal group and assigning others to groups considered inferior or abnormal, it provides one with a sense of worth, and in that sense, stereotyping is related to the ingroup bias.

Subadditivity Effect

The subadditivity effect is the tendency to judge probability of the whole to be less than the probabilities of the parts.

For instance, subjects in one experiment judged the probability of death from cancer in the United States was 18%, the probability from heart attack was 22%, and the probability of death from "other natural causes" was 33%. Other participants judged the probability of death from a natural cause was 58%. Natural causes are made up of precisely cancer, heart attack, and "other natural causes," however, the sum of the latter three probabilities is 73%. According to Tversky and Koehler in a 1994 study, this kind of result is observed consistently.

The subadditivity effect is related to other math-oriented cognitive biases, including the denomination effect, the base rate fallacy, and especially the conjunction fallacy.


More next week.

To read the whole series, click "Cognitive bias" in the tag cloud to your right, or search for any individual bias the same way.

Saturday, October 24, 2009

Unknown Knowns — A Survey of Assumptions, Biases, and Bigotry

“The bigot is not he who knows he is right; every sane man knows he is right. The bigot is he whose emotions and imagination are too weak to feel how it is that other men go wrong.”

- G. K. Chesterton, Alarms and Discursions, 1910

Last week, we explored Donald Rumsfeld’s observation about “unknown unknowns.” Unknown unknowns aren't just about what you don't know, they're about what you don't even know that you don't know. The other categories, of course, are known knowns (things you know and know you know) and known unknowns (things you know that you don't know).

But there is one missing combination: unknown knowns, the things you don't know that you really do know. How could you not know something that you actually do know? The answer involves cognitive biases, the ways in which your mind deceives you. Cognitive biases can blind you to what is in fact right in front of you, and also can make you see things that really aren't there.

In looking at cognitive bias, the essential first step is to realize that no one is immune. It's easier to see the mote of self-deception in someone else's eye than it is to see the big heavy curtains that are draped over our own perceptions. None of us can completely escape the trap, but we can (and must) stay aware that what we think isn't necessarily the whole or complete picture. As once was famously said of Vietnam, "Anybody who knows what's going on clearly doesn't understand the situation."

Project managers are taught how important it is to document the range of assumptions on a project, but the PMBOK® Guide doesn't go into much detail about how to discover them or what to do about them. And it's wrong to assume (*ahem*) that all assumptions are bad for your project. They don't always make an "ass + u + me."

Some assumptions are, of course, clearly bad. Common project assumptions include the idea that everybody's on board; that people will always play nice; and that the proposed project will actually solve the underlying problem. "Bad" in this context doesn't mean these assumptions are necessarily or always wrong; it means it's dangerous to take for granted that they're right.

Other assumptions are more useful: if you see a gun, it's wise to assume it's loaded and act accordingly, even if you have good reason to believe it probably isn't. The consequences of an error in one direction don't have the same impact as the consequences of an error in the other. Still other assumptions may change over time. Assume the gun is loaded unless you need to use it; in the latter case, it might be safer to assume it isn't loaded and check to make sure there's a round in the chamber.

The big problem in assumptions comes from assumptions that are held so deeply in the subconscious mind that we (or other stakeholders) aren't even aware they exist -- the “unknown knowns" of our title.

Prejudices and biases are a normal part of the makeup of human beings. They have a certain utility; they permit us to filter and organize and simplify the complex flood of data we get from everyday existence. The danger comes when prejudices are confused with facts. A good general assumption turns into an iron-clad rule; “some” is equated with “all,” and it’s one short step to the idea that if someone sees it differently, they must be either stupid or venal. That, as G. K. Chesterton points out, is the essence of bigotry.

It is both humbling and fascinating to read the extensive and exhaustive lists of biases and cognitive distortions that have been identified over the years. There are far too many for a single blog post, so we'll have fun with these for the next few weeks. If you'd like to jump into discussion, please feel free. SideWise thinkers know they have to battle their own biases as well as those of others, and understanding the list is the essential first step.

Let’s start with five common biases.

Decision-Making and Behavioral Biases

Bias Blind Spot — "Bias blind spot" is a recursive bias, the bias of failing to compensate for one's own cognitive biases. Some 80% of drivers think they are substantially better than the average driver. That's called the "better than average effect." Here, the vast majority of people think they are less subject to bias than the average person.

Confirmation Bias — Evidence is seldom completely clean and clear. If a mass of facts argue against our position and one fact supports it, guess which fact we focus on? When confronted by a mass of data, we tend to be selective in the evidence we collect; we tend to interpret the evidence in a biased way; and when we recall evidence, we often do so selectively. This is why a search for facts isn't as persuasive as logic might suggest.

Déformation professionnelle — Your training as a professional carries with it an intrinsic bias that's often expressed by the phrase "When the only tool you have is a hammer, all problems look like nails." We probably know IT professionals who think every problem can be best solved with software, HR professionals who think every problem yields to training and human capital development, and project managers who think all problems lie inside the confines of the triple constraints. Each profession, of course, provides enormous value, but no single profession has all the answers.

Denomination Effect — One way to limit your daily spending is to carry only large denomination bills. Research shows that people are less likely to spend larger bills than their equivalent value in smaller ones. (This could also be called the Starbucks Effect.)

Moral Credential Effect — If you develop a track record as a moral and ethical person, you can actually increase your likelihood of making less ethical decisions in the future, as if you have given yourself a "Get out of jail free" card. For example, in a 2001 study, individuals who have had the opportunity to recruit a woman or an African-American in one setting were more likely to say later that a different particular job would be better suited for a man or a Caucasian.

More next week...

[Illustration © 2009 Mark Hill, used with permission.]