Here's an index to all the installments of Cognitive Biases. Click on any "Part" name to go directly to that installment. You can also find the bias you’re interested in by clicking in the tag cloud on the right. To find all posts concerning cognitive biases, click the very big phrase.
Part 1 — Bias blind spot, confirmation bias, déformation professionnelle, denomination effect, moral credential effect.
Part 2 — Base rate fallacy, congruence bias, experimenter’s bias
Part 3 — Ambiguity aversion effect (Ellsberg paradox), choice-supportive bias, distinction bias, contrast effect
Part 4 — Actor-observer bias, anchoring effect, attentional bias, availability cascade, belief bias
Part 5 — Clustering illusion, conjunction fallacy, cryptomnesia
Part 6 — Disposition effect, egocentric bias, endowment effect, extraordinarity bias
Part 7 — False consensus effect, false memory, Forer effect, framing, fundamental attribution error
Part 8 — Gambler’s fallacy, halo effect
Part 9 — Hawthorne effect, herd instinct, hindsight bias, hyperbolic discounting
Part 10 — Illusion of asymmetric insight, illusion of control, illusory superiority, impact bias, information bias, ingroup bias, irrational escalation
Part 11 — Just-world phenomenon, loss aversion, ludic fallacy, mere exposure effect, money illusion
Part 12 — Need for closure, neglect of probability, “not-invented-here” (NIH) syndrome, notational bias
Part 13 — Observer-expectancy effect, omission bias, optimism bias, ostrich effect, outgroup homogeneity bias, overconfidence effect
Part 14 — Pareidolia, planning fallacy, post-purchase rationalization
Part 15 — Projection bias, pseudocertainty effect, publication bias
Part 16 — Reactance, reminiscence bump, restraint bias, rosy retrospection
Part 17 — Selection bias, selective perception, self-fulfilling prophecy
Part 18 — Self-serving bias, Semmelweis reflex, serial position effect
Part 19 — Status quo bias, stereotyping, subadditivity effect
Part 20 — Subjective validation, suggestibility, system justification theory
Part 21 — Telescoping effect, Texas sharpshooter fallacy, trait ascription bias
Part 22 — Ultimate attribution error, valence effect, von Restorff effect, wishful thinking, zero-risk bias
Tuesday, January 25, 2011
Tuesday, January 18, 2011
A Credit To His Race (The final installment of Cognitive Biases)
At long last, we reach the end of our series on Cognitive Biases. In this installment, we'll study the ultimate attribution error, the valence effect, the von Restorff effect, wishful thinking, and the zero-risk bias.
Ultimate Attribution Error
A phrase I used to hear from time to time in my Alabama days was, “He’s a credit to his race.” It was never used to refer to a white person, of course, but only to blacks. On the surface, it appears to be a compliment, but it’s an example of the ultimate attribution error.
In the ultimate attribution error, people view negative behaviors on the part of members of an outgroup as a normal trait, and positive behavior as exceptions to the norm. It relates to the fundamental attribution error, in which we explain our own behavior as reactions to situations and other peoples’ behavior as a matter of basic character, and clearly relates to stereotyping. Ultimate attribution error is one of the basic mechanisms of prejudice.
Valence Effect
In psychology, valence refers to the positive or negative emotional charge of a given event or circumstance. The valence effect is a probability bias in which people overestimate the likelihood of something good rather than something bad: it’s the basic mechanism that stimulates the sale of lottery tickets.
There are numerous studies that demonstrate the valence effect. In one study, people assigned a higher probability of picking a card with a smiling face than one with a frowning face in a random deck.
The valence effect can be considered as wishful thinking, but it’s been shown in some case that belief in a positive outcome can increase the odds of achieving it — you may work harder or refuse to give up as early.
Von Restorff Effect
First identified by Dr. Hedwig von Restorff in 1933, this bias (also called the isolation effect) predicts that an item that "stands out like a sore thumb" (called distinctive encoding) is more likely to be remembered than other items. For instance, if a person examines a shopping list with one item highlighted in bright green, he or she will be more likely to remember the highlighted item than any of the others.
Wishful Thinking
This popular cognitive bias involves forming beliefs and making decisions based on your imagination rather than evidence, rationality, or reality. All else being equal, the valence effect holds: people predict positive outcomes are more likely than negative ones.
There is also reverse wishful thinking, in which someone assumes that because it’s bad it’s more likely to happen: Murphy’s Law as cognitive bias.
Wishful thinking isn’t just a cognitive bias, but a logical fallacy: I wish that P would be true/false; therefore, P is true/false. It’s related to two other fallacies that are reciprocals of one another: negative proof and argument from ignorance.
In negative proof, the absence of certainty on one end of the argument is taken as proof of the opposite end: climate scientists cannot say with 100% certainty that their claims about global warming are true, therefore, they must be false. The reciprocal fallacy is known as the argument from ignorance: no one can be sure that there is no God; therefore, there is a God.
Zero-Risk Bias
Since 2000, terrorists attacks against the United States or Americans abroad have killed about 3,250 people, the vast majority of them on 9/11. Your odds of being a victim are about one in ten million.
The Transportation Security Administration consumes $5.6 billion a year. Its job is to reduce the chance of terrorist attacks on transportation infrastructure, primarily air, to zero. Let’s assume that they are completely effective in their mission. If so, the cost per life saved is $1.7 million.
Perhaps that’s a completely reasonable price to pay to save a human life. However, from a logical point of view, you have to consider what else $5.6 billion might accomplish. Over a ten-year period, about 420,000 people die in car accidents. If $5.6 billion would eliminate 100% of the risk of aviation terrorist deaths, or 10% of the risk of car accident deaths, which risk would you chose to attack?
Common sense argues for a 10% reduction in car accidents, but the zero-risk bias argues the opposite: it’s the preference for completely eliminating a risk (even if small) to reducing a larger risk. It values certainty over residual risk.
There are other arguments that can be made in support of anti-terrorist activities, but the zero-risk bias is also operational here, and it leads to faulty decisions.
______________________________________________
With this installment, our long march through the wilds of cognitive bias comes to an end. I deeply appreciate the many insightful comments you’ve provided.
And now for something completely different…
Ultimate Attribution Error
A phrase I used to hear from time to time in my Alabama days was, “He’s a credit to his race.” It was never used to refer to a white person, of course, but only to blacks. On the surface, it appears to be a compliment, but it’s an example of the ultimate attribution error.
In the ultimate attribution error, people view negative behaviors on the part of members of an outgroup as a normal trait, and positive behavior as exceptions to the norm. It relates to the fundamental attribution error, in which we explain our own behavior as reactions to situations and other peoples’ behavior as a matter of basic character, and clearly relates to stereotyping. Ultimate attribution error is one of the basic mechanisms of prejudice.
Valence Effect
In psychology, valence refers to the positive or negative emotional charge of a given event or circumstance. The valence effect is a probability bias in which people overestimate the likelihood of something good rather than something bad: it’s the basic mechanism that stimulates the sale of lottery tickets.
There are numerous studies that demonstrate the valence effect. In one study, people assigned a higher probability of picking a card with a smiling face than one with a frowning face in a random deck.
The valence effect can be considered as wishful thinking, but it’s been shown in some case that belief in a positive outcome can increase the odds of achieving it — you may work harder or refuse to give up as early.
Von Restorff Effect
First identified by Dr. Hedwig von Restorff in 1933, this bias (also called the isolation effect) predicts that an item that "stands out like a sore thumb" (called distinctive encoding) is more likely to be remembered than other items. For instance, if a person examines a shopping list with one item highlighted in bright green, he or she will be more likely to remember the highlighted item than any of the others.
Wishful Thinking
This popular cognitive bias involves forming beliefs and making decisions based on your imagination rather than evidence, rationality, or reality. All else being equal, the valence effect holds: people predict positive outcomes are more likely than negative ones.
There is also reverse wishful thinking, in which someone assumes that because it’s bad it’s more likely to happen: Murphy’s Law as cognitive bias.
Wishful thinking isn’t just a cognitive bias, but a logical fallacy: I wish that P would be true/false; therefore, P is true/false. It’s related to two other fallacies that are reciprocals of one another: negative proof and argument from ignorance.
In negative proof, the absence of certainty on one end of the argument is taken as proof of the opposite end: climate scientists cannot say with 100% certainty that their claims about global warming are true, therefore, they must be false. The reciprocal fallacy is known as the argument from ignorance: no one can be sure that there is no God; therefore, there is a God.
Zero-Risk Bias
Since 2000, terrorists attacks against the United States or Americans abroad have killed about 3,250 people, the vast majority of them on 9/11. Your odds of being a victim are about one in ten million.
The Transportation Security Administration consumes $5.6 billion a year. Its job is to reduce the chance of terrorist attacks on transportation infrastructure, primarily air, to zero. Let’s assume that they are completely effective in their mission. If so, the cost per life saved is $1.7 million.
Perhaps that’s a completely reasonable price to pay to save a human life. However, from a logical point of view, you have to consider what else $5.6 billion might accomplish. Over a ten-year period, about 420,000 people die in car accidents. If $5.6 billion would eliminate 100% of the risk of aviation terrorist deaths, or 10% of the risk of car accident deaths, which risk would you chose to attack?
Common sense argues for a 10% reduction in car accidents, but the zero-risk bias argues the opposite: it’s the preference for completely eliminating a risk (even if small) to reducing a larger risk. It values certainty over residual risk.
There are other arguments that can be made in support of anti-terrorist activities, but the zero-risk bias is also operational here, and it leads to faulty decisions.
______________________________________________
With this installment, our long march through the wilds of cognitive bias comes to an end. I deeply appreciate the many insightful comments you’ve provided.
And now for something completely different…
Tuesday, January 11, 2011
Did You Hear the One About the Texas Sharpshooter? (Part 21 of Cognitive Biases)
In this installment of Cognitive Biases, we'll learn why your memories get unstuck in time, why establishing hypotheses backward is a fallacy, and why we think other people always behave the same way.
Telescoping Effect
The telescoping effect is a memory bias, first documented in a 1964 article in the Journal of the American Statistical Association. People tend to perceive recent events as being more remote in time than they are (backward telescoping) and more remote events as being more recent than they are. The Galton-Crovitz test measures the effect; you can take the test here.
Texas Sharpshooter Fallacy
A sales manager I once knew had an infallible sense of what was going to sell. Because he didn’t want to waste his time, he put all his emphasis on selling what he knew would sell, and didn’t bother pushing the stuff that wouldn’t sell anyway.
This is an example of the Texas sharpshooter fallacy. The Texas sharpshooter, you see, fired a bunch of shots at the side of the barn, went over and found a cluster of hits, and drew a bullseye over them. When you don’t establish your hypothesis first and test it second, your conclusion is suspect.
This was first described in the field of epidemiology. For example, the number of cases of disease D in city C is greater than would be expected by chance. City C has a factory that has released amounts of chemical agent A into the environment. Therefore, agent A causes disease D.
Not so fast.
The cluster may be the result of chance, or there may be another cause. Now, if you conclude that agent A should be tested as a possible trigger of disease D, that’s a reasonable inference.
Finding a Nostradamus prophecy that could arguably relate to a big event in history is another example. Here’s a famous prophecy that appears to predict Hitler:
Beasts wild with hunger will cross the rivers,
The greater part of the battle will be against Hister.
He will cause great men to be dragged in a cage of iron,
When the son of Germany obeys no law.
But out of a thousand prophecies, what are the odds that none of them will relate to a real event?
Trait Ascription Bias
Trait ascription bias is the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable in their personal traits across different situations. This may be because our own internal states are much more observable and available to us than those of others. A similar bias on the group level is called the outgroup homogeneity bias.
The degree to which we fall into this bias often depends on how well we know the other person, but not entirely. “You always behave like that” is an accusation most of us have leveled at a loved one at some time in our lives.
More next week.
Previous Installments
You can find the bias you’re interested in by clicking in the tag cloud on the right. To find all posts concerning cognitive biases, click the very big phrase.
Part 1 — Bias blind spot, confirmation bias, déformation professionnelle, denomination effect, moral credential effect.
Part 2 — Base rate fallacy, congruence bias, experimenter’s bias
Part 3 — Ambiguity aversion effect (Ellsberg paradox), choice-supportive bias, distinction bias, contrast effect
Part 4 — Actor-observer bias, anchoring effect, attentional bias, availability cascade, belief bias
Part 5 — Clustering illusion, conjunction fallacy, cryptomnesia
Part 6 — Disposition effect, egocentric bias, endowment effect, extraordinarity bias
Part 7 — False consensus effect, false memory, Forer effect, framing, fundamental attribution error
Part 8 — Gambler’s fallacy, halo effect
Part 9 — Hawthorne effect, herd instinct, hindsight bias, hyperbolic discounting
Part 10 — Illusion of asymmetric insight, illusion of control, illusory superiority, impact bias, information bias, ingroup bias, irrational escalation
Part 11 — Just-world phenomenon, loss aversion, ludic fallacy, mere exposure effect, money illusion
Part 12 — Need for closure, neglect of probability, “not-invented-here” (NIH) syndrome, notational bias
Part 13 — Observer-expectancy effect, omission bias, optimism bias, ostrich effect, outgroup homogeneity bias, overconfidence effect
Part 14 — Pareidolia, planning fallacy, post-purchase rationalization
Part 15 — Projection bias, pseudocertainty effect, publication bias
Part 16 — Reactance, reminiscence bump, restraint bias, rosy retrospection
Part 17 — Selection bias, selective perception, self-fulfilling prophecy
Part 18 — Self-serving bias, Semmelweis reflex, serial position effect
Part 19 — Status quo bias, stereotyping, subadditivity effect
Part 20 — Subjective validation, suggestibility, system justification theory
Telescoping Effect
The telescoping effect is a memory bias, first documented in a 1964 article in the Journal of the American Statistical Association. People tend to perceive recent events as being more remote in time than they are (backward telescoping) and more remote events as being more recent than they are. The Galton-Crovitz test measures the effect; you can take the test here.
Texas Sharpshooter Fallacy
A sales manager I once knew had an infallible sense of what was going to sell. Because he didn’t want to waste his time, he put all his emphasis on selling what he knew would sell, and didn’t bother pushing the stuff that wouldn’t sell anyway.
This is an example of the Texas sharpshooter fallacy. The Texas sharpshooter, you see, fired a bunch of shots at the side of the barn, went over and found a cluster of hits, and drew a bullseye over them. When you don’t establish your hypothesis first and test it second, your conclusion is suspect.
This was first described in the field of epidemiology. For example, the number of cases of disease D in city C is greater than would be expected by chance. City C has a factory that has released amounts of chemical agent A into the environment. Therefore, agent A causes disease D.
Not so fast.
The cluster may be the result of chance, or there may be another cause. Now, if you conclude that agent A should be tested as a possible trigger of disease D, that’s a reasonable inference.
Finding a Nostradamus prophecy that could arguably relate to a big event in history is another example. Here’s a famous prophecy that appears to predict Hitler:
Beasts wild with hunger will cross the rivers,
The greater part of the battle will be against Hister.
He will cause great men to be dragged in a cage of iron,
When the son of Germany obeys no law.
But out of a thousand prophecies, what are the odds that none of them will relate to a real event?
Trait Ascription Bias
Trait ascription bias is the tendency for people to view themselves as relatively variable in terms of personality, behavior and mood while viewing others as much more predictable in their personal traits across different situations. This may be because our own internal states are much more observable and available to us than those of others. A similar bias on the group level is called the outgroup homogeneity bias.
The degree to which we fall into this bias often depends on how well we know the other person, but not entirely. “You always behave like that” is an accusation most of us have leveled at a loved one at some time in our lives.
More next week.
Previous Installments
You can find the bias you’re interested in by clicking in the tag cloud on the right. To find all posts concerning cognitive biases, click the very big phrase.
Part 1 — Bias blind spot, confirmation bias, déformation professionnelle, denomination effect, moral credential effect.
Part 2 — Base rate fallacy, congruence bias, experimenter’s bias
Part 3 — Ambiguity aversion effect (Ellsberg paradox), choice-supportive bias, distinction bias, contrast effect
Part 4 — Actor-observer bias, anchoring effect, attentional bias, availability cascade, belief bias
Part 5 — Clustering illusion, conjunction fallacy, cryptomnesia
Part 6 — Disposition effect, egocentric bias, endowment effect, extraordinarity bias
Part 7 — False consensus effect, false memory, Forer effect, framing, fundamental attribution error
Part 8 — Gambler’s fallacy, halo effect
Part 9 — Hawthorne effect, herd instinct, hindsight bias, hyperbolic discounting
Part 10 — Illusion of asymmetric insight, illusion of control, illusory superiority, impact bias, information bias, ingroup bias, irrational escalation
Part 11 — Just-world phenomenon, loss aversion, ludic fallacy, mere exposure effect, money illusion
Part 12 — Need for closure, neglect of probability, “not-invented-here” (NIH) syndrome, notational bias
Part 13 — Observer-expectancy effect, omission bias, optimism bias, ostrich effect, outgroup homogeneity bias, overconfidence effect
Part 14 — Pareidolia, planning fallacy, post-purchase rationalization
Part 15 — Projection bias, pseudocertainty effect, publication bias
Part 16 — Reactance, reminiscence bump, restraint bias, rosy retrospection
Part 17 — Selection bias, selective perception, self-fulfilling prophecy
Part 18 — Self-serving bias, Semmelweis reflex, serial position effect
Part 19 — Status quo bias, stereotyping, subadditivity effect
Part 20 — Subjective validation, suggestibility, system justification theory
Tuesday, January 4, 2011
That Psychic Was So *Accurate*! (Part 20 of Cognitive Biases)
In our 20th installment of Cognitive Biases, we cover subjective validation, the tendency to think a statement is true if it means something to us; suggestibility, the extent to which we accept or act on the suggestions of others; and system justification theory, the cognitive bias of patriotism.
Subjective Validation
Subjective validation, also known as the personal validation effect, is the tendency to consider a statement correct if it’s meaningful to the listener. It’s related to the Forer effect and validated by confirmation bias, and it’s the basic technique that reinforces belief in paranormal phenomena. The listener focuses on and remembers the accurate statements and forgets or ignores the inaccurate ones, forming an impression of the psychic’s success that is wildly inflated.
Say anything, and it’s possible to find meaning in it. “I sense a father figure trying to contact you from the spirit world,” becomes validated if there’s anyone in the subject’s life that can be made to qualify. “I hear the phrase ‘broken wheel,’” the psychic says, and of all the thousands of possible associations, the subject finds one with personal meaning, and the psychic is validated.
What if the phrase ‘broken wheel’ evokes no associations? Then the psychic says, “I hear the name ‘Charles,’” and so forth until there’s a winner. Selective memory comes into play as well, so the subject doesn’t remember the ‘broken wheel’ figure, but remembers the ‘Charles’ association vividly.
The strength of the effect depends less on the skill of the psychic, of course, and much more on the level of desire of the subject. If we want to believe, we’ll find the evidence we need.
Suggestibility
You are suggestible to the extent you are inclined to accept or act on the suggestions of others. Some people are naturally more suggestible than others, of course, but suggestibility in individuals is varied. Intense emotions, current level of self-esteem or assertiveness, and age play a role.
The nature of suggestibility plays a big role in hypnosis. There are three different types of suggestibility, according to Dr. John Kappas.
With all of that, there’s surprisingly little consensus on what suggestibility is and how it works. Is it a function of character, a learned habit, a function of language acquisition and empathy, a biased term used to provoke people to greater resistance, or something else?
Common examples of suggestible behavior in everyday life include "contagious yawning" (multiple people begin to yawn after observing a person yawning) and the medical student syndrome (a person begins to experience symptoms of an illness after reading or hearing about it).
Placebo response may also be based on individual differences in suggestibility, at least in part. Suggestible persons may be more responsive to various forms of alternative health practices that seem to rely upon patient belief in the intervention. People who are highly suggestible may be prone to making poor judgments because they did not process suggestions critically and falling pray to emotion-based advertising.
System Justification Theory
System justification theory (SJT) is a scientific theory within social psychology that proposes people have a motivation to defend and bolster the status quo, that is, to see it as good, legitimate, and desirable.
According to system justification theory, people not only want to hold favorable attitudes about themselves (ego-justification) and their own groups (group-justification), but they also want to hold favorable attitudes about the overarching social order (system-justification). A consequence of this tendency is that existing social, economic, and political arrangements tend to be preferred, and alternatives to the status quo are disparaged.
Early SJT research focused on compensatory stereotypes. Experiments suggested that the widespread endorsement of stereotypes such as "poor but happy" or "rich but miserable" exist to balance out the gap between those of low and high socioeconomic status.,Later work suggested that these compensatory stereotypes are preferred by those on the left while people on the right prefer non-complimentary stereotypes such as "poor and dishonest" or "rich and honest", which rationalize inequality rather than compensate for it.
According to system justification theory, this motive is not unique to members of dominant groups, who benefit the most from the current regime; it also affects the thoughts and behaviors of members of groups who are seemingly incurring disadvantages by it (e.g., poor people, racial/ethnic minorities). System justification theory therefore accounts for counter-intuitive evidence that members of disadvantaged groups often support the societal status quo (at least to some degree), often at considerable cost to themselves and to fellow group members.
System justification theory differs from the status quo bias in that it is predominately motivational rather than cognitive. Generally, the status quo bias refers to a tendency to prefer the default or established option when making choices. In contrast, system justification posits that people need and want to see prevailing social systems as fair and just. The motivational component of system justification means that its effects are exacerbated when people are under psychological threat or when they feel their outcomes are especially dependent on the system that is being justified.
Subjective Validation
Subjective validation, also known as the personal validation effect, is the tendency to consider a statement correct if it’s meaningful to the listener. It’s related to the Forer effect and validated by confirmation bias, and it’s the basic technique that reinforces belief in paranormal phenomena. The listener focuses on and remembers the accurate statements and forgets or ignores the inaccurate ones, forming an impression of the psychic’s success that is wildly inflated.
Say anything, and it’s possible to find meaning in it. “I sense a father figure trying to contact you from the spirit world,” becomes validated if there’s anyone in the subject’s life that can be made to qualify. “I hear the phrase ‘broken wheel,’” the psychic says, and of all the thousands of possible associations, the subject finds one with personal meaning, and the psychic is validated.
What if the phrase ‘broken wheel’ evokes no associations? Then the psychic says, “I hear the name ‘Charles,’” and so forth until there’s a winner. Selective memory comes into play as well, so the subject doesn’t remember the ‘broken wheel’ figure, but remembers the ‘Charles’ association vividly.
The strength of the effect depends less on the skill of the psychic, of course, and much more on the level of desire of the subject. If we want to believe, we’ll find the evidence we need.
Suggestibility
You are suggestible to the extent you are inclined to accept or act on the suggestions of others. Some people are naturally more suggestible than others, of course, but suggestibility in individuals is varied. Intense emotions, current level of self-esteem or assertiveness, and age play a role.
The nature of suggestibility plays a big role in hypnosis. There are three different types of suggestibility, according to Dr. John Kappas.
- Emotional Suggestibility. A suggestible behavior characterized by a high degree of responsiveness to inferred suggestions that affect emotions and restrict physical body responses; usually associated with hypnoidal depth. Thus the emotional suggestible learns more by inference than by direct, literal suggestions.
- Physical Suggestibility. A suggestible behavior characterized by a high degree of responsiveness to literal suggestions affecting the body, and restriction of emotional responses; usually associated with cataleptic stages or deeper.
- Intellectual Suggestibility. The type of hypnotic suggestibility in which a subject fears being controlled by the operator and is constantly trying to analyze, reject or rationalize everything the operator says. With this type of subject the operator must give logical explanations for every suggestion and must allow the subject to feel that he is doing the hypnotizing himself.
With all of that, there’s surprisingly little consensus on what suggestibility is and how it works. Is it a function of character, a learned habit, a function of language acquisition and empathy, a biased term used to provoke people to greater resistance, or something else?
Common examples of suggestible behavior in everyday life include "contagious yawning" (multiple people begin to yawn after observing a person yawning) and the medical student syndrome (a person begins to experience symptoms of an illness after reading or hearing about it).
Placebo response may also be based on individual differences in suggestibility, at least in part. Suggestible persons may be more responsive to various forms of alternative health practices that seem to rely upon patient belief in the intervention. People who are highly suggestible may be prone to making poor judgments because they did not process suggestions critically and falling pray to emotion-based advertising.
System Justification Theory
System justification theory (SJT) is a scientific theory within social psychology that proposes people have a motivation to defend and bolster the status quo, that is, to see it as good, legitimate, and desirable.
According to system justification theory, people not only want to hold favorable attitudes about themselves (ego-justification) and their own groups (group-justification), but they also want to hold favorable attitudes about the overarching social order (system-justification). A consequence of this tendency is that existing social, economic, and political arrangements tend to be preferred, and alternatives to the status quo are disparaged.
Early SJT research focused on compensatory stereotypes. Experiments suggested that the widespread endorsement of stereotypes such as "poor but happy" or "rich but miserable" exist to balance out the gap between those of low and high socioeconomic status.,Later work suggested that these compensatory stereotypes are preferred by those on the left while people on the right prefer non-complimentary stereotypes such as "poor and dishonest" or "rich and honest", which rationalize inequality rather than compensate for it.
According to system justification theory, this motive is not unique to members of dominant groups, who benefit the most from the current regime; it also affects the thoughts and behaviors of members of groups who are seemingly incurring disadvantages by it (e.g., poor people, racial/ethnic minorities). System justification theory therefore accounts for counter-intuitive evidence that members of disadvantaged groups often support the societal status quo (at least to some degree), often at considerable cost to themselves and to fellow group members.
System justification theory differs from the status quo bias in that it is predominately motivational rather than cognitive. Generally, the status quo bias refers to a tendency to prefer the default or established option when making choices. In contrast, system justification posits that people need and want to see prevailing social systems as fair and just. The motivational component of system justification means that its effects are exacerbated when people are under psychological threat or when they feel their outcomes are especially dependent on the system that is being justified.
More next week.
To read the whole series, click "Cognitive bias" in the tag cloud to your right, or search for any individual bias the same way.
Subscribe to:
Posts (Atom)