Showing posts with label confirmation bias. Show all posts
Showing posts with label confirmation bias. Show all posts

Monday, January 4, 2010

Risk Management, Cognitive Bias, and the Global Warming Debate

The debate on global warming tends to revolve completely around the science. Is it good? Is it bad? Is it meaningful? Is it corrupt? Everyone has an opinion on the quality of the science, and once those opinions are formed, they’re almost impossible to shake.

A wide variety of potential cognitive biases complicate the picture. Notice there’s enough here for everybody — no one’s being singled out.

Base rate fallacy — ignoring statistical data in favor of particulars
Confirmation bias — interpreting information to confirm your preconceptions
Experimenter’s bias — with about sixty subsets
Focusing effect — putting too much emphasis on a single aspect of a situation or event
Framing — viewing through a perspective or approach that is too narrow
Hyperbolic discounting — the preference for more immediate payoffs over the long term
Irrational escalation — making irrational decisions based on rational decisions in the past, or to justify actions already taken
Information bias — seeking more information even when it cannot affect action or decision

…the list goes on. Recognize some of these biases? If you’re like most of us, you recognize them in the other side more than you see them in yourself or those who agree with you.

Part of the reason why cognitive bias is at work is that the question isn’t really clear. We’re all arguing about the science, though few of us are truly entitled to an educated opinion on the subject.

But what’s the question?

It’s not about whether a scientific opinion is correct or incorrect. That sort of thing only interests specialists. No, the question has to do with what (if anything) should we do about it, based on the potential cost and consequences.

In other words, it’s a question of risk management. And to the extent that it’s a question of risk management, it’s phrased wrong.

A risk, as you’ll remember, is a future event with some probability of happening that if it happens will have a meaningful impact on your situation. If the impact is negative, it’s a threat. If the impact is positive, it’s an opportunity.

Risks, like Gaul, can be divided into three parts. The first part is probability. How likely is it that the risk will happen?

The second part is impact. If the risk should happen, what would be its effects?

Those two parts combine in the formula R = P x I to calculate a risk score, the value of the risk.

We care about the value of the risk because that helps us make a rational decision about the third element: the cost of reducing or eliminating the negative risk, or the cost of obtaining or exploiting the positive risk.

Probability

The argument about the science of climate change is at root an argument about probability. The process of science involves collecting data, discovering patterns in that data, and developing and testing hypotheses and theories about that data. Over time, the process of peer review creates a consensus in the scientific community, and at any moment in time, that’s the state of scientific knowledge.

Let’s sidestep the discussion about whether the consensus of current scientific knowledge is accurate or inaccurate, and merely assess how our own feeling and opinions influence our judgment of probability. Taking as a guide the legal standards of proof, we might fall somewhere on the following spectrum. For a rough calculation, I’ve put in some percentages.

Degree of Belief (Probability You Think It's True)
True beyond any doubt (99+%)
True beyond a reasonable doubt (95%)
True by the preponderance of the evidence (75%)
Unable to tell (50%)
False by the preponderance of the evidence (25%)
False beyond reasonable doubt (5%)
False beyond any doubt (>1%)

This is about what you believe about the science, and a corresponding figure that relates to how likely it is that the threat is true. If you don't like the choices, add one of your own and choose your own probability number.

Notice that the evidence won't stand still. Over time, science will inevitably get better, regardless of your perspective. Either the evidence of catastrophic global climate change will mount so high no sane person can deny it, or global warming will become the Comet Kohoutek of crises, a non-event. Or maybe something in between.

The problem is by the time the facts become incontrovertible, the moment for decision will have passed. If we guess wrong, there are two possibilities: (a) we will be in a significantly worse position to deal with the resultant impact, or (b) we will have wasted significant resources.

Impact

This leads us to the second item, the question of impact. Impact is the effect of the threat or opportunity if it happens — even if you believe the chance is remote at best. So we have to set probability aside temporarily. We’ll come back to it in a moment.

In addition to arguments about how likely it is that the scientific consensus on global warming is in fact correct, there is a range of opinion as to what that means in practical terms: a range of impact. I've specified a set of potential impact levels and set costs for each. Remember, the issue isn't whether these are going to happen. They're simply descriptions of the potential level of impact that different parties suggest are possible.

So choose from the list below. What, in your opinion, is the worst possible potential outcome if global warming happens?

  • Catastrophic. Global warming effects will kill tens or hundreds of millions of people directly and indirectly, wipe out tens of thousands of species, and be an economic and social catastrophe to those who survive. Repair or rebuilding may or may not be possible. (Cost = $Quadrillions)
  • Serious. Major weather events, such as hurricanes and tsunamis will be more prevalent, tens and hundreds of thousands will die, economies will suffer. (Cost = $Trillions)
  • Moderate. Managing environmental issues will be a consuming issue, but better management and improved technology will make this a background costs. (Cost = $Billions)
  • Minor. Insignificant costs. (Cost < $Millions)
Notice the impact could also be positive.

Value of the Risk

Just because you aren't convinced the evidence in favor of a risk is certain doesn't mean you don't act on it. We take everyday precautions to avoid low probability or highly uncertain risks with potentially high impact all the time — every time we drive on a freeway, for example. But there's a limit. How does the value of the risk compare to the cost of mitigation?

The value of the risk, as we’ve noted, is the probability times the impact. From our earlier work, we can construct this table. The risk score in each case is what you should reasonably be willing to spend if necessary to mitigate the degree of risk you personally believe is present.

Catastrophic
95% confident, $Quadrillions
75% confident, $Low Quadrillions
50% confident, $1 Quadrillion
25% confident, $High trillions
5% confident, $Low trillions

Serious
95% confident, $Up to 1 Quadrillion
75% confident, $750 trillion
50% confident, $500 trillion
25% confident, $250 trillion
5% confident, $50 trillion

Moderate
95% confident, $Up to 1 Trillion
75% confident, $750 billion
50% confident, $500 billion
25% confident, $250 billion
5% confident, $50 billion

Minor
95% confident, $Possibly a few billion
75% confident, $Less than a billion
50% confident, $500 million
25% confident, $250 million
5% confident, $Low millions

Cost of Mitigation

The value of the risk is what you’re willing to spend if necessary. Depending on how you assessed probability and impact, you ended up with some amount of money (perhaps $0) that's appropriate as a maximum to spend on the risk.

Of course, you need to compare that to the cost of mitigating or eliminating the risk. Sometimes, it’s not worth it. If I offered to save you from a $1,000 risk in exchange for $2,000, it’s not much of a deal. In general, if the cost of getting rid of the risk exceeds the cost of living with it, you’re better off living with it.

On the other hand, if I can save you from a $1,000 risk (say, a 25% chance of losing $4,000) for only $500, that's a pretty good deal. If the risk happens, you've saved $3,500. But if the risk doesn't happen, you're still out $500.

It's true that not all costs of a risk (or costs of a risk mitigation) can be easily translated into dollar terms — or even should be. That doesn’t change the basic principle, though: the cost of dealing with the risk has to be less than the cost of living with the risk.

There’s an important qualification when it comes to risk mitigation. Some risks you can get rid of altogether if you’re willing to pay the price. Other risks you can reduce, but not eliminate. You can lower the probability of the event occurring, or you can lower the impact if it should occur.

That’s not a bad thing, mind you, but you have to take into account the residual risk when deciding if the strategy is worth it. The value of that risk is the difference between the cost of the original risk and the cost of the residual risk.

The Right Question

To have a reasoned discussion on the subject of global warming, you have to figure out where you are on five issues, not merely one.

1. How correct is the scientific consensus on global warming?
2. What is the impact of global warming if it should occur?
3. What is the value of the risk (probability times impact)?
4. What is the cost of mitigating or eliminating the risk, and how much residual risk would remain?
5. In balance, what level of action on global warming (if any) is warranted?

To change someone’s opinion, you have to change that person’s evaluation of at least one of these issues.

As people on all sides have found, it’s nearly impossible to change anyone’s evaluation of the quality of science, which is our probability benchmark. There’s often more consensus of what global warming might mean if it happens, which is why it’s so important to separate discussion of probability from the discussion of impact.

But the real opportunity has to do with the issue of cost. The best current framing of the debate comes from the argument that dealing with global warming and environmental issues can be relatively low in cost, or ideally profitable.

If the cost to deal with global warming is low enough, it's a good idea even for those who think the probability is low.

Saturday, October 24, 2009

Unknown Knowns — A Survey of Assumptions, Biases, and Bigotry

“The bigot is not he who knows he is right; every sane man knows he is right. The bigot is he whose emotions and imagination are too weak to feel how it is that other men go wrong.”

- G. K. Chesterton, Alarms and Discursions, 1910

Last week, we explored Donald Rumsfeld’s observation about “unknown unknowns.” Unknown unknowns aren't just about what you don't know, they're about what you don't even know that you don't know. The other categories, of course, are known knowns (things you know and know you know) and known unknowns (things you know that you don't know).

But there is one missing combination: unknown knowns, the things you don't know that you really do know. How could you not know something that you actually do know? The answer involves cognitive biases, the ways in which your mind deceives you. Cognitive biases can blind you to what is in fact right in front of you, and also can make you see things that really aren't there.

In looking at cognitive bias, the essential first step is to realize that no one is immune. It's easier to see the mote of self-deception in someone else's eye than it is to see the big heavy curtains that are draped over our own perceptions. None of us can completely escape the trap, but we can (and must) stay aware that what we think isn't necessarily the whole or complete picture. As once was famously said of Vietnam, "Anybody who knows what's going on clearly doesn't understand the situation."

Project managers are taught how important it is to document the range of assumptions on a project, but the PMBOK® Guide doesn't go into much detail about how to discover them or what to do about them. And it's wrong to assume (*ahem*) that all assumptions are bad for your project. They don't always make an "ass + u + me."

Some assumptions are, of course, clearly bad. Common project assumptions include the idea that everybody's on board; that people will always play nice; and that the proposed project will actually solve the underlying problem. "Bad" in this context doesn't mean these assumptions are necessarily or always wrong; it means it's dangerous to take for granted that they're right.

Other assumptions are more useful: if you see a gun, it's wise to assume it's loaded and act accordingly, even if you have good reason to believe it probably isn't. The consequences of an error in one direction don't have the same impact as the consequences of an error in the other. Still other assumptions may change over time. Assume the gun is loaded unless you need to use it; in the latter case, it might be safer to assume it isn't loaded and check to make sure there's a round in the chamber.

The big problem in assumptions comes from assumptions that are held so deeply in the subconscious mind that we (or other stakeholders) aren't even aware they exist -- the “unknown knowns" of our title.

Prejudices and biases are a normal part of the makeup of human beings. They have a certain utility; they permit us to filter and organize and simplify the complex flood of data we get from everyday existence. The danger comes when prejudices are confused with facts. A good general assumption turns into an iron-clad rule; “some” is equated with “all,” and it’s one short step to the idea that if someone sees it differently, they must be either stupid or venal. That, as G. K. Chesterton points out, is the essence of bigotry.

It is both humbling and fascinating to read the extensive and exhaustive lists of biases and cognitive distortions that have been identified over the years. There are far too many for a single blog post, so we'll have fun with these for the next few weeks. If you'd like to jump into discussion, please feel free. SideWise thinkers know they have to battle their own biases as well as those of others, and understanding the list is the essential first step.

Let’s start with five common biases.

Decision-Making and Behavioral Biases

Bias Blind Spot — "Bias blind spot" is a recursive bias, the bias of failing to compensate for one's own cognitive biases. Some 80% of drivers think they are substantially better than the average driver. That's called the "better than average effect." Here, the vast majority of people think they are less subject to bias than the average person.

Confirmation Bias — Evidence is seldom completely clean and clear. If a mass of facts argue against our position and one fact supports it, guess which fact we focus on? When confronted by a mass of data, we tend to be selective in the evidence we collect; we tend to interpret the evidence in a biased way; and when we recall evidence, we often do so selectively. This is why a search for facts isn't as persuasive as logic might suggest.

Déformation professionnelle — Your training as a professional carries with it an intrinsic bias that's often expressed by the phrase "When the only tool you have is a hammer, all problems look like nails." We probably know IT professionals who think every problem can be best solved with software, HR professionals who think every problem yields to training and human capital development, and project managers who think all problems lie inside the confines of the triple constraints. Each profession, of course, provides enormous value, but no single profession has all the answers.

Denomination Effect — One way to limit your daily spending is to carry only large denomination bills. Research shows that people are less likely to spend larger bills than their equivalent value in smaller ones. (This could also be called the Starbucks Effect.)

Moral Credential Effect — If you develop a track record as a moral and ethical person, you can actually increase your likelihood of making less ethical decisions in the future, as if you have given yourself a "Get out of jail free" card. For example, in a 2001 study, individuals who have had the opportunity to recruit a woman or an African-American in one setting were more likely to say later that a different particular job would be better suited for a man or a Caucasian.

More next week...

[Illustration © 2009 Mark Hill, used with permission.]