“The bigot is not he who knows he is right; every sane man knows he is right. The bigot is he whose emotions and imagination are too weak to feel how it is that other men go wrong.”
- G. K. Chesterton, Alarms and Discursions, 1910
Last week, we explored Donald Rumsfeld’s observation about “unknown unknowns.” Unknown unknowns aren't just about what you don't know, they're about what you don't even know that you don't know. The other categories, of course, are known knowns (things you know and know you know) and known unknowns (things you know that you don't know).
But there is one missing combination: unknown knowns, the things you don't know that you really do know. How could you not know something that you actually do know? The answer involves cognitive biases, the ways in which your mind deceives you. Cognitive biases can blind you to what is in fact right in front of you, and also can make you see things that really aren't there.
In looking at cognitive bias, the essential first step is to realize that no one is immune. It's easier to see the mote of self-deception in someone else's eye than it is to see the big heavy curtains that are draped over our own perceptions. None of us can completely escape the trap, but we can (and must) stay aware that what we think isn't necessarily the whole or complete picture. As once was famously said of Vietnam, "Anybody who knows what's going on clearly doesn't understand the situation."
Project managers are taught how important it is to document the range of assumptions on a project, but the PMBOK® Guide doesn't go into much detail about how to discover them or what to do about them. And it's wrong to assume (*ahem*) that all assumptions are bad for your project. They don't always make an "ass + u + me."
Some assumptions are, of course, clearly bad. Common project assumptions include the idea that everybody's on board; that people will always play nice; and that the proposed project will actually solve the underlying problem. "Bad" in this context doesn't mean these assumptions are necessarily or always wrong; it means it's dangerous to take for granted that they're right.
Other assumptions are more useful: if you see a gun, it's wise to assume it's loaded and act accordingly, even if you have good reason to believe it probably isn't. The consequences of an error in one direction don't have the same impact as the consequences of an error in the other. Still other assumptions may change over time. Assume the gun is loaded unless you need to use it; in the latter case, it might be safer to assume it isn't loaded and check to make sure there's a round in the chamber.
The big problem in assumptions comes from assumptions that are held so deeply in the subconscious mind that we (or other stakeholders) aren't even aware they exist -- the “unknown knowns" of our title.
Prejudices and biases are a normal part of the makeup of human beings. They have a certain utility; they permit us to filter and organize and simplify the complex flood of data we get from everyday existence. The danger comes when prejudices are confused with facts. A good general assumption turns into an iron-clad rule; “some” is equated with “all,” and it’s one short step to the idea that if someone sees it differently, they must be either stupid or venal. That, as G. K. Chesterton points out, is the essence of bigotry.
It is both humbling and fascinating to read the extensive and exhaustive lists of biases and cognitive distortions that have been identified over the years. There are far too many for a single blog post, so we'll have fun with these for the next few weeks. If you'd like to jump into discussion, please feel free. SideWise thinkers know they have to battle their own biases as well as those of others, and understanding the list is the essential first step.
Let’s start with five common biases.
Decision-Making and Behavioral Biases
Bias Blind Spot — "Bias blind spot" is a recursive bias, the bias of failing to compensate for one's own cognitive biases. Some 80% of drivers think they are substantially better than the average driver. That's called the "better than average effect." Here, the vast majority of people think they are less subject to bias than the average person.
Confirmation Bias — Evidence is seldom completely clean and clear. If a mass of facts argue against our position and one fact supports it, guess which fact we focus on? When confronted by a mass of data, we tend to be selective in the evidence we collect; we tend to interpret the evidence in a biased way; and when we recall evidence, we often do so selectively. This is why a search for facts isn't as persuasive as logic might suggest.
Déformation professionnelle — Your training as a professional carries with it an intrinsic bias that's often expressed by the phrase "When the only tool you have is a hammer, all problems look like nails." We probably know IT professionals who think every problem can be best solved with software, HR professionals who think every problem yields to training and human capital development, and project managers who think all problems lie inside the confines of the triple constraints. Each profession, of course, provides enormous value, but no single profession has all the answers.
Denomination Effect — One way to limit your daily spending is to carry only large denomination bills. Research shows that people are less likely to spend larger bills than their equivalent value in smaller ones. (This could also be called the Starbucks Effect.)
Moral Credential Effect — If you develop a track record as a moral and ethical person, you can actually increase your likelihood of making less ethical decisions in the future, as if you have given yourself a "Get out of jail free" card. For example, in a 2001 study, individuals who have had the opportunity to recruit a woman or an African-American in one setting were more likely to say later that a different particular job would be better suited for a man or a Caucasian.
More next week...
[Illustration © 2009 Mark Hill, used with permission.]