The twin challenges of risk and uncertainty complicate the process. In classical risk, we know the probability and impact of the various outcomes. Risk management involves (a) an assessment of the risk, including the probability of its occurrence and the consequence if it does occur, and (b) a decision on what action to take in response to that risk (avoid, mitigate, transfer, or accept for threat risks, and exploit, enhance, share or accept for opportunity risks). In project management, the fundamental characteristics of “temporary and unique” [PMBOK® Guide] mean that we don’t necessarily know the range of possible outcomes, and almost never have reliable probability information.
Critical decision-making information is often subjective and values-driven. An engineering evaluation might tell us that there is a 42% probability of an event happening, and that the consequence involves the loss of $21.6 million and three lives. What such an evaluation does not tell us is whether the risk is worth running. Values—organizational values, mission-related values, ethical values—address the consideration of worth. Some of these values are assumed and implicit. Some can be quantified, and others—usually political considerations—can’t even be discussed on the record.
Decisions often require tradeoffs. A perfect solution may not exist. Each potential choice may have a downside, or may be fraught with risk or uncertainty. In some ways, making the least bad choice out of a set of poor alternatives takes greater skill and courage than making a conventionally “good” decision.
The outcome of a decision, whether positive or negative, is not in itself proof of the decision’s quality, especially where probability is concerned. The odds may be dramatically in favor of a positive outcome, yet low probability events can occur. Equally, if someone makes a stupid decision but gets lucky, the decision is no less stupid in spite of a good outcome. A good decision process improves our odds and results in the desired outcome the majority of the time.
In addition, a decision-making process must often be open and auditable. We must know not only the decision we make, but also the process that led us to that decision. If the outcome is bad, someone else (a boss, a customer, a Congressional committee) determines—with the benefit of “20-20 hindsight”—whether your decision was reasonable and appropriate.
This leads to the rational (if not always appropriate) strategy known as “CYA” (cover your assets), where the decision is made not necessarily from a mission perspective, but in a way that ensures blame and punishment will fall elsewhere in the event of a bad outcome. “A decision,” wrote author Fletcher Knebel, “is what a man makes when he can’t find anybody to serve on a committee.”
If enough time and resources are available, even the most complex problems can be structured and studied in a way that leads to an optimal decision. When time and resources are not available, however, a decision still needs to be made, and your accountability remains unchanged. There’s the story of the person who called an attorney, who listened to the situation and said, “Don’t worry, they can’t put you in jail for that.” The person replied, “But counselor, I’m calling from the jail!”
No formal process or methodology can remove all risk from decision-making. Many tools and techniques can improve decision-making, but ultimately successful decision processes require good judgment. Good judgment comes from experience combined with wisdom. Experience can come from bad judgment. While wisdom is the product of experience, experience does not automatically confer wisdom.
Adapted from "Chapter 18, Decision-Making and Analysis," by Michael S. Dobson, Paul Componation, and Ted Leemann, in Applied Project Management for Space Systems (Space Technology Series), edited by Julie Chelsey, Wiley Larson, Marilyn McQuade, and Robert Menrad, McGraw-Hill/US Air Force Academy, 2008.