Sunday, December 27, 2009

What is Risk?


Most people think a risk is something bad, which is often, though not always true. More specifically, a risk is a future event that would have a significant impact on you or something you care about if it should happen. The effect may be bad (threat) or sometimes good (opportunity).

You and I – every single one of us – live in a world of risk. Risk is uncertainty. It’s future tense, as opposed to “problem,” which is present tense.

We often manage risk by denial, declaring ourselves helpless in the face of implacable destiny, or checking our horoscope to find a propitious day to ask for that promotion. We deny the existence of randomness by chanting self-help mantras – “I’m good enough, I’m smart enough, and doggone it, people like me!” – declaring ourselves solely responsible for all that befalls us.

There is, however, a science of risk. We tend to notice the work of risk professionals only when they fail, when the patchwork of convertible debt swaps backing subprime mortgages unravels due to faulty pricing. But it’s not only the economy, stupid. Risk managers keep airplanes in the air, buildings from collapsing, and secure every bit of a modern infrastructure.

Fundamental Concepts of Risk

A risk event may be likely, or it may be unlikely. You have to take into account both the severity of the impact and its likelihood.

How likely is it that this risk will occur? Sometimes we know with mathematical certainty. More often, the best we can do is guess: pretty low, or almost certain. Impact can sometimes be turned into a number: $1000, or €50. Other times, it’s about as specific as that scene in Ghostbusters when Egon explains what will happen if they cross the streams: “It would be bad.”

The standard risk formula is R = P x I, or the value of a risk is its probability times its impact. That’s particularly helpful in pricing financial risk. If there’s a 10% chance of something happening that would cost you $1,000, then the value of the risk is $100. That means if you can get rid of the risk for under $100, it’s clearly profitable to do so. If it will cost more than $100 to get rid of the risk, you have to consider whether other factors justify the additional cost.

Because risks can be threats or opportunities, you have to know the difference between a “pure risk” and a “business risk.” A pure risk is all downside. If you didn’t get in an accident today, you’re not better off — you avoided becoming worse off.

A business risk — a stock market investment, for example — has an upside and a downside. There is a possibility you will make money, and a possibility you will lose money. The risk formula’s P x I has to be determined for the upside and for the downside, so you can see how the risks balance. If the result is favorable, that’s an argument for the investment; if it’s unfavorable, perhaps not.

For example, let’s say you’re offered an investment for $5,000. In return, you are guaranteed a 70% chance of $50,000, and a 30% chance of losing your investment. That works out to an expected monetary value of $33,500 (the value of the opportunity risk [$35,000] plus the value of the loss [-$1,500]).

But the real outcome isn’t $33,500. You’ll receive either $50,000 or lose your $5,000. There are three possible strategies — bet on the upside (go for the $50,000), hedge the downside (avoid losing the $5,000 by not betting), or bet the expected value (take the investment because an expected $33,500 means more than keeping an actual $5,000).

Your personal choice may be influenced by how much is in your bank account.

Ways to Manage Threat Risk

  • Avoidance (change the environment so the risk no longer exists)
  • Transference (give or sell the risk to someone else; buying insurance is a common risk transfer strategy)
  • Mitigation (do things to reduce likelihood or impact of the risk, even if you can’t get rid of it completely)
  • Contingency planning (come up with a plan to follow if the threat should start to develop or seem likely)
  • Acceptance (ignore it for now and deal with it if it happens)

Ways to Manage Opportunity Risk

  • Exploitation (take the value and cash it in)
  • Enhancement (try to grow the value to a higher level)
  • Sharing (give, trade, or sell the value to someone else
  • Contingency planning (come up with a plan to follow if the opportunity should start to develop or seem likely)
  • Acceptance (ignore it for now and deal with it if it happens)

Residual and Secondary Risk

Most risk strategies can’t get rid of every bit of risk. You start with an initial level of risk, and whatever’s left after you identify your strategy is the residual risk. You can come up with additional ways to reduce the residual risk further, but at some point you normally accept some level of residual risk and move on. We take our lives in our hands every day we get behind the wheel of a car, but really, what choice do most of us have?

Secondary risk is new risk created by your proposed solution to the original risk. If you spend time and energy to get rid of Problem A, and that means less attention to Problem B, it’s possible you’ve made things worse overall. Make sure you inspect any proposed solutions for secondary risk before rushing to implement them.

* * *

While some risk tools are enormously sophisticated, and their use only appropriate for specialists, you and I can use many of the techniques of experts to make our lives safer more prosperous and more successful.

We don’t want to sweat the small stuff, but the truth is, it’s not all small stuff. Some risks we can, and should, accept.

Others, we can't — though sometimes we have to.

Saturday, December 19, 2009

How Much Will You Pay for this Cheese Sandwich? (Part 6 of Cognitive Biases)

A cognitive bias is a pattern of deviation in judgment that occurs in particular situations, and boy howdy, are there a lot of them! Links to the first five parts follow below.

In this week’s installment we’ll look at biases that begin with the letters “D” and “E.”

Disposition effect

Markets are supposed to be rational, but investors aren’t, according to the discipline of behavior finance. Investors have a tendency to sell shares whose price has increased, but hold onto assets that have dropped in value, because the pain of recognizing losses exceeds the potential pleasure of having assets that may yet grow. The disposition effect is measure of that tendency.

Egocentric bias

There are two different types of egocentric bias — social and memory.

The social egocentric bias makes people tend to take more credit for their own part of a joint action than an outside observer would give them. What’s interesting about the egocentric bias is that not only do people claim more credit for positive outcomes (which would make this the same as “self-serving bias”) but also claim more responsibility for negative outcomes.

The memory egocentric bias is a self-serving tendency to remember our own past in a way that makes us look better. Like most memory biases, this isn’t the same thing as lying about our past; it’s a form of self-deception in which we really do recall things that way, facts notwithstanding.

Endowment effect

The endowment effect is also found in behavioral economics, where it’s also called “divestiture aversion.” In one test, people demanded a much higher price to sell a coffee mug they’d been given than they were willing to pay for a coffee mug they didn’t yet own. This contradicts a standard principle of economic theory that a person’s willingness to pay (WTP) should be equal to their willingness to accept payment (WTA).

There are arguments about why this is so. One possibility is that emotional attachments to things you already own may make them seem more valuable to you. It’s also been linked to a form of “status quo bias,” a general dislike of change. Some other experiments have not detected this effect.

Extraordinarity bias

A cheese sandwich that appears to have the image of the Virgin Mary on it isn’t tastier than one without, but a normal cheese sandwich costs a couple of bucks while the one with the Virgin sold for $28,000. A guitar once owned by Elvis Presley might not play better (or possibly even as well) as a new one, but people are willing to pay much more for it.

That's not wrong, it's simply a bias. The extraordinarity bias is the measure of your willingness to pay more (sometimes much more) for an "extraordinarity" of an object that doesn't in itself change the intrinsic value of the object. The extraordinarity can be personal as well as external: a present from a loved one, for example, could have far more value to you than the intrinsic object is worth.

There's no reason to avoid the extraordinarity bias; the only thing you need to do is to be conscious of it.


Previous Installments

Part 1 "Unknown Unknowns — A Survey of Assumptions, Biases, and Bigotry" — Bias blind spot, confirmation bias, déformation professionnelle, denomination effect, moral credential effect.
Part 2 "Looking for the Pony" — Base rate fallacy, congruence bias, experimenter’s bias
Part 3 "Women Drivers and Balls" — Ambiguity aversion effect (Ellsberg paradox), choice-supportive bias, distinction bias, contrast effect
Part 4 "If I Already Know I Don't Like Your Conclusion, Why Should I Listen to Your Argument?" — Actor-observer bias, anchoring effect, attentional bias, availability cascade, belief bias
Part 5 "Patterns, Probability, and Plagiarism" — Clustering illusion, conjunction fallacy, cryptomnesia

See you next week!

Saturday, December 12, 2009

The Godzilla Principle and the Most Dangerous Word

We'll take another break from Cognitive Biases, with three short pieces on project management.

The Godzilla Principle

In Japanese monster movies, there’s frequently a scene in which the monster du jour (Godzilla, Mothra, Gamora, etc.) is still a cute little baby monster. People say, “Oh, what a cute little monster!” Obviously, there’s no urgency, so they ignore it.

They wait until the monster is full grown and busily stomping downtown Tokyo, and then they’re running through the streets shouting “What are we going to do?”

By the time Godzilla is rampaging through downtown Tokyo, most of the good options are long since gone. The project management answer is to stomp the baby monster while it’s still little.

Options for solving a problem tend to reduce over time. Effective risk management and risk response planning are a lot more powerful than even the most creative on-the-job problem solving.

Are you doing enough to find and kill those pesky baby monsters on your projects before they grow up?

The Most Dangerous Word

The most dangerous word in project management is “Yes”—said before you know what it is you’re saying “yes” to. Once you say yes, you’ve bought the whole package, and your negotiating leverage with your customers, internal and external, loses a lot of potential power.

Of course, saying “No” might not be a good idea either. Enthusiasm and a positive attitude are important qualities. But you can be enthusiastic and positive and still ask probing questions. “Let me see if I understand what you’re asking for.”

Here’s a tip about working with customers: they need to hear their words in your mouth before they believe you’ve listened to them. First tell them what they said, and then start your probing. You’ll get less push back that way.

Plus Ça Change, Plus C'est La Même Chose

With the debut of the World Wide Web, some people predicted the death of retail in two years — five tops. After all the hype dissipated, we saw that purchasing patterns weren’t that much different. Twenty years later, however, as we routinely buy our holiday gifts online and have rare items shipped to us from all over the world, we can see that some types of retail establishments are indeed dying. In other words, don’t expect the world to change in a year, but be prepared for it to be unrecognizable in 20.

The first calculator I ever saw belonged to Arthur C. Clarke. When I was in college, I got to be his driver for a couple of days. Between appointments, we talked for a while and he pulled out a Hewlett-Packard calculator, for which he had paid $700 in 1971. It had four functions and no memory.

“Do you see this?” He paused dramatically. “It means that the slide rule is obsolete!

My reaction was, “This is cool, but I will never own one in my life. I will never have enough need to pay $700 for a calculator. I have a slide rule and that’s all I’ll ever need.” And that’s how most of the world felt, except for a handful of hard-core engineers.

Ten years later, banks were giving calculators away when you opened a checking account.

Short term and long term outcomes are very different.


Sunday, December 6, 2009

Patterns, Probability, and Plagiarism (Part 5 of "Cognitive Biases")

This week's installment of Cognitive Biases, the ways in which our brains distort our thinking is brought to you by the letter “C.” The series begins here.

Clustering illusion

Is the sequence below random or non-random??

OXXXOXXXOXXOOOXOOXXOO

If you thing the sequence looks non-random, you’re with the majority…but you’re wrong. The sequence has several characteristics of a random stream, an equal number of each result and an equal number of adjacent results. But people seem to expect a “random” sequence to have a greater number of alternations (O to X or vice-versa) than statistics would predict. The chance of an alternation in a sequence of independent random binary events (flips of heads or tails) is 50%, but people seem to expect an alternation rate of about 70%.

The clustering illusion is a cognitive bias that creates atendency to see patterns where actually none exist. This is why most people believe in “streaks.” When you expect greater variation in a sequence, you tend to assume that there’s a trend. But that isn’t necessarily the truth.

Conjunction fallacy

Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Which statement is more probable?

1. Linda is a bank teller.
2. Linda is a bank teller and is active in the feminist movement.

In a 1982 study by Amos Tversky and Daniel Kahneman, 85% thought statement 2 was more probable than statement 1, but that’s wrong. The probability of two events occurring together is always less than or equal to the probability of either one occurring alone. Even if there’s a very low probability Linda is a bank teller (let’s make it 5%) and a very high probability that Linda is active in the feminist movement (95%), the chance that Linda is a bank teller AND active in the feminist movement is 5% x 95%, or 4.75%, lower than the first statement.

The conjunction fallacy happens when you assume that specific conditions are more probable than a single general one, which is a violation of basic logic. Now, one possibility is that because most people aren’t familiar with the rules of formal logic, they may assume that statement 1 (Linda is a bank teller) implies that she isn’t active in the feminist movement.

But the fallacy has been demonstrated with very educated audiences.

Another Tversky/Kahneman experiment in the early 1980s surveyed a group of foreign policy experts to determine the probability that the Soviet Union would invade Poland and the US would break off diplomatic relations in the following year. The consensus estimate was about a 4% chance. Next, another group of experts was asked the probability that the United States would break off relations with the Soviet Union the following year. They estimated only a 1% chance. This implies that the detailed, specific scenario of the first scenario all by itself made it more likely.

Cryptomnesia

Robert Louis Stevenson refers to an incident of cryptomnesia that took place during the writing of Treasure Island, and that he discovered to his embarrassment several years afterward:


I am now upon a painful chapter. No doubt the parrot once belonged to Robinson Crusoe. No doubt the skeleton is conveyed from Poe. I think little of these, they are trifles and details; and no man can hope to have a monopoly of skeletons or make a corner in talking birds. The stockade, I am told, is from Masterman Ready. It may be, I care not a jot. These useful writers had fulfilled the poet's saying: departing, they had left behind them Footprints on the sands of time, Footprints which perhaps another — and I was the other! It is my debt to Washington Irving that exercises my conscience, and justly so, for I believe plagiarism was rarely carried farther. I chanced to pick up the Tales of a Traveller some years ago with a view to an anthology of prose narrative, and the book flew up and struck me: Billy Bones, his chest, the company in the parlour, the whole inner spirit, and a good deal of the material detail of my first chapters — all were there, all were the property of Washington Irving. But I had no guess of it then as I sat writing by the fireside, in what seemed the spring-tides of a somewhat pedestrian inspiration; nor yet day by day, after lunch, as I read aloud my morning's work to the family. It seemed to me original as sin; it seemed to belong to me like my right eye.

Sometimes what seems like inspiration turns out to be memory, and you’ve committed inadvertent plagiarism, or cryptoamnesia. In a 1989 study, people generated examples (such as kinds of birds), and later were asked to create new examples and to recall which answers they had previously personally given. Between 3-9% of the time, people either listed examples previously given, or recalled as their own someone else’s thought.

Few writers would risk committing deliberate plagiarism, but the dangers of cryptoamnesia are real. It’s most likely to occur when you don’t have the ability to monitor your sources properly, when you’re away from the original source of the idea, or when the idea was originally suggested by a person of the same sex (!). It’s also likely to happen in a brainstorming session, in which you recall as yours an idea that came up immediately before your idea.

Of course, not all claims of cryptoamnesia are necessarily valid; sometimes the plagiarism was all too deliberate. But nothing else explains certain situations in which people with an awful lot to lose commit what appears to be blatant plagiarism with no upside whatsoever.

The courts have ruled that the unconsciousness of the plagiarism doesn’t excuse it; the classic (rock) case is Bright Tunes Music v. Harrisongs Music involving the similarities between “He’s So Fine” and “My Sweet Lord.”

That cost George Harrison $587,000.

Cognitive biases can be expensive.

More next week.