A cognitive bias is a pattern of deviation in judgment that occurs in particular situations, and boy howdy, are there a lot of them!
This week, meet the Hawthorne effect (being watched makes you work harder), the herd instinct (tendency to follow the crowd), hindsight bias (I knew it all along), and hyperbolic discounting (I would gladly pay you Tuesday for a hamburger today.)
The Hawthorne effect is often portrayed as aa sort of Heisenberg uncertainty principle for the social sciences: observer interacts with observed through the process of observation. In practice, almost any sort of internal improvement effort will have a short-term positive effect on performance, a placebo effect that benefits all of us in the management consulting world.
The original experiments on which the Hawthorne effect is based took place from 1924 to 1932 at the Hawthorne Works, a Western Electric plant outside of Chicago. A group of six women worked in a special room assembling telephone relays and dropping them down a chute. The most famous and oft-cited of those experiments involves a study of how illumination levels affected the rate with which these women dropped finished relays down the chute. Over a five year period the researchers also changed pay rules, varied break frequency and duration, and shortened and lengthened the workday, all to the tune of the drip-drip-drip of falling relays.
There was, interestingly, no double blind in the experiments. The women were fully aware they were being studied, and even suggested some of the experiments themselves. The lack of control over the numerous variables has led to a wide range of interpretation about what — if, indeed, anything — the studies really mean.
Herd behavior was well-known to exist in animals, but Friedrich Nietzche was the first to use the concept of “herd instinct” as one more reason to have contempt for the human species. There’s nothing inherently wrong, however, with acting as part of a group. In many circumstances, the natural tendency of a group to move in the same direction can increase safety. Of course, sometimes herds head over the edge of the cliff.
As noted earlier, calling something a cognitive bias isn’t the same as calling a biased decision wrong or stupid. If a crowd is fleeing in a particular direction, it may be a false alarm, but then again, they may know something you don’t. If danger doesn’t appear imminent, taking a few minutes to look around is a better way to balance your risks.
Once you know how it turned out, a certain sense of inevitability creeps in. The signs were always there, and the people in charge should have known the truth all along.
The frequently repeated libel that FDR, for example, knew in advance about the impending Pearl Harbor attack and remained silent for political reasons is a case in point. (I won’t rehash the argument in detail, but I’m always appreciative of the Straight Dope’s accuracy and balance on almost any topic.) The argument relies on the idea that in the mass of raw data, decision-makers could have recognized in advance exactly which bits of information were salient. This is nonsense. Reading the future forward is orders of magnitude more difficult than reading it backward.
This particular bias is aided by our own tendency to believe that when we turn out to have been right that we “knew it all along.” Before-and-after measures of certainty tend to vary a lot.
“I would gladly pay you Tuesday for a hamburger today.” Wimpy, the hamburglar pal of Popeye the Sailor Man, liked his rewards up front and his penalties delayed. People in general tend to prefer the bird in the hand to a flock in the bush. That’s a fairly well-known cognitive bias.
What’s not so well known is the amount of the discount — how much will you give up in the future to receive the benefit today? Behavioral economists believe the relationship is hyperbolic. We’ll take a dollar today in preference to three dollars tomorrow.
But given a choice between a dollar 365 days from now and three dollars 366 days from now, we’ll gladly wait the same extra day for three times the payoff. Our choices are inconsistent over time: we’ll commit our future self to a course of action (waiting a day) that we aren’t willing to follow today.
This is often irrational, but not always. Depending on the uncertainty of the reward, a definite dollar today may be preferable to the possibility of three dollars tomorrow.
This particular cognitive bias shows up in studies of how people save for retirement, borrow on their credit cards, procrastinate on important tasks, and deal with the consequences of addiction. Especially where hamburgers are concerned.