Why do we need decision aids? Can't people make good choices on their own? Like many decision analysts, I was first attracted to the science of decision making as a result of reading about the errors and biases that affect people's judgments.
Remarkably, it appears that our brains have been hard-wired to make certain kinds of errors. Hundreds of different biases have been identified and categorized, including biases that distort our judgments, that introduce errors into the estimates and forecasts that we produce, and that cause us to make the wrong choices.
If you're not already familiar with the major results from this fascinating field of research, this introduction should help you to appreciate the value of formal decision-aiding tools. Without the assistance of such tools, the decisions made within organizations, including choices of which projects to conduct, will be systematically biased. Errors and bias in judgment is the first of my 5 reasons why organizations choose the wrong projects.
Heuristics and Judgmental Biases
The fact that people's intuitive decisions are often strongly and systematically biased has been firmly established over the past 50 years by literally hundreds of empirical studies. Psychologist Daniel Kahneman received the 2002 Nobel Prize in Economics for his work in this area. The conclusion reached by Kahneman and his colleagues is that people use unconscious shortcuts, termed heuristics, to help them make decisions. "In general, these heuristics are useful, but sometimes they lead to severe and systematic errors" .
Understanding heuristics and the errors they cause is important because it can help us find ways to counteract them. For example, when judging distance people use a heuristic that equates clarity with proximity. The clearer an object, the closer we perceive it to be (Figure 1). Although this heuristic is usually correct, it allows haze to trick us into thinking that objects are more distant than they are. The effect can be dangerous. Studies show people often drive faster in fog because reduced clarity and contrast make going fast appear slower. Airline pilots are similarly tricked, so pilots are trained to rely more on instruments than on what they think they see out the cockpit window.
Figure 1: Haze tricks us into thinking objects are further away.
Some of the dozens of well-documented heuristics and related errors and biases include:
The following summary, including ideas for countering biases, is derived from some of the many excellent papers on the subject, especially the popular 1998 Harvard Business Review article by Hammond, Keeney, and Raiffa .
Status Quo Bias
Status quo bias refers to the tendency people have to prefer alternatives that perpetuate the status quo. Psychologists call this a "comfort zone" bias based on research suggesting that breaking from the status quo is, for most people, emotionally uncomfortable. It requires increased responsibility and opening oneself up to criticism. For example, if you are a company executive considering introducing a lower-cost/lower-quality version of an existing product to your product line, you face the risk of adversely affecting the perceptions of customers who choose your high-quality products. If your company's reputation for quality declines, you could be accused of making a bad choice. Just considering the change forces you to confront the trade-off between increased profits and the risk of damaging your brand image. Sticking to the status quo is easier because it is familiar; it creates less internal tension.
Would you trade?
Admittedly, there are often good reasons for leaving things unchanged. But, studies show that people overvalue the status quo. A famous experiment involved randomly giving students a gift consisting of either a coffee mug or a candy bar. When offered the chance to trade, few wanted to exchange for the alternative gift. It is unlikely, of course, that students who naturally preferred the coffee cup to the candy bar received their preferred gift by chance. Apparently, "owning" what they had been given made it appear more valuable.
The power of this bias was quantified in a related experiment. Students were randomly chosen to receive mugs. Those with mugs were asked to name the minimum price at which they would sell their mugs. Those without were asked to name the maximum price at which they would buy. The median selling price was more than twice the median offer price. Again, ownership increased perceived value. Sometimes referred to as the "endowment effect," this bias may help explain why investors are often slow to sell stocks that have lost value. Likewise, it might be a factor for explaining why executives have trouble terminating failing projects.
Social norms tend to reinforce preference for the status quo. For example, courts (and many organizations) view a sin of commission (doing something wrong) as more serious than a sin of omission (failing to prevent a wrong). As another example, government decision makers are often reluctant to adopt efficiency-enhancing reforms if there are "losers" as well as "gainers." Any change is seen as unfair. The burden of proof is on the side of changing the status quo.
Lack of information, uncertainty, and too many alternatives promote holding to the status quo. In the absence of an unequivocal case for changing course, why face the unpleasant prospect of change? Thus, many organizations continue to support failing projects due to lack of solid evidence that they've failed. Killing a project may be a good business decision, but changing the status quo is typically uncomfortable for the people involved.
What causes status quo bias? According to psychologists, when people face the opportunity of changing their status quo, the loss aspects of the change loom larger than the gain aspects. Losses represent the certain elimination of visible, existing benefits. Gains, in contrast, are prospective and speculative. We know what we have, who knows what we will get? We fear regret, and this fear is amplified by our desire to maintain the respect and approval of others. In business, the key to success is often bold action, but for many CEO's, the only thing worse than making a strategic mistake is being the only one in the industry to make it. Sticking with the status quo is safer.
The best advice for countering the bias toward the status quo is to consider carefully whether status quo is the best choice or only the most comfortable one:
Sunk Cost Bias
We know rationally that sunk costs—past investments that are now irrecoverable—are irrelevant to current decisions. Sunk costs are the same regardless of the course of action that we choose next. If we evaluate alternatives based solely on their merits, we should ignore sunk costs. Only incremental costs and benefits should influence future choices.
Yet, the more we invest in something (financially, emotionally, or otherwise), the harder it is to give up that investment. For example, when making a telephone call, being on hold and hearing the recording, "Your call is important to us...Please stay on the line," often means that you've got a lot longer to wait. Still, having already invested the effort to make the call, it's hard to hang up and call another time.
There is a great deal of research demonstrating the influence of sunk costs. In one study, students were shown to be more likely to eat identical TV dinners if they paid more for them. Another study arranged to have similar tickets for a theater performance sold at different prices—people with the more expensive tickets were less likely to miss the performance. A third study found that the higher an NBA basketball player is picked in the draft, the more playing time he gets, even after adjusting for differences in performance.
An example of sunk cost bias?
The Concorde supersonic airplane is often cited as an example of sunk cost bias. It became obvious early on that the Concorde was very costly to produce and, with limited seating, was unlikely to generate adequate revenue. Few orders for planes were coming in. Still, even though it was clear that the plane would not make money, France and England continued to invest.
Sunk cost reasoning shows up frequently in business. For example, you might be reluctant to fire a poor performer you hired in part because you may feel to do so would be an admission of earlier poor judgment. You might be inclined to give more weight to information you paid for than to information that was free. You might find it harder to terminate a project if you've already spent a lot on it.
Why is it so difficult to free oneself from sunk cost reasoning? Many of us appear to be born with strong feelings about wasting resources. We feel obligated to keep investing because, otherwise, the sunk cost will have been "wasted." We would then need to admit (at least to ourselves if not to others) that we made a mistake. It has even been suggested that sunk cost reasoning may be a kind of self-punishment. We may unconsciously force ourselves to follow through on commitments that no longer seem desirable in order to instruct ourselves to be more careful next time.
Techniques for countering sunk cost bias include:
Supporting Evidence Bias
Supporting evidence bias is our tendency to want to confirm what we already suspect and look for facts to support it. This bias not only affects where we go to collect information, but also how we interpret the evidence that we receive. We avoid asking tough questions and discount new information that might challenge our preconceptions.
Suppose, for example, you are considering an investment to automate some business function. Your inclination is to call an acquaintance who has been boasting about the good results his organization obtained from doing the same. Isn't it obvious that he will confirm your view that, "It's the right choice"? What may be behind your desire to make the call is the likelihood of receiving emotional comfort, not the likelihood of obtaining useful information.
Supporting evidence bias influences the way we listen to others. It causes us to pay too much attention to supporting evidence and too little to conflicting evidence. Psychologists believe the bias derives from two fundamental tendencies. The first is our nature to subconsciously decide what we want to do before figuring out why we want to do it. The second is our inclination to be more attracted to experiences that make us feel good than experiences that make us feel uncomfortable.
What rule generated this sequence?
Despite our inclination to look for supporting evidence, it is usually much more informative to seek contradictory evidence. Confirming evidence often fails to discriminate among possibilities. To illustrate, in one study students were given the sequence of numbers 2, 4, 6 and told to determine the rule that generated the numbers. To check hypotheses, they could choose a possible next number and ask whether that number was consistent with the rule. Most students asked whether a next number "8" would be consistent with the rule. When told it was, they expressed confidence that the rule was, "The numbers increase by 2." Actually, the rule was, "Any increasing sequence." A better test would have been to check whether a next number incompatible with the hypothesis (e.g., "7") was consistent with the unknown rule.
Supporting evidence bias can cause us to perpetuate our pet beliefs. For example, if a manager believes people are basically untrustworthy, that manager will closely monitor their behavior. Every questionable act will increase suspicions. Meanwhile, employees will notice that their actions are being scrutinized. Closely watching employees will make it impossible to develop trust. Studies show that when people are placed in situations where authority figures expect them to cheat, more of them do, in fact, cheat. The behavior pattern reinforces itself to everyone's detriment.
Changing what we believe takes effort. When first encountered, data that conflicts with our preconceptions is often interpreted as being the result of error, or to some other externally attributed factor. It is only after repeatedly being exposed to the conflicting information that we are willing to make the effort to change our beliefs.
Some advice for avoiding supporting evidence bias:
The first step in making a choice is to frame the decision, but it is also where you can first go wrong. The way a problem is framed strongly influences the subsequent choices we make. People tend to accept the frame they are given; they seldom stop to reframe it in their own words. A frame that biases our reasoning causes us to make poor decisions.
Edward Russo and Paul Shoemaker  provide a story to illustrate the power of framing. A Jesuit and a Franciscan were seeking permission from their superiors to be allowed to smoke while they prayed. The Franciscan asked whether it was acceptable for him to smoke while he prayed. His request was denied. The Franciscan asked the question a different way: "In moments of human weakness when I smoke, may I also pray?" Of course, the story describes this frame as eliciting the opposite response.
Whether outcomes are described as gains or losses influences people's choices. In one experiment, participants were asked to express their preferences among alternative programs impacting community jobs. They were told that due to a factory closing 600 jobs were about to be lost. However, if program A is adopted, 200 jobs will be saved. On the other hand, if program B is adopted, there is a 1/3 probability that 600 jobs will be saved and a 2/3 probability that none of the 600 jobs will be saved. Most people preferred program A. Another group was given a rephrasing of the choice. If program C is adopted, they were told, 400 people will lose their jobs. If program D is adopted, there is a 1/3 probability that nobody will lose their job and a 2/3 probability that 600 will lose their job. This group mainly favored program D.
Similar effects occur in everyday decision making. For example, the typical New York taxi driver chooses how long to work each day based on a personal target for daily earnings. Failing to achieve the target is perceived as a loss. Thus, on slow days the driver works more hours in order to achieve the target and avoid the loss. On busy days the driver hits the target more quickly and quits early. However, it would be more efficient to work longer hours on fast days and knock off early on slow days. The driver would end up with more income and fewer hours worked.
Project proponents intuitively understand the advantage of focusing attention on upside potential rather than down-side risk. It sounds more positive to say that a new product launch has a "1-in-5 chance of succeeding" compared to the mathematically equivalent statement that it has a "80% chance of failing." If people are rational, they should make the same choice in every situation in which the outcomes and their probabilities are identical. It shouldn't matter whether those outcomes are described as "gains" or "losses" or as "successes" or "failures." But, the words establish different frames, and decisions may differ because of it.
Another example, described by Hammond, Keeney and Raiffa , involves automobile insurance laws voted on in New Jersey and Pennsylvania. Each state gave voters a new option: By accepting a limited right to sue they could lower their insurance premiums. New Jersey framed the initiative by automatically giving drivers the limited right to sue unless they specified otherwise. Pennsylvania framed it by giving drivers the full right to sue unless they specified otherwise. Both measures passed, and in both cases large majorities of drivers defaulted to the status quo. But, because of the way Pennsylvania framed the choice, drivers in that state failed to gain about $200 million in expected insurance savings.