Lee Merkhofer Consulting Priority Systems
Choosing the Wrong Portfolio of Projects

Part 1:  Errors and Bias in Judgment


Why do we need decision aids? Can't people make good choices on their own? Like many decision analysts, I was first attracted to the science of decision making as a result of reading about the errors and biases that affect people's judgments.

Remarkably, it appears that our brains have been hard-wired to make certain kinds of errors. Hundreds of different biases have been identified and categorized, including biases that distort our judgments, that introduce errors into the estimates and forecasts that we produce, and that cause us to make the wrong choices.

If you're not already familiar with the major results from this fascinating field of research, this introduction should help you to appreciate the value of formal decision-aiding tools. Without the assistance of such tools, the decisions made within organizations, including choices of which projects to conduct, will be systematically biased. Errors and bias in judgment is the first of my 5 reasons why organizations choose the wrong projects.

Heuristics and Judgmental Biases

Prioritization mathematics

The fact that people's intuitive decisions are often strongly and systematically biased has been firmly established over the past 50 years by literally hundreds of empirical studies. Psychologist Daniel Kahneman received the 2002 Nobel Prize in Economics for his work in this area. The conclusion reached by Kahneman and his colleagues is that people use unconscious shortcuts, termed heuristics, to help them make decisions. "In general, these heuristics are useful, but sometimes they lead to severe and systematic errors" [1].

Understanding heuristics and the errors they cause is important because it can help us find ways to counteract them. For example, when judging distance people use a heuristic that equates clarity with proximity. The clearer an object, the closer we perceive it to be (Figure 1). Although this heuristic is usually correct, it allows haze to trick us into thinking that objects are more distant than they are. The effect can be dangerous. Studies show people often drive faster in fog because reduced clarity and contrast make going fast appear slower. Airline pilots are similarly tricked, so pilots are trained to rely more on instruments than on what they think they see out the cockpit window.


Houston on clear and hazy days

Figure 1:   Haze tricks us into thinking objects are further away.


Some of the dozens of well-documented heuristics and related errors and biases include:


Comfort Zone Biases

People tend do what's comfortable rather than what's important.

Perception Biases

People's beliefs are distorted by faulty perceptions.

Motivation Biases

People's motivations and incentives tend to bias their judgments.

Errors in Reasoning

People use flawed reasoning to reach incorrect conclusions.

Group Think

Group dynamics create additional distortions.

People:
  • Become attached to the status quo.
  • Value things more highly if they already own them.
  • Seek information that confirms what they already suspect.
  • Ignore information inconsistent with their current beliefs.
  • Are more likely to choose an alternative if it is not the most extreme one considered.
  • Make choices or predict outcomes based on what is pleasing to imagine rather than on rational evidence.
  • Fail to learn and correct their beliefs despite strong evidence that they should.
  • Keep doing the same things, even if they no longer work well.
  • Distort their views of reality in order to feel more comfortable.
People:
  • See things according to the conventions of their profession, ignoring other views.
  • Overlook and ignore unexpected data.
  • Anchor on information that is readily available, vivid or recent.
  • Make insufficient adjustments from their initial anchors.
  • Ascribe more credibility to data than is warranted.
  • Overestimate what they know.
  • See patterns where none exist.
  • Underestimate the time/effort to complete a difficult task.
  • Perceive recent events as more distant and very distant events as less distant.
  • Give different answers to the same question posed in different ways.
People:
  • Behave differently when they think they are being observed.
  • Unconsciously distort judgments to "look good" and "get ahead."
  • Remember their decisions as being better than they were.
  • Take actions as if concerned only with short-term consequences.
  • Attribute good decisions to skill, bad outcomes to others' failures or bad luck.
  • Escalate commitments to avoid questioning earlier decisions.
  • Favor actions that shield them from unfavorable feedback.
  • May have the urge to do the opposite of what is suggested to resist a perceived constraint on freedom of choice.
People:
  • Believe they can control outcomes that they can't.
  • Simplify inappropriately.
  • Are persuaded by circular reasoning, false analogies, and other fallacious arguments.
  • Are surprised by statistically likely "coincidences."
  • Base the credibility of an argument on its manner of presentation.
  • Disregard probabilities when making decisions.
  • Abhor risk but seek bigger risks to avoid a sure loss.
  • Prefer eliminating a small risk to reducing a large risk.
  • Think a string of random outcomes (e.g., "heads"), makes that outcome more likely in the future.
  • Cannot solve even simple probability problems in their heads.
Groups:
  • Give preferential treatment to those perceived to be group members.
  • Reinforce beliefs via a bandwagon effect.
  • "Dive in" when solving problems without having all the necessary information.
  • Are excessively cautious in sharing data.
  • Assume they agree when they don't.
  • Avoid expressing inconsistent, opposing views.
  • Jump to conclusions prematurely or get bogged down trying to reach agreement.
  • Discount solutions "not invented here."
  • Create illusions of invulnerability and ignore external views of the morality of their actions.

The following summary, including ideas for countering biases, is derived from some of the many excellent papers on the subject, especially the popular 1998 Harvard Business Review article by Hammond, Keeney, and Raiffa [2].

Status Quo Bias

Status quo bias refers to the tendency people have to prefer alternatives that perpetuate the status quo. Psychologists call this a "comfort zone" bias based on research suggesting that breaking from the status quo is, for most people, emotionally uncomfortable. It requires increased responsibility and opening oneself up to criticism. For example, if you are a company executive considering introducing a lower-cost/lower-quality version of an existing product to your product line, you face the risk of adversely affecting the perceptions of customers who choose your high-quality products. If your company's reputation for quality declines, you could be accused of making a bad choice. Just considering the change forces you to confront the trade-off between increased profits and the risk of damaging your brand image. Sticking to the status quo is easier because it is familiar; it creates less internal tension.

Prioritization mathematics

Would you trade?

Admittedly, there are often good reasons for leaving things unchanged. But, studies show that people overvalue the status quo. A famous experiment involved randomly giving students a gift consisting of either a coffee mug or a candy bar. When offered the chance to trade, few wanted to exchange for the alternative gift. It is unlikely, of course, that students who naturally preferred the coffee cup to the candy bar received their preferred gift by chance. Apparently, "owning" what they had been given made it appear more valuable.

The power of this bias was quantified in a related experiment. Students were randomly chosen to receive mugs. Those with mugs were asked to name the minimum price at which they would sell their mugs. Those without were asked to name the maximum price at which they would buy. The median selling price was more than twice the median offer price. Again, ownership increased perceived value. Sometimes referred to as the "endowment effect," this bias may help explain why investors are often slow to sell stocks that have lost value. Likewise, it might be a factor for explaining why executives have trouble terminating failing projects.

Social norms tend to reinforce preference for the status quo. For example, courts (and many organizations) view a sin of commission (doing something wrong) as more serious than a sin of omission (failing to prevent a wrong). As another example, government decision makers are often reluctant to adopt efficiency-enhancing reforms if there are "losers" as well as "gainers." Any change is seen as unfair. The burden of proof is on the side of changing the status quo.

Lack of information, uncertainty, and too many alternatives promote holding to the status quo. In the absence of an unequivocal case for changing course, why face the unpleasant prospect of change? Thus, many organizations continue to support failing projects due to lack of solid evidence that they've failed. Killing a project may be a good business decision, but changing the status quo is typically uncomfortable for the people involved.

What causes status quo bias? According to psychologists, when people face the opportunity of changing their status quo, the loss aspects of the change loom larger than the gain aspects. Losses represent the certain elimination of visible, existing benefits. Gains, in contrast, are prospective and speculative. We know what we have, who knows what we will get? We fear regret, and this fear is amplified by our desire to maintain the respect and approval of others. In business, the key to success is often bold action, but for many CEO's, the only thing worse than making a strategic mistake is being the only one in the industry to make it. Sticking with the status quo is safer.

The best advice for countering the bias toward the status quo is to consider carefully whether status quo is the best choice or only the most comfortable one:

  1. When you hear comments like "let's wait and see" or "let's meet next month to see how the project is going," question whether you're hearing status quo bias.
  2. Think about what your objectives are and whether they are they best served by the status quo or a change.
  3. Identify who might be disadvantaged by changing the status quo, and look for ways to mitigate or compensate for those disadvantages.
  4. Ask yourself whether you would choose the status quo alternative if, in fact, it wasn't the status quo.
  5. Avoid overestimating the difficulty of switching from the status quo.
  6. Actively manage migration away from the status quo—communicate dissatisfaction with the status quo and repeat the message about the need for change.
  7. Note that a change becomes the status quo over time. Evaluate alternatives in terms of a future perspective as well as the present.

Sunk Cost Bias

We know rationally that sunk costs—past investments that are now irrecoverable—are irrelevant to current decisions. Sunk costs are the same regardless of the course of action that we choose next. If we evaluate alternatives based solely on their merits, we should ignore sunk costs. Only incremental costs and benefits should influence future choices.

Yet, the more we invest in something (financially, emotionally, or otherwise), the harder it is to give up that investment. For example, when making a telephone call, being on hold and hearing the recording, "Your call is important to us...Please stay on the line," often means that you've got a lot longer to wait. Still, having already invested the effort to make the call, it's hard to hang up and call another time.

There is a great deal of research demonstrating the influence of sunk costs. In one study, students were shown to be more likely to eat identical TV dinners if they paid more for them. Another study arranged to have similar tickets for a theater performance sold at different prices—people with the more expensive tickets were less likely to miss the performance. A third study found that the higher an NBA basketball player is picked in the draft, the more playing time he gets, even after adjusting for differences in performance.

Concorde airplane

An example of sunk cost bias?


The Concorde supersonic airplane is often cited as an example of sunk cost bias. It became obvious early on that the Concorde was very costly to produce and, with limited seating, was unlikely to generate adequate revenue. Few orders for planes were coming in. Still, even though it was clear that the plane would not make money, France and England continued to invest.

Sunk cost reasoning shows up frequently in business. For example, you might be reluctant to fire a poor performer you hired in part because you may feel to do so would be an admission of earlier poor judgment. You might be inclined to give more weight to information you paid for than to information that was free. You might find it harder to terminate a project if you've already spent a lot on it.

Why is it so difficult to free oneself from sunk cost reasoning? Many of us appear to be born with strong feelings about wasting resources. We feel obligated to keep investing because, otherwise, the sunk cost will have been "wasted." We would then need to admit (at least to ourselves if not to others) that we made a mistake. It has even been suggested that sunk cost reasoning may be a kind of self-punishment. We may unconsciously force ourselves to follow through on commitments that no longer seem desirable in order to instruct ourselves to be more careful next time.

Techniques for countering sunk cost bias include:

  1. Ask yourself what another manager would do in your place, one without a prior history in the investment.
  2. Seek the opinions of people who were uninvolved in the original choice.
  3. Be alert to sunk cost bias in the decisions and recommendations made by others. Comments like "we've invested so much already" and "we don't want to waste those resources" are signals. Consider re-assigning responsibilities.
  4. Avoid creating a mistake-fearing culture within your organization. Set an example by admitting when you are wrong. Change course quickly without regard to the sunk costs of investments that have gone bad.
  5. Remember that even smart choices (taking into account what was known at the time the decision was made) can have bad outcomes (because of uncertainty). Cutting your losses does not necessarily mean that you were foolish to make the original choice.
  6. Try thinking of the sunk cost not as an investment that was wasted, but as an investment that (perhaps indirectly) led to valuable information indicating that a course change is needed.
  7. Follow the advice of Warren Buffet: "When you find yourself in a hole, the best thing you can do is stop digging."

Supporting Evidence Bias

Supporting evidence bias is our tendency to want to confirm what we already suspect and look for facts to support it. This bias not only affects where we go to collect information, but also how we interpret the evidence that we receive. We avoid asking tough questions and discount new information that might challenge our preconceptions.

Suppose, for example, you are considering an investment to automate some business function. Your inclination is to call an acquaintance who has been boasting about the good results his organization obtained from doing the same. Isn't it obvious that he will confirm your view that, "It's the right choice"? What may be behind your desire to make the call is the likelihood of receiving emotional comfort, not the likelihood of obtaining useful information.

Supporting evidence bias influences the way we listen to others. It causes us to pay too much attention to supporting evidence and too little to conflicting evidence. Psychologists believe the bias derives from two fundamental tendencies. The first is our nature to subconsciously decide what we want to do before figuring out why we want to do it. The second is our inclination to be more attracted to experiences that make us feel good than experiences that make us feel uncomfortable.

Number sequence

What rule generated this sequence?

Despite our inclination to look for supporting evidence, it is usually much more informative to seek contradictory evidence. Confirming evidence often fails to discriminate among possibilities. To illustrate, in one study students were given the sequence of numbers 2, 4, 6 and told to determine the rule that generated the numbers. To check hypotheses, they could choose a possible next number and ask whether that number was consistent with the rule. Most students asked whether a next number "8" would be consistent with the rule. When told it was, they expressed confidence that the rule was, "The numbers increase by 2." Actually, the rule was, "Any increasing sequence." A better test would have been to check whether a next number incompatible with the hypothesis (e.g., "7") was consistent with the unknown rule.

Supporting evidence bias can cause us to perpetuate our pet beliefs. For example, if a manager believes people are basically untrustworthy, that manager will closely monitor their behavior. Every questionable act will increase suspicions. Meanwhile, employees will notice that their actions are being scrutinized. Closely watching employees will make it impossible to develop trust. Studies show that when people are placed in situations where authority figures expect them to cheat, more of them do, in fact, cheat. The behavior pattern reinforces itself to everyone's detriment.

Changing what we believe takes effort. When first encountered, data that conflicts with our preconceptions is often interpreted as being the result of error, or to some other externally attributed factor. It is only after repeatedly being exposed to the conflicting information that we are willing to make the effort to change our beliefs.

Some advice for avoiding supporting evidence bias:

  1. Check to see whether you are examining all the evidence. Avoid the inclination to accept confirming evidence without question.
  2. Get in the habit of looking for counter arguments.
  3. In meetings, consider appointing someone to serve as devil's advocate—to argue against the prevailing point of view. If that seems too uncomfortable, at least appoint a devil's inquisitor—someone with responsibility to ask tough questions.
  4. When you encounter something that conflicts with your beliefs, dig deeper. Resist the temptation to dismiss data that makes you uncomfortable.
  5. Be honest with yourself. Are you really gathering information to help you make a smart choice, or are you just looking to confirm what you already believe.
  6. Don't surround yourself with "yes men."

Framing Bias

The first step in making a choice is to frame the decision, but it is also where you can first go wrong. The way a problem is framed strongly influences the subsequent choices we make. People tend to accept the frame they are given; they seldom stop to reframe it in their own words. A frame that biases our reasoning causes us to make poor decisions.

Edward Russo and Paul Shoemaker [3] provide a story to illustrate the power of framing. A Jesuit and a Franciscan were seeking permission from their superiors to be allowed to smoke while they prayed. The Jesuit asked whether it was acceptable for him to smoke while he prayed. His request was denied. The Franciscan asked the question a different way: "In moments of human weakness when I smoke, may I also pray?" Of course, the story describes this frame as eliciting the opposite response.

Whether outcomes are described as gains or losses influences people's choices. In one experiment, participants were asked to express their preferences among alternative programs impacting community jobs. They were told that due to a factory closing 600 jobs were about to be lost. However, if program A is adopted, 200 jobs will be saved. On the other hand, if program B is adopted, there is a 1/3 probability that 600 jobs will be saved and a 2/3 probability that none of the 600 jobs will be saved. Most people preferred program A. Another group was given a rephrasing of the choice. If program C is adopted, they were told, 400 people will lose their jobs. If program D is adopted, there is a 1/3 probability that nobody will lose their job and a 2/3 probability that 600 will lose their job. This group mainly favored program D.

NY taxi

Poor framing?

Similar effects occur in everyday decision making. For example, the typical New York taxi driver chooses how long to work each day based on a personal target for daily earnings. Failing to achieve the target is perceived as a loss. Thus, on slow days the driver works more hours in order to achieve the target and avoid the loss. On busy days the driver hits the target more quickly and quits early. However, it would be more efficient to work longer hours on fast days and knock off early on slow days. The driver would end up with more income and fewer hours worked.

Project proponents intuitively understand the advantage of focusing attention on upside potential rather than down-side risk. It sounds more positive to say that a new product launch has a "1-in-5 chance of succeeding" compared to the mathematically equivalent statement that it has a "80% chance of failing." If people are rational, they should make the same choice in every situation in which the outcomes and their probabilities are identical. It shouldn't matter whether those outcomes are described as "gains" or "losses" or as "successes" or "failures." But, the words establish different frames, and decisions may differ because of it.

Another example, described by Hammond, Keeney and Raiffa [2], involves automobile insurance laws voted on in New Jersey and Pennsylvania. Each state gave voters a new option: By accepting a limited right to sue they could lower their insurance premiums. New Jersey framed the initiative by automatically giving drivers the limited right to sue unless they specified otherwise. Pennsylvania framed it by giving drivers the full right to sue unless they specified otherwise. Both measures passed, and in both cases large majorities of drivers defaulted to the status quo. But, because of the way Pennsylvania framed the choice, drivers in that state failed to gain about $200 million in expected insurance savings.

Advice:

  1. Ask yourself if you are working on the real problem.
  2. Look for implicit assumptions or unnecessary constraints in the way that you perceive your problem.
  3. To promote objective reasoning, avoid framing alternatives with value-laden terminology (e.g., labeling a proposed resource allocation as "fair").
  4. Try posing problems in a neutral way that combines gains and losses, adopts alternative reference points, or promotes objectivity.
  5. Look at the problem from other perspectives. For example, reverse the context. If you are the seller, how would you see things if you were the buyer?
  6. Choose a frame that captures all of what's important. For example, ask, "What's the total cost of ownership?" not "What's the price?"
  7. Watch out for leading questions—questions or phrasing designed to create a frame intended to elicit a particular response.
  8. Choose a high-level perspective for framing. For example, looking only at project-by-project risk may result in a portfolio of overly conservative projects.