man, i'm smart about how stupid i am!

4
Man, I’m Smart About How Stupid I Am! Mark Simon E xecutives who make inaccurate predic- tions will surely encounter difficulties. Being absolutely certain that one’s erro- neous predictions are right, however, is where the real danger lies. It can turn what would have been a minor problem into a major disaster. To help managers avoid being overconfident of their business predictions, here are some lessons from my own personal work experience. At the School of Business Administration where I started working four years ago, faculty vote on the criteria used to distribute the merit pay component of our raises. Yet despite our freedom to innovate and our endless debating (as faculty are prone to do), I had watched us shoot down countless proposals over the years. Then I, a relative newcomer, figured out a plan that met everyone’s concerns. The vote was just moments away, and in my heart of hearts I was sure that everyone would just love it. I had no doubt of its almost unanimous acceptance, which in turn would lead me to be crowned the hero. Even when I first came up with the plan one week earlier, I was pretty certain it would receive a favorable reaction, though I realized there might be a small chance of my being wrong. After all, I had just finished writing two articles on overcon- fidence which, consistent with countless other studies, found that people’s confidence in their predictions often exceeds the accuracy of those predictions. That knowledge, combined with the little detail that I was about to affect people in their most sensitive area-their pocketbooks- made me decide to double-check my assumption regarding the proposal’s overwhelming popular- ity. It wouldn’t hurt to make sure that a signifi- cant minority of faculty wouldn’t be mad at me when my proposal passed (especially since my tenure decision wasn’t too far off). So I searched my memory and realized that my plan was pretty similar to a past proposal that had received a favorable reaction. Okay, it never passed; in fact, I hadn’t even voted for it. But that was only because it could generate a prob- I- b lem that we were con- cerned about-a prob- lem my idea eliminated. Great! My plan had all the best parts of the prior proposal and yet avoided the reason for its downfall. Still, being ever a cautious person, I discussed my plan with the four faculty I knew best. They loved it! In fact, only one person had even minor objections and two of the others were so excited they decided to be cosponsors. As the cospon- sors and I talked, we came up with more and more reasons why the proposal would pass, in- cluding remembering cases in which people had worried about some dilemma our plan solved. Growing increasingly confident, I ran my idea by my department head, who used the word “great” when he told me I should go ahead with it. He did, however, recommend I meet with some of the senior faculty who had their fingers on the School’s pulse. Although there was just one day to go before the vote, I managed to speak to one of them. I became absolutely positive the plan would pass when this knowledgeable person encouraged me, saying, “Sure, go ahead with the proposal.” So I did. And it failed+verwhelmingly: 20 percent for, 80 percent against. After recovering from my initial shock, I be- gan to review what I knew about how people became overconfident. My hope was to under- stand whether I, an expert on the topic, could have fallen prey to the very problem I study. I came to realize that at every step along the way I had become ensnared by common judgment errors that lead to overconfidence, including ig- noring base-rates, false consensus effects, belief 21

Upload: mark-simon

Post on 01-Nov-2016

217 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Man, I'm smart about how stupid I am!

Man, I’m Smart About How Stupid I Am!

Mark Simon

E xecutives who make inaccurate predic- tions will surely encounter difficulties. Being absolutely certain that one’s erro-

neous predictions are right, however, is where the real danger lies. It can turn what would have been a minor problem into a major disaster. To help managers avoid being overconfident of their business predictions, here are some lessons from my own personal work experience.

At the School of Business Administration where I started working four years ago, faculty vote on the criteria used to distribute the merit pay component of our raises. Yet despite our freedom to innovate and our endless debating (as faculty are prone to do), I had watched us shoot down countless proposals over the years. Then I, a relative newcomer, figured out a plan that met everyone’s concerns. The vote was just moments away, and in my heart of hearts I was sure that everyone would just love it. I had no doubt of its almost unanimous acceptance, which in turn would lead me to be crowned the hero.

Even when I first came up with the plan one week earlier, I was pretty certain it would receive a favorable reaction, though I realized there might be a small chance of my being wrong. After all, I had just finished writing two articles on overcon- fidence which, consistent with countless other studies, found that people’s confidence in their predictions often exceeds the accuracy of those predictions. That knowledge, combined with the little detail that I was about to affect people in their most sensitive area-their pocketbooks- made me decide to double-check my assumption regarding the proposal’s overwhelming popular- ity. It wouldn’t hurt to make sure that a signifi- cant minority of faculty wouldn’t be mad at me when my proposal passed (especially since my tenure decision wasn’t too far off).

So I searched my memory and realized that my plan was pretty similar to a past proposal that had received a favorable reaction. Okay, it never

passed; in fact, I hadn’t even voted for it. But that was only because it could generate a prob- I- b

lem that we were con- cerned about-a prob- lem my idea eliminated. Great! My plan had all the best parts of the prior proposal and yet avoided the reason for its downfall.

Still, being ever a cautious person, I discussed my plan with the four faculty I knew best. They loved it! In fact, only one person had even minor objections and two of the others were so excited they decided to be cosponsors. As the cospon- sors and I talked, we came up with more and more reasons why the proposal would pass, in- cluding remembering cases in which people had worried about some dilemma our plan solved. Growing increasingly confident, I ran my idea by my department head, who used the word “great” when he told me I should go ahead with it. He did, however, recommend I meet with some of the senior faculty who had their fingers on the School’s pulse. Although there was just one day to go before the vote, I managed to speak to one of them. I became absolutely positive the plan would pass when this knowledgeable person encouraged me, saying, “Sure, go ahead with the proposal.”

So I did. And it failed+verwhelmingly: 20 percent for, 80 percent against.

After recovering from my initial shock, I be- gan to review what I knew about how people became overconfident. My hope was to under- stand whether I, an expert on the topic, could have fallen prey to the very problem I study. I came to realize that at every step along the way I had become ensnared by common judgment errors that lead to overconfidence, including ig- noring base-rates, false consensus effects, belief

21

Page 2: Man, I'm smart about how stupid I am!

in the law of small numbers, and prior hypothesis bias. Let’s discuss each of these as they relate to my experience.

Ignoring Base-Rates. In part, I was over- confident because 4 ignored base-rates-I formed conclusions about my endeavor without consid- ering the past outcomes of similar actions. Prob- ability theory suggests that one needs to take base-rates into account before predicting an out- come, unless the endeavor is a complete depar- ture from what came before it, a situation that rarely occurs. In my case, every single plan (more than half a dozen of them) that had urged chang- ing the status quo had been overwhelmingly de- feated; maybe this should have been a clue that my proposal would not sail through with flying colors. I would have reached a more realistic conclusion had I formally tabulated the outcome of other raise proposals and used that number as a starting point.

The False Consensus Effect. The false con- sensus effect refers to the tendency to exaggerate how common one’s opinions are, and I suffered from it big time. Not only was I convinced that everyone would share my general conclusions, I was also sure my colleagues voted down the “similar” proposal for the same reason I did. After all, it was obvious they agreed with me regarding what was good and bad about it, wasn’t it? Actu- ally, during my quest to figure out why my plan failed, faculty told me they thought I had incor- porated some of the worst parts of the other pro- posal while eliminating some of the best. Perhaps

22

if I had double-checked my assumptions initially, rather than assuming everybody believed as I did, I might have been better off.

Belief in the Law of Small Numbers. Of course, I didn’t just rely on my memory of past proposals; the opinions of some faculty members I spoke to also led me to believe my plan would be accepted by all. Yup, of the four people I ini- tially asked, an overwhelming majority-three- unquestionably supported my idea, and one had only minor concerns. But hadn’t I just written several papers finding that individuals often erro- neously express, as Tversky and Kahneman (1974) put it, a “belief in the law of small num- bers”? In other words, they draw firm conclusions from small samples, even though the smaller the sample, the more likely the results differ from those of the population. Surely I couldn’t have done that, could I?

I fear it gets worse. As is so often the case when constructing a small sample, without ex- plicitly intending to I developed a biased sample. Subconsciously I may have decided to approach the four people I was closest to because they were the ones most likely to agree with me. I might have gotten a far different picture had I asked a few more people and selected them at random.

Prior Hypothesis Bias: Information Con- sidered. The prior hypothesis bias is rather gen- eral (possibly incorporating the other biases above) and occurs when an individual empha- sizes information that confirms his hypothesis

Business Horizons /July-August 2001

Page 3: Man, I'm smart about how stupid I am!

and downplays evidence that contradicts it. Prob- lematically, not only did I initially sample only four people, I may have allowed my initial hy- pothesis that people would favor my plan to affect my conclusion about the person who was for it despite “a few minor concerns.” When I asked this person why she had voted against the plan after she had touted its benefits earlier, she looked confused. She reminded me that for any advantage she had mentioned, she had brought up at least two disadvantages. Writing down any negative feedback when it occurs would have helped me remember it later.

My cosponsors and I also may have fallen prey to the bias when we talked extensively about why the proposal would pass and never once tried to think of reasons why people might object to it. If one of us had played devil’s advo- cate, we might have gained a more accurate per- spective. Moreover, although we generated much evidence that supported our beliefs, we forgot to ask ourselves whether the evidence was redun- dant. Even though redundant data do not contain new information, they often lead people to grow more certain of their conclusions. We recalled multiple instances in which one person expressed a desire that our proposal could fulfill, but no instance really provided us with new information;

it was the same individual discussing the same need. Avoiding double-counting redundant infor- mation may have helped avoid the prior hypoth- esis bias.

Prior Hypothesis Bias: Interpreting Infor- mation. This bias not only causes one to search for and use a disproportionate amount of evi- dence to enhance one’s belief in a hypothesis, it also leads one to conclude that ambiguous data support that hypothesis. I interpreted my chair- man’s explicit comment that it would be “great” if I introduced the proposal to mean that he thought it would pass. But he never even came close to implying that. Instead, at least in retro- spect, he clearly indicated that I should proceed because our faculty do not penalize anyone for stating an opinion (something I’m eternally grate- ful for), and he believes that having assistant professors voice their opinions keeps the place vibrant.

But what of the senior faculty member who told me “to go for it”? He was merely comment- ing on my right to bring a motion before the School. In fact, my interpretation that he was in favor of the proposal and thought it would pass really called for some mental gymnastics on my part because I somehow never got around to describing what my proposal was when I “solic-

Figure 1 Precautions to Avoid Biases

I Action

Did not consider all the past proposals voted down when making my predictions.

Assumed everyone had the same reasons I did for rejecting a similar proposal.

Initially solicited only four individuals’ opinions.

Asked the opinions of people to whom I was closest.

Overemphasized one person’s positive comments and ignored her negative ones.

-------------- Thought about why the proposal would pass and not why it wouldn’t.

Recalled multiple instances in which one person expressed a desire that our pro- posal could fulfill.

Interpreted department head’s and senior faculty’s comments to mean my proposal would pass.

Bias Leading to Overcorzfidence I Precaution to Avoid Bias

Ignoring Base- I

Use outcomes of similar actions as a starting Rates point.

False Consensus I

Ask for input rather than assuming others Effect think like you.

The Law of Small Recognize that small samples generate Numbers unreliable results. Develop larger samples.

Using a Biased

I

Select random samples. Sample

Prior Hypothesis Bias (Information

Considered)

Seek out and record confirming and discon- firming evidence. Play devil’s advocate.

Considering Redundant Information

Systematically review your evidence and examine how much of it represents new information.

Prior Hypothesis Bias (Interpreting

Information)

Ask actual question you want answered, note response, and confirm your interpre- tation.

Man, I’m Smart About How Stupid I Am! 23

Page 4: Man, I'm smart about how stupid I am!

ited” his opinion. (After all, had I shown him a copy, he might have shattered my illusion that it was a panacea for all our problems). After mak- ing this omission, it was child’s play for me to discount his caveat that most faculty were pretty tired of debating the merit pay issue (surely he wasn’t relating that observation to anything I had developed). I suspect I might have assessed his opinion more accurately if I had actually asked him the question I wanted answered, made a note of his response, and confirmed that I inter- preted his answer correctly.

A s Figure 1 on the previous page indi- cates, at almost every step I generated biases that I could have avoided by

taking some simple precautions. Did I know before the vote on my proposal that I should take these precautions to avoid becoming over- confident of my prediction? Certainly. I had been

studying this area for years. So why didn’t I use my knowledge? Perhaps I was overconfident about my expertise on overconfidence! But if I ultimately fell prey to these common traps, couldn’t anyone? 0

Referenees

A. Tversky and D. Kahneman, “Judgment Under Un- certainty: Euristics and Biases,” Science, September 1974, pp. 1,124-1,131.

Mark Simon is an assistant professor of management at Oakland University in Rochester, Michigan. He would like to thank Greg Textley for his helpful com- ments on earlier drafts of the manuscript.

24 Business Horizons / July-August 2001