There are three main sources of overconfidence: the confirmation bias, the hindsight bias, and the inside perspective. When people are presented with a question they need to find an answer to, they often seek out evidence that supports their own proposition rather than information that would contradict the proposition. (Textbook pg.124) This tendency is most commonly as the confirmation bias. In one specific experiment, Jennifer Crocker conducted a study on how confirmation bias affects our day-to-day conclusions on specific questions and experiments. To conduct her study, she asked one group of participants to determine whether working out the day before an important tennis match made the player more likely to win. (Crocker, 1982.) Consequently, she asked another group to determine whether working out the day before an important tennis match made the player more likely to lose. However, each group could only collect four types of information before reaching their conclusion: the number of players in a sample who worked out the day before and won their match, the number of players who worked out and lost, the number of players who did not work out the day before and won, and the number of players who did not work out and lost. When all results were recorded, Crocker noticed a specific similarity amongst the two individuals in the two groups: especially interested in examining information that could potentially confirm the proposition they were investigating. Those trying to find out whether practicing leads to winning were more interested in the number of players who practiced and won than those trying to find out whether practicing leads to losing—and vice versa. What these individuals fail to consider is that evidence consistent with a proposition is not enough to draw a firm conclusion, as there might be even more evidence against it. To truly test a proposition, we must seek out the evidence against it as well as the evidence for it.
The second and perhaps one of the most common biases we use in our day-to-day lives is the hindsight bias. Hindsight bias is essentially the tendency to believe that you could have predicted some outcome, which in fact you couldn’t have predicted accurately, if not given the answer. Remember the story from the introduction? Say, for example, that, once your teacher begins going over your answers, she says that the correct answer for #14 was D. Not surprisingly, there is going to be a particular group of people who shout out, “Aw, man, I knew it!” What you see in play here is actually the hindsight bias. Once we hear some new fact, we can think of reasons why it might be expected to be true. It’s a useful thing to do, except that dredging up those reasons often leaves us with the feeling that we could have predicted the outcome. It’s very common that the people kept in ignorance make incorrect predictions, but those told the fact are confident they could have predicted it correctly (Bradfield & Wells, 2005; Fischhoff, Gonzalez, Lerner, & Small, 2005; Guilbault, Bryant, Brockway, & Posavac, 2004).
The last source of overconfidence is known as the inside perspective. The inside perspective can simply be divided into two categories: pre-mortem and the planning fallacy and the two often go hand in hand. Often times, we look back on projects and essays that have gone horribly wrong and ask ourselves, “How come?” “What is it that I did wrong?” This is what is referred to as doing a post-mortem, or figuring out the reasons as to why a specific thing went wrong. What you should’ve done instead is hold a pre-mortem. In simple terms, in a pre-mortem, you must think about what you want to happen, and imagine that your efforts failed. Next, you must figure out what it is you have to do to prevent this tragic outcome. This way, you can look ahead at challenges that could cause everything to fail and create a plan to avoid them. (http://www.riskology.co/pre-mortem-technique/) Similar to using pre-mortem to go about or hardships, we typically estimate that projects will be completed sooner than they actually are, even when they’re aware of past efforts that took much longer than originally planned. (Textbook pg.146) This tendency is known as the planning fallacy. To shed light on the planning fallacy, Roger Buehler, Dale Griffin, and Michael Ross (1994) conducted a number of studies of people’s estimates of completion times. In one study, they asked students in an honors program to predict as accurately as possible when they would turn in their theses. They also estimated what the completion date would be “if everything went as poorly as it possibly could.” Less than a third of the students finished by the time they had estimated. More remarkably, fewer than half finished by the time they had estimated for the worst-case scenario.
If we were to carefully examine our daily actions as amateur psychologists, we would most often see that most of us have this sense of overconfidence when it comes to planning ahead and making decisions. Overconfidence can often lead to faulty judgments and can even cause us to make major mistakes. The best way to prevent these things from happening is by eliminating the roots of the problem. That is, making sure that you prevent yourself from using any of the biases mentioned above. While it may seem like an impossible task, it is most certainly not impossible to minimize your use of these biases and approach situations from a more careful, logical perspective.
...(download the rest of the essay above)