The ability to make good decisions is particularly important in the context of business as it has an impact on the organisation’s resources—both financial and human, and its sustainability over the long-term. Furthermore, Campbell, Whitehead and Finkelstein (2009) affirm, decision-making is an integral part of our personal and professional lives. The dichotomy between Homo Economicus and Homo Sapiens highlighted by Tversky & Kahneman, (1974) suggests that we often (especially in economic and business decisions) assume people to be rational actors who properly weigh the costs against the benefits (utility times its probability). This dichotomy highlights the issue of rationality versus irrationality as a result of cognitive bias. Aronson (1973) posits that although individuals view themselves as rational beings, the truth is that they are rationalizing—often seeking to appear reasonable to themselves and others. Individuals seek reassurance that they made the right choice by gathering information that supports the choice or by changing their thoughts about it.
Decision-making is generally framed by personal characteristics, norms and habits, and are defined by the various acts and options from which we must choose, the possible consequences and outcomes of these options, and the contingencies and probabilities associated with particular choices (Tversky & Kahneman 1981). When we are in situations of uncertainty, we tend to make decisions based on previous experiences and patterns. While it may be instinctive for a few lucky individuals, making good choices can be a difficult and grueling process for many (Mellers & Locke 2007). In the decision-making process, we often have to process large amounts of information which arrive from all directions. As a result, we often depend on mental shortcuts based on heuristics to assist us in the process (Locke 2015). A heuristic is a practical problem-solving trick, which may not be rationally optimal. We use heuristics to simplify decisions in uncertain situations. Heuristics may also help us to reduce the complex tasks of assessing the probabilities involved in our decision-making, but they are prone to cognitive biases and may sometimes lead to significant errors (Tversky & Kahneman 1974).
Individuals may be viewed as ‘maximizers’ or ‘satisficers’ in their choice-making strategies (Iyengar, Wells & Schwartz. 2006). Maximizers tend to explore a range of possibilities in their quest for the best option, while satisficers often seek options that are ‘good enough’ and as close as possible to their threshold of acceptability (Iyengar et al. 2006). As we rationalize in making our decisions, we often rely on pattern recognition and emotional tagging (Campbell et al. 2009). When faced with new situations, we may integrate information from various parts of the brain in a process of pattern recognition, thereby making assumptions based on our past judgments and experiences. We may also use emotional tagging, thereby basing our actions or choices on the emotional information stored in our memories.
Several examples of flawed decision-making by highly qualified executives attest to the use these strategies and demonstrate that although these processes may be useful for making quick decisions, they are often distorted by misleading memories, emotional attachments or self-interest (Campbell et al. 2009). For example, Jürgen Schrempp spearheaded the doomed merger between Chrysler and Daimler as CEO of Daimler-Benz. A strong motivating factor in the deal, was reportedly Schrempp’s personal drive to get the deal done, based on his personal beliefs. There is also the case of Matthew Broderick, a retired Marine brigadier general with 30 years’ experience, who in the wake of Hurricane Katrina, crafted his situation report indicating that the levees in New Orleans had not been breached. He had ignored reports received based on his prior experience, which had taught him that early reports on a major event are often false. Subsequently, investigative reports indicated multiple levee breaches after Katrina made landfall (Campbell et al. 2009). These examples demonstrate how pattern recognition and emotional tagging may mislead us to erroneously think that we understand the complexities of the problems we face (Campbell et al. 2009).
Decision-making, as a cognitive process is subject to the influence of bias. To mitigate the effect of bias, Mellers and Locke (2007) recommend that decision-makers try to approach a problem from a variety of perspectives. Since we seldom know all that is required to resolve a problem, there is the tendency to focus on evidence that confirms our own beliefs and hypotheses about the problem (Mellers & Locke 2007). Confirmation bias tends to be self-serving and makes it difficult to view one’s accomplishments and efforts objectively. It results in over-confidence and is also closely related to the ‘above average effect’ and the perception that one is better than others with regard to desirable attributes such as honesty, intelligence, and managerial skill, among others Mellers and Locke (2007). Such self-serving biases can negatively influence perceptions of fairness in the business context.
According to Tversky and Kahneman (1974), three heuristic principles underlie many of our cognitive biases, namely representativeness, availability, and anchoring and adjustment. These three heuristic principles have significant implications for decision-making in the sphere of business. Representativeness may be defined as the tendency to assume commonality between objects which appear to be similar. Using this approach in formulating judgments of probability can result in errors because several factors are not taken into consideration. For instance, representativeness is not influenced by insensitivity to prior probability (or the base-rate frequency of outcomes) sample size (as bias may result in the selection of inadequate sample sizes and to the overinterpretation of research findings) or misconceptions of chance (since bias does not allow for the consideration that deviations of chance are not usually corrected but diluted) (Tversky & Kahneman 1974). Also, insensitivity to predictability, the illusion of validity, and misconceptions of regression are not factored in judgments based on representativeness (Tversky & Kahneman 1974).
Tversky and Kahneman (1974) explain the availability heuristic as the tendency to base decisions on information which is readily available in one’s memory. This principle helps individuals to estimate the likelihood of an occurrence, the frequency of co-occurrences, and the retrievability of instances (Tversky and Kahneman 1974). From our life experiences, we have learned that likely occurrences are more easily imagined than unlikely occurrences. Similarly, associative connections between events tend to be strengthened when events co-occur with some frequency (Tversky and Kahneman 1974). Availability can lead to biases arising from the retrievability of information on the basis of familiarity, salience, imaginability, illusory correlation (i.e., the frequency with which two events tend to co-occur), or to biases due to the effectiveness of a search set (Tversky & Kahneman 1974). For example, by imagining the various difficulties a given business could encounter is one means of evaluating the probability of failure.
The adjustment and anchoring heuristic is the tendency to anchor a value and then fail to adjust adequately away from that value. In anchoring, the initial value may arise from the formulation of the problem or may result from a partial computation (Tversky & Kahneman 1974). Adjustments tend to be typically insufficient since different starting points result in different estimates, which are biased toward the starting values. These biases can lead to unwarranted optimism concerning the success of a project or to underestimates of the probability of failure. (Tversky & Kahneman 1974). In developing a new product, for instance, a series of events may need to occur. Even though it is likely that each event will occur, the overall probability of success for the project may be quite low if the number of events is significantly large. Lastly, there is the ‘bandwagon effect,’ which explains the bias that propels people to get involved in an industry because ‘everyone else is’ (e.g. cryptocurrency and blockchain which currently has a huge bandwagon effect). This bias relates to one of Cialdini’s six principles of persuasion (Cialdini, 2007), social proof, as in both cases people are persuaded of something while being relatively uninformed about the situation.
There are several examples which demonstrate that past experience is ignored in planning and scheduling. Some professionals, such as scientists and writers, tend to underestimate the time required to complete a project despite past experiences where projects failed to be completed according to schedule and deadlines were missed (Kahneman & Tversky 1977). The term ‘planning fallacy’ is used to explain the tendency to neglect distributional data and to assume an ‘internal approach’ to predicting outcomes by focusing on the constituents of the problem as opposed to the distribution of outcomes in similar cases. This approach is likely to produce underestimation, and bias may be countered by the adoption of an ‘external approach’ which deals with the problem as one of several, without focusing on the uniqueness of the case and specificities that could lead to failure (Kahneman & Tversky 1977).
As Hawkins and Hastie (1990) observe, past events may often appear simple, comprehensible, and predictable when compared with future events. This can lead to hindsight bias, or the tendency for individuals who have outcome knowledge or insights to claim that based on hindsight, they would have estimated a higher probability of occurrence for a given outcome than would have been estimated in foresight (i.e., without the outcome information) (Hawkins & Hastie,1990 and Arkes et al,. 1988). The ‘knew it all along’ bias seems likely to occur when the event in focus has well-defined alternative outcomes, emotional or moral significance, or is subject to imaginative consideration prior to its outcome being made known Hawkins and Hastie (1990). Interestingly, the three mechanisms which are deemed to constitute the primary sources of hindsight balance, namely selective memory, evidence evaluation, and evidence integration model change, are also critical to adaptive learning and the development of proficient judgment processes as a means of overcoming bias Hawkins and Hastie (1990).
Much of the mental work in the decision-making process is unconscious, which makes it difficult to make the necessary data and logic checks when making choices (Campbell et al. 2009). We often make hasty decisions without considering alternatives, and fail to revisit the initial, framing assessment of a situation. As result the classical model of decision-making is not followed, as we do not lay out and examine the options, define our objectives, and assess each option against those objectives. On the contrary, we tend to analyse the situation using heuristics, such as pattern recognition, to make our decision or formulate a course of action (Campbell et al. 2009).
Managers, as decision-makers, must have an awareness of bias, as well as motivation to pay attention and avoid stereotypes (Locke 2015). However, given the unconscious nature of decision-making, and difficulty of establishing the requisite checks and balances to counteract bias, there is a need to develop techniques that would help in this regard. Scholars have identified a number of debiasing techniques. These include ensuring that there is adequate time to think through the available options, making blind judgments, using computer algorithms where possible, and collective decision-making. Keeping a tally of the proportion of events that actually occur of those to which probability was assigned, and attempting to make probability judgments compatible with subject-matter expertise are also useful practices (Tversky & Kahneman 1974). Undoubtedly, a better understanding of heuristics and their associated biases can go a long way in helping to improve managerial judgments and decisions, particularly in decisions characterised by uncertainty (Tversky & Kahneman 1974).
In summary, cognitive judgments may be biased because heuristics and implicit theories are often used in decision-making. The examples provided highlight the importance of good decision making in business. The decisions made by managers have significant implications for the efficient and effective management of the human and financial resources, success and profitability of the business. Managerial decisions need to be characterized by good judgment, and to demonstrate fairness. Managers, therefore, need to be aware of their biases to become good decision-makers. With a better understanding of how the brain works, it will be easier to anticipate the circumstances in which certain errors in judgment are likely to occur, and put the requisite safeguards in place to avoid them (Campbell et al. 2009). In this regard, it would be useful for managers to develop an understanding of the decision-making process, and techniques for avoiding the pitfalls that bias may be present when confronted with the need to make quick, effective decisions.