top of page
Julian Talbot

The Psychology of Risk

“There is no dearth of evidence in everyday life that people apprehend reality in two fundamentally different ways, one variously labelled intuitive, automatic, natural, non-verbal, narrative and experiential, and the other analytical, deliberative, verbal and rational.” - Seymour Epstein[1]

Biases and heuristics change the way we see the world. Which is OK because, on average, they work in our favour. Heuristics help us interpret what we experience and respond quickly, with minimal effort or thought. Mostly they work for the better, but not always. The fact that they are, almost by definition, unconscious and automatic is both a blessing and a curse. But what are they exactly?

Volumes have been written on the topic and this article is just a brief overview for risk managers, CEOs, company directors, bus drivers, landscape gardeners, and well, basically, all homo sapiens.

Our daily experience, backed up by decades of research, shows that we are overloaded with information. Cognitive limitations cause us to use rules of thumb to lighten the burden of mentally processing information, to make judgments and decisions.

TL;DR: Life is complex and we receive too much information every day to process it fully. So we use decision shortcuts.



Heuristic? Or Bias? What is the difference?


Biases and heuristics are rules of thumb are often helpful in dealing with complexity and ambiguity. Under many circumstances, however, they lead to (predictably) bad decisions known as cognitive biases.

Cognitive biases are mental errors caused by our simplified information processing strategies. It is crucial to distinguish cognitive biases from other forms of bias, such as cultural bias, organizational bias, or bias resulting from self-interest.

A cognitive bias doesn't result from any emotional or intellectual predisposition toward a particular judgment. Rather from subconscious mental procedures for processing information. A cognitive bias is a mental error that is consistent and predictable.[2]

Amos Tversky and Daniel Kahneman were among the first researchers to identify that decision-makers use 'heuristics' or 'rules of thumb' to arrive at their judgements.[3] Heuristics and cognitive biases are similar but not identical. Nor are there universally agreed definitions of these terms.

Ideally, we would describe a heuristic as any shortcut for decision-making that almost always leads to a correct solution, whereas a cognitive bias almost always produces an error. Unfortunately, the situation is not as simple as that.

Mental Shortcuts

A heuristic is a rule of thumb shortcut which uses experience-based techniques for decision-making, problem-solving, learning, and discovery. They are the equivalent of mental shortcuts used to speed up the process of finding a satisfactory solution, where an exhaustive rational analysis is impractical[4] or time is short.

We make these sort of decisions every day. Deciding, based on non-verbal communication, if we like someone within 90 seconds of meeting them. Taking an umbrella if it is grey and miserable outside, rather than looking up the forecast for all the locations you will visit. Trying to pick a restaurant for dinner and ultimately choosing the one you are most familiar with.

The advantage of heuristics is that they reduce the time and effort required to make decisions and judgements.

Judgmental shortcuts generally get us where we need to go – and quickly – but at the cost of occasionally sending us off course. It is, for example, easier to guess the outcome of a horse race or sporting competition , based on which names you recognize most, rather than engage in a long and tedious rational analysis. In most cases, rough approximations are sufficient.

A heuristic can, in some instances, be safer and more accurate than a calculation, and the same heuristic can underlie both conscious and unconscious decisions[5]. They are highly economical and usually practical but can lead to systematic and predictable errors.

Broadly speaking, a heuristic is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational but is nevertheless sufficient for reaching an immediate, short-term goal or approximation.[6]

A cognitive bias, by comparison, is a systematic pattern of deviation from norm or rationality in judgment. An individual's construction of reality, not the objective input, may dictate their behavior in the world. Thus, cognitive biases may sometimes lead to perceptual distortion, inaccurate judgment, illogical interpretation, or what is broadly called irrationality. Although it may seem that such misperceptions would be aberrations, biases can help humans find commonalities and shortcuts to navigate everyday situations effectively[7].

There is no special magic to biases and heuristics. They are patterns of decision-making that increase the likelihood of surviving long enough to pass on our DNA.

The short version:

  • Cognitive biases are patterns of thought that produce illogical results, which for various reasons, often achieve a good (or adequate) outcome.

  • Heuristics are practical approaches to thought that produce best guesses that aren't guaranteed to be correct.


How Many Heuristics Are There?

More than 250 recognized cognitive biases, effects, and heuristics affect the judgement and decision-making of humans. Most biases and effects are socially conditioned, and some of the most common cognitive biases are described below.

  • Optimism bias is a cognitive bias that causes an individual to believe that they will do better than most others engaged in the same activity. This bias has been shown in many studies, and it explains why we think car accidents are more likely to happen to 'other people'. One study[8] asked university students which of 18 positive and 24 negative events (e.g., getting a good job, developing a drinking problem) were more likely to happen to them versus others. On average, students considered themselves 15% more likely than others to experience positive events and 20% less likely to experience negative events. The authors of this book are familiar with this bias which manifests in our ability to produce this book in less time than it might take other authors. We estimated that it would take a year to produce the book. We started in 2009 and finally published more than ten years later. Without optimism bias, far fewer book projects, or enterprises of any nature, would likely be commenced.

  • Control bias refers to a cognitive bias where people are more likely to accept risks if they feel they have some control over them. For example, studies show that most people feel safer driving than flying cross country. Even though flying is statistically safer, being in control of a car rather than a passive passenger creates an illusion of higher safety.

  • Negativity bias is a cognitive bias that results in adverse events having a more significant impact on our psychological state than positive events. Also known as the negativity effect, losses loom more prominent in our psyche than gains (loss aversion). Negativity bias is the notion that even when of equal intensity, things of a more negative nature have a greater impact on psychological state and processes than neutral or positive things.

  • Confirmation bias is seeking or interpreting evidence in ways that are partial to existing beliefs, expectations, or hypotheses.[9] It results in a tendency to search for, favor, interpret, and recall information in a way that affirms a prior belief. At the same time to ignore, devalue, or overlook contrary evidence.

  • Focalism, also known as the focusing illusion, is an example of how cognitive biases can influence mental health. Focalism is the tendency to focus or emphasize a single factor or piece of information when making judgments or predictions. For example, consider the relationship between dating and overall happiness. When participants were asked two questions in this order ('How happy are you with your life in general?' and then 'How many dates did you have last month?'), the correlation between the answers was not different from zero. When the order of the questions was reversed in another sample of participants, the correlation jumped to 0.66; thinking about that aspect of life led the research participants to associate it more closely with their general happiness.[10] Questions on marriage and physical health have been shown to have virtually identical effects. The quote from Schkade and Kahneman[11] that "Nothing in life is quite as important as you think it is while you are thinking about it." Summarizes the concept of focalism and expectations of happiness.

  • Anchoring heuristic refers to the tendency of people to often start with one piece of known information and then adjust it to create an estimate of unknown risk, albeit the adjustment will usually not be big enough. Anchoring or focalism is a cognitive bias that involves relying too heavily on an initial piece of information (the anchor) to make subsequent judgments. Studies show, for example, that the asking price will influence you if you consider how much you should pay for a house.[12] Even irrelevant information can affect the range of possible outcomes.[13] Anchoring effects have been observed in experiments where the last few digits of a respondent's Social Security number were used as the anchor for tasks such as estimating the number of physicians in their city. In another study, judges with an average of over 15 years on the bench first read a description of a woman who had been caught shoplifting and then rolled a pair of dice. The dice were loaded, so every roll resulted in only a three or a nine. After rolling the dice, the judges were instructed to specify the exact prison sentence they would give the shoplifter. Those who had rolled a 9 said, on average, they would sentence her to 8 months; those who rolled a 3 said they would sentence her to 5 months.[14]

  • Availability heuristic is the tendency to overestimate the likelihood of events that are easier to recall. Their 'availability' in memory makes them seem more important and is influenced by how recent the memories are or how unusual or emotionally charged they may be. Events that can be more easily brought to mind or imagined are judged more likely than events that could not easily be imagined. That's why many people think homicides or acts of terrorism are more dangerous than cancer or diabetes. They are reported so frequently in the media that we unconsciously recall them much more easily. Ironically, they are shown in the media more often than cancer or diabetes precisely because they are rare.

  • Recognition heuristic refers to the tendency that if one of two objects is recognized and the other is not, then we will infer that the recognized object has the higher value with respect to the criterion[15] (whatever that may be). Marketing experts, for example, know all too well that the recognition heuristic produces a feeling of which product to trust.

  • Asymmetry regarding gains and losses or loss aversion. Most people are risk-seeking when it comes to losses, preferring to take a chance on losing nothing or losing a large amount rather than accepting a certain but small loss. By contrast, most people are risk-averse when it comes to gains and pick a small certain gain rather than take a chance on winning a lot or winning nothing.[16]

  • Dunning–Kruger effect refers to research showing that incompetent people fail to realize they are incompetent because they lack the skill or self-awareness to distinguish between competence and incompetence. It is related to the cognitive bias of illusory superiority. People who lack competence tend to be overconfident about their abilities. In contrast to this effect, experts tend to underestimate their competence and state of knowledge[17]. Generally speaking, true experts in their field, understand the limitations of their knowledge and remain open to new information. A tempting (but strictly speaking, incorrect) interpretation of the Dunning-Kruger effect is that "The stronger someone's opinions or beliefs, the less likely they are to be right."

  • Fundamental attribution error is a tendency for people to attribute benefits or costs to something associated with an event, even if it is not causal. It involves a tendency to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior. Considerable research has demonstrated an illusory causation effect in which people who are visually or auditory more noticeable or significant are perceived as more causal of events in social interactions than their less-salient counterparts.[18] Amos Tversky and psychologist Eric Johnson showed the influence of bad feelings could also extend beyond the thing generating the feelings. They asked Stanford University students to read one of three versions of a story about a tragic death, the cause being either leukemia, fire, or murder, which contained no information about how common such tragedies are. They then gave the students a list of risks – including the risk in the story and 12 others – and asked them to estimate how often they kill. As we might expect, those who read a tragic story about a death caused by leukemia rated leukemia as lethality higher than a control group of students who didn't read the story. The same with fire and murder. More surprisingly, reading the stories led to increased estimates for all the risks, not just the one portrayed. The fire story caused an overall increase in perceived risk of 14 per cent. The leukemia story raised estimates by 73 per cent. However, the murder story led the pack, raising risk estimates by a whopping 144 per cent. A 'good news story had precisely the opposite effect – driving down perceived risks across the board.[19]

  • Affect heuristic refers to an automatic affective valuation that is the basis for many judgments and behaviors. Simply put, the affect heuristic says that an overall good feeling toward a situation leads to lower risk perception, and a general bad feeling leads to higher risk perception. This explains why people tend to underestimate risks for actions that also have some ancillary benefit (e.g., smoking, skydiving). In one experiment,[20] subjects were shown either a happy face, frowning face, or neutral face, followed by a random Chinese ideograph. Subjects tended to prefer ideographs they saw after the happy face, even though the face was visible for only ten milliseconds, and they had no conscious memory of seeing it.

  • Durability bias is the subconscious inclination to forecast future events based on past events. It involves the assumption that past trends will continue into the future and is pernicious in risk management failures. See also: Gamblers fallacy, Hot-hand fallacy, and Icarus Paradox.

  • Gambler's fallacy is the tendency to believe that future probabilities are altered by past events when in reality, they are unchanged. For example, 'The roulette wheel has produced an even number for the last ten spins. Therefore it is more likely to produce an odd number on the next spin."

  • Hot-hand fallacy refers to the belief that a person who has experienced success has a greater chance of further success in the future. Sometimes also known as the 'I can do no wrong' bias. See also Icarus Paradox.

  • Hindsight bias is the tendency to see past events as being predictable at the time those events happened. Colloquially referred to as '20/20 Hindsight'.

  • Primacy effect and recency effect. Items near the end of a list are the easiest to recall (recency effect), followed by the items at the beginning of a list (primacy effect). They are therefore typically perceived as more significant than information in the middle of a list.

  • Sunk cost effect is the reluctance to pull out of a project when significant resources have been invested, even if the additional resources could be better used elsewhere. The reluctance to accept a loss or recognize a mistake can result in further losses.

In risk management terms, heuristics are mental shortcuts that allow us to make decisions more quickly, frugally, and accurately than if we considered additional information. They are a largely unconscious approach to problem-solving that takes individual experience into account and then seeks to optimize a decision based on limited information. Heuristics reduce the mental effort of retrieving and storing information in memory and speed up decision-making by reducing the amount of information necessary to make a choice or pass judgment. Although heuristics can speed up our problem-solving and decision-making, they can introduce errors and biased judgements.[21]



Cognitive biases represent predispositions to favor a given conclusion over other conclusions. This can affect any element of the risk management process, including risk identification, analysis, or treatment. Cognitive biases can be considered the leanings, priorities, and inclinations that influence our decisions to produce a pattern of deviation in judgement. These usually come from social or environmental factors. Risk culture is critical in this regard, as individuals create their own 'subjective social reality from perceptions based on their interactions and engagement with others in groups and organizations.[22]

These biases influence individual decision-making but also impact even the largest organizations. For almost 100 years, Kodak lead the photographic industry. They invented the digital camera in 1975 but then focused on the analog (film) photography business. Confirmation bias and group-think undoubtedly played a part in Kodak's leadership decision to discount the potential threat of digital photography. Kodak had 145,000 staff, and the business was optimized for analog photography with massive chemical installations to develop the films. This investment created a sunk cost bias. The leadership team exhibited optimism bias in their ability to compete against digital. Once the Apple of its day, famously owning 90% of the market for film and cameras, Kodak was bankrupt in 2012.

Another example of cognitive biases in risk management can be seen in what is perhaps the most analytical of all industries, hedge fund management. The billionaire hedge-fund manager Bill Ackman offers an illustration of cognitive biases. Ackman and his partners operated a hedge fund known as Pershing Square. Throughout the 1990s and early 2000s, they consistently outperformed all but a few of their peers. Then between 2013 and 2018, they lost over $4 billion in one investment (Valeant Pharmaceuticals) and another $1 billion in a short position (Herbalife). They then lost 20.5 per cent in 2015, 13.5 per cent in 2016, and 4 per cent in 2017 as these positions unwound. A host of cognitive biases, including postpurchase rationalization, confirmation bias, optimism bias, and selective perception, played a role. In the words of James Rickards:

‘None of the geniuses at Pershing Square suddenly became dumb. Still, they did succumb to behavioral biases; indeed, the strength of their bias was amplified by prior success, a kind of “I can do no wrong” bias.’[23]

We are all subject to cognitive biases and heuristics, which influence our risk management. Even artificial intelligence and machine learning employ heuristics in decision-making, and these systems are not immune from developing the machine equivalent of cognitive biases.


Heuristics and biases deserve a book of their own, and there are many excellent ones from which to choose. For a quick overview of this topic, consider Eliezer Yudkowsky's chapter from 'Global Catastrophic Risks’[24], “Cognitive biases potentially affecting judgment of global risks” which is available online.[25]



Understanding how heuristics and biases work can give us better insight into our risk management influences and (in theory) lead to better problem-solving and decision-making. Designing risk management frameworks and decision tools that recognize the potential for human (and machine) cognitive biases is a critical and evolving area of risk management.

Hopefully this article at least given you some useful understanding into why we humans sometimes make bad decisions. And why.

If you're now mired in self-doubt, that is pretty normal when first exposed to these ideas. Perhaps you will appreciate this article with four key strategies to minimise the negative effects of heuristics and biases.


 

Another article on one of my other websites that you might find interesting, talks about how human factors influence root cause analysis. It is an expert from a book that taught me how and why optimism bias is an author's secondbest friend (second only to my incredibly supportive wife).



 

[1] Epstein, S. (1994). Integration of the cognitive and the psychodynamic unconscious. American Psychologist, 49(8), 709-724. [2] Heuer, Richards J and Center for the Study of Intelligence (U.S.). Psychology of Intelligence Analysis, 2019. [3] Tversky, A., and D. Kahneman. “Judgment under Uncertainty: Heuristics and Biases.” Science (New York, N.Y.) 185, no. 4157 (September 27, 1974): 1124–31. https://doi.org/10.1126/science.185.4157.1124. [4] Long, Robert Douglas. Real Risk: Human Discerning and Risk, 2014. [5] Gigerenzer, Gerd. Risk Savvy: How to Make Good Decisions, 2015. http://search.ebscohost.com/login.aspx?direct=true&scope=site&db=nlebk&db=nlabk&AN=1124783. [6] “Heuristic.” In Wikipedia, December 3, 2020. https://en.wikipedia.org/w/index.php?title=Heuristic&oldid=992176642. [7] “Cognitive Bias.” In Wikipedia, November 29, 2020. https://en.wikipedia.org/w/index.php?title=Cognitive_bias&oldid=991293844. [8] Scribd. “Weinstein, Unrealistic Optimism, JPSP, 1980 | PDF | Optimism | Statistical Hypothesis Testing.” Accessed June 17, 2022. https://www.scribd.com/document/47349310/Weinstein-Unrealistic-optimism-JPSP-1980. [9] “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises - Raymond S. Nickerson, 1998.” Accessed December 9, 2020. https://journals.sagepub.com/doi/10.1037/1089-2680.2.2.175. [10] Strack, Fritz, Leonard L. Martin, and Norbert Schwarz. “Priming and Communication: Social Determinants of Information Use in Judgments of Life Satisfaction.” European Journal of Social Psychology 18, no. 5 (1988): 429–42. https://doi.org/10.1002/ejsp.2420180505. [11] Schkade, David A., and Daniel Kahneman. “Does Living in California Make People Happy? A Focusing Illusion in Judgments of Life Satisfaction.” Psychological Science 9, no. 5 (September 1998): 340–46. https://doi.org/10.1111/1467-9280.00066. [12] Kahneman, Daniel. Thinking, Fast and Slow. London: Penguin Books, 2012. [13] Tversky, Amos, and Daniel Kahneman. “Advances in Prospect Theory: Cumulative Representation of Uncertainty.” Journal of Risk and Uncertainty 5, no. 4 (October 1, 1992): 297–323. https://doi.org/10.1007/BF00122574. [14] Englich, Bert, Thomas Mussweiler, and Fritz Strack. “Playing Dice With Criminal Sentences: The Influence of Irrelevant Anchors on Experts’ Judicial Decision Making - Birte Englich, Thomas Mussweiler, Fritz Strack, 2006.” Personalilty and Social Psychology Bulletin 32 (2006): 188–200. [15] Goldstein, Daniel G., and Gerd Gigerenzer. “Models of Ecological Rationality: The Recognition Heuristic.” Psychological Review 109, no. 1 (2002): 75–90. https://doi.org/10.1037/0033-295X.109.1.75. [16] Kahneman, Daniel, and Amos Tversky. “Prospect Theory: An Analysis of Decision Under Risk.” SSRN Scholarly Paper. Rochester, NY: Social Science Research Network, 1979. https://papers.ssrn.com/abstract=1505880. [17] Kruger, Justin, and David Dunning. “Unskilled and Unaware of It: How Difficulties in Recognizing One’s Own Incompetence Lead to Inflated Self-Assessments.” Journal of Personality and Social Psychology, 77, no. 6 (n.d.): 1121–34. https://doi.apa.org/doiLanding?doi=10.1037%2F0022-3514.77.6.1121. [18] Robinson, Janet, and Leslie Zebrowitz McArthur. “Impact of Salient Vocal Qualities on Causal Attribution for a Speaker’s Behavior.” Journal of Personality and Social Psychology 43 (1982): 236–47. [19] Johnson, Eric, and Amos Tversky. “Affect, Generalization, and the Perception of Risk.” Journal of Personality and Social Psychology 45 (July 1, 1983): 20–31. https://doi.org/10.1037/0022-3514.45.1.20. [20] Winkielman, Piotr, and Robert B. Zajonc & Norbert Schwarz. “Subliminal Affective Priming Resists Attributional Interventions.” Cognition & Emotion 11, no. 4 (August 1997): 433–65. https://doi.org/10.1080/026999397379872. [21] Dale, Steve. “Heuristics and Biases: The Science of Decision-Making.” Business Information Review 32, no. 2 (June 1, 2015): 93–99. https://doi.org/10.1177/0266382115592536. [22] Long, Robert Douglas. Real Risk: Human Discerning and Risk, 2014. [23] Rickards, James. Aftermath: Seven Secrets of Wealth Preservation in the Coming Chaos. New York: Portfolio/Penguin, 2019. [24] Bostrom, Nick, ed. Global Catastrophic Risks. Repr. Oxford: Oxford University Press, 2012. [25] Yudkowsky, Eliezer. “Cognitive Biases Potentially Affecting Judgment of Global Risks,” https://intelligence.org/files/CognitiveBiases.pdf

Recent Posts

See All

Commentaires


bottom of page