Physicians are typically viewed as individuals with the proven capacity to objectively analyze information, solve problems, and make decisions. However, that is not always the case. It is estimated that the rate of diagnostic errors in clinical decisions ranges from 10% to 15%.1 These errors occur due not only to a lack of knowledge but also to problems related to critical thinking and clinical reasoning, known as clinical biases.
DUAL PROCESSING IN DECISION-MAKING
Recent studies in cognitive science show that two information-processing systems are involved in medical decision-making (as well as in other types of decision-making). These systems—known as System 1 and System 2—are completely different but complementary. Each involves different cortical mechanisms with associated neuroanatomic and neurophysiologic substrates.2,3
System 1. This system is fast, autonomous, reflexive, intuitive, unconscious, and economical—it demands little energy from the brain. System 1 is a rapid way to reach solutions when information is incomplete. Generally speaking, this system gets it right, but it does make mistakes.
System 2. This system is the exact opposite of System 1. It is conscious, deliberate, analytical, deductive, logical, scientific, and rational. It is more accurate and reliable than System 1, but it is slower and more demanding and requires many resources.
Neither of these systems is better than the other; in all contexts, they are complementary. In fact, the best physicians are those who skillfully combine both to process information.
COGNITIVE BIASES AND HEURISTICS
Traditionally, it was believed that human beings made decisions rationally. Once Amos Tversky, PhD, and Daniel Kahneman, PhD, published their work in 1974,4 it became evident that most of our important decisions are based on a limited number of heuristic principles, rather than on formal analysis of the problem.
Cognitive biases and heuristics are patterns of decision-making that deviate from logical reason. They are mental processes that enable us to solve problems rapidly. They are mental shortcuts—mostly intuitive judgments based on partial knowledge, previous experience, and suppositions. Each cognitive bias serves a function mainly related to saving time and energy. Below is a brief overview of six of these patterns.
Anchoring. When anchoring, a physician relies too heavily, or anchors, on the main traits or pieces of information acquired in the patient’s first visit. If the physician does not adjust his or her first impression in light of new information, it could lead to a premature initial diagnosis.
Availability. Some physicians may have a tendency to overestimate the likelihood of diagnoses with greater availability in their memory. A recent experience with an illness may increase the probability of diagnosing it again. Conversely, if an illness has not been diagnosed for a long period of time (making that diagnosis less available) it may be underdiagnosed.
Confirmation Bias. Confirmation bias refers to the inclination to look for signs, symptoms, and evidence to confirm an initial diagnosis and to ignore anything that contradicts it.
Omission Bias. Some physicians have a tendency toward inaction that is strongly linked to the “do no harm” principle. Events occurring as part of the natural course of an illness are more acceptable than those that could be directly attributed to the doctor’s action.
Outcome Bias. With outcome bias, a physician is more likely to diagnose illnesses with favorable prognoses before those with unfavorable ones, thus avoiding the displeasure associated with them.
Overconfidence Bias. Overconfidence bias refers to the belief that one knows more than they actually do. This reflects a tendency to act on the basis of incomplete information, intuitions, or hunches. It is a variation of the Dunning-Kruger effect.
Given that it is human nature to be influenced by such biases, it may seem that little to nothing can be done to combat them. Fortunately, that is not so. We can count on effective strategies6 to lessen the influence of biases and perhaps even do away with some of them. A few possible strategies include:
- Recognizing when we are using System 1 of metacognition and switching to System 2 instead;
- Assessing diagnosis alternatives;
- Asking ourselves, could this be something else?;
- Minimizing the demand for a fast diagnosis; and, above all,
- Instructing physicians about the existence of cognitive biases.
In view of this, I hope that this information raises awareness about cognitive biases and heuristics, thus helping physicians to be less prone to biased decision-making.
1. Graber M, Gordon R, Franklin N. Reducing diagnostic errors in medicine: what’s the goal? Acad Med. 2002;77:981–92.
2. Hammond KR. Intuitive and analytic cognition: information models. In: Sage A, ed. Concise Encyclopedia of Information Processing in Systems and Organizations. Oxford, United Kingdom: Pergamon Press; 1990: 306-312.
3. Stanovich KE, West RF. Individual differences in reasoning: implications for the rationality debate. Behav Brain Sci. 2000;23:645-665
4. Tversky A, Kahneman D. Judgment under uncertainty: heuristics and biases. Science. 1974;185:1124-1131.
5. Friedman HH. Cognitive biases that interfere with critical thinking and scientific reasoning: a course module. 2017; doi: 10.2139/ssrn.2958800.
6. Dobler CC, Morrow AS, Kamath CC. BMJ Evidence-Based Medicine. 2018; doi:10.1136/ bmjebm-2018-111074.