Clinical Decision-Making and Debiasing Strategies
Debnath Chatterjee, MD, FAAP, and Gina Whitney, MD
In a recent PAAD (September 13), we reviewed common cognitive errors, which are thought-process errors linked to failed biases or heuristics. They often can lead to patient harm. Clinical decision-making is a complex process, and all of us are prone to biases. The quality of our clinical decisions depends on several individual factors (general affective state, personality, intelligence, fatigue, cognitive load, sleep deprivation, distractions, etc.) and ambient conditions in the immediate environment (context, team factors, patient factors, resource limitations, ergonomic factors, etc.).1 In today’s PAAD, we will review two papers from the safety literature that discuss the dual process theory of clinical decision-making, how biases are generated, and debiasing strategies.
Original articles
Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf. 2013 Oct;22 Suppl 2(Suppl 2):ii58-ii64. doi: 10.1136/bmjqs-2012-001712. Epub 2013 Jul 23. PMID: 23882089
Croskerry P, Singhal G, Mamede S. Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf. 2013 Oct;22 Suppl 2(Suppl 2):ii65-ii72. doi: 10.1136/bmjqs-2012-001713. Epub 2013 Aug 30. PMID: 23996094
According to the dual-process theory, human cognitive tasks are processed by two systems that operate in parallel.[1] Type 1 processing is intuitive, fast, automatic, and relies largely on pattern recognition. We spend up to 95% of our time in this intuitive mode, which is characterized by heuristics or mental shortcuts (“seen this multiple times before”). The major advantage is that its effortless and very often accurate. In contrast, type 2 processing is slow, analytical, effortful, and deliberative, and involves conscious reasoning. The major advantage is that it can handle complex, novel problems. However, since it is slow and effortful, it is usually unsuitable for time-sensitive tasks.[1]
Clinical decision-making usually involves a blend of intuitive and analytical processing in varying degrees (Figure 1). Our brains generally default to type 1 processing whenever possible. We often toggle back and forth between the two systems. Repetitive processing of a skill using type 2 processes allows type 1 processing, which is the basis of skill acquisition.[1] This transition from type 2 to type 1 processing of cognitive tasks is particularly relevant to less experienced clinical decision-makers.
While biases can occur in both processes, most biases are associated with heuristics and type 1, intuitive processes. Biases in type 1 affecting unconscious or automatic responses are termed implicit bias, while biases in type 2 affecting conscious attitudes and beliefs are termed explicit bias. There are two main sources of biases: innate, hard-wired biases from our evolutionary past and acquired biases that we develop in our working environments (social/cultural habits, hidden curriculum, etc.). Certain high-risk situations, such as fatigue, sleep deprivation, and cognitive overload, predispose clinicians to use type 1 processes for decision-making.
So, what can we do about these biases? The key to debiasing is first to be aware of our own biases and be motivated to correct them. The next critical step involves deliberate decoupling from type 1 intuitive processing and moving to type 2 analytical processing.
In the second paper, the authors discuss three groups of debiasing strategies.[2] Selected examples are listed below.
Educational Strategies
1. Educational curricula covering theories of decision-making, cognitive and affective biases, and their applications.
2. Simulation scenarios with cognitive error traps.
3. Teaching specific skills to mitigate biases.
Workplace Strategies
1. Getting more information about a case before making a diagnosis.
2. Metacognition describes the process of reflecting on one’s own thought process and decision-making behavior. The ability to step back and observe your thinking may help.
3. Slowing down strategies by making a conscious effort to slow down to avoid premature closure.
4. Mindfulness techniques help focus attention and reduce diagnostic errors.
5. Second opinion or “fresh set of eyes” of a trusted colleague during a challenging case.
6. Improving feedback on clinical decisions to reduce feedback bias.
7. Avoid cognitive overload, fatigue, and sleep deprivation.
8. Ready availability of protocols and clinical guidelines to reduce variance.
9. Incorporating checklists, cognitive aids, and clinical decision support tools into electronic medical records.
Cognitive Forcing Functions
Cognitive forcing functions are rules that require the clinician to internalize and deliberately consider alternative options. Some examples include:
1. Checklists such as surgical safety checklist or central line insertion checklist.
2. Rule out worst-case scenario is a simple strategy to avoid missing important diagnoses.
3. Standing rules- a given diagnosis cannot be made unless other must-not-miss diagnoses have been ruled out.
4. Prospective hindsight-clinician imagines a future in which his or her decision is wrong and then answers the question- “what did I miss”?
5. Stopping rules- when enough information has been gathered to make an optimal decision.
While all of us are prone to develop cognitive “short cuts” to speed and simplify decision making, it is important to appreciate the capacity for error. A diagnosis is often merely a hypothesis, and we must take care to consider additional possibilities even as we actively manage its most immediate effects. Design of tools and processes to decrease cognitive load are particularly helpful within the clinical practice of anesthesiology as clinical situations are quite dynamic and often urgent. The impacts of time pressure, workload, task-complexity, and other performance shaping factors on our ability to make important decisions quickly should not be underestimated.
References
1. Croskerry, P., G. Singhal, and S. Mamede, Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf, 2013. 22 Suppl 2(Suppl 2): p. ii58-ii64.
2. Croskerry, P., G. Singhal, and S. Mamede, Cognitive debiasing 2: impediments to and strategies for change. BMJ Qual Saf, 2013. 22 Suppl 2(Suppl 2): p. ii65-ii72.