System 2 thinking: Of zebras and horses
System 2 is the analytic thinking that involves pondering and seek out the optimal answer rather than the “good-enough” answer.
“The good news about system 2 is that it really can monitor system 1,” said Dr. Nagler, who has a master’s degree in health professions education. “If you spend the time to do analytic reasoning, you can actually mitigate some of those errors that may occur from intuitive judgments from system 1 thinking. System 2 spends the time to say ‘let’s make sure we’re doing this right.’ ” In multiple-choice tests, for example, people are twice as likely to change a wrong answer to a right one than a right one to a wrong one.
System 2 thinking allows for the reasoning to assess questions in the gray zone. It’s vigilant, it’s reliable, it’s effective, it acknowledges uncertainty and doubt, it can be safe in terms of providing care, and it has high scientific rigor. But it also has disadvantages, starting with the fact that it’s slower and more time-consuming. System 2 thinking is resource intensive, requiring a higher cognitive demand and more time and effort.
“Sometimes the quick judgment is the best judgment,” Dr. Nagler said. System 2 thinking also is sometimes unnecessary and counter to value-based care. “If you start to think about all the possibilities of what a presentation may be, all of a sudden you might find yourself wanting to do all kinds of tests and all kinds of referrals and other things, which is not necessarily value-based care.” When system 2 thinking goes astray, it makes us think everything we see is a zebra rather than a horse.
Sonia Khan, MD, a pediatrician in Fremont, Calif., found this session particularly worthwhile.
“It really tries to explain the difference between leaping to conclusions and learning how to hold your horses and do a bit more, to double check that you’re not locking everything into a horse stall and missing a zebra, and avoiding go too far with system 2 and thinking that everything’s a zebra,” Dr. Khan said. “It’s a difficult talk to have because you’re asking pediatricians to look in the mirror and own up, to learn to step back and reconsider the picture, and consider the biases that may come into your decision-making; then learn to extrude them, and rethink the case to be sure your knee-jerk diagnostic response is correct.”
Types of cognitive errors
The presenters listed some of the most common cognitive errors, although their list is far from exhaustive.
- Affective error. Avoiding unpleasant but necessary tests or examinations because of sympathy for the patient, such as avoiding blood work to spare a needle stick in a cancer patient with abdominal pain because the mother is convinced it’s constipation from opioids. This is similar to omission bias, which places excessive concern on avoiding a therapy’s adverse effects when the therapy could be highly effective.
- Anchoring. Clinging to an initial impression or salient features of initial presentation, even as conflicting and contradictory data accumulate, such as diagnosing a patient with fever and vomiting with gastroenteritis even when the patient has an oxygen saturation of 94% and tachypnea.
- Attribution errors. Negative stereotypes lead clinicians to ignore or minimize the possibility of serious disease, such as evaluating a confused teen covered in piercings and tattoos for drug ingestion when the actual diagnosis is new-onset diabetic ketoacidosis.
- Availability bias. Overestimating or underestimating the probability of disease because of recent experience, what was most recently “available” to your brain cognitively, such as getting head imaging on several vomiting patients in a row because you recently had one with a new brain tumor diagnosis.
- Bandwagon effect. Accepting the group’s opinion without assessing a clinical situation yourself, such as sending home a crying, vomiting infant with a presumed viral infection only to see the infant return later with intussusception.
- Base rate neglect. Ignoring the true prevalence of disease by either inflating it or reducing it, such as searching for cardiac disease in all pediatric patients with chest pain.
- Commission. A tendency toward action with the belief that harm may only be prevented by action, such as ordering every possible test for a patient with fever to “rule everything out.”
- Confirmation bias. Subconscious cherry-picking: A tendency to look for, notice, and remember information that fits with preexisting expectations while disregarding information that contradicts those expectations.
- Diagnostic momentum. Clinging to that initial diagnostic impression that may have been generated by others, which is particularly common during transitions of care.
- Premature closure. Narrowing down to a diagnosis without thinking about other diagnoses or asking enough questions about other symptoms that may have opened up other diagnostic possibilities.
- Representation bias. Making a decision in the absence of appropriate context by incorrectly comparing two situations because of a perceived similarity between them, or on the flip side, evaluating a situation without comparing it with other situations.
- Overconfidence. Making a decision without enough supportive evidence yet feeling confident about the diagnosis.
- Search satisfying. Stopping the search for additional diagnoses after the anticipated diagnosis has been made.