Medical Decision Making: Time to Get Acquainted with our Biases

3903 1

My Commission Bias 

The long awaited PEXIVAS trial was finally published. I was excited to prepare and present this landmark paper at our departmental journal club.The discussion was fascinating until I started realizing that I was fighting hard for plasmapheresis! I brought up the MEPEX trial to suggest that the populations were different. I pointed out the very active kidney biopsies of patients in the  MEPEX trial desperately insinuating that there still could be a role for plasmapheresis in patients with severe kidney disease. I also highlighted the low number of patients with severe pulmonary hemorrhage questioning whether the overall non-significant signal for plasmapheresis can be applied to this particular population. While all these points might be valid (or not), as I reflect back on why I was trying to dismiss the strong negative signal of PEXIVAS, I came to the realization that I was being driven by my commision bias: the inclination that something needs to be done, while perhaps the right thing to do is to do less, and wait.   

It Happens Every Day

This experience brought to my memory a prior encounter. A patient on peritoneal dialysis was admitted with encephalopathy. While there were quite a few reasons for the altered sensorium, there was some concern for inadequate dialysis. I recall the residents on the internal medicine team, and then the medicine attending calling me repeatedly inquiring about switching the patient over to hemodialysis. Not satisfied by my answer that the prescription has been intensified, they reached out, repeatedly, to my attending nephrologist asking for something to be done! Needless to say, clearances improved significantly after the adjustments. Reflecting on the scenario, the primary team wanted to do the best for their patient and their commision bias was pushing them to do something. Something had to be done now…in their opinion, the answer was switching to hemodialysis!

Omission vs commission: The tendency to wait and withhold an intervention (omission) vs the inclination to do something immediately (commission). These are two very powerful biases that play a central role in our decision making process. We all are somewhere on a spectrum between omission and commission but perhaps knowing where our inclinations are would help us critically evaluate our decisions. But are these the only biases affecting our decisions?

The Dual-Process Theory

This begs the following question: how do we make decisions? How do we think? What are other “biases” that dictate the way we evaluate a clinical scenario? Daniel Kahneman, a US-Israeli psychologist won the nobel prize in economic science in 2002 for his work that challenged the assumption of human rationality in economic theory. In his book, Thinking Fast and Slow, Kahneman describes two systems of thinking: the fast irrational (system 1) and the slow logical (system 2). This concept has since been extrapolated to clinical medicine and medical education in what is now highlighted as the dual-process reasoning theory. 

System 1 thinking, or the non-analytic reasoning process, is the dominant thinking process most of the time. It relies heavily on mental shortcuts called heuristics. This allows us to take complicated scenarios by focusing on what is familiar to us. Several concepts (availability, illness scripts, and pattern recognition for example) constitute our heuristics and improves our efficiency in our day-to-day practice. For example, you see a patient with infective endocarditis and an acute kidney injury with red blood cell casts on the urine sediment and you are already thinking about an infection related glomerulonephritis (GN). It’s a known pattern that fits your illness script of the disease. It is also immediately available in our mind for quick retrieval. The beauty of system 1 is how fast it is. It is effortless, efficient, requires minimal cognitive effort, and is accurate most of the time. However, as one would imagine, it remains prone to errors. 

System 2 thinking, on the other hand, is the analytic thinking process. This entails a careful and systematic approach to the problem at hand. In depth reflection on the situation guided by literature reviews and statistical analyses make this system a robust decision making process. System 2 thinking is time consuming, but allows for scientific rigor and is potentially less error prone. It is slow, inefficient, and requires a high degree of cognitive effort. Ultimately, the thinking process in clinical medicine is the product of an intricate interplay between systems 1 and 2 thinking.

The Thinking Traps

While both systems can be activated, system 1 is typically the more dominant pattern of thinking. However, system 1 remains error prone and can certainly fail. Failed system 1 thinking can thus be our thinking traps that adversely affect our diagnostic process. These thinking traps are referred to as cognitive biases. 

A 2003 paper in Academic Medicine by Croskerry highlights some frequent cognitive biases encountered in medicine and discusses strategies to minimize them. Below are highlights:

  • Availability bias: Things are more likely if they readily come to mind. This is an important heuristic where a nephrologist is more readily able to diagnose IgA nephropathy as compared to a primary care provider. This is because a nephrologist has seen IgA on a regular basis. However, this can also be a thinking trap, or a bias. While rounding in the intensive care unit a nephrologist will likely see one acute  tubular necrosis (ATN) after the other. The odds are the second one is going to be an ATN, until a GN is missed. 
  • Framing effect: The way the patient is presented to you “frames” your thinking in one direction or the other. However, the presentation might have been strongly influenced by the presenter’s biases and be misleading. An interesting form of framing is triage cueing” where triage is destiny. If you are on the cardiology floor, you must have a heart problem and as such your kidney injury has to be cardiorenal.  
  • Diagnostic momentum: This one goes hand in hand with the anchoring bias and premature closure. Once a diagnosis is made, the thinking stops no matter what additional evidence surfaces. Diagnoses are sticky and get carried over from person to person, note to note, hospitalization to hospitalization. It is not until the 8th admission for asthma exacerbation that a clinician decides to pause and investigate what eventually turns out to be an eosinophilic granulomatosis with polyangiitis case! The tendency to “anchor” to one element of the story and “prematurely close” the thinking process can only perpetuate the diagnostic momentum. 
  • Visceral bias: This is a particularly strong bias where our feelings towards the patients, negatively or positively, affects our overall care. It affects the time we spend at the bedside, the story we unpack, and then the diagnoses we make. Our patients on dialysis are a particularly vulnerable population where perhaps negative feelings regarding non-compliance can have adverse consequences on the care they recieve.    
  • Commission vs Omission: As detailed above, these are very powerful biases. In general, inpatient-based specialities tend to lean towards commission while outpatient specialities favor omission. Postgraduate training, with its heavy inpatient component, tends to leave trainees and freshly minted practitioners on the commission end of the spectrum. Both have been described in errors: rushing to get a high risk kidney biopsy with severe complications with the biopsy result ultimately not actionable vs waiting too long to get a biopsy that ultimately shows a paraproteinemic kidney disease in advanced fibrotic stages. 

Between Diagnostic Uncertainty and Diagnostic Errors

So why do we need to discuss cognitive biases? Well, diagnostic errors happen every day. Graber estimated a 10-15% diagnostic error rates in internal medicine specialties with the unconscious cognitive biases potentially playing a significant role in these pitfalls. The National Academy of Medicine (NAM) highlighted diagnostic errors in its 2015 publication stating,

“It is likely that most of us will experience at least one diagnostic error in our lifetime, sometimes with devastating consequences”

The NAM called for better training in decision making and diagnostic reasoning. As a community, we still shy away from discussing cognitive errors. Our diagnostic reasoning, particularly in the medicine specialities, is our most valued asset and it certainly is not easy to discuss the traps that we might get caught in. However, to better serve our patients, we need to talk about this.

William Osler once said, “Medicine is a science of uncertainty and an art of probability”.  Tolerating uncertainty is not an easy task and these uncertain situations present the greatest risk for diagnostic errors. When faced with uncertainty, we need to slow down and reflect on our thinking process. Metacognition is the awareness and understanding of one’s own thinking process and the potential cognitive biases affecting it. As such, metacognitive strategies have been proposed as potential de-biasing strategies with the ultimate goal of mitigating diagnostic errors. While certainly time consuming, these strategies are needed when diagnostic uncertainty presents itself.  

The Cognitive Autopsy

The concept of a cognitive autopsy has been suggested as one metacognitive strategy to uncover the unconscious cognitive biases that might have led to a diagnostic pitfall. This concept was successfully applied in a group setting where the housestaff dissect a case where a diagnostic error occurred. They collectively evaluate their thinking process and determine what might have unconsciously driven the decisions that eventually culminated in the error. This reflective exercise, in a safe environment, helps bring the unconscious biases to the attention of the trainees while emphasizing the important role these traps play in diagnostic errors.

An important concept highlighted in these exercises are the high cognitive hazard situations. 

These are situations where we might be more likely to fall prey to our thinking traps with potential adverse downstream effects. They include high workload, time pressures, sleep deprivation, and physical or mental exhaustion. Through realizing the unconscious biases that drive us and the situations that put us at risk, we can hopefully reflect on our thinking in action, and hopefully improve our patient care.  

Conclusion

My commission bias plays a significant role in what I do…among many other biases. I am aware of this bias of mine and I hope that this awareness makes me pause and reflect when dealing with a complicated scenario that does not fit any predefined illness script. I do not believe that we can completely eliminate all of our cognitive biases. However, I do believe that knowing the tendencies of our biases and realizing our “high cognitive hazard” situations can help us mitigate those diagnostic traps! As training programs strive to teach clinical reasoning, it seems prudent to start discussing cognitive biases and help residents and fellows get acquainted with their favorite ones.

Ali Mehdi, MD, MEd
Nephrology Fellow, Cleveland Clinic
@AliMehdiMD

1 comment

  1. Thanks! Excellent discussion, and extremely germane.

Leave a Reply