Quality Improvement Essentials

Some Pitfalls to Avoid

WYSIATI

"What you see is all there is" - Kahnemann suggests that the combination of our "coherence-seeking system 1" thinking and our lazy, effortful system 2 thinking leads us to endorse many beliefs without digging too far underneath the surface. It's therefore quite natural for us to jump to conclusions on the basis of the limited evidence that is available to us. As Kahnemann succinctly puts it:

"System 1 is radically insensitive to both the quality and quantity of the information that gives rise to impressions and intuitions"

...and experimentation supports the thesis that "it is the consistency of the information that matters for a good story, and not the completeness." We can quickly evaluate and make sense out of available information, but we're not too highly skilled at stepping back and asking ourselves about what's missing.

Overconfidence

A natural outgrowth of our ability to quickly synthesize and evaluate information on hand is that we readily accept the story that it creates and are prone to accepting the pattern that our associative system activates based on available information, which also leads us to suppress doubt and ambiguity - two feelings that make us much more uncomfortable than confidence.

It's critically important in the world of quality improvement that we recognize how these tendencies lead us to conclusions that may be without merit. Once we have an initial association of causes and effects we more readily accept information that confirms that initial association than that which might cause doubt. The upshot of this tendency is that we have a "strong bias toward believing that small samples closely resemble the population from which they have been drawn" and we "exaggerate the consistency and coherence of what we see". We have to always be on guard against accepting cause and effect where randomness is a more likely explanation.One significant associated problem is that this will also lead to a lot of push-back from team members and other process stakeholders who have already built a coherent story of the problem or issue under review. Our tendency to see patterns in random variation within a system is very difficult for us to resist.

Common Cognitive Biases

There are a number of cognitive biases that affect our judgment in all walks of life. The table below, drawn from an article by Courtney, Spivey, and Daniel, presents the broad behavioral economics-based underpinning of some common biases with suggestions they offer for communicating with patients. Those strategies can also be applied to team members and stakeholders, and I've substituted that in where relevant:

PrincipleBehaviorsStrategy with stakeholders
Affect/emotion Our emotional associations can powerfully affect our actions; we make decisions based on emotion or affect, not statistics Use anecdotes and stories to emphasize a point
Loss aversion Losing something causes us more mental anguish than gaining something of the same value (loss aversion) Frame information as to what person may lose rather than what they may gain
Incentives We respond to prizes or privileges. Responses to incentives are shaped by things such as strongly avoiding losses    Invent and offer incentives of some kind. Frame information as to what person may lose rather than what they may gain (strong loss aversion)

Messenger We are heavily influenced by who communicates to us. We value information more from those we trust and respect Use your personal power in a positive way with the patient. Try to earn their trust and respect
Salience We pay more attention to information that seems relevant to us    Make information or recommendations personal and tailored
Framing    We are influenced by the way in which information is presented to us Frame messages to patients when conveying important information about treatment choice
Commitments We seek to be consistent with our public promises Ask for commitment (sign pledge, tell family and friends)
Norms We respond to social pressure and are strongly influenced by what we think other people are doing Use examples of others who take action
Defaults We tend to go with the flow of preset options Make it easier to decide to act versus not act (eg, require opt-out)
Priming Our acts are often influenced by subconscious cues Plan positive cues to stimulate patient choices
Ego We take actions that make us feel better about ourselves Recognize and praise beneficial behaviors
Present bias/discounting We have “present-bias” preferences, which means that present benefits and costs are valued more than future benefits and costs. In other words, we discount the future. Help patient consider present benefit of behavior or cost rather than future benefit
Rule of thumb    We adopt rules of thumb to deal with limited information-processing capacity Assess patient’s common approach to decision making. Try not to overwhelm patient with information at one sitting


There is a growing body of literature on examining the impact of cognitive biases on our decision making. Some potentially helpful articles include:

Courtney MR, Spivey C, Daniel KM. Helping patients make better decisions: how to apply behavioral economics in clinical practicePatient Prefer Adherence. 2014;8:1503-12. Published 2014 Oct 29. doi:10.2147/PPA.S71224

Redelmeier DA, Rozin P, Kahneman D. Understanding Patients' Decisions: Cognitive and Emotional Perspectives. JAMA. 1993;270(1):72–76. doi:10.1001/jama.1993.03510010078034

Tay SW, Ryan P, Ryan CA. Systems 1 and 2 thinking processes and cognitive reflection testing in medical students. Can Med Educ J. 2016;7(2):e97-e103. Published 2016 Oct 18.

Antony MM. Treatment Plans and Interventions for Depression and Anxiety Disorders. J Psychiatry Neurosci. 2001;26(5):422–423.

Croskerry, Pat MD, PhD. The Importance of Cognitive Errors in Diagnosis and Strategies to Minimize Them.  Academic Medicine. 2003; 78(8); 775-780.

Kahnemann, Daniel. Thinking Fast and Slow, pp. 85-88; pp. 113-118.