Quality Improvement Essentials
Consider a "Premortem"
Daniel Kahneman, in his seminal tome Thinking Fast and Slow, identifies a planning fallacy whereby - in spite of ample evidence from past project implementations that point to the contrary - we are constantly over-optimistic in our forecasts about project outcomes. We tend to adopt an "inside view" that is "unrealistically close to best-case scenarios" (Kahneman, p.250).
Overconfidence (frequently associated with social and/or economic pressures) exhibits itself in all manner of environments: commercial business, health care, education, our private lives. Consider:
- 90% of drivers believe they are better than average
- A study of patients who died in the ICU compared autopsy results with the diagnosis provided while patients were alive. Clinicians who were "completely certain" of the diagnosis antemortem were wrong 40% of the time
- A study of S&P 100 companies found zero correlation between the stock market return estimates of their chief financial officers and actual returns
One posited solution adopted from Bent Flyvbjerg is to consult available statistics of similar cases and situations and allow that distributional information to inform your forecast - i.e. take the "outside view". According to Kahneman, Flyvbjerg's recommendations are to:
- Identify an appropriate reference class
- Obtain statistics of that class and use them to generate a baseline
- Use specific information about your individual case to adjust the baseline prediction, i.e. identify reasons to be optimistic - or pessimistic - about your specific project outcomes. (Kahneman, p.251-252)
It's certainly an issue that such statistics may not be available for examination. In such cases, Gary Klein's premortem approach may be useful (Kahneman, p.264). The main benefit of this approach is to legitimize doubt that could be suppressed for any number of reasons once a group leader throws support behind a particular plan or approach.
Look forward and imagine the first implementation of your improvement effort:
It was an unmitigated and deeply embarrassing disaster. Write a brief history of what went wrong:
- List all relevant reasons why implementation failed
- Was the effort structured appropriately for those who had to implement it?
- Was it aimed at the right organizational level?
- Was there a wide range of learning stages evident in the implementation team?
- Did the implementation team understand the objective?
- Were required changes appropriately communicated to the team?
- Did any implementation team member exhibit a clear and substantial misunderstanding of the content?Did any aspect of the design inhibit understanding?
- Was there a resource gap or failure that impacted implementation?
- Review the project evaluation:
- Which failure points now seem controllable?
- Which controllable failure point:
- Seems most likely to occur?
- Would likely have the most significant impact on the team?
- What steps can be taken to reduce the likelihood that the event will occur?
- What steps can be taken to reduce the impact of the event, should it occur?