Recent Blog Posts

Safety Metrics: Are They Measuring Up?

More

Learning to Deal with Drift

More

Enacting resilience in everyday practice: steps along the journey

More

The Learning Review -- Looking up the River

More

Practical Resilience: Misapplication of an Important Concept

More

Topic

Board Education Governance Health


Archive

2017 (4)
June (1)
April (1)
February (1)
January (1)
2016 (3)
September (1)
June (1)
April (1)
2015 (3)
May (1)
March (1)
January (1)
2014 (5)
October (1)
September (1)
June (1)
May (1)
January (1)
2013 (4)
November (2)
October (1)
September (1)

Surgical Safety Checklists: Are they Useless in Ontario?

Posted by Tanya Hewitt on Wednesday March 11th, 2015

Surgery is on the rise – globally.  The increase in surgical rates means a corresponding increase of surgical safety problems. The WHO recognized this in the late 2000s, and commissioned a group to address this impending hazard with a fairly universal, low cost solution, to be applied throughout the world.  The Safe Surgery Saves Lives consortium published their intervention tested in eight cities (four high income – including Toronto, Ontario and four low income) in the New England Journal of Medicine (NEJM) 2009 (1).  Reporting rates of improvement from 30% to 40% across categories of patient death, and major complications, Ross Baker stated that the Surgical Safety Checklist of Safe Surgery Saves Lives was an “intervention on steroids”(2), but it set the standard for many improvement efforts since.  The publication of Atul Gawande’s “Checklist Manifesto”(3) brought to the general public the success of this seemingly inexpensive yet powerful tool, that had been used for decades in aviation, to the surgical suite.

In July 2010, the Ontario Ministry of Health and Long Term Care mandated all hospitals to use the surgical safety checklist, which by then had been adopted by the Canadian Patient Safety Institute (4).  As epidemiologists taking advantage of natural experiments, Urbach et al. decided to evaluate the efficacy of this checklist intervention in Ontario (5).  Looking at the state of affairs three months before and three months after the introduction of the checklist in 101 hospitals, they found that the adjusted risk of death 30 day post-surgery changed from 0.71% [0.66%, 0.76%] to 0.65% [0.60% , 0.70%] (Odds ratio 0.91, P=0.13).  They furthermore checked a wide variety of outcomes – from bleeding to comas, and could not find any substantial change over the population exposed to surgery.  The Urbach et al. study concludes “implementation of surgical safety checklists in Ontario, Canada, was not associated with significant reductions in operative mortality or complications.”

This set off a firestorm of response in the NEJM, with Lucien Leape giving five reasons why the Urbach et al. study did not find any change post introduction of the checklist: poor team communication, social behavioural problems unaddressed, lack of implementation resources, gaming and duration of change.  Tellingly, Leape finishes his editorial with “The likely reason for the failure of the surgical checklist in Ontario is that it was not actually used.” (6) This did not sit well with other researchers who cited problems such as randomized control trials refuting studies based on observational data and publication bias. Highlighted problems with the study ranged from the inclusion of low risk eye procedures to educating physicians on the value of near misses.  (There was also a bit of a “power of the study” discussion, but this seemed more like a pissing contest than something worth focussing on).

If the surgical safety checklist was “on steroids”, the Central Line Insertion Associated Bloodstream Infections (CLABSI) intervention by Pronovost et al. out of John Hopkins Medical Centre is the “poster child of patient safety”(2).  The Michigan Keystone ICU programme reported stunning results : “100 intensive care units in Michigan reduced CLABSIs by 66%, and sustained a median infection rate of zero and a mean of one infection per 1000 catheter-days for more than three years”(7).  However, to call this the success of a checklist is folly.  In the paper “Reality Check for Checklists”, Bosk, Dixon-Woods, Goeschel, and Pronovost (2009) note that the intervention was far more than a checklist.  Realizing that the challenges of technical work (e.g. undertaking systematic reviews to create the checklist, assembling the cart of supplies) are far outweighed by the problems of adaptive work (e.g. allowing the nurse to “abort takeoff” if the sterile environment is compromised) meant that the resources needed to be partitioned accordingly.  Comprehensive Unit Safety Programmes (CUSPs) were created to ensure equality of voices, recognition of hazards and accountable plans to deal with problems.  “What happened in Michigan involved the creation of social networks with a shared sense of mission, whose members were each able to reinforce the efforts of the other to cooperate with the interventions. Implementing the entire programme occurred over nine months—it was not simply the case that the units were handed the checklist and immediately fell in line. The work was arduous and often laden with emotions.”(8 pg 444). The paper concludes that checklists – on their own – are likely useless.  Other factors critical to the Keystone success include customization of the checklist, compliance to the checklist, preparatory work in the guides to checklist use, and “mindful” versus “mindless” autonomy and innovation, and the importance of humility and transdisciplinarity. 

This may well have been known to Urbach et al. Quotes from their paper (5) such as: “Only studies including team training or a more comprehensive safety system that includes multiple checklists have shown effectiveness similar to that seen in the WHO study”(pg 1030) and “Implementation of surgical safety checklists is not uniform”(pg 1030) followed by “Hospital-reported compliance with checklists was high” (pg 1031) indicate they might have known there was more at play.  “Although materials were available to assist in the implementation of surgical safety checklists in hospitals, no formal team training was required before public reporting, and implementation was not standardized” (pg 1035) led them to conclude that “Although a greater effect of surgical safety checklists might occur with more intensive team training or better monitoring of compliance, surgical safety checklists, as implemented during the study period, did not result in improved patient outcomes at the population level.” (pg 1036)

So, are checklists truly useless in Ontario?  Well, checklists on their own are useless anywhere.  Without integrating checklists into a larger socio-cultural weave of hierarchy flattening, local ownership, transdisciplinary work and dedication over time, checklists will likely not reap the benefits for which the Ontario government had naïvely hoped.

 

References

1. Haynes AB, Weiser TG, Berry WR, Lipsitz SR, Breizat A-HS., Dellinger EP, et al. A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med. 2009;360(5):491.  

2. Baker R. Patient Safety : 10 Years Later Why is Improvement So Hard ? Health Achieve [Internet]. Toronto, Ontario; 2014. p. 1–24. Available from: http://www.healthachieve.com/2014/program-schedule/Pages/Speaker-Presentations.aspx#

3. Gawande AA. The Checklist Manifesto - an interview [Internet]. Blackwell Bookshops. 2010 [cited 2015 Jan 27]. Available from: https://www.youtube.com/watch?v=s0hhpdPCq4Y

4. Canadian Patient Safety Institute. CPSI Surgical safety Checklist [Internet]. 2009. p. 1. Available from: Safe Surgery Saves Lives

5. Urbach DR, Govindarajan A, Saskin R, Wilton AS, Baxter NN. Introduction of surgical safety checklists in Ontario, Canada. N Engl J Med [Internet]. 2014 Mar 13 [cited 2014 May 2];370(11):1029–38. Available from: http://www.ncbi.nlm.nih.gov/pubmed/24620866

6. Leape LL. The checklist conundrum. N Engl J Med [Internet]. 2014;370:1063–4. Available from: http://www.ncbi.nlm.nih.gov/pubmed/24620871

7. Pronovost PJ. Learning accountability for patient outcomes. JAMA. 2010;304:204–5. 

8. Bosk CL, Dixon-Woods M, Goeschel CA, Pronovost PJ. Reality check for checklists. Lancet [Internet]. 2009 Aug [cited 2013 Nov 12];374(9688):444–5. Available from: http://linkinghub.elsevier.com/retrieve/pii/S0140673609614409 

 

 

 

 

 

 

Add your comment

March 18th, 2015
Thanks Tanya

Governments mandating change:
Sadly I think that mandating any behaviour change (however well intentioned) poisons the well before anyone draws water from it. Hopefully governments learn that culture change does not occur by flipping a switch (or dropping a writ)

Science?:
I agree with Dr. Leape. I find it curious that any journal (much less NEJM) would publish a 3 month pre/post intervention study on a provincial level....

Thanks for the summary

Submit comment

Tanya Hewitt is a PhD candidate in the Population Health Programme at the University of Ottawa. She is passionate about safety science, and her qualitative study is focused on incident reporting systems in hospitals. Tanya worked as a licensing assessment officer and inspector of primarily cancer clinics for the Canadian Nuclear Safety Commission for 10 years and holds a graduate certificate in risk management, an MSc in medical physics, and has enthusiastically taken many (free) courses, from Social Psychology to the Science of Safety in Healthcare.

Events:More...


News:More...

6th Resilient Health Care Meeting

For details about the 6th Resilient Health Care Meeting being hosted in Vanco

More