Recent Blog Posts

Safety Metrics: Are They Measuring Up?

More

Learning to Deal with Drift

More

Enacting resilience in everyday practice: steps along the journey

More

The Learning Review -- Looking up the River

More

Practical Resilience: Misapplication of an Important Concept

More

Topic

Board Education Governance Health


Archive

2017 (4)
June (1)
April (1)
February (1)
January (1)
2016 (3)
September (1)
June (1)
April (1)
2015 (3)
May (1)
March (1)
January (1)
2014 (5)
October (1)
September (1)
June (1)
May (1)
January (1)
2013 (4)
November (2)
October (1)
September (1)

Further Thoughts on the Road to Safety-II

Posted by Sam Sheps/Karen Cardiff on Tuesday June 17th, 2014

Healthcare has a difficult time understanding issues of patient safety. The narrow clinical focus of most patient safety activities and the “diagnosis and treat” approach to solutions ignore the fact that safety is an emergent property of clinical work as well as organizational context. Thus it is refreshing to read Carl Macrae’s (2014) recent piece “Early warnings, weak signals, and learning from healthcare disasters” in British Medical Journal (BMJ) Quality and Safety.

Macrae’s focus is on the organizational, social and psychological dimensions of safety and is primarily, though not exclusively, a reflection on what can be learned from the Mid-Staffordshire National Health Service (NHS) Foundation Trust disaster (Francis Report, 2013). There is much useful wisdom in Macrae’s perspective and some interesting gaps that we will briefly discuss below.

Macrae provides an overview of the Mid-Staffordshire disaster and cites the many important documents that have been generated to describe and attempt to understand the complex array of factors that were at play over many years. In particular he asks: “…how can healthcare organizations – and those that supervise and regulate them-interpret weak signals, identify early warnings and investigate and address the risks that underlie major failures of care such as those at Mid-Staffordshire?” He answers this question by summarizing the findings of the investigations and enquiries: “At all levels of the healthcare system data on safety and quality are not systematically collected or shared, warnings were not acted upon, bad news was minimized, indicators of harm were explained away and complaints mishandled.” Macrae then stunningly cites Barry Turner (1976), a British sociologist whose work from almost 40 years ago focused on the organizational origins of disaster: “…the overall findings reported here could be restated as the proposition that disaster-provoking events tend to accumulate because they have been overlooked or misinterpreted as a result of false assumptions, poor communications, cultural lag, and misplaced optimism.”

Macrae then reviews the more recent literature (see the Macrae citation below for specific references) and in summary claims: “Disasters are organized events. To occur, they typically require systematic and prolonged neglect of warning signs, and signals of danger, creating deep pockets of ignorance, organizational silence, and organizational blindness. When signals of risk are not noticed or misunderstood in organizations, then safeguards and defenses against those risks can be allowed to degrade – or may never be created in the first place. Critically, it is the shared beliefs, collective assumptions, cultural norms and patterns of communication across organizations that shape what information is attended to and how is it interpreted, and most importantly, what is overlooked, discounted and ignored [emphasis ours].”

This claim and what follows by way of elaboration is powerful stuff. It nails the problem directly on the C-Suite and Boardroom doors where it belongs, and also provides a fundamental insight regarding the inevitable ‘surprise’ that organizations (as well as individuals) express in the wake of adverse events. It also call to mind and reinforces the importance of such concepts as ‘drift into failure’ (Dekker, 2011), the need for ‘pre-occupation with failure’ (Weick, 2001) and ‘requisite imagination' (Westrum, 1993), the need to distinguish between ‘work as imagined versus work as actually done’ and accepting the implications of ‘the efficiency-thoroughness trade off' (ETTO) (Hollnagel, 2009). These are bedrock concepts of resilience.

Despite the acuity and comprehensiveness of Macrae’s discussion, his paper contains some interesting omissions. Thus, for example, he cites the Francis Report (2013) that an important finding regarding the Mid-Staffordshire disaster was potentially unsafe staffing levels (about which many concerns were expressed). Macrae notes this, as well as other relevant information about Mid-Staffordshire’s operations, was apparently discounted by organizational leaders and regulators. However, he does not consider the influence of broader governmental policy and objectives. Over the past, at least, 50 years the governments of every political stripe have repeatedly focused on NHS efficiency, cost cutting, rationalization and re-organization. Efficiency has been conceived as an ‘end’ rather than a ‘means to an end’, cost-cutting demoralizes, rationalization confuses and endless reorganization exhausts. So it is perhaps not surprising that in such a toxic mix, small signals are either ignored or considered unbelievable (not to mention against policy) and discounted. However, it is critically important to consider the highest levels of policy which though trumpeted as being merely a response to the reality of, among other things, ‘these difficult financial times’ is essentially ideological claptrap designed to stifle thoughtful discourse and possibly pointed questions regarding the validity for such claims of necessity. After all governments seem to have all the money they need for things they want to do as opposed to things that are glaring gaps in the social or health fabric that strongly suggest what they should (indeed need) to do. To be fair, Macrae does note regarding current investigational processes that “…none have a mandate to examine all aspects of the healthcare system as an integrated whole…”. However, he is short on identifying how broad that ‘whole’ is and only briefly discusses policy and regulation, “within the aviation system”, and doesn’t point out how policies across sectors can potentially be at cross-purposes.

Macrae also notes that the Francis Report (2013) calls for more and better information, but observes that healthcare organizations already generate a mass of information, and that “more” may simply create noise not signal. But again it is government anxiety about accountability that creates much of the information glut. Dashboards, balanced score cards, the never ending search for indicators and any number of other managerial fads is not only overwhelming but largely, and ironically, uninformative, serving only to absorb an enormous amount of professional and administrative staff time that might be more usefully be employed in, say, a reflection on how the work actually get done on a day to day basis and on why things go right (the majority of the time -- Safety-II) (Hollnagel, 2014) as well a go wrong.

Investigations, as an information source, are problematic since healthcare organizations generally investigate themselves and thus are obviously prone to the same issues of ignorance and discounting that generate safety problems in the first place. Aviation is invoked correctly as having an independent process to understand events (near misses as well as accidents), but Macrae simply says there is no organization in the NHS that has a similar responsibility. Oddly he does not mention the National Patient Safety Agency established in 2001 (now defunct), which did in fact have the responsibility of receiving and analyzing critical incident reports. But, it also had a stake in regulatory, policy and educational agendas (it was NOT simply investigative and analytic) and more importantly was not (as is true of aviation and other industry investigative bodies) at arms length. In healthcare investigation is an internal rather than an independent process. Moreover, of course, to be useful investigation reports need to be comprehensive (to tell the ‘second story not simply repeat the first story’. They need to go beyond the ‘root cause’ and maintain the psychological dimension of adverse events on those involved that as Waring (2009) notes is usually shorn from reports, which makes them less compelling with regard to actual change. There is also critical the need for processes able to provide practical responses to concerns reported as well as the problems found. Too often in healthcare staff are exhorted to report issues but rarely receive any acknowledgment or feedback. Until such processes become routine and robust Richard Cook’s ‘No Reports’ campaign (see Richard Cook's BLOG on this website dated November 7, 2013) will continue to be relevant if mystifying (not to mention annoying) to the status quo.

Macrae mentions ‘cultural norms’ in the quote cited above, but never really discusses what this is supposed to mean. Much of the patient safety literature invokes ‘Just Culture’ as a policy objective, by which most mean a ‘blame free culture’, but the concept of culture is so underspecified as to be essentially meaningless, entailing vague references to the meaning of culture as ‘the way we do things around here’. They are platitudes that appear deep but are shallow, and are presented as explanatory when they are not even effectively descriptive. Culture is a contested concept and healthcare hardly represents a monolithic culture: not only is there more than one professional ‘culture’ (e.g. clinical or administrative as Waring (2009) has excellently pointed out), but more recently we have been confronted with what is/are perhaps the most important and highly variable and complex culture(s): that of the patient and family. It is these cultures and the people with whom we engage everyday, to heal when possible and succor and support when not, who are the point of all we do in healthcare, though unfortunately not all involved really understand this perspective. Despite this general state of affairs, culture can be characterized as a collective way of knowing and communicating the dynamic and challenging context of sense making (Weick, 1995) about how work gets done and what aspects of the environment may be problematic. Thus, for example, culture may represent a capacity of practitioners to sense (and communicate) when they are operating at the margins of safety, a collective vision of what they fear in any given set of unexpected circumstances (by keeping the discussion of potential failure alive) and how they work together to address or work-around the problems encountered. Such a conceptualization of culture is rare in healthcare that tends towards the concrete (“let’s fix the hand washing problem”): the social, psychological and political dimensions remain unaddressed. Sadly, it has also been observed that reporting adverse events may be a subtle form of organizational forgetting (Cook, 1998) rather than learning: a valid but rather disheartening observation regarding healthcare organizational culture.

Finally, Macrae like most commentators on patient safety tends to focus on what has recently been characterized as Safety-I (“accidentology” as per Hollnagel, 2014). Safety-II, an understanding of why and how care is provided successfully is, unsurprisingly, not mentioned much less discussed. However, he does emphasize the importance of “paying attention” to what people do in day to day work, that is much of the time, successful. This train of thought provides and entrée to Safety-II and is easily linked to considerations of how practitioners think about what they do, what works for them in challenging circumstances, how organizational policies and procedures can be facilitative or obstructionist, and how culture as elaborated above can foster a sensitivity to early warnings and weak signals.

Macrae makes important observations about failure in healthcare organizations and it is but a small step to incorporate his insights into an even more robust understanding of not only failure, but success that provides so much more of an opportunity for learning how to provide safe and effective care.  

 

References

Cook, RI (1998). Two Years Before the Mast: Learning How to Learn about Patient Safety. Plenary Session at The Scientific Investigation of Avoidable Patient Injury. Available at http://www.ctlab.org/documents/LearningToLearn.PDF; accessed June 17, 2014.

Dekker, S. Drift into Failure: From Hunting Broken Components to Understanding Complex Systems. Ashgate Publishing, 2001.

Francis Report: The Mid-Staffordshire NHS Foundation Trust Public Inquiry, 2013. Available at: http://www.midstaffspublicinquiry.com/report; accessed June 17, 2013.

Hollnagel, E. The ETTO Principle: Efficiency-Thoroughness Trade-Off: Why Things That Go Right Sometimes Go Wrong. Ashgate Publishing, 2009.

Hollnagel, E. Safety-I and Safety-II: The Past and Future of Safety Management. Ashgate Publishing, 2014.

Macrae, C (2014). Early warnings, weak signals, and learning from healthcare disasters. British Medical Journal Quality and Safety;doi:10.1136/bmjqs-2013-0026850. Available at: http://qualitysafety.bmj.com/content/early/2014/04/26/bmjqs-2013-002685.full; accessed June 17, 2014

Turner B. (1976). The organizational and interorganizational development of disasters. Administrative Science Quarterly; 21:378-97.

Waring, JJ (2009). Constructing and re-constructing narratives of patient safety. Social Science & Medicine; Vol. 69, Issue 12, December 2009; 1722-1731. Available at: http://www.sciencedirect.com/science/article/pii/S027795360900656X; accessed June 17, 2014.

Weick, KE and Sutcliffe, KM. Managing the Unexpected: Assuring High Performance in an Age of Complexity. University of Michigan Business School Management Series, 2001.

Weick, KE. Sensemaking in Organizations. University of Michigan Business School, USA, 1995.

Westrum, R. Cultures with Requisite Imagination, 1993. Available at http://download.springer.com/static/pdf/401/chp%253A10.1007%252F978-3-662-02933-6_25.pdf?auth66=1403211178_7b1bc51d6f83dbe0c430873db6568a71&ext=.pdf; accessed June 17, 2014

 

 

 

 

Add your comment

June 18th, 2014 by Bob Wears
Nice analysis, Sam and Karen. I'd add one additional problem in moving systems to safety-II. That is, that no one ever gets credit for problems that never happened. Ie, to the extent that resilient performance prevents catastrophic failures, rather than facilitates mitigation and recovery, it tends to look unnecessary.

Repenning NP, Sterman JD. Nobody Ever Gets Credit for Fixing Problems that Never Happened: creating and sustaining process improvement. California Management Review 2001;43(4):64-88.

Dekker SWA, Woods DD. To Intervene or not to Intervene: The Dilemma of Management by Exception. Cogn Technol Work 1999;1(2):86-96.

Submit comment

Sam Sheps is Professor with the School of Population and Public Health at the University of BC. Dr. Sheps is a health services researcher and was a member of the BC team collecting data for the Canadian Adverse Events Study. Other funded patient safety projects were: a Health Canada project assessing approaches to governance and safety in non-health industries; a Canadian Patient Safety Institute funded study on high reliability in healthcare; and more recently a Canadian Health Services Research Foundation (now the Canadian Foundation for Healthcare Improvement) funded research project that focused on building capacity to undertake critical incident investigation in healthcare settings, applying the concepts of Resilience Engineering; and has co-hosted a number of workshops examining the application of Resilience Engineering in healthcare.

Karen Cardiff is a researcher with the School of Population and Public Health at UBC with an interest in health services research. Karen has over a decade of experience in the area of patient safety research. She was a member of the BC team for the Canadian Adverse Events Study, and since that time has worked on a range of projects examining safety in high-risk industries and, more recently her interest has focused on the application of Resilience Engineering principles in healthcare. She is a member of the Resilient Health Care Net and the Resilience Engineering Association. She obtained her BScN from the University of British Columbia, an MHSc in Community Health and Epidemiology from University of Toronto and a MSc in Human Factors and System Safety, Lund University.

Events:More...


News:More...

6th Resilient Health Care Meeting

For details about the 6th Resilient Health Care Meeting being hosted in Vanco

More