Resilience and Systems Engineering




© Springer-Verlag London 2015
Paul R. Barach, Jeffery P. Jacobs, Steven E. Lipshultz and Peter C. Laussen (eds.)Pediatric and Congenital Cardiac Care10.1007/978-1-4471-6566-8_26


26. Resilience and Systems Engineering



Karen Harrington  and Peter C. Laussen3, 2  


(1)
Departments of Pediatrics and Critical Care, CHU Ste Justine, 3175 Côte Ste Catherine, Montreal, QC, H3T 1C5, Canada

(2)
Department of Critical Care Medicine, The Hospital for Sick Children, Toronto, ON, Canada

(3)
Department of Anaesthesia, University of Toronto, 555 University Avenue, Toronto, ON, M5G 1X8, Canada

 



 

Karen Harrington (Corresponding author)



 

Peter C. Laussen



Abstract

Pediatric cardiac surgery is a complex, high-risk field characterized by a vulnerable patient population, technically demanding surgery, and technological and team challenges. Human factor studies have identified the importance of teamwork, communication, and standardization of some processes of care to improve outcomes. With demonstrable improvement in safety still lacking, continued reflection is warranted on the fundamental concepts which underlie safety efforts. Most studies have adapted a linear accident model with emphasis on error commission and recovery, adverse events, and latent conditions. More recent approaches to safety based on systems and resilience engineering are more applicable to complex systems of care, such as cardiac surgery. Systems engineering seeks to minimize risk through redesign. Resilience engineering explores how individuals and organizations negotiate complexity to create safety.


Keywords
Pediatric cardiac surgeryPatient safetyAccident modelComplexitySystems engineeringResilience



Introduction


It has been over a decade since the Institute of Medicine Report (IOM), To Err is Human [1] with alarming estimates of preventable medical harm. The report demanded a new approach to safety, highlighted systemic issues, and called for a 50 % reduction in medical errors by 2004. This was not achieved, despite dedicated efforts at local, regional and national levels, as well as turning to other high-risk industries and organizational safety models for instruction.

Initiatives to improve patient safety have continued to gain momentum over recent years, yet it is worthwhile to remember that significant efforts to improve safety and outcomes had already started within pediatric cardiac surgery prior to the IOM report being published. Marc de Leval introduced the concept of human factors to the field [2] and adapted an organizational accident model to study the role of human factors in surgical outcomes [3]. Two high profile inquiries into pediatric cardiac surgery deaths in the United Kingdom [4] and in Canada [5] highlighted the presence of system-wide problems, which echoed the call for a systemic approach to safety in the treatment of children with congenital heart disease.

More than a decade after the first IOM report, and despite significant effort and financial investment, it is unclear in fact if patient safety has improved [69]. While efforts have gone into refining error counting and data collection, continued reflection is warranted not only on how to reliably measure safety [8], but also on the fundamental concepts of safety that guide our safety improvement efforts.

All efforts to improve safety are based on an underlying understanding, and conceptual model, of safety and risk. As in many industries, traditional emphasis in healthcare has been on the frontline practitioner, and more recently a linear accident model has been applied to explore all contributing factors. Complex linear accident models portray adverse events as the results of a chain of events and failures, which may be active or latent [10]. Linear models are appealing for their clarity and ease of understanding, and they offer conceptually straightforward solutions: eliminate root causes, strengthen defences, and/or introduce barriers between the hazard and the patient. The very simplicity that makes them appealing, however, limits their applicability to complex systems. By artificially simplifying complex interactions into linear causal chains, the adaptive function that a “failure” in one chain of events may have in another system function is not recognized. Opportunities to explore why people acted as they did are limited, and solutions (for example, punishing/banning certain actions or introducing barriers) may inadvertently introduce more complexity and risk into the system.

While fault-based and linear accident models may remain useful in simple systems, in which processes are linear and there is one best way of doing things, they are insufficient to adequately describe risks in complex systems. With the development of increasingly complex systems scientific thinking about safety has increasingly moved towards a “systemic view”, in which outcomes are seen to emerge from the complex functioning of the system as a whole [11], It is not sufficient to understand and improve the function of one system component; interdependencies and relationships must be recognized, explored, and optimized.

Safety research in pediatric cardiac surgery has developed in a parallel course to safety theory in general, concentrating first on the performance of the individual surgeon [2], with more recent studies highlighting team and organizational factors [3, 12, 13]. Methods and measures have predominantly been adapted from a linear accident model (Swiss Cheese analogy [10], with an emphasis on error commission and recovery, adverse events, and latent conditions. The existing research has identified the importance of teamwork, communication, and standardization of some processes of care.

Pediatric cardiac surgical care delivery is a complex system. It is characterized by a highly vulnerable patient population, technically demanding surgery, complex monitoring and life support technology, and the coming together of individuals from several disciplines to form a team at each stage of patient care. Advances in diagnosis, surgical technique, perfusion and ICU management have allowed younger, patients with more complex lesions to be treated, and survival has increased significantly. Paradoxically, these same advances have also increased system complexity and introduced new sources of risk that cannot adequately be studied and addressed using a linear accident model.

This chapter will briefly review the contributions and limitations of the linear accident model to our understanding of safety in pediatric cardiac surgery. It will then discuss two systems approaches to safety in complex systems, systems engineering and resilience engineering. The approaches are complementary when studying a complex socio-technical system. Systems engineering emphasizes design and process and seeks to minimize risk through system redesign, and resilience engineering has a stronger focus on the human dimension, exploring how individuals and organizations create safety by managing complexity.


Traditional View of Safety: The “Person Model”




I will prescribe regimens for the good of my patients according to my ability and my judgment and never do harm to anyone -Hippocratic Oath, 4th Century BC

The traditional understanding of safety, in healthcare as in other industries, emphasizes the frontline individual’s contribution to an outcome. The underlying belief is that competent individuals will produce a desired, “safe” outcome. Undesired outcomes are the result of error, and error is caused by incompetence or complacency. In this view, sometimes called the “bad apple theory”, the system is inherently safe, but is threatened by incompetent or complacent individuals (bad apples). Once offending individuals are identified, removed, or rehabilitated (“naming, shaming, blaming, retraining”) [14], system safety is restored.

Belief in the bad apple/person model of safety remains strong in healthcare. The medical culture is rooted in a deep sense of personal responsibility, accountability and patient ownership [15]. Healthcare, and especially surgery, is often delivered in a one-to-one or few-to-one manner [16], and patients as well as healthcare professionals believe that safety “lies foremost in the hands through which care ultimately flows to the patient” [17]. As such, poor surgical outcome is still often ultimately concluded to be the “fault” of an individual surgeon. A failed intubation attempt or line insertion is due to the lack of skill, error or complacency of the anesthetist. A medication error is caused by the incompetence or carelessness of the doctor prescribing, or the nurse administering the drug.

While individual competence remains essential, and while a high degree of personal accountability is laudable, as Dekker asks: are individual virtue, competence, and strength of character the only things, the main things, we want to rely on” [17] to optimize patient safety? Is the system so frail that the difference between a good outcome and catastrophe is one person?

The deeply held belief that competence equals perfection, that error equals incompetence, and that good doctors do not make mistakes has fostered a culture of secrecy, shame, and blame that has not only limited opportunities for safety learning, but offers few strategies for safety growth beyond insisting that people be better and try harder. Far from producing demonstrable improvement in safety, this is the approach that is alleged to cause upwards of 100,000 deaths each year in the United States alone. The first IOM report [1], the Bristol inquiry [4] and the Winnipeg inquiry [5] all demanded a new approach to patient safety, moving beyond the traditional person-based model to examine systemic factors.


Early “Systems” Approach: The Linear Accident Model




Rather than being the main instigators of an accident, operators tend to be the inheritors of system defects. Their part is that of adding the final garnish to a lethal brew that has been long in the cooking.” Reason [18]

At the time that safety research in pediatric cardiac surgery began, James Reason’s Swiss cheese analogy [10] had recently emerged. It has provided the framework for most of the safety research in this field.

The Swiss cheese analogy is a linear accident model. An accident is seen as the result of a number of contributing factors, some active and some latent, that come together at a particular place and time. Reason’s metaphor describes layers of defenses, barriers and safeguards between a hazard and its potential victim. The Swiss cheese imagery refers to the weaknesses (“holes”) that exist in each defensive layer. Weaknesses are the result of “active failures’ by people- slips, lapses, fumbles, mistakes and procedural violations- and of “latent conditions” within the system itself, for example, understaffing or chronic worker fatigue resulting from managerial decisions. An accident occurs when the holes in many layers line up, allowing the hazard through. In this model, risk can be managed proactively by tracing back and eliminating the root cause, by identifying and remedying latent conditions in the system and/or by introducing barriers into the hazard trajectory.

The Swiss cheese metaphor has been widely adopted in healthcare and in safety research in pediatric cardiac surgery [3, 12, 13, 19]. In this analogy, if the “hazard” is improper transfer of a coronary artery during the arterial switch operation, layers “upstream” from the surgeon might include pre-operative diagnosis and decision-making and latent conditions in the cardiac surgery department, hospital management, and healthcare organization [20] causing distraction and suboptimal surgical performance. A layer “downstream” might be diagnostic techniques to detect the coronary problem before separation from cardiopulmonary bypass, allowing immediate revision. A “hole” in this layer of defense could be failure by the cardiologist to identify an incorrect coronary attachment on the postoperative echocardiogram. A “hole” in the anesthesia layer of defense might be “cognitive tunnel vision on insertion of lines to the neglect of monitoring the ECG screen [3], which could delay the identification of patient instability and reevaluation of coronary blood flow.


Contributions and Limitations of the Linear Accident Model




One of the greatest obstacles to progress on safety is, paradoxically, the attraction of neat solutions -Charles Vincent [21]

There is no doubt that the widespread adoption of the linear accident model has added to our understanding of adverse events in healthcare. It has helped to deflect blame from frontline workers and to increase our awareness of upstream factors. Safety studies in pediatric cardiac surgery have used a linear accident model to highlight the importance of teamwork, clear communication, and standardization of some processes. But is a linear accident model such as the Swiss cheese metaphor sufficient to understand safety? Does it accurately and fully describe how things go wrong, and more importantly for future learning, does it explain why they did?

Paradoxically, the very simplicity that has given the linear accident model such appeal and resulted in its widespread use may hinder further progress on safety by artificially reducing the complexity of real work. The model reduces complex interactions into artificially simple linear sequences, and hindsight provides a seemingly predictable view of the path to failure. The inherent outcome bias of an accident model further limits progress on safety by labeling behavior and actions as errors and failures, rather than considering alternative explanations for why people behaved as they did with the information available to them at the time.

Using a linear accident model approach, a multi-center study of human factors in the arterial switch operation [3] classified similar events as “minor” or “major” according to outcome. For example, “cognitive tunnel vision on insertion of lines to the neglect of monitoring the ECG screen (where no major event results)” is classified as a minor event, while “delayed diagnosis of a major deterioration in the patient’s condition” [3] is listed as a major event. In this case, who determines that tunnel vision and neglect of monitoring occurs? If line insertion was difficult, was the attentiveness of the anesthetist to the task of inserting the lines “tunnel vision”, and was it inappropriate? The same behavior “event” can therefore be classified as either minor or major according to the outcome.

The linear accident model has increased our understanding of upstream contributions to accidents by identifying vulnerabilities throughout the system, rather than implicating only the frontline worker. However, over reliance on the linear accident model may hinder further progress in safety by artificially representing complex processes as linear sequences, and by concluding the investigation at the identification of fault or error rather than considering how observed actions made sense to clinicians at the time. Beyond sequencing observable events and listing problems, a different mental model and tools are required to develop meaningful understanding of complex work [22].


Systems Approach to Safety


Faced with increasingly complex systems that stretched the limits of traditional industrial accident models and safety engineering techniques [11], scientific thinking about safety has evolved towards a “systems view”, in which outcomes are seen to emerge from the complex functioning of the system as a whole.
< div class='tao-gold-member'>

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Jul 13, 2016 | Posted by in CARDIOLOGY | Comments Off on Resilience and Systems Engineering

Full access? Get Clinical Tree

Get Clinical Tree app for offline access