Improving Safety




Abstract


The care of pediatric patients with cardiac disease has improved significantly in the last several decades. However, preventable patient harm, adverse events, and serious safety lapses occur with alarming frequency. Medical professionals in all roles are facing an era in which the general public is demanding ever-increasing information and transparency about the practice and business of health care, and public reporting of various metrics, including cardiac surgical outcomes, has expanded accordingly. Our understanding of how health care system design can negatively impact patient outcomes grows continuously, in part due to innovative work in the field of pediatric cardiac surgery, but much work remains. High reliability organizations like aviation have a great deal to teach medicine about safe performance despite extreme risk, and we would do well to embrace some of their pioneering approaches. It is of critical importance that every pediatric provider assumes shared responsibility for providing safe care in the particularly complex and high-risk environment of cardiac surgery. Doing so requires the practitioner to understand the basic language of patient safety, to embrace the fundamental concepts of a safety culture, to appreciate that a child with cardiac disease is uniquely at risk for certain preventable adverse events, and to practice with heightened vigilance to mitigate this risk. This chapter aims to introduce the reader to a selection of core concepts and frameworks necessary to provide the safest possible care for pediatric cardiac patients.




Key Words

Preventable patient harm, Safety culture, High-reliability organization, Standardized surgical handoff

 


In 2000 the Institute of Medicine’s (IOM) landmark publication To Err Is Human made headlines around the nation with its assertion that nearly 100,000 people died each year in the United States due to medical errors. Crossing the Quality Chasm, published 1 year later, identified safety as the first dimension of health care quality, without which the subsequent dimensions cannot be reliably achieved. In the years since then, appreciation in the medical community of the real scope of preventable patient harm has only deepened. It truly is an epidemic—a 2013 study examining data from 2008 to 2011 suggests that 200,000 to 400,000 Americans experience death or premature death due to medical error each year. Patient safety and preventable patient harm have become topics of the utmost importance to both the medical community and the general public, particularly as more institution-specific outcome metrics and adverse event data are readily accessible with a simple Internet search.


The morbidity related to medical error in hospitalized pediatric patients is less certain, but most assuredly is not insignificant. In a retrospective review of 15 pediatric intensive care units (PICUs) across the United States, Agarwal et al. found that over 60% of PICU patients experienced an adverse event, 45% of which were deemed preventable and 10% of which were either life threatening or permanent. Patients in a PICU are known to be at increased risk of nosocomial infection. In addition, estimates of adverse drug events in PICU patients range from 22 to 59 per 1000 doses, with up to 11% of those events being of life-threatening severity.


When specifically assessing the risk of preventable medical error for a child with cardiac disease, there is a relative paucity of incidence data. There is agreement that adverse events and medical errors do occur in cardiac surgery, owing to its high work load, complexity of involved tasks, and tendency of management plans to be uncertain and subject to change. It is reasonable to suspect that given their fragile underlying physiology and the high-risk procedures they often require, pediatric cardiac intensive care unit (PCICU) patients are particularly vulnerable to medical error. Jacques et al. examined the clinical course of 191 patients with hypoplastic left heart syndrome (or physiologic equivalent) between 2001 and 2011 for errors that could impact patient outcome, and the most striking findings included the following: Errors were common overall, with nearly a quarter of patients experiencing a postoperative error (with delay in recognizing or managing a clinical scenario, error relating to airway management or extubation, and failure of attempted delayed sternal closure being most prevalent); the majority of postoperative errors were considered foreseeable; and postoperative errors were associated with increased risk of death or transplant. Pediatric cardiac surgery clearly is a high-risk specialty with a very small margin for error, and the nature of the work—complex procedures taking place within a sophisticated organizational structure, dependent on frequent multidisciplinary collaboration, and tasking individuals with high-level technical and cognitive responsibilities—may lend itself to the human factors engineering and crew resource management approaches long used in aviation. In this vein, Hickey et al. took an innovative approach, applying the “threat and error” model of the National Aeronautics and Space Administration (NASA) to over 500 pediatric cardiac surgical admissions. This demonstrated that fully half of cases contained one or more errors and that cycles with multiple errors were very significantly associated with permanent harmful end states, including residual hemodynamic lesions, end-organ injury, and death.


Given the mounting evidence that hospitalized children are indeed at risk of medical error and its accompanying morbidity, it is paramount that every pediatric provider actively share in the responsibility to improve patient safety. There is perhaps no clinical environment in which this recognition—that improving patient safety is a shared, fundamental obligation—is more critical than in the perilous world of cardiac surgery. This chapter aims to introduce the reader to a selection of essential concepts and frameworks necessary to provide the safest possible care for pediatric cardiac patients, beginning with a review of key definitions and essential principles relating to patient safety and preventable patient harm, followed by a discussion of selected safety issues unique to a pediatric cardiac patient.


First, error, adverse event, and other key terms are defined and categorized in a useful taxonomy from original work by Kohn et al. in To Err Is Human ( Table 8.1 ).



TABLE 8.1

Basic Language of Patient Safety
























Patient Safety Freedom from accidental injury; ensuring patient safety involves the establishment of operational systems and processes that minimize the likelihood of errors and maximize the likelihood of intercepting them when they occur.
Adverse Event An injury resulting from a medical intervention.
Error Failure of a planned action to be completed as intended or use of a wrong plan to achieve an aim; the accumulation of errors results in accidents.
Active Error An error that occurs at the level of the frontline operator and whose effects are felt almost immediately.
Latent Error Errors in the design, organization, training, or maintenance that lead to operator errors and whose effects typically lie dormant in the system for lengthy periods of time.
System Set of interdependent elements interacting to achieve a common aim. These elements may be both human and nonhuman (equipment, technologies, etc.).
Human Factors Study of the interrelationships between humans, the tools they use, and the environment in which they live and work.

From The Institute of Medicine. Kohn LT, Corrigan JM, Donaldson MS, eds. To Err Is Human: Building a Safer Health System. Washington, DC: The National Academies Press; 2000:26.


In the years following the original IOM report, work by Reason and others has proposed that both human and nonhuman factors (i.e., systems), and the interaction of the two, are key in the origin of the majority of medical errors. The concept of an organizational or systems-level analysis of adverse events in health care is fundamental to current approaches to increasing patient safety. Health care providers are likely familiar with Reason’s famous “Swiss cheese” model of patient safety, which is based on analysis showing that accidents are rarely the result of individual errors, but rather multiple errors within a fundamentally flawed system. In a complex system such as health care, both latent errors (due to organizational system or design failures) and active errors (due to an individual’s failure) can occur, and the way to guarantee patient safety is to either prevent the error from occurring or prevent the error from causing harm through the application of multiple steps that function as a safety net. The steps required to verify and dispense a medication dose on an inpatient ward is a simple example of how medicine applies this safety net concept into daily work flow. Both the ordering clinician and a pharmacist independently verify the dose to be correct and appropriate for the patient and not in violation of the patient’s medication allergy profile. A modern electronic medical record typically has built-in dose maximums and automatic warnings that notify a clinician if the chosen dose falls outside of typical prescribing norms. Often, two nurses also independently verify the medication name, dose, route, patient identifier, and infusion pump settings before administering the medication to the patient. These steps are designed not necessarily to prevent a clinician from ever inadvertently ordering an incorrect medication dose (which would represent a focus on active error) but to reduce the likelihood that an incorrect dose will ever reach and harm a patient via intentional system redundancies and double checks (a contrasting focus on latent error). This model has become hugely popular as a model of accident causation in many industries, including health care, and does offer useful constructs for understanding the constant interplay between individual humans and larger organizational systems, and how each may contribute to adverse events. There is, however, debate in the literature about its validity, particularly regarding its potential to oversimplify events and concern that it has swung the pendulum too far toward placing responsibility for accidents or errors on senior management, versus individuals at the “front lines.”


Although the definitions and mental models described earlier are a useful starting point, readers should also heed a note of caution about the imperfect standardization of the language of patient safety. As increasing attention has been paid to the frequency of adverse events in pediatric patients, it has become apparent that the definitions of adverse event and preventable adverse event and the ability of teams to consistently evaluate for “preventability” vary significantly. Intriguing approaches to better define and identify adverse events using “trigger tool” methodology and targeted retrospective chart review have been piloted, but at this time much of what we know about incidence of preventable harm in hospitalized pediatric patients comes from incident reporting systems, which are subject to underreporting and other limitations.




Embedding a Culture of Safety Into Pediatric Cardiac Intensive Care


Lessons From “High-Reliability Organizations”


Embracing the concept of a “culture of safety” is fundamental to institution- or unit-level efforts to reduce preventable patient harm. The Agency for Healthcare Research and Quality (AHRQ) reports that the term culture of safety first originated with high-reliability organizations (HROs)―organizations that operate with high potential for error but few adverse outcomes, typically used to mean nuclear or aviation industries. In its current state, medicine is alarmingly far from establishing operating margins of safety comparable to these HROs. As a striking comparison from Weick and Sutcliffe, the 400,000 people who die annually due to hospital-associated preventable harm is the equivalent of two 747 passenger jets crashing every day, every year—numbers that would bring air travel to a grinding halt, yet health care continues unaffected.


HROs are defined by Weick and Sutcliffe as sharing the core characteristics listed in Box 8.1 , on a foundation of mutual trust.



Box 8.1

Characteristics of High-Reliability Organizations


Preoccupation with failure (being highly aware of all error and potential for error)


Reluctance to simplify (understanding and appreciating the complexity of the work)


Sensitivity to operations (awareness of the work being done on the front lines)


Commitment to resilience (having the capacity to identify, contain, and improve from error)


Deference to expertise (allowing frontline workers to make decisions, avoid rigid hierarchies)


Modified from Hershey K. Culture of safety. Nurs Clin North Am. 2015;50:139-152; Weick KE, Sutcliffe KM. Managing the Unexpected: Assuring High Performance in an Age of Complexity. San Francisco: Jossey-Bass; 2001.


When evaluating if the concepts of HROs can translate fully to medicine, one should acknowledge one important limitation from the start. A core principle of the concept of reliability is to focus on defects (errors or adverse events) that can be measured as rates (defects as the numerator, population at risk as the denominator) and are free from reporting bias. In applying reliability to health care, this focus translates well to problems with clear operational definitions and that occur at discrete points in time, such as central-line associated bloodstream infections. In truth, most patient safety issues do not lend themselves to measurement in this manner. This distinction may be part of why the success of efforts to transform medicine into an HRO has been somewhat limited to date.


A great deal may be gained for health care, however, by understanding the organizational culture at the heart of HROs. Fundamentally, an HRO is a system that has developed a culture sensitive to safety that enables employees to maintain a low probability of adverse events despite unpredictable threats. Within an HRO exists an expectation of employees to routinely question practices and search for anomalies that may create risk for error, to refuse to oversimplify safety issues, to work collaboratively and in deference to expertise rather than a rigid organizational hierarchy, and to create solutions when error does occur—essentially, to view reliability as a continuous, ongoing, and active pursuit rather than a simple numeric measure of past performance. Embedding this culture into the practice of medicine has real potential to change patient outcomes for the better. Roberts et al. published a compelling case report of a sustained decrease in mortality and serious safety events in a tertiary care PICU after adoption of HRO principles, followed by a recrudescence of such adverse events after a leadership change in the unit and abandonment of the HRO approach. Anesthesia may also be particularly suited for the introduction of the HRO model to reduce serious safety events.


Clearly there are differences between aviation and health care that may require alterations in HRO-based methodology. Scheduled operation of a machine that is assumed to be functioning at peak performance is quite distinct from guiding an unexpectedly deteriorating human being from illness to health. However, the principle of high reliability—ability to perform with minimal adverse events despite high risk—is something that health care should certainly endeavor to embody. Indeed, if “the only realistic goal of safety management in complex health-care systems is to develop an intrinsic resistance to its operational hazards,” HROs can provide a road map for building this intrinsic resistance. The existing body of evidence for implementing HRO principles in the practice of medicine, though small, mandates our attention as we strive to reduce adverse outcomes for our patients.


Defining and Building a Culture of Safety for Health Care


Specific to health care, the Joint Commission has defined safety culture as “the summary of knowledge, attitudes, behaviors, and beliefs that staff share about the primary importance of the well-being and care of the patients they serve, supported by systems and structures that reinforce the focus on patient safety.” Several key themes emerge when reviewing literature on how to construct a safety culture in health care.


First, there is a clear emphasis on examining medical errors through the lens of health care systems and how systems and individual workers intersect in ways that may be either predisposing to, or protective from, error. Returning back to Reason’s foundational work, individual error is referred to as active error and is committed by a frontline health care worker at the so-called sharp end of health care, whereas systems error is latent error, originating from someone or something remote from direct patient care (such as managers, system designers, or administrators). According to Reason, systems-based or latent errors are the greatest threat to complex industries like health care and are the root cause of most error. There certainly is a growing body of evidence that nontechnical errors are more prevalent in health care delivery than technical errors and are driven the majority of the time by communication breakdowns or by problematic team dynamics. Catchpole et al. found evidence that the primary threat to quality in pediatric cardiac surgery is error related to cultural and organizational failures. Indeed, some go so far as to suggest that “the need to implement effective health care organizing has become as pressing as the need to implement medical breakthroughs.” Health care is a dynamic system, with a basic structure and organization into which individual workers bring their own attitudes, behavior, and knowledge. Both parties impact the other continuously, and a focus solely on individual workers as the cause of medical error will be less effective than a strategy that acknowledges the critical interplay between systems and individuals that constantly occurs during patient care.


Second, Chassin and Loeb propose three central attributes of a safety culture that reinforce one another: trust, report, and improve. Team members must trust their colleagues and their management structure to feel safe in speaking up about unsafe conditions that may endanger patients. Trust will be strengthened when frontline workers see that improvements have been made based on their concerns. Unfortunately, trust is not a given in all health care systems. The 2013 National Healthcare Quality Report found that many health care workers still believe that mistakes will be held against them, and in that same report, half of respondents reported no adverse events at their facility in the preceding year—a number that seems quite low, raising concern that fear of blame may lead to underreporting of medical error and continued risk to patients. A culture of blame—one in which fear of criticism or punishment fosters an unwillingness to take risks or accept responsibility for mistakes—simply can no longer be tolerated in a health care system striving to improve patient safety. A focus on blame perpetuates silence in the face of near misses and performance problems, ensuring that patients continue to be at risk of preventable harm; a just culture, in contrast, provides a supportive environment in which workers can question practices, express concerns, and admit mistakes without suffering ridicule or punishment.


Underlying issues like trust and fear of blame is the larger construct of communication within health care systems. Communication barriers are one of the biggest safety challenges that critical care teams face. As this fact has become increasingly studied and better understood, the old paradigm of the physician as the unquestioned captain of the team is, in safety-focused health care environments, gradually giving way to new communication strategies that prioritize care delivery over hierarchy. For example, family-centered rounds include parents/caregivers in daily discussions of progress and plans for the patient and have been shown to improve family satisfaction, discharge planning, and communication. Including a daily goals sheet or checklist during rounds improves team cohesiveness, helps customize daily care plans to the specific needs of each patient, prompts regular review of simple but important safety items like central venous catheter duration or need for venous thromboembolism prophylaxis, and has been shown in some studies to decrease intensive care unit (ICU) length of stay. Employing structured communication frameworks to convey changes in patient status or clinical concerns, such as the popular “SBAR” (situation, background, assessment, recommendations) tool, can improve situational awareness, reduce problems related to organizational hierarchy and experience, and improve collaboration between nurses and physicians.


Finally, the concept of accountability is also fundamental to a safety culture in health care. Workers must feel empowered to hold not only themselves but also their coworkers to shared high standards in a manner that engenders transparency rather than attempts to assign blame. For example, the Joint Commission Center for Transforming Healthcare focuses on accountability as a key strategy to improve hand hygiene performance, a simple practice that is known to reduce incidence of hospital-associated infections yet one for which compliance rates are only around 40%. Many hospitals have implemented programs that encourage any observer (including patients and families) to speak up if they note that hand hygiene was not performed before patient care. More broadly, safety event reporting systems discussed in the next section provide a method for concerned team members to report issues of various types for review. Actively giving and openly receiving feedback on issues relating to patient safety is an essential part of a safety culture in health care.


Safety Reporting Systems and Approaches to Analyzing Patient Safety Events


Patient safety incident reporting systems are now common in hospitals, increasingly embedded into electronic medical records or Web-based technology, and are fundamental to detecting safety events. AHRQ proposes four key elements for an effective safety event reporting system ( Box 8.2 ).



Box 8.2

Key Components of an Effective Event-Reporting System


Institution must have a supportive environment for event reporting that protects the privacy of staff who report occurrences.




  • Reports should be received from a broad range of personnel.



  • Summaries of reported events must be disseminated in a timely fashion.



  • A structured mechanism must be in place for reviewing reports and developing action plans.



Reprinted with permission of AHRQ PSNet. Key Components of an Effective Event Reporting System. Reporting Patient Safety Events. Patient Safety Primers. AHRQ Patient Safety Network Web Site. Available at: https://psnet.ahrq.gov/primers/primer/13 .


Error-reporting systems can take many forms—voluntary or mandatory disclosures of events as they occur, automated surveillance, or chart review. Voluntary and mandatory reporting systems are common. Advantages of these kinds of systems are that they often permit any type of health care worker, regardless of position, to make a report, and that they may remove fear of punishment for speaking up by allowing reporting to be anonymous. Limitations include recall bias and underreporting—the latter often due to perception that little or no follow-up will occur after a report is made.


This perception highlights that having a system in place for reporting safety events accomplishes little if not paired with a robust method for analyzing and addressing the content of the reports. A brief discussion of a select few approaches for safety event review with which the pediatric cardiac intensivist should be familiar follows.


Root cause analysis (RCA) and apparent cause analysis (ACA): RCA is a commonly used, formally structured approach to safety event analysis, originating in industrial accidents but now widely applied to health care. It is a retrospective, systems-based method to identify both active and latent error. RCAs typically start with data collection, then detailed reconstruction of how the events leading to the event in question occurred (the active errors), why the events occurred (the latent errors), with the end goal of eliminating the latent errors to prevent the adverse outcome from recurring. The Joint Commission has mandated RCAs be done for sentinel events since 1997. Although RCAs are widely used, evidence for RCA effectiveness is fairly limited, and there is concern that the significant resources required to carry out RCAs is not balanced by the results they yield, given that follow-up and corrective actions are often inconsistently implemented and vary widely across institutions. Related to the concept of RCA is ACA, a more limited investigation employed for less severe adverse events. ACAs may be done more quickly and by a broader range of staff members than RCAs, but as with RCAs, their impact is dependent on the quality and rigor with which they are followed up.


Failure modes and effects analysis (FMEA): Originating from engineering, FMEA is, in contrast to RCA, a prospective process that uses five steps to identify potential vulnerabilities in a health care process and to subsequently test the proposed solutions to ensure no new or continued risk to patients. The basic steps for FMEA in health care consist of defining a topic (a process or situation thought to represent a potential safety risk), assembling a multidisciplinary team, graphically describing the process with a flow diagram, conducting a hazard analysis (reviewing any/all ways in which the process in question may fail and compromise patient outcomes), and finally developing actions and outcome measures. The FMEA model has been associated with successful reduction of postanesthesia complications, improved safety in radiology departments, decreased error in chemotherapy orders, and safety gains in many other components of health care delivery.


Structured morbidity and mortality reviews: This is a general categorization of a helpful construct—that of approaching traditional “M&M” conferences with a specific structure to better uncover and deconstruct issues underlying serious safety breaches. A growing body of evidence suggests a structured morbidity and mortality conference can be a driver of quality improvement initiatives and practice changes that increase patient safety. This objective can be accomplished in many ways; the reader is directed to resources on the specifics of two selected examples, the Learn From Defects Tool and Ishikawa diagrams, for detailed description of these methods and how to implement them.


Threat and error management: Edward Hickey has suggested an intriguing new approach to preventable patient harm that draws direct lessons from the safety culture of the airline industry. Aviation experts recognize and accept that error is “ubiquitous, inevitable, and needs to be managed”—trained observers of over 3500 commercial flights have concluded that 80% contain error. This model stands in contrast to the traditional approach of the medical profession to underestimate frequency of errors, to view errors as stemming from personal failure, and to resist making errors and their impacts transparent to the public. The concept of threat and error management is a tactic that therefore encourages medical teams to actively seek error and review all patient cases, rather than focus only on morbidities and mortalities. Hickey’s group instituted a model of real-time assessment of every pediatric cardiac surgical patient and included a combination of third-party review of active clinical management, weekly discussion of each patient in an open forum, and preoperative completion of a “flight plan.” The flight plan is how the medical team views each patient’s hospital course and contains a description of the potential threats for that specific case and the operative intentions and models the patient’s projected journey from operating room (OR) to ICU to discharge, similar to an aircraft’s intended flight plan. When 524 consecutive patient “flights” were analyzed, 70% had threats; 66% had consequential errors; and 60% of consequential errors led to a chain of further error and progressive deviation from the ideal flight plan. These findings suggest that as in aviation, it is these chains of events that lead to progressive loss of safety margins and increasing danger of adverse outcomes. Halting such a chain requires the ability to recognize when one is in such a cycle and active effort to rescue the situation using the principles of crew resource management— nontechnical skills that are mandatory for airline pilots and crews but which medicine has yet to embed into training or practice.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Jun 15, 2019 | Posted by in CARDIOLOGY | Comments Off on Improving Safety

Full access? Get Clinical Tree

Get Clinical Tree app for offline access