Patient Safety



Patient Safety


Thoralf M. Sundt



The term “patient safety” is widely—and sometimes loosely—used today in a variety of contexts and it would convey a variety of meanings. The suggestion made over a decade ago that patients were unsafe in our hospitals was initially greeted with skepticism if not hostility but now has given way to acceptance as the medical community has come to recognize that there are indeed instances of avoidable harm. With a turn in focus to reducing “adverse events” and “errors,” a veritable “movement” has arisen within the medical community.

The scope of the topic is enormously broad, encompassing equipment design to make their use more intuitive through legible and proximity-compatible information delivery and ergonomic structure, to process design to make them simpler and more resilient. Less obvious may be issues of teamwork and communication. While all aspects of medical error impact the entire medical community, for practitioners of diverse specialties the emphasis may vary. For pharmacists or medical oncologists, the area of greatest risk and hence greatest focus may be avoiding medication error. Surgeons have focused for many years on aspects of technical error. As the complexity of surgical care has increased, however, errors in nontechnical skills such as cognitive errors, communication errors, and errors in situational awareness have become more important. We will focus here on these domains in the interest of developing a basis for understanding the interventions proposed for the surgical arena. The potential impact of such interventions is powerful, as disruptions in teamwork and communication have been shown repeatedly to be strongly correlated with errors, outcomes, and even legal actions. Furthermore, the interventions required to improve teamwork and communication are almost entirely within our own personal control.

It is also worthwhile carefully defining our terms. “Error” as defined in the Institute of Medicine Report is “the failure of a planned action to be completed as intended or the use of the wrong plan to achieve an aim.” As such, there is no blame implied nor is the occurrence of an adverse event requisite. Regardless of the clinical consequences, an error stands as an error just the same. Accordingly, its pathogenesis can be examined, as can any recovery efforts successful or not. Indeed many would argue that the most information one can gather is by examining “near misses” as they are more plentiful (thankfully) than adverse events, and that successful error capture and recovery has more to teach about resilient teams than uncorrected ones.

The meaning of the term “adverse event” is self-explanatory; an injury or complication occurring during medial treatment. Adverse events may be avoidable or not, and they may be related to an error or not. The focus on adverse events resulting from the patient’s underlying condition has long been a staple of the medical literature, such as the incidence of dialysis-dependent renal failure and its association with comorbidities such as long-standing hypertension or diabetes. To be sure “patient safety” can be improved via interventions to mitigate that risk as well; however, more often when reference is made to patient safety efforts, it is in the context of adverse events occurring as a consequence of medical error, particularly those that might be preventable. Of course, not all adverse events lead to lasting or significant harm, and the definition of “preventability” is thorny as well. A narrow perspective may rest with the legal definition of negligence, which addresses whether or not care provided meets the standard of care that can be reasonably expected of an average physician in those same circumstances. On the contrary, a more liberal definition of preventability will likely lead to more progress in improving outcomes.

There has been significant argument over whether it is more productive to focus on errors or adverse events. Argument in favor of the former identifies them as the ultimate precursors and belies the presumption that prevention of error will reduce adverse events and preventable harm. The difficulty is that errors are so common that it can be overwhelming just keeping track of them. Those who argue in favor of a focus on adverse events suggest that such an approach with concentrate attention on the most serious errors worthy of attention. There really can be no resolution to this argument as both sides have merit. It is worth noting, however, that one must be clear about definitions at the outset. Improved safety will be the inevitable product of reducing all of these.

Although it was the Institute of Medicine report “To Err is Human” published in 2000 that raised the topic most prominently to our collective consciousness, the origins of this alternate perspective can be traced back at least as far as Ernest Amory Codman who challenged the medical community in Boston to adopt what de Leval has called a “forensic” approach to surgical outcomes in contrast to the more traditional “statistical” approach. Codman insisted on reporting of 1-year outcomes including the reasons why perfection had not been attained. He classified errors and adverse events and queried the role of human, organizational, and equipment factors. This approach was not welcomed in his home town or nationally. The same theme, however, was taken up by de Leval, initially surrounding the arterial switch operation in his own hands and subsequently in the United Kingdom broadly. The impact of even minor errors on outcomes was highlighted in this study, reported to the American Association for Thoracic Surgery in 1999, as was the role of “Human Factors” in general. He had begun his journey many years previously as told in this Lancet article “Human Factors in Cardiac Surgery: A Cartesian Dream,” which represented a summary of his Mannheimer lecture focused on his own approach to technical errors after the death of a child early in his career due to a technical error. In it, he outlines the lessons to be learned from other disciplines and his journey into systems thinking, recognition of the impact of many factors beyond patient risk factors, and surgical skill on outcome. Importantly, such a perspective provided him and us a window into other opportunities to impact those factors and improve outcomes.



LEARNING FROM OTHERS

Among the fields de Leval encourages us to explore is that of human factors science. This field incorporates knowledge from cognitive psychology, engineering and industrial design, and ergonomics to understand human capabilities and limitations, and their impact on performance of tasks be they medical or otherwise. This understanding helps in the design of systems or technologies to reduce the chances of human error. While the origins of process design can be found in 19th century “time and motion studies” conducted to improve factory efficiency, human factors came into its own as a recognized field in World War II when aircraft became so remarkably complex that their operation exceeded the cognitive capacity of a single human mind. The result was an effort to design the plane to fit the human rather than vice versa. For example, Alphose Chapanis demonstrated that shape coding aircraft cockpit controls for the flaps and landing gear reduced landing accidents. It is because of the dramatic accomplishments in the field of aircraft engineering that the analogy between medicine and aviation is cited so frequently, sometimes to the chagrin of physicians. The analogy holds, however, as fundamentally the issue is human performance in complex environments. Closely related is the field of cognitive psychology including biases, heuristics, and intuitive decision-making.

The relevance of these fields to surgery resides in practical solutions to common human challenges in complex environments that have been explored and developed elsewhere but can be modified and applied in medical care in general and surgical care in particular. The cognitive aspects are particularly applicable to surgery and in particular cardiothoracic surgery, as our field demands a large number of decisions be made on the basis of incomplete information on a short timeline with profound implications for our patients. When things go wrong, we can apply the discipline of accident analysis to understand the potential interventions that might prevent the same in future. The applicability of the Human Factors Analysis Classification System, which was employed productively to reduce accidents among military pilots, to cardiac surgery has been demonstrated.

A nontrivial barrier to the application of forensic analysis and human factors science to health care is the fear of retribution either from peers or the legal system. The former is within our control even if the latter is not. In order to maximize our ability to learn from errors, it is necessary that we develop a “blame-free” culture in which errors can be openly discussed. This does not imply disappearance of accountability, but rather a reframing. Individuals are still accountable for their own actions such as willing rule violations or reckless behavior, whereas the accountability for systems issues is shared among those responsible for the systems themselves. In some instances, individual accountability is replaced by collective accountability—not disappearance of accountability. This concept has been termed “ Just Culture.”


COMPLEXITY

It is reasonable to ask “why change from the way we have done things in the past?” Apart from concern about “political correctness” what is so different about medical care today? Why is a conscious focus on medical error is now necessary?

Just as occurred with rapid technological developments in the military during WWII, the complexity of our environment has increased exponentially in recent years. The body of surgical knowledge, the complexity of our technology, the comorbidities of our patients, and the organizational complexity of medical institutions including interdisciplinary collaborations have made medicine a much different field today that only a few years ago. The consequences of this increased complexity are profound.

Complexity itself is now a field of scientific endeavor. Complexity theory and systems science focus on the properties of complex systems. Complex, interconnected systems behave as more than the sum of their parts. Complex systems are, by their very nature, nonlinear, sensitive to initial conditions, and self-organizing. The interactions among individual elements lead to unpredictability and unintended consequences. Perrow has argued convincingly on the basis of study of nuclear power, petrochemical plants, mines, deep-sea oil drilling operations, and marine commerce that accidents are inevitable—they are “Normal” —in highly complex environments. The risks to the system as a whole posed by normal accidents depend in significant measure on the degree to which the interactions among system components are “tightly or loosely coupled” and our ability to capture them on whether the relationships are “linear or complex.” Ironically, technological “solutions” such as safety systems layered on top of these complex systems may simply make them even more complex and incomprehensible.

The implication if we wish to continue to make progress, then, is that we must develop means of dealing with such accidents. The so-called “High Reliability Organizations” (HROs) such as the nuclear power industry or flight-deck operations on aircraft carriers have done so. HROs are defined as organizations that must function reliably in highly complex environments in which the consequences of error are catastrophic. In such organizations, the focus is on error management—not just prevention—and anticipation of the unexpected; they focus on error identification, capture and recovery as well as the critical importance of resilience. Complexity is fundamentally best dealt with by decentralized control, and HROs demonstrate this in the manner in which they are structured.


HUMAN COGNITION

Recent years have seen marked advancements in our understanding of human cognition and an understanding of the strengths and limitations of the same. Metacognition—thinking about thinking—has become a topic of popular books and Nobel Prizes (Kahneman). An understanding of those cognitive processes and limitations are critical for surgeons called upon to make decisions quickly and under pressure on a regular basis. Uncertainty, time pressure, and the gravity of the decisions increase our vulnerability to error. For these reasons, it is important to understand how our brains work and, perhaps more importantly, when we are at greatest risk of making an error and means of mitigating this risk. Again, this is an area of opportunity for us to improve outcomes.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Jun 15, 2016 | Posted by in CARDIAC SURGERY | Comments Off on Patient Safety

Full access? Get Clinical Tree

Get Clinical Tree app for offline access