Postinjury Hemotherapy and Hemostasis
The ponderous literature on the subject of hemostasis could perhaps be considered a classical example of the infinite ability of the human mind for abstract speculation. For several years, the number of working theories of the hemostatic mechanism greatly exceeded and not always respected the confirmed experimental facts. In recent years, however, the revived interest in this field has led to an accumulation of new findings which has been almost too rapid for their orderly incorporation into a logical working pattern. As a result, we have rapidly gone from a state of “orderly ignorance” to one of “confused enlightenment,” from which we have not emerged as yet.
Mario Stefanini, April 19541
The first recorded blood product transfusion to a human being occurred in 1667 in France and involved transfusion of approximately three tablespoons of whole blood from a calf to a man who was suffering from insanity.2 The physician performing the transfusion postulated that the calm temperament of the calf would transfer to the patient via its blood. The procedure was well tolerated, although the patient developed severe flank pain and tar-colored urine following the subsequent three transfusions: the first recorded evidence of immune-mediated hemolysis, albeit unbeknownst to the physician at the time.
Although transfusion medicine has undergone enormous development since this sentinel event, as summarized by Stefanini, important gaps in scientific knowledge persist, and several fundamental issues involving hemotherapy following major trauma remain controversial. There has been an explosion in the science of hemostasis, with a resultant revolution in all aspects of the care of the coagulopathic trauma patient. Our understanding of the mechanisms of coagulation has shifted from that of a simple enzymatic cascade to a cell-based paradigm, in which endothelium, erythrocytes, leukocytes, and platelets interact to coordinate a delicate balance between thrombosis and fibrinolysis. Furthermore, both an early postinjury endogenous coagulopathy associated with traumatic shock and a myriad of secondary factors that exacerbate this condition have been elucidated. Diagnosis of coagulopathy is shifting from the routine use of laboratory tests designed to monitor anticoagulation therapy toward point-of-care testing, which provides essential real-time clinical correlates. Treatment algorithms of traumatic coagulopathy have emphasized early replacement of both clotting factors and platelets with concomitant restraint of crystalloid administration (termed damage control resuscitation), as well as pharmacologic adjuncts that exploit the endogenous coagulation system. On the other hand, documentation of the deleterious effects of overzealous blood component replacement has led to a reevaluation of this strategy, in an attempt to reach a balance between abatement of coagulopathy and minimization of subsequent organ dysfunction. This chapter will attempt to synthesize recent developments in the complex management of the bleeding, coagulopathic trauma patient.
RED BLOOD CELL TRANSFUSION
Red blood cell (RBC) transfusion is lifesaving in the face of critical anemia associated with hemorrhagic shock. However, the optimal target hematocrit during resuscitation remains unknown. Shock is defined broadly as the development of an oxygen debt due to impaired delivery, utilization, or both, with resultant anaerobic metabolism and organ dysfunction. Elimination of this oxygen debt involves optimization of oxygen delivery, which is the product of cardiac output and arterial oxygen content. The arterial oxygen content, in turn, is dependent primarily on the hemoglobin concentration and oxygen saturation. Oxygen consumption, defined as the product of the cardiac output and the difference between the arterial and venous oxygen content, represents a more specific marker of oxygen availability at the cellular level.
During resuscitation, a balance must occur between the competing goals of maximal oxygen content (hematocrit = 100%) and minimal blood viscosity (hematocrit = 0%). Furthermore, irrespective of hematocrit, the oxygen-carrying capacity of transfused allogeneic erythrocytes is impaired due to storage-induced changes in both deformability and hemoglobin oxygen affinity. Accordingly, although many studies have measured an increase in oxygen delivery following transfusion of allogeneic RBCs, almost none have reported an increase in oxygen consumption.3 Finally, beyond a role in oxygen delivery, erythrocytes are integral to hemostasis via their involvement in platelet adhesion and activation, as well as thrombin generation. The hematocrit is thus relevant to hemorrhagic shock as it relates to both oxygen availability and hemostatic integrity.
Early canine models of hemorrhagic shock suggested that oxygen consumption is optimized at a relatively high hematocrit (range 35–42%).4 However, hematocrit variation was achieved via autotransfusion of the animal’s shed whole blood, eliminating the aforementioned limitations of allogeneic erythrocytes, and rendering the results inapplicable to modern resuscitation of hemorrhagic shock. Furthermore, acute normovolemic hemodilution of dogs to a hematocrit of 10% is well tolerated, with little decrement in oxygen delivery secondary to a compensatory increase in cardiac output.5
Retrospective observations among critically ill surgical patients in the 1970s suggested a hematocrit of 30% as optimal for both oxygen-carrying capacity and survival.6 Such studies formed the basis of the traditional recommendation to maintain the hematocrit >30%, although the marked limitations of this retrospective literature were recognized ultimately. As the deleterious effects of RBC transfusion became increasingly evident, renewed interest in the ideal transfusion trigger occurred. The Transfusion Requirements in Critical Care (TRICC) Trial, which compared restrictive (hemoglobin <7.0 g/dL) and liberal (hemoglobin <9.0 g/dL) transfusion triggers among 838 patients, provided the first level I evidence regarding RBC transfusion strategies among the critically ill.7 Although inclusion criteria did not specify ongoing resuscitation, 37% of patients were in shock at the time of enrollment as evidenced by the need for vasoactive drugs. No difference in 30-day mortality was observed between groups. However, in-hospital mortality, as well as mortality among less severely ill patients (Acute Physiology and Chronic Health Evaluation II score <20) and younger patients (age <55 years), was significantly lower in the restrictive transfusion group. Current evidence thus suggests that a hemoglobin concentration of >7 g/dL is at least as well tolerated as a hemoglobin concentration of >9 g/dL among critically ill patients.
It is possible that hemoglobin concentrations below 7 g/dL are safe, particularly in younger patients. However, a hemoglobin concentration of 5 g/dL appears to be the threshold for critical anemia. Whereas hemodilution of healthy volunteers as low as a hemoglobin concentration of 5 g/dL is well tolerated,8 a study of postoperative patients who refused RBC transfusion reported a sharp increase in mortality below this same hemoglobin concentration.9 Such populations differ fundamentally from the multiply injured, exsanguinating patient in need of resuscitation. However, these data are provocative, and future large-scale trials of lower transfusion triggers for the resuscitation of hemorrhagic shock are warranted in light of the accumulating evidence documenting the untoward effects of RBC transfusion.
In addition to oxygen transport, RBCs play an important role in hemostasis. As the hematocrit rises, platelets are displaced laterally toward the vessel wall, placing them in contact with the injured endothelium; this phenomenon is referred to as margination. Platelet adhesion via margination appears optimal at a hematocrit of 40%.10 Erythrocytes are also involved in the biochemical and functional responsiveness of activated platelets. Specifically, RBCs increase platelet recruitment, production of thromboxane B2, and release of both ADP and P-thromboglobulin. Furthermore, RBCs participate in thrombin generation through exposure of procoagulant phospholipids. Interestingly, animal models suggest that a decrease of the platelet count of 50,000 is compensated for by a 10% increase in hematocrit.11 Despite these experimental observations, no prospective data exist detailing the relationship between hematocrit, coagulopathy, and survival among critically injured trauma patients.
In summary, prior investigations into the ideal hematocrit for oxygen-carrying capacity during hemorrhagic shock are in large part irrelevant to modern-day resuscitation with allogeneic blood. Banked erythrocytes are subject to a time-dependent diminution of oxygen-carrying capacity, and the effect of blood transfusion on oxygen consumption, regardless of hematocrit, remains questionable. The CRIT trial suggested that patients in shock tolerate a hemoglobin concentration of 7.0 g/dL at least as well as 9.0 g/dL, although this hypothesis was not testing during the initial resuscitation of hemorrhagic shock specifically. Furthermore, the role of erythrocytes in hemostasis must be considered. In practice, clinical circumstance (e.g., ongoing hemorrhage with hemodynamic instability and coagulopathy), as opposed to an isolated laboratory measurement, should inform the decision to transfuse. However, until there is definitive evidence to challenge the CRIT data, a hemoglobin concentration of <7 g/dL should be considered the default transfusion trigger for resuscitation from shock.
POSTINJURY COAGULOPATHY PERSPECTIVE
Uncontrolled hemorrhage is the leading cause of preventable morbidity and mortality following trauma. Hemorrhage is responsible for nearly one half of all trauma deaths, and is the second leading cause of early death, preceded only by central nervous system injury.12 Most hemorrhagic deaths occur within the first 6 hours postinjury, and require tremendous resource mobilization in terms of blood component therapy. Although most life-threatening hemorrhage originates as major vascular injury that is amenable to either surgical or angiographic control, a diffuse coagulopathy frequently supervenes.
Originally described over 60 years ago,1 postinjury hemorrhage that persists despite control of surgical bleeding has been referred to by many names, including medical bleeding, diffuse bleeding diathesis, posttransfusion bleeding disorder, medical oozing, and disseminated intravascular coagulation (DIC). Clinically, the coagulopathy is manifest as nonsurgical bleeding from mucosal lesions, serosal surfaces, and wound and vascular access sites that continues after control of identifiable vascular bleeding. Although postinjury coagulopathy has long been recognized, several authors have struggled to elucidate both its predictors and mechanisms. Reporting on a large cohort of combat casualties, Simmons et al. appropriately identified the relationship between major trauma and coagulopathy, but were unable to predict coagulopathy using a myriad of both clinical and laboratory parameters.13 In 1982, our group described the “bloody vicious cycle,” in which the synergistic effects of acidosis, hypothermia, and coagulopathy combined to create an irreversible clinical deterioration among patients who had received large-volume blood transfusion, eventuating in death by exsanguination despite surgical control of bleeding.14 In the late 1980s, Lucas and coworkers from Wayne State University detailed the relationship among large-volume blood transfusion, decrement in clotting factor concentrations, and the corresponding prolongation of traditional measures of coagulopathy, such as the prothrombin time (PT) and activated partial thromboplastin time (aPTT).15 Development of coagulopathy following massive transfusion (MT), which was postulated to be secondary to both consumption and dilution of clotting factors, was similarly unable to be predicted by either clinical or laboratory parameters.16 Most recently, evidence of an endogenous coagulopathy associated with severe traumatic injury has emerged, which occurs early and is independent of the secondary effects of body temperature, acidosis, and clotting factor consumption or dilution.17
The burden of postinjury coagulopathy on the severely injured trauma patient is enormous. Overt coagulopathy affects at least one in four seriously injured patients and is associated independently with increased mortality.18 In a large series from our institution, over one half of deaths due to exsanguinations occurred after control of surgical bleeding and were thus due to coagulopathy.14 Persistent hemorrhage despite surgical control of bleeding remains the most common reason for abandonment of definitive repair of injuries (damage control surgery). Finally, patients who develop postinjury coagulopathy nearly universally require MT of blood products, placing an incalculable financial burden on both institutional and national health care delivery systems.
CELL-BASED COAGULATION CONSTRUCT
Effective management of postinjury coagulopathy requires an understanding of the coagulation process. Hemostatic integrity involves an intricate balance between hemorrhage and thrombosis, achieved in concert by complex interactions between the anticoagulant, procoagulant, and fibrinolytic systems. The inciting event for thrombosis following injury is the exposure of tissue factor (from both the subendothelium and mononuclear cells) to circulating clotting factors. From this point forward, multiple enzymatic cascades, orchestrated by a myriad of cells, direct the balance of thrombosis and hemorrhage based on both substrate availability and the status of global tissue perfusion (i.e., shock). Whereas clotting factors exist in concentrations sufficient to maintain hemostasis in health, major trauma overwhelms the capacity of the coagulation system, with resultant systemic thrombosis and hemorrhage. For example, an isolated lobar pulmonary contusion may involve a surface area large enough to exhaust the body’s endogenous fibrinogen and platelet reserves. Major proteins involved in the procoagulant, anticoagulant, and fibrinolytic systems are listed in Table 13-1.
TABLE 13-1 Proteins Involved in Coagulation, Anticoagulation, and Fibrinolysis
The coagulation process has been considered traditionally a cascade of proteolytic reactions occurring in isolation. In this classic view of hemostasis (extrinsic and intrinsic pathways), the cell surface serves primarily to provide an anionic phospholipid region for procoagulant complex assembly. Whereas this model is supported by traditional laboratory tests of isolated coagulation in a test tube, it does not correlate with current concepts of hemostasis occurring in vivo.
This antiquated model has been supplanted by the cell-based model (CBM) of coagulation. This model recognizes the important interactions of the cellular and plasma components to clot formation, as opposed to the more simplistic schema of the classic view. The CBM suggests that procoagulant properties result from expression of a variety of cell-based features, originating at the endothelial level, including protein receptors, which activate components of the coagulation system at specific cell surfaces. Furthermore, this model allows for improved understanding and potential mechanistic links with cross-talk between inflammation and coagulation components. In addition, platelet receptors, endothelial cells, proteases, cytokines, and phospholipids have important roles in coagulation. This model also incorporates RBCs and their aforementioned interactions with the hemostatic process. The CBM occurs in three overlapping phases: initiation (which occurs on tissue factor–bearing cells), amplification, and propagation. Amplification and propagation involve platelet and cofactor activation eventuating in the generation of massive amounts of thrombin, known as the thrombin burst. Both amplification and propagation occur on the cell surface of platelets, underscoring the central role of the platelet in the hemostatic process.
In summary, the CBM represents a major paradigm shift from a theory that views coagulation as being controlled by concentrations and kinetics of coagulation proteins to one that considers the process to be driven by diverse cellular interactions. Coagulation factors work as enzyme/cofactor/substrate complexes on the surface of activated cells, and hemostasis requires the interaction of endothelium, plasma proteins, platelets, and RBCs.
HEMOSTASIS MANAGEMENT CONTROVERSIES
Acute Coagulopathy of Trauma
Coagulation disturbances following trauma follow a trimodal pattern, with an immediate hypercoagulable state, followed quickly by a hypocoagulable state, and ending with a return to a hypercoagulable state.19 Conceptualization of the early hypocoagulable state has changed markedly over the last 10 years. Trauma-induced coagulopathy was considered traditionally to be the consequence of clotting factor depletion (via both hemorrhage and consumption), dilution (secondary to massive resuscitation), and dysfunction (due to both acidosis and hypothermia). However, several recent reports have detailed that many trauma patients present with a coagulopathy prior to fluid resuscitation and in the absence of the aforementioned parameters.17,18,20 In a study by Brohi et al., clotting factor concentrations on emergency department entry were correlated with both hypoperfusion (measured by the base deficit) and coagulopathy (measured by both the PT and PTT) for 208 trauma activations from 2003 to 2004.17 Coagulopathy was observed only in the presence of hypoperfusion (base deficit >6) and was not related to clotting factor consumption as measured by prothrombin fragment concentrations. Similarly, in a review of trauma patients from our institution who required at least one transfusion, we noted that early (<1 hour postinjury) fibrinolysis occurred frequently among the most severely injured, and correlated significantly with markers of hypoperfusion, such as presenting systolic blood pressure, arterial pH, and base deficit.21
Such studies provide evidence of what we have termed an “acute endogenous coagulopathy of trauma,” which occurs early after injury, is independent of traditional mechanisms of coagulopathy, and is correlated closely with hypoperfusion. Such a mechanism may have evolved to protect hypoperfused vascular beds from thrombosis in the event of ischemia, but is clearly pathologic in the setting of diffuse tissue injury with resultant hemorrhagic shock. Trauma patients who present with this endogenous coagulopathy incur a 4-fold increase in mortality as compared to those patients who do not develop the coagulopathy.22 Furthermore, these patients are eight times more likely to die in first 24 hours,20 and have an increased incidence of multiple organ failure (MOF), transfusion requirements, intensive care unit (ICU) length of stay, and mortality.22
Although the existence of an endogenous coagulopathy of trauma has been well documented, potential mechanistic links to this process remain elusive. Brohi et al. noted in their study that an increasing base deficit was significantly and directly correlated with thrombomodulin concentration (an auto-anticoagulant protein expressed by the endothelium in response to ischemia [Table 13-1]), and inversely correlated to protein C concentration.17 Moreover, a decreased concentration of protein C was correlated with a prolongation of the PTT, suggesting increased activation of protein C via thrombomodulin upregulation as a possible mechanism. Activated protein C (APC), in turn, both inhibits the coagulation cascade via inhibition of factors Va and VIIIa and promotes fibrinolysis via irreversible inhibition of plasminogen activator inhibitor (PAI). A decreased concentration of protein C also correlated with a decrease in the concentration of PAI, an increase in tissue plasminogen activator (tPA) concentration, and an increase in D-dimers. This final observation suggested that protein C–mediated hyperfibrinolysis via consumption of PAI may contribute to traumatic coagulopathy.
Further associations between the endogenous coagulopathy of trauma and the APC pathway have since been described in both animals23 and humans.22 Using a mouse model of hemorrhagic shock, Chesebro et al. documented an association between coagulopathy and an elevated APC concentration (as opposed to the surrogate protein C concentration utilized in the study of Brohi et al.17).23 Inhibition of APC with mAb1591 prevented coagulopathy associated with traumatic hemorrhage (as measured by the PTT). However, complete inhibition of APC caused universal death at 45 minutes due to thrombosis and perivascular hemorrhage, underscoring the delicate balance between hemorrhage and thrombosis.
Others have argued that the early coagulopathic changes following severe injury simply reflect the traditional concepts of DIC.24 Specifically, the hematologic consequences following injury may be considered to represent a generic coagulopathic response to any insult that induces widespread inflammation (e.g., trauma, infection, ischemia/reperfusion). The release of proinflammatory cytokines, in turn, has two main effects on the coagulation system: (1) release of tissue factor with subsequent clotting factor consumption and massive thrombin generation and (2) hyperfibrinolysis due to upregulation of tPA. In favor of this argument is the long-standing documentation of diffuse intravascular thrombi in multiple, uninjured organs of victims of hemorrhagic shock.25 Furthermore, the cytokine elaboration patterns of both trauma and septic patients are nearly identical, suggesting a potential common pathophysiologic mechanism.26 However, this argument is limited by the aforementioned finding that clotting factor levels are relatively preserved in trauma patients following shock.17 Furthermore, fibrinogen levels are inconsistently depressed in patients with acute traumatic coagulopathy. Moreover, the degree of fibrinolysis, when present, appears substantially higher in the endogenous coagulopathy of trauma as compared to DIC.21 Lastly, DIC occurs classically in the setting of an underlying hypercoagulable state (e.g., malignancy, septic shock) and is associated with an upregulation of PAI-1,27 as opposed to the early hypocoagulable state observed in the bleeding trauma patient, which reflects a predominance of both t-PA upregulation and PAI-1 inhibition.
Our current conceptualization of the acute endogenous coagulopathy of trauma emphasizes the integral role of fibrinolysis. Specifically, diffuse endothelial injury leads to both massive thrombin generation and systemic hypoperfusion. These changes, in turn, result in the widespread release of tPA, leading to fibrinolysis. Both injury and ischemia are well-known stimulants of tPA release,28 and we have observed a strong correlation between hypoperfusion, fibrinolysis, hemorrhage, and mortality among injured patients who require transfusion.21 The various proposed pathways involved in the endogenous coagulopathy of trauma are depicted in Fig. 13-1.
FIGURE 13-1 Proposed pathways for the acute endogenous coagulopathy of trauma. Note the prominent role of fibrinolysis via multiple mechanisms, the necessary thrombin substrate, and the positive feedback cycle that perpetuates the coagulopathy. tPA, tissue plasminogen activator; APC, activated protein C; PAI, plasminogen activator inhibitor.
Regardless of the inciting mechanism, elucidation of an endogenous coagulopathy of trauma has important therapeutic implications. Given that the driving force of early coagulopathy appears mediated initially by hypoperfusion as opposed to clotting factor consumption, replacement of clotting factors at this time would be ineffective. In fact, early clotting factor replacement in the face of ongoing hypoperfusion may serve to exacerbate coagulopathy via generation of additional thrombin substrate for thrombomodulin. For this reason, we have noted the endogenous coagulopathy of trauma to be “fresh frozen plasma (FFP) resistant.” By contrast, the development of a secondary coagulopathy due to the complications of massive resuscitation renders the patient clotting factor deficient and thus “FFP responsive.” Elucidation of the integral role of fibrinolysis also raises the possibility of mitigation of the coagulopathy via early administration of antifibrinolytic drugs (discussed below).
Refinement of the mechanisms underlying the endogenous coagulopathy of trauma represents one future goal within the area of postinjury coagulopathy research. Currently, neither a standardized definition nor diagnostic criteria for the endogenous coagulopathy of trauma exist. Furthermore, little is known about the initiators of both upregulation and eventual downregulation of thrombomodulin during traumatic shock. Overexpression of APC has been inferred in humans from a decreased concentration of protein C rather than direct measurement of the APC concentration. Finally, although a correlation between markers of shock and clotting factor expression profiles has been documented, causality remains to be proven. Despite these limitations, description of the endogenous coagulopathy of trauma represents a major turning point in our understanding of the hemostatic derangements following injury.
Although the endogenous coagulopathy of trauma results in an immediate hypocoagulable state among shocked patients following injury, several secondary conditions may develop, which exacerbate this preexisting coagulopathy. Such conditions are, in large part, due to the complications of massive fluid resuscitation, and include clotting factor dilution, clotting factor consumption, hypothermia, and acidosis. Although these factors were considered traditionally as the driving force of traumatic coagulopathy, recent evidence suggests that their effect may have been overestimated.
The positive interaction between hypothermia, acidosis, and coagulopathy has been termed both the “bloody vicious cycle” and “lethal triad of death,” which we proposed at the 40th annual meeting of the American Association for the Surgery of Trauma in 1981.14 Each of these three factors exacerbates the others, eventuating in uncontrolled hemorrhage and exsanguination. Many causes of hypothermia exist for the trauma patient, including altered central thermoregulation, prolonged exposure to low ambient temperature, decreased heat production due to shock, and resuscitation with inadequately warmed fluids. The enzymatic reactions of the coagulation cascade are temperature dependent and function optimally at 37°C; a temperature <34°C is associated independently with coagulopathy following trauma.29 However, both experimental and clinical evidence suggest that the effect of hypothermia is modest at best, with each 1° corresponding to a decrease in clotting factor activity of approximately 10%.30 When defined using an elevation of the seconds, hypothermia did not correlate with coagulopathy among a large cohort of trauma patients.22 Thrombin generation was also not effected by hypothermia . Coagulopathy may be relevant clinically in severe hypothermia (T <32°C),31 but this condition is present in less than 5% of trauma patients. Furthermore, it is unclear if the increased mortality observed in severely hypothermic patients is causal or merely circumstantial. Hypothermia also affects both platelet function32 and fibrinolysis33; however, pronounced platelet dysfunction is only observed below 30°C, and clinical data correlating platelet dysfunction secondary to severe hypothermia and adverse outcomes are lacking. Thus, although severe hypothermia exacerbates coagulopathy, advances in resuscitation of the trauma patient have minimized the risk of this degree of hypothermia, thereby limiting its relevance. By contrast, isolated hypothermia likely has minimal clinical impact on hemostasis within the temperature range commonly seen in trauma patients (33–36°C).
Clotting factor activity is also pH dependent, with 90% inhibition occurring at pH 6.8.34 Coagulopathy secondary to acidosis is apparent clinically below a pH of 7.2. Because hypoperfusion results in anaerobic metabolism and acid production, it is difficult to discern the independent effect of acidosis on hemostatic integrity. A recent study adjusted the pH of blood samples from healthy volunteers using hydrochloric acid.35 Using thrombelastography (TEG), a significant correlation was observed between pH and clot-forming time over the pH range of 6.8–7.4. However, no difference was found in either clotting time (a measurement of the time to initiation of a clot) or maximum clot firmness over the range of pH, with the possible exception of a pH equal to 6.8. Similarly, Brohi et al. found that although thrombin generation increased with increasing injury severity, there was no relationship between degree of acidosis (as measured by the base deficit) and either thrombin generation or factor VII concentration.22 Finally, conflicting evidence exists regarding the ability of correction of acidosis via buffer to reverse coagulation disturbances.35,36 Although the independent effect of acidosis on hemostatic integrity remains unclear, correction of acidosis via resuscitation remains a valuable therapeutic end point in terms of minimizing the aforementioned hypoperfusion-induced endogenous coagulopathy of trauma. Furthermore, maintenance of the arterial during resuscitation of shock (with bicarbonate, if necessary) maximizes the efficacy of both endogenous and exogenous vasoactive drugs.
Finally, although both consumption and dilution of clotting factors have been implicated in postinjury coagulopathy, there is little experimental evidence to support this theory. The amount of thrombin generated is not related to coagulopathy in patients without shock,17 and there is no effect of dilution on coagulopathy either in vitro37 or in healthy volunteers.38
In summary, an endogenous coagulopathy occurs following trauma among patients sustaining shock, and does not appear to be secondary to coagulation factor consumption or dysfunction. Rather, current evidence suggests that it is due to ischemia-induced both anticoagulation and hyperfibrinolysis, and is resistant to clotting factor replacement. Although the hematologic changes observed following severe trauma demonstrate many characteristics of DIC with a fibrinolytic phenotype, clotting factor consumption does not appear integral. During this time frame, therapy should focus on definitive hemorrhage control, timely restoration of tissue perfusion, and point-of-care monitoring in an effort to identify fibrinolysis. Following restoration of tissue perfusion, an “FFP-sensitive” pathway may emerge, which is characterized by coagulopathy due to traditional factors, such as acidosis, hypothermia, consumption, and dilution. Recognition of the transition from the “FFP-resistant” to the “FFP-sensitive” pathway is a critical objective of current research. Fig. 13-2 depicts our “updated” bloody vicious cycle, which incorporates both the acute endogenous coagulopathy of trauma and the aforementioned secondary factors. Finally, a hypercoagulable state supervenes following restoration of tissue perfusion, usually within 72 hours of injury.
FIGURE 13-2 Updated bloody vicious cycle. It incorporates both the early acute endogenous coagulopathy of trauma, which is resistant to clotting factor replacement with fresh frozen plasma (FFP resistant), and a subsequent secondary coagulopathy that may be due to hypothermia, acidosis, clotting factor deficiency (FFP sensitive), or any combination thereof.
Hypotensive Resuscitation
Permissive hypotension involves deliberate tolerance of lower mean arterial pressures in the face of uncontrolled hemorrhagic shock in order to minimize further bleeding. This strategy is based on the notion that decreasing perfusion pressure will maximize success of the body’s natural mechanisms for hemostasis, such as arteriolar vasoconstriction, increased blood viscosity, and in situ thrombus formation. Animal models of uncontrolled hemorrhage have revealed that crystalloid resuscitation to either replace three times the lost blood volume39 or maintain 100% of pre-injury cardiac output40 exacerbates bleeding39,40 and increases mortality39 as compared to more limited fluid resuscitation.
Randomized clinical trials (RCTs) that compare fluid management strategies prior to control of hemorrhage among human subjects are limited. In the first large-scale trial, Bickell et al. randomized 598 patients in hemorrhagic shock (systolic blood pressure <90 mm Hg) who had sustained penetrating torso trauma to either crystalloid resuscitation or no resuscitation prior to operative intervention.41 Prespecified hemodynamic targets were not used. Mean systolic arterial blood pressure was significantly decreased on arrival to the emergency department for the delayed resuscitation group as compared to the immediate resuscitation group (72 mm Hg vs. 79 mm Hg, respectively, P = .02) with a corresponding increase in survival (70% vs. 62%, respectively, P = .04). A trend toward a decreased incidence of postoperative complications was also observed for the delayed resuscitation group. However, a subsequent subgroup analysis documented that these benefits occurred only among patients who had cardiac injury with tamponade.42
Two recent trials have failed to replicate these findings. Turner et al. randomized 1,306 trauma patients with highly diverse injury patterns and levels of stability to receive early versus delayed or no fluid resuscitation.43 Although no mortality difference was observed (10.4% for the immediate resuscitation group versus 9.8% for the delayed/no resuscitation group), protocol compliance was poor (31% for the early group and 80% for the delayed/no resuscitation group), limiting interpretability. Most recently, Dutton et al. randomized 110 trauma patients presenting in hemorrhagic shock (systolic blood pressure <90 mm Hg) to receive crystalloid resuscitation to a systolic blood pressure of >70 mm Hg versus >100 mm Hg.44 Randomization occurred following presentation to the emergency department. Not all patients required operation, and hemorrhage control was determined at the discretion of the trauma surgeon or anesthesiologist. Although there was a significant difference in mean blood pressure during bleeding between the conventional and low groups (114 mm Hg vs. 110 mm Hg, respectively, P < .01), the mean blood pressure was substantially higher than intended (< 70 mm Hg) for the low group, and the absolute difference between groups was likely insignificant clinically. Mortality was infrequent and did not vary by resuscitation arm (7.3% for each group).
Methodological variability between these trials has precluded a meaningful meta-analysis,45 and may help to explain the discrepant mortality findings. It is clear that the degree of hemorrhagic shock was most pronounced in the study of Bickell et al., as evidenced by the lowest presenting systolic blood pressure as well as the highest mortality. Furthermore, randomization was accomplished in the prehospital setting, and all patients required operative intervention. By contrast, mortality was infrequent in the study of Dutton et al., and the target systolic blood pressure of 70 mm Hg in the “low” group was, on average, not achieved. Thus, at present, it is possible to conclude that limited volume resuscitation prior to operative intervention may be of benefit among patients with penetrating trauma resulting in cardiac injury, although the optimum level of permissive hypotension remains unknown. The benefit of such therapy among a more diverse cohort of patients in hemorrhagic shock, with a low associated risk of death, is not clear. Finally, regardless of therapeutic benefit, reliable achievement of permissive hypotension appears challenging once hospital care has begun.
Preemptive Blood Components
The widespread replacement of whole blood by component therapy in the early 1980s allowed for improved specificity of therapy, increased storage time of individual components, and decreased transmission of infectious disease. However, the relative amounts (if any) of components indicated for resuscitation of the exsanguinating trauma patient were not addressed, and remain debated approximately 30 years later. Traditional doctrine, as espoused by Advanced Trauma Life Support training, calls for 2 L of crystalloid followed by RBCs in the case of persistent hemodynamic instability; clotting factor and platelet replacement are indicated only in the presence of laboratory derangements (PT and platelet count, respectively).46
Although this approach is reasonable for patients who have sustained relatively minor hemorrhage (<20% of circulating blood volume), replacement of lost blood with isolated erythrocytes becomes problematic in the face of ongoing hemorrhagic shock requiring a large volume of blood transfusion. In this case, replacement of shed blood with isolated RBCs will result in a dilutional coagulopathy. Several authors have attempted to quantify the amount of RBCs transfused for which dilutional coagulopathy mandates concomitant component replacement therapy, with definitions ranging from loss of one blood volume to the need for greater than 10 U of RBCs in the first 24 hours following injury. The latter criterion is the most commonly accepted definition of MT, and is the time period on which most studies of empiric component replacement therapy are based. However, because over 80% of blood component therapy transfused to patients who require MT is administered within the first 6 hours of injury,47 we believe this to be a more appropriate time period for analysis. Thus, the focus on preemptive blood components should shift to the first 6 hours postinjury.
The debate regarding preemptive blood components began with platelets during the time of whole blood resuscitation.48 Recognition of the dangers of isolated RBC therapy during MT followed shortly after the widespread institution of component therapy in the early 1980s. Our group and others noted that mortality among massively transfused patients was reduced when increased amounts of both plasma and platelets were administered empirically. Specifically, when introducing the concept of RBC:FFP ratios, we reported increased mortality among a cohort of patients with major vascular trauma associated with RBC:FFP ratios greater than 5:1, with overt coagulopathy observed nearly universally with ratios exceeding 8:1.14 In 2007, Borgman et al. published a series of 254 massively transfused US soldiers in Iraq and Afghanistan, reporting markedly improved survival among those transfused with an RBC:FFP ratio in the range of 1.5:1, as compared to higher ratios.49