Shock, Electrolytes, and Fluid

Chapter 5 Shock, Electrolytes, and Fluid




Surgeons are the masters of fluids because they need to be. They care for patients who cannot eat or drink for various reasons—for example, they have hemorrhaged, undergone surgery, or lost fluids from tubes, drains, or wounds. Surgeons are obligated to know how to care for these patients, as they have put their lives in our hands. This topic might appear simple only for those who do not understand the complexities of the human body and its ability to regulate and compensate fluids. In reality, the task of managing patients’ blood volume is one of the most challenging burdens surgeons face, often requiring complete control of the intake and output of fluids and electrolytes, often in the presence of blood loss. Surgeons do not yet completely understand the physiology of shock and resuscitation, and our knowledge is superficial. Given the nature of our profession, we have studied fluids and electrolytes as we dealt with patients who have bled and even exsanguinate. Historically, wartime experience has always helped us move ahead in our knowledge of the management of fluids and resuscitation are no exceptions as we have learned much from them as well. The current wars in Iraq and Afghanistan.


Constant attention to, and titration of, fluid loss therapy is required, because the human body is dynamic. The key to treatment is to realize the patient’s initial condition and understand that the fluid status is constantly changing. Bleeding, sepsis, neuroendocrine disturbances, and dysfunctional regulatory systems can all affect patients who are undergoing the dynamic changes of illness and healing. The correct management of blood volume is highly time-dependent. If managed well, surgeons are afforded the chance to deal with other aspects of surgery, such as nutrition, administration of antibiotics, drainage of abscesses, relief of obstruction and incarceration, treatment of ischemia, and resection of tumors. Knowing the difference among dehydration, anemia, hemorrhage, and overresuscitation is vital.


The human body is predominantly water, which resides in the intravascular, intracellular, and interstitial (or third) spaces. Water moves among these spaces and depends on many variables. Because surgeons can only control the intravascular space, this chapter will concentrate on the correct management of the intravascular space, because this is the only means to control the other two fluid compartments.


This chapter will also examine historical aspects of shock, fluids, and electrolytes—not just to note interesting facts or pay tribute to deserving physicians, but also to try to understand how this knowledge was gained. Doing so is vital to understanding past changes in management and to accept future changes. Surgeons are often awed at the discoveries made, yet also astounded by how often they were wrong, and why. Future surgeons will look back at the current body of knowledge and be amazed at how little was known. Recent changes in the management of shock, fluids, and electrolytes have been major ones. Knowledge of the history helps explain why these changes were required. As a consequence of not studying the past, we have often repeated history in many ways.


After the historical highlights, this chapter will discuss fluids that are now used, along with fluids under development. Finally, caring for perioperative patients will be explored from a daily needs perspective.



History


History may be disliked by those who are in a hurry to learn only the basics. Learning from the past, however, is essential, to know treatments that have and have not worked. Dogma must always be challenged and questioned. Were the treatments based on science? To understand what to do, surgeons must know how the practice evolved to the current management methods. Studying the history of shock is important for at least three reasons:






Resuscitation


One of the earliest authenticated resuscitations in the medical literature is the miraculous deliverance of Anne Green, who was executed by hanging on December 14, 1650. Green was executed in the customary way by being forced off a ladder to hang by the neck. She hung for 30 minutes, during which time some of her friends pulled “with all their weight upon her legs, sometimes lifting her up, and then pulling her down again with a sudden jerk, thereby the sooner to dispatch her out of her pain”1 (Fig. 5-1). When everyone thought she was dead, the body was taken down, put in a coffin, and carried to the private house of Dr. William Petty—who, by the king’s orders, was allowed to perform autopsies on the bodies of everyone who had been executed.



When the coffin was opened, Green was observed to take a breath and a rattle was heard in her throat. Petty and his colleague, Thomas Willis, abandoned all thoughts of a dissection and proceeded to revive their patient. They held her up in the coffin and then, by wrenching her teeth apart, poured hot cordial into her mouth, which caused her to cough. They rubbed and chafed her fingers, hands, arms, and feet; after 15 minutes of these efforts, they put more cordial into her mouth. Then, after tickling her throat with a feather, she opened her eyes momentarily.


At that stage, they opened a vein and bled 5 ounces of blood. They continued administering the cordial and rubbing her arms and legs. Next, they applied compression bandages to her arms and legs. Heating plasters were put to her chest, and another plaster was inserted as an enema “to give heat and warmth to her bowels.” They then put her in a warm bed, with another woman to lie with her to keep her warm. After 12 hours, Green began to speak; 24 hours after her revival, she was answering questions freely. After 2 days, her memory was normal, apart from her recollection of her execution and the resuscitation.



Shock


Hemorrhagic shock has been extensively studied and written about for many years. Injuries, whether intentional or not, have occurred so frequently that much of the understanding of shock has been learned by surgeons taking care of the injured.


What is shock? The current widely accepted definition is inadequate perfusion of tissue. However, many subtleties lie behind this statement. Nutrients for cells are required, but which nutrients are not well defined at this point. The most critical nutrient is oxygen, but concentrating on oxygenation alone probably represents elemental thinking. Blood is highly complex and carries countless nutrients, buffers, cells, antibodies, hormones, chemicals, electrolytes, and antitoxins. Even if we think in an elemental fashion and try to optimize the perfusion of tissue, the delivery side of the equation is affected by blood volume, anemia, and cardiac output. Moreover, the use of nutrients is affected by infection and drugs. The vascular tone plays a role as well; for example, in neurogenic shock, the sympathetic tone is lost and, in sepsis, systemic vascular resistance decreases, because of a broken homeostatic process or possibly because of evolutionary factors.


Many advances in medicine have been achieved by battlefield observations. Unfortunately, in military and civilian trauma, hemorrhagic shock is the leading cause of preventable death. Repeatedly, wounded patients have survived their initial injuries, with adequate control of the hemorrhage, only to undergo malaise and deterioration, resulting in death. Such cases led to many explanations; most observers theorized a circulating toxic agent, thought to be secondary to the initial insult. The first record available that shows an understanding of the need for fluid in injured patients was apparently from Ambroise Paré (1510-1590), who urged the use of clysters (enemas to administer fluid into the rectum) to prevent “noxious vapors from mounting to the brain.” Yet, he also wrote that phlebotomy is “required in great wounds when there is fear of deflexion, pain, delirium, raving, and unquietness”; he and others practiced bloodletting during that era, because shock accompanying injury was thought to be from toxins.


The term shock appears to have been first used in 1743 in a translation of the French treatise of Henri Francois Le Dran regarding battlefield wounds. He used the term to designate the act of impact or collision, rather than the resulting functional and physiologic damage. However, the term can be found in the book Gunshot Wounds of the Extremities, published in 1815 by Guthrie, who used it to describe the physiologic instability.


Humoral theories persisted until the late 19th century but, in 1830, Herman provided one of the first clear descriptions of intravenous (IV) fluid therapy. In response to a cholera epidemic, he attempted to rehydrate patients by injecting 6 ounces of water into the vein. In 1831, O’Shaughnessy also treated cholera patients by administering large volumes of salt solutions intravenously and published his results in Lancet.2 Those were the first documented attempts to replace and maintain the extracellular internal environment or the intravascular volume. Note, however, that the treatment of cholera and dehydration is not the ideal treatment of hemorrhagic shock.


In 1872, Gross defined shock as “a manifestation of the rude unhinging of the machinery of life.” His definition, given its accuracy and descriptiveness, has been repeatedly quoted in the literature. Theories on the cause of shock persisted through the late 19th century; although it was unexplainable, it was often observed. George Washington Crile investigated it and concluded, at the beginning of his career, that the lowering of the central venous pressure in the shock state in animal experiments was caused by a failure of the autonomic nervous system.3 Surgeons witnessed a marked change in ideas about shock between 1888 and 1918. In the late 1880s, there were no all-encompassing theories, but most surgeons accepted the generalization that shock resulted from a malfunctioning of some part of the nervous system. Such a malfunctioning has now been shown not to be the main reason—but surgeons are still perplexed by the mechanisms of hemorrhagic shock, especially regarding the complete breakdown of the circulatory system that occurs in the later stages of shock.


In 1899, using contemporary advances with sphygmomanometers, Crile proposed that a profound decline in blood pressure (BP) could account for all symptoms of shock. He also helped alter how physicians diagnosed shock and followed its course. Before Crile, most surgeons relied on respiration, pulse, or declining mental status when evaluating the condition of patients. After Crile’s first books were published, many surgeons began measuring BP. In addition to changing how surgeons thought about shock, Crile was part of the therapeutic revolution. His theories remained generally accepted for almost 2 decades, predominantly in surgical circles. Crile’s work persuaded Harvey Cushing to measure BP during all operations, which in part led to the general acceptance of BP measurement in clinical medicine. Crile also concluded that shock was not a process of dying, but rather a marshaling of the body’s defenses in patients struggling to live. He later deduced that the reduced volume of circulating blood, rather than the diminished BP, was the most critical factor in shock.


Crile was instrumental in forming numerous theories of shock but was also known for the “anoci-association” theory of shock, which accounted for pain and its physiologic response during surgery. He realized that the constant administration of nitrous oxide during surgery was required, which necessitated having an additional professional at the operating table—the skilled nurse anesthetist. In 1908, he trained Agatha Hodgins, one of his nurses at Western Reserve, who later founded the American Association of Nurse Anesthetists.


Crile’s theories evolved as he continued his experimentations; in 1913, he proposed the kinetic system theory. He was interested in thyroid hormone and its response to wounds, but realized that adrenalin was a key component of the response to shock. He relied on experiments by Walter B. Cannon, who found that adrenalin was released in response to pain or emotion, shifting blood from the intestines to the brain and extremities. Adrenalin release also stimulated the liver to convert glycogen to sugar for release into the circulation. Cannon argued that all the actions of adrenalin aided the animal in its effort to defend itself.4


Crile incorporated Cannon’s study into his theory. He proposed that impulses from the brain after injury stimulated glands to secrete their hormones, which in turn resulted in sweeping changes throughout the body. Crile’s kinetic system included a complex interrelationship among the brain, heart, lungs, blood vessels, muscles, thyroid gland, and liver. He also noted that if the body underwent too much stress, the adrenal glands would run out of adrenalin, the liver of glycogen, the thyroid of its hormone, and the brain itself of energy, accounting for autonomic changes. Once the kinetic system ran out of energy, BP would fall, and the animal would go into shock.


At the end of the 19th century, surgeons for the most part used a wide variety of tonics, stimulants, and drugs. Through careful testing, Crile demonstrated that most of those agents were ineffective, stressing that only saline solutions, adrenalin, blood transfusions, and safer forms of anesthesia were beneficial for treating shock. In addition, he vigorously campaigned against the customary approach of polypharmacy, instead promoting only drugs of proven value. He stated that stimulants, long a mainstay of treatment in shock, did not raise BP and should be discarded: “a surgeon should not stimulate an exhausted vasomotor center with strychnine. That would be as futile as flogging a dead horse.”


Henderson recognized the importance of decreased venous return and its effect on cardiac output and arterial pressure. His work was aided by advances in techniques that allowed careful recording of the volume curves of the ventricles. Fat embolism also led to a shocklike state, but its possible contribution was questioned because study results were difficult to reproduce. The vasomotor center and its contributions in shock were heavily studied in the early 1900s. In 1914, Mann noted that unilaterally innervated vessels of the tongues of dogs, ears of rabbits, and paws of kittens appeared constricted during shock, as compared with contralaterally denervated vessels.


Battlefield experiences continued to intensify research on shock. During the World War I era, Cannon used clinical data from the war and data from animal experiments to examine the shock state carefully. He theorized that toxins and acidosis contributed to the previously described lowering of vascular tone. He and others then focused on acidosis and the role of alkali in preventing and prolonging shock. The adrenal gland and effect of cortical extracts on adrenalectomized animals were studied with fascination during this period.


Then, in the 1930s, a unique set of experiments by Blalock5 determined that almost all acute injuries were associated with changes in fluid and electrolyte metabolism. Such changes were primarily the result of reductions in the effective circulating blood volume. Blalock showed that those reductions after injury could be the result of several mechanisms (Box 5-1). He clearly showed that fluid loss in injured tissues involved the loss of extracellular fluid (ECF) that was unavailable to the intravascular space for maintaining circulation. The original concept of a “third space,” in which fluid is sequestered and therefore unavailable to the intravascular space, evolved from Blalock’s studies.



Carl John Wiggers first described the concept of irreversible shock.6 His 1950 textbook, Physiology of Shock, represented the attitudes toward shock at that time. In an exceptionally brilliant summation, Wiggers assembled the various signs and symptoms of shock from various authors in that textbook (Fig. 5-2), along with his own findings. His experiments used what is now known as the Wiggers prep. In his usual experiments, he used previously splenectomized dogs and cannulated their arterial systems. He took advantage of an evolving technology that allowed him to measure the pressure in the arterial system, and he studied the effects of lowering BP through blood withdrawal. After removing the dogs’ blood to an arbitrary set point (typically, 40 mm Hg), he noted that their BP soon spontaneously rose as fluid was spontaneously recruited into the intravascular space.



To keep the dogs’ BP at 40 mm Hg, Wiggers had to continually withdraw additional blood during this compensated stage of shock. During compensated shock, the dogs could use their reserves to survive. Water was recruited from the intracellular compartment as well as the extracellular space. The body tried to maintain the vascular flow necessary to survive. However, after a certain period, he found that to keep the dogs’ BP at the arbitrary set point of 40 mm Hg, he had to reinfuse shed blood; he termed this phase uncompensated or irreversible shock. Eventually, after a period of irreversible shock, the dogs died.


If the dogs had not yet gone into the uncompensated phase, any type of fluid used for resuscitation would have made survival likely. In fact, most dogs at that stage, even without resuscitation, would self-resuscitate by going to a water source. Once they entered the uncompensated phase of shock, however, their reserves were exhausted; even if blood were given back, survival rates were better if additional fluid of some sort was administered. Uncompensated shock is surely what Gross meant by “unhinging of the machinery of life.” Currently, hemorrhagic shock models are classified as involving controlled or uncontrolled hemorrhage. The Wiggers prep is controlled hemorrhage and is referred to as pressure-controlled hemorrhage.


Another animal model that uses controlled hemorrhage is the volume-controlled model. Arguments against this model include the inconsistency of the blood volume from one animal to another and the variability in response. Calculating blood volume is usually based on a percentage of body weight (typically, 7% of body weight), but such percentages are not exact and result in variability from one animal to another. However, proponents of the volume model and critics of the pressure model argue that a certain pressure during hypotension elicits a different response from one animal to another. Even in the pressure-controlled hemorrhage model, animals vary highly in regard to when they go from compensated to uncompensated shock. The pressure typically used in the pressure-controlled model is 40 mm Hg; the volume used in the volume-controlled model is 40%. The variance in the volume-controlled model can be minimized by specifying a narrow weight range for the animals (e.g., rats within 10 g, large animals within 5 pounds). It is also important to have the same experimenters doing the exact same procedure at the same time of the day in animals that were prepared and hydrated in exactly the same way.


The ideal model is uncontrolled hemorrhage, but its main problem is that the volume of hemorrhage is uncontrolled by the nature of the experiment. Variability is the highest in this model, even though it is the most realistic. Computer-assisted pressure models can be used that mimic the pressures during uncontrolled shock to reduce the artificiality of the pressure-controlled model.



Fluids


How did the commonly used IV fluids, such as normal saline, enter medical practice? It is often taken for granted, given the vast body of knowledge in medicine, that they were adopted through a rigorous scientific process but that was not actually the case.


Normal saline has been used for many years and is extremely beneficial, but we now know that it also can be harmful. Hartog Jakob Hamburger, in his in vitro studies of red cell lysis in 1882, incorrectly suggested that 0.9% saline was the concentration of salt in human blood. This fluid is often referred to as physiologic or normal saline, but it is neither physiologic nor normal. Supposedly, 0.9% normal saline originated during the cholera pandemic that afflicted Europe in 1831, but an examination of the composition of the fluids used by physicians of that era found no resemblance to normal saline. The origin of the concept of normal saline remains unclear.7


In 1831, O’Shaughnessy described his experience in the treatment of cholera8:



O’Shaughnessy wrote those words at the age of 22, having just graduated from Edinburgh Medical School. He tested his new method of infusing intravenous saline on a dog and observed no ill effects. Eventually, he reported that the aim of his method was to restore blood to its natural specific gravity and to restore its deficient saline matters. His experience with human cholera patients taught him that the practice of bloodletting, then highly common, was good for “diminishing the venous congestion” and that nitrous oxide (laughing gas) was not useful for oxygenation.


In 1832, Robert Lewins reported that he witnessed Thomas Latta injecting extraordinary quantities of saline into veins, with the immediate effects of “restoring the natural current in the veins and arteries, of improving the color of the blood, and [of] recovering the functions of the lungs.” Lewins described Latta’s saline solution as consisting of “two drachms of muriate, and two scruples of carbonate, of soda, to sixty ounces of water.” Later, however, Latta’s solution was found to equate to having 134 mmol/liter of Na+, 118 mmol/liter of Cl, and 16 mmol/liter of HCO3.


Over the next 50 years, many reports cited various recipes to treat cholera, but none resembled 0.9% saline. In 1883, Sydney Ringer reported on the influence exerted by the constituents of the blood on the contractions of the ventricle (Fig. 5-3). Studying hearts cut out of frogs, he used 0.75% saline and a blood mixture made from dried bullocks’ blood.9 In his attempts to identify which aspect of blood caused better results, he found that a “small quantity of white of egg completely obviates the changes occurring with saline solution.” He concluded that the benefit of white of egg was because of the albumin or potassium chloride. To show what worked and what did not, he described endless experiments, with alterations of multiple variables.



However, Ringer later published another article stating that his previously reported findings could not be repeated; through careful study, he realized that the water used in his first article was actually not distilled water, as reported, but rather tap water from the New River Water Company. It turned out that his laboratory technician, who was paid to distill the water, took shortcuts and used tap water instead. Ringer analyzed the water and found that it contained many trace minerals (Fig. 5-4). Through careful and diligent experimentation, he found that calcium bicarbonate or calcium chloride—in doses even smaller than those in blood—restored good contractions of the frog ventricles. The third component that he found essential to good contractions was sodium bicarbonate. He knew the importance of the trace elements. He also stated that fish could live for weeks unfed in tap water, but would die in distilled water in a few hours; minnows, for example, died in an average of 4.5 hours. Thus, the three ingredients that he found essential were potassium, calcium, and bicarbonate. Ringer’s solution soon became ubiquitous in physiologic laboratory experiments.



In the early 20th century, fluid therapy by injection under the skin (hypodermoclysis) and infusion into the rectum (proctoclysis) became routine. Hartwell and Hoguet reported its use in intestinal obstruction in dogs, laying the foundation for saline therapy in human patients with intestinal obstruction.


As IV crystalloid solutions were developed, Ringer’s solution was modified, most notably by pediatrician Alexis Hartmann. In 1932, attempting to develop an alkalinizing solution to administer to his acidotic patients, Hartmann modified Ringer’s solution by adding sodium lactate. The result was lactated Ringer’s (LR), or Hartmann’s solution. He used sodium lactate (instead of sodium bicarbonate)—the conversion of lactate into sodium bicarbonate was slow enough to lessen the danger posed by sodium bicarbonate, which could rapidly shift patients from compensated acidosis to uncompensated alkalosis.


In 1924, Rudolph Matas, regarded as the originator of modern fluid treatment, introduced the concept of the continued IV drip but also warned of the potential dangers of saline infusions. He stated that “Normal saline has continued to gain popularity but the problems with metabolic derangements have been repeatedly shown but seem to have fallen on deaf ears.” In healthy volunteers, normal saline has been shown to cause abdominal discomfort and pain, nausea, drowsiness, and decreased mental capacity to perform complex tasks.


The point is that normal saline and LR solutions have been formulated for conditions other than the replacement of blood, and the reasons for the formulation are archaic. Such solutions have been useful for dehydration; when used in relatively small volumes (1 to 3 liters/day), they are well tolerated and relatively harmless, they provide water, and the human body can tolerate the amounts of electrolytes they contain. Over the years, LR has attained widespread use for the treatment of hemorrhagic shock. However, normal saline and LR are mostly permeable through the vascular membrane, but are poorly retained in the vascular space. After a few hours, only about 175 to 200 mL of a 1-liter infusion remains in the intravascular space. In countries other than the United States, LR is often referred to as Hartmann’s solution, and normal saline is referred to as physiologic (sometimes even spelled “fisiologic”) solution. With the advances in science in the last 50 years, it is hard to understand why more advances in resuscitation fluids have not been made.



Blood Transfusions


Concerned about the blood that injured patients lost, Crile began to experiment with blood transfusions. As he stated, “After many accidents, profuse hemorrhage often led to shock before the patient reached the hospital. Saline solutions, adrenalin, and precise surgical technique could substitute only up to a point for the lost blood.” At the turn of the 19th century, transfusions were seldom used. Their use waxed and waned in popularity because of transfusion reactions and difficulties in preventing clotting in donated blood. Through his experiments in dogs, Crile showed that blood was interchangeable: he transfused blood without blood group matching. Alexis Carrel was able to sew blood vessels together with his triangulation technique, using it to connect blood vessels from one person to another for the purpose of transfusions. However, Crile found Carrel’s technique too slow and cumbersome in humans, so he developed a short cannula to facilitate transfusions.


By World War II, shock was recognized as the single most common cause of treatable morbidity and mortality. At the time of the Japanese attack on Pearl Harbor on December 7, 1941, no blood banks or effectual blood transfusion facilities were available. Most military locations had no stocks of dried pooled plasma. Although the wounded of that era were evacuated quickly to a hospital, the mortality rate was still high. IV fluids of any type were essentially unavailable, except for a few liters of saline manufactured by means of a still in the operating room. IV fluid was usually administered using an old Salvesen flask and reused rubber tubing. Often, a severe febrile reaction resulted from the use of that tubing.


The first written documentation of resuscitation in World War II patients was 1 year after Pearl Harbor, in December 1942, in notes from the 77th Evacuation Hospital in North Africa. Churchill stated that “The wounded in action had for the most part either succumbed or recovered from any existing shock before we saw them. However, later cases came to us in shock, and some of the early cases were found to be in need of whole blood transfusion. There was plenty of reconstituted blood plasma available. However, some cases were in dire need of whole blood. We had no transfusion sets, although such are available in the United States: no sodium citrate; no sterile distilled water; and no blood donors.”


The initial decision to rely on plasma rather than blood appears to have been based in part on the view held by the Office of the Surgeon General of the Army, and in part on the opinion of the civilian investigators of the National Research Council. Those civilian investigators thought that in shock, the blood was thick and the hematocrit level high. On April 8, 1943, the Surgeon General stated that no blood would be sent to the combat zone. Seven months later, he again refused to send blood overseas because of the following: (1) his observations of overseas theaters had convinced him that plasma was adequate for the resuscitation of wounded men; (2) from a logistics standpoint, it was impractical to make locally collected blood more available than that from general hospitals in the combat zone; and (3) shipping space was too small. Vasoconstricting drugs such as adrenalin were condemned because they were thought to decrease blood flow and tissue perfusion as they dammed the blood in the arterial portion of the circulatory system.


During World War II, out of necessity, efforts to make blood transfusions available heightened and led to the institution of blood banking for transfusions. Better understanding of hypovolemia and inadequate circulation favored the use of plasma as a resuscitative solution, in addition to whole blood replacement. Thus, the treatment of traumatic shock greatly improved. The administration of whole blood was thought to be extremely effective, so it was widely used. Mixed with sodium citrate in a 6 : 1 ratio to bind the calcium in the blood, which prevented clotting, worked well.


However, no matter which solution was used—blood, colloids, or crystalloids—the blood volume seemed to increase by only a fraction of what was lost. In the Korean War era, it was recognized that more blood had to be infused to regain the blood volume that was lost adequately. The reason for the need for more blood was unclear, but was thought to be because of hemolysis, pooling of blood in certain capillary beds, and loss of fluid into tissues. Considerable attention was given to elevating the feet of patients in shock.



Physiology of Shock



Bleeding


Research and experience have both taught us much about the physiologic responses to bleeding. The Advanced Trauma Life Support (ATLS) course defines four classes of shock (Table 5-1). In general, that categorization has helped point out the physiologic responses to hemorrhagic shock, emphasizing the identification of blood loss and guiding treatment. Shock can be thought of anatomically at three levels (Fig. 5-5). It can be cardiogenic, with extrinsic abnormalities (e.g., tamponade) or intrinsic abnormalities (e.g., pump failure caused by infarct, overall cardiac failure, or contusion). Large vessels can cause shock if they are injured and bleeding results. If the anatomic problem is at the small vessel level, neurogenic dysfunction or sepsis can be the culprit.




The four classes of shock as taught in the ATLS course are problematic because they were not rigorously tested and proven. The developers of the ATLS course have agreed that these classes were fairly arbitrary and not necessarily based on rigorous scientific data. Patients in shock do not always follow the physiology as taught in the ATLS course, and a high degree of variance exists among patients, particularly in children and older patients. Children, in general, seem to be able to compensate, even after large volumes of blood loss, because of the higher water composition of their bodies. However, when they decompensate, the process can be rapid. Older patients do not compensate well; when they start to collapse physiologically, the process can be devastating because their ability to recruit fluid is not as good and their cardiac reserves are less.


The problem with the signs and symptoms classically shown in the ATLS classes is that in reality, the manifestations of shock can be confusing and difficult to assess. For example, consider whether an individual patient’s change in mental status is caused by factors such as blood loss, traumatic brain injury (TBI), pain, or illicit drugs. The same dilemma applies for respiratory rate and skin changes. Are alterations in a patient’s respiratory rate or skin caused by factors such as pneumothorax, rib fractures, or inhalation injury?


To date, despite the many potential methods of monitoring shock, none has been found clinically reliable for replacing BP. Clinicians all know that there is a wide range of normal BPs. The question often is this: What is the baseline BP of the patient being treated? When a seemingly normal BP is treated, is that hypotension or hypertension compared with the patient’s normal BP? How do we know how much blood has been lost? Even if blood volume is measured directly (rapid methods are now available), what was the patient’s baseline blood volume? To what blood volume should the patient be resuscitated? The end point of resuscitation has been elusive. The variance in all the variables makes assessment and treatment a challenge.


One important factor to recognize is that clinical symptoms are relatively few in patients who are in class I shock. The only change in class I shock is anxiety, which is practically impossible to assess—is it the result of factors such as blood loss, pain, trauma, or drugs? A heart rate higher than 100 beats/min has been used as a physical sign of bleeding, but evidence of its significance is minimal. Brasel and colleagues10 have shown that heart rate is neither sensitive nor specific in determining the need for emergent intervention, need for packed red blood cell (PRBC) transfusions in the first 2 hours after an injury, or severity of an injury. Heart rate was not altered by the presence of hypotension (systolic BP <90 mm Hg).


In patients who are in class II shock, we are taught that their heart rate is increased but, again, this is a highly unreliable marker; pain and mere nervousness can also increase heart rate. The change in pulse pressure—the difference between systolic and diastolic pressure—is also difficult to identify, because the baseline BP of patients is not always known. The change in pulse pressure is thought to be caused by an adrenalin response, which constricts vessels and results in higher diastolic pressures. It is important to recognize that the body compensates well.


Not until patients are in class III shock does BP supposedly decrease. At this stage, patients have lost 30% to 40% of their blood volume; for an average man weighing 75 kg/168 lbs, that can mean up to 2 liters of blood loss (Fig. 5-6). It is helpful to remember that a can of soda or beer is 355 mL; a six pack is 2130 mL. Theoretically, if a patient is hypotensive from blood loss, we are looking for a six pack of blood. Small amounts of blood should not result in hypotension. Although intracranial bleeding can cause hypotension in the last stages of herniation, it is almost impossible that it is the result of large volumes of blood loss intracranially because there is not enough room for that volume of blood. It is critical to recognize uncontrolled bleeding, and even more critical to stop bleeding before patients go into class III shock. It is more important to recognize blood loss than it is to replace blood loss. A common mistake is to think that trauma patients are often hypotensive; hypotension is rare in trauma patients (occurring less than 6% of the time).



In addition, the ATLS course, which was designed for physicians who are not surgeons, does not recognize many subtle but important aspects of bleeding. The concepts of the course are relatively basic. However, surgeons know that there are nuances of the varied responses to injuries in animals and humans. In the case of arterial hemorrhage, for example, we know that animals do not necessarily manifest tachycardia as their first response when bleeding, but actually become bradycardic. It is speculated that this is a teleologically developed mechanism because a bradycardic response reduces cardiac output and minimizes free uncontrolled exsanguination; however, a bradycardic response to bleeding is not consistently shown in all animals, including humans. Some evidence has shown that this response, termed relative bradycardia, does occur in humans. Relative bradycardia is defined as a heart rate lower than 100 beats/min when the systolic BP is less than 90 mm Hg. When bleeding patients have relative bradycardia, their mortality rate is lower. Interestingly, up to 44% of hypotensive patients have relative bradycardia. However, patients with a heart rate lower than 60 beats/min are usually moribund. Bleeding patients with a heart rate of 60 to 90 beats/min have the higher survival rate as compared with patients who are tachycardic (heart rate >90 beats/min).11


The physiologic response to bleeding also differs subtly according to whether the source of bleeding is arterial or venous. Arterial bleeding is obviously problematic, but often stops temporarily on its own; the human body has evolved to trap the blood loss in adventitial tissues, and the transected artery will spasm and thrombose. A lacerated artery can actually bleed more than a transected artery because the spasm of the lacerated artery can actually enlarge the hole in the vessel. Thrombosis of the artery sometimes does not occur in transected or lacerated vessels. Arterial bleeding, when constantly monitored, results in rapid hypotension: there is a leak in the arterial system and, because the arterial system is valveless, the recorded BP drops early, even before large-volume loss occurs. In these patients, hypotension ensues quickly but because ischemia has not yet had a chance to occur, measurements of lactate or base deficit often yield normal results.


Venous bleeding, however, is slower; the human body compensates, and sometimes large volumes of blood are lost before hypotension ensues. In venous bleeding, there is time for lactate and base deficit results to be abnormal. Blood loss is often slower, but can still be massive before it is reflected in hypotension. The slower nature of venous bleeding also allows for compensatory mechanisms to interact because water is recruited intravascularly from cells and the interstitial spaces.


It is generally taught that the hematocrit or hemoglobin level is not reliable for predicting blood loss. This is true for patients with a high hematocrit or hemoglobin level but in patients resuscitated with fluids, a rapid drop in the hematocrit and hemoglobin levels can occur immediately. Bruns and associates12 have shown that the hemoglobin level can be low within the first 30 minutes after the patient arrives at a trauma center. Therefore, although patients with a high or normal hemoglobin level may have significant bleeding, a low hemoglobin level, because it occurs rapidly, usually reflects the actual hemoglobin level and extent of blood loss. Infusion of acellular fluids often will dilute the blood and decrease the hemoglobin levels even further.


The lack of good indicators to distinguish which patients are bleeding has led many investigators to examine heart rate variability or complexity as a potential new vital sign. Many clinical studies have shown that heart rate variability or complexity is associated with poor outcome, but this has yet to catch on, perhaps because of the difficulty of calculating it. Heart rate variability or complexity would have to be calculated using software, with a resulting index on which clinicians would have to rely; this information would not be available merely by examining patients. Another issue with heart rate variability or complexity is that the exact physiologic mechanism for its association with poor outcome has yet to be elucidated.13 This new vital sign may be programmable into currently used monitors, but its usefulness has yet to be confirmed.


Hypotension has been traditionally set, arbitrarily, at 90 mm Hg and below. However, Eastridge and coworkers14 have suggested that hypotension be redefined as 110 mm Hg and below, because that BP is more predictive of death and hypoperfusion. They concluded that 110 mm Hg would be a more clinically relevant cutoff point for hypotension and hypoperfusion. In 2008, Bruns and colleagues15 confirmed that concept, showing that a prehospital BP lower than 110 mm Hg was associated with a sharp increase in mortality, and 15% of patients with that BP would eventually die in the hospital. As a result, they recommended redefining prehospital triage systems. Of note, especially in older patients, normal vital signs may miss occult hypoperfusion as indicated by increased lactate levels and base deficit.16



Lactate and Base Deficit


Lactate has been a marker of injury, and possibly ischemia, and has stood the test of time.16 However, new data question the cause and role of lactate. Emerging information is confusing; it suggests that we may not understand lactate for what it truly implies. Lactate has long been thought to be a byproduct of anaerobic metabolism and is routinely perceived to be an end waste product that is completely unfavorable. Physiologists are now questioning this paradigm and have found that lactate behaves more advantageously than not. An analogy would be that firefighters are associated with fires, but that does not mean that firefighters are bad, nor does it mean that they caused the fires.


Research has shown that lactate accumulates in muscle and blood during exercise; it is at its highest level at, or just after, exhaustion. Accordingly, it was assumed that lactate was a waste product. We also know that lactic acid appears in response to muscle contraction and continues in the absence of oxygen. In addition, accumulated lactate disappears when oxygen is present in tissues.


Recent evidence has indicated that lactate is an active metabolite, capable of moving among cells, tissues, and organs, where it may be oxidized as fuel or reconverted to form pyruvate or glucose. It now appears that increased lactate production and concentration, as a result of anoxia or dysoxia, are often the exception rather than the rule. Lactate seems to be a shuttle for energy; the lactate shuttle is now the subject of much debate. The end product of glycolysis is pyruvic acid. Lack of oxygen is thought to convert pyruvate into lactate. However, lactate formation may allow carbohydrate metabolism to continue through glycolysis. It is postulated that lactate is transferred from its site of production in the cytosol to neighboring cells and to various organs (e.g., heart, liver, kidney), where its oxidation and continued metabolism can occur.


Lactate is also being studied as a pseudohormone because it seems to regulate the cellular redox state through exchange and conversion into pyruvate and through its effects on the ratio of nicotinamide adenine dinucleotide to nicotinamide adenine dinucleotide (reduced)—the NAD+/NADH ratio. It is released into the systemic circulation and taken up by distal tissues and organs, where it also affects the redox state in those cells. Further evidence has shown that it affects wound regeneration, with the promotion of increased collagen deposition and neovascularization. Lactate may also induce vasodilation and catecholamine release and stimulate fat and carbohydrate oxidation.


Lactate levels in blood are highly dependent on the equilibrium between production and elimination from the bloodstream. The liver is predominantly responsible for clearing lactate; acute or chronic liver disease affects lactate levels. Lactate was always thought to be produced from anaerobic tissues, but it now seems that various tissue beds that are not undergoing anaerobic metabolism produce lactate when signaled of distress.


In canine muscle, lactate is produced by moderate-intensity exercise when the oxygen supply is ample. A high adrenergic stimulus also causes a rise in lactate level as the body prepares or responds to stress. A study of climbers of Mount Everest has shown that the resting PO2 on the summit was approximately 28 mm Hg and decreased even more during exercise.17 The blood lactate level in those climbers was essentially the same as at sea level. These studies have allowed us to question lactate and its true role.


In humans, lactate may be the preferred fuel in the brain and heart; infused lactate is used before glucose at rest and during exercise. Because it is glucose sparing, lactate allows glucose and glycogen levels to be maintained. However, some data point to lactate’s protective role in TBIs.18 Lactate fuels the human brain during exercise. The level of lactate, whether it is a waste product or source of energy, seems to signify tissue distress, from anaerobic conditions or other factors.19 Release of epinephrine and other catecholamines will result in higher lactate levels.


Base deficit, a measure of the number of millimoles of base required to correct the pH of 1 liter of whole blood to 7.4, seems to correlate well with lactate level, at least in the first 24 hours after an injury. Rutherford, in 1992, showed that a base deficit of 8 is associated with a 25% mortality rate in patients older than 55 years without a head injury or in patients younger than 55 with a head injury. When base deficit remains elevated, most clinicians believe that it is an indication of ongoing shock.


One of the problems with base deficit is that it is commonly influenced by the chloride from various resuscitation fluids, resulting in a hyperchloremic nongap acidosis. In patients with renal failure, base deficit can also be a poor predictor of outcome. In the acute stage of renal failure, a base deficit lower than 6 mmol/liter is associated with a poor outcome.20 With the use of hypertonic saline (HTS), which has three to eight times the sodium chloride concentration as normal saline, depending on the concentration used, in trauma patients, the hyperchloremic acidosis has been shown to be relatively harmless. However, when HTS is used, the base deficit should be interpreted with caution.



Compensatory Mechanisms


When needed, blood flow to less critical tissues is diverted to more critical tissues. The earliest compensatory mechanism in response to a decrease in intravascular volume is an increase in sympathetic activity. Such an increase is mediated by pressure receptors or baroreceptors in the aortic arch, atria, and carotid bodies. A decrease in pressure inhibits parasympathetic discharge while norepinephrine and epinephrine are liberated and causes adrenergic receptors in the myocardium and vascular smooth muscle to be activated. Heart rate and contractility are increased; peripheral vascular resistance is also increased, resulting in an increased BP. However, the various tissue beds are not affected equally; blood is shunted from less critical organs (e.g., skin, skeletal muscle, splanchnic circulation) to more critical organs (e.g., brain, liver, kidneys).


Then, the juxtaglomerular apparatus in the kidney—in response to the vasoconstriction and decrease in blood flow—produces the enzyme renin, which generates angiotensin I. The angiotensin-converting enzyme located on the endothelial cells of the pulmonary arteries converts angiotensin I to angiotensin II. In turn, angiotensin II stimulates an increased sympathetic drive, at the level of the nerve terminal, by releasing hormones from the adrenal medulla. In response, the adrenal medulla affects intravascular volume during shock by secreting catechol hormones—epinephrine, norepinephrine, and dopamine— which are all produced from phenylalanine and tyrosine. They are called catecholamines because they contain a catechol group derived from the amino acid tyrosine. The release of catecholamines is thought to be responsible for the elevated glucose level in hemorrhagic shock. Although the role of glucose elevation in hemorrhagic shock is not fully understood, it does not seem to affect outcome.21


Cortisol, also released from the adrenal cortex, plays a major role in that it controls fluid equilibrium. In the adrenal cortex, the zona glomerulosa produces aldosterone in response to stimulation by angiotensin II. Aldosterone is a mineralocorticoid that modulates renal function by increasing the recovery of sodium and excretion of potassium. Angiotensin II also has a direct action on the renal tubules, reabsorbing sodium. The control of sodium is a primary mechanism whereby the human body controls water absorption or secretion in the kidneys. One of the problems in shock is that the release of hormones is not infinite; the supply can be exhausted.


This regulation of intravascular fluid status is further affected by the carotid baroreceptors and atrial naturetic peptides. Signals are sent to the supraoptic and paraventricular nuclei in the brain. Antidiuretic hormone (ADH) is released from the pituitary, causing retention of free water at the level of the kidney. Simultaneously, volume is recruited from the extravascular and cellular spaces. A shift of water occurs as hydrostatic pressures fall in the intravascular compartment. At the capillary level, hydrostatic pressures are also reduced, because the precapillary sphincters are vasoconstricted more than the postcapillary sphincters.



Lethal Triad


The triad of acidosis, hypothermia, and coagulopathy is common in resuscitated patients who are bleeding or in shock from various factors. Our basic understanding is that inadequate tissue perfusion results in acidosis caused by lactate production. In the shock state, the delivery of nutrients to the cells is thought to be inadequate, so adenosine triphosphate (ATP) production decreases. The human body relies on ATP production to maintain homeostatic temperatures; ATP is the source of heat in all homeothermic (warm-blooded) animals. Thus, if ATP production is inadequate to maintain body temperature, the body will trend toward the ambient temperature. For most patients, this is 22° C (72° F), the temperature inside typical hospitals. The resulting hypothermia then affects the efficiency of enzymes, which work best at 37° C. For surgeons, the critical problem with hypothermia is that the coagulation cascade depends on enzymes affected by hypothermia; if enzymes are not functioning optimally because of hypothermia, coagulopathy worsens, which in surgical patients can contribute to uncontrolled bleeding from injuries or the surgery itself. Further bleeding continues to fuel the triad. The optimal method to stop the vicious cycle of death is to stop the bleeding and the causes of hypothermia. In most typical scenarios, hypothermia is not spontaneous from ischemia but is induced because of using room temperature fluid or cold blood products.



Acidosis


Bleeding causes a host of responses. During the resuscitative phase, the lethal triad (acidosis, hypothermia, and coagulopathy) is frequent, most likely because of two major factors. First, tissue ischemia from the lack of blood flow results in lactic acidosis. Some believe that the acidotic state is not necessarily undesirable, because the body tolerates acidosis better than alkalosis. Oxygen is more easily offloaded from the hemoglobin molecules in the acidotic environment; many who try to preserve tissue have found that cells live longer in an acidotic environment. Correcting acidosis with sodium bicarbonate has classically been avoided because it is treating a number or symptom when the cause needs to be addressed. Treating the pH alone has shown no benefit, but it can lead to complacency; patients appear to be better resuscitated, but the underlying cause of their acidosis has not been adequately addressed. It is also argued that rapidly injecting sodium bicarbonate can worsen intracellular acidosis because of the diffusion of the converted CO2 into the cells.


The best fundamental approach to metabolic acidosis from shock is to treat the underlying cause of shock. However, some clinicians believe that treating the pH has advantages, because the enzymes necessary for the coagulation cascade work better at an optimal temperature and optimal pH. Coagulopathy can contribute to uncontrolled bleeding, so some have recommended treating acidosis for patients in dire scenarios. Treating acidosis with sodium bicarbonate may have a benefit in an unintended and unrecognized way. Rapid infusion is usually accompanied by a rise in BP in hypotensive patients, which is usually attributed to correcting the pH. However, sodium bicarbonate in most urgent situations is given in ampules. The 50-mL ampule of sodium bicarbonate has 1 mEq/mL—in essence, similar to giving a hypertonic concentration of sodium, which quickly draws fluid into the vascular space. Given its high sodium concentration, a 50-mL bolus of sodium bicarbonate has physiologic results similar to those of 325 mL of normal saline or 385 mL of LR. Essentially it is like giving small doses of HTS. Sodium bicarbonate quickly increases CO2 levels by its conversion in the liver, so if the minute ventilation is not increased, respiratory acidosis can result.


THAM (tromethamine; tris[hydroxymethyl]aminomethane) is a biologically inert amino alcohol of low toxicity that buffers CO2 and acids. It is sodium-free and limits the generation of CO2 in the process of buffering. At 37° C, the pKa of THAM is 7.8, making it a more effective buffer than sodium bicarbonate in the physiologic range of blood pH. In vivo, THAM supplements the buffering capacity of the blood bicarbonate system by generating sodium bicarbonate and decreasing the partial pressure of CO2. It rapidly distributes to the extracellular space and slowly penetrates the intracellular space, except in the case of erythrocytes and hepatocytes, and is excreted by the kidney. Unlike sodium bicarbonate, which requires an open system to eliminate CO2 to exert its buffering effect, THAM is effective in a closed or semiclosed system and it maintains its buffering ability during hypothermia. THAM acetate (0.3 M; pH, 8.6) is well tolerated, does not cause tissue or venous irritation, and is the only formulation available in the United States. THAM may induce respiratory depression and hypoglycemia, which may require ventilatory assistance and the administration of glucose.


The initial loading dose of THAM acetate (0.3 M) for the treatment of acidemia may be estimated as follows:



image



The maximal daily dose is 15 mmol/kg/day for an adult (3.5 liters of a 0.3-M solution in a patient weighing 70 kg). It is indicated in the treatment of respiratory failure (acute respiratory distress syndrome [ARDS ] and infant respiratory distress syndrome) and has been associated with the use of hypothermia and permissive hypercapnia (controlled hypoventilation). Other indications are diabetic and renal acidosis, salicylate and barbiturate intoxication, and increased intracranial pressure associated with brain trauma. It is used in cardioplegic solutions and during liver transplantation. Despite these features, THAM has not been documented clinically to be more efficacious than sodium bicarbonate.



Hypothermia


Hypothermia can be beneficial and detrimental. A fundamental knowledge of hypothermia is of vital importance in the care of surgical patients. The beneficial aspects of hypothermia are mainly because of decreased metabolism. Injury sites are often iced, creating vasoconstriction and decreasing inflammation through decreased metabolism. This concept of cooling to slow metabolism is also the rationale behind using hypothermia to decrease ischemia during cardiac, transplantation, pediatric, and neurologic surgery. Also, amputated extremities are iced before reimplantation. Cold water near-drowning victims have higher survival rates thanks to the preservation of the brain and other vital organs. The Advanced Life Support Task Force of the International Liaison Committee of Resuscitation now recommends cooling (to 32° to 34° C) unconscious adults, who have spontaneous circulation after out of hospital cardiac arrest caused by ventricular fibrillation, for 12 to 24 hours. Induced hypothermia is vastly different from spontaneous hypothermia, which is typically from shock, inadequate tissue perfusion, or cold fluid infusion.


Medical or accidental hypothermia is also very different from trauma-associated hypothermia (Table 5-2). The survival rates after accidental hypothermia range from approximately 12% to 39%; the average temperature drop is to approximately 30° C (range, 13.7° to 35.0° C). The lowest recorded temperature in a survivor of accidental hypothermia (13.7° C [56.7° F]) was in an extreme skier in Norway; she was trapped under the ice and eventually fully recovered neurologically.


Table 5-2 Classification of Hypothermia by Cause






















  Cause
DEGREE TRAUMA ACCIDENT
Mild 36°-34° C 35°-32° C
Moderate 34°-32° C 32°-28° C
Severe <32° C (<90° F) <28° C (<82° F)

The data in patients with trauma-associated hypothermia differ. Their survival rate falls dramatically with their core temperature, reaching 100% mortality when it reaches 32° C at any point—whether in the emergency room, operating room, or intensive care unit (ICU). In trauma patients, hypothermia is caused by shock and is thought to perpetuate uncontrolled bleeding because of the associated coagulopathy. Trauma patients with a postoperative core temperature lower than 35° C have a fourfold increase in mortality and lower than 33° C, a sevenfold increase in mortality. Hypothermic trauma patients tend to be more severely injured and older, with bleeding as indicated by blood loss and transfusions.22


Surprisingly, in a study using the National Trauma Data Bank, Shafi and associates have shown that hypothermia and its associated poor outcome are not related to the state of shock. It was previously thought that a core temperature lower than 32°C was uniformly fatal in trauma patients who have the additional insult of tissue injury and bleeding. However, a small number of trauma patients have now survived, despite a recorded core temperature lower than 32° C. In a multi-institutional trial, Beilman and coworkers23 have recently demonstrated that hypothermia is associated with more severe injuries, bleeding, and a higher rate of multiorgan dysfunction in the ICU, but not with death.


To understand hypothermia, we have to remember that humans are homeothermic (warm-blooded) animals, in contrast to poikilothermic (cold-blooded) animals, such as snakes and fish. To maintain a body temperature of 37° C, our hypothalamus uses various mechanisms to control core body temperature tightly. We use oxygen as the key ingredient, or fuel, to generate heat in the mitochondria in the form of ATP. When ATP production is below its lowest threshold, one side effect is the lowering of body temperature to the ambient temperature, which typically is less than core body temperature. In contrast, during exercise, we use more oxygen, because more ATP is required and we produce excess heat. In an attempt to modulate core temperature, we start perspiring to use the cooling properties of evaporation.


Hypothermia, although potentially beneficial, is detrimental in trauma patients, mainly because it causes coagulopathy. Cold affects coagulopathy by decreasing enzyme activity, enhancing fibrinolytic activity, and causing platelet dysfunction. Platelets are affected by the inhibition of thromboxane B2 production, resulting in decreased aggregation. A heparin-like substance is released, causing a diffuse intravascular coagulation (DIC)–like syndrome. Hageman factor (factor XII ) and thromboplastin are some of the enzymes most affected. Even a drop in core temperature of only a few degrees results in 40% inefficiency in activity of some enzymes.


Heat affects the coagulation cascade so much that when blood is drawn from cold patients and sent to the laboratory, the sample is heated to 37° C, because even 1° or 2° C of cold delays clotting and renders test results inaccurate. Thus, in a cold and coagulopathic patient, if the coagulation profile obtained from the laboratory shows an abnormality, the result represents the same level of coagulopathy as if the patient (and not just the sample) had been warmed to 37° C. Therefore, a cold patient is always more coagulopathic than indicated by the coagulation profile. A normal coagulation profile does not necessarily represent what is occurring in the body.


Heat is measured in calories. One calorie is the amount of energy required to raise the temperature of 1 mL of water (which has, by definition, a specific heat of 1.0). It takes 1 kcal to raise the temperature of 1 liter of water by 1° C. If an average man (weight, 75 kg) consisted of pure water, then it would take 75 kcal to raise his temperature by 1° C. However, humans are not made of pure water and blood has a specific heat coefficient of 0.87. Thus, the human body has a specific heat of 0.83. Therefore, it actually takes 62.25 kcal (75 kg × 0.83) to raise the body temperature by 1° C. If a patient were to lose 62.25 kcal, the body temperature would decrease by 1° C. This basic science is important when choosing methods to retain heat or treat hypothermia or hyperthermia. It allows the efficacy of one method to be compared with another.


The normal basal metabolic heat generation is approximately 70 kcal/hr; shivering can increase this to 250 kcal/hr. Heat is transferred to and from the body by contact or conduction (as in a frying pan or Jacuzzi), air or convection (as in an oven or sauna), radiation, and evaporation. Convection is an extremely inefficient way to transfer heat because the air molecules are so far apart as compared with liquids and solids. Conduction and radiation are the most efficient ways to transfer heat. However, heating the patient with radiation is fraught with inconsistencies and technical challenges and is difficult to apply clinically, so we are left with conduction to transfer energy efficiently.


Warming or cooling through manipulation of the temperature of IV fluids is useful because it uses conduction to transfer heat. Although IV fluids can be warmed, the U.S. Food and Drug Administration (FDA) only allows fluid warmers to be set at a maximum of 40° C. Therefore, the differential between a cold trauma patient (34° C) and warmed fluid is only 6° C. Thus, 1 liter of warmed fluids can only transfer 6 kcal to the patient. As previously calculated, approximately 62 kcal is needed to raise the core temperature by 1° C. Therefore, 10.4 liters of warmed fluids are needed to raise the core temperature by 1° C, to 35° C. Once that has been achieved, the differential is now only 5° C between the patient and the warmed fluid, so it actually takes 12.5 liters of warmed fluids to raise the patient’s temperature from 35° to 36° C. A cold patient at 32° C needs to be given 311 kcal (75 kg × 0.83) to warm him or her to 37° C. Note that 1 liter of fluid must be given at the highest rate possible, because if the infusion rate is slow the fluid cools to room temperature as the IV line is exposed to ambient room temperature. To avoid IV line cooling, devices that warm fluids up to the point of insertion into the body should be used.


Warming patients by infusing warmed fluids is difficult, but fluid warmers are still critically important; the main reason to warm fluids is so that patients are not cooled. Cold fluids can cool patients quickly. The fluids that are typically infused are at room temperature (22° C) or at 4° C, which is the temperature of a refrigerator in which blood products are stored. Therefore, it takes 5 liters of 22° C fluid or 2 liters of cold blood to cool a patient by 1° C. Again, the main reason for using fluid warmers is not necessarily to warm patients but to prevent their cooling during resuscitation.


Rewarming techniques are classified as passive or active. Active warming is further classified as external or internal (Table 5-3). Passive warming involves preventing heat loss. An example of passive warming is to dry the patient to minimize evaporative cooling, giving warm fluids to prevent cooling, or covering the patient so that the ambient air temperature immediately around the patient can be higher than the room temperature. Covering the patient’s head helps reduce a tremendous amount of heat loss. Using aluminum-lined head covers is preferred; they reflect back the infrared radiation that is normally lost through the scalp. Warming the room technically helps reduce the heat loss gradient, but the surgical staff usually cannot work in a humidified room at 37° C. Passive warming also includes closing open body cavities, such as the chest or abdomen, to prevent evaporative heat loss. The most important way to prevent heat loss is to treat hemorrhagic shock by controlling bleeding. Once shock has been treated, the body’s metabolism will heat the patient from his or her core. This point cannot be overemphasized.


Table 5-3 Classification of Warming Techniques






























  Active
PASSIVE EXTERNAL INTERNAL
Drying the patient Bair Hugger Warmed fluids
Warm fluids Heated warmers Heat ventilator
Warm blankets, sheets Lamps Cavity lavage, chest tube, abdomen, bladder
Head covers Radiant warmers Continuous arterial or venous rewarming
Warming the room Clinitron bed Full or partial bypass

Active warming actively transfers calories to the patient, externally through the skin or internally. Skin and fat are designed to be highly efficient in preventing heat transfer, so active external warming is inefficient as compared with internal warming. Forced-air heating, such as with Bair Hugger temperature management therapy (Arizant Healthcare, Eden Prairie, Minn), is technically classified as active warming, but air is a terribly inefficient medium and not many calories are provided to patients. Forced-air heating only increases the patient’s ambient temperature but it can actually cool the patient initially because it increases evaporative heat loss if the patient is wet from blood, fluids, clothes, or sweat. Warming the skin may feel good to the patient and surgeon, but it actually decreases shivering, which is a highly efficient method of internal warming that tricks the thermoregulatory nerve input on the skin. Because forced-air heating uses convection, the actual amount of active warming is estimated to be only 10 kcal/hour.


Active external warming is better performed by placing patients on heating pads, which use conduction to transfer heat. Beds are available that can warm patients faster, such as the Clinitron bed (Hill-Rom, Batesville, Ind), which uses heated air fluidized beads. Such beds are not practical in the operating room, but are applicable in the ICU. Removing wet sheets and wet clothes remains an essential aspect of rewarming.


The best method to warm patients is to deliver the calories internally (Table 5-4). Heating the air used for ventilators is technically internal active warming, but is inefficient because, again, the heat transfer method is convection. The surface area of the lungs is massive, but the energy is mainly transferred through humidified water droplets, mostly using convection and not conduction. The amount of heat transferred through warmed humidified air is also minimal by comparison to methods that use conduction. Body cavities can be lavaged by infusing warmed fluids through chest tubes or merely by irrigating the abdominal cavity with hot fluids. Other methods, which have been written about but rarely used in practice, include gastric lavage and esophageal lavage with special tubes. If gastric lavage is desired, one method is to place two nasogastric tubes and infuse warm fluids in one while the other sucks the fluid back out. Bladder irrigation with an irrigation Foley catheter is useful. Instruments to warm the hand through conduction show much promise but are not yet readily available.


Table 5-4 Calories Delivered by Active Warming



























METHOD kcal/hr
Airway from vent 9
Overhead radiant warmers 17
Heating blankets 20
Convective warmers 15-26
Body cavity lavages 35
CAVR 92-140
Cardiopulmonary bypass 710

CAVR, Continuous arteriovenous rewarming.


The best means to deliver heat is through countercurrent exchange of fluids, using conduction to transfer calories. Again, heating IV fluids is technically active internal warming but, because of the limitations of how much fluids can be heated, it is relatively inefficient. Heating fluids before infusion minimizes cooling rather than active warming. Full cardiopulmonary bypass is unmatched; it delivers more than 5 liters/min of heated blood to every part of the body that contains capillaries. If full cardiopulmonary bypass is not available or not desired, alternatives include continuous venous and arterial rewarming. Venous-venous rewarming is most easily accomplished using the roller pump of a dialysis machine, which is often more available to the average surgeon. A prospective study has shown arterial-venous rewarming to be highly effective. It can warm patients to 37° C in approximately 39 minutes, as compared with an average warming time of 3.2 hours using standard techniques. Special Gentilello arterial warming catheters are inserted into the femoral artery and a second line is inserted into the opposite femoral vein. The pressure from the artery produces flow, which is then directed to a fluid warmer and back into the vein. This method highly depends on the patient’s BP because flow is directly related to BP.


Over the last several decades, with the changes in resuscitation methods, the incidence of hypothermia has decreased and is now less of a problem. Dilutional coagulopathy also occurs less frequently because the use of crystalloids has been minimized.

< div class='tao-gold-member'>

Stay updated, free articles. Join our Telegram channel

Aug 1, 2016 | Posted by in CARDIAC SURGERY | Comments Off on Shock, Electrolytes, and Fluid

Full access? Get Clinical Tree

Get Clinical Tree app for offline access