Lung Transplantation


Human lung transplantation was first attempted in 1963, but it was not until nearly 2 decades later that extended survival was achieved. Further refinements in patient selection, surgical technique, immunosuppression, and postoperative care have since facilitated the successful application of lung transplantation to a wide variety of advanced disorders of the airways, lung parenchyma, and pulmonary vasculature. The field has realized dramatic growth, with more than 47,000 procedures performed worldwide to date with approximately 3700 now performed annually. Nonetheless, serious problems persist that limit the utility of this procedure. The donor pool remains insufficient to meet the demands of the many desperately ill patients awaiting transplantation. Immunosuppressive therapy is associated with a number of troubling side effects, most notably a significant risk of infection and malignancy. Despite the use of immunosuppressive agents, rejection develops frequently and continually threatens organ function. Though lung transplantation offers the prospect of improved functional status and quality of life, long-term survival remains an elusive goal, with only half of recipients living beyond 5 years. In order to optimize outcomes in the face of these shortcomings, judicious selection of candidates is essential and care of recipients must be rendered in a meticulous and vigilant fashion by clinicians familiar with the hazards of posttransplant life.

Indications and Candidate Selection

Lung transplantation is a therapeutic option for a broad spectrum of chronic debilitating pulmonary disorders of the airways, parenchyma, and vasculature. Leading indications include chronic obstructive pulmonary disease (COPD; 28% of cases), idiopathic pulmonary fibrosis (IPF; 29% of cases), and cystic fibrosis (CF; 15% of cases). Other less common indications include emphysema due to alpha 1 -antitrypsin deficiency, sarcoidosis, non-CF bronchiectasis, and lymphangioleiomyomatosis. Once a common indication for transplantation, idiopathic pulmonary arterial hypertension now accounts for less than 3% of procedures, reflecting major advances in the medical management of these patients. Transplantation of patients with lung involvement due to collagen vascular disease remains controversial due to concerns that extrapulmonary manifestations of the systemic disease could compromise the posttransplant course. In particular, the esophageal dysmotility and reflux that frequently characterize scleroderma could increase the risk of aspiration and accelerated graft loss. The demonstration that posttransplantation survival of scleroderma patients is comparable with other patient populations provides some reassurance that carefully selected patients can benefit from this procedure. Use of lung transplantation for locally advanced bronchioloalveolar carcinoma (now referred to as adenocarcinoma in situ ) has largely been abandoned due to an unacceptably high rate of cancer recurrence.

Many transplant centers define an age cutoff for transplant eligibility, typically 65–70 years. In support of this policy, advanced recipient age has been consistently identified as a risk factor for increased posttransplant mortality. Nonetheless, there has been a growing trend to expand the age range on the basis of the argument that “functional” rather than chronologic age should be considered. This trend has been most pronounced in the United States, where patients 65 years and older accounted for 27% of transplant recipients in 2011 compared with 3% in 2001. Two recent single-center case series involving 50 and 78 patients, respectively, who were 65 years or older found no difference in 1-year and 3-year posttransplant survival rates compared with younger cohorts. However, the United Network for Organ Sharing (UNOS) database of U.S. transplants documents a 10-year survival rate among recipients 65 and older of only 13% compared with 23% for those 50 to 64 years and 38% for those younger than 50 years.

There are surprisingly few remaining absolute contraindications to lung transplantation. There is general consensus that the following contraindicate transplantation: (1) recent malignancy (other than nonmelanoma skin cancer); (2) active infection with hepatitis B or C virus associated with histologic evidence of significant liver damage; (3) active or recent cigarette smoking, drug abuse, or alcohol abuse; (4) severe psychiatric illness; (5) repeated noncompliance with medical care; and (6) absence of a consistent and reliable social support network. Infection with human immunodeficiency virus (HIV) is still viewed by most centers as an absolute contraindication, but promising results with liver, kidney, and heart transplantation in HIV-positive recipients, as well as a recent case report of successful lung transplantation, may soon remove this barrier.

The presence of significant extrapulmonary vital organ dysfunction precludes isolated lung transplantation, but multiorgan procedures such as heart-lung or lung-liver can be considered in highly selected patients. Both obesity and underweight nutritional status increase the risk of posttransplant mortality, but cutoffs for exclusion of candidates vary among centers. The risk posed by other chronic medical conditions such as diabetes mellitus, osteoporosis, gastroesophageal reflux, and coronary artery disease must be assessed individually on the basis of severity of disease, presence of end-organ damage, and ease of control with standard therapies.

Prior pleurodesis is associated with an increased risk of intraoperative bleeding, particularly when cardiopulmonary bypass is used, but is not a contraindication to transplantation in experienced surgical hands. Pleural thickening associated with aspergillomas similarly complicates anatomic dissection and explantation of the native lung and carries the additional risk of soiling the pleural space with fungal organisms.

Among candidates with CF, colonization with certain species comprising the Burkholderia cepacia complex, in particular Burkholderia cenocepacia (previously known as genomovar III), is considered a strong contraindication by the majority of centers, owing to the demonstrated propensity of this organism to cause lethal posttransplant infections. In contrast, the presence of pan-resistant Pseudomonas aeruginosa in this patient population is associated with acceptable outcomes and should not be viewed as a contraindication.

Transplantation of patients on mechanical ventilation is associated with increased short-term posttransplant mortality though it does not appear to affect outcomes beyond the first year. Although transplantation of these patients was previously discouraged, the new lung allocation system in the United States has prompted reconsideration of this perspective by assigning high allocation scores to ventilator-dependent patients. Many programs are now willing to maintain some ventilator-dependent patients on their active waiting list, anticipating that the high allocation score will expedite transplantation, but reserving the option of de-listing patients who develop intercurrent complications or progressive debility. An analysis of 586 ventilator-dependent patients in the UNOS database documents inferior but not necessarily prohibitively poor short-term outcomes; 1-year and 2-year survival rates were 62% and 57%, respectively, compared with 79% and 70% for nonventilated patients. Even more controversial is transplantation of patients on extracorporeal membrane oxygenation (ECMO) support, for whom 1-year and 2-year posttransplant survival rates were only 50% and 45%, respectively, in the UNOS database. More recent single-center reports document more promising outcomes, and increasing availability of ambulatory ECMO techniques may improve outcomes in the future.

Timing of Referral and Listing

Listing for transplantation is considered at a time when the lung disease limits basic activities of daily living and is deemed to pose a high risk of death in the short term. Disease-specific guidelines for timely referral and listing of patients, based on available predictive indices, have been published ( Table 106-1 ). The imprecise nature of these predictive indices can make decisions about transplant listing problematic for all but the most severely ill patients. The patient’s perception of an unacceptably poor quality of life is an important additional factor to consider but should not serve as the sole justification for listing of a patient whose disease is not deemed to be at an advanced and potentially life-threatening stage.

Table 106-1

Disease-Specific Guidelines for Listing for Lung Transplantation


  • BODE index of 7–10 or at least one of the following:

    • History of hospitalization for exacerbation associated with acute hypercapnia (P co 2 > 50 mm Hg)

    • Pulmonary hypertension or cor pulmonale, or both, despite oxygen therapy

    • FEV 1 < 20% and either D l CO < 20% or homogeneous distribution of emphysema


  • Histologic or radiographic evidence of UIP and any of the following:

    • D l CO < 39% predicted

    • ≥10% decrement in FVC during 6 months of follow-up

    • Decrease in pulse oximetry to < 88% during a 6MWT

    • Honeycombing on HRCT (fibrosis score > 2)


  • FEV 1 < 30% of predicted or rapidly declining lung function if FEV 1 > 30% (females and patients < 18 yr have a poorer prognosis; consider earlier listing) and/or any of the following:

    • Increasing oxygen requirements

    • Hypercapnia

    • Pulmonary hypertension


  • Persistent NYHA class III or IV on maximal medical therapy

  • Low (350 m) or declining 6MWT

  • Failing therapy with intravenous epoprostenol or equivalent

  • Cardiac index < 2 L/min/m 2

  • Right atrial pressure > 15 mm Hg


  • NYHA functional class III or IV and any of the following:

    • Hypoxemia at rest

    • Pulmonary hypertension

    • Elevated right atrial pressure > 15 mm Hg

BODE, [b]ody mass index, airflow [o]bstruction, [d]yspnea, [e]xercise capacity; D l CO , diffusing capacity for carbon dioxide; FEV 1 , forced expiratory volume in 1 second; FVC, forced vital capacity; HRCT, high-resolution computed tomography; 6MWT, 6-minute walk test; NYHA, New York Heart Association; P co 2 , pressure of carbon dioxide; UIP, usual interstitial pneumonia.

Modified from Orens JB, Estenne M, Arcasoy S, et al: International guidelines for the selection of lung transplant candidates: 2006 update. J Heart Lung Transplant 25:745–755, 2006.

Allocation System

Rules governing allocation of organs vary among countries but typically employ a time-based or need-based ranking of candidates on the waiting list, or some combination of the two systems. Examination of the systems that have been operative in the United States permits an appreciation of the advantages and limitations of both approaches. From 1990 to 2005, lung allocation in the United States prioritized candidates on the basis of the amount of time they had accrued on the waiting list, without regard to severity of illness. Based on a simple and objective parameter, this system was easily understood but was ultimately called into question because it failed to accommodate those patients with a more rapidly progressive course who often could not survive the prolonged waiting times. In response to the perceived inequities of the time-based system, and under mandate of the federal government, a new system was implemented in 2005. It allocates lungs on the basis of both medical urgency (risk of death without a transplant) and “net transplant benefit” (the extent to which trans­plantation will extend survival). It uses predictive models, incorporating more than a dozen variables, to generate predictions for a given patient of 1-year survival with and without transplantation. A raw lung allocation score (LAS) is then calculated on the basis of these survival predictions and normalized to a scale of 0 to 100 for ease of use. Because 1-year survival without transplantation is factored into net transplant benefit and medical urgency measures, it affects the LAS more than posttransplantation survival, which is used only in the net transplant benefit calculation. As designed, the system preferentially allocates lungs to sicker patients while attempting to avoid situations in which outcomes are so poor that there would be no meaningful survival benefit.

Since its implementation, the LAS system has had a profound and favorable effect on the dynamics of lung transplantation in the United States. Because there is no longer an incentive to place patients on the active waiting list simply to accrue time (many of whom were ultimately deactivated rather than transplanted), the number of actively listed patients has fallen to approximately one half of the previous level. Median waiting time, which had ranged from 2 to 3 years under the time-based allocation system, has decreased to less than 6 months, and one quarter of patients are waiting less than 35 days. Importantly, there has been a significant reduction in the annual death rate of patients on the waiting list, one of the stated objectives of the new system. Notably, preferential transplantation of sicker patients has not resulted in an increase in early mortality following transplantation. Further experience will be required to determine the impact of the new system on long-term outcomes following transplantation.

Bridging to Transplantation: Artificial Lung Technologies

As mentioned earlier, ECMO has been used to bridge critically ill patients to lung transplantation, though, historically, outcomes following transplantation have been suboptimal. Advances in artificial lung technology, including improved membranes, improved pumps, and even ambulatory support systems, make it increasingly possible to support selected patients successfully, permitting them to survive the wait for a suitable donor lung, and, importantly, to achieve a successful posttransplant outcome.

Patients with isolated hypercapneic failure can be bridged with pumpless devices such as the interventional lung assist (iLA) from Novalung, a low-resistance device with a meshwork of hollow fibers maximizing blood/gas diffusion; with this device, blood is propelled by arterial pressure. Patients requiring oxygenation support can be supported with veno-venous configured pump devices. Patients requiring circulatory support, as well as gas exchange support, can be managed with a conventional veno-arterial configuration. It is important to understand the underlying physiology of the patient and to select the device configuration that provides the necessary support ( Fig. 106-1 ). An application unique to patients with pulmonary arterial hypertension is the application of the pumpless iLA device from pulmonary artery to left atrium to offload the right ventricle and provide an “oxygenating septostomy” physiology. This strategy has effectively abolished wait list mortality in the group of patients that traditionally has the highest mortality on the wait list.

Figure 106-1

Selection of extracorporeal lung support device and configuration.

The choice of support device is largely dependent on the type of respiratory failure (hypercarbic or hypoxic) and the hemodynamic status (stable or unstable). LA, left atrium; PA, pulmonary artery; PH, pulmonary hypertension; RV, right ventricle.

(From Cypel M, Keshavjee S: Extracorporeal life support pre and post lung transplantation. ECMO Extracorporeal Cardiopulmonary Support in Critical Care (ELSO Red Book), ed 4. Ann Arbor, MI, 2011, Extracorporeal Life Support Organization.)

Donor Selection and Management

In addition to meeting strict criteria for declaration of brain death, cadaveric lung donors are selected on the basis of established guidelines ( Table 106-2 ). Lungs are a particularly fragile organ in the brain-dead patient and are frequently compromised by volume overload, contusion, aspiration of gastric contents, or pneumonia, as well as by extensive prior smoking. As a result, the vast majority of donors fail to meet standard criteria for lung donation, leading to a historical recovery rate of only 15% from cadaveric organ donors deemed able to donate other organs. Although it is reasonable to be conservative with patient safety in mind, there is mounting evidence that these standard criteria may in fact be too stringent, leading to unnecessary wastage of suitable lungs. In one study, 29 pairs of lungs that had been rejected for transplantation were assessed for the magnitude of extravascular water content, intactness of alveolar fluid clearance capacity, and presence of pneumonia or emphysema. Twelve pairs (41%) were found to have minimal or no abnormalities and thus to be “potentially suitable” for transplantation. Additional evidence comes from published reports documenting that outcomes with use of “extended criteria” donors are similar to those achieved with use of donors meeting standard criteria. Use of modified donor management protocols to optimize lung function through judicious fluid management, therapeutic bronchoscopy, and lung recruitment maneuvers has also been shown to enhance lung retrieval rates. Additionally, a recent multicenter, randomized trial demonstrated that use of a low tidal volume, lung-protective ventilatory protocol (6 to 8 mL/kg; PEEP 8 to 10 cm H 2 O) in brain-dead potential organ donors resulted in a doubling of lung harvest rates (54% vs. 27%) compared with a conventional ventilatory protocol (10 to 12 mL/kg; PEEP 3 to 5 cm H 2 O).

Table 106-2

Standard Lung Donor Criteria

  • Age < 55 yr

  • Clear chest radiograph

  • Pa o 2 > 300 mm Hg on F io 2 1.0, PEEP 5 cm H 2 O

  • Cigarette smoking history < 20 pack-years

  • Absence of significant chest trauma

  • No evidence of aspiration or sepsis

  • No prior thoracic surgery on side of harvest

  • Absence of organisms on sputum Gram stain

  • Absence of purulent secretions and gastric contents at bronchoscopy

  • Negative for HIV antibody, hepatitis B surface antigen, and hepatitis C antibody

  • No active or recent history of malignancy (excluding localized squamous or basal cell skin cancer, localized cervical cancer, and primary brain tumors with low metastatic potential and in the absence of invasive procedures to the brain and skull)

  • No history of significant chronic lung disease

F io 2 , fractional concentration of oxygen in inspired gas; HIV, human immunodeficiency virus; Pa o 2 , arterial oxygen pressure; PEEP, positive end-expiratory pressure.

Despite increases in the number of organs successfully retrieved, the demand for organs continues to outstrip supply, prompting a search for alternatives to the brain-dead donor pool. One emerging source is the non–heart-beating or donation after cardiac death (DCD) donor who has experienced either an out-of-hospital (i.e., uncontrolled) arrest or a planned withdrawal of life support in the operating room. Currently only 1% of lung transplants performed in the United States utilize DCD donors ; in contrast, DCD donors account for 12% of lung transplants in Australia. Data suggest that short- and medium-term outcomes are as good as or better than those associated with use of traditional brain-dead donors.

Once a donor has been identified, matching with potential recipients is based on size and ABO blood group compatibility. Prospective human leukocyte antigen (HLA) matching is not performed. However, potential candidates identified through standard pretransplant screening as having preformed circulating antibodies to foreign HLA antigens require either prospective donor-recipient lymphocytotoxic cross-matching or avoidance of donors with specific incompatible antigens.

Lung Preservation

The standard of lung preservation is hypothermic flush preservation. The most commonly used solution is Perfadex (Vitrolife, Sweden). Cold flush preservation at 4° C decreases the metabolic rate to 5% of normal and hence slows down the dying process of the lung. Although this approach has been useful for clinical lung transplantation, cold static preservation has significant limitations: (1) a decision regarding utilization must be made quickly with limited information in the donor hospital; (2) once the organ is flushed, there is no second chance to reevaluate the organ before removal of the cross clamp at reperfusion; and (3) the focus is on slowing down the dying process and it does not address or take advantage of opportunities to diagnose, treat, repair, or regenerate the donor lung.

Ex vivo lung perfusion has been developed to address these limitations. It is now possible to perfuse lungs ex vivo at normothermia for extended periods, thus creating a platform for more detailed assessment of lung function, more accurate diagnosis, and targeted treatment of donor lung injuries to improve the function of the lung after transplantation. This creates the opportunity to engineer donor organs with gene therapy, cell therapy, and other advanced treatments to create “super-organs” that hopefully will one day afford the recipient long-term allograft function.

Ex vivo lung perfusion has been shown to increase donor lung utilization of lungs that previously could not be used. Short-term outcomes using lungs conditioned in this fashion have been highly favorable. Ex vivo lung perfusion is now standard practice in the Toronto Lung Transplant Program and is being increasingly applied worldwide. The U.S. Food and Drug Administration recently approved the XVIVO Perfusion System for use in the United States.

Available Surgical Techniques

Four surgical techniques have been developed: heart-lung transplantation (HLT), single-lung transplantation (SLT), bilateral-lung transplantation (BLT), and living donor bilobar transplantation. The choice of procedure is dictated by such factors as the underlying disease, age of the patient, survival and functional advantages, donor organ availability, and center-specific preferences. Currently, SLT and BLT account for more than 97% of all procedures performed.

Heart-Lung Transplantation

HLT was the first procedure to be performed successfully, but it has largely been supplanted by techniques to replace the lungs alone. Currently, fewer than 100 procedures are performed worldwide annually. Indications are largely restricted to Eisenmenger syndrome with surgically uncorrectable cardiac lesions and to advanced lung disease with concurrent severe left ventricular dysfunction or extensive coronary artery disease. In the past, the presence of profound right ventricular dysfunction in the setting of severe pulmonary hypertension was deemed to be an indication for heart-lung transplantation. However, subsequent experience with isolated lung transplantation has demonstrated the remarkable ability of the right ventricle to recover once pulmonary artery pressures have normalized.

Single-Lung Transplantation

SLT was, until recently, the most commonly performed procedure. Traditionally, a standard posterolateral thoracotomy was utilized, but some surgeons now employ a less invasive anterior axillary muscle-sparing approach in selected cases. Three anastomoses are executed—main-stem bronchus, pulmonary artery, and left atrium (incorporating the two pulmonary veins). Compared with BLT, SLT permits more efficient use of the limited donor supply and is better tolerated by less robust patients, but it provides less functional reserve in the setting of allograft dysfunction. It is an acceptable option for patients with pulmonary fibrosis and COPD. SLT has also been performed successfully in carefully selected patients with severe pulmonary hypertension. In this setting, however, there is an increased risk of perioperative allograft edema because the freshly transplanted lung must bear the burden of nearly the entire cardiac output. This concern has prompted the vast majority of centers to abandon this approach in favor of the bilateral procedure. Because of infectious concerns, SLT is contraindicated in patients with suppurative lung disorders such as CF.

Bilateral-Lung Transplantation

BLT involves the performance of two single-lung transplant procedures in succession during a single operative session. Surgical approaches include transverse thoracosternotomy (“clamshell”) incision, bilateral anterolateral thoracotomies (sparing the sternum), and median sternotomy. In the absence of severe pulmonary hypertension, cardiopulmonary bypass can often be avoided by sustaining the patient on the contralateral lung during implantation of each allograft. The principal indications for this procedure are CF, other forms of bronchiectasis, and severe primary and secondary forms of pulmonary hypertension. In addition, many programs now advocate its use for patients with COPD, arguing that it offers functional and survival advantages over SLT. Although it is also being employed with increasing frequency in treatment of fibrotic lung disorders, the justification for this is less clear. As a result of these trends, BLT now accounts for three quarters of all procedures performed worldwide.

Living Donor Bilobar Transplantation

Living donor bilateral-lobar transplantation was developed chiefly to serve the needs of candidates with far-advanced or deteriorating status that would not allow them to tolerate a protracted wait for a cadaveric donor. The procedure involves transplantation of lower lobes from each of two living, blood group–compatible donors. In order to ensure that the lobes will adequately fill the hemithoraces, it is preferable to employ donors who are taller than the recipient. Patients with CF are particularly well suited as a target population because, even as adults, they tend to be of small stature. Intermediate-term functional outcomes and survival among recipients are similar to those achieved with cadaveric transplantation. Concerns about excessive risk to the donor have thus far proved to be unfounded. In the two largest series published to date involving a combined total of 315 donors, there were no deaths or episodes of postoperative respiratory failure, and only 9 donors (2.9%) experienced complications of sufficient magnitude to warrant surgical reexploration. Donation of a lobe results in an average decrement in vital capacity of 17%, a degree of loss that should be of little functional significance in an otherwise normal individual. Despite the apparent low risk posed to the donor, living donor transplantation has not gained widespread acceptance. Its use has been further undermined by the LAS allocation system, which expedites transplantation of more severely ill candidates; only nine living-donor transplantation procedures have been performed in the United States since implementation of the LAS system.

Routine Posttransplantation Management and Outcomes

Care of the lung transplant recipient requires close surveillance to ensure that the allograft is functioning properly, that immunosuppressive medications are properly administered and tolerated, and that complications are detected early and treated expeditiously. Most centers require patients to return frequently for office visits, blood tests, and chest radiographs during the initial 2 to 3 months following transplantation and to participate in an intensive pulmonary rehabilitation program during this time. Analogous to home glucose monitoring of the diabetic patient, lung transplant recipients chart their pulmonary function on a daily basis with a handheld microspirometer and are instructed to contact the transplant center if a sustained fall of greater than 10% in the forced expiratory volume in 1 second (FEV 1 ) or forced vital capacity is documented.

Many transplant programs employ frequent surveillance bronchoscopies and transbronchial lung biopsies within the first posttransplant year as a means of monitoring the allograft. Such an approach has been demonstrated to detect low-grade rejection and subclinical cytomegalovirus (CMV) pneumonitis in up to 30% of asymptomatic, clinically stable patients. However, it has yet to be determined whether treatment of clinically silent disease has a beneficial impact on long-term graft function.

Immunosuppressive therapy is initiated immediately at the time of transplantation and is maintained lifelong. No consensus currently exists on the role of induction therapy with lymphocyte/thymocyte-depleting globulin preparations or interleukin-2 (IL-2) receptor antagonists (basiliximab and daclizumab), and only half of all centers currently employ this strategy. The lack of consensus reflects insufficient and conflicting data on the ability of these agents to reduce the incidence of acute rejection and bronchiolitis obliterans syndrome (BOS) in the lung transplant population. Maintenance therapy consists of a calcineurin inhibitor (cyclosporine or tacrolimus), purine synthesis inhibitor (azathioprine or mycophenolate), and prednisone. Sirolimus (also known as rapamycin ), an inhibitor of IL-2-stimulated T-cell proliferation, is the newest immunosuppressive agent to be introduced into clinical practice. Use of this agent in place of a purine synthesis inhibitor does not reduce the incidence of acute rejection or BOS and is associated with a number of bothersome side effects that commonly lead to discontinuation of the drug. Lacking inherent nephrotoxicity, sirolimus has been successfully substituted for calcineurin inhibitors in patients with renal insufficiency, leading to recovery of renal function without undue risk of rejection. Sirolimus impairs wound healing and has been associated with life-threatening bronchial anastomotic dehiscence when used immediately following transplantation. As a result, the drug should never be initiated until complete healing of the bronchial anastomosis has been documented.

Individuals providing care to transplant recipients must be familiar with the administration, side effects, and drug interactions of these immunosuppressive agents ( Table 106-3 ). Although serving as the cornerstone of therapy, the use of calcineurin inhibitors is particularly challenging. When administered orally, the bioavailability of these agents is poor and unpredictable, necessitating frequent monitoring of blood levels to ensure appropriate dosing. These drugs are metabolized via the hepatic cytochrome P-450 system, and blood levels are influenced by the concurrent administration of other drugs that affect this enzymatic pathway. Adverse effects of these agents, as well as of the other drugs commonly utilized, are legion and contribute significantly to the morbidity associated with transplantation.

Table 106-3

Commonly Used Immunosuppressive Medications

Medication (Class) Dosing * Adverse Effects Drug Interactions
Cyclosporine and tacrolimus (calcineurin inhibitors) Cyclosporine: dosed to achieve a whole blood trough level of 250–350 ng/mL (first year), then 200–300 ng/mL
Tacrolimus: dosed to achieve a whole blood trough level of 10–12 ng/mL (first year), then 6–8 ng/mL
Neurotoxicity (tremor, seizures, white matter disease, headache)
Hemolytic-uremic syndrome
Hirsutism (cyclosporine)
Gingival hyperplasia (cyclosporine)
Macrolide antibiotics (except azithromycin)
Azole antifungals
Diltiazem, verapamil
Grapefruit juice
Sirolimus (mTOR inhibitor) Dosed to achieve a whole blood trough level of 6–12 ng/mL Thrombocytopenia
Peripheral edema
Impaired wound healing
Interstitial pneumonitis
Same as calcineurin inhibitors
Azathioprine (purine synthesis inhibitor) 2 mg/kg/day Leukopenia
Macrocytic anemia
Hypersensitivity reaction (fever, hypotension, rash)
Synergistic bone marrow suppression when administered with allopurinol
Mycophenolate mofetil (purine synthesis inhibitor) 1000–1500 mg bid Diarrhea
Concurrent use of cyclosporine may decrease serum concentrations of mycophenolate by limiting biliary secretion/enterohepatic recycling
Prednisone (corticosteroid) 0.5 mg/kg/day for 6–12 wk, then tapered to 0.15 mg/kg/day Hyperglycemia
Weight gain
Avascular necrosis
Mood changes
No significant interactions
Polyclonal antilymphocyte or antithymocyte globulin Dose depends on specific preparation used Leukopenia
Serum sickness
“Cytokine release syndrome”—fever, hypotension
No significant interactions
Basiliximab (monoclonal IL-2 receptor antagonist) 20 mg IV on days 1 and 4 Hypersensitivity reactions (rare) No significant interactions

IL-2, interleukin-2.

* Dosing is based on the protocol used at the Hospital of the University of Pennsylvania; dosing may vary among transplant centers.

Measured by high-performance liquid chromatography assay.

Management of medical comorbidities is an essential component of the care of the lung transplant recipient. Common medical issues that emerge in this population include osteoporosis, hypertension, renal insufficiency, coronary artery disease, diabetes mellitus, and hyperlipidemia. Treatment of these conditions is similar to that of the general population.


Current 1-, 5-, and 10-year survival rates following lung transplantation are 82%, 55%, and 33%, respectively. Survival rates have steadily improved over time, as indicated by an increase in median survival from 3.9 years in 1990–1997 to 6.1 years in 2005–2012. Disease-specific differences in survival are apparent but may be confounded by differences in severity of illness, comorbidities, and average age among these populations. In descending order, median survival is 8.3 years for CF, 6.4 years for alpha 1 -antitrypsin deficiency, 5.7 years for sarcoidosis, 5.5 years for COPD and IPAH, and 4.7 years for IPF.

Mortality is highest during the first year, with primary graft dysfunction and infection representing the most common causes of death. Factors portending an increased risk of early death include ventilator dependence of the recipient before transplantation, a pretransplant diagnosis of pulmonary arterial hypertension, elevated bilirubin, and advanced recipient age. Beyond the first year, attrition slows to an annual rate of approximately 5% to 8%. Most late deaths are attributable to the development of BOS, the lethal effects of which are due to both progressive respiratory failure and an increased susceptibility to infection.

Whether lung transplantation truly extends survival compared with the natural history of the underlying disease remains a matter of some debate. In the absence of randomized trials, this question has been approached by comparing observed posttransplant survival to survival of wait-list patients or by simulating survival with and without transplantation by statistical modeling; both of these approaches suffer from significant methodologic shortcomings. In the case of IPF, a disease with an extremely poor short-term prognosis, studies have suggested that lung transplantation does confer a survival advantage. This has been more difficult to demonstrate for COPD, which typically follows a protracted course even in the advanced stages, and available studies comparing wait list and posttransplant survival have yielded conflicting results. A more complex analysis of this issue, employing prognostic models of survival with and without transplantation, found that approximately 45% of COPD patients would gain a survival benefit of at least 1 year by undergoing BLT; only 22% would derive such a benefit if SLT were employed. Survival benefit was heavily influenced by pretransplant FEV 1 , as well as a number of other functional and physiologic parameters. As an example, nearly 80% of patients with an FEV 1 less than 16% but only 11% of those with an FEV 1 greater than 25% were predicted to gain at least a year of life with BLT. Adults with CF also appear to derive a survival advantage from lung transplantation, though one study found that this was limited to those patients with a predicted 5-year survival without transplantation of less than 50% and without B. cepacia and CF-arthropathy. In contrast, modeling studies have suggested that CF patients younger than 18 years old rarely achieve a survival benefit. This contention has been challenged by several authors, who point out potential methodologic shortcomings of these studies.

Pulmonary Function

The peak effect of lung transplantation on pulmonary function parameters is usually not realized until 3 to 6 months following the procedure, at which time the adverse impact of such factors as postoperative pain, weakness, altered chest wall mechanics, and ischemia-reperfusion lung injury has dissipated. Complete normalization of pulmonary function is the anticipated result of BLT. Following SLT for COPD, the FEV 1 increases several fold to a level of approximately 50% to 60% of the predicted normal value ( ). Similarly, SLT for pulmonary fibrosis results in marked but incomplete improvement in lung volumes, with persistence of a restrictive pattern.

Transplantation also leads to correction of gas exchange abnormalities. Oxygenation improves rapidly, permitting the majority of patients to be weaned off of supplemental oxygen within the first week. Hypercapnia may take longer to resolve, due to lingering abnormalities in the ventilatory response to carbon dioxide.

Exercise Capacity

Exercise tolerance improves sufficiently to permit the majority of transplant recipients to achieve functional independence and resume an active lifestyle. Although free of limitations with usual activity, transplant recipients with normal allograft function demonstrate a characteristic reduction in peak exercise performance as assessed by cardiopulmonary exercise testing. Specifically, patients typically achieve a maximum oxygen consumption at peak exercise of only 40% to 60% of predicted. Suboptimal exercise performance persists in subjects tested as late as 1 to 2 years following transplantation. Despite the greater magnitude of improvement in pulmonary function experienced by bilateral transplant recipients, there is no significant difference in peak exercise performance between this group and those who receive only one lung.

Characteristically, breathing reserve, oxygen saturation, and heart rate reserve remain normal during exercise while anaerobic threshold is reduced, a pattern most consistent with skeletal muscle dysfunction. Factors possibly contributing to this include chronic deconditioning, steroid myopathy, and calcineurin inhibitor-induced impairment in muscle mitochondrial respiration.


When performed in patients with pulmonary hypertension, both SLT and BLT lead to immediate and sustained nor­malization of pulmonary arterial pressure and enhanced cardiac output. In response to a decrease in afterload, right ventricular geometry and performance gradually normalize in the majority of patients. A threshold of right ventricular dysfunction below which recovery will not happen has yet to be defined.

Quality of Life

After successful lung transplantation, quality of life measures improve markedly across most domains, achieving levels approximating that of the general population. Nonetheless, several important limitations have been observed. Although improved from pretransplant status, impairments in psychological functioning—including increased levels of depression and anxiety, and poor perception of body image—persist. In addition, troubling side effects from immunosuppressive medications adversely affect quality of life. Finally, the development of BOS is associated with a significant deterioration in quality of life measures.

Despite improvements in performance status and quality of life, fewer than half of lung transplant recipients return to the workforce. Factors cited by recipients as barriers to employment include employer bias against hiring an individual with a chronic medical condition, the potential loss of disability income or medical benefits, side effects of medications, concerns about risk of infection in the workplace, and prioritization of recreational activities over work as a posttransplantation goal.


Primary Graft Dysfunction

Primary graft dysfunction (PGD) is a term applied to the development within 72 hours of transplantation of radiographic opacities in the allograft(s) associated with impaired oxygenation, in the absence of identifiable insults such as volume overload, pneumonia, rejection, atelectasis, or pulmonary venous outflow obstruction. PGD is presumed to be a consequence of ischemia-reperfusion injury, but inflammatory events associated with donor brain death, surgical trauma, and lymphatic disruption may be contributing factors. Supporting the concept of PGD as a form of acute, nonimmunologic lung injury, histologic examination of lung tissue from affected patients reveals a prevailing pattern of diffuse alveolar damage. A widely used grading system classifies the severity of PGD based on the arterial oxygen pressure-to-fractional concentration of oxygen in inspired gas (arterial P o 2 /F io 2 ) ratio ( Table 106-4 ). In most cases, the process is mild and transient, but in approximately 10% to 20% of cases, injury is sufficiently severe to cause life-threatening hypoxemia (PGD grade 3) and a clinical course analogous to the acute respiratory distress syndrome.

Table 106-4

Grading System for Primary Graft Dysfunction

Grade Pa o 2 /F io 2 Radiographic Evidence of Pulmonary Edema
0 >300 Absent
1 >300 Present
2 200–300 Present
3 <200 Present

Pa o 2 /F io 2 , ratio of arterial oxygen pressure to fractional concentration of oxygen in inspired gas.

From Christie JD, Carby M, Bag R, et al: Report of the ISHLT Working Group on Primary Lung Graft Dysfunction Part II: Definition. A consensus statement of the International Society for Heart and Lung Transplantation. J Heart Lung Transplant 24:1454–1459, 2005.

A recent prospective, multicenter cohort study identified a number of risk factors for development of severe PGD. Many of these were procedure-related factors: use of an elevated F io 2 during reperfusion, use of cardiopulmonary bypass, SLT, and administration of large volume blood product transfusions. Recipient risk factors were a diagnosis of sarcoidosis, presence of pulmonary hypertension, and overweight or obese body habitus. The only donor-related risk factor identified was a history of smoking. Notably, graft ischemic time was not identified as a risk factor in this study. In another study, an elevated level of IL-8 in bronchoalveolar lavage (BAL) fluid recovered from the donor was associated with the development of severe PGD, supporting the notion that inflammatory events preceding organ harvest may play a role.

Treatment of severe PGD is supportive, relying on conventional mechanical ventilation utilizing low tidal volume strategies, as well as on such adjunct measures as independent lung ventilation and extracorporeal life support for selected patients who otherwise cannot be stabilized. The use of nitric oxide in patients with established graft injury has been associated with sustained reduction in pulmonary artery pressures and improvement in oxygenation. However, the prophylactic administration of nitric oxide to all recipients at the time of reperfusion does not reduce the incidence of severe PGD. Results of emergency retransplantation in this setting have been poor.

With an associated perioperative mortality rate of 20% to 40%, severe PGD is a leading cause of early deaths among transplant recipients. The risk of death remains excessive even beyond the first year, suggesting that PGD has lingering adverse consequences well after resolution of the acute event. Recovery among survivors is often protracted and incomplete, though attainment of normal lung function and exercise tolerance is possible. There appears to be an increased risk of BOS following development of PGD, but data are conflicting on whether the increased risk spans all grades of PGD or is seen exclusively following the most severe grade.

Airway Complications

During implantation of the allograft, no attempt is routinely made to reestablish the bronchial arterial circulation. As a consequence, the donor bronchus is precariously dependent on retrograde blood flow through low-pressure pulmonary venous to bronchial vascular collaterals, placing the airway at risk for ischemic injury. Rarely, this may result in bronchial anastomotic dehiscence, which, when extensive, can lead to mediastinitis, pneumothorax, hemorrhage, and death. Treatment of this life-threatening complication previously required risky and often unsuccessful surgical intervention to buttress the anastomosis. More recently, success has been reported with temporary placement of a bare metal airway stent across the dehiscence in order to provide a scaffolding on which granulation tissue can form. For lesser degrees of dehiscence, conservative management with reduction in corticosteroid dosing and chest tube evacuation of associated pneumothorax will often lead to successful healing ( Fig. 106-2 ).

Figure 106-2

Bronchial anastomotic dehiscence.

A, Bronchoscopic view immediately distal to the main carina demonstrates partial dehiscence of the right bronchial anastomosis at the 1 o’clock position. B, After several weeks of expectant management, a repeat bronchoscopy demonstrates near-complete healing of the dehiscence.

Ischemic injury to the airway more commonly manifests as necrosis of the anastomotic cartilage and as patchy areas of bronchial mucosal ulceration and pseudomembranes. These devitalized areas in turn place the patient at increased risk for fungal superinfection of the airway (see later).

The most common airway complication currently encountered is bronchial anastomotic stenosis, with a reported frequency of 10% to 15% in contemporary series. Narrowing can be due to excessive granulation tissue, fibrotic stricture ( Fig. 106-3 ), or bronchomalacia (the latter two mechanisms likely a sequela of prior ischemic injury). Occasionally, fibrotic strictures can extend beyond the anastomosis, leading to narrowing of the bronchus intermedius ( eFig. 106-1 ) or lobar bronchi. Anastomotic narrowing typically develops within several weeks to months following transplantation. Clues to its presence include focal wheezing on the involved side, recurrent bouts of pneumonia or purulent bronchitis, and suboptimal pulmonary function studies demonstrating airflow obstruction and truncation of the flow-volume loop. Bronchoscopy both confirms the diagnosis and permits therapeutic interventions including balloon dilation, laser debridement, endobronchial brachytherapy, and stent placement. Although these measures are often successful in the short term, recurrent stenosis is common, necessitating repeated interventions and leading to compromised functional outcomes and excess mortality.

Figure 106-3

Bronchial anastomotic stricture.

Bronchoscopic view of a left main-stem bronchial anastomosis demonstrates marked narrowing of the lumen due to formation of a fibrous web. The true outer margin of the bronchus is outlined by the suture material.

Phrenic Nerve Injury

Phrenic nerve injury following lung transplantation can result from intraoperative traction, use of an iced slurry to cool the allograft in the chest cavity before reperfusion, or transection of the nerve in the setting of extensive fibrous adhesions and difficult hilar dissection. Depending in part on whether screening is restricted to clinically suspected cases or more broadly to all recipients, the reported incidence of phrenic nerve injury ranges from 3% to 30%. Important albeit nonspecific clues to the presence of phrenic nerve injury include difficulty in weaning from mechanical ventilation, persistent hypercapnia, orthopnea, and radiographic evidence of persistent elevation of the diaphragm and associated basilar atelectasis. Phrenic nerve injury has been associated with increases in ventilator days, tracheostomy rates, and intensive care unit length of stay. Achievement of a normal functional outcome is ultimately possible for those with reversible injury, but recovery in some cases may be protracted or incomplete. For severely impaired patients, nocturnal noninvasive ventilatory support and diaphragmatic plication have been successfully employed.

Native Lung Hyperinflation

Acute hyperinflation of the native lung leading to respiratory and hemodynamic compromise in the immediate postoperative period has been reported in 15% to 30% of emphysema patients undergoing SLT. Although risk factors remain poorly defined, the combination of positive-pressure ventilation and significant allograft edema serves to magnify the compliance differential between the two lungs and may predispose to this complication. Acute hyperinflation can be rapidly addressed by initiation of independent lung ventilation, ventilating the native lung with a low respiratory rate and a long expiratory time to facilitate complete emptying. Beyond the perioperative period, some SLT recipients with underlying emphysema demonstrate exaggerated or progressive native lung hyperinflation that more insidiously compromises the function of the allograft. In this setting, surgical volume reduction of the native lung can result in significant functional improvement.


Infection rates among lung transplant recipients are several fold higher than among recipients of other solid organs. The greater risk is likely related to the unique exposure of the lung allograft to microorganisms via inhalation and aspiration and to the higher level of immunosuppression maintained in these patients. A comprehensive discussion of infectious complications is beyond the scope of this chapter; only the most common pathogens are discussed.


Bacterial infections of the lower respiratory tract account for the majority of infectious complications and have a bimodal temporal distribution. Bacterial pneumonia is most frequently encountered within the first month posttransplantation. In addition to the immunosuppressed status of the recipient, factors that predispose to early bacterial pneumonia include the need for prolonged mechanical ventilatory support, blunted cough due to postoperative pain and weakness, disruption of lymphatics, and ischemic injury to the bronchial mucosa with resultant impairment in mucociliary clearance. Although passive transfer of occult infection with the transplanted organ is an additional concern, the presence of organisms on Gram stain of donor bronchial washings is not predictive of subsequent pneumonia in the recipient. Bacterial infections, in the form of purulent bronchitis, bronchiectasis, and pneumonia, reemerge as a late complication among patients who develop BOS. Gram-negative pathogens, in particular P. aeruginosa (see eFig. 91-1 ), are most frequently isolated in association with both early and late infectious events.


CMV is the most common viral pathogen encountered following lung transplantation, though in the era of effective prophylaxis, its incidence and impact have diminished considerably. Infection can develop by transfer of virus with the allograft or transfused blood products or by reactivation of latent virus remotely acquired by the recipient. Seronegative recipients who acquire organs from seropositive donors are at greatest risk for developing infection, and these primary infections tend to be the most severe. Although donor-positive/recipient-negative mismatching has been identified as a risk factor for increased mortality in the International Society for Heart and Lung Transplantation Registry, this may no longer be the case with the current widespread use of effective prophylactic regimens.

In the absence of prophylaxis, CMV infection typically emerges 1 to 3 months following transplantation; antiviral prophylaxis shifts the onset to later in the course, often in the initial months after the antiviral agent is discontinued. Infection is often subclinical, evidenced only by silent viremia or shedding of virus in the respiratory tract. Clinical disease may present as a mononucleosis-like syndrome of fever, malaise, and leukopenia (“CMV syndrome”) or as organ-specific invasion of the lung, gastrointestinal tract, central nervous system, or retina. Detection of virus in peripheral blood by either the pp65 antigenemia assay or polymerase chain reaction (PCR) techniques establishes a diagnosis of CMV infection but does not necessarily reflect events at the tissue level. A diagnosis of CMV pneumonia, the most common manifestation of invasive disease in the lung transplant recipient (see eFigs. 91-2 and 91-3 ), is unequivocally established only by demonstration of characteristic viral cytopathic changes on lung biopsy or on cytologic specimens obtained by BAL, but the sensitivity of these findings is relatively low. Caution must be exercised in interpretation of a positive viral culture or PCR of BAL specimens because virus can be shed into the respiratory tract in the absence of tissue invasion.

Standard treatment of CMV syndrome and tissue-invasive disease consists of a 2- to 3-week course of intravenous ganciclovir at a dose of 5 mg/kg twice daily, adjusted for renal insufficiency. Monitoring of peripheral blood viral load should be performed weekly to confirm response to therapy. Treatment should be continued until at least 1 week after an undetectable viral load is documented. Some experts advocate the addition of CMV hyperimmune globulin in treatment of severe disease, but evidence supporting this practice is scant. Although treatment is effective, relapse rates of up to 60% in primary infection and 20% in seropositive recipients have been reported. Initiation of oral valganciclovir as secondary prophylaxis after completion of definitive treatment is a common practice, but its impact on relapse rates is uncertain.

In an attempt to minimize the adverse impact of CMV infection on the posttransplantation course, emphasis has shifted to preventive strategies. Numerous prospective, randomized trials have documented the efficacy of antiviral prophylaxis in delaying the onset and reducing the incidence and severity of CMV infection. Oral valganciclovir has largely replaced intravenous ganciclovir as the prophylactic agent of choice, due to its excellent bioavailability, ease of administration, and demonstrated efficacy. Universal prophylaxis of all donor-seropositive/recipient-seronegative patients is recommended because the risk of CMV disease is high. Because the risk of disease is significantly lower in seropositive recipients (independent of donor status), it has been argued that universal prophylaxis of this group leads to overtreatment, increasing costs and unduly exposing patients to the risk of drug toxicity. In this population, preemptive strategies targeting antiviral therapy exclusively to patients demonstrating a rising viral load in peripheral blood have been advocated, but many programs still adhere to a universal prophylaxis strategy. Consensus guidelines recommend a minimum of 6 months of prophylaxis for donor-positive/recipient-negative patients and 3 to 6 months for recipient-positive patients. However, a recent randomized, controlled trial of at-risk lung transplant recipients (either donor or recipient seropositive) demonstrated a marked reduction in the incidence of CMV disease with use of a 12-month course of valganciclovir prophylaxis compared with a 3-month course (4% vs. 32%). Additional studies are required to determine whether 12 months is necessary or excessive and whether all at-risk subgroups require the same regimen.

Emergence of ganciclovir-resistant strains of CMV has been reported in 5% to 15% of lung transplant recipients with CMV infection. Risk factors that have been identified include donor-positive/recipient-negative CMV status, use of potent immunosuppressive agents such as antilymphocyte antibodies and daclizumab, increased number of CMV episodes, and prolonged exposure to ganciclovir. Foscarnet, administered alone or in combination with ganciclovir, is the agent of choice for treatment of ganciclovir-resistant disease. The drug is potentially nephrotoxic, and careful monitoring of renal function is essential. Although treatment is often successful, the presence of ganciclovir-resistant disease is associated with decreased survival in lung transplant recipients.


Aspergillus species are the most frequently encountered fungal pathogens among lung transplant recipients. As a ubiquitous organism acquired by inhalation, Aspergillus colonizes the airways of approximately one quarter of transplant recipients. Airway colonization itself does not appear to pose a major risk of subsequent progression to invasive disease. Whether this is due to the inherently benign nature of colonization or to the common practice of initiating fungal prophylaxis when colonization is detected is unclear.

Aspergillus infects the bronchial tree in approximately 5% of lung transplant recipients. In most cases, infection is localized to the bronchial anastomosis, where devitalized cartilage and foreign suture material create a nurturing environment. Less commonly, infection may present as a more diffuse ulcerative bronchitis with formation of pseudomembranes, typically following in the wake of a severe ischemic injury to the bronchial mucosa. Clustered within the first 6 months posttransplantation, these airway infections are usually asymptomatic and detected only by surveillance bronchoscopy. Although usually responsive to oral azoles or to inhaled or intravenous amphotericin, airway infections have rarely progressed to invasive pneumonia or have resulted in fatal erosion into the adjacent pulmonary artery. An increased risk of subsequent bronchial stenosis or bronchomalacia has also been reported, but it is unclear whether this is a consequence of the infection or of an underlying ischemic injury to the bronchus that predisposed to infection.

Invasive aspergillosis, a far more serious form of infection, develops in 5% of lung transplant recipients, most commonly within the first year. It nearly always involves the lung but may disseminate to distant sites, particularly the brain, in a minority of patients. Symptoms are nonspecific and include fever, cough, pleuritic chest pain, and hemoptysis. Radiographically, pulmonary aspergillosis may appear as single (see eFigs. 91-7 and 91-8B ) or multiple nodular (see eFig. 91-8A ) or cavitary opacities or as alveolar consolidation ( Fig. 106-4 ). The “halo sign”—a rim of ground-glass attenuation surrounding a central nodular opacity—is a suggestive but uncommon finding on chest computed tomography (CT) scans.

Jul 21, 2019 | Posted by in CARDIOLOGY | Comments Off on Lung Transplantation

Full access? Get Clinical Tree

Get Clinical Tree app for offline access