Serial assessments of left ventricular ejection fraction (LVEF) are customary in patients with breast cancer receiving trastuzumab. Radionuclide angiography (RNA) is often used; however, a typical monitoring schedule could include 5 scans in a year. We evaluated the proportion of imaging-related ionizing radiation attributable to RNA in 115 patients with breast cancer, from 3 medical centers in the United States, Ireland, and Japan, who completed 12 months of trastuzumab treatment. Estimated radiation dose (ERD) was used to calculate exposure associated with imaging procedures spanning the 18 months before and after trastuzumab therapy. In addition, 20 cardiologists and oncologists from participating centers were surveyed for their opinions regarding the contribution of RNA to overall radiation exposure during trastuzumab treatment. When RNA was used to monitor LVEF, the mean ERD from imaging was substantial (34 ± 24.3 mSv), with the majority attributable solely to RNA (24.7 ± 14.8 mSv, 72.6%). Actual ERD associated with RNA in this population differed significantly from the perception in surveyed cardiologists and oncologists; 70% of respondents believed that RNA typically accounted for 0% to 20% of overall radiation exposure from imaging; RNA actually accounted for more than 70% of ERD. In conclusion, RNA was used to monitor LVEF in most patients in this cohort during and after trastuzumab therapy. This significantly increased ERD and accounted for a greater proportion of radiation than that perceived by surveyed physicians. ERD should be taken into account when choosing a method of LVEF surveillance. Alternative techniques that do not use radiation should be strongly considered.
Serial monitoring of left ventricular ejection fraction (LVEF) is customary in patients undergoing treatment with a potentially cardiotoxic cancer therapy, such as trastuzumab (Herceptin; Genentech, Inc.). Surveillance of LVEF may involve transthoracic echocardiography (TTE), radionuclide angiography (RNA or multigated acquisition, also known as MUGA), or cardiovascular magnetic resonance imaging (CMR). The utility of RNA for this purpose is well established ; however, it is associated with ionizing radiation (IR) that may become significant when sequential testing is recommended. The product insert for trastuzumab advises an LVEF determination at baseline and quarterly during treatment. Each RNA typically involves 4 to 10 milliSieverts (mSv) of IR. Thus, a patient on trastuzumab for 1 year could have up to 5 RNA scans, amounting to approximately 20 to 50 mSv of IR. RNA is not the only source of IR for these patients; computed tomography (CT) may also be performed. Thus, the aim of this study was to evaluate imaging-associated IR exposure in patients with breast cancer treated with trastuzumab to determine what proportion of this exposure is attributable to RNA and how this varies from the prevailing perception in specialists who perform LVEF surveillance.
Methods
This was a retrospective multicenter international observational study involving 3 medical centers: the University of Chicago (USA), St. James’s Hospital (Ireland), and the University of Occupational and Environmental Health (Japan). Women younger than 75 years with stage I, II, or III human epidermal growth factor receptor 2 positive breast cancer, who had received trastuzumab for 12 months and survived for at least 1 year after treatment completion were included. Institutional Review Board or Ethics Committee approval was obtained at all sites. Medical records were used to identify all imaging procedures involving an estimated radiation dose (ERD) of ≥1 mSv (procedures involving IR <1 mSv were excluded). To capture IR exposure recent enough to affect a physician’s choice of method to monitor LVEF, we considered IR exposure in the 12 months preceding baseline LVEF assessment, along with an additional 6 months to account for delays between first LVEF assessment and trastuzumab initiation. In addition, IR data were collected for the 1-year period of trastuzumab treatment, and the 6 months after treatment for which LVEF surveillance is often extended.
The ERD for all procedures involving ≥1 mSv was tallied. For CT scans, the region scanned and the dose length product were obtained. The estimated dose of radiation was then calculated using the factor k (mSv mGy −1 cm −1 ; see Supplementary Table 1 ) in the equation E (mSv) ≈ k × DLP (dose length product). For RNA, the activity administered (in milliCuries) was recorded (the tracer in all cases was technetium-99m Sestamibi) and conversion factors (see Supplementary Table 2 ) were used to obtain ERD. Approximate ERD was used if any of the previously mentioned information could not be obtained. Methods for estimating ERD were reviewed by a medical physicist (JC).
Continuous variables are summarized as mean ± SD. Categorical variables are presented as numbers and percentages. Unpaired t tests were used to evaluate the significance of the difference between those with and without reduced LVEF. SPSS version 21 (SPSS Inc., Chicago, Illinois) was used for all calculations.
Twenty-five physicians comprising both oncologists and cardiologists from participating centers who perform LVEF surveillance for patients treated with trastuzumab were contacted by email and surveyed to solicit their perceptions regarding the role of RNA in LVEF surveillance and the degree of IR exposure attributable to RNA in patients treated with trastuzumab. Physicians were blinded to the study findings.
Results
One hundred fifteen patients were studied. The mean age was 49 ± 11.4 years. Most patients had early-stage breast cancer (>70% stage I or II) and few cardiovascular risk factors ( Table 1 ). Demographics did not vary significantly between centers, except for ethnicity. Eighty-one of the 115 patients studied (70.4%; Table 2 ) had sequential RNA to assess LVEF (RNA group), 13 (11.3%) had TTE alone (TTE group), and 21 (18.3%) had both TTE and RNA (RNA + TTE group; the 2 methods were not used at one time point, rather they were used interchangeably over the course of treatment). Within the total population, 14 patients (12.3%) demonstrated a decrease in LVEF to <50%. Although there was a trend toward higher total ERD in those with reduced LVEF compared with those without, this did not reach statistical significance (41.5 ± 18.9 mSv vs 29.5 ± 24.8; p = 0.052). Patient-specific ERDs were computed for 681 of the 715 imaging studies included in the analysis (95.2%). The use of approximate ERDs was necessary in <5% of studies. The mean ERD in the study population was 40.1 ± 36.9 mSv, of which 55.4% (22.2 ± 12.3 mSv) was attributable to LVEF determination. The remaining exposure was due predominantly to CT (17.4 ± 15.9 mSv; 43.4%), with minimal contribution (0.5 ± 0.3 mSv; 1.3%) from other imaging methods. Among those in whom LVEF was followed with RNA, the mean ERD was 34 ± 24.3 mSv, with almost 3 quarters attributable to RNA surveillance alone (24.7 ± 14.8 mSv; 72.6%). The number of RNAs per patient was 5.1 ± 1.7. The average dose per RNA varied by center (range: 3.1 ± 2.9 to 7.9 ± 2.1 mSv) because of differing tracer dosages. The remaining IR exposure resulted mostly from CT (9.1 ± 5.9 mSv; 26.8%). Further stratification within the RNA group revealed that 36 patients (44.4%) were exposed to 1 to 25 mSv, 28 (34.6%) to 26 to 49 mSv, and 17 (21%) to >50 mSv of IR solely for imaging purposes.
Total population | n = 115 |
---|---|
Age (years) | 49 ± 11.4 |
Stage I/II breast cancer | 83 (72.3%) |
Hypertension | 32 (27.8%) |
Diabetes | 6 (5.2%) |
Dyslipidemia | 23 (20%) |
Smoking | 2 (1.7%) |
Coronary artery disease | 2 (1.7%) |
Peripheral vascular disease | 0 |
Any cardiovascular risk factor | 45 (39.1%) |
Prior anthracycline | 47 (40.9%) |
Total (n = 115) | RNA (n = 81) | TTE (n = 13) | TTE + RNA (n = 21) | |
---|---|---|---|---|
Total ionizing radiation (mSv) | 40.1 ± 36.9 | 34 ± 24.3 | 82.2 ± 77.3 | 38.2 ± 23.6 |
Ionizing radiation from ejection fraction determination (mSv) | 22.2 ± 12.3 | 24.7 ± 14.8 | 0 | 24.9 ± 14.8 |
Ionizing radiation from computed tomography (mSv) | 17.4 ± 15.9 | 9.1 ± 5.9 | 81.5 ± 77.5 | 13.1 ± 17.1 |
Ionizing radiation from other sources (mSv) | 0.5 ± 0.3 | 0.2 ± 0.6 | 0.7 ± 1.1 | 0.2 ± 0.3 |
In the TTE group, 9 had sequential CT scans, which affected their ERD (82.2 ± 77.3 mSv); when these were removed, the mean ERD in the study population as a whole remained 33.6 ± 23.7 mSv, of which 70.2% (23.6 ± 15.2 mSv) was attributable to LVEF determination. After excluding those having sequential CT scans four subjects then remained in the TTE group, limiting statistical comparisons. Notably, however, the radiation dose in those 4 patients whose LVEF was followed with TTE (5 ± 5.7 mSv) was >6 times lower than in the RNA group (34 ± 24.3 mSv). The group followed by both TTE and RNA were exposed to 38.2 ± 23.6 mSv, 24.9 ± 14.8 mSv (more than 65%) as a result of LVEF determination, and 13.1 ± 17.1 mSv from CT.
Of the 25 physicians contacted, 20 (80%) responded (13 cardiologists and 7 oncologists); the nonresponders were all oncologists. Results showed that 57.1% of oncologists and 84.6% of cardiologists predicted TTE to be the most commonly used method for LVEF surveillance in patients receiving trastuzumab. The remainder of physicians selected RNA ( Figure 1 ). In terms of reproducibility, 53.9% of cardiologists considered CMR was superior to RNA and TTE. Almost all the oncologists surveyed (85.7%) replied that RNA was the most reproducible ( Figure 1 ). The remaining cardiologists were split between RNA and TTE. When asked to estimate the proportion of radiation exposure attributable to RNA used to monitor LVEF in patients with treated with trastuzumab, 14 of 20 respondents (70%) estimated that only 0% to 20% of IR from imaging came from RNA use. The response was similar between oncologists and cardiologists (71.4% and 69.3%, respectively; Figure 2 ).
Results
One hundred fifteen patients were studied. The mean age was 49 ± 11.4 years. Most patients had early-stage breast cancer (>70% stage I or II) and few cardiovascular risk factors ( Table 1 ). Demographics did not vary significantly between centers, except for ethnicity. Eighty-one of the 115 patients studied (70.4%; Table 2 ) had sequential RNA to assess LVEF (RNA group), 13 (11.3%) had TTE alone (TTE group), and 21 (18.3%) had both TTE and RNA (RNA + TTE group; the 2 methods were not used at one time point, rather they were used interchangeably over the course of treatment). Within the total population, 14 patients (12.3%) demonstrated a decrease in LVEF to <50%. Although there was a trend toward higher total ERD in those with reduced LVEF compared with those without, this did not reach statistical significance (41.5 ± 18.9 mSv vs 29.5 ± 24.8; p = 0.052). Patient-specific ERDs were computed for 681 of the 715 imaging studies included in the analysis (95.2%). The use of approximate ERDs was necessary in <5% of studies. The mean ERD in the study population was 40.1 ± 36.9 mSv, of which 55.4% (22.2 ± 12.3 mSv) was attributable to LVEF determination. The remaining exposure was due predominantly to CT (17.4 ± 15.9 mSv; 43.4%), with minimal contribution (0.5 ± 0.3 mSv; 1.3%) from other imaging methods. Among those in whom LVEF was followed with RNA, the mean ERD was 34 ± 24.3 mSv, with almost 3 quarters attributable to RNA surveillance alone (24.7 ± 14.8 mSv; 72.6%). The number of RNAs per patient was 5.1 ± 1.7. The average dose per RNA varied by center (range: 3.1 ± 2.9 to 7.9 ± 2.1 mSv) because of differing tracer dosages. The remaining IR exposure resulted mostly from CT (9.1 ± 5.9 mSv; 26.8%). Further stratification within the RNA group revealed that 36 patients (44.4%) were exposed to 1 to 25 mSv, 28 (34.6%) to 26 to 49 mSv, and 17 (21%) to >50 mSv of IR solely for imaging purposes.