Cybersecurity of Cardiac Wearable and Implantable Devices



Cybersecurity of Cardiac Wearable and Implantable Devices


Jean-Philippe Couderc

Alex Page

Mark Manning





INTRODUCTION

Cardiac wearable and implantable devices (CWID) such as ECG patches, pacemakers, cardiac defibrillators, and resynchronization therapy devices, as well as loop recorder devices are used to monitor, treat, and improve the lives of cardiac patients. Their use has been soaring over the past decade, with implantation registering an increase of 10% to 30%, depending on the country.1 Importantly, these devices have evolved into telemetry and remote monitoring devices with powerful interoperability capabilities. Personalization features (detection threshold programming, etc.) make these devices responsive to changes in the health status of the user, and embedded communication features enable significant improvement of data access, multidevice operability, and
patient-specific configurability options. This has led to a growing number of opportunities for privacy and safety to be breached by malicious individuals or adversaries.

Today, health care cybersecurity is one of the key public health concerns that need immediate and aggressive attention to counter the growing threats that put patient information and well-being at risk. Specifically, the issue of security of CWID is in its infancy, and the future of these devices will increasingly be dependent on cybersecurity requirements.2 The challenges for companies reside in finding an appropriate balance between safety and utility. Remote monitoring devices constitute a small but important part of the current digital health revolution, which is also progressing at an aggressive pace. All stakeholders commonly share the lack of clear expectations to manage the risks associated with the digitization of our health. The role of regulatory bodies in pacing this race and protecting the patients is fundamental to the success of the development of the future of health care systems.

In this chapter, we aim to provide an overview of the challenges associated with cybersecurity of CWID. We will specifically discuss the technologies that are currently used by physicians and clinicians in their everyday practices, the type of information usually acquired and communicated between health care providers and CWID users, the various weaknesses of current technologies, and the new regulatory landscape around cybersecurity of the CWID; finally, we will discuss solutions to counter the growing number of cyberattacks.


HACKING CWID: THREATS, FEARS, AND REALITY

During the past 2 years, the number of Americans who have had their health care information exposed because of cyberattacks increased by 42%, from 113 million in 2015 to 160 million, in 2016, according to the U.S. Department of Health and Human Service’s Office for Civil Rights. Last year, the estimated cost of data breaches for the health care sector was estimated to be $6.2 billion.3 This trend is strengthened by the increasing rate of digitization of health information. We expect these digitization efforts to continue, because the benefits of digitization are too strong to be ignored: most importantly, cost reduction and improvement of patient outcomes. In June 2017, the malware called “Notpetya” affected at least 2000 organizations across the globe, including Ukraine, Russia, the United Kingdom, and the United States. These organizations included hospitals. Experts in the field noted that one of “the perfidious characteristics of these ransomware is that its creators offer it on the darknet with an affiliate model which gives distributors a share of up to 85% of the paid ransom amount, while 15% is kept by the malware authors.” Ransomware is software that cripples a computer system in some way until the attacker is paid to undo the damage. The advent of “ransomware-as-a-service” has been a growing concern, because it makes the crime possible for nontechnical attackers.4


Monetizing Cyberattacks of Cardiac Monitoring Devices

The preceding example illustrates well the primary reason for hackers to target health-related systems: health care data are easily monetized. One can imagine that the most nefarious cybercriminals could potentially target wearable health devices because these technologies are designed to be coupled with a specific individual (implantable or wearable) to maintain his or her health status or to play a prophylactic role like implantable cardiodefibrillators (ICDs). By gaining control of such devices, hackers could still extract personally identifiable information (PII), but they could also disrupt their life-saving functions and hence use them as lethal weapons.5 Even the threat of such actions could be used to extract ransom from victims, and we have already seen that financial gain is a strong motive in cyberattacks.


Another hypothetical and yet realistic scenario would be for hackers to gain access to servers connected to implanted devices to disable their monitoring functionalities and request ransom from device companies to restore services. Such a ransomware scenario should be considered more imminent than hypothetical, considering that hackers have already successfully harmed companies with such attacks.

A less direct way to monetize data breaches is through stock trading. By penetrating companies’ networks, hackers can gain valuable information about clinical trial outcomes or company mergers/purchases. This information can then be used to predict stock changes. Additionally, attackers can short company stocks before announcing a breach or vulnerability.6 The ramifications to the business are often much more significant than a temporary dip in value; the most recent report studying the impact of cyberattacks on startup companies revealed that 60% of organizations go bankrupt in the 6 months following a successful information breach.7


Life-Threatening Cyberattacks Against Wearable/Implantable Devices

The literature describes successful attempts by academic groups and security companies to gain access to device functionalities and compromise them.8 These have been conducted using cardiac devices such as ICDs9 and noncardiac devices such as insulin pumps.10 The level of threat demonstrated by these experiments was strong enough that politicians such as the 46th US vice president, Dick Cheney (from 2001 to 2009, under President George W. Bush), had the wireless functions of his cardiac implanted devices turned off.11 In 2008, a group of researchers, led by Dr Kevin Fu of Archimedes Center for Medical Device Security at University of Michigan, showed that it is possible to extract sensitive personal information from a pacemaker or even to threaten the patient’s life by turning off or changing the pacing behavior.12 Fortunately, such an attack required close proximity to the patient and could not be carried out remotely. However, newer pacemakers are often capable of longer-range wireless communication and may therefore be vulnerable at longer distances. Such an attack scenario was developed by the hacker Barnaby Jack, who was planning to give a lecture at the Black-Hat conference in 2013 about the possibility of remotely controlling pacemakers via wireless communications at 15 m distance. He died just days before the conference, and his research has not been pursued.13 In 2017, the security company WhiteScope purchased used pacemaker programmers on eBay, and they were able to identify fundamental flaws in systems from all major vendors.14 The potential for attacks against the firmware of programmers or home monitoring systems means that attackers could target many victims at once, and with fewer restrictions on proximity.

There have been no reports of cyberattacks against CWIDs lethally harming an individual. Yet many such scenarios can be imagined and should not be neglected or underestimated. Pacemaker programming systems in the clinic have the capability to wirelessly activate several pacemaker modes/features that would be life threatening outside of a controlled clinical environment. For example, they can:



  • 1. Inhibit pacing


  • 2. Induce fibrillation (eg, with a 50 Hz AC voltage)


  • 3. Deliver shocks, causing



    • a. pain


    • b. reduced battery life


    • c. commotio cordis


    • d. ventricular fibrillation


  • 4. Change program settings, so the pacemaker no longer responds to events appropriately


Device programmers (and home transmitters) are available for sale online, so we cannot assume that they are difficult for hackers to access and analyze. And in any case, when designing for security, we must assume that the enemy knows the system.15 Further, their frequency ranges and operating procedures are well detailed in downloadable manuals and U.S. Federal Communications Commission (FCC) filings. Many radios are available for roughly $100 to $400 and are capable of receiving and transmitting on the 400 MHz (MICS), 900 MHz (ISM), and 2.4 GHz (ISM, Bluetooth, Wi-Fi, MBAN) frequencies used by CWID. If an attacker is able to reverse engineer the communications protocol of a pacemaker programmer and finds that the communications are not properly secured, then he or she will likely be able to transmit these commands with the same authority as a physician. Although many companies will claim that the skill set required to perform such an analysis is very specialized and rare, independent researchers and small groups are able to find vulnerabilities in this manner every year.

As we shall see in the next chapter, balancing usability and security in CWID is a difficult problem. To illustrate this challenge, one would note that a patient with a pacemaker who presents to a hospital with cardiac symptoms may need the immediate intervention of a programming system that has never been “paired” with his or her pacemaker. This intervention is possible only if there is no security system to prevent it. One way to mitigate the threat while preserving functionality is to allow commands to be processed only over the low frequency (LF) channel (ie, using the short-range “wand” antenna) and reserve the other channels (eg, Bluetooth) for “read only” functions such as viewing the current electrogram. This could help reduce the effective range of an attack.


CWID: A TRADEOFF BETWEEN FUNCTIONALITIES AND RISK

The role of data digitization in improving health care is based on the principles of data portability, access, and sharing. Currently, all implantable cardiac devices and an increasing number of wearable ECG systems have built-in functionalities for wireless communication. In the case of wearable devices, Bluetooth connections to a smart device (such as a smartphone) enable Wi-Fi and/or cellular access to the Internet. This functionality is widely used in ECG sensors (belt, patches, etc.), finger-based ECG sensors, and other Holter-like devices16 (Figure 15.1). In the case of implantable devices, the oldest generation of devices included a near-field interface in order for the health professional to set the device configuration and sometimes utilized a wireless interface for remote monitoring purposes. They communicated at low frequencies near 175 kHz with a range limited to 10 cm and with a limited bandwidth around 50 kilobits per second (Kbps). These devices required the patient or physician to hold a wand over the patient’s chest for connectivity. More recent devices have enabled connection to a server of the vendor via an access point to transmit device logs, health reports, physiologic, and patient information. Current systems have adopted the Medical Implant Communication Services (MICS) specifications of the FCC, which requires operating around the 400 MHz band and benefits from broader bandwidth (250 Kbps) with a communication range from 2 to 5 m.

Health professionals and patients recognize that remote accessibility enables longterm monitoring and alarm functions, but it is often overlooked that along with connectivity comes vulnerability. Most individuals with prophylactic devices embedding monitoring functionalities have accepted the device with or without knowledge about the type of information shared by these devices with the company and subsequently with their health care providers. Although vendors claim to have designed security features in their devices and addressed the published security concerns, the past has
demonstrated that, unfortunately, the creativity and resourcefulness of hackers has prevailed, and many sectors considered to be very secure (eg, governmental websites, banking systems, and major retailers) experienced attacks that had very devastating impacts. In September 2017, a shocking example of data breach was released by Equifax. This organization failed to protect PII such as social security and driving license numbers for as many as 143 million consumers.17 These cyber risks are not limited to the financial sectors; the current digitization of health information also increases incentives for hackers to develop sophisticated tools for data security breaches and cyberattacks in the health sector.






FIGURE 15.1 Schematization of the various existing cardiac wearable and implantable devices involved in the acquisition of heart-related signals (electrocardiogram [ECG] and photoplethysmography) and the associated means of sharing the data with health organizations. CRT, cardiac resynchronization therapy; ICD, implantable cardioverter defibrillator; PPG, photoplethysmography.

Figure 15.1 provides a schematic and nonexhaustive summary of wearable technologies related to cardiac monitoring and their communication tools. Most wearable cardiac devices are designed to simply record cardiac physiologic signals or to combine recording with analytical features to help decision support around diagnosis or treatment effects, whereas implantable devices typically add treatment and prophylactic functionalities (eg, ICD, cardiac resynchronization therapy [CRT]). Some wearables such as the LifeVest also have treatment functions.


Information Held by CWID

Multiple studies have suggested that physician access to medical devices through remote monitoring can offer a significant reduction of cost for both the patient and the health care providers.18,19,20 This cost saving concept depends on the quality of the data communicated by the wearable device to the health care provider via the various communication paths described in Figure 15.1. The wearable device can be a passive recorder that stores the physiologic signal continuously or under specific conditions (occurrence of an event, or triggered by the user pressing a button). Automation of the analysis of these signals will vary according to the type of devices and their intent of use. Currently, the integration of alarm systems for arrhythmia detection is a popular trend in ECG patches and Holter systems. Alarms relying on photoplethysmographic (PPG) or other physiologic signals are more scarce. Therefore, the data
sent to the health care provider can be a simple physiologic signal to be processed and analyzed, the epochs of the signal at the relevant times (during arrhythmias), or a report summarizing the heart activities (electric or hemodynamic) for given periods of time. The lists of information recorded, stored, analyzed, and transmitted by CWID are device specific. Implantable devices such as ICD and CRT are required by the United States Food and Drug Administration (USFDA) to hold the name, date of birth, device setup, and the name of the physician who implanted the device. All this information can be transmitted with the physiologic signal(s) to the company server or to an interrogation system used by a physician.

The majority of CWIDs record raw cardiac signals: the electrocardiogram or the electrogram (ECG, EG). These are the most recorded physiologic signals by CWID because they provide insights into the electrical activity of the heart. Recent technologies are expanding this list by integrating the pulsatile or photoplethysmographic (PPG) signals, the heart sound (stethogram), the ballistic force of the heart (ballistocardiogram), and, more recently, simple body activities (accelerometer) that obviously relate to cardiac activities. The PPG provides insights about the hemodynamic impact of electrical activity of the heart, and benefits from a much lower cost because it relies on very cheap sensors that have a very long life expectancy in comparison to ECG electrodes (wet electrodes are good for about 2-10 days). The development of dry electrodes has been associated with lower-quality ECG recordings but represents a promising alternative to standard ECG electrodes. Dry electrodes usually do not stick to the skin and therefore record more motion artifacts than wet electrodes during ambulatory recordings.21 It is believed that this multisensor approach will strengthen the clinical value of these devices. Pioneering examples are illustrated in academic research or in commercial devices that combine PPG and ECG signals (smartwatch). Indeed, smartwatches can continuously monitor pulse activity and trigger an alarm for executing an ECG recording when an abnormal pulse pattern is detected. Another approach would be to combine multiple sensors using Internet of Things (IoT) technology or other communication means; these represent new opportunities for cyberattack, including conscription into botnets.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Dec 19, 2019 | Posted by in CARDIOLOGY | Comments Off on Cybersecurity of Cardiac Wearable and Implantable Devices

Full access? Get Clinical Tree

Get Clinical Tree app for offline access