Editor’s Note: In addition to Peter Pronovost (photo and biography above), authors of this post include Julius Pham, Assistant Professor, Emergency Medicine, Johns Hopkins University School of Medicine; Sara Singer, Assistant Professor, Harvard School of Public Health,  Department of Health Policy and Management, Massachusetts General Hospital; Jerod Loeb, Executive Vice President for Research, The Joint Commission; Eric Campbell, Associate Professor, Institute for Health Policy; Christine Ann Goeschel, Director Quality and Patient Safety Initiatives, Johns Hopkins University – Medicine, Anesthesiology and Critical Care; and James Bagian, Chief Patient Safety Officer, U.S. Department of Veterans Affairs – National Center for Patient Safety. Health Affairs Blog publishes this post as the nation observes Patient Safety Awareness Week.

Medical devices have saved countless lives and reduced morbidity. Yet, as illustrated by recent news coverage about hazards from radiation therapy, they have a dark side: they sometimes harm, and even kill, patients. Health care needs an industrywide effort to address patient harm caused by medical devices. A public-private partnership modeled after the Commercial Aviation Safety Team could reform the process of developing devices to ensure that user safeguards are included.

As one powerful article in The New York Times about harm from radiation therapy stated: “Americans today receive far more medical radiation than ever before. The average lifetime dose of diagnostic radiation has increased sevenfold since 1980, and more than half of all cancer patients receive radiation therapy. Without a doubt, radiation saves countless lives, and serious accidents are rare.” Yet “a New York City hospital treating [Scott Jerome-Parks] for tongue cancer failed to detect a computer error that directed a linear accelerator to blast his brain stem and neck with errant beams of radiation. Not once, but on three consecutive days.”

The feature in the Times illustrates the striking imbalance between the investments and advances in biomedical research and the underinvestment and underperformance in the science of health care delivery, an imbalance that exposes patients to needless harm. The article describes a rapid change in radiation technology designs that fails to provide sufficient safeguards: many technicians, clinicians, and physicians simply do not know how to correctly and safely use the new machines. As a result, patients across the country like Scott Jerome-Parks are receiving sometimes lethal radiation overdoses, and many of these errors go unreported.

We do not know how often patients suffer preventable harm from this or other therapies. The available evidence suggests that preventable harm from medical devices is common, not only because devices sometimes fail to work properly, but more often because their poor design — which frequently  fails to adhere to good human factors engineering principles — causes clinicians to misuse them. Harm from medical devices is not limited to radiation therapy; devices can cause harm.

The Current System Fails To Adequately Protect Patient Safety

The Times article raises several concerns regarding the failure of the health care delivery system to ensure the safe use of medical devices. First, it was the lay press rather than the health care community that publicly documented the problem. Because these mistakes were not widely known, similar mistakes likely occurred at other hospitals. Prior to this article, these mistakes were largely invisible. Health care needs much more robust mechanisms for reporting and analyzing errors. To improve device safety, health care needs ways to identify hazards, analyze and prioritize them, design and implement interventions to reduce risks, and evaluate the extent to which risks have actually been reduced.

Second, the health care industry’s responses to device errors are inefficient and ineffective, often reactive rather than proactive, and usually limited to the hospital where the event occurred rather than systemic. One of the major differences between health care and other high-risk industries, such as aviation, is that mistakes in health care usually harm patients one at a time. In aviation, scores or hundreds are harmed at a time, garnering public attention, generating robust investigations, and resulting in effective solutions.

Although the total number of preventable deaths in health care is far larger than in other industries, the public knows little about the incidence of these events. Most preventable deaths or adverse events generate little or no attention, and approaches to reducing risk are anemic. While patients are harmed one at a time, the causes of that harm are not unique. Yet most hospitals, regulators, and manufacturers approach the problem as if they are the only party confronting the problem, as if no collective wisdom existed. Health care institutions and practitioners who are experiencing a problem are expected to solve it, even if they lack the knowledge, technical skills, and financial resources to do so.

How The Industry Responds Now To Safety Concerns

Consider how the industry will respond to the events described in the Times article. Many hospitals will create a team, evaluate their practices, and devise and make changes internally. In all likelihood, the team will be led by risk managers (generally in the legal department) and will not include systems or human factors engineers, because unfortunately these professionals are still rare in hospitals. Working alone and with insufficient resources, individual hospital investigations will understandably be superficial, and recommendations generally banal and of limited impact: notifying staff of the danger, reeducating them in the use of the technology, and encouraging them to be more careful. Even the “collective” response from the American Society for Radiation Oncology following the Times article envisions a six-point plan calling for more accreditation, training, error tracking, and information sharing, all of which effectively amounts to an appeal for individuals to be more careful.

Hospitals are not the only entities working to reduce the risks of radiation therapy. The Joint Commission, the Veterans Health Administration (VHA), the Food and Drug Administration, state regulators, professional societies, and private industry all are working independently to remedy these ills.

Interventions vary widely in their probability of actually reducing risk. Encouraging someone to be more careful (telling anesthesiologists not to connect an epidural to an intravenous catheter) has the lowest probability, while redesigning the device so it is impossible to make the mistake (two catheters that cannot be physically connected) has the highest probability. Checklists and reminders are somewhere in the middle.

The methods used for these investigations are inefficient, often opaque, and minimally effective, generally lacking detailed investigations by systems and human factors engineers. Moreover, the available resources are usually minimal, and investigation is expensive. Collectively, efforts to reeducate and train staff in every hospital will consume significant resources and provide limited benefit. Even if an effective local solution is identified, there are limited mechanisms for disseminating the intervention nationally or globally; the health care knowledge market is very inefficient.

Although local investigations should continue, alone they probably will not protect patients from complex medical devices. Humans are fallible, and if it remains possible to enter the wrong dose, or to connect tubes that should not be connected, clinicians will do it. There are protective measures that individual hospitals can implement, but simply telling staff to “be more careful” will not protect patients. Hospitals and caregivers should not reject the idea of solving safety problems. But health care asks hospitals and caregivers to perform a Herculean task—to compensate for poor device design. Instead, we need to design devices so that mistakes are unlikely to occur.

There is currently no structure in health care to coordinate industrywide efforts to address patient harm caused by medical devices. Even though manufacturers are probably interested in designing safe devices, without coordinated input, they are likely to receive scores of conflicting suggestions—or, worse yet, no suggestions at all. Health care has well-intentioned parties making the best use of their resources, but struggling in a dysfunctional system.

A Better Approach

A more effective and efficient approach would be to design devices so we cannot make mistakes. Anesthesiologists accomplished this decades ago when they made it impossible to accidentally connect an oxygen hose to a nitrous oxide hose. This mistake, which killed patients by inducing hypoxemia, was eliminated by redesigning the hoses, a process that cost far less than the collective costs of continuously reeducating anesthesiologists.

Unfortunately, such direct solutions are rare. To make them more common, the health care industry must acknowledge that device mistakes will occur, and must ensure that human factors principles are incorporated into new device designs from the start. There must also be investment in efficient and effective mechanisms to design, implement, and evaluate interventions that reduce harm to patients. We learned from the aviation industry that this can be done. Long ago, that industry recognized the foolishness of having individual airline investigations and learning from mistakes in isolation.

The authors of this post previously described a public-private partnership to promote patient safety (P5S), modeled after Commercial Aviation Safety Teams (CAST). Through CAST, commercial aviation brings together the entire aviation industry: major manufacturers, airlines, labor organizations, government agencies such as the Federal Aviation Administration, and international organizations such as the Flight Safety Foundation. CAST members collaborate to prioritize their greatest risks, investigate them thoroughly, and design and implement interventions that work. This group addresses ubiquitous errors that are beyond the ability of any single entity to solve. Since its founding in 1997, CAST has helped the aviation industry improve an already admirable record of safety. Between 1994 and 2006, the average rate of fatal accidents decreased from 0.05 to 0.022 per 100,000 departures.

A group such as P5S could coordinate the investigation of harm from medical devices, identify problems and contributing factors, design and evaluate interventions that have a high probability of reducing risks, and support the broad implementation of those interventions. Medical device errors may be especially appropriate for P5S because product redesign is more likely than other types of interventions to reduce the risk of these errors.

A national or global coordinated effort to reduce device errors has several advantages over local efforts. First, it is expensive to conduct detailed investigations, design interventions, and implement solutions; few local efforts would have sufficient resources to do this well. Second, a thorough investigation requires technical expertise that local institutions may lack. Third, individual hospitals would have little clout in changing product designs. Fourth, it has worked exceedingly well in aviation.

How might a coordinated effort work? The leaders from the public and private sector who chair P5S could convene an analysis team of technical experts, including systems and human factors engineers and clinicians, to investigate the causes of harm from radiation therapy. For each factor that contributes to the harm, this team could recommend interventions to reduce risks and prioritize interventions, starting with those that would have the largest impact on patients.

The leaders could also convene an implementation team and charge it with reviewing the analysis team’s recommendations and working closely with regulators, manufacturers, and professional societies to design a detailed plan to implement the most effective and efficient interventions. The leaders could transparently report which manufacturers implement the recommendations, providing hospitals and patients with the information needed to make informed purchasing decisions. The leaders could also convene a team of health services researchers to evaluate the extent to which risks have actually been reduced, and to improve the process.

The VHA’s National Center for Patient Safety (NCPS) has used this approach to redesign risks out of devices. The NCPS identified problems with patient-controlled anesthetic pumps that could inadvertently deliver lethal overdoses to patients. The center’s multidisciplinary team of human factors experts, engineers, and clinicians investigated the problem and worked with the manufacturer to redesign the device to be more tolerant of clinician misuse, moving away from simply encouraging clinicians to be more careful. The NCPS used a similar approach to reduce life-threatening entrapment hazards in hospital beds and infection risks from endoscopic irrigation tubing. These devices were previously prone to causing harm when clinicians were “not careful enough.” Such an approach requires organizational infrastructure, technical expertise, and resources.

Challenges Remain

Although our group has worked to demonstrate proof of concept for a health care CAST, several barriers remain. The first is leadership. To increase its chances of success, P5S should probably be housed in or supported by a governmental agency. The second is financing. The P5S will produce a public good and will need stable financing to conduct its operations. In the radiation therapy case, a moderate amount of funding will be required to support the investigation of engineering weaknesses that permit error, and of potential solutions. While participating organizations would be expected to contribute personnel time and effort, and other forms of in-kind support, additional financing would be required. We are exploring various financial models to provide the P5S with a stable source of funding, including grant support, social venture capital, and membership fees.

At a recent Senate hearing, one of the authors (PJP) discussed the need for the P5S. A Senator said: “You mean to tell me I cannot stick a diesel fuel pump in a gas-powered car because it won’t fit, yet clinicians can connect IVs to epidural catheters? How can that be?” If we are to realize the benefits from our investments in biomedical technologies, we must increase our ability to use those therapies safely. The time has come to create an effective and efficient mechanism to reduce harm from medical devices. The time has come to join together to create the P5S as a supplement to professional, regulatory, and financial incentives to improve patient safety. We cannot solve the problems we face by working alone.