The NAT concepts of linear vs interactively complex and loosely coupled vs tightly coupled are plotted in quadrants. Several nonmedical systems from Perrow’s original descriptions (italics),5 and approximate plot locations for oncology-related activities are shown in boxes (adapted from Marks et al19). BMT indicates bone-marrow transplant; chemo, chemotherapy; RT, radiation therapy.
The illustrated goal sheet is for a patient with head and neck cancer. The “Result” and “Meet Goal?” columns are color coded, with red indicating that the goal has not been met (“No”); green, that the goal has been met (“Yes”). This sheet facilitates and standardizes plan review. Max indicates maximum; N/A, not applicable; PTV, planning target volume; Rx, prescription.
Not all components of Normal Accident Theory were implemented at the same time. Standardized communication procedures began in 2009, before any measure illustrated on the graph; use of the goal sheet began in June 2010. There were various reasons for replanning, some of which were preventable (eg, unclear physician directives), while others were clinically warranted (demonstrated tumor regression). Our goal was to reduce replanned treatment rates to 5% per month or lower, understanding that some replanned treatments are always necessary.
Implementation of our Normal Accident Theory interventions started in 2008 and have progressively developed to the present and are ongoing. Percentages of positive responses in the selected 3 dimensions of patient safety culture (organizational learning, overall perception of patient safety and quality, and office processes and standardization) appear to have increased from 2009 to 2013 (analysis of variance P<.01). In addition, while the number of employees did not change, the number who responded to the survey in 2013 more than doubled from 2011, perhaps reflecting an increased respect for quality-improvement activities. AHRQ indicates Agency for Healthcare Research and Quality.
Chera BS, Mazur L, Buchanan I, Kim HJ, Rockwell J, Milowsky MI, Marks LB. Improving Patient Safety in Clinical OncologyApplying Lessons From Normal Accident Theory. JAMA Oncol. 2015;1(7):958-964. doi:10.1001/jamaoncol.2015.0891
Concerns for patient safety persist in clinical oncology. Within several nonmedical areas (eg, aviation, nuclear power), concepts from Normal Accident Theory (NAT), a framework for analyzing failure potential within and between systems, have been successfully applied to better understand system performance and improve system safety. Clinical oncology practice is interprofessional and interdisciplinary, and our therapies often have narrow therapeutic windows. Thus, many of our processes are, in NAT terms, interactively complex and tightly coupled within and across systems and are therefore prone to unexpected behaviors that can result in substantial patient harm. To improve safety at the University of North Carolina, we have applied the concepts of NAT to our practice to better understand our systems’ behavior and adopted strategies to reduce complexity and coupling. Furthermore, recognizing that we cannot eliminate all risks, we have stressed safety mindfulness among our staff to further promote safety. Many specific examples are provided herein. The lessons from NAT are translatable to clinical oncology and may help to promote safety.
The Institute of Medicine (IOM) has highlighted patient safety as an urgent health care quality problem,1,2 estimating that 44 000 to 98 000 Americans die annually from preventable medical errors,1,3 and more recent estimates are even higher.4 Safety concerns within clinical oncology are at least as complex as they are in other fields. Herein, we apply concepts from Normal Accident Theory (NAT),5 a framework for analyzing failure potential within and between systems, to the practice of clinical oncology with the goal of identifying areas of safety concern. Specifically, we analyze linear vs interactive complexity and tight vs loose coupling to identify areas for potential safety improvement. In addition, we demonstrate how we have already applied the findings of these analyses to improve safety in clinical oncology practice at the University of North Carolina (UNC).
The medication error rate in the outpatient chemotherapy setting has been reported to be approximately 3% to 19%,6,7 depending on the specific practice setting. In a survey of adult outpatients receiving chemotherapy, 42 (22%) of 193 believed that they had experienced unsafe care.8 A survey of 1013 health care professionals from 9 oncology departments in Switzerland9 reported that 54% observed their colleagues making potentially harmful errors. Seventy percent reported sometimes remaining silent about safety concerns, and 37% reported remaining silent when they could have helped prevent an incident. The same research group10 surveyed patients and found that 16% reported experiencing an error in their care, and 11% were very concerned about the errors.10
Radiotherapy-associated errors were noted to occur in about 1% to 4% of patient treatments in single-institution reports, with most errors being not clinically serious.11 Registry data note a much lower error rate (about 1 to 4 in 10 000) because only a small fraction of errors cross the threshold triggering reporting.11 In a survey of radiation therapists, 16% reported being personally reprimanded for raising concerns about safety.12
These data are concerning both for the prevalence of safety issues and the apparent “suppression” of reporting them. National professional societies have recognized these issues, published white papers to describe best practices, and hold annual safety meetings. For example, the American Society of Clinical Oncology/Oncology Nursing Society has guidelines for chemotherapy administration and convenes a yearly Quality Care Symposium,13 and the American Society of Therapeutic Radiation Oncology and the American Association of Physicists in Medicine have published numerous quality and safety reports14- 18 and have held several safety-focused meetings.
We applaud these efforts and support continued initiatives aimed at improving patient safety. However, we strongly believe that major gains in safety will require broad adoption of concepts from NAT.5
Normal Accident Theory (NAT), a framework for analyzing failure potential within and between systems, can be successfully applied to clinical oncology to better understand system performance and improve patient safety.
Many processes in clinical oncology are, in NAT terms, interactively complex and tightly coupled, a combination that inevitably results in errors leading to patient harm.
Safety initiatives should be designed to reduce complexity and coupling of clinical oncology processes.
Strategies of automation, forcing functions (ie, hard stops), standardization, and monitoring of system performance (incident learning systems, peer review) can reduce complexity and coupling.
The practice of clinical oncology may never be error free; a global strategy to promote patient safety is to develop safety mindfulness among clinical oncology health care workers.
Substantial work in non–health care settings has been performed to better understand the causes of errors and investigate potential mitigation strategies. One of the nation’s leading theorists in the area of safety, Dr Charles Perrow,5 has developed NAT, a framework for analyzing failure potential within and between systems. He argues that errors in systems occur often and are indeed expected as part of normal operations. He categorizes systems based on how these errors propagate and interact within the larger system. Systems in which failures propagate and interact predictably are considered linear, and those in which failures behave unpredictably are interactively complex.
He further categorizes systems by their ability to detect and respond to failures. Systems that are relatively slow allowing relatively more opportunity to detect and respond to failures are termed loosely coupled, while those that are fast offering less opportunity to detect and respond to failures are termed tightly coupled.
For example, the US Post Office system is linear (errors have predictable consequences) and is loosely coupled (errors are largely detected and corrected, and most of the mail ultimately gets delivered). A dammed river system is also linear but is tightly coupled. A dam breach will often lead to a flood because the time scale for fixing the breach is too long to mitigate the rapid downstream effects. A university is interactively complex because events occurring within its many varied components (eg, multiple departments, schools, social events, athletics) can interact in unforeseen ways (Figure 1).19
Perrow5 argues that systems that are both interactively complex and tightly coupled have a particular propensity for catastrophic failure. Since errors in subsystems are assuredly going to occur, and since these will propagate in unforeseen ways that cannot be fully understood or mitigated, major global system failures are probable. In other words, complex systems cannot be fully understood, and thus their behavior will always have some element of chaos. He argues that only a change in their structure—reducing coupling or reducing interactive complexity—can reduce the probability of a catastrophic event.
Where does clinical oncology lie in the NAT construct? We submit that clinical oncology practice is a relatively complex system (Figure 1). Most patients require multidisciplinary care involving numerous diverse specialists and care givers (eg, surgical, medical, and radiation oncologists; nurses; social workers), often with multiple care transitions (eg, from one physician office to another and from outpatient to inpatient settings). Thus, the number of handoffs, interactions, and unforeseen interactions can be high, each carrying with it the potential for error. Furthermore, the interconnected nature of much of oncology practice tends to propagate individual errors, and interactions between multiple errors through the system and to the patient (eg, how a radiologist obtains or reviews images can affect a surgical decision, which can affect a pathologic assessment, which can alter a treatment decision). All of these complexities mean that it is frequently difficult to know how errors will propagate through the system.
Clinical oncology processes have variable degrees of coupling. When treatment must be delivered quickly (eg, acute leukemic crisis), coupling can be tight, and errors can rapidly reach the patient. Even if the pace of care is slow (eg, outpatient hospice care), processes without routine downstream safety checks may still enable errors (and unforeseen interactions between errors) to reach the patient. On the other hand, the multidisciplinary nature of oncology may reduce coupling. For example, typically several clinicians will review the same set of images or reports; cases are discussed in tumor boards with multiple attendees; and multiple clinicians perform the same history and physical examination. Thus, there are often opportunities for upstream errors to be mitigated, and unforeseen interactions within the complex systems might be detected by the diverse types of people involved in the processes.20 Based on these considerations, we have categorized a number of oncology processes as examples of the application of NAT (Figure 1).
Different quality-assurance strategies are more or less effective for systems with different levels of complexity and coupling. For example, people working in systems that are linear and tightly coupled (upper left quadrant in Figure 1) must strictly adhere to processes, and there should be frequent monitoring of key system performance metrics (ideally automatically). The same is true for systems that are interactively complex and tightly coupled (upper right quadrant in Figure 1), but here the monitoring of performance measures should be comprehensive (since the manner in which the system will fail cannot be predicted). In loosely coupled systems, the monitoring can be less frequent, people-based monitoring (eg, peer review).
The terms quality and safety are not rigorously defined and hence require clarification for our purposes. Safety often refers to the avoidance of catastrophic failures (eg, death or serious injury from a medication misadministration), and quality is often used more broadly (eg, patient satisfaction scores, wait times in clinics). However, there is clearly a continuum, and the distinction between safety and quality (particularly in medicine) is often indistinct. Most medical errors do not cause fatality or serious injury, but the factors that determine whether a fatality or serious injury occurs can be unpredictable. The distinction between safety and quality is particularly blurry in interactively complex systems where seemingly minor quality issues can interact in unforeseen ways resulting in safety issues. We submit that this is the case in much of medicine (particularly in oncology), and thus safety and quality initiatives are inherently linked.
Interactive complexity is reduced by promoting desired processes (and reducing process variation) by, in order of efficacy, automation, forcing functions (ie, hard stops), and standardization. This approach is especially useful in tightly coupled, interactively complex processes, but should be effective in all processes. A straightforward example from radiation treatment planning is nomenclature. During radiation treatment planning, alternative beams and plans are considered and compared. Without a naming convention, it can become difficult to keep track of the alternatives, and occasionally an incorrect beam or plan might propagate through the process potentially causing errors. At UNC, we have modified our software such that beams and plans are automatically given unambiguous names (eg, based on orientation, creator’s name, date and time).21 Additional descriptors can be added to the name as desired.
Another example from radiation treatment planning is our “3 Ps” system for communicating vital patient information radiation planning directives. During radiation treatment planning, it is important to know and communicate with other care givers whether the patient is pregnant (P1), has a pacemaker (P2), or has received prior irradiation (P3). Failure to consider these factors may have severe consequences (particularly the prior radiation). Similarly, clinicians need to communicate their goals (eg, patient positioning, treatment start date, dose and volume goals). We have defined a single location within our Mosaiq radiation therapy electronic health record (Elekta AB) where clinicians convey this information in a consistent location and format. Staff are empowered (supported by the chairperson) to “stop the line” (ie, hard stop—not proceed with simulation or planning) if these items are unaddressed. Prior to instituting this approach, we had more variation in communication (eg, telephone message, text page, email, sticky note, verbal exchange), more chaos, more rework, and more events with potential patient harm. The standardized means of electronic communication are intended to complement, and not replace, face-to-face discussions. The importance of using this standard process is reinforced in our daily morning departmental huddle.
For purposes of standardized communications, we distinguish between processes and medical decisions. For example, one wants to standardize how a medical oncologist communicates the chemotherapy treatment plan to the infusion nurse. This is distinctly separate from the medical decision as to the optimal chemotherapeutic agent to use. The processes are easier to standardize because they are more readily accepted by clinicians. There is certainly also some utility in standardizing some medical decisions to standardize workflow. However, for both the processes and medical decisions, there must be some flexibility, which is easier to conceptualize for medical decisions. Thus, we suggest that standardization initiatives focus initially largely on processes.
Review of dosimetric parameters within a radiation treatment plan can be cumbersome and perhaps haphazard: there are many images and parameters to review. Typically, the clinician will review a dose volume histogram, identify some critical values (eg, mean lung dose), and compare these data against some standard. The comparison is often done “in one’s head,” perhaps aided by dose limits thumbtacked to the wall. At UNC, we have standardized the manner in which plans can be reviewed by creating a digital goal sheet where predetermined parameters from the plan are automatically compared with departmental standards. Color coding is used to facilitate easy review of the data (eg, parameters meeting the standards and goals are green, and those out of range are red). Goal sheets are also helpful during peer review, and their use enables harmonization of departmental standards and the rapid deployment of new or modified standards (Figure 2).
Radiation treatment planning can be complicated and time-consuming, and changes will result in replanning, reworking, and treatment delays. Understanding that some replanning is often necessary (eg, tumor shrinkage), our goal is to minimize the proportion of replanning to 5% per month. Figure 3 shows our monthly radiation treatment replanning rates. We believe that standardizing communication of planning directives and use of the goal sheet have reduced the radiation replanning in our department.
We recently instituted a multidisciplinary initiative called the Enhanced Recovery After Surgery (ERAS) programto standardize preoperative, intraoperative, and postoperative processes for patients undergoing surgical treatment for pancreatic cancers, with plans to implement in all major oncologic resections. This setting is particularly fitting for such an approach because it is interactively complex (multiple clinicians, handoffs, and care settings) and at times tightly coupled. The ERAS program is a multimodality, perioperative, evidence-based care pathway designed to achieve early recovery for patients undergoing major surgery.22- 24 The ERAS protocols allow for standardization of care, thus accelerating recovery and safety and optimizing utilization of health care resources. Multidisciplinary team education (anesthesiology, surgery, and outpatient and inpatient nursing) and timely communication were integral throughout the process, along with early integration of patient education to highlight the advantages of this approach.
Prior to implementation of ERAS, the perioperative processes with the most variances included intraoperative fluid management and perioperative pain control. The ERAS protocol has standardized perioperative fluid management, with close monitoring of the goal-directed fluid therapy delivered by the anesthesia clinicians, and widely incorporated the use of thoracic epidural analgesia. During the postoperative period, the nurses and patients are educated on the benefits of early mobilization, and a large portion of the surgical care has been outlined in clinical algorithms. Our preliminary results are encouraging: we have seen decreased lengths of stay, decreased numbers of complications, decreased pain scores, and improved patient satisfaction (as measured by Press Ganey analytic tools). Although changes in the culture of an institution often come about slowly, processes are now being established to sustain these NAT improvement measures throughout the health care system.
For systems that are interactively complex and tightly coupled (upper right quadrant in Perrow’s model,5Figure 1), the monitoring of performance measures should be comprehensive because the manner in which the system will fail cannot be predicted. Thus, patient-specific pretreatment checks are more comprehensive and include technical (eg, treatment plan quality and robustness) and clinical (eg, patient’s treatment decisions, target definition, planned doses) factors. The Department of Radiation Oncology at UNC has a long history of daily pretreatment physician peer review. This approach has been expanded to include a broader cross-section of the department (eg, including nonphysicians and students). Decisions regarding each patient are reviewed and discussed publically, and nonphysician staff members are encouraged to participate (eg, “Dr Marks, your target looks tighter than your usual”).21 These sessions are conducted as part of our broader daily huddle and are well attended (20-30 people) and viewed favorably by the vast majority of staff.21 Since beginning the sessions, we have noted a reduction in the percentages of patients needing reworking of their radiation treatment plans.21 Similarly, we have a robust multidisciplinary tumor board program in our cancer center (9 disease-specific meetings per week), where most of our cases are publically discussed.
We have created detailed process flow maps (PFMs) to understand how information is passed from one step to the next (eg, handoffs) and where existing quality and safety checks are located. We have also implemented a robust “good catch” incident learning system, where workers report errors and near-errors (eg, missing or incorrect information, scheduling problems). By mapping the good catches to the PFMs, we quantitatively assess where errors are detected, where they originated (ie, their root causes), and the utility of existing safety barriers. This information is used to inform improvement work aimed to modify processes, safety barriers, or other practices or procedures. Between 2012 and 2014, 880 good catches were reported, leading to 63 formal improvement activities. Typically, these events lead to changes in our processes and the elimination, modification, or institution of safety barriers aimed to reduce coupling (ie, reduce the likelihood that errors propagate).
The importance of the good catch program is recognized and publically emphasized through many visual displays. We recently surveyed all members of the UNC Department of Radiation Oncology (71% response rate) to assess their knowledge, understanding, and perceptions of the good catch program. Overall, 84% of respondents (100% of physicians) agreed or strongly agreed that the good catch program enhances patient safety mindfulness. Only 5% of respondents reported being dissuaded by a physician, supervisor, or peer from submitting good catches. These results suggest that the use of an incident learning system can enhance patient safety mindfulness and promote a safety culture. The UNC Department of Radiation Oncology’s improvement in patient safety culture is also reflected in our Agency for Healthcare Research and Quality patient safety survey results (Figure 4). A similar good catch program is being developed throughout the cancer hospital (North Caroline Cancer Hospital), with initial efforts focused on the outpatient pediatric and adult oncology infusion clinics.
We recognize that we cannot fully engineer all risks out of clinical oncology practice. The complex nature of health care dictates that errors and unforeseen interactions will likely still occur despite our best efforts. Nevertheless, we can create systems that have some degree of flexibility to accommodate the unexpected. Furthermore, we need to assure that staff members have safety mindfulness—a persistent focus on safety, a recognition that unforeseen errors will occur, and a desire to proactively improve their systems and processes. We have tried to create culture, environment, and infrastructure to allow all individuals (staff members and patients) to develop an understanding of safety mindfulness and to feel comfortable in openly speaking about errors and suboptimal systems. This is a challenge because medicine has traditionally promoted the concept that errors are the result of people’s individual failings3 and are associated with blame, shame, and disciplinary actions. Dr Lucian Leape has stated that such a “climate of blame and punishment … has been the major barrier to making progress in safety over the years.”25 We need to promote the (true) belief that errors will occur and that they are due largely to systems’ flaws rather than human character flaws.
Toward this goal at UNC, we have instituted several important practices. Leaders speak often and openly about safety concerns. They encourage, recognize, reward, and publically celebrate people who participate in improvement work (eg, reporting good catches) and are involved in formal improvement events (Figure 3). Also, our decisions regarding compensation and promotion take into account the employee’s participation in quality and safety improvement work. In addition, we conduct daily morning huddles, in conjunction with our daily peer review, where we review the day’s upcoming activities (eg, number of patients, anticipated challenges). This huddle is a consistent reminder to all that we operate within systems that are somewhat interactively complex and coupled and that unforeseen events can occur. Similar huddle concepts are being applied more broadly in the North Caroline Cancer Hospital.
Leaders regularly conduct safety rounds, where they speak with front-line workers at their worksites (eg, treatment machine, clinic) to discuss potential safety and quality issues. At first, workers were often reluctant to disclose their concerns for fear of blame, reprimands, and job security. Over time, we believe that staff members have become more comfortable with this initiative. Since 2010, we have had 14 safety rounds sessions, visited all workspaces, collected over 200 suggestions and concerns, and provided follow-up on most of these.
We have been trying to systematically apply concepts from NAT to help us better understand how our systems behave and to implement more effective safety initiatives. During this time, we have noted improved process measures (eg, reduced wait times, reduced interruptions, reduced percentage of patients requiring radiation therapy replanning), improved workers’ perceptions about quality and safety (Figure 4), and improved financial performance. Our experiences at UNC suggest that this approach is both feasible and useful.
Accepted for Publication: March 12, 2015.
Corresponding Author: Bhishamjit S. Chera, MD, Department of Radiation Oncology, University of North Carolina Hospitals, 101 Manning Dr, CB No. 7512, Chapel Hill, NC 27599-7512 (email@example.com).
Published Online: May 14, 2015. doi:10.1001/jamaoncol.2015.0891.
Author Contributions: Dr Chera had full access to all of the data in the study and takes responsibility for the integrity of the data and the accuracy of the data analysis.
Study concept and design: Chera, Mazur, Kim, Rockwell, Marks.
Acquisition, analysis, or interpretation of data: Chera, Mazur, Buchanan, Kim, Rockwell, Milowsky.
Drafting of the manuscript: Chera, Mazur, Buchanan, Rockwell, Marks.
Critical revision of the manuscript for important intellectual content: Chera, Mazur, Buchanan, Kim, Rockwell, Milowsky, Marks.
Administrative, technical, or material support: Chera, Buchanan, Kim, Rockwell.
Study supervision: Chera, Kim, Rockwell, Marks.
Design of figures, selection of references: Mazur.
Conflict of Interest Disclosures: Dr Mazur and Marks have received research funding in the area of patient safety from Accuray and Elekta. No other conflicts are reported.
Funding/Support: This study was supported in part by funding from the Agency for Healthcare Research and Quality, the Centers for Disease Control and Prevention, and additional support from the UNC Innovations Center, UNC Institute for Healthcare Quality Improvement, and UNC Health Care System.
Role of the Funder/Sponsor: The funding institutions had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation, review, or approval of the manuscript; and decision to submit the manuscript for publication.
Correction: This article was corrected on June 11, 2015, to fix a byline error: Ian Buchanan’s name was spelled wrong.