Error Reduction in Health Care -  - E-Book

Error Reduction in Health Care E-Book

0,0
80,99 €

-100%
Sammeln Sie Punkte in unserem Gutscheinprogramm und kaufen Sie E-Books und Hörbücher mit bis zu 100% Rabatt.

Mehr erfahren.
Beschreibung

Error Reduction in Health Care: A Systems Approach to Improving Patient Safety, 2nd Edition

Completely revised and updated this book offers a step-by-step guide for implementing the Institute of Medicine guidelines to reduce the frequency of errors in health care services and mitigate the impact of those errors that do occur. It explores the fundamental concepts and tools of error reduction, and shows how to design an effective error reduction initiative. The book pinpoints how to reduce and eliminate medical mistakes that threaten the health and safety of patients and teaches how to identify the root cause of medical errors, implement strategies for improvement, and monitor the effectiveness of these new approaches.

Sie lesen das E-Book in den Legimi-Apps auf:

Android
iOS
von Legimi
zertifizierten E-Readern

Seitenzahl: 622

Veröffentlichungsjahr: 2011

Bewertungen
0,0
0
0
0
0
0
Mehr Informationen
Mehr Informationen
Legimi prüft nicht, ob Rezensionen von Nutzern stammen, die den betreffenden Titel tatsächlich gekauft oder gelesen/gehört haben. Wir entfernen aber gefälschte Rezensionen.



Table of Contents

Cover

Table of Contents

Title page

Copyright page

FIGURES, TABLES, AND EXHIBITS

FOREWORD

PREFACE

THE EDITOR

THE AUTHORS

Part 1: THE BASICS OF PATIENT SAFETY

CHAPTER 1 A FORMULA FOR ERRORS: GOOD PEOPLE +  BAD SYSTEMS

Surgery on Wrong Patient

Why Mistakes Occur

How to Error-Proof Processes

Role of Senior Leaders

Conclusion

Discussion Questions

Key Terms

CHAPTER 2 THE HUMAN SIDE OF MEDICAL MISTAKES

Health Care: A Unique Socio-Technical System

Humans as Problem Solvers

Conclusion

Discussion Questions

Key Terms

CHAPTER 3 HIGH RELIABILITY AND PATIENT SAFETY

High-Reliability Principles

Applying HRO Principles to Health Care

Highly Reliable Processes

Conclusion

Discussion Questions

Key Terms

Part 2: MEASURE AND EVALUATE PATIENT SAFETY

CHAPTER 4 MEASURING PATIENT SAFETY PERFORMANCE

Monitor High-Risk Processes

Measure Performance

Collect Measurement Data

Conclusion

Discussion Questions

Key Terms

CHAPTER 5 ANALYZING PATIENT SAFETY PERFORMANCE

Reporting Measurement Data

Safety Measurement Results

Conclusion

Discussion Questions

Key Terms

CHAPTER 6 USING PERFORMANCE DATA TO PRIORITIZE SAFETY IMPROVEMENT PROJECTS

Quantify Opportunity Priorities

Opportunity Analysis Steps

Conclusion

Key Terms

Part 3: REACTIVE AND PROACTIVE SAFETY INVESTIGATIONS

CHAPTER 7 ACCIDENT INVESTIGATION AND ANTICIPATORY FAILURE ANALYSIS

Accident Investigation

Anticipatory Failure Analysis

Conclusion

Key Terms

CHAPTER 8 MTO AND DEB ANALYSIS CAN FIND SYSTEM BREAKDOWNS

Framework for Investigating Medical Accidents

Framework for Proactive Safety Improvements

Conclusion

Key Terms

CHAPTER 9 USING DEDUCTIVE ANALYSIS TO EXAMINE ADVERSE EVENTS

Adverse Event Investigation Techniques

Application of Deductive Analysis

Deconstruction of the Logic Tree

Conclusion

Discussion Questions

Key Terms

Part 4: HOW TO MAKE HEALTH CARE PROCESSES SAFER

CHAPTER 10 PROACTIVELY ERROR-PROOFING HEALTH CARE PROCESSES

Anatomy and Physiology of a Process

How Processes Fail

Why Processes Fail

Medical Management of the Health Care Process

Conclusion

Discussion Questions

Key Terms

CHAPTER 11 REDUCING ERRORS THROUGH WORK SYSTEM IMPROVEMENTS

Error Management Strategies

Multiple Improvements Are Needed

Conclusion

Discussion Questions

Key Terms

CHAPTER 12 IMPROVE PATIENT SAFETY WITH LEAN TECHNIQUES

Going Lean

Building Lean Processes

Lean Improvement Tools

Applying Lean Tools

Conclusion

Discussion Questions

Key Terms

Part 5: FOCUSED PATIENT SAFETY INITIATIVES

CHAPTER 13 HOW INFORMATION TECHNOLOGY CAN IMPROVE PATIENT SAFETY

What We Have Learned from Research and Practice

HIT Implementation Challenges

Solutions

Conclusion

Discussion Questions

Key Terms

CHAPTER 14 A STRUCTURED TEAMWORK SYSTEM TO REDUCE CLINICAL ERRORS

Teamwork Training Addresses Safety Challenges

Teamwork Challenges in Emergency Care

The Teamwork System

Identifying Teamwork Failures in Incidents

Implementing a Teamwork System

Conclusion

Discussion Questions

Key Terms

CHAPTER 15 MEDICATION SAFETY IMPROVEMENT

National Efforts to Improve Medication Safety

Impact of Organizational Culture

Understanding Medication Errors

Medication Error Prevention Strategies and Practices

Measuring and Improving the Quality of Medication Delivery

Role of Patients in Medication Safety

Conclusion

Discussion Questions

Key Terms

GLOSSARY

Index

Copyright © 2011 by John Wiley & Sons, Inc. All rights reserved.

Published by Jossey-Bass

A Wiley Imprint

989 Market Street, San Francisco, CA 94103-1741—www.josseybass.com

No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the publisher, or authorization through payment of the appropriate per-copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400, fax 978-646-8600, or on the Web at www.copyright.com. Requests to the publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, 201-748-6011, fax 201-748-6008, or online at www.wiley.com/go/permissions.

Readers should be aware that Internet Web sites offered as citations and/or sources for further information may have changed or disappeared between the time this was written and when it is read.

Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.

Jossey-Bass books and products are available through most bookstores. To contact Jossey-Bass directly call our Customer Care Department within the U.S. at 800-956-7739, outside the U.S. at 317-572-3986, or fax 317-572-4002.

Jossey-Bass also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic books.

Library of Congress Cataloging-in-Publication Data

Error reduction in health care : a systems approach to improving patient safety / Patrice L. Spath, editor.—Second Edition.

p. ; cm.

 Includes bibliographical references and index.

ISBN 978-0-470-50240-2 (pbk.); 978-1-118-00151-6(ebk.); 978-1-118-00155-4 (ebk.); 978-1-118-00156-1 (ebk.)

1. Health facilities–Risk management. 2. Medical errors–Prevention. I. Spath, Patrice L., 1949- editor.

 [DNLM: 1. Medical Errors–prevention & control. 2. Quality Assurance, Health Care–methods. 3. Risk Management–methods. WB 100]

 RA971.38.E77 2011

 362.1–dc22

2010047564

FIGURES, TABLES, AND EXHIBITS

Figures

Figure 1.1 Active Errors Leading to Mr. Murphy’s Unnecessary Surgery

Figure 1.2 Error-Reduction Strategies

Figure 2.1 Typical Medical Accident Trajectory

Figure 4.1 Patient Incident Report Form

Figure 4.2 Sample Summary Report of Incidents for One Month

Figure 4.3 Flowchart of Blood Transfusion Process

Figure 5.1 Tabular Report of Inpatient Patient Safety Measures

Figure 5.2 Dashboard Tabular Report of Inpatient Patient Safety Measures

Figure 5.3 Bar and Line Graph of Patient Fall Incidents

Figure 5.4 Percentage of Patient Incidents That Did Not Result in Discomfort, Infection, Pain, or Harm to the Patient

Figure 5.5 Control Chart of Number of Patient Falls

Figure 6.1 Basic Blood Drawing Process Flow Diagram

Figure 6.2 Blood Redraw Process Significant Few Events

Figure 7.1 Root Cause Analysis Results of the Death of Patient Following Blood Transfusion (Case #1)

Figure 7.2 Root Cause Analysis Results of Serious Disability Following Elective Arthroscopic Knee Surgery (Case #2)

Figure 7.3 One Process Step from FMEA on the Process of Ordering Medication for a Hospitalized Patient

Figure 9.1 The 5-Whys Analytical Tool

Figure 9.2 Ishikawa Diagram

Figure 9.3 Causal Tree

Figure 9.4 PROACT Logic Tree

Figure 9.5 Surgical Fire Incident Top Box

Figure 9.6 Drill Down of Modes in Surgical Fire Incident

Figure 9.7 Body of the Logic Tree for Surgical Fire Incident

Figure 9.8 Physical, Human, and Latent Root Causes

Figure 9.9 Completed Logic Tree for Surgical Fire Event

Figure 9.10 Surgical Fire Analysis Results Using the 5-Whys Method

Figure 10.1 Levels of Analysis

Figure 11.1 Thrombosis Risk Assessment for Surgical and Medical Patients

Figure 11.2 Physician Orders for VTE Prophylaxis

Figure 12.1 Equipment Red Tag for 5S Exercise

Figure 12.2 Visual Aid for Clinical Staff Managing Behavioral Health Patient

Figure 12.3 Kanban Card for Inventory Control

Figure 14.1 Most Frequent Teamwork Errors

Figure 14.2 Care Resources Managed by the ED Core Team

Figure 14.3 Interrelationships of the Five Team Dimensions

Figure 14.4 The Teamwork Check Cycle

Figure 14.5 Teamwork Failure Checklist

Figure 14.6 Individual Claim Assessment Process

Figure 14.7 Example of a Completed Teamwork Failure Checklist

Figure 14.8 Teamwork System Implementation

Figure 15.1 Hospital Medication Administration Process

Figure 15.2 Anonymous Medication Error Report

Figure 15.3 Medication System Error Analysis

Figure 15.4 Heart Failure Clinical Pathway

Figure 15.5 Do-Not-Use Abbreviations

Figure 15.6 Medication Administration Process: Manual Versus Computerized

Figure 15.7 Patient Version of Pneumonia Guideline

Tables

Table 3.1 Process Reliability Levels and Related Improvement Strategies

Table 4.1 Examples of Patient Safety Indicators

Table 4.2 Task Criticality Scoring System for the Process of Enteral Feeding Product Procurement, Preparation, and Administration

Table 4.3 Commonly Used Patient Safety Culture Assessment Tools

Table 4.4 ICD-9-CM Diagnosis Codes for Maternal and Fetal Complications of in Utero Procedures

Table 4.5 Data Definitions Applicable to Reporting to a Patient Safety Organization

Table 5.1 Organization-Wide Report of Safety Measures for the Function of Medication Administration

Table 6.1 Blood Drawing Process Spreadsheet Headings

Table 6.2 Blood Redraw Process OA Spreadsheet

Table 8.1 Example of a Blank MTO Diagram

Table 8.2 Taxonomy of Contributing Causes

Table 8.3 Example of a Schematic Diagram from an MTO Analysis

Table 9.1 Proprietary RCA Deductive Analysis Software Products

Table 9.2 The 5 Ps of Data Collection

Table 10.1 Probability of Success in a Process

Table 12.1 Categories of Waste and Hospital Examples

Table 12.2 Five Phases of the 5S Lean Improvement Tool

Table 14.1 Team Characteristics

Table 14.2 Teamwork Behavior Matrix

Table 14.3 Potential Uses of Teamwork Failure Checklist Findings

Table 15.1 Process Redesign Tactics Applied to Medication Delivery

Table 15.2 Medication Measures

Exhibits

Exhibit 2.1 Case #1: Midwife Takes Wrong Drug Vial off the Shelf

Exhibit 2.2 Case #2: Wrong Method Used for Anesthesia

Exhibit 2.3 Case #3: Missed Diagnosis of Diabetic Ketoacidosis 30

Exhibit 2.4 Case #4: Nurse Administers Ten Times the Prescribed Amount of Insulin

Exhibit 3.1 Organization Self-Assessment of High-Reliability Principles

Exhibit 3.2 National Center for Patient Safety Hierarchy of Improvement Actions

Exhibit 4.1 General Risk-Related Performance Measures

Exhibit 4.2 Examples of Performance Measures for Safety-Critical Tasks in Major Patient Care Functions

Exhibit 4.3 Form Used to Gather Observation Data to Measure Housekeeping Staff Compliance with Infection Control Practices

Exhibit 6.1 Impact Assumptions Used in OA of Blood Redraw Process

Exhibit 6.2. Conclusions and Recommendations for Blood Redraw OA

Exhibit 10.1 Checklist for Proactive Risk-Reduction Activities

Exhibit 11.1 Checklist for Auditing the Safety of the Physical Environment

Exhibit 11.2 Sources of Work Redesign and Patient Safety Improvement Ideas and Patient Engagement Materials

Exhibit 12.1 Mistake-Proofing Case Study in Perioperative Services

Exhibit 12.2 Kaizen Event Summary: Improving Timeliness of Inpatient Admissions from the Emergency Department

Exhibit 13.1 Information Technology Implement Scenario

Exhibit 13.2 Information Technology Work-Around Scenario

Exhibit 14.1 Senior Leader Actions Necessary to Support Teamwork Implementation

Exhibit 15.1 Medication Error Case Study

FOREWORD

Lucian L. Leape

It is hard to believe that it was only 11 years ago that the Institute of Medicine shocked the world with the revelation that hundreds of thousands of hospitalized patients are injured by medical errors and as many as 98,000 die annually as a result. In that relatively brief period the patient safety “movement” has grown exponentially, so that now every hospital has a patient safety officer and some kind of a safety program. There are very few providers—nurses, doctors, pharmacists, or technicians—who have not been involved in at least one systems change designed to improve safety.

Yet, most of us feel woefully ill-equipped to take on the patient safety challenge. For one reason, it is so complicated. In the increasing complex world of modern health care, there seems to be an infinite number of possibilities for things to go wrong. And, in response, there seems to be an infinite number of types of changes we are being called on to make. Where to start?

A good place to start is by getting a serious understanding not just of the extent of medical injury, but also of the theories of why people make mistakes and how you can prevent them. The next step is to learn what is available in terms of methods for understanding risk, analyzing mistakes, and designing care processes to make them difficult to make. Error Reduction in Health Care does just that. The first two chapters provide a succinct yet comprehensive description of what is known about the extent of medical errors and the current thinking about why people make mistakes and how to design systems to reduce risk. Then there are chapters on how to analyze accidents and how to use deductive methods to understand hazards before accidents occur, how to prioritize risks in order to know where to focus efforts at systems redesign. The chapter on the application of human factors principles in designing systems changes is especially valuable, and the chapter on performance measurement provides perhaps the most comprehensive list of measures available—a veritable treasure trove that can be used to assess safety and, equally important, progress in improvement.

But there is more to safety than measuring risk and redesigning processes using human factors concepts. In health care, more than any other industry, the processes are people processes. Although there are important examples where automation is being used effectively, for most patients, most of the time, health care is the result of human interactions. James Reason pointed out years ago that the essence of a safe culture is to be found in the interrelationships of the caregivers—how we work together, or don’t. The power of this observation comes through in the several chapters in Error Reduction in Health Care. Implementing information technology, changing medication systems, creating “lean” systems, and, of course, functioning effectively in teams, all require that caregivers work well together. Creating a culture where that happens is proving to be the great challenge in patient safety. It requires knowledge, commitment, and leadership. But first it requires that you have a clear vision of where you want to go. Error Reduction in Health Care is a good place to start.

PREFACE

Medical accidents, near-miss situations, and recommendations for preventing these events are not new topics. For example, in 1886 Dr. Frank Hamilton wrote that “a few years ago a strong, healthy woman died in the dentist’s chair in New York City while under the influence of nitrous oxide.” Hamilton then goes on to describe how future events of this sort might be prevented, saying, “The danger to life would no doubt in these cases be diminished if the patient were in the recumbent position. Recent experiments and observations seem to have shown that the admixture of oxygen gas with the nitrous oxide in certain proportions averts the danger of asphyxia, while it does not diminish the anesthesia” (Hamilton, 1886, p. 946). In 1915 Gordon Christine, MD, wrote about problems related to ownership of patient records. According to Christine, “there is a widespread notion among nurses that bedside clinical records of a patient are the property of the attending nurse, and that they can therefore be rightfully removed by her from the home of the patient at the conclusion of her services” (Christine, 1915, p. 22). Christine relates an incident in which the nurse took a patient’s chart and refused to return it even though the continuity of care was being compromised. In two other instances Christine notes that records were removed “because the nurses wished to cover up some of their mistakes.”

Safe health care is recognized by the Institute of Medicine as one of the key dimensions of health care quality. In Crossing the Quality Chasm, safe health care is defined as, “avoidance of unintended patient injuries” (IOM, 2001). Since publication of this report, much has been done to improve patient safety. We’ve learned a lot and have made progress toward achieving the safe health care goal, yet there’s still much more learning and work to be done.

Patient safety improvement is what this book is all about. In the pages that follow you’ll find out why errors occur at the front lines of patient care and what is needed to prevent these errors. Some of the fixes are fairly simple—use checklists to remind caregivers of required actions. Some of the fixes are costly—to implement computerized order entry systems. Some of the fixes challenge our traditions—to break down professional silos. All of the fixes require systems thinking—problems must be viewed as parts of the overall system and solutions must address the underlying causes.

The basics of patient safety are covered in Part One of the book. These chapters provide a foundation for further learning. In Chapter One, McClanahan, Goodwin, and Perlin present an overview of issues surrounding health care accidents. Using a real-life case study, the authors describe how our system of care actually fosters mistakes. And although errors are often attributed to the action of an individual, there are usually a set of external forces and preceding events that lead up to the error. Quality experts agree that the most common cause of performance problems is the system itself, not the individuals functioning within the system. Though a human error may have occurred, the root cause is likely to be found in the design of the system that permitted such an error to be made. The professionals who work together to provide patient care do not function in isolation. The activities of caregivers are influenced by multiple factors, including personal characteristics, attitudes, and qualifications; the composition of teams, organizational culture, and climate; physical resources; and the condition of the patient. These factors affect performance as well as influence decision making, task prioritization, and conflict resolution.

A fundamental understanding of the kinds of errors that health care professionals make can help us design better systems. In Chapter Two Ternov draws from work in the cognitive sciences and analyses of human performance to provide an in-depth review of the causes of medical mistakes. Ternov’s previous work as principal investigator of medical accidents for the Board of Health and Welfare in Sweden has offered a unique insight into the causes of mistakes. He describes several medical accidents and near-miss events and provides commentary as to why they occurred and ways to keep them from recurring.

Is health care reliable? What is the probability that a health care process will adequately perform its intended purpose? Are health care professionals mindful of the safety risks associated with patient care? These questions, and more, are addressed in Chapter Three. To further advance patient safety, senior leaders must create an environment where everyone is aware of the error potential in operations and safe behaviors and attitudes are rewarded. In addition, work processes must be designed to perform as expected a high proportion of the time. Dlugacz and Spath detail what must be done by health care organizations if they are to become highly reliable.

The chapters in Part Two of the book address measurement and evaluation of patient safety performance data. Accident investigators have found that most disasters in complex organizations had long incubation periods characterized by a number of discrete events signaling danger that were often overlooked or misinterpreted during the incubation period. This observation has important implications for health care organizations. Patient safety can be enhanced with the introduction of measures that continually evaluate risk-prone processes. By monitoring the performance of these processes, health care professionals can detect impending problems before an undesirable event occurs. Included in Chapter Four is advice on how to select the important tasks that should be regularly evaluated. The authors offer a scoring matrix that can be used to identify safety-critical patient care activities. More than 100 patient safety measures are included in this chapter.

Collecting measurement data is just the first step toward making improvements. The information gathered must be analyzed for danger signals needing further investigation. Chapter Five describes techniques for evaluating the results of patient safety measurement, starting with effective data display. Statistical process control tools can be used to judge performance as well as comparison data from other health care organizations. In Chapter Six Latino illustrates how performance data can be used to select the process failures most in need of fixing. This involves gathering and analyzing data to make the business case for improvement projects.

In Part Three of the book you’ll find chapters describing how health care organizations use retrospective and prospective investigation techniques to uncover root causes and latent failures. Feldman began studying the causes of sentinel events in the early 1960s. In Chapter Seven, he and Roblin of Kaiser Permanente show how the private industry model of root cause analysis can be applied to untoward patient care events to find the underlying causes. You’ll learn how to identify the root causes of adverse events and fix the latent failures that contribute to medical accidents. In addition, Feldman and Roblin introduce anticipatory failure analysis—prospective risk assessment techniques designed to identify and resolve process failures before an accident occurs. It is not enough to wait for an accident to happen and then start improving patient care tasks. Better system reliability cannot be achieved by acting only on what is learned from yesterday’s mistakes—proactive patient safety initiatives should be used to keep ahead of the accidents.

In Chapter Eight Ternov describes methods used in Sweden to retrospectively analyze medical accidents and proactively study high-risk health care processes so that preventive measures can be taken before an adverse event occurs. A deductive analysis approach to analyzing adverse event is favored by Latino, author of Chapter Nine. Deductive analysis is an investigation technique that explores from the general to the specific to determine how the system failed and why wrong decisions were made. He details the advantages of using a deductive analysis tool to identify failures and appropriate corrective actions.

Once the decision is made to improve the safety of a process, how should the work be redesigned? Much has been learned about improving the reliability and safety of health care processes in the past ten years and this learning is detailed in Part Four of the book. In Chapters Ten and Eleven readers learn how to design patient care processes to be more resistant to error occurrence and more accommodating of error consequences. When errors cannot be completely eliminated, then clinicians must learn how to quickly recognize the mistake and take appropriate actions to mitigate the consequences. Many of the techniques used to create more efficient health care processes also help to make them safer. In Chapter Twelve Lavallee shows how lean techniques borrowed from private industry can be used to reduce the likelihood of harmful mistakes during the delivery of health care services.

Targeted patient safety improvement recommendations are found in Part Five. An often-cited suggestion for improving patient safety is technology—tools that automate or mechanize clinical and administrative processes. In Chapter Thirteen, Slovensky and Menachemi describe how automation can reduce human errors in health care work processes but not without some challenges. The current state of the art in information technology and patient safety, as well as recommendations for avoiding common automation-induced hazards are covered in Chapter Thirteen.

The aviation industry has discovered that faulty teamwork among crew members is a frequent causal factor in airline accidents. Many scientists involved in improving airline crew performance are now applying the same concepts to health care teams. By adopting structured teamwork improvement strategies, caregivers are finding that medical accidents can be prevented. Tactics for enhancing teamwork and communication among health care professionals are included in Chapter Fourteen. In this chapter readers will find a checklist that can be used to identify the teamwork and communication problems that lead to an adverse patient event as well as a teamwork improvement action plan.

It is estimated that each year medication errors injure approximately 1.3 million people in the United States. In a study of fatal medication errors from 1993 to 1998, improper dosing of medicine accounted for 41% of fatal errors. Giving the wrong drug and using the wrong route of administration each accounted for 16% of the errors (Institute of Medicine, 2006). The important topic of how to reduce the incidence of medication errors is covered by Dlugacz in Chapter Fifteen.

Following publication in 2000 of the first edition of Error Reduction in Health Care there was considerable national attention on the problem of patient safety and this focus has not subsided. We hope that some of the progress we’ve made toward the goal of safe patient care can be attributed to what readers learned in the first edition. Eleven years later we know more about what works and what doesn’t work. We know errors cannot be eliminated by simply disciplining the people who make the mistakes. Quick fixes must give way to systems thinking and adoption of reliability principles that have improved safety in other complex, high-risk industries. We should seek to prevent errors, but also design systems that more readily catch and mitigate the effects of errors. Most important is maintaining an organizational culture of safety and reliability.

Patrice L. Spath, MA, RHIT

Health Care Quality Specialist

Forest Grove, Oregon

THE EDITOR

Patrice L. Spath, MA, RHIT, is a health information management professional with broad experience in health care quality and safety improvement. She is president of Brown-Spath & Associates (www.brownspath.com), a health care publishing and training company based in Forest Grove, Oregon. During the past twenty-five years, Patrice has presented more than 350 educational programs on health care quality and patient safety topics and has completed numerous quality program consultations for health care organizations.

Ms. Spath has authored and edited many books and peer-reviewed articles for Health Administration Press, AHA Press, Jossey-Bass, AHC Media LLC, Brown-Spath & Associates, and other groups. Her recent books include Engaging Patients as Safety Partners (AHA Press, 2008) and Leading Your Health Care Organization to Excellence (Health Administration Press, 2005) for which she received the James A. Hamilton Book of the Year Award. This award is given annually to the author of a management or health care book judged outstanding by the American College of Healthcare Executives’ Book of the Year Committee.

Ms. Spath is an adjunct assistant professor in the Department of Health Services Administration at the University of Alabama, Birmingham, where she teaches online quality management courses. She has also had teaching responsibilities in the health information technology program at Missouri Western State University in St. Joseph and the graduate health administration program at Montana State University in Billings.

Ms. Spath currently serves as consulting editor for Hospital Peer Review and is an active member of the advisory board for WebM&M (http://webmm.ahrq.gov), an online case-based journal and forum on patient safety and health care quality that is supported by a contract from the Agency for Healthcare Research and Quality.

This book is dedicated to everyone involved in improving the safety of health care services.

THE AUTHORS

Richard J. Croteau, MD, is a senior patient safety advisor at Joint Commission International with a principal focus on international patient safety activities in collaboration with the WHO’s World Alliance for Patient Safety. For the past twenty years, he has held several positions with The Joint Commission including executive director for patient safety initiatives and vice president for accreditation services. Earlier activities include chief of surgery at South County Hospital in Rhode Island and rocket systems analyst for NASA’s Lunar Module program, Project Apollo. He currently splits his time between international patient safety activities and woodworking.

Yosef D. Dlugacz, PhD, is the senior vice president and chief of clinical quality, education and research of the Krasnoff Quality Management Institute, a division of the North Shore-Long Island Jewish Health System which was the recipient of the National Quality Forum’s 2010 National Quality Healthcare Award. Dr. Dlugacz was instrumental in designing the impressive and sophisticated quality management structure that integrates quality management methods into every level of care within the vast 15-hospital system. His latest book, Value Based Health Care: Linking Finance and Quality (Jossey-Bass, 2010), explores the relationship between quality care and organizational efficiency.

Sanford E. Feldman, MD, FACS, a general surgeon in private practice in San Francisco, was a pioneer in the area of medical peer review. He was a founder and first director of the San Francisco Medical Peer Review Organization and became director of the California Medical Review Organization. After retirement from surgery Dr. Feldman continued to work for the improvement of patient care. He was an expert in the assurance of quality care by hospitals and wrote about physician and hospital error for various medical journals. Dr. Feldman passed away in 2008 at the age of 93.

Karen Ferraco was a risk management consultant at ProNational Insurance Company in Okemos, Michigan, and at PHICO Insurance Company in Pennsylvania. She specialized in risk management with an emphasis on professional and hospital liability. Ms. Ferraco was frequently involved in training and educational presentations for medical professionals and medical staff and hospital boards. In addition to her strengths in analyzing high-risk exposures, she practiced as a registered nurse for 10 years. Ms. Ferraco passed away in 2002 at the age of 55.

Susan T Goodwin, RN, MSN, CPHQ, FNAHQ, FACHE, is assistant vice president in the Clinical Services Group for the Hospital Corporation of America (HCA) and currently concentrating on the development and redesign of credentialing, privileging, and peer review systems and processes for HCA affiliates. Ms. Goodwin is the 2011 president of the National Association of Healthcare Quality (NAHQ), and most recently served as chairman and NAHQ representative for the Joint Commission’s Professional and Technical Advisory Committee for the Hospital Accreditation Program.

Robert J. Latino is CEO of Reliability Center, Inc. (RCI) (www.reliability.com). RCI is a reliability consulting firm specializing in improving equipment, process, and human reliability. He has facilitated root cause analyses and failure mode and effect analyses with his clientele in more than 20 countries around the world for 25 years and has taught more than 10,000 students in the PROACT® methodology. Mr. Latino is the author of Root Cause Analysis: Improving Performance for Bottom Line Results (2006, Taylor & Francis) and Patient Safety: The PROACT Root Cause Analysis Approach (2008, Taylor & Francis) as well as coauthor on several other patient safety articles and publications.

Danielle Lavallee, PharmD, PhD, is the senior health care consultant for Lean Hospitals, LLC (www.leanhospitals.org). Her work and research focuses on improving the efficiency of health care practices through the incorporation of lean principles and she is a Lean Hospitals Certified Six Sigma Black Belt. Dr. Lavallee holds degrees of doctor of pharmacy from the University of Kansas and doctor of philosophy from the Department of Pharmaceutical Health Services Research at the University of Maryland, School of Pharmacy.

Susan McClanahan, RN, BSN, is a director in the Clinical Services Group for Hospital Corporation of America (HCA). She joined the company in 1995. Through her work providing consultative and educational assistance to each of the HCA affiliates, she promotes foundational quality of care and compliance with accreditation, legislative, and regulatory standards. A graduate of the University of Tennessee at Knoxville, Ms. McClanahan specialized in emergency department nursing for eight years. After obtaining her certification as a paralegal in 1990, she worked as a claims investigator in the risk management department at Vanderbilt University Medical Center.

Nir Menachemi, PhD, MPH, is an associate professor of health care organization and policy at the University of Alabama at Birmingham (UAB). His research focuses on health information technologies, patient safety, medical malpractice, and health care quality issues. At UAB, Dr. Menachemi teaches courses in strategic management, health information management, and directs the doctoral program in public health.

John C. Morey, PhD, CHFP, is a board-certified human factors professional and the senior research psychologist at Dynamics Research Corporation. He has more than thirty years of experience in training evaluation and conducts research in teamwork training programs for aviation, combat systems, and health care. Dr. Morey was a member of the original development team for the MedTeams® project, a joint civilian and military program to transition lessons learned from aviation crew resource management to health care. He holds a doctorate in experimental psychology from the University of Georgia.

Jonathan B. Perlin, MD, PhD, MSHA, FACP, FACMI, is president of clinical services and chief medical officer for Hospital Corporation of America (HCA). He holds adjunct appointments as professor of medicine and biomedical informatics at Vanderbilt University and professor of health administration at Virginia Commonwealth University. Dr. Perlin previously served as the chief executive officer of the Veterans Health Administration, where his work helped propel the nation’s largest integrated health system to international recognition for achievements in quality, safety, and use of electronic health records.

Matthew M. Rice, MD, JD, FACEP, is an ABEM-certified practicing emergency medicine physician who works clinically at more than 26 hospitals in the Pacific Northwest. Dr. Rice spent 22 years in the U.S. Army in various positions, retiring as a colonel in 2000 from Madigan Army Medical Center where he was department chair and program director of the Emergency Medicine Residency. He is a senior member of the American College of Emergency Physicians (ACEP), the national organization representing emergency medicine and chairs and is a member of various committees including the ACEP standards of care committee. He is on the board of directors of the National Patient Safety Foundation. He has special professional interests in legal medicine, quality of medicine, local and national politics, and future issues in health care.

Douglas W. Roblin, PhD, is Senior Research Scientist with the Center for Health Research/Southeast at Kaiser Permanente Georgia (KPG) and adjunct assistant professor of health policy and management at the Rollins School of Public Health at Emory University. He is a social anthropologist with research interests in chronic disease management, health care access, social epidemiology, and association of patient-level outcomes with organizational and financial characteristics of delivery systems. Dr. Roblin has more than twenty years of experience with management and analysis of Kaiser Permanente (KP) databases and has directed a number of surveys of KP enrollees. He is KPG’s site principal investigator for three large multisite research consortia of the HMO Research Network: the HMO Center for Research and Education on Therapeutics, the HMO Cancer Research Network (CRN), and the HMO Cancer Communication Research Center.

Daniel T. Risser, PhD, is an experimental social psychologist with thirty years of research experience examining teams and organizations. He is a senior scientist in the Concepts and Analysis Division of VT Aepco in Gaithersburg, Maryland. He has been a member of the American Society for Healthcare Risk Management (ASHRM), a member of the ASHRM Claim Data Gathering Task Force, and an assistant professor of organizational behavior in a business school. Dr. Risser was the lead researcher examining clinical errors in a large, 10-hospital project (the MedTeams Project) designed to improve teamwork and reduce errors in emergency departments. He has also conducted teamwork research on obstetrical delivery units and Army helicopter aircrews, conducted research on human links in weapon systems design, military command and control systems, and unmanned aerial systems (UAS), and helped to develop Human Systems Integration (HSI) policy for weapon design for the Office of Secretary of Defense.

Mary Salisbury RN, MSN, is a nurse commanding forty years of continuous service in operative, critical care, and emergency medicine with a current focus on safety research. A member of the original MedTeams® research team working to translate the principles of crew resource management into health care, Ms. Salisbury remains a participating author and designer of the TeamSTEPPS™ training and evaluation methodologies. As founder and president of the Cedar Institute, Inc., Ms. Salisbury provides services to both military and civilian health care organizations to translate research into practice—leader and team training, implementation, sustainment and performance coaching, and to focus work on the facilitating elements of team-driven safety to include simulation, evaluation, and managing disrupting behaviors.

Paul M. Schyve, MD, is the senior vice president of The Joint Commission. Before joining the Commission in 1986, he held a variety of professional and academic appointments in the areas of mental health and hospital and health system administration. A Distinguished Life Fellow of the American Psychiatric Association, he has served on numerous advisory panels for the Centers for Medicare and Medicaid Services, the Agency for Healthcare Research and Quality, and the Institute of Medicine, and published in the areas of psychiatric treatment and research, psychopharmacology, quality assurance, continuous quality improvement, health care accreditation, patient safety, health care ethics, and cultural and linguistic competence.

Robert Simon, PhD, has a doctorate in education from the University of Massachusetts. He is a human factors specialist and educator with more than thirty years experience. For the last twenty years he has specialized in research, development, and training for high-performance, high-stress teams in aviation and medicine. He was the principal investigator for the U.S. Army’s Aircrew Coordination Training Program, the U.S. Air Force’s Crew Resource Management Program, and the MedTeams® program. He joined the Center for Medical Simulation in Cambridge, Massachusetts, as education director in December 2002. He is the director of the Institute for Medical Simulation, a jointly sponsored endeavor of the Center for Medical Simulation and the Harvard-MIT Division of Health Science and Technology intended to foster high-quality simulation-based health care education.

Donna J. Slovensky, PhD, RHIA, FAHIMA, is a professor and associate dean for Academic and Student Affairs in the School of Health Professions at the University of Alabama at Birmingham (UAB). Her teaching and scholarship interests include health information management, strategic management, and health care quality issues.

Sven Ternov, MD, is a PhD student at Lund Institute of Technology, studying risk management in complex socio-technical systems. He is a consultant for the Swedish air traffic navigation and a provider and consultant for the Danish Board of Health. He previously served as an inspector for the Swedish Board of Health and Welfare at the regional unit in Malmoe.

Part 1: THE BASICS OF PATIENT SAFETY

CHAPTER 1

A FORMULA FOR ERRORS: GOOD PEOPLE +  BAD SYSTEMS

Susan McClanahan, Susan T. Goodwin and Jonathan B. Perlin

LEARNING OBJECTIVES

Understand the prevalence of health care–associated errors and error consequencesDescribe the concepts of latent failures and human factors analysisDemonstrate how to apply mistake-proofing techniques to reduce the probability of errorsDiscuss the role of leaders in supporting patient safety initiatives

Since this book was published in 2000, there has been ongoing news media coverage of medical misadventures, increasing evidence of quality, safety, and efficiency gaps, and thus patient safety has continued to be a growing concern for the public, policymakers, and everyone involved in the delivery of health care services. Although the standard of medical practice is perfection (error-free patient care), most health care professionals recognize that some mistakes are inevitable.

In this book, readers discover how to examine medical mistakes and learn from them. This first chapter sets the stage for this learning by providing a general overview of the causes of medical mistakes and what can be done to eliminate or reduce the occurrence of such errors. The chapter starts with a description of a case involving surgery on the wrong patient. The case scenario is extrapolated from actual events, although the details of the case have been materially altered, including the use of fictitious names, to protect patient privacy and confidentiality.

Surgery on Wrong Patient

Mr. Murphy slipped on a wet floor in the locker room of the clubhouse at his favorite golf course. He fell heavily on his right hip and was in pain when he arrived by ambulance at the hospital’s emergency department (ED). While Murphy was being examined, Mr. Jenkins was being admitted to the same ED. Jenkins was a resident of a local long-term care facility and he had also fallen on his right side that morning.

In addition to caring for Murphy and Jenkins, the ED staff members were very busy with other patients. As was typical when the department was crowded, the admissions registrar was behind in getting patients fully registered and putting identification bands on each patient. The registrar’s time was also occupied by other duties. To prevent delays in patient care and to maintain patient flow in an already overcrowded ED, the physicians typically ordered needed diagnostic tests and pain medication in advance of conducting a physical examination of a patient. Staff members providing care relied on their memory of each patient’s name, and verbal verification from the patient, but this was not done consistently. Mr. Jenkins, who had no attendant or family members with him, was not coherent enough to speak for himself and only his transfer documents accompanied him from the long-term care facility. Orders for right hip radiographs for both Murphy and Jenkins were entered into the computer by the nursing staff.

Murphy was transported to the radiology department first. A requisition for a radiograph of the right hip was printed out in the radiology department; however, his medical record did not accompany him. The radiology technologist took the requisition from the printer and, noting that it was for a right hip radiograph, verbally confirmed with Murphy that he was hurting in his right hip and was there for a hip radiograph. The technologist did not identify the patient using two patient identifiers (which for this department in this facility were name and date of birth). Unfortunately, the radiograph requisition was for Jenkins and it was Jenkins’ name that was placed on Murphy’s radiographs.

While radiographs were being taken of Murphy’s hip, Jenkins was transported to the radiology department. A technologist who had just come back from her lunch break took the Murphy requisition from the department’s printer and confirmed with the transporter that the patient on the stretcher was there for a right hip radiograph. She proceeded to perform the diagnostic study. The technologist did not know that there was another patient in the department for the same study, and she assumed she had the right requisition for the right patient (essentially repeating the error of the first technologist). Murphy’s name was then placed on Jenkins’ radiographs.

After both patients were transported back to the ED, the radiologist called the ED physician to report that the radiographs labeled with Murphy’s name indicated a fracture. The radiographs labeled with Jenkins’ name were negative for a fracture. Because metabolic diagnostic studies done on Jenkins indicated other medical problems, he was admitted to the hospital. Murphy was also admitted with a diagnosis of “fractured right hip.” The radiologist had not been given any clinical information related to either patient. If he had, he may have noted that one of Murphy’s diagnoses was obesity and his radiographs showed very little soft tissue. Jenkins, however, was very frail and thin and his radiographs showed a large amount of soft tissue.

Having been diagnosed with a fractured hip, Murphy was referred to an orthopedist. The orthopedist employed a physician assistant (PA) who performed a preoperative history and physical examination, noting in the medical record that there was shortening and internal rotation of the right leg. The orthopedic surgeon did not personally confirm these findings prior to authenticating the history and physical examination, even though he had had to admonish the PA in the past for doing less than thorough exams. The orthopedic surgeon had not communicated the performance issues related to the PA to anyone at the hospital. Likewise, the hospital’s quality management department did not collect or report performance measurement data or conduct ongoing professional practice evaluations for any allied health professionals.

Surgery for Murphy was scheduled for the next day. Meanwhile, Jenkins continued to complain of severe pain in his right hip and refused to bear weight on that side. A repeat radiograph of his right hip was performed late that evening. The radiologist read the radiograph the next morning and a fracture was noted. Although the staff recognized the discrepancy in diagnoses between the first and second radiographs, no immediate investigation of the reason for this was done. The case was merely flagged for retrospective peer review.

Although Murphy’s diagnostic images were digitally available through the Picture Archiving and Communication System (PACS) at this facility, they were not appropriately displayed in the operating room in accordance with the hospital policy addressing the Universal Protocol and procedures for avoiding surgical errors involving the wrong patient, wrong site, or wrong procedure. Once again, the discrepancy between the patient’s physique and the soft tissue evident in the radiographs was not detected. Surgery proceeded until after the incision was made and the surgeon found no fracture. While waiting for the patient to recover from anesthesia, the surgeon made a quick call to the hospital risk manager to discuss how he should deliver the news of the unnecessary surgery to Murphy and his family.

Prevalence of Incidents

Fortunately, incidents like the one described in the case scenario are not usual occurrences, but they happen more often than they should. As of March 31, 2010, wrong site/wrong patient surgery continues to be the most prevalent sentinel event reported to The Joint Commission (TJC) constituting 13.4% of the 6,782 sentinel events reviewed by TJC since 1995 (The Joint Commission, 2010).

How often do incidents involving patient harm actually occur? A study prepared by Healthgrades (2008) estimates that patient safety incidents resulted in 238,337 potentially preventable deaths during 2004 through 2006. It is estimated that each year 100,000 patients die of health care–associated infections (Klevens et al., 2002). Medication errors are among the most common medical errors, harming at least 1.5 million people every year (Institute of Medicine, 2006). Although the exact number of injurious patient incidents is not clearly known, what we do know is that medical errors can have serious consequences and may result in patient death, disability, or other physical or psychological harm, additional or prolonged treatment, and increased public dissatisfaction with the health care system. Health care can be made safer and making it safer is a national imperative.

Incident Contributors

The causes of wrong site/wrong patient surgery generally involve more than one factor and the case described at the start of the chapter illustrates some of the common causes: incomplete patient assessment, staffing issues, unavailability of pertinent information in the operating room, and organizational cultural issues.

Mr. Murphy was the unlucky victim of less than ideal circumstances that led to a series of human errors that were not caught and corrected. Emergency department staff members were busy caring for patients and, not surprisingly, as annual ED visits throughout the United States increased by 31% between 1995 and 2005 (Nawar, Niska, & Xu, 2007). High patient loads frequently caused overcrowding in this facility’s ED (a contributing factor to this case, related in part to staffing challenges). Staff did not follow procedures for properly identifying patients and surgical site verification (an organizational cultural factor). The radiologist had not been given any clinical information related to either patient (a contributing factor related to incomplete patient assessment). Conflicting diagnostic test findings did not arouse curiosity and were not investigated immediately. The PA who performed a preoperative history and physical examination noted in the medical record that there was shortening and internal rotation of the right leg; however, the orthopedic surgeon did not personally confirm these findings prior to authenticating the history and physical examination (resulting in an incomplete patient assessment).

Although Mr. Murphy’s radiographs were available for viewing electronically, they were not appropriately displayed in the operating room (a factor related to availability of pertinent information in the operating room). The end result, as James Reason observed, is that the greatest risk of accident in a complex system such as health care is “not so much from the breakdown of a major component or from isolated operator errors, as from the insidious accumulation of delayed human errors” (1990, p. 476). In this instance, each contributing factor or cultural issue—which alone would not necessarily lead to the untoward outcome—align like the holes in Reason’s famous Swiss cheese model, allowing a system failure to penetrate each potential barrier and occur (Reason, 2000).

Why Mistakes Occur

Mistakes are unintended human acts (either of omission or commission) or acts that do not achieve their intended goal. No one likes to make mistakes, but everyone is quick to point them out. In the minds of society and medical pro­fessionals alike, health care mistakes are unacceptable. Why are health care professionals so quick to find fault and place blame? Psychologists call it “the illusion of free will.” “People, especially in Western cultures, place great value in the belief that they are free agents, the captains of their own fate” (Reason, 1997). Because people are seen as free agents, their actions are viewed as voluntary and within their control. Therefore, medical mistakes have traditionally been blamed on clinicians who were characterized as careless, incompetent, or thoughtless.

However, because human action is always limited by local circumstances and the environment of action, free will is an illusion, not a certainty (Reason, 1997). Investigations of incidents such as the Three Mile Island and the Challenger disasters indicate that “accidents are generally the outcome of a chain of events set in motion by faulty system design that either induces errors or makes them difficult to detect” (Leape et al., 1995). Mr. Murphy’s unnecessary surgery illustrates the relationship between human errors and faulty systems. Several erroneous decisions and actions occurred that had an immediate impact on the chain of events. These types of errors, known as active failures, are often conspicuous and recognized as slips, mistakes, and violations of rules or accepted standards of practice. Active errors are usually committed by the persons who appeared to be in control of the system at the time the incident evolved. Examples of active errors that led to Mr. Murphy’s unnecessary surgery are summarized in Figure 1.1.

FIGURE 1.1 Active Errors Leading to Mr. Murphy’s Unnecessary Surgery

Errors by the “frontline operators” created the local immediate conditions that allowed the latent failures in the system to become manifest. Latent failures are contributory factors in the system that may have lain dormant for a long time (days, weeks, or months) until they finally contributed to the incident. delayed impact on the function of the system (Reason, 1997). Many times these latent failures are only recognized after an incident occurs. Listed below are some of the latent failures that created conditions which made possible the occurrence of an unnecessary surgery:

Staffing for the admissions registration area was not adequate for the volume of patients experienced during the busier times in the ED. There was no contingency plan to increase staffing during these times. Instead, the staff prioritized their workload and improperly prioritized patient registration and placing of ID bands as a task that could wait. There were no policies and procedures set forth to guide staff more properly in what to do in a busy situation. Nor was there a “safety culture” that facilitated identifying the environment as potentially unsafe and encouraged resolution of concerns.The facility’s policy regarding patient identification did not address safety measures to be taken in the event that the patient was uncommunicative or disoriented and therefore unable to verbally confirm his or her identity.There was a lack of standardized “hand-off” communication of important information. Patient identification was not appropriately communicated between caregivers.The quality management activities of the hospital did not cover an entire category of care providers. There was no performance measurement data or systematic ongoing professional practice evaluation for allied health professionals; in this case, the PA. Traditionally, the quality management activities of the hospital most frequently resulted in peer review letters of sanction, and fear of this had prevented the orthopedic surgeon from communicating performance information about the PA for whom he was responsible. The surgeon also did not provide adequate supervision of the PA.

Combination of Factors

As shown by the accident scenario, adverse patient incidents rarely result from a single mistake. System safeguards and the abilities of caregivers to identify and correct errors before an accident occurs make single-error accidents highly unlikely. Rather, accidents typically result from a combination of latent failures, active errors, and breach of defenses (Leape, 1994). System defenses, often called barriers, function to protect potential victims and assets from potential hazards. Defenses include engineered mechanisms (for example: alarms, physical barriers, automatic shutdowns), people (surgeons, anesthesiologists, nurses), procedural or administrative controls (time-out procedures, patient identification verifications). The breach of a defense occurs when latent failures and active errors momentarily line up to permit a trajectory of accident opportunity, bringing hazards into contact with victims, as demonstrated by James Reason’s Swiss cheese model (2000).

Evidence from a large number of accident inquiries indicates that bad events are more often the result of error-prone situations and error-prone activities than they are of error-prone people (Reason, 2004). The balance of scientific opinion clearly favors system improvements rather than individual discipline as the desired error management approach for the following reasons:

Human fallibility can be moderated to a point, but it can never be eliminated entirely. It is a fixed part of the human condition partly because, in many contexts, it serves a useful function (for example, trial-and-error learning in knowledge-based situations).Different types of errors have different psychological mechanisms, occur in different parts of the organization, and require different methods of management.Safety-critical errors happen at all levels of the system; they are not just made by those directly involved in patient care.Corrective actions involving sanctions, threats, fear, appeals, and the like have only limited effectiveness, and in many cases these actions can harm morale, self-respect, and a sense of justice.Errors are the product of a chain of causes in which the precipitating psychological factors—momentary inattention, misjudgment, forgetfulness, preoccupation—are often the last and least manageable links in the causal chain.

Health safety researchers have come to realize that individuals are not the primary cause of occasional sporadic accidents. Individuals can, however, be dynamic agents of patient safety by identifying and eliminating factors that undermine people’s ability to do their jobs successfully (Smith, Boult, Woods, & Johnson, 2010). In the next section readers are introduced to the science of human factors analysis and what health care organizations can learn from the error-reduction efforts in other complex, yet highly reliable, safe industries.

How to Error-Proof Processes

Systems that rely on error-free human performance are destined to fail. Traditionally, however, individuals have been expected to not make errors. The time has come for health care professionals to universally acknowledge that mistakes happen and to aim improvement activities at the underlying system failures rather than at the people who, though predominantly well intentioned, are working in systems that are not robust in protecting against mistakes or critically harmful outcomes. For example, if a nurse gives the wrong medication to a patient, typically two things occur. First, an incident report is completed and sent to the nurse’s department manager and risk management. Next, the nurse is “counseled” by management to pay closer attention next time. She is possibly told to read educational materials on the type of medication that was given in error. She may be warned that a second incident will result in a letter of reprimand being placed in her personnel file.

These individual-focused actions, however, will not fix the latent failures (for example: look-alike or sound-alike medication names, confusing product packaging, similar patient names) that continue to smolder behind the scenes and will invariably manifest themselves when another medication error is made by a different nurse. There may be the rare case of purposeful malevolence, malfeasance, or negligence, which is appropriately dealt with by sanction, but it is inappropriate to react with disciplinary actions for every error.

Human Factors Engineering

The discipline of human factors engineering (HFE) has been dealing with the causes and effects of human error since the 1940s. Originally applied to the design of military aircraft cockpits, HFE has since been effectively applied to the problem of human error in nuclear power plants, NASA spacecraft, and computer software (Welch, 1997). The science of HFE has more recently been applied to health care systems to identify the causes of significant errors and develop ways to eliminate or ameliorate them. Two particular concepts from the science of HFE have been introduced to health care systems to proactively improve safety. One is the use of a risk assessment technique—failure mode and effect analysis—to anticipate failures that may occur in high-risk processes. The process is then redesigned to reduce the severity and frequency of failures (Burgmeier, 2002). A second very promising proactive concept is the identification and examination of close call events (where a mistake almost reached a patient but was caught just in time). Information derived from close call events provides an understanding of latent failures that need to be resolved to prevent an actual harmful event from occurring (Cohoon, 2003).

By adopting the error-reduction strategies that have been successfully applied in other industries, many health care delivery systems can be redesigned to significantly lessen the likelihood of errors. Some of the tactics that have been summarized in health care literature are illustrated in Figure 1.2 and described in the following paragraphs (Leape, 1994; Cook & Woods, 1994; Grout, 2007; Clancy, 2007; Zwicker & Fulmer, 2008).

Reduce reliance on memory. Work should be designed to minimize the need for human tasks that are known to be particularly fallible, such as short-term memory and vigilance (prolonged attention). Checklists, protocols, and computerized decision aids are examples of tools that can be incorporated into health care processes to reduce mistakes. In a recent study related to clinical information technologies and patient outcomes, researchers found that hospitals with automated notes and records, order entry, and clinical decision support had fewer complications, lower mortality rates, and lower costs (Amarasingham, Plantinga, Diener-West, Gaskin, & Powe, 2009).

FIGURE 1.2 Error-Reduction Strategies

Improve information access. Creative ways must be developed to make information more readily available to caregivers. Information must be displayed where it is needed, when it is needed, and in a form that permits easy access by those who need it. For example, placing printed resuscitation protocols on “crash carts” gives caregivers a ready reference during cardiopulmonary resuscitation.

Mistake-proof processes. Where possible, critical tasks should be structured so that errors cannot be made. The use of forcing functions is helpful. For example, computerized systems can be designed in such a way as to prevent entry of an order for a lethal drug or to require weight-based dosing calculations for pediatric patients.

Standardize tasks. An effective means of reducing error is by standardizing processes wherever possible. If a task is done the same way every time—by everyone—there is less chance for error.

Reduce the number of hand-offs. Many errors come from slips in the transfer of materials, information, people, instructions, or supplies. Processes with fewer hand-offs reduce the chances for such mistakes.

The system and task redesigns suggested here could serve as the basis for improving processes that led to the unnecessary surgery described at the beginning of this chapter. The following specific corrective actions would likely be effective in decreasing the possibility of future adverse patient occurrences caused by latent failures in the system that cared for patients Murphy and Jenkins:

Reduce reliance on memory. In reverting to alternative procedures when patients were not wearing identification bands, the staff needed to remember to ask patients their identity. Strictly applied protocols for patient care treatment and diagnostic testing would incorporate the step of checking two patient identifiers and would not allow informal variations from this requirement.

Improve information access. The case illustrates many gaps in information communication (for example, patient identity, clinical information, and practitioner performance data). Health information technologies designed to permit access to clinical information by all appropriate practitioners may have helped the radiologist identify the error. Appropriate methods for collecting and trending practitioner performance data that can foster an improvement and safety culture are also needed to change the punitive culture generally associated with the peer review process.

Error-proof processes. Systems have been created that force the critical task of verifying patient identification before care can proceed. For example, by requiring patient identifier information to be entered into the system before the PACS allowed the radiology technologist to proceed with a diagnostic imaging study, the process would be more error-proof. A point-of-care bar-coding system that matches the identifying information in the system to the bar code on a patient’s ID band would also greatly reduce mistakes.

Standardize tasks. Safety-critical tasks should be standardized and processes created to ensure that all steps are followed. An example is the use of a standardized checklist to ensure consistency and compliance with all measures of the Universal Protocol developed by TJC to prevent surgery on the wrong patient (The Joint Commission, 2009). Another example is the Surgical Safety Checklist developed by the World Health Organization (WHO) that helps ensure that OR teams consistently follow critical safety steps in the surgical process, with a goal of minimizing the most common and avoidable risks that may endanger surgical patients. Pilot testing of the WHO Surgical Safety Checklist in eight hospitals demonstrated the rate of death decreased from 1.5% to 0.8%, and the rate of complications decreased from 11% to 7% when the checklist was used (World Health Organization, 2008).

Reduce the number of hand-offs.