Source:

AJN, American Journal of Nursing

May 2004, Volume 104 Number 5 , p 81 - 84 [FREE]

Author

  • Ronda G. Hughes PhD, MHS, RN

Abstract

Outline

  • A CULTURE OF SAFETY

  • ONLY HUMAN

  • TRAINING THROUGH SIMULATIONS

  • PREVENTING AND MITIGATING ERRORS

  • REFERENCES



    Graphics

  • Table 1

    The odds are greater that a person will be injured or die as a result of medical error than as a consequence of driving or flying, according to Lucian Leape, MD, speaking at the Agency for Healthcare Research and Quality’s Second Annual Patient Safety Research Conference in March 2003. While errors leading to injuries have become rare in other complex and high-risk industries, they remain pervasive in health care. 1 Clinicians can adopt lessons learned from other industries, such as aviation, to reduce threats to patient safety.

    A CULTURE OF SAFETY

    It’s now understood that failures in two categories— active (system) failures, which stem from characteristics of communication systems, job design, or equipment, and latent (organizational) failures, often arising from decisions made by people with indirect responsibilities—are generally responsible for errors that harm employees and consumers. 2 Researchers have found that, with few exceptions, both accidents and recurrent, “everyday” errors result from a series of human decisions associated with active and latent failures. Although active failures tend to have immediate consequences and may be more obvious, it’s often latent failures that ultimately undermine efforts to ensure safety.

    To address error in the aviation industry, for example, changes were made in systems and procedures. Yet the extent and causes of many errors remained hidden, in part because individuals who were involved in or had observed an accident or an unsafe situation were still not being rewarded for reporting it or punished for failing to do so. 3 With the establishment of the Aviation Safety Reporting System (ASRS) in 1975, reports ...

 

The odds are greater that a person will be injured or die as a result of medical error than as a consequence of driving or flying, according to Lucian Leape, MD, speaking at the Agency for Healthcare Research and Quality's Second Annual Patient Safety Research Conference in March 2003. While errors leading to injuries have become rare in other complex and high-risk industries, they remain pervasive in health care. 1 Clinicians can adopt lessons learned from other industries, such as aviation, to reduce threats to patient safety.

 

It's now understood that failures in two categories-active (system) failures, which stem from characteristics of communication systems, job design, or equipment, and latent (organizational) failures, often arising from decisions made by people with indirect responsibilities-are generally responsible for errors that harm employees and consumers. 2 Researchers have found that, with few exceptions, both accidents and recurrent, "everyday" errors result from a series of human decisions associated with active and latent failures. Although active failures tend to have immediate consequences and may be more obvious, it's often latent failures that ultimately undermine efforts to ensure safety.

 

To address error in the aviation industry, for example, changes were made in systems and procedures. Yet the extent and causes of many errors remained hidden, in part because individuals who were involved in or had observed an accident or an unsafe situation were still not being rewarded for reporting it or punished for failing to do so. 3 With the establishment of the Aviation Safety Reporting System (ASRS) in 1975, reports of accidents and potential hazards could be made voluntarily and anonymously; the ASRS also guaranteed that it would not "use [such] information against reporters in enforcement actions." 3 Rather than assigning blame, the industry acknowledged that everyone makes mistakes and focused instead on understanding the root causes. This new "culture of safety" has been credited with making aviation accidents rare. 4

 

Errors in health care range from unnecessary procedures or tests to treatment complications that result in a prolonged hospital stay to patient injury and even death. 5, 6 One study found that between 44,000 and 98,000 people die in U.S. hospitals annually as a result of medical errors 5, 6; a majority of such errors are avoidable. One formidable barrier to improving patient safety is the fear of punishment (such as being sued for malpractice or losing one's job), which inhibits people from acknowledging, reporting, and discussing their errors. If the lessons learned in other industries apply to health care, the best way to prevent errors and "near misses" is to monitor continually for threats to patient safety and to recognize that individual errors reflect organizational and system failures. The question that should be asked then becomes not "Who?" but "Why?" 7

 

When human factors such as lack of experience, skill, or motivation inhibit a person's ability to perform well, errors and near misses become inevitable. 2 And the impact of these factors is magnified when the person is fatigued, stressed, or distracted-all of which tend to be exacerbated during a crisis. Safety researchers have applied human factors engineering (which studies the mental and physical abilities and limitations of humans, and uses this knowledge to guide the design of various organizational systems) and industrial engineering, among others, to improve safety in many non-health care industries, such as transportation.

 

Because human factors will always be involved, many non-health care industries have focused on identifying processes that could be redesigned and in some cases automated. In general, redesign and automation have decreased the incidence of errors but haven't eliminated them. Cross-industry analyses of accidents and errors have found that, despite process redesign and even automation, some incidence of error remains inevitable because humans still perform some tasks. 2 Many tasks require vigilance to be performed safely; but vigilance is continually compromised by interruptions, distractions, and fatigue. The less a process is automated, the greater the reliance on human vigilance and the greater the risk of error. Indeed, one study of crew error in the aviation industry found that lapses in vigilance caused almost half of the known errors and accidents. 8

 

In health care also, and specifically in nursing, safety can be improved by lessening the effects of human factors. There are three essential ways to achieve this. First, staff fatigue and stress must be minimized. A nurse should work no more than 12 hours per day (for a maximum of 60 hours during a seven-day period) and should carry a small enough patient load to ensure that she can meet each patient's needs. 9, 10 Second, staff vigilance against potential threats to patient safety must be supported. For example, work schedules should be arranged so that nobody works longer than the recommended daily and weekly hours. 10 Third, what is known must be incorporated into practice now-and to do so effectively may require system redesign. Many health care delivery processes, especially those vulnerable to human error (such as medication administration, where the majority of all medical errors occur 11), need to be redesigned with consideration for human limitations, with the goal of reducing factors such as fatigue, stress, and distractions.

 

The aviation industry has recognized that many errors can be averted through better and recurrent training, with the goal of improving human performance in stressful situations. Courses in areas such as flight simulation and crew resource management were developed to minimize the risk of human error associated with nonautomated operations. (Examples of such errors include acting upon missing or incorrect information or a misinterpreted communication.)

 

Simulations brought flight crews together to promote team-work and gave all crew members (not just pilots) opportunities to learn how to prevent errors. Teamwork, including the importance of communication among all involved, was emphasized. Changes to this end included setting aside the command hierarchy to allow any person, regardless of rank, to address any critical situation. Although it was costly to train flight crews using highly realistic simulations-real passengers may be all that's missing-the expense was largely offset by savings associated with significantly lower error and accident rates.

 

Like the members of a flight crew, health care professionals are generally highly trained, skilled, dedicated people working together in complex systems. Yet knowledge and technology, as well as the responsibilities of clinicians, are constantly changing. Thus, clinicians' efforts to improve their knowledge and skills and to foster a work environment conducive to learning must also be ongoing. Simulations, drills, and mock codes are integral to these efforts. And regardless of clinical setting, the costs associated with medical errors that result in legal charges of negligence or malpractice outweigh the costs of conducting simulations and training programs.

 

The roles and responsibilities of nurses put them in what the Joint Commission on Accreditation of Healthcare Organizations has called the "front-line position" of a patient's defense against medical errors. 12 For example, one 1995 study found that nurses were responsible for intercepting 86% of all medication errors before the medication was given. 13 But to be effective in this regard, nurses need to be vigilant, identify potential errors, recognize an error as an error even if no adverse outcome ensued, and determine the root cause or causes of errors.

 

So what should a nurse do when she sees an error in the making? In the aviation industry, any team member (whether ground or flight crew) who becomes aware of a situation that could result in an error has the authority to "raise a red flag"-indicating the need to proceed with caution or stop-even if in so doing a flight is terminated. Table 1 (page 83) indicates some situations in which nurses may raise a red flag.

The odds are greater that a person will be injured or die as a result of medical error than as a consequence of driving or flying, according to Lucian Leape, MD, speaking at the Agency for Healthcare Research and Quality's Second Annual Patient Safety Research Conference in March 2003. While errors leading to injuries have become rare in other complex and high-risk industries, they remain pervasive in health care. 1 Clinicians can adopt lessons learned from other industries, such as aviation, to reduce threats to patient safety.

A CULTURE OF SAFETY

It's now understood that failures in two categories-active (system) failures, which stem from characteristics of communication systems, job design, or equipment, and latent (organizational) failures, often arising from decisions made by people with indirect responsibilities-are generally responsible for errors that harm employees and consumers. 2 Researchers have found that, with few exceptions, both accidents and recurrent, "everyday" errors result from a series of human decisions associated with active and latent failures. Although active failures tend to have immediate consequences and may be more obvious, it's often latent failures that ultimately undermine efforts to ensure safety.

To address error in the aviation industry, for example, changes were made in systems and procedures. Yet the extent and causes of many errors remained hidden, in part because individuals who were involved in or had observed an accident or an unsafe situation were still not being rewarded for reporting it or punished for failing to do so. 3 With the establishment of the Aviation Safety Reporting System (ASRS) in 1975, reports of accidents and potential hazards could be made voluntarily and anonymously; the ASRS also guaranteed that it would not "use [such] information against reporters in enforcement actions." 3 Rather than assigning blame, the industry acknowledged that everyone makes mistakes and focused instead on understanding the root causes. This new "culture of safety" has been credited with making aviation accidents rare. 4

Errors in health care range from unnecessary procedures or tests to treatment complications that result in a prolonged hospital stay to patient injury and even death. 5, 6 One study found that between 44,000 and 98,000 people die in U.S. hospitals annually as a result of medical errors 5, 6; a majority of such errors are avoidable. One formidable barrier to improving patient safety is the fear of punishment (such as being sued for malpractice or losing one's job), which inhibits people from acknowledging, reporting, and discussing their errors. If the lessons learned in other industries apply to health care, the best way to prevent errors and "near misses" is to monitor continually for threats to patient safety and to recognize that individual errors reflect organizational and system failures. The question that should be asked then becomes not "Who?" but "Why?" 7

ONLY HUMAN

When human factors such as lack of experience, skill, or motivation inhibit a person's ability to perform well, errors and near misses become inevitable. 2 And the impact of these factors is magnified when the person is fatigued, stressed, or distracted-all of which tend to be exacerbated during a crisis. Safety researchers have applied human factors engineering (which studies the mental and physical abilities and limitations of humans, and uses this knowledge to guide the design of various organizational systems) and industrial engineering, among others, to improve safety in many non-health care industries, such as transportation.

Because human factors will always be involved, many non-health care industries have focused on identifying processes that could be redesigned and in some cases automated. In general, redesign and automation have decreased the incidence of errors but haven't eliminated them. Cross-industry analyses of accidents and errors have found that, despite process redesign and even automation, some incidence of error remains inevitable because humans still perform some tasks. 2 Many tasks require vigilance to be performed safely; but vigilance is continually compromised by interruptions, distractions, and fatigue. The less a process is automated, the greater the reliance on human vigilance and the greater the risk of error. Indeed, one study of crew error in the aviation industry found that lapses in vigilance caused almost half of the known errors and accidents. 8

In health care also, and specifically in nursing, safety can be improved by lessening the effects of human factors. There are three essential ways to achieve this. First, staff fatigue and stress must be minimized. A nurse should work no more than 12 hours per day (for a maximum of 60 hours during a seven-day period) and should carry a small enough patient load to ensure that she can meet each patient's needs. 9, 10 Second, staff vigilance against potential threats to patient safety must be supported. For example, work schedules should be arranged so that nobody works longer than the recommended daily and weekly hours. 10 Third, what is known must be incorporated into practice now-and to do so effectively may require system redesign. Many health care delivery processes, especially those vulnerable to human error (such as medication administration, where the majority of all medical errors occur 11), need to be redesigned with consideration for human limitations, with the goal of reducing factors such as fatigue, stress, and distractions.

TRAINING THROUGH SIMULATIONS

The aviation industry has recognized that many errors can be averted through better and recurrent training, with the goal of improving human performance in stressful situations. Courses in areas such as flight simulation and crew resource management were developed to minimize the risk of human error associated with nonautomated operations. (Examples of such errors include acting upon missing or incorrect information or a misinterpreted communication.)

Simulations brought flight crews together to promote team-work and gave all crew members (not just pilots) opportunities to learn how to prevent errors. Teamwork, including the importance of communication among all involved, was emphasized. Changes to this end included setting aside the command hierarchy to allow any person, regardless of rank, to address any critical situation. Although it was costly to train flight crews using highly realistic simulations-real passengers may be all that's missing-the expense was largely offset by savings associated with significantly lower error and accident rates.

Like the members of a flight crew, health care professionals are generally highly trained, skilled, dedicated people working together in complex systems. Yet knowledge and technology, as well as the responsibilities of clinicians, are constantly changing. Thus, clinicians' efforts to improve their knowledge and skills and to foster a work environment conducive to learning must also be ongoing. Simulations, drills, and mock codes are integral to these efforts. And regardless of clinical setting, the costs associated with medical errors that result in legal charges of negligence or malpractice outweigh the costs of conducting simulations and training programs.

PREVENTING AND MITIGATING ERRORS

The roles and responsibilities of nurses put them in what the Joint Commission on Accreditation of Healthcare Organizations has called the "front-line position" of a patient's defense against medical errors. 12 For example, one 1995 study found that nurses were responsible for intercepting 86% of all medication errors before the medication was given. 13 But to be effective in this regard, nurses need to be vigilant, identify potential errors, recognize an error as an error even if no adverse outcome ensued, and determine the root cause or causes of errors.

So what should a nurse do when she sees an error in the making? In the aviation industry, any team member (whether ground or flight crew) who becomes aware of a situation that could result in an error has the authority to "raise a red flag"-indicating the need to proceed with caution or stop-even if in so doing a flight is terminated. Table 1 (page 83) indicates some situations in which nurses may raise a red flag.

 
Table 1 - Click to enlarge in new windowTable 1. When to Raise a Red Flag

REFERENCES

 

1. Why do errors happen? In: Kohn LT, et al., editors. To err is human: building a safer health system. Washington, DC: National Academies Press; 2000. p. 42-57. [Context Link]

 

2. Reason J. Human error. Cambridge, U.K.: Cambridge University Press; 1990. [Context Link]

 

3. Aviation Safety Reporting System. Program overview. 2003. http://asrs.arc.nasa.gov/overview.htm. [Context Link]

 

4. Barnet A, Higgins M. Airline safety: the last decade. Manage Sci 1989;35(1):1-20. [Context Link]

 

5. Brennan TA, et al. Incidence of adverse events and negligence in hospitalized patients. Results of the Harvard Medical Practice Study I. N Engl J Med 1991;324(6):370-6. [Context Link]

 

6. Leape LL, et al. The nature of adverse events in hospitalized patients. Results of the Harvard Medical Practice Study II. N Engl J Med 1991;324(6):377-84. [Context Link]

 

7. Creating and sustaining a culture of safety. In: Page A, editor. Keeping patients safe: transforming the work environment of nurses. Washington, DC: National Academies Press; 2003. p. 285-311. [Context Link]

 

8. Dornheim M. Crew distractions emerge as new safety focus. Aviat Week Space Technol 2000;153(3):58-60. [Context Link]

 

9. Maximizing workforce capability. In: Page A, editor. Keeping patients safe: transforming the work environment of nurses. Washington, DC: National Academies Press; 2003. p. 162-225. [Context Link]

 

10. Work and workspace design to prevent and mitigate errors. In: Page A, editor. Keeping patients safe: transforming the work environment of nurses. Washington, DC: National Academies Press; 2003. p. 225-84. [Context Link]

 

11. Kaushal R, Bates D. Computerized physician order entry (CPOE) and clinical decision support systems (CDSSs). In: Shojania KG, et al., editors. Making health care safer: a critical analysis of patient safety procedures. Rockville, MD: Agency for Healthcare Research and Quality; 2001. [Context Link]

 

12. Joint Commission on Accreditation of Healthcare Organizations. Front line of defense: the role of nurses in preventing sentinel events. Oakbrook Terrace, IL: Joint Commission Resources; 2001. [Context Link]

 

13. Leape LL, et al. Systems analysis of adverse drug events. JAMA 1995;274(1):35-43. [Context Link]