2: Errors: Organizations, Individuals, and Unsafe Acts


CHAPTER 2
Errors: Organizations, Individuals, and Unsafe Acts



Errors at the sharp end are symptomatic of both human fallibility and underlying organizational failings. Fallibility is here to stay. Organizational and local problems, in contrast, are both diagnosable and manageable.


James Reason (2005)



Rather than being the instigators of an accident, operators tend to be the inheritors of system defect…their part is usually that of adding the final garnish to a lethal brew whose ingredients have already been long in the cooking.


James Reason (1990b)


While observing the aftermath of an incident that caused the death of a patient, a colleague commented, “I can’t even imagine this error happening.” Unfortunately the unimaginable often occurs when local conditions necessary for error generation exist within the work environment and are triggered––activated––by actions taken at the human-system interface (Reason 1990b). In reality, everything we humans devise, use, or do is prone to error and failure (Haerkens et al. 2015). So where do we begin in order to gain an understanding of how errors occur so that we can prevent them? To answer this question we have drawn heavily on the model developed by James Reason (Reason 1990a, 1990b) and subsequently adapted by others specifically to address errors and adverse incidents in medicine (Diller et al. 2014; Karsh et al. 2006; Leape 1994; Vincent et al. 1998, 2014). These models are based on systems and human factors analysis approaches, which focus on multiple error-generating factors found at the organizational, supervisory, environmental, personnel, and individual levels.


We have drawn on these models and modified them with the goal of characterizing the environment within which veterinary anesthetists work, an environment that includes technical, organizational, and human factors domains and the multiplicity of factors in those domains involved in error generation (Figure 2.1). In Figure 2.1 the domains are bounded by broken lines so as to reflect the real world in which anesthetists work; a world in which elements within and outside the work environment can influence our practice of anesthesia and yet are often beyond our control. The arrows between the various elements are bi-directional reflecting the fact that these interactions are two-way, one influencing the other and vice versa. This environmental model serves as the outline for this chapter.

Radial diagram of the environments and systems surrounding a client and a patient: the individual, the team, medications and delivery systems, physical and technological environments, and the organization.

Figure 2.1 This graphic shows the environment within which a veterinary anesthetist functions when managing an anesthetized patient. The outermost and all inner borders are broken lines that reflect the real world in which we work, a world in which elements within and outside our work environment, often beyond our control, can influence our practice of anesthesia. The arrows between the various elements are bi-directional reflecting the fact that these interactions are two-way, one influencing the other and vice versa. It contains Reason’s factors involved in making unsafe acts, including the organization, the individual, the team, the physical and technological environments, and medications and their delivery systems.


Error causation: technical factors


Errors do occur as a result of technical or equipment failures, but they are infrequent (Reason 2005). This is not to belittle or ignore these types of failures, especially when they harm either a patient or a healthcare provider. An issue with technical or equipment failures is how to quickly identify and correct these types of errors when they occur so that they do not cause further patient harm. Chapter 4 presents a few cases involving equipment failures, how they were detected, and strategies that were used to quickly identify them.


Error causation: organizational and supervision factors


Before discussing this topic in depth we need to ask, is a discussion of organizations relevant to veterinary medicine? More specifically, are private veterinary practices organizations? Probably we would agree that multi-veterinarian practices, such as referral practices/hospitals and university teaching hospitals, are organizations, but what about single- or two-veterinarian practices? An organization is defined as:



…a body of people structured and managed to meet a specific goal; as such it has a management structure that determines relationships between the different activities and members of the organization, and assigns roles, responsibilities, and authority to carry out different tasks. Organizations are open systems that affect and are affected by their environment (our emphasis).


Modified from: http://www.businessdictionary.com/definition/organization.html (accessed November 8, 2015)


We contend that these organizational elements exist in all veterinary practices, be they large or small in size. That said, it is important to note that each veterinary practice, be it a single- or multi-person practice, has its own unique ways of accomplishing the day-to-day tasks inherent in its operation. These routine tasks, often referred to as “standard operating procedures,” may be routine for one practice but not for another. These aspects of the practice directly or indirectly affect all aspects of patient management, including anesthesia. A procedure or process deeply embedded in one practice may not even exist in another practice. What may raise a question in the mind of a visitor to a practice may not even seem worthy of consideration by those working within the practice because “it’s just the way we do things here”; it is part and parcel of the organization’s culture.


So what role does the organization play? It is true that people make errors or at the very least are the final common pathway by which errors occur. But people do not go to work intending to make errors or cause harm. Error generation is often due to organization-related factors inherent in the organization and that influence the behavior and action of those within it (Battles & Shea 2001; Garnerin et al. 2002; Klemola 2000; Kruskal et al. 2008; Reason 2004; Wald & Shojania 2001). A number of terms have been used to describe these factors, including latent conditions or latent failures (also known as root causes or “resident pathogens”). These conditions exist as a result of defensive gaps, weaknesses, or absences unwittingly created in a system due to earlier decisions made by the designers, builders, regulators, managers, and supervisors of the organization or system. Examples of latent conditions include vials of similar shape, size, and color containing entirely different drugs; or similar labels for two different drugs (for an example see Case 5.1) (Garnerin et al. 2002; Reason 2004).


Latent conditions possess two important properties: their effects are usually longer lasting than those created by active failures (the latter are failures that occur due to actions taken by individuals at the human-system interface); and they exist within the system prior to an adverse event. These two properties mean that latent conditions can be detected and repaired before they cause harm (Reason 2004) As such, latent conditions are the primary targets of any safety management system (Reason 2004).


Management, too, has a role in error generation. For example, those at the frontlines of healthcare may be the recipients of a variety of latent failures attributable to supervision. In these situations there may be failure of leadership exemplified by inadequate training, or lack of professional guidance or oversight, all of which encourage non-standard approaches to patient care (Diller et al. 2014). There may be a lack of operational planning, a failure to correct known problems, or inadequate or missing supervisory ethics such as turning a blind eye to violations of standard operating procedures. Resource management such as the allocation and maintenance of organizational resources, including human resources, monetary budgets, and equipment design, can create latent conditions that set the stage for error generation.


Corporate decisions about allocation of such resources usually focus on two objectives: (1) quality of the work, and (2) on-time and cost-effective operations. In many situations quality is sacrificed for cost control or efficiency thus setting the stage for adverse incidents (Diller et al. 2014). This concept is perhaps best outlined by Hollnagel’s Efficiency-Thoroughness Trade-Off (ETTO) principle (Hollnagel 2009). In general, this principle refers to the idea that during their daily activities individuals and organizations must make “trade-offs” between the resources (time, effort, personnel, etc.) they expend on preparing, planning, and monitoring an activity (their thoroughness) and the resources (again time, effort, personnel, etc.) they expend on performing the activity (their efficiency). Safety conscious individuals and organizations favor thoroughness over efficiency, while those favoring productivity favor efficiency over thoroughness. The ETTO principle makes the assumption that it is impossible to maximize both thoroughness and efficiency at the same time and recognizes that an activity will not succeed without some degree of both. Hollnagel gives a number of reasons commonly used to justify making ETTO decisions, including “it is normally OK, there is no need to check because it will be checked later,” or “we always do it this way,” or “this way is much quicker.” Our experiences suggest that these formal and informal organizational processes, such as operational tempo, time pressures, schedules, and balancing thoroughness against efficiency, also occur in veterinary anesthesia and give a sense of the influence organizational climate can have on the individual and culture of patient safety.


Senior management should ensure that the organization’s culture and climate focuses on patient safety. This can be accomplished through operational processes, including formal processes, procedures, and oversight within the organization. All of this implies that an organization with a culture attuned to error prevention and patient safety is willing and able to learn from errors. This state of organizational being has been variously described as that of the learning organization, or the high reliability organization (HRO) (Sutcliffe 2011). These organizations are skilled at creating, acquiring, and transferring knowledge and modifying their behavior to reflect new knowledge and insights gained from error reporting and analysis (Palazzolo & Stoutenburgh 1997; Sutcliffe 2011; Vogus & Hilligoss 2015). HROs possess the following essential components (Palazzolo & Stoutenburgh 1997; Sutcliffe 2011):



  • Systems thinking—individuals within the organization recognize that dynamic complexity in complex systems means that problems are a meshwork of interrelated actions.
  • Personal mastery—there is a continuous process and state of mind that enables individuals within the learning organization to master a discipline.
  • Mental models—individuals recognize that they have biased images of reality and that they can challenge those views and develop different views or models of reality.
  • Building shared visions—a shared view of the organization’s vision is developed so that it fosters genuine commitment to the vision, not just compliance.
  • Team learning—teams are the fundamental learning unit of the organization for it is in teams that views of reality are shared and assumptions are challenged and tested.

A learning organization is also distinguished by how it acts (heedful interrelating) and what it does (heedful attending), both of which lead to mindful performance (Weick 2002). According to Weick, heedful attending is embodied in five processes (Weick 2002):



  1. A preoccupation with failure such that the people assume each day will be a bad day and act accordingly.
  2. A reluctance to simplify interpretations because they know that hubris is their enemy, and optimism is the height of arrogance.
  3. A sensitivity to operations so as to maintain situational awareness.
  4. Commitment to resilience and the ability to cope with unanticipated dangers after they have occurred, and do so by paying close attention to their ability to investigate, learn, and act without prior knowledge of what they will be called upon to act on.
  5. A willingness to organize around expertise thus letting those with the expertise make decisions.

Somewhat taking liberties here, the flip side of the learning organization might be what Reason calls the vulnerable system (Reason et al. 2001). A vulnerable system or organization is one that displays the “vulnerable system syndrome” (VSS) and its cluster of pathologies that render it more liable to experience errors and adverse incidents. Reason describes the syndrome as possessing three interacting and self-perpetuating characteristics: (1) blaming errors on front-line individuals; (2) denying the existence of systemic error-provoking weaknesses; and (3) the blinkered pursuit of productive and financial indicators (Reason et al. 2001). However, Reason also states:



Even the most resistant organizations can suffer a bad accident. By the same token, even the most vulnerable systems can evade disaster, at least for a time. Chance does not take sides. It afflicts the deserving and preserves the unworthy.


Reason (2000).


It is unwise to define success based on a chance occurrence. In anesthesia, success in safety means that an outcome is achieved by minimizing the risk of harm without relying on the quick wits of the anesthetist, a robust patient, and a pinch of good fortune; that is, it should not be defined merely as having an alive and conscious patient at the end of anesthesia.


This brings us to resilience in healthcare systems. Resilience is the intrinsic ability of a system to adjust its functioning in response to changes in circumstances so that it can continue to function successfully, even after an adverse incident, or in the presence of continuous stress or latent conditions. It revolves around clinicians’ abilities to make appropriate judgments regarding when and how to follow control measures and how the existing system supports this decision-making process. Resilience is the manner in which a system is able to respond to unexpected events and meet new demands while buffering challenges to safety. Rather than suppressing human variability by adding more and more control measures, resilience embraces human variability and the ability to make moment-to-moment adaptations and adjustments in the face of changing events in an uncertain and dynamic world (Reason 2000). There are many hallmarks of a resilient organization, including its culture and subcultures, which shape the organization’s ability to meaningfully confront errors wherever and whenever they occur and to learn from them (see “Developing a safety culture” in Chapter 8). Resilience is an important aspect of error prevention within an organization.


Leape states that error prevention efforts must focus on system-associated errors that occur as a result of design, and that design implementation must be considered a part of error prevention (Leape 2002). This approach requires methods of error reduction at each stage of system development, including design, construction, maintenance, allocation of resources, and training and development of operational procedures. The design process must take into consideration the reality that errors will occur and must include plans for recovering from errors. Designs should automatically correct errors when they occur, but when that is not possible, the design should detect errors before they cause harm. This means the system should build in both buffers (design features that automatically correct for human or mechanical errors) and redundancy (duplication of critical mechanisms and instruments so that failure of a component does not result in loss of function). Tasks should be designed to minimize errors including simplifying and standardizing tasks so as to minimize the load on the weakest aspects of cognition, which are short-term memory, planning, and problem-solving.


Prevention is the process of removing factors (root causes) that contribute to unsafe situations, but it is not the only means for reducing errors (Garnerin et al. 2002, 2006; Leape 1994). Another process, that of absorption, is intended to eliminate root causes (Reason’s “bad stuff”), including cultural causes, such as organizational roadblocks, which hinder early identification and correction of active failures (Garnerin et al. 2002, 2006). Absorption involves incorporating buffers into a system so that errors are identified and absorbed or intercepted before they cause patient harm (Garnerin et al. 2006; Leape 1994). Using both prevention and absorption enhances the elimination of errors more so than if only one approach is used (Garnerin et al. 2002). An example of prevention is a policy that makes it widely known within an organization that there is the potential for a particular type of error to occur. An example of absorption is the adoption of specific techniques or procedures within the organization to specifically prevent the occurrence of the error. A real life example serves to make this point.


In the Equine/Farm Animal Hospital of the Cornell University Hospital for Animals, any large animal patient undergoing anesthesia for any reason is aseptically catheterized intravenously with a 14-gauge, 5.25-inch catheter. These catheters are typically inserted into a patient’s jugular vein and secured in place by suturing the catheter hub to the skin; a catheter may remain in a patient for up to 24 to 36 hours depending on postoperative care. The catheter that is the focus of this example is actually designed for use in human patients, not veterinary patients.


In the mid-1990s when these catheters first started to be used in the hospital, it was discovered that partial separation of the catheter shaft from the hub occurred occasionally when removing a catheter from a patient. Unfortunately, in one patient a complete separation occurred and the catheter traveled down the jugular vein and lodged in the patient’s lung. The manufacturer was contacted regarding this problem and to the company’s credit, company representatives visited the hospital to gain a better understanding of how the catheters were used and the nature of the problem. The manufacturer made some changes in catheter design and assembly and as a result this problem disappeared for a number of years.


The problem unexpectedly reappeared a few years later when, during anesthesia of a horse, the anesthetist noticed that the IV fluids being administered to the patient were leaking from the catheter under the skin and creating a very large fluid-filled subcutaneous mass. The fluids were stopped and another catheter was inserted into the opposite jugular vein so that fluid administration could continue. The defective catheter was removed, inspected, and found to have a hole and tear at the catheter-hub interface (Figure 2.2). A test determined that a needle inserted through the injection cap into the catheter was not long enough to cause the hole and tear, thus the problem was attributed to a flaw in the catheter itself. To prevent harm to other large animal patients, this problem was dealt with using a two-pronged approach: first, the problem with the catheter was made widely known throughout the Equine/Farm Animal Hospital, specifically at a regularly scheduled monthly meeting that involved all faculty, house officers, and technicians. A technique was also presented to show how to block a catheter from floating down the jugular vein should it tear free from the hub during its removal from a patient. This is an example of the processes of prevention and absorption that when used together increase the likelihood of eliminating hazards within the work environment.

Photos of a 14-gauge catheter used for intravenous catheterization of horses and cattle displaying the back (left) and side (right) views.

Figure 2.2 Two views of a 14-gauge catheter typically used for intravenous catheterization of horses and cattle. The hole and tear in this catheter was noticed after it was removed from a horse’s jugular vein. Had there been more skin-associated drag on the catheter it may well have torn off the hub and traveled down the jugular vein to lodge in the horse’s lungs.


Error causation: environmental factors


Environmental factors include the physical environment with its lighting, noise, smells, clutter, and room layout. It also includes the technological environment with its equipment and control design, display, or interface characteristics.


Error causation: personnel factors


Personnel factors involve communication and information flow, such as miscommunication between individuals or when information is incomplete or unavailable. Other personnel factors include coordination failures that occur when individuals work independently rather than as team members; planning failures that occur when providers fail to anticipate a patient’s needs or create inappropriate treatment plans; and issues of fitness for duty, which can include many possibilities, such as sickness, fatigue, and self-medication with licit or illicit drugs that impair function.


Error causation: human factors


Both Reason and Diller use the term “unsafe acts” to describe the actions of those at the human-system interface that cause errors. As previously mentioned, Reason’s unsafe acts are due to the basic error types of slips, lapses, and mistakes (Figure 2.3) (Reason 1990a). Diller, drawing on Reason’s framework and applying it to healthcare, states that unsafe acts, or active failures, are those actions taken by individuals that cause errors and violations (Diller et al. 2014) (Figure 2.4 and Table 2.1). In Diller’s approach, errors can be categorized as:



  • Decision errors—occur when information, knowledge, or experience is lacking.
  • Skill-based errors—occur when a care provider makes a mistake while engaged in a very familiar task (see Case 5.1). This type of error is particularly susceptible to attention or memory failures, especially when a care giver is interrupted or distracted.
  • Perceptual errors—occur when input to one of the five senses is degraded or incomplete, such as poor hearing or eyesight.
  • Violations—intentional departure from accepted practices, so by definition they are not errors. Violations include routine violations, those that are habitual by nature and often enabled by management that tolerates “bending the rules”; and exceptional violations, or willful behaviors outside the norms and regulations that are condoned by management, not engaged in by others, and not part of the individual’s usual behavior.
Tree diagram of unsafe acts categorized as intended and unintended actions, where the basic error types are slip, lapse, and mistake.

Figure 2.3 This graphic relates unsafe acts to unintended and intended actions and the basic error types and cognitive failures that underlie them. Of special note is that violations are not errors, they are intentional actions that may or may not cause harm.


From: James Reason (1990) Human Error. Cambridge, UK: Cambridge University Press, p. 207. With permission of the publisher.

Tree diagram of the preconditions and violations of unsafe acts, where factors listed under preconditions are grouped as errors.

Figure 2.4 This graphic outlines how unsafe acts can lead to errors when any number of preconditions exist within the environment. The preconditions consist of human factors domains as described by Diller et al.


From: Thomas Diller et al. (2014) The Human Factors Analysis Classification System (HFACS) applied to health care. American Journal of Medical Quality 29(3): 181–90. With permission of the publisher.


Table 2.1 Diller et al.’s classification system of factors involved in error generation is based on the Human Factors Analysis Classification System and is intended for application to healthcare. It includes elements of Reason’s Generic Errors Model.













































Organizational influences
Resource management—allocation failures of organizational resources
Organizational climate—factors that adversely influence worker performance
Organizational processes—failure of formal processes, procedures and oversight
Supervision
Inadequate leadership
Inappropriate planned operations
Failure to correct known problems
Supervisory violations (supervisory ethics)
Preconditions for unsafe acts
Environment factors


  • Physical environment
  • Technological environment
Personnel factors—provider behavior contributing to an adverse incident


  • Communication and information flow
  • Coordination failures
  • Planning failure
  • Fitness for duty—fatigue, illness, self-medication that reduces capability
Condition of the operator


  • Adverse mental state
  • Adverse physiological state
  • Chronic performance limitations

    • Lack of knowledge
    • Inadequate training
    • Lack of experience
    • Lack of technical knowledge
Unsafe acts
Errors


  • Skill-based—mistake is made while engaged in a very familiar task
  • Decision error—information, knowledge, or experience is lacking
  • Perceptual error—occurs when one of the five senses is degraded or incomplete
Violations—intentional departures from accepted practice


  • Routine—often enabled by management
  • Exceptional—behavior outside the norm and not condoned by management

From: Thomas Diller et al. (2014) The Human Factors Analysis Classification System (HFACS) applied to health care. American Journal of Medical Quality 29(3): 181–90. With permission of the publisher.


Actions taken, regardless of whether intended or unintended, are preceded by cognitive processes, so we must understand the role of cognitive processes in error causation if we want to prevent errors or minimize their effects. Reason (1990a) has developed a cognitive model to explain in general terms how errors occur, and Drs Lucian Leape (1994) and Thomas Diller (Diller et al. 2014) have adapted that model to the field of medicine. According to this model, which is based on human cognition, the human mind functions in two modes (Leape 1994; Stanovich 2011; Stiegler et al. 2012; Wheeler & Wheeler 2005):



  1. Schematic control mode (also called intuitive or Type I cognitive processing)—an automatic, fast-response mode of cognition in which the mind has unconscious mental models composed of old knowledge—schemata—that are activated by very little conscious thought, or activated by sensory inputs that the mind tends to interpret in accordance with the general character of earlier experiences. In this mode thinking (mental functioning) is automatic, rapid, and effortless (Leape 1994). We use intuitive mode for ease of use and understanding.
  2. Attentional control mode (also called analytical or Type II cognitive processing)—a controlled, conscious, analytical mode of cognition requiring effort that is difficult to sustain, and uses stored knowledge; it is called into play when a new situation is encountered or the intuitive mode has failed. In this mode of thinking deliberate effort has to be made to determine what information to pay attention to and what to ignore. We use analytical mode for ease of use and understanding.

The intuitive mode is characterized by the use of heuristics, a process of learning, discovery, or problem-solving by trial-and-error methods. It is also a process by which we use cognitive short-cuts––rules of thumb––to reduce the cognitive cost of decision-making (Croskerry et al. 2013a; Reason 2008; Stiegler & Tung 2014). If we think of problem-solving, such as making a diagnosis, as being linked by some cognitive pathway to stored knowledge, then heuristics lies along that pathway and is just another way of applying stored knowledge to novel problems; it is neither a faulty (error-prone) nor faultless (error-free) process (McLaughlin et al. 2014). In fact, experienced decision makers use heuristics in ways that increase their decision-making efficiency (Kovacs & Croskerry 1999). Heuristics save time and effort in making daily decisions. Indeed, while performing daily activities we spend about 95% of our time in the intuitive mode using heuristics (Croskerry et al. 2013a). This is an acceptable approach when time and circumstances permit, but potentially detrimental in an emergent situation (Croskerry et al. 2013a). Cognitive scientists recognize that the human mind prefers to function as a context-specific pattern recognizer rather than use the analytical mode and calculate, analyze, or optimize (Reason 1990a). In fact, we humans prefer pattern matching over calculation to such a degree that we are strongly biased to search for a prepackaged solution before resorting to a more strenuous knowledge-based level of performance (Leape 1994). Thus we have a prevailing disposition to use heuristics. While it works well most of the time, heuristics can lead to errors in decision-making, including clinical decision-making, due to the influence of biases (Croskerry et al. 2013a; Hall 2002).


The analytical mode is used for conscious problem-solving that is required when a problem is confronted that has not been encountered before or as a result of failures of the intuitive mode. It requires more cognitive effort and draws upon stored knowledge and past experience to aid decision-making (Leape 1994).


Within this cognitive model three levels of human performance have been identified and used in error analysis (Leape 1994; Reason 1990a):



  1. Skill-based (SB) level performance is governed by stored patterns of preprogrammed instructions—schemata—that are largely unconscious, characterized as highly routinized, and occur in familiar circumstances. Skill-based performance relates to technical performance and proper execution of tasks (Kruskal et al. 2008).
  2. Rule-based (RB) level performance consists of actions or solutions governed by stored rules of the type if…then. Rule-based performance requires conscious thought; it relates to supervision, training and qualifications, communication, and interpretation (Kruskal et al. 2008).
  3. Knowledge-based (KB) level performance occurs when synthetic thought is used for novel situations. This level of performance requires conscious analytical processing and stored knowledge, and it requires effort.

Conceiving of and executing an action sequence involves the cognitive processes of planning, storage (memory), and execution. Errors can occur within any of these three stages, and the types of errors are characterized as slips, lapses, or mistakes—Reason’s basic error types. Slips are actions that occur not as planned even though the intended action may have been correct, that is, the actual execution of the action was wrong. Slips are usually observable (usually overt). As an aside, slips have been described as failures of low’level mental processing (Allnutt 1987). This terminology—“low-level mental processing”—is not a reflection of intelligence nor is it meant to be derogatory, but recognizes that an individual who makes a slip is distracted by any number of possible causes, and his or her full attention is not on the task at hand. Absent-minded slips increase the likelihood of making errors of omission, that is, failing to take a required action (Reason 1990a).


Lapses involve failures of memory that occur when an individual’s attention is distracted or preoccupied. They are usually apparent only to the person who experiences them (Reason 1990a). Both slips and lapses occur at the skill-based level (Reason 1990a).


Mistakes, on the other hand, occur when a plan is inadequate to achieve its desired goal even though the actions may run according to plan; mistakes occur at the planning stage of both rule-based and knowledge-based levels of performance (Helmreich 2000; Reason 1990b, 2005). Rule-based errors (mistakes) occur when a good rule is misapplied because an individual fails to interpret the situation correctly, or a bad rule that exists in memory is applied to the situation (Reason 2005). Knowledge-based errors (mistakes) are complex in nature because it is difficult to identify what an individual was actually thinking, what the cognitive processes were, prior to and at the time of an error. The usual scenario is that a novel situation is encountered for which the individual does not possess preprogrammed solutions (no schemata) and an error arises for lack of knowledge, or because the problem is misinterpreted. Mistakes have been described as failures of higher level cognition, that is, failure of the analytical mode of cognition (Allnutt 1987). Mistakes occur as a result of the same physiological, psychological (including stress), and environmental factors that produce slips (see Figure 2.3).


Leape has somewhat modified Reason’s model by focusing only on slips and mistakes and considers lapses to be slips (Box 2.1). According to Leape (1994), slips are unintended acts. Usually the operator has the requisite skills, but there is a lack of a timely attentional check, so slips are failures of self-monitoring. This phrase—“ failures of self-monitoring”—implies that the individual is solely responsible for the slip, but often slips occur as a result of factors that distract or divert the individual’s attention from the task at hand. Leape has identified a number of attention-diverting factors, or distractions, most of which are familiar to everyone in veterinary medicine, not just in veterinary anesthesia (Leape 1994):



  • Physiological—fatigue, sleep loss, alcohol or drug abuse, illness.
  • Psychological (may be internal or external of the individual)—other activity (busyness), emotional states such as boredom, frustration, fear, anxiety, anger, or depression.
  • Environmental—noise, heat, visual stimuli, odors, motion, clutter, room layout.
Aug 14, 2022 | Posted by in SUGERY, ORTHOPEDICS & ANESTHESIA | Comments Off on 2: Errors: Organizations, Individuals, and Unsafe Acts

Full access? Get Clinical Tree

Get Clinical Tree app for offline access