Use Sophia to knock out your gen-ed requirements quickly and affordably. Learn more
×

How Do Medical Errors Happen?

Author: Capella Healthcare

what's covered
This lesson will discuss how medical errors happen and the role of human factors and systems engineering. Specifically, this lesson will cover:
  1. The Systems Approach
  2. The Swiss Cheese Model of Accident Causation

1. The Systems Approach

Prior to the 1990s, it was thought that errors in healthcare were caused by individual incompetence. People assumed that well-trained, punctilious healthcare professionals do not make errors. This view was built on a legalistic framework and resulted in the punishment of individuals as a means to motivate individuals to be more vigilant. This type of blame created a toxic effect and caused people to rarely report mistakes, which made learning from these errors almost impossible.

Since then we have learned that adverse events do not occur because people intentionally want to hurt patients. Rather they arise because of the complexity of healthcare systems. While healthcare is becoming more effective, it includes increasing new technologies, medications, and treatments. People are older and are entering the healthcare system with multiple comorbidities requiring more difficult decisions. Increased economic issues put pressure on health systems and lead to overburdened systems across the continuum of care.

Just consider what it takes to transfer a patient from the hospital to home. The doctor must write the discharge order and summary and reconcile the medication list; a pharmacy must be organized to deliver home meds to the bedside; case managers must arrange durable medical equipment and home healthcare if needed; transportation needs to be arranged by the patient; and nurses must educate the patient about discharge instructions and schedule follow-up appointments with the primary care physician. As you can see, this is a highly complex function with multiple opportunities for something to go wrong. When so many different types of healthcare providers are involved, it is very difficult to ensure safe care unless our systems are designed to facilitate timely and complete information and understanding by all health professionals.

In the contemporary patient safety movement, the systems approach to human errors replaces the traditional, person-focused approach. The systems approach is based on the basic premise that humans are fallible by nature and errors are to be expected. This approach focuses on identifying the underlying cause of an error and building defenses into the system to prevent the error in the future or limit its impact rather than blaming individual healthcare professionals. The person-focused approach alone would not address the error-provoking properties within the system that led to errors.

Additionally, a systems approach involves studying the culture, policies, and all the individual components of an organization as well as external influences. James Reason thought that many errors committed by individuals had their origin in these "upstream" influences. Automation using technology can foster efficiencies in tasks, but it can also add to the complexity of problem-solving and the cognitive burden that can occur when something goes awry. High reliability organizations focus on the entire system, support the workforce, and espouse a culture of "intelligent wariness," or always being on the lookout for a potential system failure.


2. The Swiss Cheese Model of Accident Causation

The contemporary field of systems analysis was pioneered by the British psychologist James Reason, who analyzed accidents in the fields of aviation and nuclear power to study the nature of preventable adverse events. He concluded that catastrophic safety failures are almost never caused by isolated errors committed by individuals. Instead, most accidents arise from multiple, smaller errors in environments with serious underlying flaws. He introduced the Swiss Cheese Model to describe how accidents occur.

Swiss Cheese Model of Accident CausationSource: https://psnet.ahrq.gov/web-mm/right-left-neither
Swiss Cheese Model of Accident Causation
Source: https://psnet.ahrq.gov/web-mm/right-left-neither

Reason (1990) used this model to make the following point: "Rather than being the main instigators of an accident, operators tend to be inheritors of system defects created by poor design, incorrect installation, faulty maintenance, and bad management decisions. Their part is usually that of adding the final garnish to a lethal brew whose ingredients have already been long in the cooking."

In this model, errors made by individuals result in catastrophic outcomes due to flawed systems, or what we can all "the holes in the cheese." This model helps explain how the holes in the defenses line up to cause an accident and point the way toward solutions. The strategy is to identify the holes and shrink them so they do not line up in a similar way in the future.

As stated previously, human error is inevitable, especially in systems as complex as those in healthcare. Simply striving for perfection will not significantly improve patient safety, as it is unrealistic to expect flawless performance from individuals who are working in complex, high-stress environments. The systems approach contends that efforts to prevent human errors before they occur, or to block them from causing harm, will be more effective than trying to create perfect healthcare professionals.

According to Reason, in a complex system, there are two main categories of errors -- active failures and latent failures. Active failures are errors at the “sharp end” of the system that involve frontline healthcare providers.

EXAMPLE

A surgeon performs a wrong site surgery, a lab tech labels a tube incorrectly, or a nurse administers the wrong medication.

Errors at the “sharp end” are usually quite apparent since they are committed by the healthcare professionals closest to the patient. The “sharp end” of the system is the point where the patient receives healthcare, and it also signifies the point of contact between the individual and the larger system (e.g., the human-system interface). The "blunt end" refers to the many layers of the healthcare system that do not come in direct contact with patients but influence the personnel and equipment at that sharp end that does reach the patient.

Latent failures, or errors that occur at the “blunt end” of the system, tend to be less apparent, and they are attributed to broader organizational influences, including system design, culture, policies, quality management systems, supervision issues, environmental factors, allocation of resources, regulators, payers, and financial constraints, to name a few. Patient safety is irreducibly a matter of systems, where the microsystem (the "sharp end") is where the success or failure of all systems to ensure safety converge.

At the same time, patient safety needs to focus on all systems. Patient safety prevents avoidable adverse events by paying attention to systems and interactions, including the human interface, and allowing all parties to learn from near misses and adverse events. All those involved, through a concerted effort, should act to minimize the extent and the impact of unavoidable adverse events by creating well-designed systems and reducing the impact when error occurs. This is critical to becoming a high reliability organization.

In the illustration below, you can see how latent and active conditions lined up through failed defenses to cause a catastrophic event. In the Air Traffic Controllers situation, human factors (fatigue) as well as organizational issues (deficient training, inadequate staffing due to budget costs) led to the airplane crash. This information informs the leadership and frontline staff of the deficiencies in the system and is a starting point for system redesign.

<b>Swiss Cheese Model</b>Source: www.btcmanager.com/bitcoin-scams-interview-peter-todd/
Swiss Cheese Model
Source: www.btcmanager.com/bitcoin-scams-interview-peter-todd/

Authored by Cindy Ebner, MSN, RN, CPHRM, FASHRM


Support

If you are struggling with a concept or terminology in the course, you may contact RiskManagementSupport@capella.edu for assistance.

If you are having technical issues, please contact learningcoach@sophia.org.