Institutions have accepted or given pre-approval for credit transfer.
* The American Council on Education's College Credit Recommendation Service (ACE Credit®) has evaluated and recommended college credit for 32 of Sophia’s online courses. Many different colleges and universities consider ACE CREDIT recommendations in determining the applicability to their course and degree programs.
In this lesson, you will be introduced to a real-world example of teamwork and communication in the aviation industry. Specifically, this lesson will cover:
1. Course Introduction
In the United States, an estimated 85% of the population has at least one medical encounter, while 25% of these people have as many as 8 to 9 encounters, annually. A single visit involves a multidisciplinary team of healthcare professionals, administrative staff, patients, and their families or friends. Often such an encounter results in multiple visits to other clinicians or services in multiple organizations using different medical records. Fragmented care has been identified as a common contributor to medical errors. Studies have also indicated that more than half of adverse events in healthcare are attributable to surgical care, most of which are deemed preventable. Furthermore, root cause analyses have associated patient outcomes with non-technical aspects of performance, such as teamwork and safety culture, and not to a mere lack of technical expertise. The effective delivery of surgical care requires the orchestration of a multidisciplinary team working in harmony in a time-sensitive and complex environment. Any breakdown in these interdependent processes can potentially lead to performance failures and adverse events. In an interview study of surgeons, communication was found to be a causal factor in 43% of errors made in surgery.
2. Real-World Example
The aviation industry has demonstrated the importance of effective teamwork on flight safety and has verified the effectiveness of specific team training to improve safety. As aircraft systems became more complex in the 1970s, they noted that more accidents were occurring (Fig. 1) and conducted a formal investigation to identify root causes so they could reduce the number of accidents.
The record of the investigations provides alarming documentation of ways in which crew coordination failed at critical moments.
The following are human errors caused by interpersonal miscommunication (Helmreich & Foushee, 2010):
A crew, distracted by the failure of a landing gear indicator light, failed to notice that the automatic pilot was disengaged and allowed the aircraft to descent into a swamp.
A co-pilot, concerned that take-off thrust was not properly set during a departure in a snowstorm, failed to get the attention of the captain with the aircraft stalling and aircraft crashed into the Potomac River.
A crew failed to review instrument landing charts and their navigational position with respect to the airport and further disregarded repeated Ground Proximity Warning System alerts before crashing into a mountain below the minimum descent altitude.
A crew distracted by nonoperational communication failed to complete checklists and crashed on take-off because the flaps were not extended.
A breakdown in communication between a captain, co-pilot, and Air Traffic Control regarding fuel state and a crash led to complete fuel exhaustion.
A crew crashed on take-off because of icing on the wings after having inquired about de-icing facilities. In the same accident a flight attendant failed to communicate credible concerns about the need for de-icing expressed by pilot passengers.
Consider if any of these types of miscommunication have happened in your work?
As a result of this investigation, the focus shifted from individual issues to crew-level issues that compromised safety; it was a significant achievement in our understanding of what determines safety in flight operations. Just as performance can suffer because of poor technology or inadequate training, so too can system effectiveness be reduced by errors in the design and management of crew-level tasks and of organizations.
Beginning in 1980, NASA endorsed Cockpit Resource Management (CRM, later named Crew Resource Management). CRM focuses on interpersonal communication, leadership, and decision making and is used in environments where human error can have devastating effects. It is regarded as a countermeasure with three main goals:
Trapping emerging errors before they amplify risk
Mitigating the consequences of any errors committed
CRM training, like any new approach in a well-established traditional enterprise, was not universally accepted in its early years. When it was first introduced to pilots and the FAA, it was viewed with skepticism despite a string of recommendations from the National Transportation and Safety Board that CRM training be required of the nation’s airlines. Training began in 1980; however, it was noted in the 1990s that organizational culture was an obstacle to effective implementation. Again, culture eats safety for lunch…Really!!
Over the last few decades, extensive efforts have been made to apply lessons learned from aviation and other high-risk, complex industries to healthcare and develop new healthcare-specific evidence for teamwork and patient safety. In this course, we will explore the fundamentals of teamwork and communication in healthcare and how to create effective teams to achieve safe, reliable, and effective care.