Lessons from the China Airlines Flight 140 Crash: Human Factors and Error Prevention

Lessons from the China Airlines Flight 140 Crash: Human Factors and Error Prevention

Introduction

On April 26, 1994, at 8:12 PM, China Airlines (now China Airlines) Flight 140, an Airbus A300B4-622R en route from Taipei, Taiwan to Nagoya, crashed during its landing approach at Nagoya Airport (now Nagoya Airfield, commonly known as Komaki Airport). Of the 271 people on board—15 crew members and 256 passengers—264 lost their lives. This tragedy ranks as the second-deadliest aviation accident in Japanese history, after the Japan Airlines Flight 123 crash in 1985 (520 fatalities), and remains the worst aviation disaster of the Heisei era.

This catastrophe was not merely a case of pilot error but resulted from a complex interplay of multiple contributing factors. This article examines the lessons learned from this accident and explores the critical importance of human factors and error prevention.

Sequence of Events

Flight Overview

The accident aircraft departed Chiang Kai-shek International Airport (now Taiwan Taoyuan International Airport) at 5:53 PM Japanese time, carrying 15 crew members and 256 passengers, bound for Nagoya Airport. The aircraft, an Airbus A300B4-622R manufactured in December 1990, was relatively new with 8,572 flight hours and 3,910 takeoff and landing cycles.

Critical Phase Leading to the Accident

At 8:07 PM, the aircraft received clearance for an Instrument Landing System (ILS) approach and began its landing sequence, with the first officer handling manual flight controls. At approximately 1,000 feet altitude and about 5.5 kilometers from the runway threshold, the first officer inadvertently activated the go-around lever—a control that initiates the landing abort mode—which should not have been touched at this stage of the approach.

Fatal Chain of Events

This erroneous input caused the autopilot system to select go-around mode and automatically increase engine thrust. Simultaneously, the horizontal stabilizer began moving to pitch the nose up. The captain and first officer recognized the error and attempted to continue the landing manually. The captain issued three warnings to the first officer and pushed the control column forward to lower the nose.

However, a critical problem emerged. The Airbus A300-600R’s autopilot system at that time was designed so that once go-around mode was activated, the autopilot’s commands took precedence over manual pilot inputs. Therefore, even as the captain pushed the control column forward to lower the nose, the autopilot system continued commanding the horizontal stabilizer to pitch up.

This created a conflict in the aircraft’s attitude control. While the control column was pushed forward, the horizontal stabilizer was actuated to its maximum nose-up position. This abnormal condition persisted for approximately 15 seconds. When the first officer released the control column, the force from the horizontal stabilizer prevailed, causing the nose to pitch up sharply and abruptly.

The aircraft climbed steeply and entered an aerodynamic stall. Subsequently, the aircraft fell from approximately 500 feet altitude toward the ground, crashing approximately 500 meters from the southern end of Nagoya Airport’s runway and bursting into flames. Rescue operations commenced within one minute of the crash, and the fire was extinguished approximately 90 minutes later. Of the 271 people aboard, 264 died, with only seven survivors—all of whom had been seated in the forward section of the aircraft (seat rows 7-15).

Accident Causes from a Human Factors Perspective

The detailed investigation by the Aircraft Accident Investigation Commission (now the Japan Transport Safety Board) revealed that multiple factors contributed to this accident.

1. Inadequate Pilot Training and Lack of Awareness

China Airlines’ training program did not include adequate procedures for handling inadvertent go-around mode activation. The first officer did not understand the correct procedure to cancel this mode, and the captain similarly lacked sufficient knowledge. Furthermore, the pilots did not accurately understand the priority relationship between the autopilot system and manual control.

2. Autopilot System Design Issues

In the Airbus A300-600R at that time, when go-around mode was activated, the autopilot system’s commands took precedence over the pilot’s manual inputs. This design philosophy was intended to ensure that pilots could reliably execute a go-around in emergency situations, but it became a fatal problem when inadvertently activated. Even when pilots pushed the control column with maximum force, they could not completely counteract the nose-up force.

3. Inadequate Warning Systems

The aircraft was not equipped with a system to clearly warn of conflicts between the autopilot system and manual control. Although warning tones sounded in the cockpit, the pilots could not accurately understand their meaning. A more intuitive and explicit warning system might have enabled earlier recognition of the situation’s severity.

4. Lack of Crew Resource Management (CRM)

Communication between the captain and first officer was problematic. Although the captain issued three warnings to the first officer, clear instructions and role assignments were not established. Additionally, while the first officer recognized his mistake, he could not implement appropriate corrective action. This highlights the importance of effective communication and teamwork among crew members.

Evolution of Human Factors Countermeasures in the Aviation Industry

The China Airlines Flight 140 accident renewed awareness of the importance of human factors countermeasures in the aviation industry. It is estimated that approximately 70% of aviation accidents and incidents involve human factors, and since this accident, further safety measures have been advanced.

Development of CRM (Crew Resource Management) Training

CRM is defined as the effective utilization of all available resources (people, equipment, information, etc.) to achieve safe and efficient operations. Its developmental history reflects the evolution of aviation safety itself.

First Generation: Interpersonal Management Skills (Late 1970s-1980s)

Following the 1977 Tenerife Airport disaster (583 fatalities, the worst accident in aviation history), the problem of authority gradient in the cockpit was recognized. The excessive authority of captains prevented first officers and flight engineers from expressing doubts or concerns, which was identified as a factor leading to major accidents. CRM training during this period focused primarily on communication skills and leadership.

Second Generation: Error Management (1990s)

Beyond improving interpersonal relationships, it became recognized that error countermeasures considering human cognitive characteristics and limitations were important. Based on the premise that “humans are fallible beings,” the mainstream approach shifted from completely preventing errors to detecting them early and minimizing their impact. During this phase, skills such as situation awareness sharing, cross-monitoring, and assertiveness (constructive advocacy) were emphasized.

Third Generation: Threat and Error Management (2000s onward)

The University of Texas Human Factors Research Project developed the Threat and Error Management (TEM) model. This model also focuses on the “threats” lurking behind errors. Threats are factors that increase the likelihood of errors, including adverse weather, equipment malfunctions, time pressure, fatigue, and stress.

The TEM model was developed through a methodology called LOSA (Line Operations Safety Audit). In LOSA, trained observers accompany actual flights to collect and analyze data on crew behavior and environmental factors. This enables identification of potential risks in daily operations and implementation of preventive measures.

Modern CRM training includes the following elements:

  • Ensuring Psychological Safety: Creating an environment where anyone can speak up freely for safety, regardless of rank or experience
  • Assertiveness: Skills to clearly express one’s opinions and concerns while respecting others
  • Cross-Monitoring and Check-Back: Methods for mutually checking each other’s actions and confirming instructions and communications
  • Decision-Making Processes: Effective team decision-making methods
  • Workload Management: Methods to appropriately distribute tasks and maintain each member’s performance at a certain level or above
  • Sharing Situation Awareness: Techniques for the entire team to accurately grasp and share the situation

Technical Improvements

Following the China Airlines Flight 140 accident, Airbus modified the autopilot system software. The revised system allows manual control to take precedence when pilots apply sufficient force to the control column, even when go-around mode is activated. This enables pilots to reliably control the aircraft in emergency situations.

Warning systems were also improved to provide clearer alerts when conflicts occur between the autopilot system and manual control. These technical improvements were implemented on all China Airlines aircraft by September 1994, and Airbus recommended them to all operators worldwide in December of that year.

Regulatory Response

In Japan, following this accident, CRM training became legally mandatory for all airline pilots operating domestically from April 2000 (Heisei 12). However, Japan’s major airlines had already been voluntarily conducting CRM training since around 1990.

On May 10, 1994, the Ministry of Transport Civil Aviation Bureau (now the Ministry of Land, Infrastructure, Transport and Tourism Civil Aviation Bureau) issued the following guidance to Japan Air System (now Japan Airlines), which operated aircraft of the same model:

  • Thorough confirmation of autopilot mode during approach
  • Clarification of go-around mode cancellation procedures
  • Thorough adherence to cautions regarding autopilot system use
  • Compliance with operational regulations

Taiwan’s Civil Aeronautics Administration also issued the following directives to China Airlines:

  • Urgent implementation of flight computer modifications (directive issued May 3, 1994)
  • Enhanced training and re-evaluation of A300-600R pilots (issued May 7)

China Airlines conducted special inspections of flight control systems and autopilot systems on all owned aircraft by May 31 and conducted skill assessments of all pilots. Furthermore, computer software modifications were completed by September 7.

Basic Principles of Human Factors Countermeasures

The lessons learned from the China Airlines Flight 140 accident can be broadly applied to other industrial sectors. The basic principles of human factors countermeasures can be organized as follows.

Analysis Using the SHELL Model

The SHELL model, proposed by Frank H. Hawkins of KLM Royal Dutch Airlines in 1975, is a framework for analyzing human factors. It places humans (Liveware) at the center and examines interactions with surrounding elements.

  • S (Software): Work procedures, manuals, training programs, regulations, etc.
  • H (Hardware): Equipment, facilities, tools, interfaces, etc.
  • E (Environment): Physical environment (lighting, noise, temperature, humidity), organizational environment, workspace, etc.
  • L (Liveware): The central worker
  • L (Liveware): Other team members, supervisors, stakeholders, etc.

Later, the m-SHELL model was proposed, positioning M (Management) as an independent element surrounding the system. This emphasizes the importance of organizational management factors.

Analyzing the China Airlines Flight 140 accident using the SHELL model:

  • Software: Inadequate training programs, unclear emergency procedures
  • Hardware: Autopilot system design issues, insufficient warning systems
  • Environment: Time pressure, high-stress landing situation
  • Liveware (Center): Pilots’ knowledge deficiency, lack of situation awareness
  • Liveware (Surrounding): Insufficient communication between captain and first officer
  • Management: Airline’s training system, safety culture

Three-Stage Error Management Approach

R. Helmreich of the University of Texas proposed the error management concept, which consists of three stages:

Stage 1: Error Prevention

Anticipating potential dangers and preventing errors before they occur. This includes appropriate training, clear procedures, effective use of checklists, and improved situation awareness. In the context of the China Airlines Flight 140 accident, appropriate training regarding go-around mode might have prevented the accident.

Stage 2: Error Detection and Correction

Early detection of situations likely to produce errors and prompt indication and correction. This requires cross-monitoring, check-backs, and effective communication. In the China Airlines Flight 140 accident, although the captain recognized the first officer’s error, he could not take effective corrective action.

Stage 3: Impact Mitigation

Minimizing damage even when errors occur and their effects manifest. This includes appropriate emergency response procedures, backup systems, and risk management. In the China Airlines Flight 140 accident, the autopilot system design made it difficult for pilots to minimize the impact.

Fostering Organizational Safety Culture

Beyond improving individual skills, it is important to foster a safety culture throughout the organization. This includes the following elements:

Establishing a Non-Punitive Reporting Culture

NASA’s ASRS (Aviation Safety Reporting System), initiated in 1976, collects information on anomalous operations from individual pilots under conditions of confidentiality, anonymity, and immunity. The reported content is databased, published online, and used for investigating causes of anomalous operations and improving aviation systems.

The key to this system’s success is creating an environment where reporters can report without fear of punishment. A culture is important where reporting errors and near misses is encouraged, and such information is used for organizational learning and improvement.

Continuous Learning and Improvement

An attitude of learning from past accidents and incidents and continuously improving is necessary. Using methods such as RCA (Root Cause Analysis), it is important to identify not only surface causes but also organizational and systemic root causes and implement countermeasures.

Management Commitment

It is essential to clarify management policies that prioritize safety and allocate necessary resources (budget, personnel, time). When management commits to fostering a safety culture, safety-first values permeate throughout the organization.

Application to Other Industries

The knowledge of human factors countermeasures cultivated in the aviation industry has been widely applied to other industrial sectors requiring high safety standards, including healthcare, nuclear power, railways, maritime transport, and manufacturing.

Application in Healthcare

In the healthcare field, programs such as Team STEPPS (Team Strategies and Tools to Enhance Performance and Patient Safety) have been developed, incorporating CRM training concepts from the aviation industry. Human factors concepts are being applied to improve teamwork in operating rooms, prevent medication errors, and ensure patient safety.

Examples of human factors countermeasures in healthcare:

  • Time-outs: All team members gather before surgery to confirm patient information, surgical procedures, and risks
  • Use of checklists: Standardized confirmation procedures such as WHO Surgical Safety Checklist
  • Cross-monitoring: Team members check each other’s actions and speak up if they have concerns
  • Standardized communication: Structured communication methods such as SBAR (Situation, Background, Assessment, Recommendation)

Application in Manufacturing

In manufacturing, human factors concepts have been incorporated into quality control, process management, and safety management. Methods such as poka-yoke (error-proofing), visualization, standard work, and root cause analysis are widely used to prevent human errors and improve quality and safety.

Examples of human factors countermeasures in manufacturing:

  • Work simplification: Breaking down complex work into understandable and executable procedures
  • Fail-safe design: Design that prevents accidents even if incorrect operations are performed
  • Fool-proof design: Design that prevents incorrect operations
  • Standard operating procedures: Reducing work variations and sharing best practices
  • Near-miss reporting systems: Early discovery of potential dangers and implementation of countermeasures

Application in Nuclear Industry

In the nuclear industry, human factors research advanced significantly following the Three Mile Island accident in 1979 and the Chernobyl accident in 1986. Human factors concepts have been widely incorporated into operator training, control room design, and emergency response procedures.

Application to Regulatory Fields: Lessons for the Pharmaceutical and Medical Device Industries

Human factors concepts are also extremely important in the regulatory field of pharmaceuticals and medical devices. Particularly in ensuring data integrity, operating quality systems, and manufacturing management, human errors can have serious consequences.

Data Integrity and Human Factors

Regulatory requirements such as FDA 21 CFR Part 11 (Electronic Records and Electronic Signatures), EU GMP Annex 11 (Computerised Systems), and PIC/S PI 041-1 (GMP Documentation and Data Integrity) demand various controls to ensure data reliability. However, these technical controls alone are insufficient; addressing human factors is indispensable.

Applying lessons from the China Airlines Flight 140 accident to the regulatory field:

Importance of Training

In the China Airlines Flight 140 accident, pilots’ insufficient understanding of the autopilot system’s operating principles contributed to the accident. Similarly, in the pharmaceutical and medical device fields, if employees do not accurately understand systems and procedures, data integrity violations and quality issues may result.

Application to regulatory fields:

  • Computerized system user training: Including not just correct system usage, but also the importance of data integrity and responses to erroneous operations
  • GMP training: Not mere memorization of rules, but understanding why rules exist and the consequences of violations
  • Regular retraining: Confirming knowledge retention and sharing the latest regulatory requirements

Clear Procedures and Checklists

In the aviation industry, checklists are widely used to prevent omission of important procedures. Appropriate use of Standard Operating Procedures (SOPs) and checklists is also important in the pharmaceutical and medical device fields.

Application to regulatory fields:

  • Detailed SOPs for critical processes
  • Checklists for deviation management, CAPA (Corrective Action and Preventive Action), change control
  • Checklists for data review

Ergonomic Considerations in System Design

In the China Airlines Flight 140 accident, the autopilot system design being counter-intuitive to pilots was problematic. In the pharmaceutical and medical device fields, ergonomic considerations in system and process design are also necessary.

Application to regulatory fields:

  • User-friendly interface design
  • Error-proof design: Preventing incorrect operations
  • Appropriate design of warnings and alerts: Ensuring important warnings are not overlooked
  • Automatic recording of audit trails: Preventing errors from manual recording

Organizational Culture and Reporting Systems

A non-punitive reporting culture like aviation’s ASRS is also important in the pharmaceutical and medical device fields. Creating an environment where employees can easily report errors and deviations enables early problem detection and improvement.

Application to regulatory fields:

  • Non-punitive deviation reporting systems
  • Establishment of near-miss reporting systems
  • Examination of organizational factors in root cause analysis of quality issues
  • Fostering an open quality culture

Human Factors in Medical Device Design

FDA, ISO, and IEC emphasize human factors (usability) engineering in medical device design.

  • IEC 62366-1:2015+AMD1:2020 (Medical devices — Application of usability engineering to medical devices)
  • FDA Guidance: Applying Human Factors and Usability Engineering to Medical Devices (2016)

These standards and guidance require consideration of users from the design stage to minimize use errors and ensure patient safety. Lessons from the China Airlines Flight 140 accident can be applied to medical device design as follows:

  • Intuitive user interface design
  • Design to prevent erroneous operations (fool-proof)
  • Clear and understandable warning displays
  • Design appropriate to user training levels
  • Design considering the use environment (lighting, noise, stress, etc.)
  • Continuous improvement through formative and summative evaluations

Conclusion

The China Airlines Flight 140 crash on April 26, 1994, was a tragedy that claimed 264 precious lives. However, the lessons learned from this accident have contributed to safety improvements not only in the aviation industry but also in many industries including healthcare, manufacturing, nuclear power, and pharmaceutical/medical device regulation.

The fundamental concept of human factors is based on the premise that “humans are fallible beings,” and rather than completely eliminating errors, it involves constructing systems that coexist with errors while minimizing their impact. This requires appropriate training, clear procedures, ergonomically excellent design, effective teamwork, and fostering an organizational safety culture.

The China Airlines Flight 140 accident demonstrates that technological advancement alone cannot ensure safety, and highlights the importance of appropriately designing and managing human-machine interactions. In today’s computerized complex systems, human factors concepts are becoming increasingly important.

In the pharmaceutical and medical device regulatory field as well, human factors considerations are required in all aspects including data integrity, quality systems, manufacturing management, and device design. Beyond merely meeting regulatory expectations, it is important for entire organizations to engage in continuous improvement based on human factors principles to protect patient safety.

More than 30 years have passed since the China Airlines Flight 140 accident. We must never forget the lessons learned from this tragedy and continue our efforts toward realizing a safer society.

References and Sources

  1. Aircraft Accident Investigation Commission (1996). Aircraft Accident Investigation Report: China Airlines Flight 140 Airbus A300B4-622R, Nagoya Airport, April 26, 1994. Ministry of Transport, Japan.
  2. Japan Transport Safety Board. China Airlines Flight 140 Crash Overview. https://jtsb.mlit.go.jp/jtsb/aircraft/detail.php?id=851
  3. JAXA Aeronautical Technology Directorate. Human Factors Technology. https://www.aero.jaxa.jp/research/basic/flight/human/
  4. Japan Institute of Human Factors. Various Safety Seminar Courses. http://www.jihf.com/
  5. Medical Safety Promotion Network. CRM Training at Airlines. https://www.medsafe.net/
  6. International Civil Aviation Organization (ICAO). Safety Management Manual (SMM), Doc 9859, 4th Edition, 2018.
  7. Helmreich, R.L., Merritt, A.C., & Wilhelm, J.A. (1999). The Evolution of Crew Resource Management Training in Commercial Aviation. International Journal of Aviation Psychology, 9(1), 19-32.
  8. Reason, J. (1990). Human Error. Cambridge University Press.
  9. FDA. Guidance for Industry: Applying Human Factors and Usability Engineering to Medical Devices. 2016.
  10. IEC 62366-1:2015+AMD1:2020. Medical devices — Application of usability engineering to medical devices.

This article is written with deep condolences to the victims of the China Airlines Flight 140 crash and their bereaved families, with the hope that lessons learned from this tragedy will contribute to realizing a safer society.

Related post

Comment

There are no comment yet.