BPS updates

Improving aircraft safety

Don Harris discusses the role of human error in air accidents and how aviation psychology has contributed to making flying as safe as possible

03 February 2014

At 23:42 (EST) on 29 December 1972, an Eastern Airlines Lockheed L1011 (Tristar) crashed into the Florida Everglades. The aircraft had only a minor technical failure (a blown bulb on the landing gear status display) but crashed because no one was actually flying it. All the crew were ‘head down’ trying to fix the problem. Circling near Miami airport at 2000 feet, and despite an alarm sounding, none of them noticed that the autopilot had disconnected causing the aircraft to enter
a gentle descent.

The accident report concluded that ‘the probable cause of this accident was the failure of the flightcrew to monitor the flight instruments during the final four minutes of flight, and to detect an unexpected descent soon enough to prevent impact with the ground. Preoccupation with a malfunction of the nose landing gear position indicating system distracted the crew’s attention from the instruments and allowed the descent to go unnoticed’ (National Transportation Safety Board, 1973). Other accidents around this time highlighted instances of the captain doing all the work while other crew were almost unoccupied; dominant personalities suppressing teamwork and error checking; or simply poor crew cooperation, coordination and/or leadership.

The human factor
During the last 50 years the accident rate in commercial jet aircraft (excluding those manufactured in USSR and former Soviet states) has declined sharply from approximately 5.0, to just 0.35 per million departures (Boeing Commercial Airplanes, 2013). As reliability and structural integrity has improved, the number of accidents resulting from engineering failures has reduced dramatically. Human error is now the principal threat to flight safety: it is estimated that up to 75 per cent of all aircraft accidents now have a major human factors component (Civil Aviation Authority, 2013).

The main focus of aviation psychology is to reduce human error throughout the system from the flight deck to the ground staff. But ‘human error’ is merely the beginning of an explanation. In the same way that aircraft accidents seldom have a solitary cause, human mistakes also rarely have a single underlying contributory factor. Error is the product of design, procedures, training and/or the environment, including the organisational environment (see Dekker, 2001b). As we shall see, it is an oversimplification to suggest that any accident is caused by ‘human error’ or ‘system failure’ alone.

The vastly increased use of aircraft during the Second World War was a major impetus for serious coordinated work on aviation safety. Analysis of US Army Air Corps pilot losses had shown them to be equally distributed between three principal causes: about one third of pilots were lost in training crashes; one third in operational accidents; and surprisingly only one third were lost in combat (Office of Statistical Control, 1945). This suggested there were safety deficiencies inherent throughout the whole system of training and operation.

Early aviation psychology in the US focused on human engineering issues in aircraft design. Reviewing his work at the Aero Medical Laboratory in the early 1940s, the human factors pioneer Alphonse Chapanis (1999) described how, after landing certain types of aircraft, stressed and fatigued pilots occasionally retracted the undercarriage instead of the flaps. Chapanis observed that in aircraft where this occurred the controls for the undercarriage and flaps were identical in shape and located next to each other. The remedy he proposed was simple: physically separate the controls and make their shape analogous to the function they are controlling e.g. a wheel shape for the undercarriage or an aerofoil shape for the flaps.

Another US researcher Walter Grether described the difficulties pilots had reading the early three-pointer altimeter (see Figure 1). Experiments found this instrument took over 7 seconds to interpret and produced 11 per cent of errors of 1000 feet or more (Grether, 1949). Most human performance research in the UK during this time was conducted at the Medical Research Council (MRC) Applied Psychology Unit in Cambridge. Here fundamental work was undertaken on issues such as the direction of the motion relationships between controls and displays (Craik & Vince, 1945); the effects of prolonged periods of vigilance on performance, especially in radar operators (Mackworth, 1948); and pilot fatigue (Drew, 1940).

Even this early work demonstrated that ‘pilot error’ was not a sufficient explanation for the causes of many accidents. Pilots often fell into a trap left for them by the cockpit interfaces – what became known as ‘design induced’ error.

Crew resource management
During the 1970s CFIT (Controlled Flight Into Terrain) accidents, such as the one described at the beginning of this article, began to dominate safety thinking in commercial aviation. CFIT accidents involve an airworthy aircraft under control crashing into terrain. These invariably involve human error in some way. Accidents of this type resulted in many airlines instigating crew resource management (CRM) programmes that optimise use of all the human resources on board the aircraft – not just the crew on the flight deck. CRM training is now mandatory for all commercial flight crew. Reading through the Civil Aviation Authority’s syllabus requirements for CRM (CAA, 2006) – including company safety culture, workload management, decision making, leadership and team behaviour – is like reading through the British Psychological Society’s syllabus for occupational psychology. In fact, the practice of aviation psychology is essentially a specialist application of occupational psychology.

The widespread uptake of CRM programmes produced a culture change in aviation. Early CRM approaches were predicated on avoiding error and focused on improving management style and interpersonal skills on the flight deck. Emphasis was placed upon improving communication, attitudes and leadership to enhance teamwork. The syllabus for second-generation CRM training built upon these concepts but included stress management, human error, decision making and group dynamics. Furthermore, CRM programmes began to train the whole aircraft crew together to encourage teamwork. Eventually CRM concepts permeated into airline organisations as a whole and included further issues such as safety and national culture (aviation is a truly international industry). The real change in safety culture, however, occurred when CRM training per se actually started to disappear as its concepts were absorbed into all aspects of training and the development of procedures (see Helmreich, 1994; Pariés & Amalberti, 1995). Within aviation, CRM is now a ‘way of life’.

CRM programmes instigated an organisational culture change and have resulted in significant safety gains. Its concepts form a key part of any airline-wide safety management programme. The modern view of CRM regards humans as fundamentally fallible, especially under stress, and considers error as part of the human condition: it is pervasive. Emphasis is not simply on attempting to eliminate error but on utilising the error management troika: avoid errors; trap errors already made; and/or mitigate the consequences of these errors.

CRM concepts are now being adapted extensively for use in a wide range of other safety-critical domains outside the aerospace industry, such as patient safety (particularly surgery); rail safety; at nuclear sites and in the offshore oil and gas industry. These aviation-inspired practices (often referred to as NOTECHS – non-technical skills) have had to be adapted to suit the environment and cultures of these new application domains. While the basic syllabus components have remained the same (e.g. team cooperation, leadership and managerial skills, situation awareness and decision making: van Avermaete, 1998) the manner in which they have been instantiated has been adapted to their new application areas. For example in surgical applications, teams are much bigger with a great deal more specialisation; furthermore, members of the team may change half way through lengthy operations or to accommodate new specialities. As Musson (2009) points out, unlike on the flight deck, in the operating theatre there are also subcultures at work (surgeons; anaesthetists; perfusionists; nurses)that can lead to interprofession frictions.

Flying by the book
The omission of critical actions or undertaking the wrong action has been observed to be the primary factor in many accidents. Nearly one quarter of fatal approach-and-landing accidents involved such an event (Ashford, 1999). On the flight deck, to help reduce the likelihood of error, all aircraft are operated in a highly proceduralised manner. Standard operating procedures (SOPs) and checklists dominate daily life. Aircraft are flown on a ‘monitor and cross-monitor’ and ‘challenge and response’ basis. One pilot reads out checklist items and monitors the other pilot as they complete the task.

But this approach is not confined to the flight deck. Aircraft are handed over from dispatchers to pilots using highly structured checklists; and checklists are used to transfer critical information between outgoing and incoming shifts in the maintenance hangar to make sure nothing is omitted. Handovers in an air traffic control centre are undertaken in
a similar way using acronyms such as ‘PRAWNS’ (Voller et al., 2005): Pressure (barometric); Runways in use; Airports and Airways; Weather briefing; Non-standard procedures in use and priority information; and Strips (aircraft movements). In this way no vital information is omitted – everything is coordinated. Nevertheless, accidents still happen as a result of not completing the required procedures. This may be for a variety of reasons, such as distractions on the flight deck; interruptions; abnormally high workload; incorrect management of priorities (poor CRM); poor checklist design or simply complacency. However, checklists and SOPs have been a further factor in the reduction of error.

In surgical settings such standardised processes or checklists were not commonplace until relatively recently, either in the operating theatre or when handing over patients to the post-operative care team. Implementation of the World Health Organization’s Surgical Safety Checklist has promoted better team communication and dynamics (i.e. essential parts of good CRM) with consequent improvements in patient safety in terms of perioperative morbidity and mortality. The Safer Surgery Saves Lives group investigated the impact of the WHO checklist in eight hospitals worldwide before and after its implementation. The overall death rate was reduced from 1.5 per cent to 0.8 per cent and inpatient complications dropped from 11 per cent
to 7 per cent (Haynes et al., 2009).

Nevertheless, the introduction of such checklists was not straightforward. The surgical environment required an approach with more flexibility than that found on the flight deck (Walker et al., 2012). Although every flight is different, there is more variation in operations. Interestingly, this reaction against over-proceduralisation (and toward increased flexibility) in surgery is now reflected in the teachings of some airline training captains. An emerging mantra seems to be that checklists are check lists, not do lists. Emphasis is being placed once again on judgement and decision making rather than slavish application of procedures that cannot specify every circumstance and hence guarantee safety (see Dekker, 2001a).

Modern safety challenges
An airline’s SMS (safety management system) provides the framework to integrate the majority of the work undertaken by human factors specialists in the aviation industry. An effective airline SMS is mandated under international law. The International Civil Aviation Organization’s approach to safety management is based around Jim Reason’s ‘Emmental (or Swiss) cheese’ model of accident causation (see Figure 2), which describes the contributions of organisational and psychological factors – including stress, poor training and poor scheduling – to the accident process (ICAO, 2009; Reason, 1997).

Although there is increasing recognition of the importance of the human component in aviation safety, further work is required. The science base and regulations still lag behind changes in the nature of modern flight operations. For example, Reason’s model implicitly assumes a ‘semi-closed’ organisational system, typical of the nature of major organisations during the 1980s (when the model was developed). However, with the advent of low-cost operators the nature of the airline business has changed dramatically. Modern airlines are far more ‘open’ systems, with more outsourcing and subcontracting of functions. For example, they operate into a wide range of airports (none of which they own), and maintenance is often provided by third parties. Some low-cost carriers may not even own their aircraft, or employ their own ground and check-in personnel. In extreme cases, they don’t even employ their own pilots! And, as ever, there is the possibility of misinformation crossing organisational boundaries through air traffic management and control being provided by the various national authorities of the countries into which airlines fly (or overfly).

Airline operations have also become more integrated. For example, the turn-around process on the gate includes airport operator, airline, air traffic control, ground handling, catering, fuelling, cleaning contractors and a Central Flow Management Unit, to name but a few. Commercial air transport as a whole is actually a ‘system of systems’ (Harris & Stanton, 2010). Maier (1998) characterised a ‘system of systems’ as possessing five basic traits: operational independence of elements; managerial independence of elements; evolutionary development; possessing emergent behaviour; and having a geographical distribution of elements. In the context of aviation, aircraft operations, maintenance and air traffic management/control all have distinct operational independence and managerial independence (they are offered by independent companies or national providers). While they are bound by a set of common operating principles and international regulations, there are now many more inter-organisational boundaries that information and resources must cross compared to 30 years ago. Accidents in civil aviation are now often characterised by errors promulgating across organisational (and system) boundaries (Harris & Li, 2011). This can be seen in the Air Inter A320 accident described in the box on p.93. Furthermore, the person making the final error may not be one of the victims of the accident. Preventing accidents like these is a challenge. Safety management now has to extend beyond the immediate organisation.

As commercial aviation is a ‘system-of-systems’, aviation psychology must respond with a similar systemic approach. There needs to be greater integration between the various subdisciplines – selection, training, equipment design and organisational pressures do not exist in isolation. They combine to contribute to accidents so they should be tackled in an integrated manner (Harris, 2011).

Beyond safety
Despite the net safety gains aviation psychology has contributed, in some eyes it has become almost as a ‘hygiene factor’: consuming money without ‘adding value’. There are opportunities for aviation psychology to make a positive impact on the financial bottom line of airlines, but to truly enhance operating efficiency the human part of any system cannot be examined in isolation from all the other components – a wider, socio-technical perspective must be adopted (Harris, 2006). By taking an integrated, long-term approach to tracking human-related costs and safety issues, significant wide-ranging benefits will accrue.

First, though, a change of attitude is required – one in which cost savings and operational efficiency are regarded as equally acceptable objectives as safety. In mainstream occupational psychology improving business efficiency is an acceptable objective for research and consultancy. However, 99 per cent of aviation psychology seems to be aimed at improving safety. The subdiscipline can make significant contributions to improving the ‘bottom line’ in airlines but is rarely used to do so. Aviation psychology tends to confine (and define) itself almost solely within the remit of improving safety.

So, despite the advances it has contributed to safety, aviation psychology needs to avoid its natural inclination to define itself solely within this province. While increasing levels of specialisation in areas such as human-centred design, selection, training and error have served to develop the science base, this fragmentation has impeded a coherent, systemic application of psychology in commercial aviation. The field has come of age, but it must coalesce for the maximum benefit from an integrated, long-term approach to be realised. The opportunity now exists to capitalise on the progress made by this relatively new subdiscipline of psychology.

Box text (page 93, see PDF)

The manifold roots of error
The crash of Air Inter Flight 148, an Airbus A320, at Mont Sainte-Odile, near Strasbourg in 1992 was at least partly attributed to an error prompted by the design of a certain aspect of the flight deck (Bureau d’Enquêtes et d’Analyses pour la Sécurité de l’Aviation Civile, 1992). The primary cause of the accident was that the crew selected the wrong mode on the flight management and guidance system – ‘vertical speed’ mode instead of ‘flight path angle’ mode. During the approach they entered ‘33’ in the autoflight system, intending  a 3.3° descent angle (corresponding to about 800 feet per minute) but because the aircraft was in the wrong mode this input triggered a descent rate of 3300 feet per minute. Crucially, the read-outs for both vertical speed and flight path angle were made on the same shared digital display, the only distinction being the incorporation of a decimal place when in ‘flight path angle’ mode. The aircraft subsequently flew into a ridge at approximately 2700 feet, around eight miles short of the airport.

But this is not the whole story. There were other human factors at play in the run-up to the accident. The crew had difficulty in establishing their exact position as a result of incorrect information from air traffic control. This caused high workload on the flight deck as they tried to re-align the aircraft prior to commencing the final stages of the approach. The high rate of descent should still have been detected by the crew, but was missed because the CRM on the flight deck was poor, and in particular there was a significant lack of checks and cross-checks on the aircraft’s position. There was only minimal communication recorded between crew members. Adding to the problems, the accident occurred at night and in poor weather (low cloud and light snow).

Finally, it had also been decided by the airline not to equip the aircraft with a ground proximity warning system, which would have alerted the crew prior to impact with the terrain. Air Inter operated in several mountainous areas, and it was felt that such systems gave too many nuisance warnings In short, no one human error or problem caused this accident, but a single appropriate intervention on the part of the flight crew (or air traffic control) might have avoided it.

This is the nature of safety management. Install multiple barriers to stop or trap error: you only need one of them to hold.

Don Harris
Human Systems Integration Group, Faculty of Engineering and Computing, Coventry University
[email protected]

References

Ashford, R. (1999). Study of fatal approach and landing accidents worldwide, 1980-96. Flight Safety Digest (November 1998–February 1999). Alexandria, VA: Flight Safety Foundation.
Boeing Commercial Airplanes (2013). Statistical summary of commercial jet airplane accidents (worldwide operations 1959–2012). Seattle, WA: Author.
Bureau d’Enquêtes et d’Analyses pour la Sécurité de l’Aviation Civile (1992). Rapport de la commission d’enquête sur l’accident survenu le 20 janvier 1992 près du Mont Sainte-Odile (Bas Rhin) à l’Airbus A 320 immatriculé F-GGED exploité par la compagnie Air Inter (F-ED920120). Le Bourget: Author.
Chapanis, A. (1999). The Chapanis Chronicles: 50 years of human factors research, education, and design. Santa Barbara, CA: Aegean Publishing.
Civil Aviation Authority (2006). Crew resource management (CRM) training. London: Author.
Civil Aviation Authority (2013). Global fatal accident review 2002–2011 (CAP 1036). London: Author.
Craik, K.J.W. & Vince, M.A. (1945). A note on the design and manipulation of instrument-knobs. Applied Psychology Laboratory, Cambridge University Report. Cambridge University.
Dekker, S.W.A. (2001a). Follow the procedure or survive. Human Factors and Aerospace Safety 1, 381–385.
Dekker, S.W.A. (2001b). The re-invention of human error. Human Factors and Aerospace Safety, 1, 247–266.
Drew, G.C. (1940). An experimental study of mental fatigue. British Flying. Personnel Research Committee Memorandum No. 227. London: British Air Ministry, Flying Personnel Research Committee.
Grether, W.F. (1949). The design of long-scale indicators for speed and accuracy of quantitative reading. Journal of Applied Psychology, 33, 363–372.
Harris, D. (2006). The influence of human factors on operational efficiency. Aircraft Engineering and Aerospace Technology, 78(1), 20–25.
Harris, D. (2011). Human performance on the flight deck. Aldershot: Ashgate.
Harris, D. & Li, W-C. (2011). An extension of the human factors analysis and classification system (HFACS) for use in open systems. Theoretical Issues in Ergonomic Science, 12(2), 108–128.
Harris, D. & Stanton, N.A. (2010). Aviation as a system of systems. Ergonomics 53(2), 145–148.
Haynes, A.B., Weiser, T.G., Berry, W.R. et al. (2009). A surgical safety checklist to reduce morbidity and mortality in a global population. New England Journal of Medicine, 360, 491–499.
Helmreich, R.L. (1994). The anatomy of a system accident: The crash of Avianca Flight 052. International Journal of Aviation Psychology, 4, 265–284.
International Civil Aviation Organization (2009). Safety management manual (2nd edn). ICAO Doc 9859. Montreal: Author.
Mackworth, N.H. (1948). The breakdown of vigilance during prolonged visual search. Quarterly Journal of Experimental Psychology, 1, 6–21.
Maier, M.W. (1998). Architecting principles for system of systems. Systems Engineering, 1, 267–284. Musson, D. (2009). Putting behavioural markers to work: Developing and evaluating safety training in healthcare settings. In R. Flin & L. Mitchell (Eds.) Safer surgery: Analysing behaviour in the operating theatre (pp.423–435). Aldershot: Ashgate.
National Transportation Safety Board (1973). Aircraft accident report, Eastern Air Lines, Inc. Miami, Florida, December 29, 1972, L-1011, N310EA: Report Number: NTSB-AAR-73- 14. Washington, DC: Author.
Office of Statistical Control (1945). Army Air Forces statistical digest – World War II. Available from the Air Force Historical Research Agency (www.usaaf.net/digest).
Pariés, J. & Amalberti, R. (1995). Recent trends in aviation safety: From individuals to organisational resources management training. Risøe National Laboratory Systems Analysis Department Technical Report, Series 1 (pp.216–228). Roskilde, Denmark: Risøe National Laboratory.
Reason, J.T. (1997). Managing the risks of organizational accidents. Aldershot: Ashgate.
van Avermaete, J.A.G. (1998). NOTECHS: Non-technical skill evaluation in JAR-FCL. NLR-TP-98518. Amsterdam: National Aerospace Laboratory.
Voller, L., Glasgow, L., Heath, N. et al. (2005). Development and implementation of a position hand-over checklist and best practice process for air traffic controllers. In B. Kirwan, M. Rodgers & D. Shaefer (Eds.) Human factors impacts in air traffic management (pp.25–42). Aldershot: Ashgate.
Walker, I.A., Reshanwalla, S. & Wilson, I.H. (2012). Surgical safety checklists: Do they improve outcomes? British Journal of Anaesthesia, 109(1), 47–54.