Flight Crew Decision- Making - Semantic Scholar · Flight Crew Decision-Making ... (conceptualizing...

34
Chapter 5 Flight Crew Decision- Making Judith M. Orasanu NASA-Ames Research Center, Moffett Field, CA 94035-1000 Crew Resource Management The contents of this chapter are held in the Public Domain. 147

Transcript of Flight Crew Decision- Making - Semantic Scholar · Flight Crew Decision-Making ... (conceptualizing...

Chapter 5

Flight Crew Decision-Making

Judith M. OrasanuNASA-Ames Research Center, Moffett Field, CA 94035-1000

Crew Resource Management

The contents of this chapter are held in the Public Domain. 147

IntroductionFlight crews make decisions all the time, from the captain’s acceptance of the aircraft and

flight plan prior to departure to docking at the gate after landing. Unfortunately, the

decisions that get the most attention are those that result in disasters, for example the

decision to take off with snow and ice on the plane after the de-ice time had expired at

Washington National Airport (NTSB, 1982), or the decision to take off without being

sure the runway was clear of traffic in heavy fog at Tenerife, the Canary Islands (Dutch

Aircraft Accident Inquiry Board, 1979). However, commercial aviation remains an

incredibly safe mode of transportation, in good part due to the skills and judgment of its

pilots.

While an industry-wide analysis showed that over 70% of aviation accidents

resulted from crew coordination or communication problems (as opposed to lack of

individual technical skills, Lautman & Gallimore, 1987), the analysis of 37 aircraft

accidents between 1978 and 1990 in which flight crew behavior contributed to the

accident, the National Transportation Safety Board (NTSB, 1994) found that 25

involved what the Board considered ‘‘tactical decision errors.’’ Line Operations

Safety Audits (LOSA) involving observations of crew behavior during ‘‘normal’’

flights found that decision errors were among the least likely type of error to

occur (~6%), but were more likely (along with proficiency errors) to become

consequential (e.g. result in a hazardous aircraft state) than other types of errors

(Klinect et al., 1999). Hence, maintaining safe flight operations depends on assuring

effective crew decision-making, especially under threatening conditions (Helmreich

et al., 2001).

Because decision-making takes mental energy and because a large body of research

suggests that people do not always make optimal decisions, aircraft designers, carriers and

the FAA try to simplify crew decision-making by establishing standard procedures and

checklists to cover anticipated failures or emergencies (Billings, 1991; Wiener, 1988),

and through crew training and automated systems. However, poor decisions may occur

even when problem situations are fairly straightforward because of conditions that

increase risk, such as high workload, weather or heavy traffic. In other cases, simple

problems cascade or interact, precluding ‘‘by the book’’ solutions. In still rarer cases,

completely unforeseen catastrophic problems arise, like the loss of all hydraulic systems

due to an engine explosion (NTSB, 1990). Given the impossibility of designing

error-proof or fully automated systems that can cope with any and all emergencies, the

last line of defense is the flight crew. Thus, the bottom line is: How can flight crews be

trained and supported to make the best decisions possible, especially under challenging

high-risk conditions? To address this question, we first must ask: How do crews typically

Chapter 5 • Flight Crew Decision-Making148

make decisions? What constitutes effective decision-making? What factors make

decisions difficult and contribute to poor decisions?

The short answer to these questions is the following: crew decision-making is not

one thing. Crews make many different kinds of decisions, but all involve situation

assessment and choice of a course of action that satisfies goals while managing risks.

However, decisions differ in the degree to which they call upon different types of

cognitive processes. A decision to abort a takeoff requires different decision processes

from choosing an alternate airport for landing with a system failure or determining the

cause of a master caution warning light. The nature of the processes involved in

a decision depends on the structure of the decision task and the conditions surrounding

it. How familiar is the problem? Is a response prescribed or must it be developed? How

much time is available? Given the variety of decisions that are made routinely in the

flight deck no single approach can be prescribed. No silver bullet exists to make crews

better decision-makers. The long answer to the above questions is the subject of this

chapter.

Since the original version of this chapter appeared in 1993, the concept of

decision-making in the cockpit has evolved, deepening its roots in the naturalistic

decision-making (NDM) framework and growing toward the goal of threat and error

management (TEM). Perhaps the biggest change in aviation decision-making (ADM)

over the past 16 years is the concept of ADM as risk management. This shift emphasizes

the importance of risk perception and risk assessment as essential components of

effective decision-making. It also aligns error detection and correction with ADM, in

keeping with the TEM framework (Helmreich, 2002).

In this chapter the term ADM refers to decision-making by a flight crew, usually

consisting of two members, but also includes solo pilots of varying skill levels and aircraft

capabilities.

This chapter is organized as follows: first, the processes by which flight crews make

decisions are described; then we address factors that contribute to decision difficulty

and poor decisions, followed by factors that provide crew resilience in the face of high-

risk challenges. The final section explores strategies for improving crew decision-

making.

5.1. Aviation Decision-MakingAviation decision-making is viewed in this chapter as a form of ‘‘naturalistic

decision-making’’ (NDM) (Klein et al., 1993). Naturalistic decision-making focuses

on understanding how people with domain expertise use their knowledge to make

decisions, typically in safety-critical environments (Cannon-Bowers et al., 1996;

Flight Crew Decision-Making • Chapter 5 149

Zsambok & Klein, 1997). Certain features characterize naturalistic decision-making

and distinguish it from classical analytical decision-making (Lipshitz et al., 2001).

According to Lipshitz et al., its essential characteristics are: (1) choice

(conceptualizing decision-making as choosing among concurrently available

alternatives), (2) input–output orientation (focusing on predicting which alternative

will, or should be, chosen given a decision-maker’s preferences), (3) comprehensiveness

(conceptualizing decision-making as a deliberate and analytic process that requires

a relatively thorough information search, particularly for optimal performance), and

(4) formalism (the development of abstract, context-free models amenable to

quantitative testing).

In contrast, in the NDM approach emphasis is placed on situation assessment prior to

choosing a course of action; a process orientation replaces the input–output orientation;

satisficing (i.e. achieving a ‘‘good enough’’ rather than optimal solution) replaces

comprehensiveness and optimality; and context-sensitive informal models replace

formalism (Lipshitz et al., 2001). These shifts stem from the fact that human information

processing limitations1 preclude exhaustive information search and simultaneous

comparison of multiple options. Moreover, in many high-risk consequential

environments, time for making a decision is limited, information is incomplete,

conditions change dynamically, and goals shift, rendering analytic decision-making

impractical, if not impossible.

In NDM, decisions are integral to a task and are made in order to achieve operational

goals, such as transporting passengers to their destinations while managing any threats to

safety. The decision-maker’s knowledge, often acquired through many years of training

and experience, plays a key role in the decision process. Knowledge is the basis for

recognizing situations that require decisions to be made, assessing the type and degree of

threat present, determining what information is relevant to the decision, and deciding

on an appropriate course of action. Also, team members, when present, expand the

cognitive resources and help to overcome potential limitations of a single decision-

maker.

5.1.1. Theoretical FoundationsSeveral naturalistic decision models contributed to the ADM model described in this

chapter. Four were especially influential: Klein’s (1993a) Recognition-Primed Decision

(RPD) model, Hammond’s Cognitive Continuum theory (Hammond et al., 1987),

1 Simon (1991) has termed this ‘‘bounded rationality.’’

Chapter 5 • Flight Crew Decision-Making150

Rasmussen’s (1993) Levels of Cognitive Control model, and Cohen’s (Cohen et al.,

1996) Recognition/Metacognition model. Each model contributed in the following

ways to the aviation decision model.

KleindRecognition Primed Decision-Making

Klein’s notion of how domain-specific knowledge guides recognition of situations and

retrieval of appropriate responses is the foundation for many aviation decisions,

especially those made under time pressure. Schema-based knowledge provides the link

between recognizable situation patterns and actions that have worked in the past under

similar conditions; this knowledge also is the basis for mentally simulating outcomes of

the actions.

HammonddCognitive Continuum Theory

This theory asserts that decisions vary from intuitive to analytic depending on the nature

of the situations in which they are made. In intuitive decisions, people rely on pattern

matching; in analytical decisions, they use more thorough evaluation processes.

Accompanying this continuum is a task continuum that reflects the nature of the cues

available to the decision-maker. The amount and type of data determine where

a problem sits on the continuum. According to Hammond, good decisions depend on

correspondence between the decision strategy and the task: an analytical strategy applied

to non-engineered cue patterns is inefficient, whereas an intuitive strategy applied to

numeric data may be suboptimal.

RasmussendLevels of Cognitive Control

Rasmussen’s (1985, 1993) Cognitive Control model distinguishes between skill-based,

rule-based and knowledge-based behaviors. Skill-based behaviors are highly practiced

and exercised automatically in response to changes in cue input. Rule-based behaviors

involve conscious action, but typically are codified in standard procedures. Knowledge-

based behaviors involve deliberate thinking that is needed when skill- and rule-based

guidance is lacking. Analytic or choice decisions and creative problem-solving fall into

this category.

CohendRecognition/Metacognition Model

Some form of control process is needed to manage the shifts along a cognitive

continuum, levels of cognitive control and types of recognitional processes. Cohen et al.

(1996) formalized a critical thinking model that combined recognitional and

Flight Crew Decision-Making • Chapter 5 151

metacognitive processes. The recognition component involves a quick test to assess

situation familiarity and problems with the retrieved solution, time available to make

a decision and the stakes if an error is made. If no problems are found, then the RPD-

elicited response is adopted. But if stakes are high, time is available and the problem is

atypical, then the metacognitive component is invoked. It provides a basis for critiquing

and correcting the problem definition and solution until the decision-maker is confident

of the best interpretation and solution. Cohen’s quick test for problem familiarity, time

and stakes maps closely to the situation assessment processes in the ADM model depicted

in Figure 5.1.

Figure 5.1 Schematic of aviation decision process model in response to an environ-mental threat. Decision processes are shown as a function of environmental condi-tions pertaining to cue diagnosticity and familiarity, time pressure, and risk, as wellas response affordances: availability and familiarity of response options.

What’s the problem?

How much time to resolve?

How severe are potential consequences?

Condition-Action

Rule available

Multiple options

available

No option

available

Apply Rule Choose Option Create Novel Solution

Gather more information

Co

nd

itio

ns

, A

ffo

rd

an

ce

s

Time Available

Risk High or Low

Time Limited

Risk High

Course of

Action

CUE Pattern Threat Event

Situation Assessment

Problem familiar

OR NOT

understood

Problem familiar

Or understood

Problem NOT

understood

Chapter 5 • Flight Crew Decision-Making152

5.1.2. The Role of ExpertiseExpertise contributes to cockpit decision-making in three ways. First, expert knowledge

facilitates rapid and accurate perception of cues and interpretation of problems

(Cannon-Bowers et al., 1990). Second, this knowledge includes stored condition–action

patterns, the basis for recognition-primed decisions (Klein, 1989, 1993). Third, expert

knowledge provides a basis for risk assessment and for estimating the likelihood of

occurrence of various kinds of problems.

Knowledge is not a shield against errors. Expertise is the foundation for heuristics,

which sometimes result in poor judgments. Deep knowledge is responsible for efficient

functioning most of the time, but occasionally leads one astray.

Expert knowledge confers an advantage primarily for problems that are meaningful

within the expert’s domain (Klein, 1998). Chess masters show remarkable memory for

the location of chess pieces that represent positions during play, a basis for strategic

moves (Chase & Simon, 1973). But if those same pieces are placed randomly on the

chessboard, the masters’ recall is no better than the novice’s.

5.1.3. Aviation Decision Process ModelThe aviation decision-making model described in Orasanu (1993) still holds, with some

minor adjustments as shown in Figure 5.1. Using NDM research methods, Orasanu and

Fischer (1997) analyzed aviation incident reports (from the NASA Aviation Safety

Reporting System, ASRS), National Transportation Safety Board (NTSB) accident

reports and observations of flight crews in complex flight simulations to develop an

aviation decision process model. The model involves two major components: situation

assessment (SA) and choosing a course of action (CoA).

Situation Assessment

Aviation decisions typically are prompted by off-nominal or changed conditions that

require adjustment of the planned course of action. Situation assessment involves

defining the problem, assessing the level of risk associated with it and determining the

amount of time available for solving it. Available time appears to be a major determinant

of subsequent strategies. If the situation is not understood, diagnostic actions may be

taken, but only if sufficient time is available. External time pressures may be mitigated by

crews to support information search and problem solution (Orasanu & Strauch, 1994),

e.g. crews may buy time through holding, or reprioritize tasks to reduce workload

Flight Crew Decision-Making • Chapter 5 153

(cf. Wickens & Raby, 1991). If risk is high and time is limited, action may be taken

without thorough understanding of the problem.

Risk Assessment

Perhaps the most significant elaboration of the ADM model since 1993 is increased

attention to the risk assessment component of situation assessment. In order to manage

risk, threats must be perceived and accurately assessed. Risk includes two components:

the likelihood of a threat and the severity of its potential consequences. The two vary

independently, although in aviation, events with the most severe consequences typically

occur quite infrequently. These include engine failures, fires and other emergencies.

Other threats are more common, such as traffic, weather and schedule delays, but

strategies for their management are more accessible.

While relatively little research has been conducted on risk perception in aviation (but

see O’Hare and Smitheram, 1995), Fischer et al. (2003) conducted a study that clearly

demonstrated the role of risk perception in decision-making. Using a think-aloud

protocol with dynamically evolving ambiguous flight situations, the investigators

analyzed the information pilots requested, their evaluation of conditions in terms of risk,

sensitivity to ambiguity, time available and goal conflicts, and ultimate decisions and risk

management strategies. Findings showed that decisionsdeither accepting or avoiding

risksddepended on how the pilots perceived the ambiguous conditions. If they felt

conditions were not too severe, they were willing to take a risk, but if the risk level

passed a threshold, they avoided the risk.

Choosing a Course of Action

After the problem is defined and the conditions are assessed, a course of action is chosen

based on the structure of the options present in the situation. Building on Rasmussen

(1985), Orasanu and Fischer (1997) specified three types of response structures: rule-based,

choice and creative. All involve application of knowledge but vary in the degree to which

the response is determined by the situation. What constitutes an appropriate course of

action depends on the affordances of the situation. Sometimes a single response is

prescribed in company manuals or procedures. At other times, multiple options exist from

which one must be selected. On some rare occasions, no option is readily available and the

crew must invent a course of action. In order to deal appropriately with the situation, the

decision-maker must be aware of what constitutes a situationally appropriate process.

While various types of decisions can be distinguished for analytical purposes, in

practice any given flight situation may require the use of several different decision

strategies. Making one decision or taking the prescribed action may present a new set of

Chapter 5 • Flight Crew Decision-Making154

conditions requiring a different type of decision. To an observer, these may appear as

a smooth flow of action, although decisions are hidden behind the actions.

5.1.4. Situational Constraints and Affordances inChoosing a Course of Action

Rule-based or RPD decisions

Single-response situations correspond to Klein’s (1989, 1993a) recognition-primed

decisions and Rasmussen’s (1985, 1993) rule-based actions. Single option cases are the

simplest decisions because they require the least cognitive work.

In many high-risk and time pressured situations, such as smoke in the cockpit, an

engine stall, or rapid decompression, an action is prescribed in response to specific

situation cues. These and other abnormal situations are deemed to be sufficiently

consequential that procedures are specified to reduce the crew’s need to invent solutions

for the problem. The primary decision is whether any circumstances suggest that the

pre-defined response should not be implemented.

Sometimes an action may be planned or in process and a stimulus triggers a decision

to terminate the action, a go/no go decision. Stimulus conditions that elicit this response

may be quite diverse. For example, a rejected takeoff may be triggered by an explosive

engine failure, cargo door light, runway traffic, compressor stall, or engine overheat

lights (also see Chamberlin, 1991). Likewise, a missed approachda decision to

terminate a landingdmay be triggered by inability to see the runway at decision height,

by air or ground traffic, autopilot disengagement, or an unstable approach.

All rule-based decisions involve risk assessment, particularly when ground speed or

altitude is near a decision threshold. Certain conditions, like a wet runway or system

malfunction that result in poor braking, will complicate the decision.

Decision-making in aviation differs from many other high-consequence domains

such as medicine or fire-fighting in that many rule-based aviatio decisions are codified in

FAA regulations or company operations manuals. In other domains the basis for

condition–action pairings is the decision-maker’s experience, which builds deep domain

knowledge. Klein (1989, 1998) found that experienced decision-makers recognize a cue

configuration as signaling a particular type of problem and then generate an appropriate

response based on previous experience with similar problems. Once a response option

has been retrieved, it is evaluated by mentally simulating its consequences to determine

whether the response will satisfy the decision-maker’s goals. If so, the action is accepted.

If not, another option is generated and evaluated, or the situation is reassessed.

Flight Crew Decision-Making • Chapter 5 155

Multiple-option decisions

Some flight decisions involve choice from among alternatives present in the situation.

These choices map most closely onto our everyday notion of decision-making. For

example, a crew may need to select an alternate landing site in response to an on-board

medical emergency in bad weather. Landing alternates are prescribed in the flight plan if

weather conditions at the destination deteriorate, and procedure manuals provide

guidance on how to deal with medical emergencies. However, weather conditions may

be deteriorating at the nearest airport that offers appropriate medical facilities, and

precious time may be required to reach a different airport. In this case, the crew needs to

weigh the risks of trying to land in borderline weather conditions versus the possible

danger to the passenger of flying to a more distant airport.

Strategies used by crews to select from among alternatives vary, but observations to

date (Klein, 1993a; Orasanu, 1993) suggest that they do not correspond to a full

analytical procedure. A full analysis would involve evaluation of each possible option in

terms of every variable relevant to the decision (e.g. weather, fuel consumption, runway

length), and a mathematical formula would be used to combine all the information to

yield the optimal choice. In fact, crews appear to make decisions in the most economical

way, taking short-cuts in this process. They work toward a suitabledbut not necessarily

the bestddecision in the shortest time, investing the least possible cognitive work.

Options often are eliminated on the basis of one feature, such as weather, and are out

of the running thereafter, unless no suitable alternate can be found and the process must

be reopened. This is essentially an elimination by aspects strategy (Tversky, 1972).

However, if a few candidates are available, one is chosen to match the constraints of the

circumstances and the crew’s goals and perceived risks. Usually, the most safety-critical

constraint prevails. However, organizational policy also plays an important role. The

crew may choose an alternate that has a company maintenance facility or where

replacement planes will be available for passengers to continue their flight.

Ill-Defined problems

Two other types of decisions hardly look like decisions at all. They consist of ill-defined

problems that may or may not be clarified in the process of situation assessment.

Ambiguous cues may make it impossible to define the problem that needs fixing. Two

strategies may be used to cope with this type of situation: manage the situation as though

it is an emergency without clearly defining the problem (procedural management), or

diagnose and define the problem, and then generate a novel solution (creative problem-

solving) because no prescribed procedures exist for dealing with it.

Chapter 5 • Flight Crew Decision-Making156

Procedural management. Certain cues are ominous but leave the crew without

a clear idea of the underlying problem. Various noises, thumps, vibrations,

rumblings, pressure changes, or control problems indicate that something has

happened, but not necessarily what. The cues signal potentially dangerous conditions

that trigger emergency responses, regardless of the source of the problem. Smoke, loss

of pressure, an acrid smell, an explosion, or loss of control all signal ‘‘Land now.’’ Little

time is devoted to determining the source of the cues. All energies are devoted to

finding an appropriate airport, running necessary checklists, getting landing clearance,

declaring an emergency, dumping fuel and landing. These problems are essentially

treated as RPD situations, with the condition broadly labeled as ‘‘emergency landing.’’

A recent example of a procedural management decision was the landing on the Hudson

River by a USAir A-320 aircraft after both engines were lost due to bird strikes on

takeoff from LaGuardia Airport in New York (NTSB, 2009). Captain Sullenberger

initially planned to land at Teterboro Airport in New Jersey after the dual engine loss,

but realized they had insufficient altitude to travel the 6 miles; instead he opted to make

a river landing.

The cognitive work done for this class of decision is primarily risk assessment.

Responses are clearly prescribed and highly proceduraldonce the situation is defined as

an emergency. If the risk is judged to be high, then emergency procedures are

undertaken. If the risk is not immediately defined as an emergency, then additional

energy may be devoted to situation diagnosis.

Diagnosis of the problem underlying ambiguous cues can serve two purposes. It can

clarify what the problem is so that an appropriate action can be taken, or it can provide

information that may be useful for fixing the problem. When workload is relatively lowand

time is available, the crew may try to diagnose and fix the problem. But even if diagnosis

does not lead to fixing the malfunction, it can turn the problem into one with a better-

defined response (essentially a recognition-primed decision). Defining the problem clearly

may lead to a more specific response than simply treating it as an emergency.

Creative problem-solving. Perhaps the most difficult types of decisions are those

requiring creative problem-solving. In addition to diagnosing the situation, a solution

must be invented that will satisfy the goal. These cases tend to be low-frequency events;

neither aircraft designers nor operations personnel imagined such a situation would arise,

so no procedures were designed to cope with them.

Diagnosis is critical and typically involves causal reasoning, which is reasoning

backward from effects to cause, as well as hypothesis generation and testing. The range of

tests that can be performed will vary. For example, in response to a power loss indication

for one engine, the crew can manipulate the throttle to see its effect. If they find no

Flight Crew Decision-Making • Chapter 5 157

effect, they may shut down the engine since it is not working. They may check to see if

fuel is flowing to the engine. Tests often are embedded in checklists.

Even if the nature of the problem has been determined, no ready solutions are

prescribed for these problems. Perhaps the most celebrated case of creative problem

solving was United Airlines flight 232 (NTSB, 1990) in which the DC10 lost all

hydraulic systems due to an explosion in the number two engine. The captain invested

considerable energy on situation assessment, determining what capability he had left

after the hydraulic failure (Predmore, 1991). The two outboard engines were still

running, but no flight controls were operative. Knowing that the only control he had

was engine thrust, he and his crew determined that they could use asymmetrical engine

thrust to turn the plane and power level to control the altitude.

While the case of UAL 232 is extreme, ASRS reports indicate that crews do, in fact,

encounter novel situations that are not covered by the Federal Aviation Regulations

(FARs), Minimum Equipment Lists (MEL), or checklists. For example, the captain of

a large transport on a cross-country flight reported a low level of oxygen in the crew

emergency tanks while at FL310. No guidance concerning how to proceed was available

in company manuals. The cause of oxygen depletion could not be determined in flight,

nor could the problem be fixed. Regulations require emergency oxygen in case of rapid

decompression, so rather than land immediately, the crew came up with a creative

solution. They descended to FL250 and borrowed the flight attendants’ walk-around

oxygen bottles. (Different O2 requirements are specified for flight attendants above and

below FL250.) This solution allowed them to continue to their destination rather than

to divert or to descend to 10,000 feet, which would have eliminated the need for the O2.

However, the latter option would have meant the flight would not have had sufficient

fuel to reach its destination because of rerouting around bad weather.

This example is interesting because it illustrates consideration of multiple options,

creation of a novel solution, sensitivity to constraints and explicit risk assessment. In

creating his solution, the captain was aware that he would not be able to communicate with

ATC in an emergency if hewere using thewalk-around O2 bottle, as it had no microphone.

But he judged the likelihood of a rapid decompression to be sufficiently low that he chose

this option. Another constraint was fuel; the captain wanted to conserve fuel because of the

possibility of a missed approach or diversion due to poor destination weather. An early

decision to divert would have been the most conservative decision, but it would not have

met the goal of getting the passengers to their destination in a timely manner.

This above effort to classify decisions in terms of situational demands and affordances is

a first step toward understanding what makes certain kinds of decisions difficult, the

cognitive effort they require and possible weak links. The various types of decisions fall on

a continuum ranging from simple to complex, which require little cognitive work to

Chapter 5 • Flight Crew Decision-Making158

considerable effort. One reason for laying out these differences is to create an

appreciation for the fact that no single decision method will work for all types of situations.

5.2. What Factors Make Decisions Difficult?Before examining factors that make decisions difficult and contribute to errors, the

concept of ‘‘decision error’’ within an NDM framework must be considered.

5.2.1. Decision Errors: Outcome vs. ProcessDefining decision errors in naturalistic contexts is fraught with difficulties. First, errors

typically are defined as deviations from a criterion of accuracy. However, defining the

‘‘best’’ decision in a natural work environment may be impossible. Second, a loose

coupling of decision processes and event outcomes works against using outcomes as

reliable indicators of decision quality. Redundancies in the system can ‘‘save’’ a poor

decision from serious consequences. Conversely, even the best decision may be

overwhelmed by events over which the decision-maker has no control, resulting in an

undesirable outcome. A third problem is hindsight bias. Fischhoff (1975), Hawkins and

Hastie (1990) and others point out a tendency to define errors by their consequences.

But in natural contexts the analyst does not know how often exactly the same decision

process was used or the same decision was made in the face of similar situations with no

negative consequences. Were those prior decisions also ‘‘errors’’?

Unsatisfying though it may be, the following definition is adopted here: decision

errors are ‘‘deviations from some standard decision process that increase the likelihood

of bad outcomes’’ (Lipshitz, 1997, p. 152). Although outcome alone may not be

a good indicator of decision quality, the decision-maker’s goal or intended outcome

remains important. In naturalistic work contexts, decisions contribute to performance

goals. Decisions do not stand alone as events to be judged independent of the broader

task.

5.2.2. How Can Decision Processes Go Wrong?Decision errors may arise within the two major components of the aviation decision

model: (a) pilots may develop an incorrect interpretation of the situation, which leads to

an inappropriate decision, or (b) they may establish an accurate picture of the situation,

but choose an inappropriate course of action. In addition, they may not appropriately

assess the risks inherent in the situation.

Flight Crew Decision-Making • Chapter 5 159

Faulty situation assessment

Situation assessment errors can be of several types: cues may be misinterpreted,

misdiagnosed, or ignored, resulting in an incorrect picture of the problem (Endsley,

1995); risk levels may be misassessed (Johnston, 1996; Orasanu et al., 2004); or the

amount of available time may be misjudged (Keinan, 1988; Maule, 1997; Orasanu &

Strauch, 1994). Problems may arise when conditions change and pilots do not update

their situation models (Woods & Sarter, 1998).

For example, one accident that can be traced to an incorrect assessment of the

situation was the decision by the crew of a B-737 to shut down an engine; unfortunately,

the wrong one:

The crew sensed a strong vibration while in cruise flight at 28,000 ft. A burning smell andfumes were present in the passenger cabin, which led the crew to think there was a problem inthe right engine (because of the connection between the cabin air conditioning and the rightengine). The captain throttled back the right engine and the vibration stopped. However, thiswas coincidental. In fact, the left engine had thrown a turbine blade and gone intoa compressor stall. The captain ordered the right engine shut down and began to return to theairport. He again questioned which engine had the problem, but communication with airtraffic control and the need to reprogram the flight management computer took precedence,and they never did verify the location of the problem. The faulty engine failed completely asthey neared the airport, and they crashed with neither engine running. (AAIB, 1990)

The problem was incorrectly defined because the cues (vibration and burning

smell) supported the interpretation of a right engine problem. The crew did not verify

this interpretation before taking an action that was irrevocable at that point in the

flight.

Faulty selection of action

Errors in choosing a course of action may also be of several types. In rule-based

decisions, the appropriate response may not be retrieved from memory, either because it

was not known or because some contextual factor mitigated against it. Conversely, an

inappropriate rule may be applied, especially a frequently used one. In choice decisions,

options may not be considered. Constraints that determine the adequacy of various

options may not be used in evaluating them.

Creative decisions may be the most difficult because they involve the least support

from the environment; candidate solutions must be invented to fit the goals and existing

conditions. Any solution that meets one’s goal may be considered a success. As Klein

Chapter 5 • Flight Crew Decision-Making160

(1993b) noted, decision-makers may not project the consequences and uncertainties

associated with candidate actions, resulting in a poor choice of action.

An accident in which an inappropriate course of action was chosen in the face of

fairly complete information about the nature of a problem occurred near

Pinckneyville, IL.

About two minutes into a night flight in instrument conditions, a Hawker-Siddley commuteraircraft lost its left generator. In error, the first officer isolated the right generator and then wasunable to restart it. This meant total loss of ability to generate electrical power, which wasneeded to run all cockpit instruments. Under the best of circumstances batteries might beexpected to last for 30 minutes. The captain decided to continue to the destination airport45 minutes away, rather than diverting. Continued use of non-essential electrical equipmentshortened the battery life. A complete electrical failure and subsequent loss of flightinstruments critical for IFR flight led the plane to crash. (NTSB, 1985)

The crew’s decision to continue as planned despite the mechanical failure, rather

than to land as soon as possible, was fatal.

Decision difficulties arise when goals conflict or when no good choice is available.

For example, weather at the destination airport might be satisfactory when the plane

takes off, but may deteriorate rapidly and be below minimums by the time the flight

arrives. The alternat airport may have clear weather, but it may be more distant,

straining fuel resources. All options are evaluated in terms of their level of risk, but

sometimes no low risk option is available. Then risk must be played off against what will

be gained in each case, factoring in the crew’s level of confidence that they can follow

through with the choice. The crew needs to think about what might happen down the

line. They are in a dynamic state; their equipment may be changing over time, the

weather is changing over time and their location is changing over time.

Faulty Risk Assessment

Poor decisions may also arise when a flight crew is aware of conditions that require

a decision, but underestimates the level of risk associated with the conditions, especially

when they are changing dynamically. For example, when approaching Dallas for

landing, the first officer of an L-1011 commented on lightning in the storm lying on

their flight path (NTSB, 1986a). Yet, the crew flew into it and encountered wind shear.

We know that risk is important to pilots, because potential risk was the dominant

dimension considered by captains from several airlines when making judgments about

flight-related decision situations (Fischer et al., 1995). Why then do crews appear to

underestimate risk in potentially critical situations?

Flight Crew Decision-Making • Chapter 5 161

One possible explanation is that crews lack the relevant experience or are unable to

retrieve the knowledge needed to assess risk appropriately in those specific

circumstances (cf. Klein, 1993b). Another arises from pilots’ routine experience. If

similar risky situations have been encountered in the past and a particular course of

action has succeeded, the crew will expect to succeed the next time with the same

response, a phenomenon Reason (1990) called ‘‘frequency gambling.’’ Given the

uncertainty of outcomes, in many cases they will be correct, but not always. Hollenbeck

et al. (1994) found that past success influences risk-taking behavior. Baselines become

misrepresented over time as a situation becomes familiar and the individual becomes

more comfortable with it. Likewise, Sitkin (1992) argued that uniformly positive

experiences provide no baseline by which to determine when the situation is becoming

more dangerous.

5.2.3. Plan Continuation Errors (PCE)Examination of decision errors in the NTSB’s (1994) analysis of 37 crew-involved

accidents revealed an emergent theme: about 75% of the decision errors involved

continuation of the original flight plan in the face of cues that suggested changing the

course of action (Berman, 1995). These included taking off in snowy conditions,

landing during an unstable approach, or continuing a VFR flight in instrument

conditions (cf. O’Hare and Smitheram, 1995). More recent analyses confirm this

pattern, called ‘plan continuation errors’ (Orasanu et al., 2002) or plan continuation

biases (Dismukes et al., 2007).

Although it is not possible to determine the cause of these patterns from post hoc

analyses, our efforts were drawn to examining factors that might lead crews to

demonstrate plan continuation types of decisions. In many cases it appeared that the

crew failed to appreciate the risks inherent in the evolving conditions or those

associated with pressing on with their original course of action. Both contextual and

cognitive factors were hypothesized as potential contributors to these types of decision

errors.

5.2.4. Error Inducing ContextsThree types of contextual factors extracted from accident analyses may contribute to

poor aviation decision, including PCEs: (1) poor quality information, including

ambiguous dynamic conditions or poorly displayed information, (2) organizational

pressures, and (3) environmental threats and stressors.

Chapter 5 • Flight Crew Decision-Making162

Information quality

Cues that signal a problem are not always clear-cut. Poor interface design that does not

provide adequate diagnostic information or action feedback can lead a crew astray

(Woods & Sarter, 1998). For example, in the Kegworth crash, information about which

engine had the problem was poorly displayed, contributing to the flight crew shutting

down the wrong engine (AAIB, 1990). Conditions can deteriorate gradually, and the

decision-maker’s situation awareness may not keep pace. Ambiguous cues permit

multiple interpretations. If ambiguity is not recognized, a crew may be confident in their

understanding of a situation, when in fact they are wrong.

In addition to making it difficult to assess the situation, ambiguity can influence the

decision indirectly. A crewmember may recognize that something ‘‘doesn’t seem right’’

(as stated by the first officer in the Air Florida takeoff crash in Washington, DC, during

heavy snow with a frozen pitot tube, NTSB, 1982), but may find it difficult to justify

a change in plan when cues are ambiguous. For decisions that have expensive

consequences, such as rejecting a takeoff or diverting, the decision-maker may need to

feel very confident that the change is warranted. Ambiguity thus may contribute to

plan continuation errors.

Organizational pressures

An organization’s emphasis on productivity may inadvertently set up goal conflicts

with safety. As Reason (1997) has documented, organizational decisions about

levels of training, maintenance, fuel usage, keeping schedules, etc. may set up

latent pathogens that undermine safety. For example, on-time arrival rates are

reported to the public. Companies also emphasize fuel economy and getting

passengers to their destinations rather than diverting, perhaps inadvertently

sending mixed messages to their pilots concerning safety versus productivity.

Mixed messages, whether explicit or implicit in the norms and organizational

culture, create conflicting motives, which can affect pilots’ risk assessment and the

course of action they choose.

Environmental Threats and Stressors

Operational factors that may affect pilots’ ability to make reasoned decisions include

high workload, limited time, heavy traffic, poor weather, last minute runway changes,

and schedule delays. An extensive literature documents the deleterious effects of stress

on cognitive functioning (Hancock & Desmond, 2001; Hockey, 1979), including

attentional focus, working memory capacity, and risk taking. These influence decision

Flight Crew Decision-Making • Chapter 5 163

process through their effects on information scanning, cue detection, hypothesis

generation, and option evaluation. Stress also may affect crew communication, which

can interfere with building situation models, sharing information, contingency

planning, and error trapping.

One of the few studies to examine stress and pilot decision making found that

stress had little effect on decisions that drew on domain expertise and relied on

perceptual knowledge, i.e., recognition-primed decisions (Stokes et al., 1997). This is

consistent with the notion that decisions are more difficult when the problem is not

well understood and no clear response is available, i.e., in ill-defined problem

situations.

Certain phases of flight typically induce higher levels of stress due to heavy workload,

traffic and little room for error recovery (Strauch, 1997), such as during takeoff and from

top of descent to landing. Under stress, decision-makers often fall back on familiar

responses (Hockey, 1979), but these responses may not be appropriate to the situation.

For example, about one minute after takeoff the captain of a four-engine aircraft

retarded power on all four engines in response to a vibration throughout the aircraft, an

action that resulted in a crash (NTSB, 1986b). Reducing power so close to the ground

was not appropriate because insufficient time was available for recovery. The same action

might have been fully appropriate at a higher altitude.

Other potentially dangerous conditions may permit more time to diagnose the

problem and consider what to do (e.g. fuel leaks, or hydraulic, electrical or

communication failures). However, under stress, people often behave as though they

are under time pressure, when in fact they are not (Keinan, 1988).

5.2.5. Cognitive FactorsAmbiguous cues, dynamically changing risks, organizational pressures and

environmental stressors may not in themselves be sufficient to cause poor decisions.

However, when the decision maker’s cognitive limits are stressed, these factors may

combine to induce errors.

Schema-based decisionsConsider that more than one half of the decision errors in the NTSB (1994) database

(29 out of 51) involved omissions, or failures to do something that should have been done.

Crews may have been captured by a familiar schema in these cases, leading them to

do what they normally do, that is, to carry on with the usual plan, even though another

Chapter 5 • Flight Crew Decision-Making164

action was called for. Guidance by routine knowledge also supports the ‘‘cognitive

miser’’ perspective: people will do as little as possible to get the job done at a satisfactory

(though not necessarily optimal) level, Simon’s (1957) concept of ‘‘satisficing.’’

Lack of knowledgeA cognitive economy or a schema dominance explanation, however, fails to account for

errors of commission (NTSB, 1994). These are cases in which crews took actions that

were out of the ordinary, such as attempting to blow snow off their aircraft using the

engine exhaust from the aircraft ahead of them (NTSB, 1982). These cases may reflect

‘‘buggy’’ mental models or gaps in knowledge (VanLehn, 1990). Buggy models may lead

to success in some cases, so decision-makers may have great confidence in these

inadequate models.

Novices are at a disadvantage in making decisions because they lack the deep and

well-integrated knowledge of experts (Chi et al., 1988; Klein, 1998). This may be

manifest in what appears to be inadequate situation assessment or choice of risky options.

For example, Driskell and colleagues (1998) found that general aviation (GA) pilots only

matched experts’ choices of options under a variety of flight conditions in terms of their

‘‘riskiness’’ 50% of the time. One reason may be their less-informed risk models. When

Fischer et al. (2003) asked GA and commercial pilots to categorize flight scenarios on the

basis of risk, novices focused only on the severity of the consequences, whereas experts

included both the timeline (How long do I have to make a decision?) and controllability

(What can I do about the situation?).

Expert–novice differences were manifest in behaviors of more and less experienced

pilots in several low-fidelity simulations involving deteriorating weather conditions.

More experienced pilots made decisions earlier and traveled less into bad weather prior

to diverting than did less experienced pilots (Wiegmann et al., 2002), suggesting

inadequate situation awareness stemming either from lack of knowledge or different risk

standards in the more junior pilots.

Social factorsSocial factors may create goal conflicts that increase decision difficulty. Perceived

expectations among pilots may encourage risky behavior or may induce one to behave as

if one were knowledgeable, even when ignorant. For example, a runway collision in

near zero visibility (due to fog) resulted when one aircraft stopped on an active runway

because the crew did not realize where they were (NTSB, 1991). The captain was

unfamiliar with the airport and was making his first unsupervised flight after a long

Flight Crew Decision-Making • Chapter 5 165

period of inactivity. The first officer boasted of his knowledge of the airport but, in fact,

gave the captain incorrect information about taxiways. Rather than questioning where

they were, the captain went along.

Based on critical incident interviews with pilots flying in the extreme conditions of

Alaska, Paletz and colleagues characterized several social phenomena that may lead crews

into taking risks and perhaps plan continuation errors (Paletz et al., 2009). These

include:

n Informational social influence: accepting information obtained from another as

evidence about reality, as in follow-the-leader behavior

n Foot-in-the-door persuasion technique: agreement to a small request increasing

likelihood of agreement to a large one later

n Normalization of deviance: an incremental acceptance of a progressively lower

level of safety by a group of people

n Impression management: not looking bad to themselves or to others

n Self-consistency motives: acting in ways consistent with one’s beliefs.

Personal StressPersonal stressors include concern with family matters, job security, or health issues.

While some pilots may be able to put these matters out of mind on the flight deck, others

may be distracted by them. These personal factors may also affect decision making by

interfering with sleep, which can have negative effects on alertness, attentional focus,

mood, and crew communication. Ill-structured problems and organization-related goal

conflicts require high levels of cognitive effort, which may be compromised in

conjunction with other stress factors (Cannon-Bowers & Salas, 1998). Stress typically

constrains working memory capacity (Hockey, 1979), thus limiting the decision-maker’s

ability to entertain multiple hypotheses or to mentally simulate the consequences of

options (Wickens et al., 1993).

5.3. Behaviors that Characterize EffectiveCrew Decision-MakingBehaviors associated with effective crew decision making have been identified from

research in both high- and low-fidelity flight simulations (Orasanu, 1994), and validated

Chapter 5 • Flight Crew Decision-Making166

in actual line operations, primarily through line operational safety audits (LOSA)

(Thomas, 2004). Both sources provide evidence of how crews manage threats, trap

errors, and maintain positive crew climate essential to making good decisions. These

behaviors can be broken down into taskwork skills and teamwork skills.

5.3.1. Taskwork SkillsSituation Awareness. Effective crews are vigilant. They monitor the environment for

threats that may require a response, as well as monitoring progress of the flight according

to the operative plan. They gather additional information to clarify threats.

Build Shared Situation Models. Effective crews build shared situation models when

threats arise. They assess and communicate the nature of the threat, the degree of risk,

and time available.

Update Plans. Effective crews are adaptive. They adjust to dynamically evolving

conditions and update plans as needed (to avoid plan continuation errors). This

includes building contingency plans to cope with uncertain situations. Threats may

require that goals be updated to support threat management while maintaining the

overall plan.

Manage Tasks and Workload. Effective crews revise task priorities and reassign tasks to

manage workload.

Evaluate Options. Effective crews project the consequences of potential decisions to

decide what to do. They are sensitive to competing goals and risks, such as safety,

productivity, economic and professional consequences.

Metacognitive Strategies. Effective crews are reflective. They check their assumptions,

question missing information, consider what might go wrong, how likely it is and how

serious it would be.

5.3.2. Teamwork SkillsEffective taskwork depends on effective teamwork. Maintaining a positive crew climate

and trust in each other is essential for assuring that all crewmembers, especially

junior ones, contribute to problem assessment and decision making (Salas et al., 2005).

Trust and openness are the basis for error trapping. Accident investigations consistently

point out the role of ‘‘monitoring-challenging’’ failures as links in the accident chain

(NTSB, 1994). These failures are more frequent when the captain is the one making

the error than when both crewmembers are responsible or when the error arises outside

of the flight deck (Orasanu et al., 1998; Thomas, 2004).

Flight Crew Decision-Making • Chapter 5 167

Crew Climate. Positive crew climate begins with the captain’s pre-flight briefing

(Ginnett, 1993), which establishes a climate of openness and participation. Studies show

that effective crews are characterized by active participation of all crewmembers (Fischer

et al., 2007; Parke et al., 2000).

Error Trapping. When errors occur, members of effective crews are able to disrupt

the error chain by calling out the error and correcting it (or even preventing it from

occurring by being pro-active). Junior crewmembers are more likely to be effective in

trapping errors made by the captain by using certain communication strategies: clearly

describing the nature of the problem, offering a suggestion for solving it while leaving

the decision up to the captain, and providing justification for the suggestion (why it’s

a good idea) (Fischer et al., 2000). Challenges that are too direct or too mitigated

(weak) are not likely to be effective, the former because they may disrupt crew climate,

the latter because they don’t convey the seriousness of the problem. Effective

challenges invoke a crew orientation (‘‘we’re in this together’’), reflected by use of

‘‘we’’ rather than ‘‘you’’ or ‘‘I’’ in the suggestions, e.g., ‘‘We need to turn 15 degrees

north about now.’’

Back-up Strategies. Finally, members of effective crews monitor each other for stress,

fatigue or workload and back up each other or reassign tasks as needed. Good crews use

compensatory strategies to manage fatigue or stress, such as double-checking

information, status or plans (Petrilli et al., 2006).

5.4. Can We Train Crews to Make BetterDecisions?Team training approaches that focus on team process skills appear to be most effective

in developing resilient teamwork skills essential to effective crew decision making

(Klein et al., 2008; Salas et al., 2007). At this point, no basis exists for believing that it

is possible to develop training to improve all-purpose decision-making skills. Decision

strategies are learned most effectively in conjunction with domain-specific content

(Glaser & Bassok, 1989), a reality guiding the integration of CRM skills with

technical training for pilots under the FAA’s Advanced Qualification Program (AQP)

(FAA, Advisory Circular #120-54A, 6/23/06). Team training approaches that focus

on team process skills appear to be most effective in developing resilient teamwork

essential to effective crew decision making (Klein et al., 2008; Salas et al., 2007).

Positive evidence is accruing on the success of training in the perceptual skills and

strategies needed for effective situation assessment (Endsley & Robertson, 2000;

Wiggins & O’Hare, 2003).

Chapter 5 • Flight Crew Decision-Making168

5.4.1. Aviation Decision Training ModelsSeveral decision training models have been developed in the aviation industry, two of

which are used by major international carriers. DODAR stands for Diagnosis, Options,

Decide, Assign tasks, Review (Walters, 2002). FOR-DEC stands for Facts, Options,

Risks and BenefitsdDecide, Execute, Check (Hoermann, 1995). While both include

steps of gathering information, deciding on the basis of anticipated consequences, and

reviewing the decision, neither capitalizes on the crew’s expertise at recognizing and

sizing up the situation, as in NDM. Both imply concurrent weighing of multiple

options. Neither is tuned to differences in decision situations for which different

decision strategies are appropriate (i.e. rule-based, choice and creative decisions).

Essentially, these remain domain-independent general approaches that could be applied

in any domain by any decision-maker.

5.4.2. NDM-based trainingIn contrast, training grounded in the NDM framework provides opportunities for

developing rapid pattern recognition, serial consideration of options, use of mental

simulation to evaluate options, and metacognitive skills (Cohen et al.,1996; Means et al.,

1993; Klein, 1993a). Klein (1998) has pointed out that learning to think like an expert

involves developing deep knowledge that serves as a basis for making decisions. He

recommends the following activities to foster this learning:

n Engaging in deliberate practice that includes a goal and evaluation criteria

n Building an extensive experience bank from diverse scenarios

n Obtaining feedback that is accurate, diagnostic and timely

n Reviewing prior experiences to derive new insights and lessons from mistakes.

Situation assessment

Training within an NDM framework emphasizes development of situation assessment

skills, an element that was totally absent from traditional decision models. For aviation

decisions, both rapid pattern recognition and diagnostic skills are needed. Recognition

of danger cues and generation of appropriate responses to them should become

automatic, which can only happen through repeated practice with feedback. Deliberate

rather than recognitional situation assessment skills are required when cues are

ambiguous, contradictory or worrisome. Risk assessment is an essential component.

Flight Crew Decision-Making • Chapter 5 169

Course of action selection

The second major component of training within an NDM framework is evaluating

a course of action. In most cases a workable rather than an optimal solution may be

adequate. In the case of a rule-based or RPD decision, the option is generated upon

recognition of the situation. Its adequacy in meeting the decision-maker’s goals is

evaluated by using mental simulation of the likely outcome of taking the action in that

specific context. Training needs to emphasize that evaluating one option at a time is

appropriate under many circumstances rather than generating and assessing all possible

options, especially under time-pressured situations.

Metacognitive training

Perhaps the most trainable decision-related skill complex is metacognition. Considerable

research exists supporting the trainability of these skills across wide ranges of populations

(Brown et al., 1986). Training strategies developed by Cohen, Freeman and Thompson

(1998) emphasize goals, environmental conditions and actions under realistic practice

conditions to promote accurate recognition, and repetition with feedback to facilitate

automatic performance. It aims to sensitize trainees to domain-specific cues including

time constraints, stakes and problem familiarity, as well as conflicts, completeness and

reliability of information. Practice involves making metacognitive processes explicit (i.e.

critiquing and correcting), which benefits from a team context. Devil’s advocate and

crystal ball techniques are used to challenge assumptions and see weaknesses in situation

assessments and plans.

Thus, crews are prepared to cope when complications arise from multiple or

competing goals, e.g. maintaining flight safety, saving fuel and getting the passengers to

their destinations on time, with different options satisfying these competing goals. Crews

learn to recognize that some level of risk always exists and that tradeoffs must be

managed. Metacognitive assessment includes oneself: Are you fatigued or stressed? What

are your motives in pursuing a particular option? Senders and Moray (1990) noted that

pilots need training in ‘‘how to change one’s mind’’ and avoiding cognitive ‘‘lockup,’’

which may play a role in plan continuation errors.

5.4.3. Communication training

Build Shared Situation Models

As unexpected dynamic conditions arise, it is essential that team members communicate to

build a shared model of the emergent situation and how to cope with it: What is the

Chapter 5 • Flight Crew Decision-Making170

problem? What is our plan? Who does what and when? What contingencies must be

planned for? What cues or conditions must we look out for and what will we do? Only if all

participants have a shared model will they be able to contribute efficiently to the shared goal.

The intent is not simply to get crews to talk more. More is not necessarily better:

high levels of talk contribute to workload. What is desired is explicit discussion of the

problem: its definition, plans, strategies and relevant information. Current training

programs that are integrating CRM with technical training encourage crews to use pre-

briefings to assure that all members know what to do in case of time-critical

emergencies, such as how to handle aborted takeoffs.

Establish a positive crew climate through briefings

Briefings conducted by the captain go a long way to assure that team members

understand their role in the effort and feel comfortable offering their contributions,

which may be critical to managing threats in challenging situations (Ginnett, 1993).

Briefings set the tone or team climate; in both aviation and in medicine team members

‘‘follow the leader,’’ adopting the interactional style of the leader (Lingard et al., 2002).

Salas et al. (2005) point out the importance of mutual trust in supporting mutual

performance monitoring essential to flight safety. By establishing positive relationships,

the leader can let the team know that she or he is not invincible and create a crew climate

that is open and productive.

Monitor and challenge threats and errors

Crewmembers also must learn appropriate ways to bring problems to the attention of

the captain (called advocacy and assertion in early CRM parlance). These include being as

specific as conditions allow, pointing out problems, suggesting solutions and providing

reasons for one’s concerns. Strategies identified by Fischer and Orasanu (1999, 2000) to

be most effective in correcting errors involved crew obligation statements (such as ‘‘We

need to deviate right about now’’), preference statements (e.g. ‘‘I think it would be wise

to turn left’’) and hints (e.g. ‘‘That return at 25 miles looks mean’’). In addition, requests

that were supported by problem or goal statements (e.g. ‘‘We need to bump the airspeed

to Vref plus 15. There’s windshear ahead.’’) were rated as more effective than

communications without supporting statements.

5.4.4. Monitoring skillsThe above communication skills focus on how to communicate. They are predicated on

adequate skill in monitoring threats in the environment and crew errors. Sumwalt et al.

Flight Crew Decision-Making • Chapter 5 171

(2002) recommend training for monitoring that focuses on specific areas of

vulnerability, such as top-of-descent or points at which clearances are expected. Non-

critical tasks should be accomplished during less critical phases. Monitoring other

crewmembers is essential to mutual back-up behaviors.

Effective monitoring thus depends on effective workload and task management

strategies. Overall crew performance depends on the captain’s ability to prioritize tasks

and allocate duties. Demands can be managed by contingency planning, but this depends

on the captain anticipating possible problems, which in turn depends on good situation

awareness and metacognitive skill.

5.5. Conclusions: The Future of AviationDecision-MakingThe jobs of pilots and air traffic controllers are constantly evolving. With modern

equipment on the flight deck, pilots have more information at their disposal. Designing

information displays to support good situation awareness and fast, accurate problem

diagnosis is a theme of current research and development efforts. Providing information

on risks is more problematic. New systems may be able to critique proposed plans for

flawsdessentially, an intelligent automated metacognitive aid. Advances in technology

will be accompanied by changes in roles and responsibilities in the not-too-distant

future. The question will become how to prepare crews and controllers, with their deep

knowledge and adaptability, but also with their vulnerabilities, to manage such a system.

Mutual trust, respect, and a positive crew climate will continue to be the foundation for

effective crew decision making in future automated systems.

AcknowledgmentsThe author’s research was supported by the National Aeronautical and Space

Administration (NASA) Aviation Safety Program and by the Federal Aviation

Administration (FAA). The opinions expressed in this chapter are the author’s and do

not represent official views of any federal agency.

REFERENCESAir Accidents Investigations Branch (AAIB), 1990. Report on the accident to Boeing

737-400 G-OBME near Kegworth, Leicestershire on 8 January, 1989. (Aircraft

Accident Report 4/90). HMSO, London.

Chapter 5 • Flight Crew Decision-Making172

Berman, B., 1995. Flightcrew errors and the contexts in which they occurred: 37

major US air carrier accidents. In: Proceedings of the 8th International Symposium

on Aviation Psychology. Ohio State University, Columbus, pp. 1291–1294.

Billings, C. E. (1991). Human-centered aircraft automation: A concept and guidelines.

(Tech. Mem. No. 103885). Moffett Field, CA: NASA-Ames Research Center.

Brown, A.L., Armbruster, B.B., Baker, L., 1986. The role of metacognition in reading

and studying. In: Orasanu, J. (Ed.), Reading comprehension: From research to

practice. Hillsdale, NJ, Erlbaum, pp. 49-76.

Cannon-Bowers, J.A., Salas, E., 1998. Individual and team decision making under stress:

theoretical underpinnings. In: Cannon-Bowers, J.A., Salas, E. (Eds.), Making Deci-

sions under Stress. American Psychological Association, Washington, DC, pp. 17–38.

Cannon-Bowers, J.A., Salas, E., Converse, S., 1990. Cognitive psychology and team

training: training shared mental models of complex systems. Human Factors Society

Bulletin 33 (12), 1–4.

Cannon-Bowers, J.A., Salas, E., Pruitt, J.S., 1996. Establishing the boundaries of

a paradigm for decision research. Human Factors 38, 193–205.

Chamberlin, R.W., 1991. Rejected takeoffs: Causes, problems, and consequences.

In: Proceedings of the Sixth International Symposium on Aviation Psychology.

The Ohio State University, Columbus, OH, pp. 993–998.

Chase, W.G., Simon, H.A., 1973. Perception in chess. Cognitive Psychology (4), 55–81.

Chi, M.T.H., Glaser, R., Farr, M.J., 1988. The Nature of Expertise. Erlbaum,

Hillsdale, NJ.

Cohen, M.S., Freeman, J.T., Thompson, B., 1998. Critical thinking skills in tactical

decision making: A model and a training strategy. In: Cannon-Bowers, J.A., Salas, E.

(Eds.), Making decisions under stress: Implications for individual and team training.

APA, Washington, D.C., pp. 155–190.

Cohen, M.S., Freeman, J.T., Wolf, S., 1996. Metacognition in time-stressed decision

making: recognizing, critiquing, and correcting. Human Factors 38 (2), 206–219.

Comision de Accidentes 1978. Colision aeronaves, Boeing 747 PH-BUF de KLM y

Boeing 747 N 736 PA de Panam en Los Rodeos (Tenerife), el 27 de Mßarzo de 1977.

Madrid: Ministerio de Transportes y Comunicaciones, Subsecretaria de Aviacion Civil.

Dismukes, R.K., Berman, B.A., Loukopoulos, L.D., 2007. The limits of expertise:

Rethinking pilot error and the causes of airline accidents. Ashgate Publishing,

Aldershot, England.

Driskill, W.E., Weissmuller, J.J., Quebe, J.C., Hand, D.K., Hunter, D.R., 1998.

Evaluating the decision-making skills of general aviation pilots (Final Report, DOT/

FAA/AM-98/7). Washington, DC: U.S. Dept. of Transportation, Federal Aviation

Administration.

Flight Crew Decision-Making • Chapter 5 173

Endsley, M.R., 1995. Toward a theory of situation awareness. Human Factors 37.

Endsley, M.R., Robertson, M.M., 2000. Training for situation awareness in individuals

and teams. In: Endsley, M.R., Garland, D.J. (Eds.), Situation awareness analysis and

measurement. Lawrence Erlbaum Associates, Mahwah, NJ, pp. 349–366.

Fischer, U., Davison, J., Orasanu, J., 2003. What makes flight situations risky? Examining

commercial and general aviation pilots’ concepts of risk, Proceedings of the 12th

International Symposium on Aviation Psychology. Lawrence Erlbaum Associates.

Fischer, U., McDonnell, L., Orasanu, J., 2007. Linguistic correlates of team perfor-

mance: Toward a tool for monitoring team functioning during space missions.

Aviation, Space and Environmental Medicine 78 (5), B86–95. II.

Fischer, U., Orasanu, J., 1999. Say it again, Sam! Effective communication strategies to

mitigate pilot error. In: Jensen, R.S., Rakovan, L.A. (Eds.), Proceedings of the 10th

International Symposium on Aviation Psychology. Ohio State University,

Columbus, Ohio, pp. 362–366.

Fischer, U., Orasanu, J., 2000. Error-challenging strategies: Their role in preventing

and correcting errors, Proceedings of the 44th Annual Meeting of the Human

Factors and Ergonomics Society. HFES, Santa Monica, CA, pp. 30-33.

Fischer, U., Orasanu, J., Wich, M., 1995. Expert pilots’ perceptions of problem situ-

ations. In: Proceedings of the Eighth International Symposium on Aviation

Psychology. Ohio State University Press, Columbus, OH, pp. 777–782.

Fischhoff, B., 1975. Hindsight /¼ foresight: The effect of outcome knowledge on

judgment under uncertainty. Journal of Experimental Psychology: Human Percep-

tion and Performance 1, 288–299.

Ginnett, R.C., 1993. Crew as groups: Their formation and their leadership. In:

Weiner, E., Kanki, B., Helmreich, R. (Eds.), Cockpit resource management.

Academic Press, San Diego, CA, pp. 71–98.

Glaser, R., Bassok, M., 1989. Learning theory and the study of instruction. Annual

Review of Psychology 40, 631–666.

Hammond, K.R., Hamm, R.M., Grassia, J., Pearson, T., 1987. Direct comparison of

the efficacy of intuitive and analytical cognition in expert judgment. Proceedings of

IEEE Transactions on Systems, Man, and Cybernetics, SMC-17, 753–770.

Hancock,P.A.,Desmond, P.A., 2001. Stress, workload and fatigue. Erlbaum, Mahwah, NJ.

Helmreich, R.L., 2002. Threat and error management: 6th generation CRM training.

In: Proceedings of the First TEM Workshop (ICAO), pp. 1–14. San Salvador,

El Salvador, April 30, 2002.

Helmreich, R.L., Klinect, J.R., Wilhelm, J.A., 2001. System safety and threat and error

management: The line operational safety audit (losa). In: 11th International

Symposium on Aviation Psychology. Ohio State University, Columbus, OH.

Chapter 5 • Flight Crew Decision-Making174

Hawkins, S.A., Hastie, R., 1990. Hindsight: biased judgments of past events.

Psychological Bulletin 107, 311–327.

Hockey, G.R.L., 1979. Stress and the cognitive components of skilled performance. In:

Hamilton, V., Warbuton, D.M. (Eds.), Human Stress and Cognition: An

Information-processing Approach. Wiley, Chichester.

Hoermann, H.J., 1995. FOR-DEC- A prescriptive model for aeronautical decision

making. Human factors in aviation operations, 17–23.

Hollenbeck, J., Ilgen, D., Phillips, J., Hedlund, J., 1994. Decision risk in dynamic

two-stage contexts: beyond the status-quo. Journal of Applied Psychology 79 (4),

592–598.

Johnston, N., 1996. Managing risk in flight operations. In: Hayward, B.J., Lowe, A.R.

(Eds.), Applied Aviation Psychology: Achievement, Change and Challenge. Ashgate,

Brookfield, VT, pp. 1–19.

Kanki, B.G., Lozito, S., Foushee, H.C., 1989. Communication indices of crew

coordination. Aviation, Space and Environmental Medicine 60, 56–60.

Keinan, G., 1988. Training for dangerous task performance: the effects of expectations

and feedback. Journal of Applied Social Psychology 18 (4, Pt 2), 355–373.

Klein, C., Salas, E., DiazGranados, D., Burke, C.S., Stagl, K.C., Goodwin, G.F., et al.,

2008. Do team training interventions enhance valued team outcomes? A meta-

analytic initiative. Paper presented at the 23rd Annual Conference of the Society for

Industrial and Organizational Psychology, San Francisco, CA.

Klein, G.A., 1989. Recognition-primed decisions. In: Rouse, W.B. (Ed.), Advances in

Man-machine System Research, 5. JAI Press, Greenwich, CT, pp. 47–92.

Klein, G.A., 1993a. A recognition-primed decision (RPD) model of rapid decision

making. In: Klein, G., Orasanu, J., Calderwood, R., Zsambok, C. (Eds.), Decision

Making in Action: Models and Methods. Ablex, Norwood, NJ, pp. 138–147.

Klein, G., 1993b. Sources of error in naturalistic decision making, Proceedings of the

Human Factors and Ergonomics Society 37th Annual Meeting. HFES, Santa

Monica, pp. 368-371.

Klein, G., 1998. Sources of Power: How People Make Decisions. MIT Press, Cambridge.

Klein, G., Orasanu, J., Calderwood, R., Zsambok, C.E. (Eds.), 1993. Decision Making

in Action: Models and Methods. Ablex, Norwood, NJ.

Klinect, J.R., Wilhelm, J.A., Helmreich, R.L., 1999. Threat and error management:

Data from line operations safety audits, Tenth International Symposium on Aviation

Psychology. The Ohio State University, Columbus, OH, pp. 683-688.

Lautman, L.G., Gallimore, P.L., 1987, April-June. Control of the crew caused accident:

Results of a 12-operator survey. Boeing Commercial Airplane Co., Airliner. Seattle. 1-6.

Flight Crew Decision-Making • Chapter 5 175

Lingard, L., Reznick, R., Espin, S., Regehr, G., DeVito, I., 2002. Team communi-

cations in the operating room: Talk patterns, sites of tension, and the implications for

novices. Acad. Medicine 77, 232–237.

Lipshitz, R., 1997. Naturalistic decision making perspectives on decision errors. In:

Zsambok, C., Klein, G. (Eds.), Naturalistic Decision Making. Erlbaum, Mahwah,

NJ, pp. 151–162.

Lipshitz, R., Klein, G., Orasanu, J., Salas, E., 2001. Taking stock of naturalistic decision

making. Journal of Behavioral Decision Making 14, 331–352.

Maule, A.J., 1997. Strategies for adapting to time pressure. In: Flin, R., Salas, E.,

Strub, M., Martin, L. (Eds.), Decision Making under Stress: Emerging Themes and

Applications. Ashgate, Aldershot, UK, pp. 271–279.

Means, B., Salas, E., Crandall, B., Jacobs, T.O., 1993. Training decision makers for the

real world. In: Klein, G., Orasanu, J., Calderwood, R., Zsambok, C.E. (Eds.),

Decision Making in Action: Models and Methods. Ablex, Norwood, NJ, pp. 51–99.

National Transportation Safety Board, 1982. Aircraft accident report: Air Florida, Inc.,

Boeing 737-222, N62AF, Collision with 14th Street Bridge near Washington

National Airport, Washington, DC, January 13, 1982 (NTSB/AAR-82-8).

Washington, DC: NTSB

National Transportation Safety Board, 1985. Aircraft accident report: Air Illinois

Hawker Siddley, HS 748-2A, N748LL, near Pinckneyville, Illinois, October 11,

1983 (NTSB AAR 85/03). Washington, DC: NTSB

National Transportation Safety Board, 1986a. Aircraft accident report: Delta Air Lines,

Inc., Lockheed L-1011-385-1, N726DA, Dallas/Fort Worth International Airport,

Texas, August 2, 1985 (NTSB-AAR-86/05). Washington, DC: NTSB

National Transportation Safety Board, 1986b. Aircraft accident report: Galaxy Airlines,

Inc., Lockheed Electra-L-188C, N5532, Reno, Nevada, January 21, 1985 (NTSB

AAR-86-01). Washington, DC: NTSB

National Transportation Safety Board, 1990. Aircraft accident report: United Airlines

Flight 232, McDonnell Douglas DC-10-10, Sioux Gateway Airport, Sioux City,

Iowa, July 19, 1989 (NTSB/AAR-91-02). Washington, DC: NTSB

National Transportation Safety Board, 1991. Aircraft accident report: NW Airlines,

Inc., Flights 1482 and 299 runway incursion and collision Detroit Metropoli-

tanWayne County Airport Romulus, Michigan, December 3, 1990 (NTSB AAR 91/

05). Washington, DC: NTSB

National Transportation Safety Board, 1994. A review of flightcrew-involved, major

accidents of U.S. Air Carriers, 1978 through 1990 (PB94-917001, NTSB/SS-94/01).

Washington, DC: NTSB

Chapter 5 • Flight Crew Decision-Making176

National Transportation Safety Board, 2009. Third update on investigation into

ditching of US Airways jetliner into Hudson River. NTSB Advisory, Feb. 4, 2009.

Washington, DC: NTSB.

O’Hare, D., Smitheram, T., 1995. ‘‘Pressing on’’ into deteriorating conditions: an

application of behavioral decision theory to pilot decision making. International

Journal of Aviation Psychology 5 (4), 351–370.

Orasanu, J., 1993. Decision-making in the cockpit. In: Wiener, E.L., Kanki, B.G.,

Helmreich, R.L. (Eds.), Cockpit resource management. Academic Press, San Diego,

CA, pp. 137–172.

Orasanu, J., 1994. Shared problem models and flight crew performance. In:

Johnston, N., McDonald, N., Fuller, R. (Eds.), Aviation psychology in practice.

Ashgate, Aldershot, UK, pp. 255–285.

Orasanu, J., Fischer, U., 1992. Team cognition in the cockpit: Linguistic control of

shared problem solving. In: Proceedings of the 14th Annual Conference of the

Cognitive Science Society. The Ohio State University, Hillsdale, NJ: Erlbaum.

Columbus, OH, pp. 272–277.

Orasanu, J., Fischer, U., 1997. Finding decisions in natural environments: The view

from the cockpit. In: Zsambok, C., Klein, G. (Eds.), Naturalistic decision making.

Erlbaum, Mahwah, NJ, pp. 343–357.

Orasanu, J., Fischer, U., Davison, J., 2004. Risk perception and risk management in

aviation. In: Dietrich, R., Jochum, K. (Eds.), Teaming Up: Components of Safety

under High Risk. Ashgate, Aldershot, UK, pp. 93–116.

Orasanu, J., Fischer, U., McDonnell, L.K., Davison, J., Haars, K.E., Villeda, E., et al.,

1998. How do flight crews detect and prevent errors? Findings from a flight

simulation study, Proceedings of the Human Factors and Ergonomics Society 42nd

Annual Meeting. Human Factors and Ergonomics Society, Santa Monica, CA.

pp. 191-195.

Orasanu, J., Martin, L., Davison, J., 2002. Cognitive and contextual factors in aviation

accidents. In: Salas, E., Klein, G. (Eds.), Naturalistic Decision Making. Lawrence

Erlbaum, Mahwah, NJ, pp. 343–358.

Orasanu, J., Strauch, B., 1994. Temporal factors in aviation decision making. In:

Smith, L. (Ed.), Proceedings of the Human Factors and Ergonomics Society 38th

Annual Meeting, vol. 2. Human Factors and Ergonomics Society, Santa Monica,

CA, pp. 935–939.

Paletz, S.B.F., Bearman, C., Orasanu, J., Holbrook, J., 2009. Socializing the Human

Factors Analysis and Classification System: Incorporating social psychological

phenomena into a human factors error classification system. Human Factors 51 (4), 435.

Flight Crew Decision-Making • Chapter 5 177

Parke, B., Kanki, B., Nord, K., Bianchi, A., 2000. Crew climate and performance: Use

of group diagrams based on behavioral ratings, 44th IEA2000/HFES 2000 Congress.

CA, San Diego, pp. 3149–3152.

Petrilli, R.M., Roach, G.D., Dawson, D., Lamond, N., 2006. The sleep, subjective

fatigue, and sustained attention of commercial airline pilots during an international

pattern. Chronobiology International 23, 1357–1362.

Predmore, S.C., 1991. Microcoding of communications in accident analyses: Crew

coordination in United 811 and United 232. In Proceedings of the Sixth Interna-

tional Symposium on Aviation Psychology. Columbus, OH: The Ohio State

University, pp. 350-355.

Rasmussen, J., 1985. The role of hierarchical knowledge representation in decision

making and system management. IEEE Transactions on Systems, Man and Cyber-

netics 15 (2), 234–243.

Rasmussen, J., 1993. Deciding and doing: decision making in natural context. In:

Klein, G., Orasanu, J., Calderwood, R., Zsambok, C. (Eds.), Decision Making in

Action: Models and Methods. Ablex, Norwood, NJ.

Reason, J., 1990. Human Error. Cambridge University Press, Cambridge, UK.

Reason, J., 1997. Managing the Risks of Organizational Accidents. Ashgate,

Brookfield, VT.

Salas, E., Nichols, D.R., Driskell, J.E., 2007. Testing three team training strategies in

intact teams: a meta-analysis. Small Group Research 38, 471–488.

Salas, E., Sims, D.E., Burke, C.S., 2005. Is there a ‘‘big five’’ in teamwork? Small Group

Research 36 (5), 555–599.

Senders, J.W., Moray, N.P., 1991. Human error: Cause, prediction, and reduction.

Lawrence Erlbaum Assoc., Hillsdale, NJ.

Simon, H., 1957. Models of man: Social and rational. Wiley, New York.

Simon, H., 1991. Bounded rationality and organizational learning. Organization

Science 2 (1), 125–134.

Sitkin, S., 1992. Learning through failure: the strategy of small losses. Research in

Organizational Behavior 14, 231–266.

Stokes, A.F., Kemper, K., Kite, K., 1997. Aeronautical decision making, cue recog-

nition, and expertise under time pressure. In: Zsambok, C.E., Klein, G. (Eds.),

Naturalistic decision making. Erlbaum, Mahwah, NJ, pp. 183–196.

Strauch, B., 1997. Automation and decision makingdLessons from the Cali accident.

In Proceedings of the Human Factors and Ergonomics Society 41st Annual Meeting

(pp. 195-199). Santa Monica, CA: Human Factors and Ergonomics Society.

Chapter 5 • Flight Crew Decision-Making178

Sumwalt, R.L., Thomas, R.J., Dismukes, R.K., 2002. Enhancing flight crew moni-

toring skills can increase flight safety. In Proceedings of the 55th International Air

Safety Seminar, Dublin, Ireland, Nov. 4-7, 2002.

Thomas, M.J.W., 2004. Predictors of threat and error management: identification of

core non-technical skills and implications for training systems design. International

Journal of Aviation Psychology 14 (2), 207–231.

Tversky, A., 1972. Elimination by aspects: A theory of choice. Psychological Review

79, 281–299.

Van Lehn, K., 1990. Mind bugs: The origins of procedural conceptions. MIT Press,

Cambridge, MA.

Walters, A., 2002. Crew resource management is no accident. Aries, Wallingford.

Wickens, C., Raby, M., 1991. Individual differences in strategic flight management and

scheduling. In: Proceedings of the Sixth International Symposium on Aviation

Psychology. The Ohio State University, Columbus, pp. 1142–1147.

Wickens, C.D., Stokes, A., Barnett, B., Hyman, F., 1993. The effects of stress on pilot

judgment in a MIDIS simulator. In: Svenson, O., Maule, A.J. (Eds.), Time Pressure

and Stress in Human Judgment and Decision Making. Cambridge University Press,

Cambridge, UK, pp. 271–292.

Wiegmann, D.A., Goh, J., O’Hare, D., 2002. The role of situation assessment and flight

experience in pilots’ decisions to continue visual flight rules flight into adverse

weather. Human Factors 44(2), 189–197.

Wiener, E., 1988. Cockpit automation. In: Wiener, E.L., Nagel, D.C. (Eds.), Human

factors in aviation. Academic Press, NY.

Wiggins, M., O’Hare, D., 2003. Weatherwise: Evaluation of a cue-based training

approach for the recognition of deteriorating weather conditions during flight.

Human Factors 45, 337–345.

Woods, D.D., Sarter, N.B., 1998. Learning from automation surprises and "going sour"

accidents: Progress on human-centered automation. (NASA report NCC 2-592).

Moffett Field, CA. NASA Ames Research Center.

Zsambok, C., Klein, G. (Eds.), 1997. Naturalistic Decision Making. Erlbaum,

Mahwah, NJ.

Flight Crew Decision-Making • Chapter 5 179

This page intentionally left blank