2:AB5A:=< 'D5?8

12
HUMAN FACTORS, 1995,37(1),137-148 Situation Awareness Is Adaptive, Externally Directed Consciousness KIP SMITH! and P. A. HANCOCK, University of Minnesota, Minneapolis, Minnesota We define situation awareness (SA) as adaptive, externally directed consciousness. This definition dispels the artificial and contentious division evident in the liter- ature, according to which SA is either exclusively knowledge or exclusively pro- cess. This misdirected rivalry has more to do with general perspectives on the study of human behavior than with SA itself. Through defining SA as an aspect of consciousness, we hope to clarify two key issues. (1) The source of goals with respect to SA is a normative arbiter in the task environment; that is, the behavior that SA generates must be directed at an external goal. (2) SA is the invariant at the core of the agent's perception-action cycle that supports skilled performance; that is, relationships among factors or dimensions in the environment determine what the agent must know and do to achieve the goals specified by the external arbiter. We introduce a construct we call the risk space to represent the invariant relations in the environment that enable the agent to adapt to novel situations and to attain prespecified goals. We articulate this concept of a risk space through use of a specific example in commercial aircraft operations. The risk space structures in- formation about the physical airspace in a manner that captures the momentary knowledge that drives action and that satisfies the goals and performance criteria for safe and efficient flight. We note that the risk space may be generalized to many different means of navigation. INTRODUCTION In the early part of this century, consciousness became an unacceptable term in the psycholo- gist's lexicon. In reaction to the unquantifiable subjectivity of introspectionism, Watson (1913) rewrote the course of psychology by excising the "mental" from what is literally the "science of mental life." In the latter part of this century, the concept of consciousness has enjoyed a grad- ual rehabilitation, albeit cloaked in many guises. Constructs such as attention, mental I Requests for reprints should be sent to Kip Smith. Human Factors Research Laboratory. University of Minnesota, 141 Mariucci Arena. 1901 Fourth St.. S.E .. Minneapolis. MN 55455. workload, and now situation awareness (SA) have arisen for consideration. This renaissance is attributable in part to attempts to expand un- derstanding of behavioral abilities beyond sim- ple stimulus-response relationships to include humans performing complex tasks in dynamic environments. Although human factors has found particular value in each of these constructs, consciousness has yet to be fully reinstated in explanations of an individual's behavior in human-machine sys- tems. Therefore, although the main topic here is situation awareness, our definition of SA and its relation to consciousness has broader implica- tions for the field of human factors and the study of behavior in general. © 1995, Human Factors and Ergonomics Society. All rights reserved. at HFES-Human Factors and Ergonomics Society on September 9, 2016 hfs.sagepub.com Downloaded from

Transcript of 2:AB5A:=< 'D5?8

HUMAN FACTORS, 1995,37(1),137-148

Situation Awareness Is Adaptive, ExternallyDirected Consciousness

KIP SMITH! and P. A. HANCOCK, University of Minnesota, Minneapolis, Minnesota

We define situation awareness (SA)as adaptive, externally directed consciousness.This definition dispels the artificial and contentious division evident in the liter-ature, according to which SA is either exclusively knowledge or exclusively pro-cess. This misdirected rivalry has more to do with general perspectives on thestudy of human behavior than with SA itself. Through defining SA as an aspect ofconsciousness, we hope to clarify two key issues. (1) The source of goals withrespect to SA is a normative arbiter in the task environment; that is, the behaviorthat SAgenerates must be directed at an external goal. (2) SA is the invariant at thecore of the agent's perception-action cycle that supports skilled performance; thatis, relationships among factors or dimensions in the environment determine whatthe agent must know and do to achieve the goals specified by the external arbiter.We introduce a construct we call the risk space to represent the invariant relationsin the environment that enable the agent to adapt to novel situations and to attainprespecified goals. We articulate this concept of a risk space through use of aspecific example in commercial aircraft operations. The risk space structures in-formation about the physical airspace in a manner that captures the momentaryknowledge that drives action and that satisfies the goals and performance criteriafor safe and efficient flight. We note that the risk space may be generalized to manydifferent means of navigation.

INTRODUCTION

In the early part of this century, consciousnessbecame an unacceptable term in the psycholo-gist's lexicon. In reaction to the unquantifiablesubjectivity of introspectionism, Watson (1913)rewrote the course of psychology by excising the"mental" from what is literally the "science ofmental life." In the latter part of this century,the concept of consciousness has enjoyed a grad-ual rehabilitation, albeit cloaked in manyguises. Constructs such as attention, mental

I Requests for reprints should be sent to Kip Smith. HumanFactors Research Laboratory. University of Minnesota, 141Mariucci Arena. 1901 Fourth St.. S.E .. Minneapolis. MN55455.

workload, and now situation awareness (SA)have arisen for consideration. This renaissanceis attributable in part to attempts to expand un-derstanding of behavioral abilities beyond sim-ple stimulus-response relationships to includehumans performing complex tasks in dynamicenvironments.

Although human factors has found particularvalue in each of these constructs, consciousnesshas yet to be fully reinstated in explanations ofan individual's behavior in human-machine sys-tems. Therefore, although the main topic here issituation awareness, our definition of SA and itsrelation to consciousness has broader implica-tions for the field of human factors and the studyof behavior in general.

© 1995, Human Factors and Ergonomics Society. All rights reserved.

at HFES-Human Factors and Ergonomics Society on September 9, 2016hfs.sagepub.comDownloaded from

138-March 1995

Our first step is to propose a definition of SA.Second, we characterize SA in terms of the per-ception-action cycle introduced by Neisser(1976) and as applied by Tenney, Adams, Pew,Huggins, and Rogers (1992). Third, we discussSA in the contexts of adaptation (Simon, 1982)and of the distinction between competence andperformance (Anderson, 1990; Chomsky, 1965;Marr, 1982). Theories of competence specifywhat must be known in order to solve an infor-mation processing problem. Our analysis placesSA firmly within the abstract level of explana-tion that characterizes theories of competence.

We argue that SA specifies what must beknown to solve a class of problems posed wheninteracting with a dynamic environment. Fur-ther, we suggest that the existing literature fre-quently confounds competence with perfor-mance. This confound may be responsible forthe current failure of SA to gain wide acceptancebeyond a cadre of aviation specialists.

We then provide a concrete example of ourcharacterization of SA that uses a novel repre-sentation of the en route environment for com-mercial air traffic operations. This "risk space"representation allows us to illustrate our con-ception of SA in a practical realm. We argue thatthe risk space captures the invariant in the per-ception-action cycle that guides the adaptive be-havior of both pilots and air traffic controllers.We indicate how the risk space embodies an op-erational definition of SA that is amenable toempirical testing and note that it can be used innavigational realms beyond aviation operations.

The final section contrasts our definitions,both theoretical and operational, with the no-tion of SA as a mental model, which has beenpropounded by others (e.g., Endsley, 1988). Weuse this contrast to illustrate both the generalityof the position we take and the power of the rep-resentation we offer.

SITUATION AWARENESS DEFINED

We propose that SA be defined as adaptive,externally directed consciousness. We take con-sciousness to be that part of an agent's knowl-edge-generating behavior that is within the

HUMAN FACTORS

scope of intentional manipulation. As shown inFigure 1, we view SA as generating purposefulbehavior (behavior directed toward achieving agoal) in a specific task environment. The prod-ucts of SA are knowledge about and directed ac-tion within that environment. We argue that SAis more than performance. More fundamentally,it is the capacity to direct consciousness to gen-erate competent performance given a particularsituation as it unfolds. Given our definition, it isnot surprising that we regard SA as directly re-lated to stress, mental workload, and other en-ergetic constructs that are facets of conscious-ness (Freeman, 1948).

Adaptation to Externally Defined Goals

We see a sequential link from consciousness toSA to adaptation. Adaptation is the process by

Figure 1. An approach to defining situation awareness(SA) through explicit recognition of the centrality of ex-ternally oriented consciousness. The central (horizontal)line provides an arbitrary distinction between exogenousand endogenous orientations of consciousness and rep-resents a distinction between SA and introspection.

at HFES-Human Factors and Ergonomics Society on September 9, 2016hfs.sagepub.comDownloaded from

SITUATION AWARENESS DEFINED

which an agent channels its knowledge and be-havior to attain goals as tempered by the condi-tions and constraints imposed by the task envi-ronment (Holland, 1975/1992). Adaptationensures that the agent's behavior and goalsmatch the information made available and ac-tivity made possible by its environment (Simon,1982).

SA, like adaptation, is a dynamic concept thatexists at the interface between the agent and itsenvironment. The study of SA requires assess-ment of the environment-agent relationship. Aswith adaptation, SA presumes experience in anenvironment and the development of an armoryof appropriate alternative courses of action(Holland,1975/1992).

We submit that SA presumes adaptation of aparticular kind. As expressed in aviation, and asall skilled practitioners know, SA is about hav-ing the "right stuff." This notion implies com-plete and natural adherence to task goals and tocriteria for performance. This, in turn, impliesthe existence of a specification of the task theagent is to perform and of measures for evaluat-ing that performance. To possess SA-to havethe right stuff-the agent must have developed alevel of adaptive capability sufficient to matchthe specification of task goals and the criteria forassessing performance variables. Thus SA is ad-aptation to a singular source of constraint: anormative arbiter that defines the stuff that isright.

As shown in Figure 2, we see the arbiter andits dicta residing in the task environment. A realproblem in the current formulations of SA is thefailure to articulate the presence in the environ-ment of normative specifications and criteria forthe performance of the agent's task. Althoughindividuals may exhibit situated, outwardly di-rected consciousness, it is not until the exter-nally defined task is made explicit that their be-havior achieves the status we wish to reserve forSA.To qualify for SA, the agent first must intendits goals, beliefs, and knowledge to match thetask and performance specified by dicta from itsenvironment and, then, must succeed to somedegree in meeting those expectations.

March 1995-139

Figure 2. Constraints on SA. The singular constraint isthe presence of a nonnative arbiter of perfonnance in theagent's task environment. The arbiter specifies for theagent task-relevant constraints and criteria for perfor-mance. Adaptation to the environment requires the agentto adopt the arbiter's specification of constraints andperfonnance variables. Cues and demands are stimulithat unfold in the environment. The agent's internal con-straints are those that shape its intentionality.

In our definition, we intend the phrase exter-nally directed to indicate that the goal of the be-havior that SA directs must reside in the taskenvironment rather than in the agent's head.Failure to recognize the role of the normativearbiter of performance has clearly been a sourceof confusion in the literature on SA. Until anexternal goal and criteria for achieving it arespecified, examination of greater or lesser de-grees of SA or even of loss of SA remains impos-sible, being the same cul-de-sac in which intro-spectionism found itself. If the agent were todictate private, incontestable (but dynamic)goals, SAwould always be perfect because what-ever was perceived would be that goal.

at HFES-Human Factors and Ergonomics Society on September 9, 2016hfs.sagepub.comDownloaded from

140-March 1995

However, in practical realms of human activ-ity, goals and boundaries on performance areoften set by others or by nature or made explicitby ourselves at some previous point in time.Such constraints are evident in task-relevantcues and actions. Once the external arbiter isaccepted as the key source of constraint on theadaptation of the agent, the directed nature ofSA becomes a more manageable construct. Onlywith a specified goal and concrete performancecriteria can we begin to talk about how welladapted a particular agent is with respect to aparticular environment.

To grasp the need for an external goal, con-sider the example of finding your car in acrowded parking lot. If your goal is not explic-itly stated, your actions are governed by intro-spection, which is not SA. However, if you ex-plicitly tell someone else, ''I'm looking for mycar," then the quality of your performance canbe judged, and the competence that guides youto find your car constitutes your SA. This exam-ple illustrates that you can be the arbiter of yourown performance, as long as you state the goal apriori. Further, to assess SA, it is not sufficient tothink you understand the actions of another.Those actions must be assessed in the light ofexplicitly stated goals.

As it is the relationship between organism andenvironment that is crucial. SA is appropriatelydiscussed within the framework of the ecologi-cal movement (e.g., Gibson, 1979; see also Flach,Hancock, Caird, and Vicente, 1995; Hancock,Flach, Caird, and Vicente, 1995). The ecologicalapproach affirms the importance of the interac-tion of an agent with its environment in the mu-tual shaping of the agent's actions. The ap-proach elevates this interaction to make it the"unit of analysis" in any assessment of adaptedbehavior. Explanation is found in the interac-tion of agent and environment rather than in theagent (human or machine), separately consid-ered. Given this perspective, to comprehend SAwithout a viable understanding of the interac-tion between agents and their task environmentwould be virtually impossible. An ecologicalview emphasizes that if SA is continued to be

HUMAN FACTORS

studied with the techniques that focus over-whelmingly on the agent as the individual unitof concern, the critical invariants in the agent-environment interaction are about as likely toemerge as is the Encyclopaedia Britannica fromthe combined efforts of Sir Arthur Eddington'santhropoid typists (but see Koestler, 1978).

To return to our definition, an agent's SA gen-erates knowledge and action, given the structureof its environment and the goals and perfor-mance criteria specified in that environment.The explicit use of the term given reveals ourconviction that the root of SA is the adaptivecapacity to guide behavior in response to dy-namic situations. It is important to note thatthis definition denies a claim of SA to any agentwho is merely conscious and attending to its en-vironment. Further, it denies a claim of SA to anagent that may be fully capable of SAbut that isnot actively pursuing the goal specified by thearbiter of performance. Rather, to stake a claimto SA, an agent must be seeking information andtaking action in pursuit of an externally speci-fied goal. Without the normative focus of an ex-ternally specified goal, SA degenerates intointrospection.

Competence, Perfonnance, andSituation Awareness

Adapted, externally directed consciousness(SA) endows the agent with the competence togenerate appropriate behavior in response to dy-namic situations. The distinction between com-petence and performance is a persistent themein the study of cognition (Anderson, 1990; Chom-sky, 1965; Marr, 1982). Competence directsbehavior but is independent of the situation.Competence melds knowledge, abilities, andcognition to generate appropriate behaviorgiven conditions in the task environment. Com-petence resides in the agent and enables theagent to generate skilled performance. Perfor-mance is action situated in the world, a momen-tary phenomenon that is guided by competencebut must be distinguished from it. Performanceis contingent on information available in theenvironment; competence is independent of the

at HFES-Human Factors and Ergonomics Society on September 9, 2016hfs.sagepub.comDownloaded from

SITUATION AWARENESS DEFINED

particulars of a situation. The utility of this dis-tinction is the leverage that an analysis of com-petence provides to developing an understand-ing of the normatively focused behavior thatcharacterizes SA.

An analysis of competence asks a simple ques-tion (Marr, 1982):What is the problem for whichthis agent's behavior is the solution? Specifica-tion of the problem focuses on the agent's knowl-edge and goals, the information available in theenvironment, and the actions the agent maytake to meet such goals. Specification of compe-tence details what the agent must know and doto attain its goal (to solve the problem). An anal-ysis of competence is unconcerned with the ac-tual processes (e.g., representations, mentalmodels) that produce the agent's performance.As Newell (1981) pointed out, the resulting de-scription of competence places constraints onbehavior but does not prescribe performance.Full prescription or emulation of the agent's be-havior requires a complete accounting of theagent's representation and processing.

The problem to be solved by SA is coming toknow what must be known in order to behave inaccord with the mandates of the arbiter of per-formance. SA is the competence that directs theagent's sampling of factors in the environmentthat determine how the agent can come to knowwhat it must do. In contrast, the behavior thatcompetence generates is the solution to a spe-cific problem. The behavior that SA generates isthe agent's solution to the problem of knowingthose cues and demands in the environment thatenable it to take action that aligns with the dictaof the arbiter of performance.

Tenney et al. (1992) proposed that Neisser's(1976) perception-action cycle provides a frame-work for understanding how SA works. In es-sence, an agent and its environment interact in amanner to satisfy the arbiter by generatingskilled performance. We agree. The perception-action cycle is shown in Figure 3. Informationand action flow continuously around the cycle.Starting arbitrarily at the top, the environmentinforms the agent, modifying its knowledge.Knowledge directs the agent's activity in the en-

March 1995-141

Figure 3. Neisser's (1976) perceptual cycle. The invari-ant at the core of the cycle specifies the agent's adapta-tion to its environment. It structures the infonnationmade available by the environment, the agent's knowl-edge. and the actions the agent takes to meet the con-straints specified by the arbiter of perfonnance.

vironment. That activity samples and perhapsanticipates or alters the environment, which inturn informs the agent. The informed, directedsampling and/or anticipation capture the es-sence of behavior characteristic of SA.

To go beyond performance, to capture thecompetence that is SA, we include in Figure 3the invariant that links the elements of the per-ception-action cycle. The invariant is the struc-ture of the agent's adaptation to the environ-ment: It forms the linkage among information,knowledge. and action that produces competentbehavior. Specifically, the invariant codifies theinformation that the environment may makeavailable, the knowledge the agent requires toassess that information, and the action theknowledge will direct the agent to take to attainits goals. We propose therefore that SA is speci-fied by the invariant at the core of an adaptedagent's perception-action cycle.

Some readers may be puzzled by the fact thatwe conjoin adaptability and invariance. We con-tend that there is no incompatibility. Rather, itis the invariant structure in the agent's interac-tion with its environment that generates up-to-the-minute knowledge and appropriate action.

at HFES-Human Factors and Ergonomics Society on September 9, 2016hfs.sagepub.comDownloaded from

142-March 1995

The invariant structure enables the agent to an-ticipate and respond skillfully to novel condi-tions and signals in a dynamic task environ-ment-that is, to adapt to high rates of change,time pressure, and/or uncertainty.

The environment contributes to the invariantthose factors or dimensions that define sourcesof information to which the agent must attend toin order to attain the arbiter's goal. The agent'sadaptation contributes the knowledge andmethods for assessing that information. The en-vironment specifies appropriate alternative be-haviors. The agent's adaptation generates thosealternatives. With its consciousness directed atfactors relevant to the arbiter's goal, the agentmodifies its knowledge in a manner that affordsappropriate action (e.g., Gibson, 1979). Althougheach situation that the agent faces is likely topresent a novel array of information, the invari-ant relations that exist in each array guide suc-cessful problem solving and skilled performanceof the task.

By defining SA as a generative process ofknowledge creation and informed action taking,we expressly deny that SA is merely a snapshotof the agent's current mental model. Rather, SAguides the process of modifying knowledge-that is, of constructing a representation of cur-rent and likely events. The experience of air traf-fic controllers "losing the picture" illustratesthis point. As Hopkin (1988) recounted, the jobof the controllers is to construct knowledge oftheir sectors and to take action on the basis ofthis knowledge. They call the knowledge theygenerate "the (big) picture." At times, they "losethe picture"-their knowledge becomes insuffi-cient to support their task. Experienced control-lers on the job are often sufficiently self-aware torecognize they are losing the picture as it happens.

This metaknowledge argues our point: SA isnot the controllers' picture. Rather, it is the con-trollers' SA that builds the picture and that en-ables them to know that what they know is in-sufficient for the increasing demands. SA notonly supports the construction of the picture butalso guides the assessment of its integrity.

HUMAN FACTORS

THE RISK SPACE REPRESENTATION OFTHE FLIGHT DECK/ATC INVARIANT

Commercial aviation may epitomize a taskthat qualifies for SA. The task environment con-tains clear and unequivocal normative arbitersof performance: airlines and the Federal Avia-tion Administration (FAA, and its counterpartsaround the world). The agents are pilots andcontrollers. The arbiter establishes the agents'goals and performance criteria for safe and effi-cient flight. In what follows we present a hy-pothesis about the invariant structure of theaircraft-airspace system that guides skilled per-formance on the fligh t deck and in the air trafficcontrol (ATC) suite.

Our candidate representation for the invari-ant in commercial aviation is a multidimen-sional risk space (Smith and Hancock, 1992). Therisk space is a generalization of a mathematicalformulation of human performance in processcontrol tasks (Moray, 1986; Phatak and Bekey,1969). The axes of any risk space are those fac-tors in the environment that compromise safety.These safety factors define sources of informa-tion to which the agent attends in order to sat-isfy the arbiter's norms for performance.Thresholds of safety parse the risk space intodecision regions. The thresholds are perfor-mance criteria defined by either the arbiter ofperformance or the agent who differentiates al-ternative control decisions. Each decision regionis associated with a unique set of appropriatealternatives (actions). Information generated bythe task environment (the airspace) is continu-ally posted and updated within the risk space.Figure 4 schematically presents a portion of theflight deck/ATC risk space.

Two safety factors-horizontal separation be-tween aircraft and relative velocity at altitude-define the axes of the portion of the ATC riskspace sketched in Figure 4. Maintaining separa-tion is one of the FAA's prime directives. Avoid-ing high-velocity convergence is a paramountconcern. Separation and relative velocity deter-mine the information that both pilots and

at HFES-Human Factors and Ergonomics Society on September 9, 2016hfs.sagepub.comDownloaded from

SITUATION AWARENESS DEFINED March 1995-143

oH

o

5

15

10

20

25-enQ)

EctSo'.;::0=tctSc:c:c:o

'.;::0ctS•....ctSa.Q)enctS-We:

CJ2Z·~«0a:~

..........•• '0 •••••••••••• '0 •••• u ••••

•• •• •• u eo u •••••• •• •• •• n ••••••••••.. ., .. .. .. .. .. .. .. .. .... .. .. .. .. .. .. .. .. .. .. .... .. .. .. .. .. .. .. .. .. .. .. ... ..

•• •• •• •• •• •• •• •• •• •• 00 •• •• •• •• •• ••o '0 •• •• •• •• .0 •• •• •• .0 •• •• •• ., •• •• •• ••.. ':. ·:··:::::::::::::::::::::::::::::::::"R:E·S':'blVE'::::::::::

. .. ':.':, :.':.'::.::::::;~~;~~:~~~~:~;::~;::~;~~~~~:~~~:~~~:~~~~~~~:~;~::~~:~;~~~::::::::::::::::::::::::::::::::::::~~~:;~::~s:H......,.Collision" :..:..:..: :..:..:..:..: :..:..:..: :. !.!!..!

RANGE DELTA ot(Relative velocity at altitude)

Figure 4. The horizontal plane of the flight deckJATC risk space for the en route environment. The y axis is thehorizontal distance (separation, range) between two aircraft. The x axis is the relative velocity at which the aircraftare approaching or diverging. At any time, positions and airspeeds define a point in the risk space. Flight through theairspace traces a path through the risk space. The slopes oflines in the risk space have units of time (e.g., 30 and 300s). The line connecting the point defined by two aircraft to the origin of the axes represents the time remaining beforethe aircraft collide. We propose that two such lines form thresholds for anticipating and resolving a conflict. Thesethresholds of safety, shown by heavy lines, parse the risk space into decision regions. Each region is associated witha set of safe and effective decision alternatives. The set of alternatives is defined by the invariant in the flight deckJATCperception-action cycle.

controllers must attend to in order to know whatthey must know to attain the goals specified bythe FAA.Note that time is not an axis of the riskspace. The risk space represents time both im-plicitly and explicitly. At every instant, the lo-cations and airspeeds of two aircraft define aunique point in the risk space. As the aircraft fly

through the airspace, the point charts a paththrough the risk space. Figure 4 shows one suchpath: two aircraft are converging at a rate that isdecreasing; one (or both) may be making acourse adjustment.

As shown in Figure 4, the slope of any straightline in the risk space has units of time. A line

at HFES-Human Factors and Ergonomics Society on September 9, 2016hfs.sagepub.comDownloaded from

144-March 1995

drawn from the point defined by two aircraft tothe intersection of the axes defines the timewhen those aircraft will collide if they maintaintheir headings and airspeeds. Lines like thoseshown in Figure 4 are solutions to the basicproblem of en route navigation: They specify thetime to collision. We submit that pilots and con-trollers pay attention to information about sep-aration and airspeed in order to solve the prob-lem of whether (and when) aircraft are likely tocollide. The risk space makes the solution to thisproblem transparent. By capturing the invari-ant relationship among location, airspeed, andtime to collision, the risk space enables pilotsand controllers to make appropriate decisionsand take action.

Thresholds parse the risk space into decisionregions associated with alternative actions. Wepropose that thresholds of time prompt pilotsand controllers to invoke specific classes of de-cisions. Figure 4 depicts three decision regionspopulated by three classes of decisions (and ac-tions): proceed on course, anticipate potentialfuture conflicts, and resolve impending con-flicts. The thresholds and decision regions in therisk space define what pilots and controllersmust do in order to attain the goals specified bythe FAA.

The decision region labeled "proceed" is theregion where there is little danger of collision.The goal of all control decisions is to keep allaircraft in the decision region labeled "pro-ceed." Conversely, the small region labeled "re-solve" corresponds to situations where collisionis imminent and pilots must take evasive action.The region in the middle, labeled "anticipate" isan oversimplification. but in general it is the re-gion where ATC and pilots go on alert and cometo decisions that are intended to prevent a con-tlict from materializing.

The risk space framework integrates informa-tion critical to safety considerations in a mannerthat specifies the action appropriate for a givensector at a particular time. The decision regionsspecify (and can be used to communicate) theactions that pilots and controllers need to takegiven the unfolding of events in the airspace. To-

HUMAN FACTORS

gether, the axes, thresholds, and decision re-gions of the risk space present a complete ac-count of what must be known and done totraverse the airspace safely and efficiently.These relations, as represented by the risk space.afford the momentary knowledge of the distri-bution of aircraft in the airspace that guides thesuccessful performance of both pilots and con-trollers. The two additional axes-vertical sepa-ration and relative velocity-are orthogonal tothose shown in Figure 4. The full representationof the flight deck/ATC risk space requires visu-alization of a four-dimensional hyperspacewhere thresholds form shells. The thresholds en-close regions in which alternative actions areappropriate and specified. Points pass throughthese regions as pairs of aircraft fly through theairspace.

The risk space facilitates prognosis but is itselfinvariant across airspace (sectors) and time. Therisk space enables the pilot/controller to adaptto changing conditions in the task environment,to focus on the dimensions of risk that are crit-ical, and consequently to take appropriate ac-tion. By capturing the invariant relations thatdefine skilled performance and by prescribingthat performance, the risk space specifies thecompetence of skilled agents.

The risk space is an observer construct: It is aparamorphic representation (Dawes, 1979) ofthe adaptation of skilled pilots and controllers.We do not claim that any individual pilot or con-troller necessarily visualizes a four-dimensionalrisk space. Our claim is that the risk spaceframework (when fully developed) specifiescompetence at decision making and, therefore,can be used to explain and predict skilled per-formance. The risk space explains and predictsskilled performance in commercial aviation. Itsaxes, thresholds. and decision regions capturecompetence at knowing what must be knownand doing what must be done when traversingthe airspace.

The risk space stands as an operational defi-nition of SA in the en route environment. Spe-cifically, the risk space codifies the relevant in-formation-separation and relative velocity-

at HFES-Human Factors and Ergonomics Society on September 9, 2016hfs.sagepub.comDownloaded from

SITUATION AWARENESS DEFINED

that the environment makes available. Further,the risk space codifies as thresholds of risk theknowledge the agent requires to assess that in-formation. Finally, the risk space embeds indecision regions the actions that an adaptedagent's knowledge will direct it to take to meetits goals.

To our knowledge, this specificity is unparal-leled in SA. Our current research is directed atdefining (1) the locations of the thresholds undera wide range of airspace configurations and (2)the set of appropriate actions given those con-figurations. The risk space hypothesis of the SAof pilots and controllers in the en route environ-ment is open to empirical challenge. It is com-forting, however, to see that this same concep-tualization has been independently generatedby others for use in the realm of road vehiclecollision avoidance (Fancher et aI., 1994).

THE CURRENT ALTERNATIVE: A SYMBOLWITH MULTIPLE REFERENTS

Published accounts of SA appear rooted in theassumption that SA is behavior. Such a defini-tion allows SA infinite variability. SA would be-come whatever an agent has and/or does thatmakes an observer believe the agent is skilled. Atbest, such an approach is descriptive. For in-stance, Endsley's (1988; 1995 [this issue, articleon theory]) pioneering account equates SA di-rectly with "an important component of ... per-formance" (Endsley, 1988, p. 97). She stated thatSA "can be conceived of as the pilot's internalmodel of the world around him at any point intime" (1988, p. 97). The problem here is the at-tempt to define a new term (SA) with referenceto a second-mental model-which is itself illdefined and the subject of much contention andconfusion. Like SA, mental models are intu-itively appealing but difficult to corral.

We assume that the term mental model is usedin the literature to refer to an organized struc-ture of task-relevant information that allowsmental simulation of processes, causes, orevents in the environment (e.g., Gentner andStevens, 1983; Johnson-Laird, 1983). With re-spect to Endsley's assertion, in order to define

March 1995-145

SA, we would have to develop a complete spec-ification of an agent's mental model. Such aspecification would obviously vary from agentto agent, even in the same situation. To equateSA with momentary knowledge and mentalmodels is to run the risk of allowing SA to de-generate rapidly into "whatever is inside your(skilled) head:'

Equating SAwith mental models also appearsto confound knowledge with process. For in-stance, Endsley (1988) defined three levels of SA:perception of the elements in the environmentwithin a volume of space and time, comprehen-sion of their meaning, and projection of this sta-tus in the near future. These levels sound verysimilar to three stages of information process-ing. We suggest that the three levels are, in fact,isomorphic with the processes posited in Newelland Simon's (1972) generic model of the humanproblem solver. In Newell and Simon's account,the agent samples the environment, generates arepresentation of that environment, and invokesmethods that transform the representation and,on occasion, effect change in the environment.The outputs of the processes of sampling, repre-sentation, and transformation (projection) areknowledge and behavior. The processes them-selves are neither knowledge nor behavior. Ac-cordingly, it is inconsistent to define SA in termsof three levels of processing while equating itwith momentary knowledge.

The evident confusion in the assertion that SAis performance and/or mental model spurs ourdesire to offer an alternative account. Lest we beviewed as mere critics, we are the first to ac-knowledge that our definition of SA draws heav-ily on previous work, especially Endsley's.We take, however, a more ecological approachto define SA as the invariant in the agent-environment system that generates the momen-tary knowledge and behavior required to attainthe goals specified by an arbiter of performancein the environment.

DISCUSSION

We have argued that SA is a facet of conscious-ness. It is one in a long line of energetic con-

at HFES-Human Factors and Ergonomics Society on September 9, 2016hfs.sagepub.comDownloaded from

146-March 1995

structs (e.g., attention, stress, and workload)that in a progressive fashion have reintroducedconsciousness into scientific discussion of hu-man behavior. Asconsciousness has re-emerged,age-old questions as to its nature-product orprocess, knowledge or performance-have per-colated to the surface. We submit that a philo-sophical resolution of these questions is unlikelyto be achieved in any facile manner. However,for practical purposes, those in the field of hu-man factors can impose some operationalbounds that allow the definition of terms.

We have articulated such bounds. We positfirst that SAis externally directed toward a taskenvironment. This means SA is a facet of con-sciousness, not necessarily all of consciousness.Second, for SA to exist, constraints on perfor-mance must be made explicit in the environ-ment. SAis referenced to those goals and bound-aries on performance. Finally, we recognize SAas an invariant component in an adaptive cycleof knowledge, action, and information. In thiscycle, knowledge directs adaptive behavior,which modifies the environment, which in tumthen informs knowledge; and so the cycle con-tinues. Adaptive behavior that satisfies the arbi-ter ofperformance is the desired by-product of acycle that is driven by competence.

Our discussion of agents and environmentssupports our proposal that SA is restricted totasks that present clear and defined objectivesthat can be externally verified. Consequently,tasks with goals that are accessible only to in-trospection do not engender SAin the sense thatwe define it. SAis the competence that generatesskilled performance in task environments thatdefine the agent's goals. As a competence-levelconstruct, SAguides our evaluation ofan agent'sperformance. We therefore advocate developingrepresentations of an agent's SA, like the riskspace of Figure 4, for their potential to guideanalyses of the interaction of skilled agents anddynamic task environments. However, we cau-tion that much remains to be understood abouthow SArelates or coincides with other facets ofconsciousness that have enjoyed a similar voguein the past (see Wickens, 1993).

HUMAN FACTORS

We do not apologize if our approach provesdisturbing to those who desire static definitions.Anyuseful notion of SAmust take into accountthe central problems ofgenerativity and nonsta-tionarity posed by environments that are con-tinually changing and by agents who are contin-ually adapting and learning. The risk space is anobserver construct that, we contend, capturesthe structure of a skilled agent's adaptation to adynamic environment filled with moving tar-gets. The risk space is defined by sources of in-formation, by criteria for action, and by the ac-tions themselves; together, these factors addressthe goals and standards for performance speci-fied by a normative arbiter. The sources of in-formation, criteria, and actions are invariantacross situations; the risk space is the structurethat makes them operative. Like a grammar forlanguage, the risk space generates all possiblesituations and prescribes adaptive, appropriatebehavior. The risk space is an operational defi-nition of the SAof pilots and controllers in themidair (en route) environment. The invariantstructure it captures predicts the cycle of adap-tive behavior that all recognize as the productof SA.

At a very simple level, SA is an appropriateawareness of a situation. This implies that thereare some things in a situation that have centralimportance and other things that are not rele-vant at all. Because the situations of interest tothose studying human factors are ubiquitouslydynamic, the things that are important and thethings that are irrelevant can change with littlewarning. Others have seen fit to seize on thispoint to describe SA as a characteristic of theindividual agent, who may, therefore, be said tohave good or poor SA, depending on his or herpropensity. What we have tried to affirm here isthat SAis not resident in the agent but exists inthe invariant interaction of the agent and his orher environment.

We suggest that SAmay be good or poor de-pending on factors that constrain that interac-tion. It is correct to note that the behavior thatSA generates is amenable to manipulationthrough change in the way information is dis-

at HFES-Human Factors and Ergonomics Society on September 9, 2016hfs.sagepub.comDownloaded from

SITUATION AWARENESS DEFINED

played. With others in the ecological movement,we suggest that context-specific emergent prop-erties of a task may be displayed so that thecritical aspects of a situation are brought toawareness (Bennett and Flach, 1992; Hansen,1995; Vicente and Rasmussen, 1992).

We have proposed here that for the context wehave specified--en route navigation-the emer-gent property of the environment is described ina multidimensional hyperspace with axes repre-senting separation and the rate of change of sep-aration. To achieve the arbiter's goal of collisionavoidance, pilots and controllers need be awareonly of interobject location in this space.Through this specification we bound the natureof the navigation problem and overcome the in-finite regress of context change and awarenesschange with every fluctuation of the dynamics.

The specifications of the risk space also ad-dress the question of how future interfaces maybe constructed. It is clear that the evolution ofthe human-machine interface has been drivenby the mediation of computer control. Equallyclearly, fostering facile interactions has man-dated progressively more use of human spatialabilities. Alphanumeric strings have becomevestigial descriptions, whereas icons and three-dimensional representations have flourished. Asa result, we expect that the emergent spaces thatdescribe the constraints on system performancewill be used to generate visual boundaries in vir-tual reality.

Interaction in virtual reality has yet to matchprognostications (Kozak, Hancock, Arthur, andChrysler, 1993), but the concept of graphicboundaries to operational status should, in time,enhance performance by adaptively displayingthe invariants in dynamic operational phasespaces. In sum, we see value in the phenomenoncalled situation awareness. We expect to see itand other, even clearer, manifestations of con-sciousness as we design and evaluate purposivehuman-machine-environment systems.

ACKNOWLEDGMENTS

Parts of this work were presented at the First Annual Centerfor Applied Human Factors in Aviation International Meeting

March 1995-147

on Situation Awareness, Orlando, Florida, February 1993, andat the 37th Annual Meeting of the Human Factors and Ergo-nomics Society in Seattle, Washington, October 1993. Supportfor portions of the present work came from the Federal Avia-tion Administration, Grant 93-G-048 (John Zalenchak, techni-cal monitor) and from the U.S. Navy, Contract N/N62269-92-0211 (Jeff Morrison, technical monitor) to the second author.The views expressed are those of the authors and do not nec-essarily represent those of the named agencies.

REFERENCESAnderson, J. R. (1990). The adaptive character of thought. Hill-

sdale, NJ: Erlbaum.Bennett, K. B., and Flach, J. M. (1992). Graphical displays:

Implications for divided attention, focused attention, andproblem solving. Human Factors, 34,513--533.

Chomsky, N. (1965). Aspects of a theory of syntax. Cambridge:MIT Press.

Dawes, R. M. (1979). The robust beauty of improper linearmodels in decision making. American Psychologist, 34,571-582.

Endsley, M. R. (1988). Design and evaluation for situationawareness enhancement. In Proceedings of the Human Fac-tors Society 32nd Annual Meeting (pp. 97-101). SantaMonica, CA: Human Factors and Ergonomics Society.

Fancher, P. S., Ervin, R. D., Bareket, Z., Johnson, G. E., Tre-fait, M., Tiedecke, J .. and Hagleitner, W. (1994, April). In-telligent cruise control: Performance studies based upon anoperating prototype. Paper presented at the 1994 AnnualMeeting of IVHS America, Atlanta, GA.

Flach, J., Hancock, P. A., Caird, J. K., and Vicente, K. (Eds).(1995). Global perspectives on an ecological approach to hu-man-machine systems. Hillsdale, NJ: Erlbaum.

Freeman, G. L. (1948). The energetics of human behavior. Ith-aca, NY: Cornell University Press.

Gentner, D., and Stevens, A. L. (1983). Mental models. Hills-dale, NJ: Erlbaum.

Gibson, J. J. (1979). The ecological approach to visual percep-tion. Boston: Houghton-Mifflin.

Hancock, P. A., Flach, J., Caird, J. K., and Vicente, K. (Eds.).(1995). Local applications in an ecological approach to hu-man-machine systems. Hillsdale, NJ: Erlbaum.

Hansen, 1. P. (1995). Representation of system invariants byoptical invariants in configura I displays for process con-trol. In P. A. Hancock, J. M. Flach, J. K. Caird, and K. H.Vicente (Eds)., Local applications in an ecological approachto human-machine systems. Hillsdale, NJ: Erlbaum.

Holland, J. H. (1992). Adaptation in natural and artificial sys-tems. Ann Arbor: University of Michigan Press. (Originalwork published 1975)

Hopkin, V. D. (1988). Air traffic control. In E. L. Wiener andD. C. Nagel (Eds.), Human factors in aviation (pp. 639-663). San Diego, CA: Academic.

Johnson-Laird, P. N. (1983). Mental models. Cambridge, MA:Harvard University Press.

Koestler, A. (1978). Janus. New York: Random House.Kozak, 1. J., Hancock, P. A., Arthur, E. J., and Chrysler, S. T.

(1993). Transfer of training from virtual reality. Ergonom-ics, 36, 777-784.

Marr, D. (1982). Vision. San Francisco: Freeman.Moray, N. (1986). Monitoring behavior and supervisory con-

trol. In K. R. Boff, L. Kaufman, and J. P. Thomas (Eds.),Handbook of perception and human performance (Vol. 2, pp.40.1-40.51). New York: Wiley.

Neisser, U. (1976). Cognition and reality: Principles and impli-cations of cognitive psychology. San Francisco: W. H. Free-man.

Newell, A. (1981). The knowledge level. Artificial Intelligence,18,87-127.

at HFES-Human Factors and Ergonomics Society on September 9, 2016hfs.sagepub.comDownloaded from

148-March 1995

Newell, A., and Simon, H. A. (1972). Human problem solving.Englewood Cliffs, NJ: Prentice-Hall.

Phatak, A. V., and Bekey, G. A. (1969). Decision processes inthe adaptive behavior of human controllers. IEEE Trans-actions of Systems Science and Cybernetics,S, 339-352.

Simon, H. A. (1982). Sciences of the artificial (2nd ed.). Cam-bridge: MIT Press.

Smith, K., and Hancock, P. A. (1992). Managing risk undertime stress. In Proceedings of the Human Factors Society36th Annual Meeting (pp. 1019-1023). Santa Monica, CA:Human Factors and Ergonomics Society.

Tenney, Y. J., Adams, M. J., Pew, R. W., Huggins, A. W. F., andRogers, W. H. (1992). A principled approach to the measure-

HUMAN FACTORS

ment of situation awareness in commercial aviation (NASAContractor Report 4451). Langley, VA: NASA.

Vicente, K. J., and Rasmussen, J. (1992). Ecological interfacedesign: Theoretical foundation. IEEE Transactions on Sys-tems, Man and Cybernetics, 22, 589-606.

Watson, J.B. (1913). Psychology as the behaviorist views it.Psychological Review, 20, 158-167.

Wickens, C. D. (1993). Workload and situation awareness: Ananalogy of history and implications. Insight: The VisualPerformance Technical Group Newsletter, 14, 4.

Date received: May 6, 1993Date accepted: October 4, 1994

at HFES-Human Factors and Ergonomics Society on September 9, 2016hfs.sagepub.comDownloaded from