Dynamic Perception of Facial Affect and Identity in the ... · Dynamic Perception of Facial Affect...

11
Functional magnetic resonance imaging (fMRI) was used to compare brain activation to static facial displays versus dynamic changes in facial identity or emotional expression. Static images depicted prototypical fearful, angry and neutral expressions. Identity morphs depicted identity changes from one person to another, always with neutral expressions. Emotion morphs depicted expression changes from neutral to fear or anger, creating the illusion that the actor was ‘getting scared’ or ‘getting angry’ in real-time. Brain regions implicated in processing facial affect, including the amygdala and fusiform gyrus, showed greater responses to dynamic versus static emotional expressions, especially for fear. Identity morphs activated a dorsal fronto-cingulo-parietal circuit and additional ventral areas, including the amygdala, that also responded to the emotion morphs. Activity in the superior temporal sulcus discriminated emotion morphs from identity morphs, extending its known role in processing biologically relevant motion. The results highlight the importance of temporal cues in the neural coding of facial displays. Introduction An important function of facial expression is the social communication of changes in affective states. The dynamic perception of expressive features likely recruits specialized processing resources to direct appropriate actions in response to observed sequences in facial motion. Such dynamic information may be integral to the mental representation of faces (Freyd, 1987). Behavioral studies have shown that humans are sensitive to temporal cues in facial displays. For example, subjects can temporally order scrambled sequences of videotaped emotional reactions, even when consecutive frames contain subtle transitions in facial expression (Edwards, 1998). In fact, per- formance improves under time constraints, suggesting that the extraction of dynamic features in facial expression occurs relatively automatically. In other circumstances, dynamic infor- mation contributes to face recognition abilities (Christie and Bruce, 1998; Lander et al., 1999) and judgements of facial affect (Bassili, 1978, 1979; Kamachi et al., 2001) and identity (Seamon, 1982; Hill and Johnston, 2001; Thornton and Kourtzi, 2002). As with other aspects of emotional perception, identification of expression changes may exhibit mood-congruent biases. Niedenthal and colleagues (Niedenthal et al., 2000) showed that participants induced into a sad or happy mood take longer than controls to detect a change in morphed expressions that slowly decrease in intensity of displayed sadness or happiness, respectively. Motion cues may also dissociate perceptual abilities in patients with neurologic and developmental disorders. Humphreys et al. (Humphreys et al., 1993) reported a double dissociation in two patients relative to performance on facial affect and identity tasks. Prosopagnosic patient H.J.A., who sustained ventral occipitotemporal damage, had difficulties with both facial identity and expression judgements using static photographs. However, his performance improved when asked to categorize facial expressions using moving point-light displays. On the other hand, patient G.K., who sustained bilat- eral parietal lobe damage, had relatively good performance on facial identity tasks but was impaired at facial affect recognition using either static or dynamic cues. Children with psychopathic tendencies also present with selective impairments in identi- fying emotion from cinematic displays of slowly morphing expressions (Blair et al., 2001). In contrast, autistic children may benefit from slow dynamic information when categorizing emotional expressions (Gepner et al., 2001). This latter finding differs from other autistic deficits on motion-processing tasks that require faster temporal integration, including lip reading (de Gelder et al., 1991; Spencer et al., 2000). Finally, the perception of biological motion in Williams syndrome is spared relative to other aspects of motion perception and visuomotor integration (Jordan et al., 2002). The neural substrates that mediate dynamic perception of emotional facial expressions are unknown. Previous neuro- imaging studies have been limited to posed snapshots that lack temporal cues inherent in everyday socioemotional interactions. These studies have emphasized category-specific representa- tions in the amygdala and associated frontolimbic structures by comparing responses to faces portraying basic emotions [e.g. (Breiter et al., 1996; Morris et al., 1996; Phillips et al., 1997, 1998; Sprengylmeyer et al., 1997; Whalen et al., 1998, 2001; Blair et al., 1999; Kesler-West et al., 2001)]. A separate line of research has revealed brain regions responsive to biological motion, including movement of face parts, but the role of emotion has been largely untested. Shifts of eye gaze, mouth movements and ambulation videos using animation or point-light displays elicit activity in the superior temporal sulcus and anatomically related areas [reviewed in Allison et al. (Allison et al., 2000)]. Some of these regions, such as the amygdala (Bonda et al., 1996; Kawashima et al., 1999), also participate in facial affect recognition, suggesting a potential link between dorsal stream processing of biological motion and ventral stream processing of emotional salience. The present study was designed to integrate these literatures by investigating perception of negative affect using facial stimuli that varied in their dynamic properties. Prototypical expressions of fear and anger were morphed with neutral expressions of the same actors to form the impression that the actors were becoming scared or angry in real-time (Fig. 1). fMRI activation to the emotion morphs was contrasted with the static expressions. In addition, identity morphs were created that blended facial identities across pairs of actors with neutral expressions. This condition was included to evaluate the specificity of the results with respect to changes in facial affect versus identity, and to dissociate the signaling of biologically plausible from implausible motion. Hypothesis about four brain regions were made a priori: Dynamic Perception of Facial Affect and Identity in the Human Brain Kevin S. LaBar, Michael J. Crupain, James T. Voyvodic 1 and Gregory McCarthy 1 Center for Cognitive Neuroscience and 1 Brain Imaging and Analysis Center, Duke University, Durham, NC 27708, USA Cerebral Cortex Oct 2003;13:1023–1033; 1047–3211/03/$4.00 © Oxford University Press 2003. All rights reserved.

Transcript of Dynamic Perception of Facial Affect and Identity in the ... · Dynamic Perception of Facial Affect...

Page 1: Dynamic Perception of Facial Affect and Identity in the ... · Dynamic Perception of Facial Affect and ... with neutral expressions of the same actor to create the ... cropped with

Functional magnetic resonance imaging (fMRI) was used to comparebrain activation to static facial displays versus dynamic changes infacial identity or emotional expression. Static images depictedprototypical fearful, angry and neutral expressions. Identity morphsdepicted identity changes from one person to another, always withneutral expressions. Emotion morphs depicted expression changesfrom neutral to fear or anger, creating the illusion that the actorwas ‘getting scared’ or ‘getting angry’ in real-time. Brain regionsimplicated in processing facial affect, including the amygdala andfusiform gyrus, showed greater responses to dynamic versus staticemotional expressions, especially for fear. Identity morphs activateda dorsal fronto-cingulo-parietal circuit and additional ventral areas,including the amygdala, that also responded to the emotion morphs.Activity in the superior temporal sulcus discriminated emotionmorphs from identity morphs, extending its known role in processingbiologically relevant motion. The results highlight the importance oftemporal cues in the neural coding of facial displays.

IntroductionAn important function of facial expression is the socialcommunication of changes in affective states. The dynamicperception of expressive features likely recruits specializedprocessing resources to direct appropriate actions in response toobserved sequences in facial motion. Such dynamic informationmay be integral to the mental representation of faces (Freyd,1987).

Behavioral studies have shown that humans are sensitive totemporal cues in facial displays. For example, subjects cantemporally order scrambled sequences of videotaped emotionalreactions, even when consecutive frames contain subtletransitions in facial expression (Edwards, 1998). In fact, per-formance improves under time constraints, suggesting that theextraction of dynamic features in facial expression occursrelatively automatically. In other circumstances, dynamic infor-mation contributes to face recognition abilities (Christie andBruce, 1998; Lander et al., 1999) and judgements of facial affect(Bassili, 1978, 1979; Kamachi et al., 2001) and identity (Seamon,1982; Hill and Johnston, 2001; Thornton and Kourtzi, 2002). Aswith other aspects of emotional perception, identification ofexpression changes may exhibit mood-congruent biases.Niedenthal and colleagues (Niedenthal et al., 2000) showed thatparticipants induced into a sad or happy mood take longer thancontrols to detect a change in morphed expressions that slowlydecrease in intensity of displayed sadness or happiness,respectively.

Motion cues may also dissociate perceptual abilities inpatients with neurologic and developmental disorders.Humphreys et al. (Humphreys et al., 1993) reported a doubledissociation in two patients relative to performance on facialaffect and identity tasks. Prosopagnosic patient H.J.A., whosustained ventral occipitotemporal damage, had difficulties withboth facial identity and expression judgements using static

photographs. However, his performance improved when askedto categorize facial expressions using moving point-lightdisplays. On the other hand, patient G.K., who sustained bilat-eral parietal lobe damage, had relatively good performance onfacial identity tasks but was impaired at facial affect recognitionusing either static or dynamic cues. Children with psychopathictendencies also present with selective impairments in identi-fying emotion from cinematic displays of slowly morphingexpressions (Blair et al., 2001). In contrast, autistic children maybenefit from slow dynamic information when categorizingemotional expressions (Gepner et al., 2001). This latter findingdiffers from other autistic deficits on motion-processing tasksthat require faster temporal integration, including lip reading(de Gelder et al., 1991; Spencer et al., 2000). Finally, theperception of biological motion in Williams syndrome is sparedrelative to other aspects of motion perception and visuomotorintegration (Jordan et al., 2002).

The neural substrates that mediate dynamic perception ofemotional facial expressions are unknown. Previous neuro-imaging studies have been limited to posed snapshots that lacktemporal cues inherent in everyday socioemotional interactions.These studies have emphasized category-specific representa-tions in the amygdala and associated frontolimbic structures bycomparing responses to faces portraying basic emotions [e.g.(Breiter et al., 1996; Morris et al., 1996; Phillips et al., 1997,1998; Sprengylmeyer et al., 1997; Whalen et al., 1998, 2001;Blair et al., 1999; Kesler-West et al., 2001)]. A separate line ofresearch has revealed brain regions responsive to biologicalmotion, including movement of face parts, but the role ofemotion has been largely untested. Shifts of eye gaze, mouthmovements and ambulation videos using animation or point-lightdisplays elicit activity in the superior temporal sulcus andanatomically related areas [reviewed in Allison et al. (Allison et

al., 2000)]. Some of these regions, such as the amygdala (Bondaet al., 1996; Kawashima et al., 1999), also participate in facialaffect recognition, suggesting a potential link between dorsalstream processing of biological motion and ventral streamprocessing of emotional salience.

The present study was designed to integrate these literaturesby investigating perception of negative affect using facial stimulithat varied in their dynamic properties. Prototypical expressionsof fear and anger were morphed with neutral expressions ofthe same actors to form the impression that the actors werebecoming scared or angry in real-time (Fig. 1). fMRI activation tothe emotion morphs was contrasted with the static expressions.In addition, identity morphs were created that blended facialidentities across pairs of actors with neutral expressions. Thiscondition was included to evaluate the specificity of the resultswith respect to changes in facial affect versus identity, and todissociate the signaling of biologically plausible from implausiblemotion. Hypothesis about four brain regions were made a priori:

Dynamic Perception of Facial Affect andIdentity in the Human Brain

Kevin S. LaBar, Michael J. Crupain, James T. Voyvodic1 andGregory McCarthy1

Center for Cognitive Neuroscience and 1Brain Imaging andAnalysis Center, Duke University, Durham, NC 27708, USA

Cerebral Cortex Oct 2003;13:1023–1033; 1047–3211/03/$4.00© Oxford University Press 2003. All rights reserved.

Page 2: Dynamic Perception of Facial Affect and Identity in the ... · Dynamic Perception of Facial Affect and ... with neutral expressions of the same actor to create the ... cropped with

1. The amygdala would preferentially respond to fear stimulirelative to the other categories and would show greateractivation to fear morphs than static fear images.

2. The fusiform gyrus would respond to all stimuli but wouldshow a similar bias as the amygdala, given the neuro-modulatory inf luence of the amygdala on face processingalong the ventral visual stream (Morris et al., 1998; Pessoa et

al., 2002).3. The superior temporal sulcus would discriminate emotion

from identity morphs, since only the emotion morphs arebiologically plausible.

4. Visual area V5/MT+ would respond to all dynamic stimuliindiscriminately, given its sensitivity to visual motion cues(Tootell et al., 1995; Huk et al., 2002) and its sparserconnectivity with the amygdala relative to rostroventralsectors of the visual stream (Amaral et al., 1992).

Materials and Methods

SubjectsTwelve healthy adults provided written informed consent to participatein the study. Two of these subjects were dropped due to excessive headmovement (center-of-mass motion estimates >3.75 mm in x, y or z

planes). The remaining 10 participants (five male, five female; age range =21–30 years) were included in the statistical analysis. All participantswere right-handed and were screened for history of neurologic andpsychiatric illness and substance abuse. Procedures for human subjectswere approved by the Institutional Review Board at Duke University.

Stimulus DevelopmentFacial affect stimuli that are panculturally representative of basicemotions were taken from the Ekman series (Ekman and Friesen, 1976;Matsumoto and Ekman, 1989). Prototypical expressions of fear and angerwere morphed with neutral expressions of the same actor to create thedynamic emotional stimuli. The expression change depicted in the morphalways portrayed increasing emotional intensity (i.e. from neutral to 100%fear, or neutral to 100% anger). In addition, pairs of actors with neutral

expressions were morphed to create dynamic changes in identity. Identitymorphs always combined pairs of actors of the same gender and ethnicity.All actors in the emotion morphs were included in the identity morphs,and all actors in the static images were included in the dynamic stimuli. Asubset of actors portrayed both fear and anger.

Emotion morphs were used instead of videotaped expressions toallow experimental control over the rate and duration of the changes, asin previous studies [e.g. (Niedenthal et al., 2000)]. Morphs were createdusing MorphMan 2000 software (STOIK, Moscow, Russia). All faces wereinitially cropped with an ovoid mask to exclude extraneous cues (hair,ears, neckline, etc.). The images were then normalized for luminance andcontrast and presented against a mid-gray background. Approximately150 fiducial markers were placed on each digital source image in themorph pair and individually matched by computer mouse to corres-ponding points on the target image. Areas of the face relevant forperceiving changes in identity and expression, such as the eyes, mouth,and corrogator and obicularis oculi muscles were densely sampled(Ekman and Friesen, 1978; Bassili, 1979). All expressions were posedwith full frontal orientations (i.e. there were no changes in viewpointeither across or within morphs). Morphs were presented at a rate of30 frames/s, consistent with previous studies (Thornton and Kourtzi,2002). Forty-three frames were interpolated between the morph endpoints to provide smooth transitions across a 1500 ms duration. The finalmorph frame was presented for 200 ms for a total stimulus durationof 1700 ms. This duration approximates real-time changes of facialaffect using videotaped expressions (Gepner et al., 2001). Morphswere saved in .avi format and displayed as movie clips. Static displaysof 100% fear, 100% anger, and neutral expressions were taken fromthe first and last frames of the emotion and identity morph movies andwere presented for the same total duration as the morphs. Figure 1illustrates four frames of a neutral-to-fear morph. Complete examplesof neutral-to-anger and identity morph movies can be found athttp://www.mind.duke.edu/level2/faculty/labar/face_morphs.htm.

Experimental DesignParticipants viewed 36 unique exemplars of each of four stimuluscategories — static neutral, static emotional, dynamic neutral (identitymorph), dynamic emotional (emotion morph). Half of the emotionalstimuli represented fear and half represented anger. Each exemplar was

Figure 1. Experimental paradigm. Examples of static angry and neutral expressions and four frames of a fear morph are depicted. ISI = interstimulus interval.

1024 Dynamic Face Perception • LaBar et al.

Page 3: Dynamic Perception of Facial Affect and Identity in the ... · Dynamic Perception of Facial Affect and ... with neutral expressions of the same actor to create the ... cropped with

presented twice during the course of the experiment (total 72 stimuli ofeach category). Stimuli were presented in a pseudorandom event-relateddesign, subject to the constraint that no more than two exemplars of eachcategory were presented in a row to avoid mood induction effects. Faceswere separated by a central fixation cross. The intertrial interval variedbetween 12 and 15 s (mean 13.5 s) to allow hemodynamic andpsychophysiological responses to return to baseline levels betweenstimulus presentations (Fig. 1). The testing session was divided into eightruns of 8 min 24 s duration. Run order was counterbalanced acrossparticipants, and no stimuli were repeated within each half-session of fiveruns. Stimulus presentation was controlled by CIGAL software (Voyvodic,1999) modified in-house to present video animations. Participantsperformed a three-alternative forced-choice categorical judgement inresponse to each face. Specifically, they used a three-button response boxto indicate whether each face depicted an emotion morph (change inemotional expression), identity morph (change from one person toanother), or static picture (no changes). Participants were told to respondwhenever they could identify the category; speed of response was notemphasized. One example of each category was shown to theparticipants prior to entering the magnet to familiarize them with thestimuli.

Imaging Parameters and Data AnalysisMR images were acquired on a 1.5 T General Electric Signa NVi scanner(Milwaukee, WI) equipped with 41 mT/m gradients. The subject’s headwas immobilized using a vacuum cushion and tape. The anterior (AC) andposterior commissures (PC) were identified in the mid-sagittal slice of alocalizer series. Thirty-four contiguous slices were prescribed parallel tothe AC–PC plane for high-resolution T1-weighted structural images[repetition time (TR) = 450 ms, echo time (TE) = 20 ms, field-of-view(FOV) = 24 cm, matrix = 2562, slice thickness = 3.75 mm]. An additionalseries of T1-weighted structural images oriented perpendicular to theAC–PC were also acquired using the parameters specified above.Gradient echo echoplanar images sensitive to blood-oxygenation-level-dependent (BOLD) contrast were subsequently collected in the sametransaxial plane as the initial set of T1-weighted structural images(TR = 3 s, TE = 40 ms, FOV = 24 cm, matrix = 642, f lip angle = 90°, slicethickness = 3.75 mm, resulting in 3.75 mm3 isotropic voxels).

The fMRI data analysis utilized a voxel-based approach implementedin SPM99 (Wellcome Department of Cognitive Neurology, London, UK).Functional images were temporally adjusted for interleaved sliceacquisition and realigned to the image taken proximate to the anatomicstudy using affine transformation routines. The realigned scans werecoregistered to the anatomic scan obtained within each session andnormalized to SPM’s template image, which conforms to the MontrealNeurologic Institute’s standardized brain space and closely approximatesTalairach and Tournoux’s (Talairach and Tournoux, 1988) stereotaxicatlas. The functional data were high-pass filtered and spatially smoothedwith a 8 mm isotropic Gaussian kernel prior to statistical analysis. Theregressors for the time-series data were convolved with a canonical hemo-dynamic response profile and its temporal derivative as implemented inSPM99. Statistical contrasts were set up using a random-effects model tocalculate signal differences between the conditions of interest. Statisticalparametric maps were derived by applying linear contrasts to theparameter estimates for the events of interest, resulting in a t-statistic forevery voxel. Then, group averages were calculated by employing pair-wise t-tests on the resulting contrast images. This sequential approachaccounts for intersubject variability and permits generalization to thepopulation at large. Interaction terms were analyzed in subsequentpairwise t-tests after the main effect maps were calculated to avoid falsepositive activations from the baseline period in the control conditions.The resultant statistical parametric maps were thresholded at a voxelwiseuncorrected P < 0.001 and a spatial extent of five contiguous voxels.

ResultsThe experimental hypotheses were primarily targeted at simpleeffects, which are emphasized below. Additionally, a generalemotion × motion interaction analysis was conducted as well asa main effects analysis for the motion variable to examine therole of visual area MT+.

fMRI Activation to Emotion MorphsCompared to static emotional expressions, emotion morphselicited responses along a bilateral frontotemporal circuit,including ventrolateral prefrontal cortex, substantia innominata,amygdala, parahippocampal gyrus, and fusiform gyrus (Fig. 2and Table 1). The activations in this contrast were predom-inantly restricted to ventral brain regions, with some additionaldorsal activity in the dorsomedial prefrontal cortex, leftprecentral sulcus, right intraparietal sulcus, and putative visualmotion area MT+. This contrast reveals brain regions whoseresponses were enhanced by dynamic changes in negative facialaffect over and above their responses to the same expressionspresented statically.

The above analysis combined fear and anger expressions.Nearly identical patterns were found when the fear morphs wereselectively compared against static fear images (Table 2). Thebrain regions whose responses differentiated anger morphs fromstatic anger expressions were somewhat more restricted to asubset of those in the combined analysis (Table 2). When the fearand anger morphs were directly contrasted against each other,category-specific activations were revealed. Relative to angermorphs, fear morphs preferentially engaged the lateral anddorsomedial prefrontal cortex, amygdala, midbrain, parahippo-campal gyrus, fusiform gyrus, and posterior cingulate gyrus.An additional site of activation within the posterior superiortemporal sulcus was found at a less stringent statistical threshold(Fig. 2 and Table 3). Many of these activations were lateralized tothe right hemisphere. Relative to fear morphs, anger morphspreferentially engaged a region of ventrolateral prefrontal cortexand supramarginal gyrus (Fig. 2 and Table 3).

fMRI Activation to Identity MorphsRelative to static neutral expressions, identity morphs elicitedresponses along both dorsal and ventral processing streams(Fig. 2 and Table 4). The dorsal circuit included the dorsomedialprefrontal cortex, precentral sulcus, intraparietal sulcus, caudatenucleus, thalamus and visual area MT+. The ventral regionsincluded the inferior frontal gyrus, amygdala, and fusiformgyrus. Most of the activations were bilateral.

Direct comparisons revealed brain regions that selectivelycoded changes in facial affect versus changes in facial identity(Fig. 2 and Table 5). For these comparisons, fear and angermorphs were combined. Relative to the emotion morphs,identity morphs elicited greater responses in predominantlydorsal regions (dorsal anterior cingulate gyrus, dorsal inferiorfrontal gyrus, intraparietal sulcus, caudate nucleus), and a large,ventrolateral region of the temporal lobe (inferior temporal/posterior fusiform gyri). Relative to the identity morphs,emotion morphs elicited greater responses in ventral anteriorcingulate gyrus and ventromedial prefrontal cortex, middlefrontal gyrus (rostral area 8), medial fusiform gyrus, and bothanterior and posterior segments of the superior temporal sulcus.These latter brain regions may form a circuit that distinguishesbiologically relevant and plausible motion from other types ofmotion.

Emotion × Motion InteractionA formal analysis was conducted to determine which brainregions showed an interaction between the emotion and motionfactors in the experimental design (Table 6). A double-subtraction procedure was employed to compare the magnitudeof the motion effect (dynamic versus static) across the facialexpression categories (emotional versus neutral). For thisanalysis, fear and anger expressions were combined. Brain

Cerebral Cortex Oct 2003, V 13 N 10 1025

Page 4: Dynamic Perception of Facial Affect and Identity in the ... · Dynamic Perception of Facial Affect and ... with neutral expressions of the same actor to create the ... cropped with

Figure 2. Random-effects analysis of the group-averaged fMRI results. (a) Brain regions showing greater activity to dynamic emotional expression changes than to static facesportraying a fixed emotion (fear and anger combined). Color bar indicates T values. (b) Brain regions showing greater activity to dynamic changes in facial identity than to static faceswith neutral expression. Color bar indicates T values. (c) Dissociable brain regions for the perception of fearful (red) versus angry (green) emotional expression changes. T values arethresholded and truncated at T = 2.5 for illustration purposes. (d) Dissociable brain regions for the perception of changes in facial affect (red; fear and anger combined) versuschanges in facial identity (green). T values are thresholded and truncated at T = 2.5 for illustration purposes. Note that emotion morphs and identity morphs engage different sectorswithin frontal, temporal and cingulate cortices. Abbreviations: ACG = anterior cingulate gyrus, AMY = amygdala, aSTS = anterior superior temporal sulcus, dmPFC = dorsomedialprefrontal cortex, FFG = fusiform gyrus, IFG = inferior frontal gyrus, IPS = intraparietal sulcus, ITG = inferior temporal gyrus, MFG = middle frontal gyrus, MT = motion-sensitivearea of middle temporal gyrus, PCG = posterior cingulate gyrus, pSTS = posterior superior temporal sulcus, SI = substantia innominata, SMG = supramarginal gyrus, Th =thalamus.

1026 Dynamic Face Perception • LaBar et al.

Page 5: Dynamic Perception of Facial Affect and Identity in the ... · Dynamic Perception of Facial Affect and ... with neutral expressions of the same actor to create the ... cropped with

regions that were more sensitive to motion within the emotionalexpressions (compared to the neutral expressions) included thefusiform gyrus, anterior cingulate gyrus/ventromedial prefrontalcortex, the superior temporal gyrus and middle frontal gyrus.Brain regions that were more sensitive to motion within theneutral expressions (compared to the emotional expressions)included the inferior temporal gyrus/posterior fusiform gyrus,intraparietal sulcus, basal ganglia, dorsal anterior cingulate gyrusand lateral inferior frontal gyrus. These results are nearlyidentical to the activations shown in the simple effects analysiswhere the emotion morphs and identity morphs were directlycontrasted against each other (Fig. 2 and Table 5).

Main Effect of MotionBrain regions sensitive to facial motion cues across emotionalexpression and identity changes were identified by a main

effects analysis (Table 7). For this analysis, fear and angerexpressions were combined. The results showed significantmotion-related activity in six brain regions: visual area MT+,amygdala, inferior frontal gyrus, dorsomedial prefrontal cortex,intraparietal sulcus and caudate nucleus. All activations werebilateral except the caudate nucleus, which was left-sided.

Behavioral ResultsA within-subjects ANOVA computed on behavioral accuracydata revealed a significant interaction between factors ofemotion (fear, anger, neutral) and motion (dynamic, static),F(2,18) = 14.68, P < 0.001. Main effects of emotion[F(2,18) = 11.65, P < 0.001] and motion [F(2,18) = 48.69,P < 0.0001] were also found. Overall, participants were moreaccurate in identifying static images than dynamic images. Post

hoc t-tests showed that across the static images, accuracy foranger (96 ± 1%) was worse than that for fear (98 ± 1%) or neutral(98 ± 1%) expressions, which did not significantly differ fromeach other. However, these data are potentially confounded byceiling effects. Across the dynamic images, accuracy was worsefor identity morphs (38 ± 4%) than either anger (64 ± 8%) or fear(63 ± 8%) morphs, which did not significantly differ from eachother.

Table 1fMRI activation to emotion morphs (fear/anger combined)

Emotion morph > static emotion BA Side T value x y z

Middle temporal gyrus 21 left 7.67 –53 –60 0right 5.90 49 –68 –4

Intraparietal sulcus 7/40 left 6.70 –34 –64 26right 4.53 34 –83 30

Substantia innominata left 6.39 –11 –4 –15right 3.99 15 4 –15

Amygdala left 6.02 –19 –8 –23right 5.23 26 –4 –23

Parahippocampal gyrus 28/36 right 5.89 19 –26 –11left 5.32 –11 38 –15

Fusiform gyrus 19/37 left 5.83 –49 –60 –19right 5.30 53 –49 –19

Inferior frontal gyrus 45 left 5.42 –53 26 1147 left 5.28 –34 19 –19

right 5.00 34 23 –11Dorsomedial prefrontal cortex 32 midline 5.17 –4 26 41Precentral sulcus 6 left 4.78 –41 –4 38

Table 2fMRI activation to emotion morphs, broken down by category

BA Side T value x y z

Fear morph > fear staticSubstantia innominata right 9.02 11 –4 –15

left 8.33 –15 4 –15Dorsomedial prefrontal cortex 32 midline 7.46 0 26 38Middle temporal gyrus 21 right 6.64 53 –68 –4

left 5.57 –49 –71 0Ventral striatum left 6.59 –15 11 –4Midbrain left 5.63 –11 –41 –11

right 4.48 4 –30 –23Fusiform/inferior temporal gyri 19/37 left 5.43 –53 –64 –15

right 5.2 50 –56 –11Inferior frontal gyrus 47 left 5.16 –30 23 –23

44 right 4.19 53 8 19Precentral sulcus 6 left 4.82 –54 0 38Intraparietal sulcus 7 right 4.77 38 –79 23Amygdala right 4.75 26 –4 –26

left 4.31 –15 –15 –11Anger morph > anger static

Inferior frontal gyrus 47 right 6.77 38 26 –11left 4.89 –53 26 0

Middle temporal gyrus 21 right 5.91 49 –68 4left 5.31 –53 –64 8

Precentral sulcus 6 left 5.79 –41 –4 34Fusiform gyrus 19/37 left 5.27 –49 –68 –19

Table 3Comparison of fMRI activation to fear and anger morphs

BA Side T value x y z

Fear morph > anger morphFusiform gyrus 37 right 6.63 38 –68 –15Middle frontal gyrus 9/46 right 6.50 38 34 23

left 6.36 –34 41 23Midbrain right 6.00 11 –23 –11Amygdala right 5.43 11 –11 –19Posterior cingulate gyrus 31 midline 5.38 0 –38 45Parahippocampal gyrus 28/36 left 5.33 –23 –19 –26Inferior frontal gyrus 44 right 4.96 60 4 23Dorsomedial prefrontal cortex 32 right 4.30 11 0 49Superior temporal sulcusa 22 right 4.06 53 –38 4

Anger morph > fear morphSupramarginal gyrus 40 right 6.21 49 –53 34Inferior frontal gyrus 47 right 5.47 19 34 –4

aCluster threshold reduced to three voxels.

Table 4fMRI activation to identity morphs

Identity morph > neutral static BA Side T value x y z

Fusiform gyrus 19/37 right 12.67 49 –56 –19left 7.28 –49 –53 –26

Intraparietal sulcus 7/40 right 8.59 34 –56 45left 5.72 –41 –64 30

Inferior frontal gyrus 47 right 7.92 45 26 –8left 6.88 –30 19 –8

45 right 6.82 53 23 0left 5.85 –49 41 –4

Middle temporal gyrus 21 right 7.37 64 –45 –11left 5.41 –56 –53 0

Thalamus right 6.85 11 –15 4left 6.58 –8 –30 –19

Caudate nucleus right 5.99 11 8 11Precentral sulcus 6 left 6.46 –49 0 45Dorsomedial prefrontal cortex 32 midline 6.44 –8 23 49Amygdala left 4.09 –15 –8 –19

Cerebral Cortex Oct 2003, V 13 N 10 1027

Page 6: Dynamic Perception of Facial Affect and Identity in the ... · Dynamic Perception of Facial Affect and ... with neutral expressions of the same actor to create the ... cropped with

Behavioral Accuracy and Brain ActivationInspection of individual subject data revealed that 6 of the 10participants were less accurate in their recognition judgementsof the identity morphs relative to the emotion morphs. Weconducted a post hoc analysis of the identity morph versusemotion morph contrast (Table 5 and Fig. 2) to determine anypotential inf luence of behavioral accuracy on the statisticalparametric maps. Participants were subdivided into two groupsbased on behavioral performance — those for whom accuracywas equated across the morph categories (n = 4) and those forwhom accuracy was worse for the identity morphs (n = 6).Because of the small sample sizes, we used both conservative(P < 0.001 uncorrected) and liberal (P < 0.05 uncorrected)threshold criteria for determining statistical significance in theidentity morph versus emotion morph contrasts from eachsubgroup. A t-test was then computed across groups andthresholded at P < 0.001 uncorrected. The only brain region that

showed differential activation in the identity morph versusemotion morph contrast as a function of behavioral accuracywas the right inferior frontal gyrus (BA 44). This area was moreengaged to the identity morphs by the group with poorerperformance, perhaps ref lecting cognitive effort. However, thisbrain region emerged only at the more liberal statistical cutoff.

DiscussionGenerating emotional expressions requires sequencedmovements of facial muscles, which have long been identifiedpsychophysiologically (Ekman and Friesen, 1978; Bassili, 1978,1979). Voluntary (or unintended) perturbations of characteristicmovements or timing in facial expression can significantly altertheir meaning from the perspective of an observer (Hess andKleck, 1997). This has been documented in the distinctionbetween genuine and false smiles (Ekman and Friesen, 1982).The present study shows that specific regions of the brain aresensitive to the perception of f luid changes in physiognomyrelevant for the social signaling of changes in affective states.These brain regions preferentially signal real-time increases infacial affect intensity over static, canonical displays of the sameemotions. Some of the brain regions selectively responded toperceived changes in fear or anger, whereas others moregenerally distinguished facial expression changes from facialidentity changes induced by stimulus morphing. The functionalimplications of the findings are discussed relative to the knownanatomy of facial affect and biological motion perception.

Role of the AmygdalaThe amygdala has been a focus of investigation in facial affectperception, yet its exact function remains debated. Severalneuroimaging studies have reported amygdala activation tofearful faces (Breiter et al., 1996; Phillips et al., 1997, 1998;Whalen et al., 1998, 2001), but others have failed to replicatethese results (Sprengylmeyer et al., 1997; Kesler-West et al.,2001; Pine et al., 2001). The amygdala’s response to fearful facesmay be enhanced when the expression is caricatured (Morriset al., 1996, 1998), under conditions of full attention [(Pessoaet al., 2002); but see (Vuilleumier et al., 2001)], or whenjudgements involve the simultaneous presentation of multipleface exemplars (Hariri et al., 2000; Vuilleumier et al., 2001).Whereas some studies show fear specificity (Morris et al., 1996,1998; Phillips et al., 1997, 1998; Whalen et al., 1998, 2001),others report generalization to other emotion categories,including happiness (Breiter et al., 1996; Kesler-West et al.,2001; Canli et al., 2002; Pessoa et al., 2002). Amygdala activationto faces further varies across subjects according to social (Hart et

Table 5Comparison of fMRI activation to emotion morphs versus identity morphs

BA Side T value x y z

Emotion morph > identity morphFusiform gyrus 37 right 8.26 30 –49 –11Middle frontal gyrus 8 right 6.85 34 30 45

left 6.20 –34 34 38Anterior cingulate gyrus 24 midline 5.75 0 38 –4Superior temporal sulcus (anterior) 21/38 right 4.84 53 –4 –25Ventromedial prefrontal cortex 10/32 midline 4.75 4 53 0Temporal pole 38 right 4.02 38 8 –26Superior temporal sulcus (posterior) 21 right 4.49 68 –30 8

39 left 3.86 –60 –41 26Identity morph > emotion morphInferior temporal gyrus 37 right 6.32 64 –41 –11

left 4.13 –60 –41 –11Fusiform gyrus 19 right 5.99 45 –60 –19Inferior frontal gyrus 44 left 4.96 –53 11 38

right 3.57 49 11 239 right 4.15 56 23 23

Anterior cingulate gyrus 24 midline 4.73 0 26 41Caudate nucleus right 4.58 11 8 8Intraparietal sulcus 7/40 left 4.20 –34 –64 53

right 4.08 34 –75 45

Table 6fMRI results: emotion by motion interaction

BA Side T value x y z

Motion effect for emotional expression >Motion effect for neutral expression

Fusiform gyrus 37 right 6.95 30 –49 –11Middle frontal gyrus 8 left 6.89 –30 38 38

right 4.64 38 30 41Anterior cingulate gyrus 24 midline 4.64 8 38 –4Ventromedial prefrontal cortex 10/32 midline 4.51 8 56 0Superior temporal sulcus (posterior) 39 left 3.99 –60 –41 26

Motion effect for neutral expression >Motion effect for emotional expression

Inferior temporal gyrus 37 right 6.73 64 –41 –15Fusiform gyrus 19 right 6.05 41 –60 –19Inferior frontal gyrus 44 left 5.66 –53 11 34

9 right 5.06 56 23 23Intraparietal sulcus 7/40 left 4.83 –34 –68 49Caudate nucleus right 4.64 11 8 8Ventral striatum right 4.30 8 8 –4Anterior cingulate gyrus 24 midline 4.32 0 26 45

Table 7fMRI results: main effect of motion

Morphed images > static images BA Side T value x y z

Middle temporal gyrus 21 right 10.71 49 –56 –15left 7.61 –53 –60 –4

Amygdala left 10.30 –15 –4 –19right 8.24 15 –15 –19

Inferior frontal gyrus 47 left 7.57 –38 15 –15right 7.43 34 23 –8

Intraparietal sulcus 7/40 right 6.56 34 –75 30left 6.13 –26 –64 49

Dorsomedial prefrontal cortex 32 midline 6.44 –4 30 38Caudate nucleus left 4.60 –8 11 4

1028 Dynamic Face Perception • LaBar et al.

Page 7: Dynamic Perception of Facial Affect and Identity in the ... · Dynamic Perception of Facial Affect and ... with neutral expressions of the same actor to create the ... cropped with

al., 2000; Phelps et al., 2000), personality (Canli et al., 2002) andgenetic (Hariri et al., 2002) factors.

All of these studies presented posed facial displays devoid oftemporal cues integral to real-life social exchanges. In thepresent study, we have shown that the amygdala’s activity isenhanced by faces containing dynamic information relativeto static snapshots of the same faces. Consistent with ourhypotheses, the amygdala responded more to emotion morphsthan to static emotional faces, especially for fearful expressions.Direct comparisons showed a specificity for fear over anger inthe dynamic morphs but not in the static images. The categoryspecificity in amygdala processing of dynamic facial displayscannot be attributed to other potential confounding featuresacross the faces. Facial identity was the same across the dynamicand static images, and the morphing procedure allowedexperimental control over the rate, intensity, duration andspatial orientation of the expression changes. Dynamic stimulimay be more effective in engaging the amygdala during faceperception tasks and can potentially clarify the extent to whichits responses exhibit category specificity.

Surprisingly, the amygdala also responded to dynamic changesin facial identity that were emotionally neutral. The intensity ofamygdala activation to identity morphs was indistinguishablefrom that to the emotion morphs, even when the analyses wererestricted to fear. We speculate that morphed stimuli containingrapid, artificial changes in facial identity elicit amygdala activitydue to their threat or novelty value. In evolution, camouf lageand other means of disguising identity have been effectivelyemployed as deception devices in predator–prey interactions(Mitchell, 1993). It is possible that rapid changes in identity areinterpreted by the amygdala as potentially threatening andconsequently engage the body’s natural defense reactions.Alternatively, the amygdala may play a broader role in theperception of facial motion beyond that involved in emotionalexpression. The amygdala is known to participate in eye gazeand body movement perception (Brothers et al., 1990), evenwhen the stimuli have no apparent emotional content (Younget al., 1995; Bonda et al., 1996; Kawashima et al., 1999). Thisaccount, though, does not explain why the amygdala activationwas stronger for fear and identity morphs relative to angermorphs.

Parametric studies have indicated that the amygdala codes theintensity of fear on an expressor’s face [(Morris et al., 1996,1998); but see (Phillips et al., 1997)]. However, it is not clearwhether intense emotional expressions recruit amygdalaprocessing because of perceptual, experiential or cognitivefactors. Most imaging studies on facial affect perception haveused blocked-design protocols that further complicate inter-pretation of results in this regard. The application of dynamicstimuli may help distinguish between these potential underlyingmechanisms. In the present study, the emotion on the actor’sface did not reach full intensity until the last frame of eachmorph movie, whereas full intensity was continuously expressedin the static images. Thus, the intensity of expressed emotion inthe morphs was, on average, only half of that portrayed in thestatic displays. If the amygdala simply codes the perceivedintensity of fear in the expressor’s face, one would expect moreactivation for the static than dynamic stimuli. The statisticalparametric maps do not support this interpretation. Alter-natively, the amygdala’s response may shift in latency to the timepoint at which full intensity was expressed. The temporalresolution of fMRI may be insufficient to reveal this possibility.

Neural processing in the amygdala may relate to abstract orcognitive representations of fear (Phelps et al., 2001). Previous

studies have demonstrated that the amygdala’s response tosensory stimuli depends upon contextual cues and evaluativeprocesses. For instance, amygdala activity to facial expressionsincreases in a mood-congruent fashion (Schneider et al., 1997),and amygdala activation to neutral faces is greater in socialphobics, who may interpret these stimuli as threatening(Birbaumer et al., 1998). The amygdala is also engaged by neutralstimuli that signal possible or impending threatening events, asin fear conditioning (Büchel et al., 1998; LaBar et al., 1998) andanticipatory anxiety (Phelps et al., 2001). Rapid changes in fearintensity (and perhaps facial identity) may indicate an imminentsource of threat in the environment, which recruits amygdalaactivity as a trigger for the defense/vigilance system (Whalen,1998; Fendt and Fanselow, 1999; LaBar and LeDoux, 2001).Further work is needed to elucidate the contributions of theamygdala to perceptual, experiential and cognitive aspects offear.

The present results may partly explain why patients withamygdala lesions often exhibit deficits in judging the intensity offear in facial displays. These patients sometimes have sufficientsemantic knowledge to label fear, but they typically under-estimate the degree of fear expressed in posed displays[(Adolphs et al., 1994; Young et al., 1995; Calder et al., 1996;Broks et al., 1998; Anderson et al., 2000; Anderson and Phelps,2000); but see (Hamann et al., 1996)]. Fear recognition tasksrequire perceptual judgements of the extent of physiognomicchange in the face relative to less fearful states and/or a canonicaltemplate of fear. If the amygdala codes dynamic cues in facialdisplays, these patients may have difficulty using kineticinformation in face snapshots to make intensity judgementswithout additional contextual cues. Cognitive or experientialaspects of fear may also contribute to performance on recog-nition tasks. Testing amygdala-lesioned patients with dynamicstimuli may help determine the specific mechanism underlyingtheir deficit and could potentially reduce the variability inperformance across individual patients (Adolphs et al., 1999).

Role of the Superior Temporal Sulcus (STS) andAssociated RegionsElectrophysiological and brain imaging studies in humans andmonkeys have implicated the cortex in and surrounding thebanks of the STS in the social perception of biological motion[reviewed in Allison et al. (Allison et al., 2000)]. Biologicalmotion associated with eye gaze direction (Puce et al., 1998;Wicker et al., 1998; Hoffman and Haxby, 2000), mouthmovements (Calvert et al., 1997; Puce et al., 1998; Campbell et

al., 2001), and hand and body action sequences (Bonda et al.,1996; Howard et al., 1996; Grèzes et al., 1998; Neville et al.,1998; Grossman et al., 2000) engage the STS, particularly in itsmid-to-posterior aspect. The present study extends the knownrole of the STS to the perception of dynamic changes in facialexpression. The posterior aspect of the STS region responsive tothe emotion morphs overlaps with the areas implicated in theseprevious studies. Activity of STS neurons in monkeys is evokedby static pictures of grimaces, yawns, threat displays and otherexpressions relevant for socioemotional interactions withconspecifics (Perrett et al., 1985, 1992; Hasselmo et al., 1989;Brothers and Ring, 1993). These images potentially recruitneural processing in the monkey STS because of implied motionin the expressions (Freyd, 1987; Allison et al., 2000; Kourtzi andKanwisher, 2000; Senior et al., 2000). This may also explain whythe STS was not activated in the emotion morph versus staticemotion contrasts.

Interestingly, the STS preferentially signaled dynamic changes

Cerebral Cortex Oct 2003, V 13 N 10 1029

Page 8: Dynamic Perception of Facial Affect and Identity in the ... · Dynamic Perception of Facial Affect and ... with neutral expressions of the same actor to create the ... cropped with

in facial expression relative to dynamic changes in facial identity.Given that rapid changes in facial identity do not exist in thephysical world, this result supports the hypothesis that theSTS distinguishes biologically plausible from biologicallyimplausible or non-biological motion (Allison et al., 2000). Inthis regard, the STS is dissociated from visual motion areaV5/MT+, which is situated just posterior and ventral to the banksof the STS (Zeki et al., 1991; McCarthy et al., 1995; Tootell et al.,1995; Dumoulin et al., 2000; Kourtzi and Kanwisher, 2000; Huket al., 2002). As predicted, area MT+ was activated by all dynamicstimuli and did not prefer emotion over identity morphs. To ourknowledge, this is the first demonstration of responses in areaMT+ to dynamic facial expressions in humans. Differential STSprocessing of emotion morphs may alternatively relate tospecific aspects of facial motion present in these stimuli. At amore relaxed statistical criterion, the posterior STS diddiscriminate fear from anger morphs, which involve distinctfacial actions (Ekman and Friesen, 1978). Further research isneeded to evaluate whether this STS activity is related to specificfacial actions or ref lects modulation by the amygdala or otherlimbic structures.

Multimodal portions of the STS integrate form and motionthrough anatomic links with both ventral and dorsal visual areas(Rockland and Pandya, 1981; Desimone and Ungerleider, 1986;Oram and Perrett, 1996). In turn, the STS is interconnectedwith the prefrontal cortex in a gradient from ventral to dorsalregions as one proceeds along its rostrocaudal extent (Petridesand Pandya, 1988). The STS is connected to limbic andparalimbic regions, such as the amygdala and cingulate gyrus,via direct projections (Herzog and Van Hoesen, 1976; Pandyaet al., 1981; Amaral et al., 1992) and through dorsal fronto-parietal and temporopolar interfaces (Barbas and Mesulam,1981; Cavada and Goldman-Rakic, 1989; Petrides and Pandya,1988, 1999; Morecraft et al., 1993). Components of this fronto-temporolimbic circuit, including the medial fusiform gyrus,rostral area 8, medial prefrontal cortex/ventral anterior cingulategyrus and temporopolar cortex/anterior STS, also distinguishedemotion morphs from identity morphs in consort with theposterior STS.

Role of the Inferotemporal CortexDissociable regions within the inferior temporal and fusiformgyri signaled dynamic changes in facial identity versus facialexpression — anteriomedial fusiform gyrus for expressionchanges and posterolateral inferotemporal cortex (inferiortemporal gyrus and posterior fusiform gyrus) for identitychanges. Anatomically segregated processing was also foundacross the superior and inferior temporal neocortex for facialaffect and identity, respectively. Such regional specificity mayaccount for the variability in performance across these twodomains of face recognition in prosopagnosics with varyinglocations and extents of brain damage (Hasselmo et al., 1989;Humphreys et al., 1993; Haxby et al., 2000). Portions of thefusiform gyrus exhibited category specificity for fear over angermorphs, perhaps due to modulatory feedback from limbicstructures such as the amygdala (Amaral et al., 1992). Previousimaging studies have shown enhanced fusiform gyrus activityfor fearful expressions (Breiter et al., 1996; Sprengylmeyer et al.,1997; Pessoa et al., 2002). As revealed by connectivity modeling,the amygdala interacts with various sectors along the ventralvisual stream during facial affect perception tasks (Morris et al.,1998; Pessoa et al., 2002).

Computational models and single-cell recordings in monkeys

support a role for the inferior temporal gyrus in neural coding offacial identity independent of facial affect (Hasselmo et al., 1989;Haxby et al., 2000). Inferotemporal activity in the present studymay ref lect dual coding of the identities present within themorph, since this area is hypothesized to participate in thestructural encoding of faces (Kanwisher et al., 1997; Allison et

al., 1999). This possibility could be confirmed in electrophysio-logical experiments with high temporal resolution. Importantly,the morph stimuli were created with smooth transitionsbetween frames and presented at a rate that avoided ‘strobing’effects, which potentially engender recoding of each face insuccessive frames. Alternatively, face processing along theinferotemporal cortex may be subject to attentional modulation.Campbell et al. (Campbell et al., 2001) found greater activity ininferotemporal cortex during viewing of ‘meaningless’ facialactions (gurning) relative to ‘meaningful’ facial actions ofspeech-reading. These authors also postulated an attentionalaccount for their results.

Limitations and Future DirectionsThe present study was limited in three primary ways. First,only morphed images of fear and anger were presented. It isunknown if the results extend to other emotional expressioncategories. The creation of morphed stimuli is time-consumingand inclusion of all categories is difficult to accommodate withina single event-related fMRI paradigm. Future studies shouldcompare morphed images of fear and anger to other expressionsto determine the specificity of the present results. Secondly, onlyincremental emotional expression changes were presented.Future studies should compare incremental versus decrementalchanges in fear and anger to determine the sensitivity of thebrain regions to directional aspects of morphed expressions.Finally, the individuals in the present study were less accurate intheir categorical recognition of dynamic relative to static images.Although the statistical analysis of accuracy revealed only onebrain area that may ref lect cognitive effort on the task (BA 44),this analysis may have been underpowered due to sample sizeconstraints. Future studies should compare activation todynamic and static facial expressions under experimentalconditions in which task performance is equated and/orunrelated to the primary experimental manipulation (e.g. duringgender judgements).

ConclusionTemporal cues play an important role in the neural coding offacial affect that requires integration between ventrolimbicbrain regions sensitive to emotional salience and frontotemporalregions triggered by biologically relevant motion. The role ofdynamic information in face perception has been largely ignoredby neuroscientists. As noted by Gloor [(Gloor, 1997), p. 220],‘One only has to think of how much more information thesight of a moving as opposed to an immobile face conveys torealize the importance of such a mechanism’. Many structuresimplicated in facial affect perception preferred dynamic overstatic displays, and these regions were dissociated from otherbrain areas responsive to rapid changes in facial identity.Anatomically segregated patterns of brain activation acrosstemporal, frontal and cingulate cortices provide support forcomputational models and single-cell recordings in monkeysregarding the partial independence of facial affect and identityprocessing. The use of an event-related design further ensured

1030 Dynamic Face Perception • LaBar et al.

Page 9: Dynamic Perception of Facial Affect and Identity in the ... · Dynamic Perception of Facial Affect and ... with neutral expressions of the same actor to create the ... cropped with

that the signal changes were stimulus-driven and not related topotential mood induction across blocks of emotionally valenttrials. Adoption of this experimental approach may invigoratenew directions of cognitive neuroscience research using inter-active stimuli that more closely approximate real-time changesin facial expression relevant for the social communication ofemotions.

NotesThis work was supported by National Institutes of Health grants R01DA14094 and U54 MH06418, a NARSAD Young Investigator Award, and aRalph E. Powe Junior Faculty Enhancement Award from Oak RidgeAssociated Universities. The authors wish to thank Paul Kartheiser andHarlan Fichtenholtz for assistance with data collection and analysis.

Address correspondence to Kevin S. LaBar, Center for CognitiveNeuroscience, Box 90999, Duke University, Durham, NC 27708, USA.Email: [email protected].

ReferencesAdolphs R, Tranel D, Damasio H, Damasio A (1994) Impaired recognition

of emotion in facial expressions following bilateral damage to thehuman amygdala. Nature 372:669–672.

Adolphs R, Tranel D, Hamann SB, Young A, Calder A, Anderson AK et al.

(1999) Recognition of facial emotion in nine subjects with bilateralamygdala damage. Neuropsychologia 37:1111–1117.

Allison T, Puce A, Spencer DD, McCarthy G (1999) Electrophysiologicalstudies of human face perception. I: Potentials generated inoccipitotemporal cortex by face and non-face stimuli. Cereb Cortex9:415–430.

Allison T, Puce A, McCarthy G (2000) Social perception from visual cues:role of the STS region. Trends Cogn Sci 4:267–278.

Amaral DG, Price JL, Pitkänen A, Carmichael ST (1992) Anatomicalorganization of the primate amygdaloid complex. In: The amygdala:neurobiological aspects of emotion, memory, and mental dysfunction(Aggleton JP, ed.), pp. 1–66. New York: Wiley-Liss.

Anderson A K, Phelps EA (2000) Expression without recognition:contributions of the human amygdala to emotional communication.Psychol Sci 11:106–111.

Anderson AK, Spencer DD, Fulbright RK, Phelps EA (2000) Contributionof the anteromedial temporal lobes to the evaluation of facial emotion.Neuropsychology 14:526–536.

Barbas H, Mesulam M-M (1981) Organization of afferent input tosubdivisions of area 8 in the rhesus monkey. J Comp Neurol200:407–431.

Bassili JN (1978) Facial motion in the perception of faces and of emotionalexpression. J Exp Psychol Hum Percept Perform 4:373–379.

Bassili JN (1979) Emotion recognition: the role of facial movement and therelative importance of upper and lower areas of the face. J Pers SocPsychol 37:2049–2058.

Birbaumer N, Grodd W, Diedrich O, Klose U, Erb M, Lotze M et al. (1998)FMRI reveals amygdala activation to human faces in social phobics.Neuroreport 9:1223–1226.

Blair RJR, Morris JS, Frith CD, Perrett DI, Dolan RJ (1999) Dissociableneural responses to facial expressions of sadness and anger. Brain122:883–893.

Blair RJR, Colledge E, Murray L, Mitchell DGV (2001) A selectiveimpairment in the processing of sad and fearful expressions inchildren with psychopathic tendencies. J Abnorm Child Psychol 29:491–498.

Bonda E, Petrides M, Ostry D, Evans A (1996) Specific involvement ofhuman parietal systems and the amygdala in the perception ofbiological motion. J Neurosci 16:3737–3744.

Breiter HC, Etcoff NL, Whalen PJ, Kennedy WA, Rauch SL, Buckner RL et

al. (1996) Response and habituation of the human amygdala duringvisual processing of facial expression. Neuron 17:875–887.

Broks P, Young AW, Maratos EJ, Coffey PJ, Calder AJ, Isaac C et al. (1998)Face processing impairments after encephalitis: amygdala damage andrecognition of fear. Neuropsychologia 36:59–70.

Brothers L, R ing B (1993) Mesial temporal neurons in the macaque

monkey with responses selective for aspects of social stimuli. BehavBrain Res 57:53–61.

Brothers L, Ring B, Kling A (1990) Response of neurons in the macaqueamygdala to complex social stimuli. Behav Brain Res 41:199–213.

Büchel C, Morris JS, Dolan RJ, Friston KJ (1998) Brain systemsmediating aversive conditioning: an event-related fMRI study. Neuron20:947–957.

Calder AJ, Young AW, Rowland D, Perrett DI, Hodges JR, Etcoff NL(1996) Facial emotion recognition after bilateral amygdala damage:differentially severe impairment of fear. Cogn Neuropsychol 13:699–745.

Calvert GA, Bullmore ET, Brammer MJ, Campbell R, Williams SCR,McGuire PW et al. (1997) Activation of auditory cortex during silentlipreading. Science 276:593–596.

Campbell R, MacSweeney M, Surguladze S, Calvert G, McGuire P,Suckling J et al. (2001) Cortical substrates for the perception of faceactions: an fMRI study of the specificity of activation for seen speechand for meaningless lower-face acts (gurning). Cogn Brain Res12:233–243.

Canli T, Sivers H, Whitfield SL, Gotlib IH, Gabrieli JDE (2002) Amygdalaresponse to happy faces as a function of extraversion. Science296:2191.

Cavada C, Goldman-Rakic PS (1989) Posterior parietal cortex in rhesusmonkey: II. Evidence for segregated corticocortical networks linkingsensory and limbic areas with the frontal lobe. J Comp Neurol 287:422–445.

Christie F, Bruce V (1998) The role of dynamic information in therecognition of unfamiliar faces. Mem Cognit 26:780–790.

de Gelder B, Vroomen J, van der Heide L (1991) Face recognition andlip-reading in autism. Eur J Cogn Psychol 3:69–86.

Desimone R, Ungerleider LG (1986) Multiple visual areas in thecaudal superior temporal sulcus of the macaque. J Comp Neurol248:164–189.

Dumoulin SO, Bittar RG, Kabani NJ, Baker CL Jr, LeGoualher G, Pike BG et

al. (2000) A new anatomical landmark for reliable identification ofhuman area V5/MT: a quantitative analysis of sulcal patterning. CerebCortex 10:454–463.

Edwards K (1998) The face of time: temporal cues in facial expressions ofemotion. Psychol Sci 9:270–276.

Ekman P, Friesen WV (1976) Measuring facial movement. Environ PsycholNonverb Behav 1:56–75.

Ekman P, Friesen WV (1978) The facial action coding system. Palo Alto,CA: Consulting Psychologists Press.

Ekman P, Friesen W V (1982) Felt, false, and miserable smiles. J NonverbBehav 6:238–252.

Fendt M, Fanselow MS (1999) The neuroanatomical and neurochemicalbasis of conditioned fear. Neurosci Biobehav Rev 23:743–760.

Freyd JJ (1987) Dynamic mental representations. Psychol Rev94:427–238.

Gepner B, Deruelle C, Grynfeltt S (2001) Motion and emotion: a novelapproach to the study of face processing by young autistic children.J Autism Dev Disord 31:37–45.

Gloor P (1997) The temporal lobe and limbic system. New York: OxfordUniversity Press.

Grèzes J, Costes N, Decety J (1998) Top-down effect of strategy on theperception of human biological motion: a PET investigation. CognNeuropsychol 15:553–582.

Grossman E, Donnelly M, Price R, Pickens D, MorganV, Neighbor G et al.

(2000) Brain areas involved in perception of biological motion. J CognNeurosci 12:711–720.

Hamann SB, Stefanacci L, Squire LR, Adolphs R, Tranel D, Damasio H,Damasio A (1996) Recognizing facial emotion. Nature 379:497.

Hariri AR, Bookheimer SY, Mazziotta JC (2000) Modulating emotionalresponses: effects of a neocortical network on the limbic system.Neuroreport 11:43–48.

Hariri AR, Mattay VS, Tessitore A, Kolachana B, Fera F, Goldman D et al.

(2002) Serotonin transporter genetic variation and the response of thehuman amygdala. Science 297:400–403.

Hart AJ, Whalen PJ, Shin LM, McInerney SC, Fischer H, Rauch SL (2000)Differential response in the human amygdala to racial outgroup vs.ingroup face stimuli. Neuroreport 11:2351–2355.

Hasselmo ME, Rolls ET, Bayli, GC (1989) The role of expression andidentity in the face-selective responses of neurons in the temporalvisual cortex of the monkey. Behav Brain Res 32:203–218.

Cerebral Cortex Oct 2003, V 13 N 10 1031

Page 10: Dynamic Perception of Facial Affect and Identity in the ... · Dynamic Perception of Facial Affect and ... with neutral expressions of the same actor to create the ... cropped with

Haxby JV, Hoffman EA, Gobbini MI (2000) The distributed human neuralsystem for face perception. Trends Cogn Sci 4:223–233.

Herzog AW, Van Hoesen GW (1976) Temporal neocortical afferentconnections to the amygdala in the rhesus monkey. Brain Res115:57–69.

Hess U, Kleck RE (1997) Differentiating emotion elicited and deliberateemotional facial expressions. In: What the face reveals: basic andapplied studies of spontaneous expression using the facial actioncoding system (FACS) (Ekman P, Rosenberg EL, eds), pp. 271–286.New York: Oxford University Press.

Hill H, Johnston A (2001) Categorizing sex and identity from thebiological motion of faces. Curr Biol 11:880–885.

Hoffman EA, Haxby JV (2000) Distinct representations of eye gaze andidentity in the distributed human neural system for face perception.Nat Neurosci 3:80–84.

Howard RJ, Brammer M, Wright I, Woodruff PW, Bullmore ET, Zeki S(1996) A direct demonstration of functional specialization withinmotion-related visual and auditory cortex of the human brain. CurrBiol 6:1015–1019.

Huk AC, Dougherty RF, Heeger DJ (2002) Retintotopy and functionalsubdivision of human areas MT and MST. J Neurosci 22:7195–7205.

Humphreys GW, Donnelly N, Riddoch MJ (1993) Expression iscomputed separately from facial identity, and it is computedseparately for moving and static faces: neuropsychological evidence.Neuropsychologia 31:173–181.

Jordan H, Reiss JE, Hoffman JE, Landau B (2002) Intact perception ofbiological motion in the face of profound spatial deficits: Williamssyndrome. Psychol Sci 13:162–167.

Kamachi M, Bruce V, Mukaida S, Gyoba J, Yoshikawa S, Akamatsu S(2001) Dynamic properties inf luence the perception of facialexpressions. Perception 30:875–887.

Kanwisher N, McDermott J, Chun MM (1997) The fusiform face area: amodule in human extrastriate cortex specialized for face perception.J Neurosci 17:4302–4311.

Kawashima R, Sugiura M, Kato T, Nakamura A, Natano K, Ito K et al.

(1999) The human amygdala plays an important role in gazemonitoring. Brain 122:779–783.

Kesler-West ML, Andersen AH, Smith CD, Avison MJ, Davis CE, Kryscio RJet al. (2001) Neural substrates of facial emotion processing usingfMRI. Cogn Brain Res 11:213–226.

Kourtzi Z, Kanwisher N (2000) Representation of perceived object shapeby the human lateral occipital cortex. Science 283:1506–1509.

LaBar KS, LeDoux JE (2001) Coping with danger: the neural basis ofdefensive behaviors and fearful feelings. In: Handbook of physiol-ogy, section 7: the endocrine system, Vol. IV: coping with theenvironment: neural and endocrine mechanisms (McEwen BS, ed.),pp. 139–154. New York: Oxford University Press.

LaBar KS, Gatenby JC, Gore JC, LeDoux JE, Phelps EA (1998) Humanamygdala activation during conditioned fear acquisition andextinction: a mixed-trial fMRI study. Neuron 20:937–945.

Lander K, Christie F, Bruce V (1999) The role of movement in therecognition of famous faces. Mem Cognit 27:974–985.

Matsumoto D, Ekman P (1989) American–Japanese cultural differences inintensity ratings of facial expressions of emotion. Motiv Emot13:143–157.

McCarthy G, Spicer M, Adrignolo A, Luby M, Gore J, Allison T(1995) Brain activation associated with visual motion studied byfunctional magnetic resonance imaging in humans. Hum Brain Mapp2:234–243.

Mitchell RW (1993) Animals as liars: the human face of nonhumanduplicity. In: Lying and deception in everyday life (Lewis M, Saarni C,eds), pp. 59–89. New York: Guilford Press.

Morecraft RJ, Guela C, Mesulam M-M (1993) Architecture of connectivitywithin a cingulofrontoparietal neurocognitive network. Arch Neurol50:279–284.

Morris JS, Frith CD, Perrett DI, Rowland D, Young AW, Calder AJ et al.

(1996) A differential neural response in the human amygdala to fearfuland happy facial expressions. Nature 383:812–815.

Morris JS, Friston KJ, Büchel C, Frith CD, Young AW, Calder AJ et al.

(1998) A neuromodulatory role for the human amygdala in processingemotional facial expresssions. Brain 121:47–57.

Neville HJ, Bavelier D, Corina D, Rauschecker J, Karni A, Lalwani A et al.

(1998) Cerebral organization for language in deaf and hearingsubjects: biological constraints and effects of experience. Proc NatlAcad Sci USA 95:922–929.

Niedenthal PM, Halberstadt JB, Margolin J, Innes-Ker Å-H (2000)Emotional state and the detection of change in facial expression ofemotion. Eur J Soc Psychol 30:211–222.

Oram MW, Perrett DI (1996) Integration of form and motion in theanterior superior temporal polysensory area (STPa) of the macaquemonkey. J Neurophysiol 76: 109–129.

Pandya DN, Van Hoesen GW, Mesulam M-M (1981) Efferent connectionsof the cingulate gyrus in the rhesus monkey. Exp Brain Res42:319–330.

Perrett DI, Smith PAJ, Potter DD, Mistlin AJ, Head AS, Milner AD et al.

(1985) Visual cells in the temporal cortex sensitive to face view andgaze direction. Proc R Soc Lond B 223:293–317.

Perrett DI, Hietanen JK, Oram MW, Benson PJ (1992) Organization andfunctions of cells responsive to faces in the temporal cortex. PhilTrans R Soc Lond B 335:23–30.

Pessoa L, McKenna M, Gutierrez E, Ungerleider LG (2002) Neuralprocessing of emotional faces requires attention. Proc Natl Acad SciUSA 99:11458–11463.

Petrides M, Pandya DN (1988) Association fiber pathways to thefrontal cortex from the superior temporal region in the rhesusmonkey. J Comp Neurol 273:52–66.

Petrides M, Pandya DN (1999) Dorsolateral prefrontal cortex:comparative cytoarchitectonic analysis in the human and the macaquebrain and corticocortical connection patterns. Eur J Neurosci11:1011–1036.

Phelps EA, O’Connor KJ, Gatenby JC, Gore JC, Grillon C, Davis M (2000)Activation of the left amygdala to a cognitive representation of fear.Nat Neurosci 4:437–441.

Phelps EA, O’Connor KJ, Cunningham WA, Funayama ES, GatenbyJC, Gore JC et al. (2001) Performance on indirect measures ofrace evaluation predicts amygdala activation. J Cogn Neurosci12:729–738.

Phillips ML, Young AW, Senior C, Brammer M, Andrew C, Calder AJ et al.

(1997) A specific neural substrate for perceiving facial expressions ofdisgust. Nature 389:495–498.

Phillips ML, Young AW, Scott SK, Calder AJ, Andrew C, Giampietro V et al.

(1998) Neural responses to facial and vocal expressions of fear anddisgust. Proc R Soc Lond B 265:1809–1817.

Pine DS, Szeszko PR, Bilder RM, Ardekani B, Grun J, Zaragn E et al. (2001)Cortical brain regions engaged by masked emotional faces inadolescents and adults: an fMRI study. Emotion 1:137–147.

Puce A, Allison T, Bentin S, Gore JC, McCarthy G (1998) Temporal cortexactivation in humans viewing eye and mouth movements. J Neurosci18:2188–2199.

Rockland KS, Pandya DN (1981) Cortical connections of the occipitallobe in the rhesus monkey: interconnections between areas 17, 18, 19and the superior temporal sulcus. Brain Res 212:249–270.

Schneider F, Grodd W, Weiss U, Klose U, Mayer KR, Nagele T et al. (1997)Functional MRI reveals left amygdala activation during emotion.Psychiatr Res 76:75–82.

Seamon JG (1982) Dynamic facial recognition: examination of a naturalphenomenon. Am J Psychol 85:363–381.

Senior C, Barnes J, Giampietro V, Simmons A, Bullmore ET, Brammer Met al. (2000) The functional neuroanatomy of implicit-motionperception or ‘representational momentum’. Curr Biol 10:16–22.

Spencer J, O’Brien J, Riggs K, Braddick O, Atkinson J, Wattam-Bell J(2000) Motion processing in autism: evidence for a dorsal streamdeficiency. Neuroreport 11:2765–2767.

Sprengylmeyer R, Rausch M, Eysel UT, Przuntek H (1997) Neuralstructures associated with recognition of facial expressions of basicemotions. Proc R Soc Lond B 265:1927–1931.

Talairach J, Tournoux P (1988) Co-planar stereotaxic atlas of the humanbrain. New York: Thieme.

Thornton IM, Kourtzi Z (2002) A matching advantage for dynamic humanfaces. Perception 31:113–132.

Tootell RBH, Reppas JB, Kwong KK, Malach R, Born RT, Brady TJet al. (1995) Functional analysis of human MT and related visualcortical areas using magnetic resonance imaging. J Neurosci15:3215–3230.

Voyvodic JT (1999) Real-time fMRI paradigm control, physiology, andbehavior combined with near real-time statistical analysis.Neuroimage 10:91–106.

Vuilleumier P, Armony JL, Driver J, Dolan RJ (2001) Effects of attentionand emotion on face processing in the human brain: an event-relatedfMRI study. Neuron 30:829–841.

1032 Dynamic Face Perception • LaBar et al.

Page 11: Dynamic Perception of Facial Affect and Identity in the ... · Dynamic Perception of Facial Affect and ... with neutral expressions of the same actor to create the ... cropped with

Whalen PJ (1998) Fear, vigilance, and ambiguity: initial neuroimagingstudies of the human amygdala. Curr Direct Psychol Sci 7:177–188.

Whalen PJ, Rauch SL, Etcoff NL, McInerney SC, Lee MB, Jenike MA(1998) Masked presentations of emotional facial expressionsmodulate amygdala activity without explicit knowledge. J Neurosci18:411–418.

Whalen PJ, Shin LM, McInerney SC, Fischer H, Wright CI, Rauch SL (2001)A functional MRI study of human amygdala responses to facialexpressions of fear versus anger. Emotion 1:70–83.

Wicker B, Michel F, Henaff MA, Decety J (1998) Brain regionsinvolved in the perception of gaze: a PET study. Neuroimage 8:221–227.

Young AW, Aggleton JP, Hellawell DJ, Johnson M, Broks P, Hanley JR(1995) Face processing impairments after amygdalotomy. Brain118:15–24.

Zeki SM, Watson JD, Leuck CJ, Friston KJ, Kennard C, Frackowiak RS(1991) A direct demonstration of functional specialization in humanvisual cortex. J Neurosci 11:641–649.

Cerebral Cortex Oct 2003, V 13 N 10 1033