Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression...

68
N0256275 Phil Banyard April 2012 Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy Rebecca Noskeau BSc Psychology with Criminology Psychology Division School of Social Sciences Nottingham Trent University 1

Transcript of Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression...

Page 1: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

N0256275

Phil Banyard

April 2012

Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Rebecca Noskeau

BSc Psychology with Criminology

Psychology DivisionSchool of Social Sciences

Nottingham Trent University

1

Page 2: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Emotion Recognition and Emotional Resonance: Exploring the

Relationship between Facial Expression Recognition and Empathy

Abstract

Current research investigating the relationship between empathy and facial expression

recognition suggests that the two are inter-related. The purpose of this study was to explore

this relationship by comparing participants’ scores from the EQ-Short; their levels of accuracy

when identifying facial expressions; and their levels of emotional resonance with each

expression. A one-way ANOVA highlighted that females scored significantly higher than

males on the EQ-Short: F (1,140) = 4.64, p = 0.033. A bi-variate correlation revealed that scores

on the EQ-Short correlated significantly with rates of accuracy when identifying facial

expressions: r = .245, n =142, p = 0.003. The EQ-Short explained 6% of the variance found in

accuracy scores, (R2 = .0600, p = 0.003). A paired-samples t-test revealed that the three female

facial stimuli produced the highest accuracy rates: t (141) = 5.595, p = <0.001, and resonance

scores: t (141) = 3.834, p = <0.001. Happiness, surprise, disgust and sadness were identified

significantly more accurately than fear and anger: t (141) = 11.52, p = <0.001. A One-Way

ANOVA revealed females recognised anger significantly more accurately than males: F

(1,140) = 3.92, p = 0.050). Paired-samples t-tests revealed that happiness was recognised t (141)

= 5.22, p = <0.001, and resonated with t (141) = 7.31, p = <0.001, significantly more than the

five other emotions. There were significant differences in both accuracy and resonance scores

in response to the 6 FEED subjects: t (141) = 14.779, p = <0.001. These findings lend weight to

the growing body of evidence which suggests that empathy and facial expression recognition

are intrinsically related to one-another. This has real-world implications for the treatment of

disorders which are characterised by atypical functioning in these areas.

Key Words: Emotion; Facial Expressions; Empathy; Emotional Resonance; Emotion

Recognition.

2

Page 3: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Introduction

Due to the highly subjective and multi-faceted nature of emotion, researchers have struggled

to agree upon a clear definition of what emotions are, and indeed, what they are not (Scherer,

2005). However, what has been defined is that emotion is a complex psycho-physiological

phenomenon, which originates from an interaction between bio-chemical and environmental

influences (Meyer, 2004). Emotion is experienced as a mixture of physiological activation,

conscious experience and expressive behaviours (Meyer, 2004). Researchers believe that there

are at least six distinct basic emotional states: happiness, sadness, anger, disgust, surprise and

fear (Ekman, 1992, 1999; Ekman & Friesen, 1971). These states have been labelled the six basic

emotions, and are thought to blend together in order to create more complex emotional states

(Ekman, 1992, 1999; Ekman & Friesen, 1971; Scherer, 2005). Research has shown that these

emotions have a distinct pattern of expression upon the body and brain, both when we

ourselves are experiencing them, and when we see others doing so (App et al., 2011;

Chakrabarti et al., 2006; Ekman, 1992, 1999; Ekman & Friesen, 1971; Vignemont & Singer, 2006).

There is an ongoing debate as to whether emotions and facial expressions are innate, or are a

bi-product of socialisation (Barrett, 2011; Ekman, 1999; Matsumoto & Hwang, 2012).

Universalists believe that emotions and expressions are innate, adaptive and functional (Darwin, 1872/2009; Ekman, 1992, 1999; Ekman & Friesen, 1971; Shariff & Tracy, 2011).

However, cultural relativists and social constructivists emphasise a dominant role of

socialisation in the formation of emotional constructs and behaviours (Averill, 1980; Barrett,

2011; Mead, 1975; Russell, 1994, 1995). They believe that emotions and facial expressions are

predominantly culturally informed, and then later internalised and reproduced (Averill, 1980;

Barrett, 2011; Mead, 1975; Russell, 1994, 1995). In addition, they argue that by reducing emotion

to a few parsed archetypal concepts - such as in the case of the six basic emotions, universal

emotion researchers are reducing their potential understanding of these phenomena (Averill,

1980; Barrett, 2011; Ekman, 1992, 1999; Ekman & Friesen, 1971; Mead, 1975; Russell, 1994, 1995).

The relativists and constructivists believe that this is epitomised in emotion research when

pre-formed emotion labels are used, as they believe that this limits participants’ responses by

forcing them to rely on using the pre-formed emotion labels, when describing emotional

phenomena (Averill, 1980, Barrett, 2011; Mead, 1975; Russell, 1994, 1995).

3

Page 4: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Nevertheless, despite the prevalence of criticisms against it, there is a huge body of research

that supports a universal theory of emotion, and in particular, the innateness of emotional

facial expressions. For example, researchers have found that members of remote tribes both

display and understand the same basic emotions and facial expressions, as people from mass-

socialised societies (Ekman & Friesen, 1971). Furthermore, it has been reported that

congenitally blind people exhibit the same basic facial expressions as sighted people (Shariff

& Tracy, 2011). In addition, it has been shown that babies are able to discriminate and imitate

facial expressions from as early as 36 hours old (Field et al., 1982). Indeed, a recent study even

claims to have discovered that babies produce basic facial expressions in utero (Reissland et

al., 2011). It is however important to note that this study was conducted using only two

participants and therefore much greater research is necessary before this theory can be

accepted or generalised further. What researchers have managed to establish is that from as

early as one-year old, we are able to use our facial expression recognition abilities to help us

to safely navigate the world around us (Sorce, et al., 1985; Vignemont & Singer, 2006).

The emotions and facial expressions of others are often automatically processed, and though

they remain below the threshold of conscious awareness, have been shown to significantly

affect the functioning of the autonomic nervous system (Côté & Hideg, 2011; Dimberg et al.,

2000; Dimberg et al., 2011). This is supported by the finding that involuntary and voluntary

facial expressions are automatically perceived and reacted to differently - both

psychologically and biologically (Ekman & Davidson, 1993; Shore & Heerey, 2011). The Mirror

Neuron System (MNS) is thought to play a significant role in this automatic processing of

others’ emotions, and in our understanding of their intentions and actions (Gallese et al., 2011;

Iacoboni, 2009; Iacoboni et al, 2005; Menon, 2011; Oberman & Ramachandran, 2007;

Ramachandran, 2000). However, the MNS only accounts in part for this process, as it has been

found that an empathiser’s own emotional state can bias their attentional system, making

people more prone to noticing facial expressions which are congruent with their own present

emotional state, (Becker & Leinenger, 2011). This means that at times we may not even be

aware of others’ emotions and facial expressions, unless they are similar to our own, and

therefore salient to us at any given moment.

4

Page 5: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Understanding Empathy

Empathy has been identified as being the result of a combination of emotion recognition - the

ability to accurately identify another person’s emotional state, and emotional resonance - the

ability to feel what another person is feeling (Chakrabarti et al., 2006; Decety & Hodges, 2006;

Lama et al., 2008; Lawrence, et al., 2004; Shamay-Tsoory, 2011; Singer & Lamm, 2009). Empathic

abilities are believed to facilitate social communication and create social cohesion and

research suggests this happens both when we ourselves are being empathic towards others,

and when we perceive others as being empathic towards us (Cohen et al., 2012; Rifkin, 2010;

Verhofstadt et al., 2008; Vignemont & Singer, 2006). Researchers have found that empathy is

highly subjective and dependent upon the context and experiences of both the empathiser and

their target (Foroni & Semin, 2011; Hein & Singer, 2008; Matsumoto, 1991; Vignemont, & Singer,

2006). For example, it has been shown that the relationship dynamics that exists between two

people, including their relational hierarchical status to one-another, can have a profound

influence on the depth and the nature of emotional sharing, mutual-understanding and

personal identification, that thus ensues (Hein & Singer, 2008; Matsumoto, 1991; Vignemont, &

Singer, 2006).

Recent studies exploring empathic capabilities have been able to show that empathy is not a

fixed trait, but is a skill which can evolve over time, and as such, it can be learnt and

enhanced (Aziz-Zadeh et al., 2011; Baron-Cohen, 2011; Bastiaansen et al., 2011; Martin, et al., 2012;

Trepagnier et al., 2011). This is evident in the significant improvements found in the empathic

abilities of autistic children and adults who completed educational programmes designed to

help them to recognise and relate to others’ emotions and expressions more effectively

(Baron-Cohen, 2011; Hopkins, 2007; Hopkins et al., 2011; Trepagnier et al., 2011). As the MNS

plays such a significant role in the automatic processing of others’ emotions, and indeed, in

our understanding of their intentions and actions, it is believed to be an essential component

of empathy (Gallese et al., 2011; Iacoboni, 2009; Iacoboni et al, 2005; Oberman & Ramachandran,

2007; Menon, 2011; Ramachandran, 2000). It is hypothesised that autistic people have a delay in

the development of their MNS, and as a result are believed to suffer from deficits in cognitive

empathy (Bastiaansen et al., 2011; Oberman & Ramachandran, 2007). Until recently this was seen

as a fixed and life-long pattern, however recent research suggests that an Autistic person’s

5

Page 6: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

MNS catches up to a neuro-typical level of functioning at around 35-40 years-old - at which

point it is seen to go on to surpass it (see Figure 1 below), (Bastiaansen et al., 2011). This

contrasts with the neuro-typical pattern of development, which sees the MNS decline

gradually with age (see Figure 1), (Bastiaansen et al., 2011). Further evidence of the plasticity of

empathy is provided by a study which found that empathy can be enhanced by reading fiction

(Gabriel & Young, 2011). This is believed to occur due to the reader having access to the inner

motives, thoughts and intentions of the characters in the books - rather than only an exterior

view of their actions - thus enabling the reader to resonate more fully with them (Gabriel &

Young, 2011). The cumulative weight of these studies supports the hypothesis that empathy is

a dynamic, ever-evolving skill. This has important implications for further research and

treatments which are aimed at helping people with deficits in empathy. In addition, the last

study in particular has strong real-world applications in the reduction of discrimination,

prejudice and conflict.

Figure 1: The delayed pattern of development, and later

excelling, found in the MNS of people with Autism; as

compared with the gradual pattern of decline found in

neuro-typical development. Adapted by L. D. Cookson,

2012, from “Age-Related Increase in Inferior Frontal Gyrus

Activity and Social Functioning in Autism Spectrum

Disorder,” by J. A. Bastiaansen, M. Thioux, L. Nanetti, C. V.

D. Gaag, C. Ketelaars, R. Minderaa and C. Keysers, 2011,

Biological Psychiatry, 69(9), 832-838. Adapted with

permission.

6

Page 7: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

In order to understand what empathy is, it is helpful to consider what happens when there are

deficits in empathic abilities. Researchers believe that there are two routes by which empathy

is achieved - via emotional affect (affective empathy), and cognitive rationalisation (cognitive

empathy), (Decety & Hodges, 2006; Shamay-Tsoory, 2011; Shamay-Tsoory et al., 2009; Shamay-

Tsoory et al., 2010). It has been discovered that imbalances in either of these routes can result

in psychiatric and neurological disorders. An example of this (as previously mentioned), is in

the case of autism, where it is believed that there is a deficiency in cognitive empathy; in

contrast, the dark triad of personality - Narcissism, Machiavellianism and Psychopathy – are

all disorders that are believed to be attributable to impairments in affective empathy (Bastiaansen et al., 2011; Blair, 2005; Oberman & Ramachandran, 2007; Wai & Tiliopoulos, 2012).

Deficits in affective empathy have been shown to emerge when there is damage to the pre-

frontal cortex; and pre-frontal cortex injuries have been shown to mimic the deficits in

empathy which are present in psychopathy (Shamay-Tsoory et al., 2010). These findings

complement other research which suggests that people high in affective empathy perceive the

emotions of others more vividly, and of a higher intensity, than those with lower affective

empathy (Dimberg et al., 2011).

Perhaps the most acute disorder in which an imbalance in affective and cognitive empathy

exists is borderline personality disorder (BPD). In BPD emotions, facial expression

recognition and empathy are all heightened, and at times distorted (Donegan et al., 2003; Harari

et al., 2010; Lynch et al., 2006; Wagner & Linehan, 1999). Harari et al., (2010), suggest that there

is a common pattern of increased cognitive empathy and decreased affective empathy in the

general population, but that in BPD this pattern is reversed, meaning that sufferers have an

increased affective empathic response, and thus higher-sensitivity, to the emotions of others,

than is found in the norm. As such, this can cause sufferers to be overwhelmingly at the whim

of the emotions of those around them (Harari et al., 2010).

The Dark-Side of Empathy

As is evident in the case of BPD, there are negative aspects to experiencing high-levels of

emotion and empathy. This is especially true in our modern-day society and unique

predicament of being flooded with graphic depictions of others in desperate need, from all

over the world. It is believed that a high level of constant awareness of the suffering of others

7

Page 8: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

can produce what Karen Armstrong refers to as ‘compassion fatigue’ (Armstrong, 2009). An

additional negative aspect of empathy is that it is not always used to promote well-being - for

example in the case of bullying, whereby empathic abilites are used to target another person’s

weaknesses (Hein & Singer, 2008). This more negative take on empathic abilites complements

an alternative theory of empathy which suggests that pro-social behaviour and empathic

action may originate from a place of self-interest and social desirability, rather than altruistic

concern (Vignemont & Singer, 2006). This theory suggests that an ‘empathiser’ may in

actuality be attempting to avoid experiencing negative emotions - such as guilt and shame, by

performing a seemingly empathic action (Vignemont & Singer, 2006). A recent study by Côté

and Hideg (2011), propose that being able to affect people with emotional displays is a

desirable skill, because it can enhance perceptions of competence and liking. However, it

could be argued that having a pronounced ability to affect people emotionally may not in all

cases be desirable. This is especially true if the emotional affect is largely negative and

undermining of competence - for example, if a person is highly socially awkward.

The Benefits of Being Empathic

Despite the aforementioned negative aspects, empathy has been shown to be a vital tool both

psychologically and physiologically. A recent study revealed that when physicians related

empathically to their patients, it significantly helped to improve their chronic health

conditions and in turn their long-term health (Hojat et al., 2011). In addition, the study which

reported that reading fiction can enhance empathy also discovered that this enhanced sense of

emotional resonance felt with the characters, provided readers with a sense of belonging and

the same kinds of mood and life-satisfaction, as membership to real-life social groups (Gabriel

& Young, 2011). Research investigating the beneficial role that empathy plays within

marriages, has found that spouse’s feel most emotionally supported when their partner

emotionally resonates with their distress (Verhofstadt et al., 2008). Echoing these findings, a

recent study suggests that perceived empathic effort produces the strongest relationship

satisfaction (Cohen et al., 2012). These findings thus highlight a positive extra inter-personal

dimension to empathy and its real-world applications (Cohen et al., 2012).

8

Page 9: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

The Relationship between Facial Expression Recognition and Empathy

There is much evidence to suggest that facial expression recognition is a key skill of

empathy. This relationship is most evident when we consider what happens when this link is

broken or heightened. Damage to the pre-frontal cortex, as previously discussed, creates a

deficit in affective empathy which echoes the pattern of impairments found in psychopathy.

In psychopathy, and some cases of criminal offending, it has been found that there is lack of

emotional responsiveness to both the emotions and facial expressions of others (Blair, 2005;

Shamay-Tsoory et al., 2010). This finding complements research which has found that the

higher a person’s affective empathy is, the quicker they are able to recognise facial

expressions and begin experiencing another person’s emotions (Besel, 2007; Besel & Yuille,

2010). Furthermore, people high in affective empathy, are also more facially reactive when

experiencing the emotions of others, which researchers have postulated could bi-directionally

feedback, and increase their empathic feelings (Dimberg et al., 2011; Ekman, 1992).

These findings, relate directly to the pattern of heightened affective empathy, emotion and

facial expression recognition evident in BPD (Donegan et al., 2003; Harari et al., 2010; Lynch et

al, 2006; Wagner & Linehan, 1999). For example, researchers have found that people with BPD

have a heightened ability to both recognise emotional facial expressions quicker, and with

more accuracy, than controls, (Donegan et al., 2003; Lynch et al, 2006; Wagner & Linehan, 1999).

Furthermore, people with BPD have been found to have a negativity bias towards perceiving

fearful faces when presented with neutral facial stimuli – however, this has been shown not to

take away from their superior speed and accuracy when identifying emotional facial

expressions, (Wagner & Linehan, 1999). Further evidence highlighting a connection between

affective empathy and facial expression recognition has shown that people suffering with

BPD, display significantly higher levels of physiological reactivity within their brains, in

response to other people’s facial expressions and emotions – thus they experience and feel the

emotions of others significantly more than controls (see Figure 2 below), (Donegan, et al.,

2003). It is notable that this study also found that BPD sufferers and healthy controls both had

the highest levels of physiological activation in response to the happiness facial stimuli (see

Figure 2), (Donegan, et al., 2003). Interestingly, a pattern of physiological hyper-reactivity and

distorted facial expression recognition capacity is also found in other emotional disorders,

9

Page 10: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

such as major depression, social anxiety, and post-traumatic stress disorder (Blair, 2005;

Demenescu, et al., 2010; Hirstein & Ramachandran, 1997; Poljac et al., 2011). 

Figure 2: Activation map showing regions in the amygdala

slice in which activation exceeded the criterion threshold level

of p _.005 for control and BPD groups, for four facial

expressions. Reprinted from “Amygdala hyper-reactivity in

borderline personality disorder: implications for emotional

dysregulation,” by N.H. Donegan, C. A. Sanislow, H. P.

Blumberg, R. K. Fulbright, C. Lacadie, P. Skudlarski, J. C.

Gore, I. R. Olson, T. H. McGlashan and B. E. Wexler, 2003,

Biological Psychiatry, 54(11), 1284-1293. Reprinted with

permission.

As the aforementioned studies attest, the hypothesis that facial expression recognition and

empathy are connected is emphasised by the finding that people who suffer from disorders in

facial expression recognition, also tend to suffer from disorders in their empathic abilities,

and vice versa. This connection has proven vitally important, as researchers have found ways

in which these two abilities can be mutually enhanced in order to alleviate deficits in these

areas (Domes et al., 2007; Duke & Nowicki, 1994; Grinspan et al., 2003; Hopkins, 2007; Hopkins et

al., 2011; Trepagnier et al., 2011). Thus in addition, lending weight to the hypothesis that

empathy and facial expression recognition are dynamic skills which can be learnt and

10

Page 11: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

enhanced over time (Aziz-Zadeh et al., 2011; Baron-Cohen, 2011; Bastiaansen et al., 2011; Martin,

et al., 2012; Trepagnier et al., 2011).

Methodological Implications

As empathy is believed to be the result of both the ability to recognise, and to feel the

emotions of others, the three individual parts of this experimental study have been designed

to test these two elements of empathy. It is the aim of this study to explore the relationship

between facial expression recognition and empathy by analysing and comparing participants’

scores on an empathy scale; by measuring their levels of accuracy when identifying the six

basic facial expressions; and by measuring their emotional resonance with each expression.

At the time of writing this the author knows of no other previous research that has explored

the relationship between empathy and facial expression recognition, using these unique

methodological constructs.

Researchers exploring sex differences in empathy have found that there is a significant

female advantage in empathic abilities (Baron-Cohen & Wheelwright, 2004; Goldenfeld et al.,

2010). Baron-Cohen believes that this is demonstrative of the essential difference between

male and female brains (Baron-Cohen, 2003). However this view is criticised as being largely

based upon findings which utilise self-report measures of empathy, and thus, could be more

attributable to differences in gender display rules and socialisation; rather than indicative of

innate sex-differences in empathic abilities (Ickes et al., 2000; Martin et al, 1996; Riggio et al.,

1989). As this study will utilise psychometrics, it is predicted that there will be a significant

female advantage in empathic abilities for this measure. Furthermore, it is predicted that there

will be significant age-differences in empathic abilities, as research suggests that empathy

and facial expression recognition are dynamic skills which evolve and develop over time

(Aziz-Zadeh et al., 2011; Baron-Cohen, 2011; Bastiaansen et al., 2011; Martin, et al., 2012).

11

Page 12: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Drawing from previous research, the predictions of this study are as follows:

1. Scores from the EQ-Short will positively correlate with recognition and resonance scores.

2. Females will score higher than males on the EQ-Short.

3. There will be sex-differences in recognition and resonance scores.

4. There will be inter-sex differences in recognition and resonance scores when viewing

male vs. female facial stimuli.

5. There will be differences in recognition and resonance scores for each of the 6 FEED

Subjects.

6. There will be higher accuracy and emotional resonance scores for the happiness facial

stimuli.

7. There will be age-differences in EQ-Short, accuracy and resonance scores.

Methods

Design

Research was conducted using a nomothetic two-way, mixed between-within experimental

design, which utilised psychometrics.

Participants

142 participants, 62 male (44%) and 80 female (56%), took part in this online study. They

ranged in age from 18-70 years old (M = 32.07, SD = 13.75). Participants were sourced from

the local area of Nottinghamshire and the internet. Invitations were sent to online contacts via

email and Facebook messages (see Appendix 1a & 1b). Participants were also recruited

through two online research hosting websites (The Inquisitive Mind, 2012; Psychological

Research on the Net, 2012); and the SONA system (Nottingham Trent University’s research-credits

programme).

12

Page 13: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Materials

The study was built using the online survey hosting website Zoomerang.com (Zoomerang,

2012). Zoomerang was utilised as it allow users to incorporate videos within surveys; has

highly customisable features; and has a user-friendly interface. The first page of the study

contained an integrated informed consent form (see Appendix 2). Throughout the study

participants were required to answer every question on each page before continuing to the

next. The first part of the study was conducted via the administration of the EQ-Short (See

Figure 1, and Appendix 3 & 4). The EQ-Short was placed at the beginning of the study in order

for participants to be able to complete it before becoming too fatigued.

Figure 1, Part One – The EQ-Short: This image shows how the EQ-Short was

formatted within the study and how a 4-point likert scale was utilised.

The second part of this study was created using 36 video-clips selected from the Facial

Expression and Emotion Database – FEED (Wallhoff, 2006). 3 male and 3 female subjects

were chosen from the FEED, each of which expressed the six basic emotions: anger, sadness,

happiness, disgust, surprise and fear (see Appendix 5). The videos were coded, uploaded to a

private YouTube address, and then linked to Zoomerang via embedded codes (see Appendix

5). Participants were asked to watch each video and then select which emotion they thought

13

Page 14: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

was being experienced (see Appendix 6a & 6b). In response to some of the criticisms that

earlier studies exploring universal facial expressions have faced, participants were able to

select from eight options for each video. They could choose either one of the six basic

emotions, or ‘I do not know’, or ‘Other Emotion’ - thus aiming to reduce the experimenter

effects of a total forced-choice methodology (Russell, 1994). In addition, as the FEED does not

use posed emotional expressions, this was hoped to further reduce experimenter biases

(Russell, 1994). The eight choices were contained within a drop-down box beneath each video

(see Appendix 6a & 6b). The videos were presented in a randomised order for each participant

in order to counter-balance any extraneous order-effects. Below each video was the

‘resonance bar’, which participants were required to use in order to rate their emotional states

upon (see Appendix 6a & 6b). The resonance bar was created using a 5-point likert scale and

scored using the Cambridge Scoring System (Baron-Cohen, & Wheelwright, 2004). Participants’

responses to the ‘positive’ emotion videos - happiness and surprise, were scored: 0, 0, 0, 1, 2;

whilst answers in response to the ‘negative’ emotion videos - anger, fear, disgust, sadness,

were scored: 2, 1, 0, 0, 0.

The final page of the study featured a photo of a polar-bear cub. This was included as a

positive counter-measure in order to neutralise any negative mood induction that may have

occurred from empathising with the ‘negative’ emotions (see Appendix 7). Underneath the

photo was an embedded debriefs, which included information concerning the subjects

investigated within the study; contact details for the researcher; and details of people who

participants could contact with any concerns. Before launching the study, a pilot-study was

performed using 6 participants. This study established an average time-frame for completing

the study (20 minutes); helped to make sure that all the tasks, text and format were easily

comprehensible; and ascertained that all elements of the study were working correctly.

The Empathy Quotient (EQ)

The Empathy Quotient (EQ) is a 40 item multi-dimensional scale, containing statements

designed to measure empathy (Baron-Cohen et al., 2001; Baron-Cohen & Wheelwright, 2004;

Lawrence et al., 2004). A recent study suggests that there is an emphasis in the EQ, on

measuring cognitive empathy and social understanding (Besel & Yuille, 2010). The EQ was

14

Page 15: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

found to have a high alpha reliability of .84 in a study exploring the effects of Aspergers

Syndrome on empathic abilities (Lawrence et al., 2004). In addition, the EQ has been shown

to have moderate concurrent validity, demonstrated by the moderate correlations between the

EQ and the ‘empathic concern’ and ‘perspective-taking’ subscales of the Interpersonal

Reactivity Index (IRI), (Davis, 1983; Lawrence at al., 2004). When completing the EQ,

participants are required to state how much they agree, or disagree with each statement, using

a 4-point likert scale.

For the purposes of this study the EQ-Short was utilised (Wakabayashi et al., 2006), (see

Appendix 4). The EQ-Short is 22-item version of the EQ. It has been shown to have higher

levels of internal consistency than the original, thus suggesting that some of the 40 items are

unnecessary (Wakabayashi et al., 2006). However, it is notable to state that by reducing these

‘unnecessary items’ some of the face-validity of the EQ also appears to have been

diminished. The depth and breadth of empathic experience appears minimised and the

questions seem to become more cognitively specialised. Nevertheless, the correlations

between scores on the 40-item EQ and the 22-item EQ-Short are high: r = 0.93 (r = 0.93 in

males, and r = 0.93 in females), (Wakabayashi et al., 2006). In addition, the EQ-Short has been

found to have good test-retest reliability (Wakabayashi et al., 2006). In the current study the

Cronbach alpha coefficient for the EQ-Short was a robust .90. Online administration of the EQ

has been reported to have strong concurrent validity and reliability (M = 39.00, SD = 11.44)

compared with face-to-face administration (M = 38.83, SD = 12.40), (Baron-Cohen et al., 2003).

The Facial Expression & Emotion Database (FEED)

The Facial Expressions and Emotions Database (FEED) was created as part of the European

Union project FG-NET (Face and Gesture Recognition Research Network), by Frank

Wallhoff at the Technical University in Munich (Wallhoff, 2006). The FEED contains 18

subjects – 9 male and 9 female, whom each express the six basic emotions as defined by

Ekman and Friesen, (1971). The database seeks to provide users with non-posed, natural

facial expressions, therefore emotions were evoked by subjects watching video clips and

looking at still images. The predominantly Caucasian ethnicity of the FEED is representative

of the predominant ethnicity of Western Europe - where the database originates from, and

15

Page 16: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

indeed, where this study took place. For the purposes of this study six subjects were chosen

from the FEED based on their sex, expressionality and likeability, (see Appendix 5).

Procedure

Participants accessed the study in their own time via the internet. Participants were asked to

carefully read the study information before agreeing to participate (see Appendix 2). After

reading the information, participants were asked to provide their written consent by creating a

unique identifier, and then provide details of their age and sex before starting the study. Once

participants had completed all parts of the study, they were taken to an embedded debrief

page and thanked earnestly for their time.

Reflecting on Methods

This study was administrated online due to the large and varied sample that this method is

able to provide. In addition, it was believed that this method would be most conducive to the

ecological validity of the study. Feedback from the pilot study revealed that some instructions

needed to be clearer and were thus altered accordingly. In addition, the pilot study

highlighted that some participants struggled to watch the videos due to the capabilities of the

devices which they accessed the study from. As a result, the invitations for the study made it

clear that the study would involve watching videos streamed from YouTube, and therefore

participants would need to make sure that the device they accessed the study from was

capable of doing this. When designing the study it was decided that the word ‘empathy’

should be used minimally throughout, in order to reduce any priming or experimenter-effects.

As the study was only administered online, there is a common methods variance issue -

therefore future studies could benefit from administering parts of the study online and in-

person.

16

Page 17: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Results

Data was analysed using SPSS. The data met parametric assumptions as participants were

randomly sampled and Levene’s tests for homogeneity of variance revealed that the data was

normally distributed. A bi-variate correlation revealed that scores on the EQ-Short (M = 23.89,

SD = 9.26), (95% CI = 22.36 – 25.43), (5% Trimmed Mean = 24.07), correlated significantly with

rates of accuracy (M = 22.02, SD = 4.37), (95% CI = 21.30 – 22.75), (5% Trimmed Mean = 22.33),

when identifying the six emotional facial expressions: r = .245, n =142, p = 0.003. Performing

the coefficient of determination revealed that scores on the EQ-Short explained 6% of the

variance in accuracy scores - demonstrating that there was a positive linear relationship

between the two, with weak-to-moderate levels of homoscedasticity, R2 = .0600, p = 0.003, (see

Scatterplot 1 below). The removal of outliers resulted in a minor increase in the level of

shared variance to 6.05%, between EQ-Short (M = 23.97, SD = 9.30), (95% CI = 22.42 – 25.52),

(5% Trimmed Mean = 24.16), and accuracy scores (M = 22.31, SD = 3.67), (95% CI = 21.69 –

22.92), (5% Trimmed Mean = 22.43), r = .246, n = 140, p = 0.003, (R2 = .0605, p = 0.003).

Further analysis investigating the relationship between rates of accuracy and scores on the

EQ-Short using bi-variate correlation, revealed which individual emotions were significantly

correlated with scores from the EQ-Short (p = <0.05), (see Figure 3).

0 5 10 15 20 25 30 35 40 45 500

5

10

15

20

25

30

35

R² = 0.0600195154398054

Scatterplot 1: Correlation Between EQ-Short and Ac-curacy Scores

EQ-Short Scores

Accu

racy

Sco

res

17

Page 18: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Sadness Anger Surprise Disgust Fear Happy0

0.10.20.30.40.50.60.70.80.9

Figure 3: Correlations Between the EQ-Short and Accuracy Scores - for each of the Six Emotions

N = 142

Post-hoc testing exploring the recognition rates for each of the six emotions revealed that

happiness, surprise, disgust and sadness were identified significantly more accurately than

fear and anger (see Figure 4). This was evident within the significant drop in accurate

recognition rates between the fourth and fifth most recognised emotions - sadness and fear,

(M = 1.76, SD = 1.82), t (141) = 11.52, p = <0.001. Therefore, the difference between the most

accurately recognised emotion – happiness, and the least accurately identified emotion –

anger, was also significant, (M = 4.24, SD = 1.45), t (141) = 34.76, p = <0.001, (see Figure 4). In

addition, a paired-samples t-test revealed that happiness was recognised significantly more

accurately than any other emotion. This was evident in the significant difference between

happiness and the second most accurately identified emotion - surprise (M = .563, SD = 1.29), t

(141) = 5.22, p = <0.001. Despite anger being the least recognised facial expression, it was the

only emotion which produced a significant sex-difference in accuracy rates. This was found

using a one-way analysis of variance, which revealed that females (M = 1.45, SD = 1.04), were

significantly more accurate than males (M = 1.08, SD = 1.18), at identifying anger: F (1,140) =

3.92, p = 0.050.

18

Page 19: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Happy Surprise Disgust Sadness Fear Anger0

1

2

3

4

5

6

Figure 4: Accuracy when Identifying Facial Expressions

Accu

racy

Sco

res

A one-way analysis of variance highlighted that females scored significantly higher (M =

25.35, SD = 9.48), than males (M = 22.02, SD = 8.70) on the EQ-Short: F (1,140) = 4.64, p = 0.033.

The one-way analysis of variance also revealed however, that there were non-significant

differences in the scores between females (M = 22.48, SD = 4.41) and males (M = 21.44, SD =

4.28) in their levels of accuracy when identifying emotional facial expressions: F (1,140) =

1.990, p = 0.161. In addition, there were also non-significant differences in the scores between

females (M = 17.74, SD = 13.86) and males (M = 16.76, SD = 14.66) in the resonance measure: F (1,140) = 0.166, p = 0.684.

Resonance (M = 17.31, SD = 14.17), (95% CI = 14.96 – 19.66), (5% Trimmed Mean = 16.60) was

found to be non-significantly correlated to accuracy (r = .023, n = 142, p = 0.790), and to EQ-

Short scores (r = .145, n = 142, p = 0.084). A paired sample t-test revealed that there was a

significant difference in the resonance scores for happiness, compared with the other

emotions (see Figure 5). This was demonstrated by the difference in resonance scores for the

highest resonated emotion happiness (M = 5.09, SD = 4.19), and the second highest resonated

emotion disgust (M = 2.73, SD = 3.13), t (141) = 7.31, p = <0.001, (see Figure 3). Throughout the

resonance measure a high pattern of variability was demonstrated by large standard

deviations evident within the data (see Figure 5, 6, 7 & 9).

19

Page 20: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Happy Disgust Sadness Surprise Anger Fear0

1

2

3

4

5

6

Figure 5: Resonance Levels for the Six Emotions

Reso

nace

Sco

res

For the purposes of analysis participants were split into three age-groups (see Table 1).

Differences across the three age-groups in scores from the EQ-Short and accuracy measures

were non-significant. In addition, an independent t-test revealed that resonance scores

between the two most diverse groups: Group 3 (M = 19.34, SD = 15.86) and Group 2 (M =

13.21, SD = 15.99), were also non-significant: t (55) = 1.45, p = 0.152).

Group Number Age-Range Number of Participants

1 18 – 30 yrs old n = 85

2 31 – 43 yrs old n = 29

3 44 –70 yrs old n = 28

Table 1: Properties of the 3 Age Groups

The three female facial stimuli prompted the highest accuracy and resonance scores, for both

male and female participants (see Figure 6). A paired-samples t-test revealed that the

difference in responses to the male, versus female videos, was significant for both accuracy:

t (141) = 5.595, p = <0.001, and resonance: t (141) = 3.834, p = <0.001.

20

Page 21: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Accuracy Female Videos

Accuracy Male Videos

Resonance Female Videos

Resonance Male Videos

0

2

4

6

8

10

12

14

Figure 6: Differences in Accuracy and Resonance Scores for the Female and Male Facial Stimuli

Parti

cipan

t Sc

ores

A one-way between groups multivariate analysis of variance was utilised to explore possible

inter-sex differences between accuracy and resonance rates, for the male and female

participants, in relation to the male and female facial stimuli. Four dependent variables were

used: accuracy female stimuli, accuracy male stimuli, resonance female stimuli and

resonance male stimuli. The independent variable was sex. Checks were performed to test

whether assumptions of normality, linearity, univariate and multivariate outliers,

homogeneity of variance-covariance matrices, and multicolinearity could be accepted, and no

violations were discovered. There was a non-significant difference between males and

females on the combined dependent variables: F (1,142) = 982, p = .639; Wilks’ Lambda = .634;

partial eta squared = .018. Therefore, no significant inter-sex differences were found between

the scores of the male and female participants, in response to the male and female facial

stimuli (see Figure 7). In the resonance measure, the pattern of high variability in scores is

again demonstrated (see Figure 7 below).

21

Page 22: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

M F M F M F M FAccuracy Female Accuracy Male Resonance Female Resonance Male

0

2

4

6

8

10

12

14

Figure 7: Inter-Sex Differences in Accuracy and Resonance Levels when Identifying Male vs. Female Facial Expressions

Mal

e (M

) &

Fem

ale

(F)

Par

ticip

ant S

core

s

There were no significant sex-differences found between the scores from male and female

participants for each of the six FEED subjects in the accuracy measure. However there were

significant differences in accuracy scores when recognising the emotions of Female 1 (F1), (M

= 4.24, SD = 1.12), (see Appendix 6a), versus the accuracy scores for Male 3 (M3), (M = 2.80,

SD = 0.80), (see Appendix 6b), t (141) = 14.779, p = <0.001, (see Figure 8).

0

1

2

3

4

5

Figure 8: Differences in Accuracy Scores in Relation to the Six Individuals Featured within the Videos

Accu

racy

Sc

ores

22

Page 23: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

A similar pattern was identified in the resonance scores in response to the six FEED subjects,

demonstrated by the significant difference between the resonance scores for F1 (M = 3.23, SD

= 2.84), (see Appendix 6a), and M3 (M = 2.56, SD = 2.23), (see Appendix 6b), t (141) = 4.46, p =

<0.001, (see Figure 9).

F1 M1 F2 M2 F3 M30.00

0.50

1.00

1.50

2.00

2.50

3.00

3.50

Figure 9: Differences in Resonance Scores in Relation to the Six Individuals Featured in the Videos

Reso

nanc

e S

core

s

Discussion

Analysis

1. Scores from the EQ-Short will correlate positively with recognition and resonance scores.

The weak yet highly significant positive correlation between the EQ-Short and accuracy

scores when identifying facial expressions supports the hypotheses that empathy and facial

expression recognition are inter-related. The relatively low level of variance shared between

the two variables could support the analysis of the EQ-Short which suggests that is has an

emphasis on measuring cognitive empathy, rather than construct as a whole (Besel & Yuille,

2010). The EQ-Short did not correlate significantly with resonance scores, which again may be

indicative of the EQ-Short being predominately a cognitive empathy measure - as the

resonance bar was more suited to measuring affective empathy, (Besel & Yuille, 2010). As

such, future studies could consider using either a more comprehensive scale or multiple

empathy scales, in order to cover the wider construct of empathy - including both cognitive

and affective components.

23

Page 24: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

2. Females will score higher than males on the EQ-Short.

3. There will be sex-differences in recognition and resonance scores.

Analysis revealed that females scored significantly higher than males on the EQ-Short

This finding supports previous research which has found that females score higher on self-

report measures of empathy, (Baron-Cohen, 2003; Baron-Cohen & Wheelwright, 2004; Goldenfeld

et al., 2010). However, as this pattern was not echoed significantly in accuracy and resonance

scores, these results could lend weight to the critique which states that rather than

highlighting an essential sex-difference in empathic abilities, this finding may be highlighting

a difference in gendered emotion display rules; or perhaps even, that it is more socially

desirable to be empathic for females, than it is for males, (Ickes et al., 2000; Martin et al, 1996;

Riggio et al., 1989). Despite anger being the least accurately recognised emotion, it was the

only one which produced a significant sex-difference in accuracy rates – whereby females

were significantly more accurate at identifying it than males This anger bias is thus a finding

that future studies would benefit from exploring further.

4. There will be inter-sex differences in recognition and resonance scores when viewing

male vs. female facial stimuli.

There was a significant difference in how participants related to the sex of the subjects in the

videos. Female subject’s facial expressions were identified more accurately than males (t

(141) = 5.595, p = <0.001). In addition, resonance scores for female stimuli were significantly

higher than those recorded for the male subjects, lending further support to the argument that

men and women are not significantly different in empathic ability, (Ickes et al., 2000; Martin et

al, 1996; Riggio et al., 1989). Furthermore, these results may be indicative that females are

somehow more emotionally expressive than males, which if accurate, would support the

research by Côté and Hideg, (2011) which sees superior emotional expression as a skill. These

findings also suggest that both males and females may be more open to being empathic with

females, than they are males. Thus, this is another area which would benefit from further

study.

24

Page 25: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

5. There will be differences in recognition and resonance scores for each of the 6 FEED

Subjects.

There were significant differences in both the accuracy and resonance scores in response to

the six FEED Subjects - specifically between Female 1 (F1), (see Appendix 6a), who had the

highest accuracy and resonance scores, versus Male 3 (M3), (see Appendix 6b), who had the

lowest accuracy and resonance scores, These findings could be indicative of F1’s superior

skill at conveying her emotions, (Côté and Hideg, 2011). Furthermore, they could also be

related to her level of empathic ability - we are watching her, watch someone or something

else - which she is emoting to and empathising with; therefore it may be that the resonance

scores actually reflect the empathic abilities of the subjects, as well as the participants (Dimberg, 2011).

6. There will be higher accuracy and emotional resonance scores for the happiness facial

stimuli.

Happiness was the most significantly accurately identified emotion. This complements

previous findings which have highlighted that happiness is the most physiologically

responded to emotion, (Donegan et al., 2003). Thus, perhaps this physiological effect could be

producing a resulting boost in the salience of happiness, (Becker & Leinenger, 2011).

Happiness, along with sadness, disgust, and surprise were all identified significantly more

accurately than fear and anger. These results may be indicative that at the time of

participating in the study, anger and fear were less emotionally salient for the majority of

participants, (Becker & Leinenger, 2011). Conversely, anger may have been recognised the

least due to the diluted nature of the representation of anger that was expressed within the

videos. Again, it is important to make the observation that participants watched the subjects,

watching someone or something else. The anger stimulus that the subjects were viewing is

likely to be a primary source (i.e. someone getting very angry), whereas what participants are

then experiencing as viewers is a secondary source. Whereas, it is believed that the response

to happiness, sadness, disgust surprise and fear would all cause a primary replication of the

emotion in the subjects, thus causing participants to be more able to easily and accurately

recognise them. As such, this again may also be indicative of how empathic the subjects are,

in addition to the participants.

25

Page 26: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Happiness was resonated with significantly more than the other five emotions, again

supporting the finding that happy expressions produce the highest level of physiological

activation in empathisers, (Donegan et al., 2003). Conversely, happiness could have been

resonated with most significantly due to the social desirability attached to being happy or

rather, being seen as happy. Alternatively, this findings could be due to the scoring system,

and how intuitively feeling lots of happiness, equates to ‘+ 2’ on the resonance bar. Feeling

lots of sadness would work just as well – by equating this with ‘- 2’, but it is not as easy to

place surprise, fear, disgust and anger on the polarised bar. Therefore the design of this

methodological construct may have contributed to the high patterns of variability in levels of

resonance expressed by both sexes, and across all of the age-groups (see Figures 5, 6, 7 & 9).

The high pattern of variability in resonance levels may also be indicative of the multi-faceted

nature of each emotion and thus be highlighting that there are different types of sadness,

happiness, etc - some which are negative and some which are positive. Thus it could be that

the Cambridge Scoring System was not equipped to account with this level of variability.

This pattern of variability could also be attributed to the highly subjective and individual

nature of emotion. Indeed, perhaps this variability is attributable to the different environments

and conditions that the participants were in, when they took part in the study, (Foroni et al.,

2011; Hein & Singer, 2008; Matsumoto, 1991; Vignemont, & Singer, 2006). As this was an online

study, there is no way of knowing what emotional or mental states participants were in at the

time they took part, thus further emphasising the possibility of increased variability in their

emotional states and resulting levels of emotional resonance. Therefore future studies would

benefit from seeking information regarding participants’ emotional baselines at the beginning

of each section of the study, in order to explore the effects that emotional states can have on

empathic abilities, (Becker & Leinenger, 2011).

7. There will be significant age-differences in the EQ-Short, accuracy and resonance scores.

Though all tests for age related differences in significance across the three conditions were

non-significant; interestingly it was the eldest age group which displayed the least resonance

with the videos. Perhaps this could be attributable to an age-related type of ‘compassion

fatigue’, (Armstrong, 2009); or maybe even due to the elder generation being less used to

26

Page 27: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

watching videos on YouTube and therefore being less empathically open, because the context

is unfamiliar to them. Especially in contrast to their younger counterparts, who will be more

likely to be familiar with watching videos via this medium. Their lower resonance scores

could also be a product of all the subjects in the FEED videos being roughly the same age as

each other – around 30years old, which may make them more relatable to the youngest and

middle-aged participants, than it does to the eldest ones. Conversely this lower score could be

indicative of the natural decline in the MNS with age (Bastiaansen et al., 2011).

Additional Limitations and Suggestions for Future Research

Feedback from participants revealed that they heavily doubted their ability to accurately

identify the facial expressions which resulted in them experiencing anxiety and in turn may

have interfered with their empathic abilities. Future studies could allow participants to pick

multiple emotions, or give open-ended answers instead, as this may help to alleviate

insecurity and elevate accuracy rates. Secondly, as this study was only administered once per-

participant, this has in reality only afforded a snap-shot representation of their potential

empathic abilities. Therefore future studies could administer the experimental measures on

multiple occasions in order to gain a more rounded view of participants’ abilities, and in

order to increase the studies temporal validity and reliability. Multiple administrations would

also allow for the exploration of the effects of the contexts within which participants took

part in the study, as this have been shown to play such a significant role in resulting empathic

abilities (Foroni & Semin, 2011; Hein & Singer, 2008; Matsumoto, 1991; Vignemont, & Singer,

2006).

The FEED stimuli used in this study may lack ecological validity as the empathiser is not

experiencing the emotions of another person face-to-face, nor in a real-world setting.

However, it could be argued that the modern ubiquitous use of computers, televisions and

smart phones, could mean that people are now accustomed to relating to and empathising

with others through this type of medium. An additional limitation of the FEED stimuli is that

it only provided a one dimensional sense of emotion – visual. If the FEED videos

incorporated sound, it may have increased the levels of accurate recognition and resonance

rates. Furthermore, the FEED only featured Caucasian subjects; therefore this may have

27

Page 28: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

produced race effects in expression recognition which are not able to be accounted for, as

data was not collected on participant race/ethnicity. Therefore, future studies would benefit

from altering the stimuli to include multiple ethnicities, and in addition, could explore race

effects in empathy and expression recognition, when the empathiser is of both the same, and

of differing ethnic origin.

Conclusions and Implications

The results of this study support previous findings which have highlighted a link between

facial expression recognition and empathy. It is hoped that by further illuminating this

connection, a deeper understanding of conditions whereby this capacity is impaired or

heightened can be achieved. This is especially true in terms of educational, health and work-

place settings, where people suffering from these conditions would greatly benefit from

increased levels of support, understanding and awareness of the issues that they face on a

day-to-day basis, as a result. Knowledge of the connection between empathy and facial

expression recognition abilities can also bring great hope to sufferers and their families, as

research has shown that there are effective ways in which these two abilities can be mutually

enhanced, (Domes et al., 2007; Duke & Nowicki, 1994; Grinspan et al., 2003; Hopkins, 2007;

Hopkins et al., 2011; Trepagnier et al., 2011). In addition, at the opposite end of the spectrum,

knowledge of the hyper-activity present in empathic abilities, and disturbances in facial

expression recognition found in conditions such as BPD, major depression, PTSD and social

anxiety, can provide sufferers and their families with great comfort, by enabling sufferers to

be able to learn skills that help them to modulate and manage their own problematic and

heightened responses to emotional stimuli (Blair, 2005; Demenescu, et al., 2010; Hirstein &

Ramachandran, 1997; Poljac et al., 2011; Wagner & Linehan, 1999).

Furthermore, this and other studies have highlighted how powerful seeing others experience

the emotion happiness is; and reveals the profound effects it can have both physiologically

and emotionally. This finding has huge implications for understanding the positive and

negative effects that the environments we inhabit and the people that we are exposed to, on a

daily basis, can have on our minds and bodies. In addition, this happiness effect could be

utilised by mental-health care professionals to design treatments to positively support and

28

Page 29: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

help people that are suffering from empathic and facial expression recognition disorders such

as BPD, depression, PTSD and social anxiety.

Further Information

For an excellent video introduction to empathy which explores its historical context, and

looks at its relevance in our modern-day world, see The Empathic Civilisation, (Rifkin, 2010).

In addition, for an enlightening video introduction to Mirror Neurons, please see Menon,

(2011).

Acknowledgements

The author would like to thank Phil Banyard for his enduring help in bringing this project to

fruition; Frank Wallhoff for creating the FEED and allowing the use of it in this study; and

Prof. Dr Akio Wakabayashi for his part in creating the EQ-Short and for giving his permission

for it to be used for the purposes of this study.

29

Page 30: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

References

App, B., McIntosh, D.N., Reed, C. L., & Hertenstein, M.J. (2011). Nonverbal channel use in

communication of emotion: how may depend on why. Emotion, 11(3), 603-617.

Armstrong, K. (2009). Let's revive the golden rule. [Web Video]. TED.com. Retrieved from

http://www.ted.com/talks/karen_armstrong_let_s_revive_the_golden_rule.html.

Averill, J. (1980). ‘A constructivist view of emotion’, in Plutchik, R., & Kellerman, H. (eds). Theories

of Emotion. New York: Academic Press, pp.306-312, 318-322, 326-329.

Aziz-Zadeh, L., Sheng, T., Liew, SL., & Damasio, H. (2011). Understanding Otherness: The Neural

Bases of Action Comprehension and Pain Empathy in a Congenital Amputee. Cerebral Cortex,

2011 Jul 6. [Epub ahead of print]. doi: 10.1093/cercor/bhr139.

Baron-Cohen, S. (2003). The essential difference: Men, women and the extreme male brain. London:

Penguin.

Baron-Cohen, S. (2011). Zero Degrees of Empathy: A new theory of human cruelty. London: Penguin.

Baron-Cohen, S., & Wheelwright, S., (2004). The Empathy Quotient: an investigation of adults with

Aspergers syndrome or high functioning autism, and normal sex differences. Journal of Autism

and Developmental Disorders, 34(2), 163–175.

Baron-Cohen, S., Richler, J., Bisarya, D., Gurunathan, N., & Wheelwright, S. (2003). The

systemizing quotient: an investigation of adults with Asperger syndrome or high-functioning

autism, and normal sex differences. Philosophical Transactions of the Royal Society London B

Biological Sciences, 358, 361-374.

Baron-Cohen, S., Wheelwright, S., Hill, J., Raste, Y., & Plumb, I. (2001). The ‘‘reading the mind in

the eyes” test revised version: A study with normal adults, and adults with AS syndrome or high-

functioning autism. Journal of Clinical Psychology, 42, 241–251.

Barrett, L. F. (2011). Was Darwin wrong about emotional expressions? Current Directions in

Psychological Science, 20, 400-406.

Bastiaansen, J. A., Thioux, M., Nanetti, L., Gaag, C. V. D., Ketelaars, C., Minderaa, R., & Keysers,

C. (2011). Age-Related Increase in Inferior Frontal Gyrus Activity and Social Functioning in

Autism Spectrum Disorder. Biological Psychiatry, 69(9), 832-838.

30

Page 31: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Becker, M. W., & Leinenger, M. (2011). Attentional selection is biased toward mood-congruent

stimuli. Emotion, 11(5), 1248-1254. 

Besel, L. D. S. (2007). Empathy: The role of facial expression recognition. (Unpublished doctoral

dissertation). University of British Columbia, Vancouver.

Besel, L.D.S., & Yuille, J.C. (2010). Individual differences in empathy: The role of facial expression

recognition. Personality and Individual Differences, 47, 107-112.

Blair, R. J. R. (2005). Responding to the emotions of others: Dissociating forms of empathy

through the study of typical and psychiatric populations. Consciousness and Cognition,

14, 698–718.

Chakrabarti, B., Bullmore, E., & Baron-Cohen, S. (2006). Empathizing with basic emotions: Common

and discrete neural substrates. Social Neuroscience, 1(3-4), 364-384.

Cohen, S., Schulz, M. S., Weiss, E., & Waldinger, R. J. (2012). Eye of the Beholder: The Individual

and Dyadic Contributions of Empathic Accuracy and Perceived Empathic Effort to Relationship

Satisfaction. Journal of Family Psychology. Advance online publication. doi: 10.1037/a0027488.

Retrieved from http://www.apa.org/pubs/journals/releases/fam-ofp-cohen.pdf

Côté, S., & Hideg, I. (2011). The ability to influence others via emotion displays: A new dimension of

emotional intelligence. Organizational Psychology Review, 1 (1), 53-71

Darwin, C. (2009). The expression of emotion in man and animals. Hammersmith, London: Harper

Perennial. (Original work published 1872).

Davis, M. H. (1983). Measuring individual differences in empathy: Evidence for a multidimensional

approach. Journal of Personality and Social Psychology, 44(1), 113-126.

Decety, J., & Hodges, S. D. (2006). The social neuroscience of empathy. In P. A. M. van Lange

(Ed.), Bridging social psychology: Benefits of transdisciplinary approaches (pp. 103-

110). Mahwah, NJ: Erlbaum.

Demenescu, L. R., Kortekaas, R., den Boer, J. A., & Aleman, A. (2010). Impaired Attribution of

Emotion to Facial Expressions in Anxiety and Major Depression. Plos ONE, 5(12), 1-5.

doi:10.1371/journal.pone.0015058

31

Page 32: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Dimberg, U., Andréasson, P., & Thunberg, M. (2011). Emotional Empathy and Facial Reactions to

Facial Expressions. Journal of Psychophysiology, 25(1), 26–31.

Dimberg, U., Thunberg, M., & Elmehed, K. (2000). Unconscious facial reactions to emotional facial

expressions. Psychological Science, 11, 86–89.

Domes, G., Heinrichs, M., Michel, A., Berger, C., & Herpertz, S. C. (2007). Oxytocin improves

"mind-reading" in humans. Biological Psychiatry, 61(6), 731-733.

Donegan, N. H., Sanislow, C. A., Blumberg, H. P., Fulbright, R. K., Lacadie, C., Skudlarski, P., Gore,

J. C., Olson, I. R., McGlashan, T. H., & Wexler, B. E. (2003). Amygdala hyperreactivity in

borderline personality disorder: implications for emotional dysregulation. Biological Psychiatry,

54(11), 1284-1293.

Duke, M. P., Jr., & Nowicki, S., Jr. (1994). Dyssemia in school: How to educate for nonverbal

comprehension. Principal's Journal, 73, 21-24.

Ekman, P. (1992). Facial expressions of emotion: New Findings, New Questions. Psychological

Science, 3, 34-38

Ekman, P. (1999). Basic Emotions. In T. Dalgleish and T. Power (Eds.) The Handbook of Cognition

and Emotion, 45-60. Sussex, U.K.: John Wiley & Sons, Ltd.

Ekman, P., & Davidson, R. J. (1993). Voluntary smiling changes regional brain activity. 

Psychological Science (Wiley-Blackwell), 4(5), 342-345.

Ekman, P., & Friesen, W. (1971). Constants Across Cultures in the Face and Emotion . Journal of

Personality and Social Psychology, 17, 124-129.

Field, T. M., Woodson, R., Greenberg, R. & Cohen, D. (1982). Discrimination and imitation of facial

expression by neonates. Science, 218, 179-181

Fisher, C. (2011). The Mirror Neuron System Slowly Develops in Autism. The Behavioral Medicine

Report. Retrieved from http://www.bmedreport.com/archives/27282

Foroni, F., & Semin, G. R. (2011). When does mimicry affect evaluative judgment?.  Emotion, 11(3),

687-690.

Gabriel, S., Young, A. F. (2011). Becoming a vampire without being bitten: the narrative collective-

assimilation hypothesis. Psychological Science, 22(8), 990-994.

32

Page 33: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Gallese, V., Gernsbacher, M. A., Heyes, C., Hickok, G., & Iacoboni, M. (2011). Mirror Neuron

Forum. Perspectives on Psychological Science 6, 369-407.

Goldenfeld, N., Baron-Cohen, S., Wheelwright, S., Ashwin, C., & Chakrabarti, B. (2010).

Empathizing and systemizing in males, females, and autism: a test of the neural competition

theory. In T. Farrow & P. Woodruff (Eds.), Empathy and Mental Illness (pp. 1-16). Cambridge:

Cambridge University Press.

Grinspan, D., Hemphill, A., & Nowicki Jr., S. (2003). Improving the Ability of Elementary School-

Age Children to Identify Emotion in Facial Expression. Journal of Genetic Psychology, 164(1),

88.

Harari, H., Shamay-Tsoory, S. G., Ravid, M., & Levkovitz, Y. (2010). Double dissociation between

cognitive and affective empathy in borderline personality disorder. Psychiatry Research, 175(3),

277-279.

Hein, G., & Singer, T. (2008). I feel how you feel but not always: the empathic brain and its

modulation. Current Opinion in Neurobiology, 18(2), 153-158.

Hirstein, W., & Ramachandran, V.S. (1997). Capgras Syndrome: A Novel Probe for Understanding

the Neural Representation of the Identity and Familiarity of Persons. Proceedings: Biological

Sciences, 264(1380), 437-444.

Hojat, M., Loius, D., Markham,F., Wender, R., Rabinowitz, C., & Gonnella, J. (2011). Physicians’

Empathy and Clinical Outcomes for Diabetic Patients. Academic Medicine, 86(3), 359-364.

Hopkins, I. M. (2007). Demonstration and evaluation of avatar assistant: Encouraging social

development in children with autism spectrum disorders. (Unpublished doctoral dissertation).

The University of Alabama, Birmingham.

Hopkins, I. M., Gower, M. W., Perez, T. A., Smith, D. S., Amthor, F. R., Wimsatt, F. C, & Biasini, F.

J. (2011). Avatar Assistant: Improving Social Skills in Students with an ASD through a

Computer-Based Intervention. Journal of Autism and Developmental Disorders, 41(11), 1543-

1555.

Iacoboni, M. (2009). Imitation, empathy, and mirror neurons. Annual Review of Psychology, 60, 653-

670.

33

Page 34: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Iacoboni, M., Molnar-Szakacs, I., Gallese, V., Buccino, G., Mazziotta, J.C., & Rizzolatti, G. (2005).

Grasping the Intentions of Others with One's Own Mirror Neuron System. PLoS

Biology, 3(3): e79. 2005 Feb 22. [Epub] doi:10.1371/journal.pbio.0030079. Accessed from

http://www.plosbiology.org/article/info:doi/10.1371/journal.pbio.0030079

Ickes, W., Gesn, P. R., & Graham, T. (2000). Gender differences in empathic accuracy: Differential

ability or differential motivation?. Personal Relationships, 7, 95–109. 

Lama, D., Gyatso, T., & Ekman, P. (2008). Emotional awareness: Overcoming the obstacles to

psychological balance and compassion: A conversation between the Dalai Lama and Paul

Ekman. [Audio Book]. USA: Macmillan Audio. Available from

http://www.paulekman.com/publications/recentbooks/.

Lawrence, E.J., Shaw, P., Baker, D., Baron-Cohen, S., David, A.S. (2004). Measuring empathy:

reliability and validity of the Empathy Quotient. Psychological Medicine, 34, 911–919.

Lynch, T.R., Rosenthal, M.Z., Kosson, D.S., Cheavens, J.S., Lejuez, C.W., & Blair, R.J. (2006).

Heightened sensitivity to facial expressions of emotion in borderline personality disorder.

Emotion, 6, 647–655.

Martin, D., Slessor, G., Allen, R., Phillips, L. H., & Darling, S. (2012). Processing orientation and

emotion recognition. Emotion, 12(1), 39-43.

Martin, R. A., Berry, G. E., Dobranski, T., Horne, M., & Dodgson, P. G. (1996). Emotion Perception

Threshold: Individual Differences in Emotional Sensitivity, Journal of Research in Personality,

30(2), 290-305.

Matsumoto, D. (1991). Cultural influences on facial expressions of emotion. Southern

Communication Journal, 56, 128-137.

Matsumoto, D., & Hwang, H. S. (2012). Culture and emotion: The integration of biological and

cultural contributions. Journal of Cross-Cultural Psychology, 43(1), 91-118.

Mead, M. (1975). Review of Darwin and Facial Expression. Journal of Communication, 25, 209-213.

Menon, D. (2011). Mirror Neurons Part 1 & 2 [Video Files]. In Monkey See, Monkey Do? The Role of

Mirror Neurons in Human Behavior. Association for Psychological Science. Accessed from

http://www.psychologicalscience.org/index.php/news/releases/monkey-see-monkey-do-the-role-

of-mirror-neurons-in-human-behavior.html.

34

Page 35: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Myers, D., G. (2004) "Theories of Emotion." Psychology: Seventh Edition. New York, NY: Worth

Publishers, pp. 500-507.

Oberman, L.M., & Ramachandran, V.S. (2007). The simulating social mind: the role of the mirror

neuron system and simulation in the social and communicative deficits of autism spectrum

disorders. Psychological bulletin, 133(2), 310-327.

Poljac, E., Montagne, B., & De Haan. E. H. F. (2011) Reduced recognition of fear and sadness in

post-traumatic stress disorder. Cortex, 47(8), 974-980

Psychological Research on the Net. (2012). Hanover College. Accessed from

http://psych.hanover.edu/research/exponnet.html

Ramachandran, V. S. (2000). Mirror Neurons and imitation learning as the driving force behind "the

great leap forward in human evolution”. Edge: The Third Culture, 69. Retrieved from

http://edge.org/conversation/mirror-neurons-and-imitation-learning-as-the-driving-force-behind-

the-great-leap-forward-in-human-evolution

Reissland, N., Francis, B., Mason, J., & Lincoln, K. (2011). Do facial expressions develop before

birth?. PLoS ONE, 6(8).

 Rifkin, J. (2010). The Empathic Civilisation. The Royal Society for the encouragement of Arts,

Manufactures and Commerce. [Video file]. Accessed from http://www.youtube.com/watch?

v=l7AWnfFRc7g&feature=related.

Riggio, R. E., Tucker, J., & Coffaro, D. (1989). Social skills and empathy. Personality and Individual

Differences, 10, 93–99.

Russell, J. A. (1994). Is there universal recognition of emotion from facial expression? A review of

cross-cultural studies. Psychological Bulletin, 115, 102-141.

Russell, J. A. (1995). Facial expressions of emotion: what lies beyond minimal universality?.

Psychological Bulletin, 118, 379-391.

Scherer, K., R. (2005). What are emotions? And how can they be measured?. Social Science

Information, 44(4), 695-729.

Shamay-Tsoory, S. G. (2011). Psychopathy, brain damage and empathy [Press Release]. University

of Haifa. Retrieved from http://newmedia-eng.haifa.ac.il/?p=4426

35

Page 36: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Shamay-Tsoory, S. G. (2011). The Neural Bases for Empathy. Neuroscientist, 17, 18-24.

Shamay-Tsoory, S. G., Aharon-Peretz, J., & Perry, D. (2009). Two systems for empathy: a double

dissociation between emotional and cognitive empathy in inferior frontal gyrus versus

ventromedial prefrontal lesions. Brain: A journal of neurology, 132(3), 617-627.

Shamay-Tsoory, S. G., Harari, H., Aharon-Peretz, J., & Levkovitz, Y. (2010). The role of the

orbitofrontal cortex in affective theory of mind deficits in criminal offenders with psychopathic

tendencies. Cortex, 46, 668-677.

Shariff, A. F., & Tracy, J. L. (2011). What are emotion expressions for?. Current Directions in

Psychological Science, 20, 395–399.

Shore, D. M., & Heerey, E. A. (2011). The value of genuine and polite smiles. Emotion, 11(1), 169-

174.

Singer, T. (2006). The neuronal basis and ontogeny of empathy and mind reading: review of literature

and implications for future research. Neuroscience and Biobehavioral Reviews, 30, 855–863.

Singer, T., & Lamm, C. (2009). The social neuroscience of empathy. Annals of the New York

Academy of Sciences, 1156, 81–96.

Sorce, J. F., Emde, R. N., Campos, J. J., & Klinert, M. D. (1985). Maternal emotional signalling: Its

effect on the visual cliff behavior of 1-year-olds. Developmental Psychology, 21, 195-200.

The Inquisitive Mind. (2012). Online Research. The Inquisitive Mind. In-Mind.org. Accessed from

http://beta.in-mind.org/online-research?page=5

Trepagnier, C. Y., Olsen, D. E., Boteler, L., & Bell, C. A. (2011). Virtual Conversation Partner for

Adults with Autism. Cyberpsychology, Behavior, and Social Networking, 14(1-2), 21-27.

Verhofstadt, L.L., Buysse, A., Ickes, W., Davis, M., & Devoldre, I. (2008).  Support provision in

marriage: The role of emotional linkage and empathic accuracy. Emotion, 8, 792-802.

Vignemont, F. D., & Singer, T. (2006). The empathic brain: how, when and why?. Trends in

Cognitive Sciences, 10 (10), 435-441.

Wagner, A.W., & Linehan, M. M. (1999). Facial expression recognition ability among women with

borderline personality disorder: Implications for emotion regulation? Journal of Personality

Disorders, 13, 329–344.

36

Page 37: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Wai, M., & Tiliopoulos, N. (2012). The affective and cognitive empathic nature of the dark triad of

personality, Personality and Individual Differences. [In press]. Available online 2 February

2012.Accessed from http://dx.doi.org/10.1016/j.paid.2012.01.008.

Wakabayashi, A., Baron-Cohen, S., Wheelwright, S., Goldenfeld, N., Delaney, J., Fine D, Richard

Smith, R., & Weil, L. (2006). Development of short forms of the Empathy Quotient (EQ-Short)

and the Systemizing Quotient (SQ-Short). Personality and Individual Differences, 41, 929-940.

Wallhoff, F. (2006). Facial Expressions and Emotion Database. Technische Universität München.

Accessed from http://cotesys.mmk.e-technik.tu-muenchen.de/isg/content/feed-database.

Zoomerang. (2012). Zoomerang.com. Retrieved from http://www.zoomerang.com/

37

Page 38: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Appendix 1

1a – Email Message

Do you consider yourself to be an empathic person?

Can you identify what emotion someone is feeling from their facial expression alone?

Are these two qualities related?

This psychological experiment is designed to explore these questions.

To take part you will need to be able to watch a selection of short videos streamed directly from YouTube - so it’s important that your computer / smart phone and internet connection are up to doing this.

If you know of anyone else who you think would be willing to complete this survey, please feel free to forward the survey link on to them.

This is for my third year dissertation so I'd really appreciate you taking part.

Many thanks & best wishes,Rebecca

1b - Facebook Message

Do you consider yourself to be an empathic person?

Can you identify what emotion someone is feeling from their facial expression alone?

Are these two qualities related?

This psychological experiment is designed to explore these questions.

To take part you will need to be able to watch a selection of short videos streamed directly from YouTube - so it’s important that your computer / smart phone and internet connection are up to doing this.

If you know of anyone else who you think would be willing to complete this survey, please feel free to invite them to this group, or forward the survey link on to them. 

https://www.zoomerang.com/Survey/WEB22EBCLEVSAW

38

Page 39: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Appendix 2

Informed Consent Form

* Please read the following information carefully before deciding whether to participate in this research:

Why are you doing this research? I want to understand the relationship between empathy and facial expression recognition. More specifically, I want to know whether people who rate highly on an empathy scale have increased facial expression recognition and emotional resonance.

Are there any risks?There are no known risks associated with taking part in an experiment of this kind. Please keep in mind that you are able to end the experiment at any time by simply closing the browser window.  

What do I get out of it? If you are a fellow university student you will be awarded with research credits for taking part. In order to obtain your credits you will need to email [email protected], & quote your Unique Identifier & your N-Number. For everyone else, you will be awarded with my eternal gratitude.  Will you share my information with other people? Your information will under no circumstances be shared. Your data will be anonymised, and only identifiable through a Unique Identifier - such as 'lotusflower1', which you will need to provide below. All data will be password-protected and secure.   What if I change my mind and I don’t want to do the experiment anymore, or I want to have my data removed from the study? Your participation in this study is completely voluntary and you may exit at any point by simply closing the browser window. If you decide you want to have your data removed from the study this is possible up until February 17th 2012. In order to do this you will need to contact either myself ([email protected]) or my Project Supervisor, Phil Banyard ([email protected]). * However, please bear in mind that a by-product of doing this, is that you will lose your anonymity.   What if I have questions or something to tell you? If you have any questions, comments or queries please use the above contact details to get in touch.   What if I have questions about my rights in this research, have a

39

Page 40: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

complaint, or want to report a problem to someone besides the researcher? You can contact Phil Banyard using the above details, or alternatively you can contact Nottingham Trent University directly on: +44 (0)115 941 8418 

How do I agree to participate? By providing a Unique Identifier & pressing 'Submit' below, you are confirming that you: 

1.       Agree to take part in this research. 2.       Understand what you are getting into. 3.       Understand you are free to leave the experiment at any point. 4.       Are aged 18 years old, or above.  

This research has been reviewed and approved by the Nottingham Trent University School Research Ethics Committee (SREC), and follows the British Psychological Society (BPS), & Nottingham Trent University's (NTU), Code of Conduct and Ethics Policies. 

* I agree to take part in this research. I feel that I understand what I am getting into, and I know that I am free to leave the experiment at any time.  

40

Page 41: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Appendix 3

Appendix 3a – Intro to the EQ-Short

Part 1

On the following page there will be a list of statements. 

Please read each statement very carefully and rate how strongly you agree or disagree with it by clicking next to the appropriate answer. 

There are no right or wrong answers, or trick questions.

Appendix 3b – Intro to the Videos and Resonance Bar

Part 2

In the second part of this experiment you will watch a selection of short videos which show a group of people expressing different emotions.

After watching each video you will need to decide which emotion you think is being expressed via their facial expression.

Finally you will take a moment after each video, to rate how you are feeling by using the scale provided. This is in order to chart your emotional state as you progress through the videos.

41

Page 42: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Appendix 4

The EQ-Short

1. (1.) I can easily tell if someone else wants to enter a conversation.

2. (3.) I really enjoy caring for other people.

3. (4.) I find it hard to know what to do in a social situation. a

4. (8.) I often find it difficult to judge if something is rude or polite. a

5. (9.) In a conversation, I tend to focus on my own thoughts rather than on what my listener

might be thinking. a

6. (11.) I can pick up quickly if someone says one thing but means another.

7. (12.) It is hard for me to see why some things upset people so much. a

8. (13.) I find it easy to put myself in somebody else’s shoes.

9. (14.) I am good at predicting how someone will feel.

10. (15.) I am quick to spot when someone in a group is feeling awkward or uncomfortable.

11. (18.) I can’t always see why someone should have felt offended by a remark. a

12. (21.) I don’t tend to find social situations confusing.

13. (22.) Other people tell me I am good at understanding how they are feeling and what they are

thinking.

14. (26.) I can easily tell if someone else is interested or bored with what I am saying.

15. (28.) Friends usually talk to me about their problems as they say that I am very

understanding.

16. (29.) I can sense if I am intruding, even if the other person doesn’t tell me.

17. (31.) Other people often say that I am insensitive, though I don’t always see why. a

18. (34.) I can tune into how someone else feels rapidly and intuitively.

19. (35.) I can easily work out what another person might want to talk about.

20. (36.) I can tell if someone is masking their true emotion.

21. (38.) I am good at predicting what someone will do.

22. (39.) I tend to get emotionally involved with a friend’s problems.

( ) Nos. of items are in original versions.

a Reversal items.

42

Page 43: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Appendix 5

How the 36 FEED videos were coded for the six subjects and emotions

F1 - FEEDF61 = SURPRISE – 26/27- FEEDF65 = SAD – 28/29- FEEDF63 = HAPPY – 30/31- FEEDF66 = FEAR – 32/33- FEEDF62 = DISGUST – 34/35- FEEDF64 = ANGER – 36/37

F2 - FEEDF53 = SURPRISE – 38/39- FEEDF55 = SAD – 40/41- FEEDF51 = HAPPY – 42/43- FEEDF54 = FEAR – 44/45- FEEDF56 = DISGUST – 46/47- FEEDF52 = ANGER – 48/49

M1 - FEEDM42 = SURPRISE – 50/51- FEEDM41 = SAD – 52/53- FEEDM44 = HAPPY – 54/55- FEEDM45 = FEAR – 56/57- FEEDM46 = DISGUST – 58/59- FEEDM43 = ANGER – 60/61

43

Page 44: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

M2 - FEEDM32 = SURPRISE – 62/63- FEEDM36 = SAD – 64/65- FEEDM33 = HAPPY – 66/67- FEEDM31 = FEAR – 68/69- FEEDM34 = DISGUST – 70/71- FEEDM35 = ANGER – 72/73

F3 - FEEDF22 = SURPRISE – 74/75- FEEDF23 = SADNESS – 76/77- FEEDF25 = HAPPY – 78/79- FEEDF26 = FEAR – 80/81- FEEDF21 = DISGUST – 82/83- FEEDF24 = ANGER – 84/85

M3 - FEEDM16 – SURPRISE – 86/87- FEEDM15 = SAD – 88/89- FEEDM14 = HAPPY – 90/91- FEEDM13 = FEAR – 92/93- FEEDM11 = ANGER – 94/95- FEEDM12 = DISGUST – 96/97

44

Page 45: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Appendix 6a

Example of Pages with Video and Resonance Bar

Example 1: F1 - Surprise

45

Page 46: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Appendix 6b

Example 2: M3 – Fear

46

Page 47: Emotion Recognition and Emotional Resonance: Exploring the Relationship between Facial Expression Recognition and Empathy

Appendix 7

Embedded Debrief with Cute Picture as Mood Counter-Measure

Thank you for participating in this survey!

This experiment was designed to explore the relationship between an individual’s self-referential feelings and beliefs about their own level of empathy - with how accurate they are at identifying emotions through facial

expressions; and in addition, their level of emotional resonance when doing so. 

If you wish to withdraw your data from the study, you can do so up until February 17th 2012, by contacting me on: [email protected] and quoting your Unique Identifier. 

Data will be stored until August 2012

If you would like a copy of the final report please contact me on [email protected]..

There are no known risks in taking part in this type of study, although if you feel particularly uncomfortable or upset by anything in this study please contact your doctor and / or your local counselling services.

If you have any queries, please do not hesitate to contact me at [email protected] or alternatively you can contact my project supervisor, Phil Banyard: [email protected]

+44 (0)115 848 5585

IMPORTANT - NTU STUDENTS: In order to obtain your Research Credits you will need to email me on [email protected], & quote your

Unique Identifier & N-Number.

47