Jonathan Haidt - The New Synthesis in Moral Psychology

6
DOI: 10.1126/science.1137651 , 998 (2007); 316 Science et al. Jonathan Haidt, The New Synthesis in Moral Psychology www.sciencemag.org (this information is current as of October 15, 2008 ): The following resources related to this article are available online at http://www.sciencemag.org/cgi/content/full/316/5827/998 version of this article at: including high-resolution figures, can be found in the online Updated information and services, http://www.sciencemag.org/cgi/content/full/316/5827/998/DC1 can be found at: Supporting Online Material found at: can be related to this article A list of selected additional articles on the Science Web sites http://www.sciencemag.org/cgi/content/full/316/5827/998#related-content http://www.sciencemag.org/cgi/content/full/316/5827/998#otherarticles , 3 of which can be accessed for free: cites 22 articles This article 14 article(s) on the ISI Web of Science. cited by This article has been http://www.sciencemag.org/cgi/content/full/316/5827/998#otherarticles 3 articles hosted by HighWire Press; see: cited by This article has been http://www.sciencemag.org/cgi/collection/psychology Psychology : subject collections This article appears in the following http://www.sciencemag.org/about/permissions.dtl in whole or in part can be found at: this article permission to reproduce of this article or about obtaining reprints Information about obtaining registered trademark of AAAS. is a Science 2007 by the American Association for the Advancement of Science; all rights reserved. The title Copyright American Association for the Advancement of Science, 1200 New York Avenue NW, Washington, DC 20005. (print ISSN 0036-8075; online ISSN 1095-9203) is published weekly, except the last week in December, by the Science on October 15, 2008 www.sciencemag.org Downloaded from

description

People are selfish, yet morally motivated. Morality is universal, yet culturally variable. Such apparent contradictions are dissolving as research from many disciplines converges on a few shared principles, including the importance of moral intuitions, the socially functional (rather than truth-seeking) nature of moral thinking, and the coevolution of moral minds with cultural practices and institutions that create diverse moral communities. I propose a fourth principle to guide future research: Morality is about more than harm and fairness. More research is needed on the collective and religious parts of the moral domain, such as loyalty, authority, and spiritual purity.

Transcript of Jonathan Haidt - The New Synthesis in Moral Psychology

Page 1: Jonathan Haidt - The New Synthesis in Moral Psychology

DOI: 10.1126/science.1137651 , 998 (2007); 316Science

et al.Jonathan Haidt,The New Synthesis in Moral Psychology

www.sciencemag.org (this information is current as of October 15, 2008 ):The following resources related to this article are available online at

http://www.sciencemag.org/cgi/content/full/316/5827/998version of this article at:

including high-resolution figures, can be found in the onlineUpdated information and services,

http://www.sciencemag.org/cgi/content/full/316/5827/998/DC1 can be found at: Supporting Online Material

found at: can berelated to this articleA list of selected additional articles on the Science Web sites

http://www.sciencemag.org/cgi/content/full/316/5827/998#related-content

http://www.sciencemag.org/cgi/content/full/316/5827/998#otherarticles, 3 of which can be accessed for free: cites 22 articlesThis article

14 article(s) on the ISI Web of Science. cited byThis article has been

http://www.sciencemag.org/cgi/content/full/316/5827/998#otherarticles 3 articles hosted by HighWire Press; see: cited byThis article has been

http://www.sciencemag.org/cgi/collection/psychologyPsychology

: subject collectionsThis article appears in the following

http://www.sciencemag.org/about/permissions.dtl in whole or in part can be found at: this article

permission to reproduce of this article or about obtaining reprintsInformation about obtaining

registered trademark of AAAS. is aScience2007 by the American Association for the Advancement of Science; all rights reserved. The title

CopyrightAmerican Association for the Advancement of Science, 1200 New York Avenue NW, Washington, DC 20005. (print ISSN 0036-8075; online ISSN 1095-9203) is published weekly, except the last week in December, by theScience

on

Oct

ober

15,

200

8 w

ww

.sci

ence

mag

.org

Dow

nloa

ded

from

Page 2: Jonathan Haidt - The New Synthesis in Moral Psychology

The New Synthesis inMoral PsychologyJonathan Haidt

People are selfish, yet morally motivated. Morality is universal, yet culturally variable. Suchapparent contradictions are dissolving as research from many disciplines converges on a fewshared principles, including the importance of moral intuitions, the socially functional (ratherthan truth-seeking) nature of moral thinking, and the coevolution of moral minds with culturalpractices and institutions that create diverse moral communities. I propose a fourth principleto guide future research: Morality is about more than harm and fairness. More research isneeded on the collective and religious parts of the moral domain, such as loyalty, authority,and spiritual purity.

If you ever become a contestant on an un-usually erudite quiz show, and you are askedto explain human behavior in two seconds

or less, you might want to say “self-interest.”After all, economic models that assume only amotive for self-interest perform reasonably well.However, if you have time to give a morenuanced answer, you should also discuss themoral motives addressed in Table 1. Tryanswering those questions now. If your totalfor column B is higher than your total for columnA, then congratulations, you areHomomoralis, notHomo economicus. You have social motivationsbeyond direct self-interest, and the latest researchin moral psychology can help explain why.

In 1975, E. O. Wilson (1) predicted thatethics would soon be incorporated into the “newsynthesis” of sociobiology. Two psychologicaltheories of his day were ethical behaviorism(values are learned by reinforcement) and thecognitive-developmental theory of LawrenceKohlberg (social experiences help childrenconstruct an increasingly adequate understand-ing of justice). Wilson believed that these twotheories would soon merge with research on thehypothalamic-limbic system, which he thoughtsupported the moral emotions, to provide acomprehensive account of the origins andmechanisms of morality.

As it turned out, Wilson got the ingredientswrong. Ethical behaviorism faded with behav-iorism. Kohlberg’s approach did grow to domi-nate moral psychology for the next 15 years, butbecause Kohlberg focused on conscious verbalreasoning, Kohlbergian psychology forged itsinterdisciplinary links with philosophy and edu-cation, rather than with biology as Wilson hadhoped. And finally, the hypothalamus was foundto play little role in moral judgment.

Despite these errors in detail, Wilson got thebig picture right. The synthesis began in the1990s with a new set of ingredients, and it hastransformed the study of morality today. Wilsonwas also right that the key link between thesocial and natural sciences was the study of

emotion and the “emotive centers” of the brain.A quantitative analysis of the publicationdatabase in psychology shows that research onmorality and emotion grew steadily in the 1980sand 1990s (relative to other topics), and thengrew very rapidly in the past 5 years (fig. S1).

In this Review, I suggest that the key factorthat catalyzed the new synthesis was the“affective revolution” of the 1980s—the in-crease in research on emotion that followed the“cognitive revolution” of the 1960s and 1970s. Idescribe three principles, each more than 100years old, that were revived during the affectiverevolution. Each principle links together insightsfrom several fields, particularly social psychol-ogy, neuroscience, and evolutionary theory. Iconclude with a fourth principle that I believewill be the next step in the synthesis.

Principle 1: Intuitive Primacy(but Not Dictatorship)Kohlberg thought of children as budding moralphilosophers, and he studied their reasoning asthey struggled with moral dilemmas (e.g., Shoulda man steal a drug to save his wife’s life?). But inrecent years, the importance of moral reasoninghas been questioned as social psychologists haveincreasingly embraced a version of the “affectiveprimacy” principle, articulated in the 1890s byWilhelmWundt and greatly expanded in 1980 byRobert Zajonc (2). Zajonc reviewed evidence thatthe human mind is composed of an ancient,automatic, and very fast affective system and aphylogenetically newer, slower, and motivation-ally weaker cognitive system. Zajonc’s basicpoint was that brains are always and automaticallyevaluating everything they perceive, and thathigher-level human thinking is preceded, per-meated, and influenced by affective reactions(simple feelings of like and dislike) which pushus gently (or not so gently) toward approach oravoidance.

Evolutionary approaches to morality general-ly suggest affective primacy. Most propose thatthe building blocks of human morality areemotional (3, 4) (e.g., sympathy in response tosuffering, anger at nonreciprocators, affection forkin and allies) and that some early forms of these

building blocks were already in place before thehominid line split off from that of Pan 5 to 7million years ago (5). Language and the ability toengage in conscious moral reasoning came muchlater, perhaps only in the past 100 thousand years,so it is implausible that the neural mechanismsthat control human judgment and behavior weresuddenly rewired to hand control of the organismover to this new deliberative faculty.

Social-psychological research strongly sup-ports Zajonc’s claims about the speed andubiquity of affective reactions (6). However,many have objected to the contrast of “affect”and “cognition,” which seems to imply thataffective reactions don’t involve informationprocessing or computation of any kind. Zajoncdid not say that, but to avoid ambiguity I havedrawn on the work of Bargh (7) to argue that themost useful contrast for moral psychology isbetween two kinds of cognition: moral intuitionand moral reasoning (8). Moral intuition refers tofast, automatic, and (usually) affect-laden pro-cesses in which an evaluative feeling of good-bador like-dislike (about the actions or characterof a person) appears in consciousness withoutany awareness of having gone through steps ofsearch, weighing evidence, or inferring a conclu-sion. Moral reasoning, in contrast, is a controlledand “cooler” (less affective) process; it is con-scious mental activity that consists of transform-ing information about people and their actions inorder to reach a moral judgment or decision.

My attempt to illustrate the new synthesis inmoral psychology is the Social IntuitionistModel(8), which begins with the intuitive primacyprinciple. When we think about sticking a pininto a child’s hand, or we hear a story about aperson slapping her father, most of us have anautomatic intuitive reaction that includes a flashof negative affect. We often engage in consciousverbal reasoning too, but this controlled processcan occur only after the first automatic processhas run, and it is often influenced by the initialmoral intuition. Moral reasoning, when it oc-curs, is usually a post-hoc process in which wesearch for evidence to support our initial intuitivereaction.

Evidence that this sequence of events is thestandard or default sequence comes from studiesindicating that (i) people have nearly instantimplicit reactions to scenes or stories of moralviolations (9); (ii) affective reactions are usuallygood predictors of moral judgments and behav-iors (10, 11); (iii) manipulating emotional reac-tions, such as through hypnosis, can alter moraljudgments (12); and (iv) people can sometimesbe “morally dumbfounded”—they can knowintuitively that something is wrong, even whenthey cannot explain why (8, 13). Furthermore,studies of everyday reasoning (14) demonstratethat people generally begin reasoning by settingout to confirm their initial hypothesis. Theyrarely seek disconfirming evidence, and arequite good at finding support for whatever theywant to believe (15).

Department of Psychology, University of Virginia, Char-lottesville, VA 22904, USA. E-mail: [email protected]

18 MAY 2007 VOL 316 SCIENCE www.sciencemag.org998

REVIEWS

on

Oct

ober

15,

200

8 w

ww

.sci

ence

mag

.org

Dow

nloa

ded

from

Page 3: Jonathan Haidt - The New Synthesis in Moral Psychology

The importance of affect-ladenintuitions is a central theme ofneuroscientific work on morality.Damasio (16) found that patientswho had sustained damage to cer-tain areas of the prefrontal cortexretained their “cognitive” abilitiesby most measures, including IQ andexplicit knowledge of right andwrong, but they showed massiveemotional deficits, and these def-icits crippled their judgment anddecision-making. They lost theability to feel the normal flashes ofaffect that the rest of us feel whenwe simply hear the words “slapyour father.” They lost the ability touse their bodies—or, at least, to in-tegrate input from brain areas thatmap bodily reactions—to feel whattheywould actually feel if they werein a given situation. Later studies ofmoral judgment have confirmed theimportance of areas of the medialprefrontal cortex, including ventro-medial prefrontal cortex and themedial frontal gyrus (17, 18). Theseareas appear to be crucial for in-tegrating affect (including expec-tations of reward and punishment)into decisions and plans. Otherareas that show up frequently infunctional magnetic resonance im-aging studies include the amygdalaand the frontal insula (9, 11, 16).These areas seem to be involved insounding a kind of alarm, and forthen “tilting the pinball machine,”as it were, to push subsequent processing in aparticular direction.

Affective reactions push, but they do notabsolutely force. We can all think of times whenwe deliberated about a decision and went againstour first (often selfish) impulse, or when wechanged our minds about a person. Greene et al.(19) caught the brain in action overriding itsinitial intuitive response. They created a class ofdifficult dilemmas, for example: Would yousmother your own baby if it was the only wayto keep her from crying and giving away yourhiding place to the enemy soldiers looking foryou, who would then kill the whole group of youhiding in the basement? Subjects were slow torespond to cases like these and, along the way,exhibited increased activity in the anterior cingu-late cortex, a brain region that responds to inter-nal conflict. Some subjects said “yes” to caseslike these, and they exhibited increased activity inthe dorsolateral prefrontal cortex, suggesting thatthey were doing additional processing andoverriding their initial flash of horror.

There are at least three ways we can overrideour immediate intuitive responses. We can useconscious verbal reasoning, such as consideringthe costs and benefits of each course of action.

We can reframe a situation and see a new angleor consequence, thereby triggering a secondflash of intuition that may compete with thefirst. And we can talk with people who raisenew arguments, which then trigger in us newflashes of intuition followed by various kinds ofreasoning. The social intuitionist model includesseparate paths for each of these three ways ofchanging one’s mind, but it says that the firsttwo paths are rarely used, and that most moralchange happens as a result of social interaction.Other people often influence us, in part bypresenting the counterevidence we rarely seekout ourselves. Some researchers believe, how-ever, that private, conscious verbal reasoning iseither the ultimate authority or at least a fre-quent contributor to our moral judgments anddecisions (19–21). There are at present no dataon how people revise their initial judgments ineveryday life (outside the lab), but we can lookmore closely at research on reasoning in general.What role is reasoning fit to play?

Principle 2: (Moral) Thinking Is for(Social) DoingDuring the cognitive revolution, many psychol-ogists adopted the metaphor that people are

“intuitive scientists” who analyze the evidenceof everyday experience to construct internalrepresentations of reality. In the past 15 years,however, many researchers have rediscoveredWilliam James’ pragmatist dictum that “thinkingis for doing.” According to this view, moralreasoning is not like that of an idealized scientistor judge seeking the truth, which is often useful;rather, moral reasoning is like that of a lawyer orpolitician seeking whatever is useful, whether ornot it is true.

One thing that is always useful is anexplanation of what you just did. People in allsocieties gossip, and the ability to track reputa-tions and burnish one’s own is crucial in mostrecent accounts of the evolution of humanmorality (22, 23). The first rule of life in a denseweb of gossip is: Be careful what you do. Thesecond rule is: What you do matters less thanwhat people think you did, so you’d better beable to frame your actions in a positive light.You’d better be a good “intuitive politician” (24).From this social-functionalist perspective, it isnot surprising that people are generally moreaccurate in their predictions of what others willdo than in their (morally rosier) predictions aboutwhat they themselves will do (25), and it is not

Table 1. What’s your price? Write in the minimum amount that someone would have to pay you (anonymously andsecretly) to convince you to do these 10 actions. For each one, assume there will be no social, legal, or materialconsequences to you afterward.Homoeconomicuswould prefer the option in columnB to the option in columnA for action1 and would bemore or less indifferent to the other four pairs. In contrast, a person withmoral motives would (on average)require a larger payment to engage in the actions in column B and would feel dirty or degraded for engaging in some ofthese actions for personal enrichment. These particular actions were generated to dramatize moral motives, but they alsoillustrate the five-foundations theory of intuitive ethics (41, 42).

How much money would it take to get you to...

Column A Column B Moralcategory

1) Stick a pin into your palm.

$___

Stick a pin into the palm of a child youdon’t know.

$___

Harm/care

2) Accept a plasma screen television that afriend of yours wants to give you. Youknow that your friend got the television ayear ago when the company that made itsent it, by mistake and at no charge, toyour friend.

$___

Accept a plasma screen television that afriend of yours wants to give you. Youknow that your friend bought the TV ayear ago from a thief who had stolen itfrom a wealthy family.

$___

Fairness/reciprocity

3) Say something slightly bad about yournation (which you don’t believe to betrue) while calling in, anonymously, to atalk-radio show in your nation.

$___

Say something slightly bad about yournation (which you don’t believe to betrue) while calling in, anonymously, toa talk-radio show in a foreign nation.

$___

Ingroup/loyalty

4) Slap a friend in the face (with his/herpermission) as part of a comedy skit.

$___

Slap your father in the face (with hispermission) as part of a comedy skit.

$___

Authority/respect

5) Attend a performance art piece in whichthe actors act like idiots for 30 min,including failing to solve simpleproblems and falling down repeatedly onstage.

$___

Attend a performance art piece inwhich the actors act like animals for 30min, including crawling aroundnaked and urinating on stage.

$___

Purity/sanctity

Total for column A: $___ Total for column B: $___

www.sciencemag.org SCIENCE VOL 316 18 MAY 2007 999

REVIEWS

on

Oct

ober

15,

200

8 w

ww

.sci

ence

mag

.org

Dow

nloa

ded

from

Page 4: Jonathan Haidt - The New Synthesis in Moral Psychology

surprising that people so readily invent andconfidently tell stories to explain their ownbehaviors (26). Such “confabulations” are oftenreported in neuroscientific work; when braindamage or surgery creates bizarre behaviors orbeliefs, the patient rarely says “Gosh, why did Ido that?” Rather, the patient’s “interpretermodule” (27) struggles heroically to weave astory that is then offered confidently to others.Moral reasoning is often like the press secretaryfor a secretive administration—constantly gen-erating the most persuasive arguments it canmuster for policies whose true origins and goalsare unknown (8, 28).

The third rule of life in a web of gossip is: Beprepared for other people’s attempts to deceiveand manipulate you. The press secretary’s pro-nouncements usually contain some useful in-formation, so we attend to them, but we don’ttake them at face value. We easily switch into“intuitive prosecutor” mode (24), using ourreasoning capacities to challenge people’s ex-cuses and to seek out—or fabricate—evidenceagainst people we don’t like. Thalia Wheatleyand I (12) recently created prosecutorial moralconfabulations by giving hypnotizable subjects apost-hypnotic suggestion that they would feel aflash of disgust whenever they read a previouslyneutral word (“take” for half the subjects; “often”for the others). We then embedded one of thosetwo words in six short stories about moralviolations (e.g., accepting bribes or eating one’sdead pet dog) and found that stories that includedthe disgust-enhanced word were condemnedmore harshly than those that had no such flash.

To test the limiting condition of this effect, weincluded one story with no wrongdoing, aboutDan, a student council president, who organizesfaculty-student discussions. The story includedone of two versions of this sentence: “He [tries totake]/[often picks] topics that appeal to bothprofessors and students in order to stimulate dis-cussion.” We expected that subjects who felt aflash of disgust while reading this sentence wouldcondemn Dan (intuitive primacy), search for ajustification (post-hoc reasoning), fail to find one,and then be forced to override their hypnoticallyinduced gut feeling using controlled processes.Most did. But to our surprise, one third of thesubjects in the hypnotic disgust condition (andnone in the other) said that Dan’s action waswrong to some degree, and a few came upwith thesort of post-hoc confabulations that Gazzanigareported in some split-brain patients, such as“Dan is a popularity-seeking snob” or “It justseems like he’s up to something.” They inventedreasons to make sense of their otherwise inex-plicable feeling of disgust.

When we engage in moral reasoning, we areusing relatively new cognitive machinery thatwas shaped by the adaptive pressures of life in areputation-obsessed community. We are capableof using this machinery dispassionately, such aswhen we consider abstract problems with nopersonal ramifications. But the machinery itself

was “designed” to work with affect, not free ofit, and in daily life the environment usuallyobliges by triggering some affective response.But how did humans, and only humans, developthese gossipy communities in the first place?

Principle 3: Morality Binds and BuildsNearly every treatise on the evolution of moralitycovers two processes: kin selection (genes foraltruism can evolve if altruism is targeted at kin)and reciprocal altruism (genes for altruism can

evolve if altruism and vengeance are targeted atthose who do and don’t return favors, respective-ly). But several researchers have noted that thesetwo processes cannot explain the extraordinarydegree to which people cooperate with strangersthey’ll never meet again and sacrifice for largegroups composed of nonkin (23, 29). There musthave been additional processes at work, and thestudy of these processes—especially those thatunite cultural and evolutionary thinking —is anexciting part of the new synthesis. The unifying

principle, I suggest, is the insight of thesociologist Emile Durkheim (30) that moralitybinds and builds; it constrains individuals and tiesthem to each other to create groups that areemergent entities with new properties.

Amoral community has a set of shared normsabout how members ought to behave, combinedwithmeans for imposing costs on violators and/orchanneling benefits to cooperators. A big step inmodeling the evolution of such communities isthe extension of reciprocal altruism by “indirect

reciprocity” (31) in which virtue paysby improving one’s reputation, whichelicits later cooperation from others.Reputation is a powerful force forstrengthening and enlarging moralcommunities (as users of ebay.comknow). When repeated-play behavioraleconomics games allow players toknow each others’ reputations, coop-eration rates skyrocket (29). Evolu-tionary models show that indirectreciprocity can solve the problem offree-riders (which doomed simplermodels of altruism) in moderately largegroups (32), as long as people haveaccess to information about reputations(e.g., gossip) and can then engage inlow-cost punishment such as shunning.

However the process began, earlyhumans sometimes found ways tosolve the free-rider problem and to livein larger cooperative groups. In sodoing, they may have stepped througha major transition in evolutionaryhistory (33). From prokaryotes toeukaryotes, from single-celled orga-nisms to plants and animals, and fromindividual animals to hives, colonies,and cooperative groups, the simplerules of Darwinian evolution neverchange, but the complex game of lifechanges when radically new kinds ofplayers take the field. Ant colonies area kind of super-organism whose prolif-eration has altered the ecology of ourplanet. Ant colonies compete with eachother, and group selection thereforeshaped ant behavior and made antsextraordinarily cooperative within theircolonies. However, biologists havelong resisted the idea that group se-lection contributed to human altruism

because human groups do not restrict breedingto a single queen or breeding pair. Genes relatedto altruism for the good of the group are there-fore vulnerable to replacement by genes relatedto more selfish free-riding strategies. Humangroup selection was essentially declared off-limits in 1966 (34).

In the following decades, however, severaltheorists realized that human groups engage incultural practices that modify the circumstancesunder which genes are selected. Just as amodified gene for adult lactose tolerance evolved

Ho

w r

elev

ant

to m

ora

l ju

dg

men

t? (

1=ne

ver,

6=

alw

ays)

Politics

1

2

3

4

5

6

Veryliberal

Liberal Moderate ConservativeSlightlyliberal

Slightlyconservative

Veryconservative

HarmFairnessIngroupAuthorityPurity

Fig. 1. Liberal versus conservative moral foundations.Responses to 15 questions about which considerations arerelevant to deciding “whether something is right or wrong.”Those who described themselves as “very liberal” gave thehighest relevance ratings to questions related to the Harm/Care and Fairness/Reciprocity foundations and gave the lowestratings to questions about the Ingroup/Loyalty, Authority/Respect, and Purity/Sanctity foundations. The more conserv-ative the participant, the more the first two foundationsdecrease in relevance and the last three increase [n = 2811;data aggregated from two web surveys, partially reported in(41)]. All respondents were citizens of the United States. Datafor 476 citizens of the United Kingdom show a similar pattern.The survey can be taken at www.yourmorals.org.

18 MAY 2007 VOL 316 SCIENCE www.sciencemag.org1000

REVIEWS

on

Oct

ober

15,

200

8 w

ww

.sci

ence

mag

.org

Dow

nloa

ded

from

Page 5: Jonathan Haidt - The New Synthesis in Moral Psychology

in tandem with cultural practices of raising dairycows, so modified genes for moral motives mayhave evolved in tandem with cultural practicesand institutions that rewarded group-beneficialbehaviors and punished selfishness. Psychologi-cal mechanisms that promote uniformity withingroups and maintain differences across groupscreate conditions in which group selection can oc-cur, both for cultural traits and for genes (23, 35).Even if groups vary little or not at all genetically,groups that develop norms, practices, and insti-tutions that elicit more group-beneficial behaviorcan grow, attract new members, and replace lesscooperative groups. Furthermore, preagriculturalhuman groups may have engaged in warfareoften enough that group selection altered genefrequencies as well as cultural practices (36).Modified genes for extreme group solidarityduring times of conflict may have evolved intandem with cultural practices that led to greatersuccess in war.

Humans attain their extreme group solidarityby forming moral communities within whichselfishness is punished and virtue rewarded.Durkheim believed that gods played a crucialrole in the formation of such communities. Hesaw religion as “a unified system of beliefs andpractices relative to sacred things, that is to say,things set apart and forbidden—beliefs andpractices which unite into one single moralcommunity called a church, all those who adhereto them” (30). D. S. Wilson (35) has argued thatthe coevolution of religions and religious mindscreated conditions in which multilevel groupselection operated, transforming the older moral-ity of small groups into a more tribal form thatcould unite larger populations. As with ants,group selection greatly increased cooperationwithin the group, but in part for the adaptivepurpose of success in conflict between groups.

Whatever the origins of religiosity, nearly allreligions have culturally evolved complexes ofpractices, stories, and norms that work together tosuppress the self and connect people to some-thing beyond the self. Newberg (37) found thatreligious experiences often involve decreasedactivity in brain areas that maintain maps of theself’s boundaries and position, consistent withwidespread reports that mystical experiencesinvolve feelings of merging with God or theuniverse. Studies of ritual, particularly thoseinvolving the sort of synchronized motor move-ments common in religious rites, indicate thatsuch rituals serve to bind participants together inwhat is often reported to be an ecstatic state ofunion (38). Recent work on mirror neuronsindicates that, whereas such neurons exist inother primates, they are much more numerous inhuman beings, and they serve to synchronize ourfeelings and movements with those of othersaround us (39). Whether people use their mirrorneurons to feel another’s pain, enjoy a synchro-nized dance, or bow in unison toward Mecca, itis clear that we are prepared, neurologically,psychologically, and culturally, to link our con-

sciousness, our emotions, and our motor move-ments with those of other people.

Principle 4: Morality Is About MoreThan Harm and FairnessIf I asked you to define morality, you’d probablysay it has something to dowith how people oughtto treat each other. Nearly every researchprogram in moral psychology has focused onone of two aspects of interpersonal treatment: (i)harm, care, and altruism (people are vulnerableand often need protection) or (ii) fairness,reciprocity, and justice (people have rights tocertain resources or kinds of treatment). Thesetwo topics bear a striking match to the twoevolutionary mechanisms of kin selection (whichpresumably made us sensitive to the sufferingand needs of close kin) and reciprocal altruism(which presumably made us exquisitely sensitiveto who deserves what). However, if group selec-tion did reshape humanmorality, then theremightbe a kind of tribal overlay (23)—a coevolved setof cultural practices and moral intuitions—thatare not about how to treat other individuals butabout how to be a part of a group, especially agroup that is competing with other groups.

In my cross-cultural research, I have foundthat the moral domain of educated Westerners isnarrower—more focused on harm and fairness—than it is elsewhere. Extending a theory from cul-tural psychologist Richard Shweder (40), JesseGraham, Craig Joseph, and I have suggested thatthere are five psychological foundations, eachwith a separate evolutionary origin, upon whichhuman cultures construct their moral commu-nities (41, 42). In addition to the harm and fair-ness foundations, there are also widespreadintuitions about ingroup-outgroup dynamics andthe importance of loyalty; there are intuitionsabout authority and the importance of respect andobedience; and there are intuitions about bodilyand spiritual purity and the importance of livingin a sanctified rather than a carnal way. And it’snot just members of traditional societies whodraw on all five foundations; even within West-ern societies, we consistently find an ideologicaleffect in which religious and cultural conserva-tives value and rely upon all five foundations,whereas liberals value and rely upon the harmand fairness foundations primarily (Fig. 1 andTable 1).

Research on morality beyond harm andfairness is in its infancy; there is much to belearned. We know what parts of the brain areactive when people judge stories about run-away trolleys and unfair divisions of money.But what happens when people judge storiesabout treason, disrespect, or gluttony? Weknow how children develop an ethos of caringand of justice. But what about the developmentof patriotism, respect for tradition, and a senseof sacredness? There is some research on thesequestions, but it is not yet part of the new syn-thesis, which has focused on issues related toharm and fairness.

In conclusion, if the host of that erudite quizshow were to allow you 60 seconds to explainhuman behavior, you might consider saying thefollowing: People are self-interested, but theyalso care about how they (and others) treatpeople, and how they (and others) participate ingroups. These moral motives are implemented inlarge part by a variety of affect-laden intuitionsthat arise quickly and automatically and theninfluence controlled processes such as moralreasoning. Moral reasoning can correct andoverride moral intuition, though it is morecommonly performed in the service of socialgoals as people navigate their gossipy worlds.Yet even thoughmorality is partly a game of self-promotion, people do sincerely want peace,decency, and cooperation to prevail within theirgroups. And because morality may be as much aproduct of cultural evolution as genetic evolu-tion, it can change substantially in a generationor two. For example, as technological advancesmake us more aware of the fate of people infaraway lands, our concerns expand and weincreasingly want peace, decency, and coopera-tion to prevail in other groups, and in the humangroup as well.

References and Notes1. E. O. Wilson, Sociobiology (Harvard Univ. Press,

Cambridge, MA, 1975).2. R. B. Zajonc, Am. Psychol. 35, 151 (1980).3. R. L. Trivers, Q. Rev. Biol. 46, 35 (1971).4. M. Hauser, Moral Minds: How Nature Designed our

Universal Sense of Right and Wrong (HarperCollins,New York, 2006).

5. J. C. Flack, F. B. M. de Waal, in Evolutionary Origins ofMorality, L. D. Katz, Ed. (Imprint Academic, Thorverton,UK, 2000), pp. 1–29.

6. R. H. Fazio, D. M. Sanbonmatsu, M. C. Powell,F. R. Kardes, J. Pers. Soc. Psychol. 50, 229 (1986).

7. J. A. Bargh, T. L. Chartrand, Am. Psychol. 54, 462 (1999).8. J. Haidt, Psychol. Rev. 108, 814 (2001).9. Q. Luo et al., Neuroimage 30, 1449 (2006).

10. C. D. Batson, Adv. Exp. Soc. Psych. 20, 65 (1987).11. A. G. Sanfey, J. K. Rilling, J. A. Aronson, L. E. Nystrom,

J. D. Cohen, Science 300, 1755 (2003).12. T. Wheatley, J. Haidt, Psychol. Sci. 16, 780 (2005).13. F. Cushman, L. Young, M. Hauser, Psychol. Sci. 17, 1082

(2006).14. D. Kuhn, The Skills of Argument (Cambridge Univ. Press,

Cambridge, 1991).15. Z. Kunda, Psychol. Bull. 108, 480 (1990).16. A. Damasio, Looking for Spinoza (Harcourt, Orlando,

FL, 2003).17. J. D. Greene, R. B. Sommerville, L. E. Nystrom,

J. M. Darley, J. D. Cohen, Science 293, 2105 (2001).18. J. Greene, J. Haidt, Trends Cogn. Sci. 6, 517 (2002).19. J. D. Greene, L. E. Nystrom, A. D. Engell, J. M. Darley,

J. D. Cohen, Neuron 44, 389 (2004).20. D. A. Pizarro, P. Bloom, Psychol. Rev. 110, 193 (2003).21. E. Turiel, in Handbook of Child Psychology, W. Damon,

Ed. (Wiley, New York, ed. 6, 2006).22. R. Dunbar, Grooming, Gossip, and the Evolution of

Language (Harvard Univ. Press, Cambridge, MA, 1996).23. P. J. Richerson, R. Boyd, Not by Genes Alone: How Culture

Transformed Human Evolution (Univ. of Chicago Press,Chicago, IL, 2005).

24. P. E. Tetlock, Psychol. Rev. 109, 451 (2002).25. N. Epley, D. Dunning, J. Pers. Soc. Psychol. 79, 861

(2000).26. R. E. Nisbett, T. D. Wilson, Psychol. Rev. 84, 231 (1977).27. M. S. Gazzaniga, The Social Brain (Basic Books,

New York, 1985).

www.sciencemag.org SCIENCE VOL 316 18 MAY 2007 1001

REVIEWS

on

Oct

ober

15,

200

8 w

ww

.sci

ence

mag

.org

Dow

nloa

ded

from

Page 6: Jonathan Haidt - The New Synthesis in Moral Psychology

28. T. D. Wilson, Strangers to Ourselves: Discovering theAdaptive Unconscious (Belknap Press, Cambridge, MA,2002).

29. E. Fehr, J. Henrich, in Genetic and Cultural Evolution ofCooperation, P. Hammerstein, Ed. (MIT Press, Cambridge,MA, 2003).

30. E. Durkheim, The Elementary Forms of the Religious Life(1915; reprint, The Free Press, New York, 1965).

31. M. A. Nowak, K. Sigmund, Nature 437, 1291(2005).

32. K. Panchanathan, R. Boyd, Nature 432, 499(2004).

33. J. Maynard Smith, E. Szathmary, The Major Transitions inEvolution (Oxford Univ. Press, Oxford, UK, 1997).

34. G. C. Williams, Adaptation and Natural Selection:

A Critique of Some Current Evolutionary Thought(Princeton Univ. Press, Princeton, NJ, 1966).

35. D. S. Wilson, Darwin’s Cathedral: Evolution, Religion,and the Nature of Society (Univ. of Chicago Press,Chicago, IL, 2002).

36. S. Bowles, Science 314, 1569 (2006).37. A. Newberg, E. D’Aquili, V. Rause, Why God Won’t Go

Away: Brain Science and the Biology of Belief (Ballantine,New York, 2001).

38. W. H. McNeill, Keeping Together in Time: Dance and Drillin Human History (Harvard Univ. Press, Cambridge, MA,1995).

39. V. Gallese, C. Keysers, G. Rizzolatti, Trends Cogn. Sci. 8,396 (2004).

40. R. A. Shweder, N. C. Much, M. Mahapatra, L. Park, in

Morality and Health, A. Brandt, P. Rozin, Eds. (Routledge,New York, 1997), pp. 119–169.

41. J. Haidt, J. Graham, Soc. Justice Res., in press.42. J. Haidt, C. Joseph, in The Innate Mind, P. Carruthers,

S. Laurence, S. Stich, Eds. (Oxford Univ. Press, New York,in press), vol. 3.

43. I thank D. Batson, R. Boyd, D. Fessler, J. Graham,J. Greene, M. Hauser, D. Wegner, D. Willingham, andD. S. Wilson for helpful comments and corrections.

Supporting Online Materialwww.sciencemag.org/cgi/content/full/316/5827/998/DC1Figs. S1 and S2References10.1126/science.1137651

Embodying EmotionPaula M. Niedenthal*

Recent theories of embodied cognition suggest new ways to look at how we process emotionalinformation. The theories suggest that perceiving and thinking about emotion involve perceptual,somatovisceral, and motoric reexperiencing (collectively referred to as “embodiment”) of therelevant emotion in one’s self. The embodiment of emotion, when induced in human participantsby manipulations of facial expression and posture in the laboratory, causally affects how emotionalinformation is processed. Congruence between the recipient’s bodily expression of emotion andthe sender’s emotional tone of language, for instance, facilitates comprehension of thecommunication, whereas incongruence can impair comprehension. Taken all together, recentfindings provide a scientific account of the familiar contention that “when you’re smiling, thewhole world smiles with you.”

Here is a thought experiment: A mangoes into a bar to tell a new joke. Twopeople are already in the bar. One is

smiling and one is frowning. Who is more likelyto “get” the punch line and appreciate his joke?Here is another: Two women are walking over abridge. One is afraid of heights, so her heartpounds and her hands tremble. The other is notafraid at all. On the other side of the bridge, theyencounter a man. Which of the two women ismore likely to believe that she has just met theman of her dreams?

You probably guessed that the first person ofthe pair described in each problem was the rightanswer. Now consider the following experimen-tal findings:

1) While adopting either a conventionalworking posture or one of two so-called ergo-nomic postures, in which the back was straightand the shoulders were held high and back or inwhich the shoulders and head were slumped,experimental participants learned that they hadsucceeded on an achievement test completedearlier. Those who received the good news inthe slumped posture felt less proud and reportedbeing in a worse mood than participants in theupright or working posture (1).

2) Images that typically evoke emotionally“positive” and “negative” responses were pre-sented on a computer screen. Experimentalparticipants were asked to indicate when apicture appeared by quickly moving a lever.Some participants were instructed to push alever away from their body, whereas others weretold to pull a lever toward their body. Par-ticipants who pushed the lever away respondedto negative images faster than to positive im-ages, whereas participants who pulled the levertoward themselves responded faster to positiveimages (2).

3) Under the guise of studying the quality ofdifferent headphones, participants were inducedeither to nod in agreement or to shake theirheads in disagreement. While they were “test-ing” their headphones with one of these twomovements, the experimenter placed a pen onthe table in front of them. Later, a different ex-perimenter offered the participants the pen thathad been placed on the table earlier or a novelpen. Individuals who were nodding their headspreferred the old pen, whereas participants whohad been shaking their heads preferred the newone (3).

All of these studies show that there is areciprocal relationship between the bodily ex-pression of emotion and the way in whichemotional information is attended to and in-terpreted (Fig. 1). Charles Darwin himself de-fined attitude as a collection of motor behaviors(especially posture) that conveys an organism’semotional response toward an object (4). Thus,

it would not have come as any surprise to himthat the human body is involved in the ac-quisition and use of attitudes and preferences.Indeed, one speculates that Darwin would besatisfied to learn that research reveals that (i)when individuals adopt emotion-specific pos-tures, they report experiencing the associatedemotions; (ii) when individuals adopt facialexpressions or make emotional gestures, theirpreferences and attitudes are influenced; and (iii)when individuals’ motor movements are in-hibited, interference in the experience of emo-tion and processing of emotional information isobserved (5). The causal relationship betweenembodying emotions, feeling emotional states,

Centre National de la Recherche Scientifique (CNRS) andUniversity of Clermont-Ferrand, France. E-mail: [email protected]

*Present address: Laboratoire de Psychologie Sociale etCognitive, Université Blaise Pascal, 34 Avenue Carnot,63037 Clermont-Ferrand, France.

Fig. 1. Twoways in which facial expression has beenmanipulated in behavioral experiments. (Top) Inorder to manipulate contraction of the brow musclein a simulation of negative affect, researchers haveaffixed golf tees to the inside of participants’ eye-brows (42). Participants in whom negative emotionwas induced were instructed to bring the ends of thegolf tees together, as in the right panel. [Photo credit:Psychology Press]. (Bottom) In other research, par-ticipants either held a pen between the lips toinhibit smiling, as in the left panel, or else held thepen between the teeth to facilitate smiling (39).

18 MAY 2007 VOL 316 SCIENCE www.sciencemag.org1002

REVIEWS

on

Oct

ober

15,

200

8 w

ww

.sci

ence

mag

.org

Dow

nloa

ded

from