IMPLICIT BIAS TASK FORCE - American Bar Association · Implicit Bias Task Force 2010‐2011 ... at...
Transcript of IMPLICIT BIAS TASK FORCE - American Bar Association · Implicit Bias Task Force 2010‐2011 ... at...
IMPLICIT BIAS TASK FORCE
TOOLBOX POWERPOINT INSTRUCTION MANUAL
ABA SECTION OF LITIGATION
Implicit Bias Task Force 2011‐2012
Chair of the Section
Ronald L. Marmer
Jenner & Block
Chicago, IL
●
Co‐Chairs of the Task Force
The Honorable Delissa A. Ridgway
U.S. Court of International Trade
New York, New York
●
Charna E. Sherman
Attorney at Law – Partner
Charna E. Sherman Law Offices Co., LPA
Cleveland, OH
Implicit Bias Task Force 2010‐2011
Chair of the Section
Hilarie Bass
Global Operating Shareholder
GreenbergTraurig, LLP
Miami, Florida
●
1
Co‐Chairs of the Task Force
Joanne Epps
Dean, Temple University Beasley School of Law
Philadelphia, PA
●
The Honorable Delissa A. Ridgway
U.S. Court of International Trade
New York, New York
●
Charna E. Sherman
Attorney at Law – Partner
Charna E. Sherman Law Offices Co., LPA
Cleveland, OH
●
For questions or more information
2
Professor Sarah Redfield, [email protected]
INTRODUCTION TO THE MANUAL
This manual is written to enable interested attorneys and professionals to offer
training on implicit bias on their own. To do this requires planning. The
materials here, as well as use of materials offered in the rest of the Section’s
Implicit Bias Toolbox and Implicit Bias website, should prove helpful. Those
with questions or wanting more background should contact Professor Redfield at
207‐752‐1721 (cell) or [email protected].
TABLE OF CONTENTS FOR THE MANUAL
TOOLBOX POWERPOINT INSTRUCTION MANUAL.........................................1
INTRODUCTION TO THE MANUAL.........................................................................3
TABLE OF CONTENTS ..............................................................................................3
SCOPE AND TIME..........................................................................................................4
THE CLE POWERPOINT OVERVIEW.....................................................................4
SLIDE BY SLIDE CHECKLIST...................................................................................4
A NOTE ON CLE.........................................................................................................7
OVERALL NOTES ON USING THE POWERPOINT ................................................7
SIGNIFICANCE OF ITALICS ....................................................................................7
NOTES VIEW ...............................................................................................................7
FULL CITATION .........................................................................................................7
TIME ..............................................................................................................................8
PREPARING FOR THE PRESENTATION...............................................................8
EVALUATIONS ...........................................................................................................8
SLIDE BY SLIDE INSTRUCTIONS ...............................................................................9
Opening Slide ...............................................................................................................9
Slides #2–#8 .................................................................................................................10
Slides #9–#21 ...............................................................................................................22
Slides #20–#28 .............................................................................................................47
Concluding Slide........................................................................................................64
3
IMPLICIT BIAS AND DEBIASING POWERPOINT REFERENCES (JULY 25,
2011).................................................................................................................................65
SCOPE AND TIME
There are two PowerPoints:
I. A 90 minute CLE based on a PowerPoint presentation and
supplementary materials that offer a general overview of implicit bias
and debiasing. (For those interested in a longer general version, with
additional attention to employment, email Professor Redfield at
II. An alternative choice CLE based on a PowerPoint and supplementary
materials that focus on the Implicit Association Test (IAT) and the
work of The Honorable Delissa Ridgway and Professor Jeffrey
Rachlinski.
The following materials use PowerPoint I for the 90 minute CLE. The first chart gives a summary overview and the second chart gives a
detailed slide by slide bird’s eye view.
THE CLE POWERPOINT OVERVIEW
Topic Slides Time
I
Introduction, including short culture and cultural group
exercise for introducing participants 1‐8
II Implicit Bias Explained and Tested (Redfield version) 9‐21
III Debiasing 22‐28
60 minutes
IV
Conclusion, including Section’s video, The Science and
Implications of Implicit Bias, and program evaluations 29 30 minutes
SLIDE BY SLIDE CHECKLIST
The following chart shows the content type for each slide and marks areas where
discussion is called for (not that you can’t have more discussion). This should
facilitate choices based on time for your session.
4
5
Topic (Transition in
Red)
Change
Your
Venue
Discussion
Questions
Handouts Notes
Info
1 Title
2 Roadmap
3 Introductions
4 Intro Facilitator
5 Objectives
6 Norms
7 Intro Ex. Cultural
Groups
8 5 Circle Ex.
9 Implicit Bias
10 Schemas
11 Professor Schema
12 Shorthand Schemas
13 Implicit Social
Categories/Cognition
14 Implicit Bias Defined
15 Stroop Task
16 IAT
17 IAT (Internet)
18 Implicit/Explicit Bias
Topic (Transition in
Red)
Change
Your
Venue
Discussion
Questions
Handouts Notes
Info
19 Systemic Concerns
20 Debiasing
21 Good News Club v.
Milford Cent. Sch., 533
U.S. 98 (2001)
22 Debiasing Overview
23 Education (includes
Internet clip)
24 Exposure
25 Approach
26 Approach: Stare not
Blink
27 Change Process
28 Notice Your
Messaging
29 Conclude (include
Internet) and
Evaluation
6
A NOTE ON CLE
We strongly encourage you to apply for continuing education credits if you
conduct training sessions using this curriculum (which should qualify for both
diversity and ethics credits). It has been acknowledged that “a great amount of
racial disparity exists in how the law treats individuals. [Continuing Legal
Education] has great potential to educate legal practitioners of this disparity and
the wide array of ways it is manifested in their profession.”1 For information on
obtaining CLE credit, visit https://www.clereg.org/.2
OVERALL NOTES ON USING THE POWERPOINT
SIGNIFICANCE OF ITALICS
Slides that you may particularly want to change to reflect your venue before final
distribution are provided in italics, but it is possible to use the slide deck exactly
as is. For reference to possible changes, see the Slide Summary Checklist, supra.
NOTES VIEW
The presentation is meant to be distributed in notes version so that references
and further research information is available to the participants. The additional
instructions on the slide by slide version here are for facilitators/presenters.
(“Notes” is a choice in PowerPoint for view and print.] (Click here for printable
version of handouts and here for printable version of slide‐by‐slide
instructions...)
FULL CITATION
1It has been acknowledged that “a great amount of racial disparity exists in how
the law treats individuals. [Continuing Legal Education] has great potential to
educate legal practitioners of this disparity and the wide array of ways it is
manifested in their profession.” However, “among the forty states that require
[Continuing Legal Education], only five require coursework addressing bias and
discrimination in the profession.” This gap in training can be filled by providing
training that complies with state requirements for those who may not otherwise
be motivated to learn more. Lorenzo Bowman et al., The Exclusion of Race from
Mandated Legal Education Requirements: A Critical Race Theory Analysis, 8 Seattle J.
for Soc. Just. 229, 229 (2009).
7
2 This description is from ABA BUILDING TRUST at Chapter 4.
Citation to sources and references for the PowerPoint is provided with these
materials and as a separate document. (Click here for printable version of the
handouts and click here for a printable version of just the references.]
TIME
Ninety minutes is an estimated time. The amount of time will depend, in part, on
the number of participants in your program or workshop. As written, the
presentation allows for mostly lecture and some discussion. However, the
amount of time allocated for a particular topic can be shortened or lengthened by
carefully selecting from among the suggested discussion questions. If one topic
in a particular unit is more important than another (based on your assessment of
your program’s specific needs or goals), you may choose to skip a few of the
slides or otherwise shorten the time spent on topics less important to your group
in order to maximize the time available for suggested discussion questions or
exercises on other points.3 (Click here for Slide Summary Checklist.)
PREPARING FOR THE PRESENTATION
In preparing for the presentation, facilitators will want to be familiar with the
PowerPoint and the information provided in the notes view for the slides (which
are intended to be in the handouts for participants), as well as the slide‐by‐slide
instructions offered here. (Click here for printable versionof instructions and here
for printable version of notes pages for handouts. The slide‐by‐slide entries in
the instructions include both presentation tips and additional substantive
background for most slides.
EVALUATIONS
We recommend that you use a written evaluation form to elicit feedback
regarding the participants’ learning experience. A sample evaluation is
included.Please share your evaluation results with the Section to help to improve
this training. Results and comments can be emailed to Professor Sarah Redfield
3 American Bar Association, ABA Criminal Justice Section et al., BUILDING
COMMUNITY TRUST: IMPROVING CROSS‐CULTURAL COMMUNICATION IN THE
CRIMINAL JUSTICE SYSTEM, Chapter 3, available at
http://www.americanbar.org/groups/criminal_justice/pages/buildingcommunity.
html[hereinafter “ABA BUILDING TRUST”].
8
.
SLIDE BY SLIDE INSTRUCTIONS
Opening Slide
Slide #1 Title Slide
Implicit Bias & Debiasing
ABA SECTION OF LITIGATION
1
With enormous respect for their work, and gratitude for their generosity in letting us incorporate and build from their materials, these materials follow the approach of ABA CRIMINAL JUSTICE SECTION ET AL., BUILDING COMMUNITY TRUST: IMPROVING CROSS‐
CULTURAL COMMUNICATION IN THE CRIMINAL JUSTICE SYSTEM.
Much of this PowerPoint is taken from or modeled after the excellent work of
other American Bar Association [ABA] sections developing training curriculum
around bias and related issues in the criminal justice and court system.
With enormous respect for their work, and gratitude for their generosity in
letting us incorporate and build from their materials, these materials follow the
approach of ABA Criminal Justice Section et al., Building Community Trust:
Improving Cross‐cultural Communication in the Criminal Justice System. The
topics in the ABA Building Trust materials are: Unit 1: Introduction; Unit 2:
Culture, Cultural Competency & the Criminal Justice System; Unit 3: Implicit
Bias; Unit 4: Unearned Privilege or Advantage; Unit 5: Micro‐Messages; Unit 6:
Systemic Disparities and Community Perceptions; Unit 7: Cross‐Cultural
Communication; Unit 8: Organizational Change; all available at
http://www.americanbar.org/groups/criminal_justice/pages/buildingcommunity.
html [hereinafter “ABA Building Trust”].
Additional notes for the facilitator/presenter—
9
You may want to change this to reflect your program title, date, and group.
Slides #2–#8
10
These slides are introductory and lay out the introductions and approach for
training using this PowerPoint.
Slide #2 Roadmap of Presentation
Roadmap of the Presentation
INTRODUCTIONS
CONCLUSIONS
IMPLICIT BIAS, including Implicit Association Test (IAT)
Significance for leaders of legal profession and system of justice
DEBIASING
2
Additional notes for the facilitator/presenter—
11
This slide reflects topics to be covered in a 90 minute CLE, 60 minutes for
the PowerPoint and 30 minutes for the Section’s video, The Science and
Implications of Implicit Bias, plus evaluation and concluding remarks. As
previously indicated, you may want to change this to reflect particular
interests of your group.
Slide #3 Introduction
INTRODUCTIONS
Facilitator/Faculty
Objectives & Norms
Cultural group introductions, concept + individual
3
•
•
•
*** Connections ***
This part of the program allows for individual introductions and also
introductions to groups (ingroups and outgroups). The introductions serve the
usual function of participants’ meeting each other, plus offer an underlying
approach for the rest of the materials on implicit bias.
Additional notes for the facilitator/presenter—
This slide transitions to the introductory group of slides; it lists the three preliminary
areas for presentation: facilitator intro; objectives & norms; and cultural group intros.
12
Show this for just a second and ask for questions or just click by it and
leave as part of the handout for participants’ organization of their notes.
Slide #4. Introduction of facilitator(s)
Facilitator Introduction
4
Additional notes for the facilitator/presenter—
Introduce yourself and any additional faculty members and welcome
participants to the session.
For your possible use:
Sample Introduction for Faculty4
Faculty role: “My role here today is as a facilitator of learning. This
means presenting information and facilitating our discussions in
such a way as to ensure that we achieve each session’s objectives,
and that we stick to the schedule so that we cover the material that
has been planned. We all like to talk—so we’ve built in time for
discussion. Still, sometimes I will need to re‐direct or move a
discussion in a particular direction – please work with me if this
happens.”
IF your group is small, you may want to ask participants to quickly (!)
introduce themselves. If you do use individual introductions, these
guidelines might be helpful:
13
4 Modeled on/from ABA BUILDING TRUST, Unit 1,
Be prepared to cut off anyone who tends to give an opening
statement. Tell the participants that more time for getting to
know each other will come later.
Ask each person in turn to introduce self by stating name,
office/employer, and one word or phrase that best describes
why participating in the session.
Model an introduction for the participants, e.g., I’m Jesse
Taylor. I am a supervisor in the juvenile division of the public
defender’s office. Why I am here = “curious.”)
Record word / phrase each participant chooses on a flip chart.
Note initial impressions of commonalities, interests.
If there are unifying themes in the list, e.g., Debias, make note of
those as such.
Post the flip chart page where it will be visible throughout the
program/conference.
14
Refer back to any common interests, values, and objectives at
appropriate times during the conference to reinforce key points.
Slide #5 Objectives5
Objectives
Understand what implicit bias means and how it may influence our decisions.
Understand that being implicitly biased does not necessarily mean we act in explicitly biased ways.
Learn to recognize some behaviors that may suggest bias or differential treatment.
Learn some techniques that help debias perceptions and improve interactions.
5
1.
2.
3.
4.
Additional notes for the facilitator/presenter—
This slide is self‐explanatory, and you may want to edit it to suit your particular
group and session. These are the same objectives as listed in the Toolbox
materials on the website.
Each line appears separately as you hit spacebar or click forward.
Show this for just a second and ask for questions.
15
5 Approach taken from ABA BUILDING TRUST.
Slide # 6 Norms
Norms
Confidentiality
Breaks / or individual leave & rejoin
Phones off
What else?
Do we all agree?
6
Additional notes for the facilitator/presenter—
This is a housekeeping slide to set particular rules that you may want for your
group and venue. You will know your group best, so will know what is or is not
useful and appropriate. Some topics may be sensitive, so you may choose to use
a confidentiality approach; if you do, this is the place to discuss these ground
rules.
You may want to touch on some or all of these points if you use this slide,
though some of them may be a bit “touchy feely” for your group. If you
decide to use the norms approach, the following points may be helpful:
Awareness. “Issues related to race and culture can bring up strong
emotions. It is important for us to come together in a safe
environment, where folks can share feelings, experiences, and
thoughts. Norms and expectations can help us do that.”
Confidentiality.
Respect. Each of us is at a different stage of learning and experience
with this subject.
Listening to understand. Don’t just listen to argue.
16
Taking self‐responsibility. “I—statements” – “I feel x” and “I think y.”
Explaining. If something is said that is hurtful or uncomfortable,
explain why.
17
Breaks are scheduled, but take as you individually need.
Slide #7 Introductory Exercise: Cultural Groups6
Introductory Exercise: Cultural Groups
“… groups of people who consciously or unconsciously share identifiable values, norms, symbols, and some ways of living that are repeated and transmitted from one generation to another.”
7
What are your cultural groups?
Examples of cultural traits that define groups include race, ethnicity, religion,
gender, sexual orientation, national origin, family or professional status, etc.
[ABA Building Trust Unit 2]
The ease with which we identify our cultural groups speaks to material that will
be discussed later when we take the Implicit Association Test and begin to see
how ingroup and outgroup preferences, sometimes referred to as intergroup
discrimination, may play out. [Dasgupta]
The purpose of this Exercise is to bring to a more conscious level participants’
thinking about attributes that might otherwise be automatically and implicitly
recognized and lead to automatic associations, attitudes, and/or stereotypes.
[ABA Building Trust Unit 2, also Mills, Banaji & Heiphetz]
Additional notes for the facilitator/presenter—
This slide defines cultural groups.
Give participants a minute to read and consider the definition before starting
the Exercise.
6 Modeled on/from ABA BUILDING TRUST, Unit 1.
18
Psychology researchers link culture and decision making: "Decision making is a very private thing, individualized and personal. Yet it has a cultural dimension. The human brain does not acquire language, symbolic skills, or any form of symbolic cognition without the pedagogical guidance of culture and, as a result, most decisions made in modern society engage learned algorithms of thought that are imported from culture.” [Donald, 191, also, Shepherd] Similarly, the attitudes of one’s group influence individual attitudes. When we become aware that our attitudes differ from our groups’, our attitudes tend to shift toward the norm of our peer group; this includes influence on our biases. [Dasgupta Mechanisms]
Additional background information you may want to use for discussing this slide—
The idea of culture, where our perceptions and attitudes (and biases) are
grounded, underlies much of the rest of the presentation.
As the notes indicate, we are using the Cultural Group/Five Circles Exercise
(this slide and the next slide) as an introduction for two reasons: it is an
informal and engaging way for participants to meet each other or, in a group
where most of the participants already know each other, for members to
perhaps learn something new about others there. Because the approach
launches participants’ thinking about how they define their groups (and this
relates to later work on how they implicitly perceive others, ingroups, and
outgroups), we urge you to consider using it even if it seems a bit too
“touchy‐feely.”
19
If you choose not to use the Exercise, do spend some time pointing out the
significance of group identification to work around implicit bias.
Slide #8 Five Circles Exercise: Cultural Groups7
Five Circles Exercise: CULTURAL GROUPS
8
The standard understanding of discrimination is that it stems from prejudice,
generally defined as outgroup hostility, but a revised understanding (influenced
by research on implicit bias) is that most societally significant discrimination may
be ʺdiscrimination‐in‐reverse,ʺ the effect of ingroup favoritism. [Greenwald
AALS, also Dasgupta]
20
7 Modeled on/from ABA BUILDING TRUST, Units 1, 2.
Our automatic group identification is significant; it is easy to see how it can impact diverse admissions, hiring, retention, and promotion, as well as more general decision making. Research suggests that being a member of a group creates a preference for that group, the ingroup, and against the outgroup. [Dasgupta, research overview] A now classic experiment showed that this group loyalty occurred even if factors that put you in a group are random and arbitrary, that is, the very act of categorization may be enough to create an ingroup preference. [Tajfel] When we categorize people into groups, we tend to regard members of the same group as “more similar than they actually are, and more similar than they were before they were categorized together.“ The reverse is also true and can be self‐perpetuating, that is, once categorized into groups, we see the differences as inherent and remember the ingroup more and more favorably. [Fiske & Gilbert]
Ingroup bias may be influenced by the size of the group, e.g., bias in favor of members of one’s family but not a whole ethnic group. [Greenwald & Krieger] Other research suggests an inability to distinguish between members of a group once the group is created and a tendency to find our ingroups affirming of our accomplishments and to perceive outgroups as lesser. [Benforado, Dasgupta, Perdue] We also may show preference for ingroup members based on their displaying bias against outgroups. [Fiske & Gilbert] Similarly, when the ingroup perceives itself threatened by the outgroup, the ingroup has an even more negative views of the outgroup. [Fiske]
Additional notes for the facilitator/presenter—
This slide is a visual for these instructions, which you should give to your
audience:
On the Five Circles Handout (or a blank sheet of paper where
participants draw five circles), write one cultural group with which
you closely identify in each circle. For example, I might write
woman, mother, Yankee, teacher, lawyer.
Move around the room and find one or two people for each circle
with whom you share a cultural connection and add their names to
your circles. [Note: please be mindful of any participants with
disabilities in the language you use for this instruction.]
The way this will work will depend in part on the size of your group and how
well participants may already know each other.
Start by offering a very short time period of perhaps 3 minutes. Keep
track of time. Try to be sure everyone has a chance to fill in at least one
circle. If need be, extend by minute increments, keeping in mind your
overall schedule for the day.
DEBRIEF FIVE CIRCLES EXERCISE:
Ask the participants to return to their seats and facilitate a 5 minute (or
less, depending on how engaged your participants are) discussion.
Choose from these questions:
Did anyone have a hard time actually filling out all 5 circles?
How many put your role in the court/criminal justice/firm as one
of your cultural groups?
Did any of you share more than one cultural connection with
same person? What does that suggest to you?
21
What struck you about the process of finding matches?
Slides #9–#21
These slides provide background and testing for implicit bias.
Slide #9Implicit Bias
IMPLICIT BIAS
9
*** Connecting the Topics ***
The Five Circles Exercise is a self‐selected, self‐reporting of attributes. We might
have chosen not to report an invisible or hidden attribute, e.g., LGBTQ, or even to
say that a visible attribute was not something we considered to be our cultural
group. This Exercise connects to later observations by way of the Implicit
Association Test, which measures associations and attitudes on a less conscious
level.
The core of this program is its focus on Implicit Bias. [ABA Building Trust Unit 3]
As the preceding Exercise and comments suggest, our cultural awareness and
response is not necessarily explicit and not necessarily what we say when asked
to self‐report, i.e., it is implicit. The following slides explore some of the research
from the fields of sociology and psychology on these points as background for
understanding our cognitive responses, implicit and explicit, as legal
professionals.
Additional notes for the facilitator/presenter—
22
Expanded information on definitions and other basic background materials in the court context can be found at JERRY KANG, NAT'L CENTER FOR STATE COURTS, NCSC, IMPLICIT BIAS, A PRIMER (2009), available at http://wp.jerrykang.net.s110363.gridserver.com/wp‐content/uploads/2010/10/kang‐Implicit‐Bias‐Primer‐for‐courts‐09.pdf, as well as in the Glossary, which is separately linked on the Section’s website, xxxx.
This slide transitions from the introductory group of slides to the core materials
on implicit bias.
23
You may show this for just a second and ask for questions or just click by it
*Reminder (also in the Handouts as Notes): The Five Circles Exercise is a
self‐selected, self‐reporting of attributes. Our cultural awareness and
response is not necessarily explicit and not necessarily what we say, when
asked to self‐report, i.e., it is implicit.
Slide #10 Schemas
Schemas
DEFINITION
Mental shortcuts
Organize & categorize information
Automatic
EXAMPLE
Four‐equal sided figure
Square
10
• Schemas are sets of propositions or mental constructs for relationships; they
create generalizations and expectations about categories of objects, places,
events, activities, and people. [Bernstein]
• We use schemas in order to make sense of and navigate the incredible
volume of data and input encountered from day to day. [ABA Building Trust
Unit 3]
• The example on the slide illustrates a schema for a square. We have schemas
for many objects—think about a car, a bicycle, a flower. The unconscious
brain deals with the “mundane and routine,” while the conscious brain is the
“mediator of novelty and learning.” [Donald, 203]
• We also have schemas for ourselves and other people, and these schemas
also carry certain expectations—think back to the Five Circles Exercise; think
about a schema you might have for a law professor … (See next slide.)
Additional notes for the facilitator/presenter—
As you show this slide:
24
Explain to participants that the next few slides build on each other up to the
definition of implicit bias; once defined, participants will have the
opportunity to participate in a test experience with regard to implicit bias—
the Implicit Bias Test.
Explain to participants that the slide illustrates the example of a schema for a
square. Also explain that we have such schemas for many objects, people,
and processes (the latter sometimes called scripts).
Additional background information you may want to use for discussing this slide—
Schemas are implicit shortcuts we all use. The concept of schema is the first
concept needed to understand implicit bias as such, and it is covered at some
length in the next three slides. Understanding the brain’s use of these implicit
shortcuts leads to understanding of the operation of implicit bias and its
consequences in terms of self and group identification.
The next slide illustrates a schema for a type of person, a professor, and the
slide after that suggests the way this may all come together in a biased
decision.
25
Other processes you might want to use for examples are ordering in a
restaurant, we all know what to do when we get a menu; or reading music,
once we have learned to read the music that part becomes automatic, and we
can concentrate on the art.
Slide #11 Schemas
Professor Schema
Students re: Professors
Know their subjects
Prepare for and attend class
Have office hours
Give and grade assignments and exams
So Students
Rely on schema to predict and explain prof’s
actions
fill gaps if prof’s actions ambiguous
But may eventually change based on individual performance
11
This slide offers a common social categorization to illustrate how schemas may
affect our work. We don’t immediately think of each professor separately;
instead, we have an implicit general view of likely professorial attributes.
Expectancies flow from our schema and social cognitions. [Hamilton] From this
implicit schema, we tend to find explanations for contradicting behavior that go
with our schemas. For example, if a professor is late, students’ instinct is to think
some emergency arose; but, if the professor is consistently late, then students
may decide he’s a lousy professor.
Such schema can predispose us to certain expectancies and to evaluate others in
a way that confirms our pre‐existing biases [Fiske, Darley], e.g., if we implicitly
judge men to be better leaders, we are (implicitly) quicker to find weaknesses in
female managers and to forgive or recast similar behaviors in male managers.
[Eagly & Karau, Levit, Porter, Valian Why So Slow, Valian Beyond]
26
In addition to impacting how we view and evaluate others, schemas affect our self‐perceptions. Research shows that our self‐schemas are influenced by stereotypes and impact performance. If the self‐schema for women, for example, is that they are not good at math, or as a group women area told they will likely not do well on a particularly difficult math test, women are likely to test lower than if their self‐schema and gender stereotypes are the reverse. Claude Steele has labeled this a “stereotype threat.” [Steele & Aronson, Steele, Dasgupta] A stereotype threat has obvious implications, e.g., for admissions testing for diverse law students. [cf. Walters] Other research shows that our self‐images and self‐perceptions of ability can be improved through high‐quality interaction with successful members of the relevant ingroup. [Asgari & Dasgupta, Dasgupta & Asgari]
Additional notes for the facilitator/presenter—
This slide illustrates how schemas work to categorize and view people. We have
a set of expectations based on our schemas.
You may want to ask participants to offer professorial attributes before you
show the slide.
Additional background information you may want to use for discussing this slide—
27
An important point here is that a schema predisposes us to think a certain
way and to slot behaviors into anticipated slots implicitly. This idea is
particularly relevant with regard to employment, judicial decision making,
etc.
Slide #12 Shorthand Schemas
Shorthand Schemas
Helpful in some situations, but…
…candiscri
lead to minatory
behaviors, inequity, and unfairness. 12
Using schemas, the brain takes in information and processes it in connection
with its pre‐existing patterns. If I tell you that the court ruled 5‐4 in the recent
Arizona tax standing case, you likely, without thought, would categorize this to
mean the Supreme Court of the United States ruled this way. Some researchers
call the way the brain operates with schemas “unconscious cognition” or, in
regard to people or groups, unconscious or ”implicit social cognition.”
[Greenwald Psychology]
• Schemas can be right or wrong, helpful or not, for example:
•Helpful, e.g., tying shoes, driving. Once learned, these are tasks we
do quickly without conscious thought or effort. [Tying shoes image
licensed from Getty Images.]
•Not helpful, e.g., some connections with race. This is a picture of
Amadou Diallo; Diallo was shot in 1999 by New York City police
who thought he was reaching for a gun when he was in fact
reaching for his wallet. (The police were subsequently acquitted on
charges of second degree murder.) [Diallo, Correll, Vedantam
Hidden Brain]
• Additional notes for the facilitator/presenter—
28
This is a critical slide in the sequence.
With the idea of square and professor schemas as background, you can
explain to participants that the shorthand way our minds work can be
helpful or not, right or not.
You may want to ask your participants to offer a few examples of helpful/
not, right/ not.
Additional background information you may want to use for discussing this slide—
The pictures on the slide, as explained in the notes, illustrate a helpful
schema (tie our shoes) and a not‐helpful scheme (race based police shooting).
29
Schemas set up implicit expectations. When we encounter someone for
whom we have a schema.We expect certain behavior and we try to fit that
behavior within our schema, as the late professor example on this slide
illustrates. In this sense, behavior is often self‐confirming, the psychology
version of you get what you ask for.
Slide #13 Implicit Social Categories, Cognition
Implicit Social Categories/Cog
FROM
Parents/Families
Friends/Peers
School
Media
nition
Direct or vicarious experiences
Positive or negative associations
13
This slide relates back to the previous slides on schemas and connects to the next
slide on implicit bias and its measurement. Social categories and stereotypes are
types of schemas. We develop our generalized social categories/characteristics
from many sources, including those listed here: parents, friends, and media.
[Dasgupta & Rivera] Our social categories can be either positive or negative, e.g.,
Asians are good at math, and professors are smart and/or absent‐minded. They
can also be accurate or not. [ABA Building Trust Unit 3, Project Implicit, Nosek
(images)]
Understanding the inherent and automatic nature of implicit social
categorization helps explain implicit bias and its potential repercussions.
30
Implicit recognition forms early and quickly. Research shows we tend to implicitly and immediately classify people by gender, race, and age. [Ramirez, Gawronski & Payne] A research study at the Children's Research Lab at the University of Texas illustrates how early we form categories and biases. The study considered early tendencies to discriminate on the basis of skin color: “Asked how many white people are mean, these children commonly answered, ‘Almost none.’ For the same question about black people, many answered, ‘Some,’ or ‘A lot.’ [Bronson, Vedantam Hidden Brain]
Additional notes for the facilitator/presenter—
This slide continues to build on the idea of schemas, now as applied to people
and groups.
As you show this slide, mention the key points on sources of social cognition,
i.e., our environment from birth including family, peers, teachers,
media…Later in the presentation there is discussion of situational and
environmental conditions that can change these inputs and change implicit
bias.
You may want to refer back to the Five Circles Exercise at this point and to
remind participants how they self‐reported on their own cultural groups.
You may want to emphasize the point made in the notes research that we
implicitly and immediately classify people by race, gender, age.
Additional background information you may want to use for discussing this slide—
31
This slide and its notes provide information on where schemas and social
categorizations originate. The common vocabulary in the social science
literature sometimes refers to such schemas as “implicit social categories”
and “cognition.” Such categories form early from many sources, direct and
indirect.
Slide #14 Implicit Bias Defined
Implicit Bias Defined
EVERYONE HAS SCHEMA/IMPLICIT BIAS A preference for a group (positive or negative)
often operating outside our awareness.
based on a stereotypes and attitudes we hold
that tend to develop early in life
and tend to strengthen over time
Attitudes Stereotypes
Evaluative feelings that are positive or negative
Traits we associate with a category
14
We can think of implicit bias as a lens through which we view the world—a lens
which automatically filters how we take in and act on information, a lens that is
always present. [ABA Building Trust Unit 3, Kang Primer]
These implicit social categorizations or biases are seen to be deeply held, indeed,
even by those who are committed to civil rights in their explicit world. The
Reverend Jesse Jackson reported: “There is nothing more painful to me at this
stage in my life than to walk down the street and hear footsteps and start
thinking about robbery . . . then look around and see somebody white and feel
relieved.” [Bennett 149 (quote), Steele, Vedantam See No Bias, but see Jost
(discussing differences in test results White/Black/Conservative/Liberal)]
Implicit biases do not necessarily lead to explicitly biased decisions or behaviors,
but they may well predict discriminatory nonverbal, subtle behaviors such as
sitting further away or cutting interviews short, which behaviors are then
interpreted as biased. [Fiske](See Slide #18 for further discussion.)
Additional notes for the facilitator/presenter—
As you show this slide, give participants a chance to read.
Ask your participants to offer examples of stereotypes and attitudes, e.g.:
32
Stereotype example: men are better at playing the tuba than women
Attitude example: we will be comfortable with someone who graduated
from the same college we did…
Point out before leaving this slide, that just like schemas, implicit biases
are not inherently right/wrong.
Additional background information you may want to use for discussing this slide—
33
Each word on this slide has meaning: implicit bias is talking about a
preference—a preference for a group—a preference that can be positive
or negative—a preference that is based on stereotypes and attitudes we
hold—a preference for a group that tends to develop early in life.
Slide #15 Stroop Task
This is another kind of illustration of categorization. It is easier and quicker to
read the word red when it is written in red than to read the word green when it
is written in red. If we measure the speed with which you read each, we are
measuring how closely aligned the concepts are in your brain. This kind of time
measurement forms the core of the Implicit Association test. [Stroop]
Additional notes for the facilitator/presenter—
This slide offers a quick illustration of how implicit thinking works and is
measured. Like the upcoming slide that introduces the actual Implicit
Association Test [IAT], this slide shows how response time is impacted by
implicit cognition. It takes us longer to process the dissonant terms, e.g., reading
blue when written in green.
On the first click, the first set of words comes onto the slide, blue for blue.
Ask participants to read a few aloud as a group, reading across the slide
from left to right.
On the second click, the first set disappears.
34
On the third click a second set of words comes onto the slide, now the
wordsare written in a color that is not the same as the word, for example,
yellow for blue.
35
Again ask the participants to read a few words aloud as a group. Their
time will be slower and/or they will make errors.
Slide #16 IAT
[IAT, Ashburn‐Nardo] Remember that, by definition, implicit biases are
those we carry without awareness or conscious direction. So how do we
know what ours are and whether they are impacting our decision making or
our interactions? One way developed by social scientists is the Implicit
Association Test or IAT. [Project Implicit]
The IAT is designed to measure associative knowledge, associations, and
links that cause one concept to be activated by another, so‐called automatic
associations (e.g., math ability) with attributes (like male, female) under tight
time parameters.
The underlying theory is that we will respond more accurately and quickly
to associations that fit with our schemas and implicit social cognitions, that
is, those acquired associations that are largely involuntary. [Greenwald
Psychology, Nosek (images), Ashburn‐Nardo, Shepherd] The next few slides
offer an opportunity to test and reflect on our implicit biases.
There is a wealth of literature on the IAT generally and on its relationship to
explicit bias and its value as a predictor of same. [Banaji, Blanton, Dasgupta
Implicit, Dasgupta Mechanisms, Fiske & Gilbert, Gawronski & Lebel,
Gawronski & Payne, Greenwald & Poehlman, Greenwald et al., Hoffman,
Jost, Kang, Kang Colorblind, Lane, Pettigrew, Tetlock1, Tetlock2]
36
Additional notes for the facilitator/presenter—
This slide follows the Stroop Task and introduces the IAT, which will be taken
online.
Show the slide and explain again the basic idea of implicit cognition
where our brains make connections implicitly or automatically and
quickly (as participants saw in the Stroop Task).
If asked, you may want to be prepared to tell participants that most
researchers believe that we can’t “rig” our answers even when we know
the way the test works.
Like the Stroop Exercise, the IAT rests on the assumption that two
concepts that are closely associated will be linked more readily and
more quickly. The IAT measures this response time.
How this works: The test calculates reaction times [RT]: “Average RTs
from White+pleasant / Black+unpleasant [and] Average RTs from
White+unpleasant / Black+pleasant [and] … the difference in times
forms a score such that positive values = ingroup preference.”
[Ashburn‐Nardo]
The pictures on this slide are some of those that are used in the IAT.
Additional background information you may want to use for discussing this slide— This was too much detail to include in the notes research box:
37
Much has been written about the IAT. See, e.g., Susan T. Fiske, Daniel T.
Gilbert & Gardner Lindzey, eds., Handbook of Social Psychology 406‐8,
1090 (5th ed. 2010) (overview); John T. Jost, Laurie A. Rudman, Irene V.
Blair, Dana R. Carney, Nilanjana Dasgupta, Jack Glaser & Curtis D.
Hardin, The Existence of Implicit Bias Is Beyond Reasonable Doubt: A
Refutation of Ideological and Methodological Objections and Executive
Summary of Ten Studies that No Manager Should Ignore, 29 Res. in
Organizational Behav. 39 (2009) (overview), available at
http://www.psych.nyu.edu/jost/Jost,%20Rudman,%20Blair,%20Carney,%
20Dasgupta,%20Claser,%20&%20Hardin%20%282009%29.PDF; Kristin A.
Lane, Mahzarin R. Banaji, Brian A. Nosek & Anthony G. Greenwald,
Understanding and Using the Implicit Association Test: IV. What We Know (So
Far),ʹʹ IN BERND WITTENBRINK &NORBERT S. SCHWARZ (EDS.), IMPLICIT
MEASURES OF ATTITUDES: PROCEDURES (2007) (overview); Wilhelm
Hoffmann, Bertram Gawronski, Tobias Gschwender, Huy Le & Manfred
Schmitt, A Meta‐Analysis on the Correlation Between the Implicit Association
Test and Explicit Self‐Report Measures, 31 Personality & Soc. Psychol.
Bulletin (2005) available at,
http://faculty.washington.edu/agg/IATmaterials/PDFs/Hofmann%20&%2
0al%20(PSPB,2005).pdf; Anthony G. Greenwald, T. Andrew Poehlman,
Eric Luis Uhlmann & Mahzarin R. Banaji, Understanding and Using the
Implicit Association Test: III. Meta‐analysis of Predictive Validity, 97 J.
PERSONALITY & SOC. PSYCH. 17 (2009); Thomas F. Pettigrew & Linda R.
Tropp, A Meta‐Analytic Test of Intergroup Contact Theory, 90 J. PERSONALITY
& SOC. PSYCHOL. 751 (2006).
Much is also written about the brain functions the IAT measures and
many researchers find it to be a better measurement of our biases than an
explicit self‐reporting approach provides. See, e.g., Jerry Kang, Nilanjana
Dasgupta, Kumar Yogeeswaran & Gary Blasi,Are Ideal Litigators White?
Measuring the Myth of Colorblindness, 7 J. EMPIRICAL LEGAL STUDIES 886
(2010); Nilanjana Dasgupta, Implicit Ingroup Favoritism, Outgroup
Favoritism, and Their Behavioral Manifestations, 17 SOC. JUSTICE RES. 143
(2004).
Researchers have also found the IAT to be a predictor of biased behavior.
See, e.g., Greenwald Poehlman, supra this note; Anthony G. Greenwald et
al., Measuring Individual Differences in Implicit Cognition: The Implicit
Association Test, 85 J. Personality & Soc. Psychol. 1464 (1998); Kang
Colorblind, supra this note; Nilanjana Dasgupta, Mechanisms Underlying
Malleability of Implicit Prejudice and Stereotypes: The Role of Automaticity
Versus Cognitive Control in T. Nelson (Ed.), Handbook of Prejudice,
Stereotyping, and Discrimination (2009).
38
However, not all researchers agree on these points. Bertram Gawronski,
Etienne P. Lebel & Kurt R. Peters., What Do Implicit Measures Tell Us?
Scrutinizing the Validity of Three Common Assumptions, 2 Perspective on
Psychol. Sci. 2 (2007) (summary), available at
http://www.psych.utoronto.ca/users/spa/news/data/files/Gawronski_impl
icit_perspectives_paper.pdf;Bertram Gawronski & B. Keith Payne, Handbook
of Implicit Social Cognition: Measurement, Theory and Application 87, 117
(2010) (noting “somewhat low correlation”); Philip Tetlock & Gregory
Mitchell, Implicit Bias and Accountability Systems: What Must Organizations
Do to Prevent Discrimination? 29 Res. in Organizational Behav. 3 (2009);
Philip Tetlock & Gregory Mitchell, Calibrating Prejudice in Milliseconds, 71
Soc. Psychol. Q. 12–16 (2008).
Slide #17 IAT
As the Stroop Task and prior slides have indicated, this test measures and
compares response times to identify unstated biases.
The Implicit Association Tests have documented existence of bias on a range
of dimensions of diversity (gender, race, ethnicity, age, skin tone, sexual
orientation, etc.) and in a range of cultures/contexts. [Adapted from ABA
Building Trust Unit 3, also Greenwald Poehlman]
“Implicit biases are pervasive. They appear as statistically ‘large’ effects that
are often shown by majorities of samples of Americans. …75‐80% of self‐
identified Whites and Asians show an implicit preference for racial White
relative to Black.” [IAT, quote, Dasgupta) These biases show “implicit
preference for high status groups, relative prejudice against lower status
groups, and correlated behavioral bias that preserves or exacerbates
intergroup inequalities.” [Dasgupta & Rivera, citations omitted]
39
Such biases, often distinct from our self‐reported responses to groups other
than our own, play out in many ways ranging from expectations about
behavior and ability to assessment and advancement. These implicit biases
are “particularly evident when measured unobtrusively…” [Dasgupta &
Rivera, 55; Dasgupta Mechanisms]
However, meaningful social contact can change these implicit biases. [Kang
& Banaji, Dasgupta & Rivera] (See further discussion with regard to social
contact and debiasing at Slide #24.)
Additional notes for the facilitator/presenter—
This is a chance for your audience to take the IAT. It is done as a group so a
“score” is not all that valuable, but it will show them quite quickly that their
intuitive response times are quicker with implicit biases than without.
To do the test you will need an online connection to the Harvard Project Implicit
site,https://implicit.harvard.edu/implicit/. Once connected:
Click on “demonstration,”
Then “go to the demonstration test,”
Then agree to the preliminary information and click on “I wish to
proceed.”
Then choose the Race IAT.
On the next screen, “Click here to begin.”
This is all fairly straightforward. You may want to do all of these steps before the
presentation starts and minimize this window and restore it to full size when
you get to this point in your presentation.
When the test begins, the first screen asks for self‐reporting—you may
want to ask people to jot a few answers, or, in the interest of time, skip on
to the actual test. [You cannot skip this page entirely, so answer at least
one question to get to the test.]
Explain that you (or someone you tap as your assistant) will handle the
typing (either “e” or “i” based on what you hear from the group.
[If all of the participants have their own laptops, you want to let them use them and take the test separately, but doing this partially as a
group and partially individually will probably not work because those
speaking out loud will be distracting.]
40
Ask participants as a group to quickly and loudly tell you what to choose.
“E” for the answer on right as look at screen, “I” for the answer on the
left as look at screen.
After the test, there is another screen of questionnaire type questions. Answer at least one of these and then…the test will report your
automatic preferences based on your time in answering.
DEBRIEF the test.
Ask for audience participation on some or all of these three questions,
answers for which are summarized on the next slide:
Were you surprised by the results?
What does it mean if we are implicitly biased?
In what arenas of the legal profession and access to justice do you
expect this might make a difference?
Additional background information you may want to use for discussing this slide—
As the notes explain, most Americans who take the test show biases
toward white men, toward associating men with careers, women with
families, etc. Some of these numbers are laid out in the notes page.
Even civil rights researchers and activists show these tendencies.
You may want to acknowledge the sources of implicit bias again, that is,
for example, we are exposed to media that reinforce such bias on the
news, etc.
Implicit bias is NOT necessarily explicitly reported or acted upon.
Awareness and debiasing are possible.
The Harvard Project Implicit website provides the following information,
which might also be useful background for your group8 (as quoted from
their website):
41
8FAQs, Project Implicit,
https://implicit.harvard.edu/implicit/demo/background/faqs.html#faq6(last visited July 1,
2011) (hereinafter “IAT FAQs”).
What does it mean if I get a test result that I donʹt believe describes me or, if I take the same test twice, I get
different results each time?
Answer: You may be giving the test more credit than it deserves! These tests are not perfectly accurate by any definition of accuracy. Normally, outcomes will change at least slightly from one taking to another. You may discover this if you repeat any of the tests. We encourage repeating any test for which the outcome surprises you. If the outcome repeats, the result is definitely more trustworthy than is the first result alone. If the outcome varies, it is best to average the different results. However, if the outcome varies widely from one taking to another (something that is unusual) we suggest that you just regard the set of results as 'inconclusive'. Besides normal variation in the reliability of assessment, the IAT is also known to be malleable based on differences in the social setting and recent experience. These factors will influence the consistency of measurement across occasions. For more information about reliability see Nosek, Greenwald & Banaji, in press. For more information about malleability of implicit attitudes and stereotypes see Blair, 2001.
The red Xs forced me to give responses I did not consider proper. Does that mean the test is no good for me?
42
Answer: The instructions page for each IAT lists the words, names, and/or types of pictures that appear in that test. The page also indicates the category to which each of those words belongs (For example, the page might say "good words = wonderful, beautiful, happy, joy, smile.") However, it is sometimes difficult to clearly view the pictures or to remember which category each word or name belongs to once the test begins. In laboratory versions we can make sure that each person understands the categories used in the test and the words, faces, or names that define each category. For web versions of these tests selected items that most people would agree on their category membership, and should work for as many people as
43
possible. If the categories that you believe best represent the IAT's words, names, or faces are treated by the applet as wrong (red X) for more than a few items, then that test will indeed not be adequate or accurate for you. We hope that you may have found something useful in the experience nevertheless.
Slide #18 Implicit/Explicit Bias
Implicit Explicit
Implicit biases differ sometimes substantially from stereotypes and attitudes we expressly self‐report.
Biases
Some research shows IAT better predictor of behavior than explicit self‐reports
BUT STILL DOESN’T NECESSARILY MEAN you act with your implicit biases
18
Implicit and explicit biases are related but different mental constructs. It is likely
that neither solely offers an accurate measure of bias. [Adapted from ABA
Building Trust Unit 3, also Kang Primer] At the least, the IAT tells us to consider
our possible implicit bias and be mindful of its implications. (See further
discussion in debiasing section starting at Slide #20.)
44
Psychology research shows how implicit bias may be relevant to the composition of the legal profession– 90% white, 31% female profession, with prevailing male leadership in largest firms [Bureau of Labor, ABA Commission1]: "if people have a schema about gender differences, that schema spills over into their judgments. . . . The implications for judgments of professional competence are clear. Employers faced with a man and a woman matched on the qualities relevant to success...may believe they are judging the candidates objectively. Yet, if their schema represents men as more capable. . . they are likely to overestimate the male's qualifications and underestimate the female's.“ [Valian Why So Slow, Bertrand, Valian Beyond]
The experience of transgendered people offers an illustration. Research biologist Barbara Barres changed her gender and became Ben Barres. When Ben gave a presentation someone in the audience who did not know his history observed, “Ben Barres gave a great seminar today, but, then his work is much better than his sister’s.” [Vedantam Hidden Brain, 102‐03]. For further research, see citation at Slide #16 [Blanton and others discussing data and assumptions]. For a very readable general discussion, see Vedantam See No Bias.
Additional notes for the facilitator/presenter—
This slide and the next provide an overview of answers to the questions you
asked in debriefing the IAT.
As you show this slide, the important point to emphasize is that implicit
bias is likely different from self‐reported bias. If asked, I might say the
politically correct thing, that is, that I am not biased against African
Americans, but implicitly I may be. Reasons why someone may not self‐
report biases held implicitly include not knowing, fear of appearing
discriminatory, and fear of interracial confrontation.
Also note the point on the slide that implicit bias may exist, but it does
not necessarily predict or cause explicit bias in decisions or behavior.
Additional background information you may want to use for discussing this slide—
As the notes and research box on this slide and others suggest, the
psychological research on the correlation between implicit bias and self‐
reported bias and the correlation between implicit bias and biased
decisions or actions is not uniform, though research showing correlations
is on the increase. However, regardless of these correlations, awareness of
possible implicit biases and their formation and nature can only help
achieve more fairness in decision making.
45
For those that doubt we are implicitly biased, the example mentioned in the
notes on how we perceive the same person, once transgendered to female, as less
intelligent or competent, should prove interesting.
Slide #19 Systemic Concerns and Implications
The potential influence of implicit bias on the legal profession and access to
and delivery of justice is wide‐ranging. For one example, implicit bias likely
underlies part of the reason why the practicing bar remains predominantly
white male. [Redfield 2009, 2011, also Slide #18] The employment data
reflects differences in educational qualifications, entry, retention, and success
between genders and among races and ethnicities, and demonstrates how
decisions are made that maintain the ingroup status quo. [Trix, Bertrand, Jost
Beyond Reasonable Doubt, Valian Why So Slow, National Center]
46
There is a considerable and growing body of research on the influence of implicit bias on legal decision making and justice generally. Levinson & Young (evidence); Nisbet (misremembering); Irwin (ethics); Banks (general); Guthrie (general); Jost Beyond Reasonable Doubt (overview of experiments); Kang & Lane (general); Levinson (general); Levinson et al. (general); Rachlinski 2007 (general); Guthrie Blinking (judicial decision making); Guthrie Hidden (executive branch judges); Rachlinski 2008 (trial judges); Rachlinski 2006 (bankruptcy judges); Bennett (jury); Smith (jury); Juvenile (juvenile justice); Parks (general criminal); Lynch (general criminal); MacKenzie (prosecutorial discretion); Eberhardt (sentencing); Mustard (sentencing); Ramirez (sentencing); Correll (shooter bias); Glaser (shooter bias); Plant (shooter bias/practice can change); Kang Colorblind (litigators). There is also considerable evidence on implicit bias re: employment issues in legal settings and generally. [Bertrand, Eagly & Karau, Eagly & Carli, Gawronski & Payne, Faigman, Commission]
Additional notes for the facilitator/presenter—
This slide brings together previous ideas with the reality of the legal profession.
While it is only one slide, it speaks volumes about our need to be concerned with
this topic.
Before you show this slide, ask your audience to list some of the ways
implicit bias impacts the legal system and note them on your chart paper
(or ask someone in the audience to be the writer for this).
Then show the slide as a summary to go with those noted.
Additional background information you may want to use for discussing this slide—
The legal profession remains at 90% white according to recent Bureau of
Labor statistics, and the ABA Commission on Women in the Profession
reports 31% female. So too the higher ranks of the academy and the
practice remain mostly white male.
There is research on each point listed on the slide (and more) that shows
how bias, particularly race or gender bias, plays out and impacts lives.
Slides #20–#28
47
These slides move on from recognizing implicit bias to considering techniques to
debias our behavior and decision making.
Slide #20 Debiasing (transition)
*** Connecting the Topics ***
Having considered the nature of our implicit biases and attitudes toward
ingroups and outgroups, this section of the presentation offers suggestions for
debiasing techniques. Consider what behaviors at which we should “stare”
rather than “blink.” [Gladwell, Frederick]
48
Additional notes for the facilitator/presenter—
Transition slide for organizational purposes only.
49
Taking as a given that implicit bias exists and that many of us would want to be sure our decision making is not biased, research suggests some techniques that can be useful to debias decisions. Shawn Marsh summarizes these debiasing techniques from the research: [Marsh]
1. Education around awareness that implicit bias exists;
2. Reducing cognitive load to allow more time and space for accurate reflection;
3. Encouraging high effort processing for more careful attention to information
and to one’s own thinking errors;
4. Employing checklists to assure thought at certain points;
5. Encouraging mindfulness to increase understanding of one’s own thought
processes and to watch out for thinking errors;
6. Exposing people from different groups to each other to “help counteract
biased thinking”;
7. Reducing bias‐related cues within environment;
8. Reviewing organizational behavior.
Slide #21 The good news is—
[Quote, Kang Primer]
Motivation has a significant impact on explicit expressions of bias. [Jost Beyond
Reasonable Doubt] Implicit biases are malleable and can be changed [Dasgupta
Mechanisms, Kang Colorblind] when people are pushed to change by specific
motivations, including the motivation to be fair, the motivation to not appear
prejudiced, and the motivation to avoid inter‐racial conflict [Dasgupta, Dasgupta
& Asgari] as well as by situational factors. [Shepherd] But motivation requires
that you know there is an issue. Recognizing our implicit biases and responses,
through use of the IAT, for example, can provide this motivation. [Kang Primer,
Ramirez, Blair] Implicit biases can also be changed when people “invest the
effort to practice specific strategies to avoid stereotypic or prejudicial responses.”
[Dasgupta & Asgari 643, Fiske & Gilbert] In addition to these intentional
approaches, implicit biases can be changed by changing the “social context
people inhabit rather than by directly manipulating their goals, motivation, or
effort,” with the longer the period of exposure to counterstereotypes, the greater
the decrease in stereotypes. [Dasgupta & Asgari 643‐44, see also Fiske & Gilbert
(describing impact of direct experience)]
50
Additional notes for the facilitator/presenter—
As you show this slide, emphasize its basic point—change can happen if
one is motivated to reduce implicit bias and its potential impact on
deliberation and decisions for the legal profession.
Additional background information you may want to use for discussing this slide—
Of course, good intentions are not likely to change implicit bias enough
standing alone. Mindfulness and careful thinking are called for.
Experts sometimes refer to intuitive thinking as “System 1 thinking,” and
this is popularly called Blinking, a term made popular by Malcolm
Gladwell in his book, Blink: The Power of Thinking without Thinking
(2007). More deliberate or conscious thinking is sometimes called “System
2 thinking,” occasionally popularly called “Staring.” Legal writers have
considered these approaches. Professor Rachlinski, among others, has
studied the way judges make decisions, characterizing that model as
“intuitive deliberative”.
51
While the research is not uniform, some research does conclude that a desire to be fair can change implicit attitudes through developing “chronic egalitarian values.” [Fiske & Gilbert 1110, Wellman] The desire to be fair may be more effective at impacting bias than legal coercion might be. Conditions that support this motivation include strong, unambiguous norms around bias, and leadership with positive examples. [Bartlett]
Slide #22 Overview of Debiasing
This is a summary of approaches discussed further in the slides that follow.
(See also Marsh’s list of approaches at Slide #20.
52
Research tells us that both implicit and explicit attitudes can change. [Fiske & Gilbert] Psychologists identify these change agents:
1. Change in contextual cues (e.g., expose White participants to positive
African Americans and/or disliked Whites) [Fiske & Gilbert]
2. Addition of new information [Fiske & Gilbert, cf. Pettigrew “simply
knowing more about the outgroup typically does not have a major
effect on reducing prejudice)]
3. Directly experiencing objects so as to create “new strong associations
that are incompatible with existing stereotypes or prejudice” [Fiske &
Gilbert 1110]
4. Taking the perspective of someone in the outgroup [Aberson]
5. Asking questions that get at both sides[Plous]
Additional notes for the facilitator/presenter—
This slide provides an overview of takeaway debiasing techniques that can be
put into practice in a variety of settings. Some of them are quick and appear to
focus only on the surface; others offer longer and deeper opportunity for change.
This slide is mostly transitional, offering a summary of the techniques to
come.
As you show this slide, tell participants that the ideas offered in the next
few slides are research‐based and have proven can prove helpful
confronting implicit bias.
Slide #23 Education
As the prior slides suggest, becoming aware of implicit bias is first an
educational process. The film clip shown with this slide illustrates mindfulness.
[Simons or Davison] There are other optical illusions that show similar ideas, i.e.,
that things are not always as they seem intuitively nor at first glance, for
example, A & B are the same shade.
52
53
Additional notes for the facilitator/presenter—
Not surprisingly, education and awareness are critical to confronting implicit
bias. The idea is that we take that which is implicit and often unknown and
unspoken and shine a spotlight on it. Even just this awareness can be helpful.
With this slide you maywant to show a short film clip mentioned in the
notes. [For a 90‐minute session you will probably need to use one of the
two short clips, though the Lunch Date, which is 10 minutes, is richer
ground for discussion.]
To show one of these you have to use from the internet sites.
Christopher Chabris & Daniel Simons, The Invisible Gorilla(Film)
available at http://www.theinvisiblegorilla.com/videos.html (1.5
minutes)
or
Adam Davison, Lunch Date (Film 1989) available at,
http://www.youtube.com/watch?v=epuTZigxUY8 (ten minutes)
If you show them, or if your participants have watched them, DEBRIEF THE FILM.
Ask participants what they saw/missed.
If time allows, ask them to offer examples of similar “misses” from their
own practices.
Additional background information you may want to use for discussing this slide—
54
The point of any of this slide, the checkerboard, and the films is that they
1) are entertaining and 2) show situations where we make assumptions
about directions or appearances and miss other things in the process.
Slide #24 Exposure
[Picture source, ABA, http://www.americanbar.org/aba.html.]
We can help ourselves to be less implicitly biased through social contact, positive
exemplars, and environmental context that expose us to other groups. [Jolls,
Chang (education), Blair, Dasgupta& Rivera (personal and media)] ‐‐“A principal
mechanism for psychological change is the ‘social contact hypothesis,’ which
suggests that prejudice and stereotypes can be reduced by face to face interaction
between groups. [Kang & Banaji, Fiske & Gilbert, Asgari, Dasgupta & Asgari]
The idea of perspective taking, where a person takes the viewpoint of a member
of the outgroup, is also useful in changing implicit bias. [Aberson] This is much
like the law school approach of arguing both sides.
55
Additional notes for the facilitator/presenter—
This slide offers a few examples of exposure—to other people and groups and to
the perspectives held by others.
As you show this slide, tell participants that research shows that such
exposure can change implicit and explicit bias.
Additional background information you may want to use for discussing this slide—
Social contact has long been a core idea for discussing and achieving the
values of diversity. The notes highlight two examples of this research and
also highlight that mere exposure is not likely sufficient; meaningful
exposure is needed.
The idea of perspective‐taking is another way to encourage us to think
more widely. When asked to put ourselves in another’s position or
articulate that position, we are more likely to be able to decrease our
implicit biases.
56
Kang and Banaji summarize the research for debiasing through social contact as requiring that people: [Kang Banaji, Miller]
(1) be exposed to disconfirming data;
(2) interact with others of equal status;
(3) cooperate;
(4) engage in non‐superficial contact; and
(5) receive clear norms in favor of equality.
An interesting study involving rising college freshmen illustrates the power of positive exemplars and these principles. Among girls who showed implicit biases for men before college, those who went to all‐women schools and were exposed to women leaders lost that bias as compared to those who went to coed institutions where the bias became stronger. [Dasgupta & Asgari] Research at law schools has also shown that “As racial diversity of a law school increases, there are strong increases in endorsements of perceived diversity of ideas and strong decreases in prejudice.” [LSAC]
The idea of meaningful attention is as addressed in the next slide, and the
idea of positive exemplars is illustrated in Slide #28 with regard to
noticing our messaging.
Slide # 25 Approach
Additional notes for the facilitator/presenter—
57
This is an overview/transition slide summarizing the next 2 slides, which offer
more detail on these approaches.
Slide #26 Approach: Stare not Blink
Implicit bias can be limited by allowing more time to consider decisions, that is,
by decreasing your cognitive load—the amount and complexity of information
to be processed in any given time frame. E.g., decreasing caseloads for judges,
defenders, prosecutors. [ABA Building Trust Unit 3]
Related to the time to consider is the opportunity to engage in “high‐effort
processing” of information vs. low‐effort/peripheral processing, that is, time to
“stare” or consciously consider; for example, through the use of more writing of
discussion points and decisions, rather than to just “blink” with off‐the‐cuff
decisions. [Secunda]
Using checklists at key decision points can encourage less biased decisions by
providing an objective framework to assess your thinking. [Arkes] For example,
in a hiring decision, it is useful to agree beforehand to the basic merit criteria (e.g., are
book smarts or street smarts more important?). This helps assure decreased
attention to biased characteristics that may be influencing decision making.
[Rudman, Uhlmann, Luzadis, National Center] This approach can also limit the
tendency to re‐make the criteria to fit the ingroup favoritism and “preferred”
candidate. [Greenwald AALS, Uhlmann, Gawronski & Bodenhausen]
58
Research has shown that these checkpoints can be effective in overcoming stereotypes. [Isaac, summary] For example, committing to the relative value of credentials before reviewing applications eliminated gender bias for those hiring a new police chief. [Uhlmann] Stereotypes are also less likely where there are more than one or two diverse candidates in the pool. [Isaac]
Additional notes for the facilitator/presenter—
This slide offers four specific ways to train ourselves to be less quick in our
thinking, as the title says, to stare not blink. These are not rocket‐science
approaches, but the research shows they can be effective in achieving more
informed and fair decision making.
As you highlight the methods and points on this slide, you may want to
ask participants for examples from their practices and/or the likelihood of
adding these techniques for various aspects of their work.
Additional background information you may want to use for discussing this slide—
All of these techniques can help us move from blink to stare, from
intuitive to deliberative thinking.
Of course, not all situations need or should remove themselves from
intuitive thinking—consider a parent’s intuitive response to a child’s
appearance or symptoms, consider the use of our intuitive expertise to
evaluate items within that expertise. Malcolm Gladwell’s opening
example in Blink: The Power of Thinking without Thinking of the art
experts evaluating a statue purchased by the Getty museum as fake is an
example; their evaluations were not based on research, but on their
cumulated experience in a blink.
59
The techniques that involve checklists and writing make it harder for us
to slip into our implicitly biased views and can be particularly significant
in employment and evaluation matters. For example, a checklist might
help us from falling into gendered descriptions in recommendations and
evaluations. National Center for Women & Information Technology
offers these examples from their research:
Stereotypical and Grindstone Adjectives. Adjectives used
to describe both male and female applicants were often
based in gender stereotypes: men as successful and women
as nurturing. Words like ʺcompassionateʺ were frequently
used for women, while words like ʺaccomplishmentʺ were
more often used for men. Grindstone words ‐ adjectives
describing applicants as hard workers‐were also more
often used for women than for men, implying that women
may have strong work ethic, but men have ability.
Slide #27 Approach: Change Process
Another approach to debiasing is to use procedural changes to disrupt the link
between implicit bias and discriminatory behavior through changed procedures.
Similarly, adding a level of accountability to the process debiases decisional
outcomes. If we think we are being monitored or may have to explain our
decisions, we are more motivated to act in an unbiased or debiased way.
[Benforado, Ziegert] But it is important that the accountability be to a superior
who him/herself offers a clear unbiased approach. [Jost Beyond Reasonable
Doubt]
Additional notes for the facilitator/presenter—
The debiasing techniques offered here build on the previous slides on debiasing,
but are likely suggestions that require a more serious, longer‐term motivation
and commitment to change.
As you show this slide, ask participants for examples from their practices.
60
Some very telling research shows these points. For example, researchers have documented how using screens to provide a “blind” audition opportunity increased the numbers of women who achieved seats in orchestra trials. [Goldin]
Additional background information you may want to use for discussing this slide—
It is common wisdom that what you test and what you count is what is
learned.
As leaders make known their intention to count, test, and consider results
in evaluation settings, this model and its practices filter into the
organizational ethos.
The orchestra screens mentioned in the notes pages offer a clear example
of how implicit bias could and did influence decisions rather than
allowing merit to control. For those who think their decisions are based
on merit, this is instructive.
Similarly, the various research experiments where resumes are sent to
potential employers with names changed, e.g., from Jack to Jamal, and
where Jack gets far more interview requests than Jamal, illustrate the
value of more careful procedure.
61
The ABA Commission on Women in the Profession, Fair Measure:
Toward Effective Attorney Evaluations (2nd ed. 2008) report is an
excellent resource, which you might want to mention.
Side #28 Notice Your Messaging
Another approach to debiasing is to focus on “micro‐messaging,” where small
messages are sent, typically without conscious thought or intent. [Valian Why So
Slow, Valian Beyond] These messages may be micro‐affirmations or micro‐
inequities or even micro‐aggressions.
To understand micro‐messaging, think about what pictures hang in your office.
When the statue of the women suffragists joined the many male statues in the
Capitol Rotunda, Senator Olympia Snowe commented that her work
environment—and doubtless that of many others—had improved. [Harvard, VA,
Architect (images), Library (images); also, Sheperd (environment)]
Also consider these examples: an associate of color shares an idea and no one
responds; or the senior partners are seen repeatedly chatting in the hallways with
certain associates, but not others ….
These are either micro‐affirmations or micro‐inequities depending on who you
are in the scenario, and they accumulate.
62
Research shows that small differences accumulate: “A useful concept in sociology is the accumulation of advantage and disadvantage. It suggests that ...advantages accrue, and... disadvantages also accumulate. Very small differences in treatment can, as they pile up, result in large disparities … It is unfair to neglect even minor instances of group‐based bias." (Valian Way Too Slow, Sandler) Research also shows that these “small” messages have power for insiders and outsiders. For example, when a person with higher status acknowledges someone at a meeting, that acknowledgement influences others to also think better of that acknowledged person. [Valian]
Additional notes for the facilitator/presenter—
For micro‐messaging. The notes give a few examples for consideration.
As you show this slide, offer an example, e.g., “In my office, all the past
deans are white men, so all the pictures on the walls are white men…”
and ask participants to offer an example as well.
Also then ask participants to offer a suggestion for how they can change
their messaging.
Additional background information you may want to use for discussing this slide—
This slide moves to a discussion of inclusion and making a safe and
welcoming environment.
There is research showing that such inclusion can produce greater
productivity and creativity.
63
The pictures on this slide are from the statuary in the U.S. Capitol
Rotunda: Portrait Monument to Lucretia Mott, Elizabeth Cady Stanton,
and Susan B. Anthony which was moved there in 1997 when it joined the
many males already there including Washington, Lincoln, and Garfield,
pictured here.
Concluding Slide
Slide # 29 Conclusions
The presentation has reviewed the way our brains function, often implicitly and
without cognition, to categorize objects and people in ways that may be
implicitly biased. By educating ourselves and becoming aware of these implicit
biases, we can help assure that our explicit behavior is fair. The debiasing section
offers takeaways to these ends.
Additional notes for the facilitator/presenter—
64
At this point, it is a good time for questions. Then show the Section of
Litigation’s film on implicit bias as a good summary and send off. Be sure
to also do the evaluations and share your feedback with us.
IMPLICIT BIAS AND DEBIASING POWERPOINT REFERENCES
ABA Building Trust. AMERICAN BAR ASSOCIATION, ABA CRIMINAL JUSTICE
SECTION ET AL.,BUILDING COMMUNITY TRUST: IMPROVING CROSS‐CULTURAL
COMMUNICATION IN THE CRIMINAL JUSTICE SYSTEM, available at
http://www.americanbar.org/groups/criminal_justice/pages/buildingcommunity.
html.
ABA Commission1. American Bar Association, ABA Commission on Women in
the Profession,A Current Glance at Women in the Law 2011 (updated January,
2011), available at
http://www.americanbar.org/groups/women/resources/statistics.html.
ABA Commission2, AMERICAN BAR ASSOCIATION, ABA COMMISSION ON WOMEN
IN THE PROFESSION, FAIR MEASURE: TOWARD EFFECTIVE ATTORNEY EVALUATIONS
(2nd ed. 2008).
Aberson. Christopher L. Aberson, Carl Shoemaker & Christina Tomolillo, Implicit
Bias and Contact: the Role of Interethnic Friendships, 144 HUMBOLDT J. SOC.
PSYCHOL. 335 (2004), available at
http://www.humboldt.edu/psychology/fs/aberson/jsp%202004.pdf.
Architect. Architect of the Capitol, Portrait Monument to Elizabeth Cady Stanton
(1815‐1902), Susan B. Anthony (1820‐1906), Lucretia Mott (1793‐1880),
http://www.aoc.gov/cc/art/rotunda/suffrage.cfm.
Arkes. Hal R. Arkes & Victoria A. Shaffer, Should We Use Decision Aids or Gut
Feelings? in G. GIGERENZER & C. ENGEL, EDS., HEURISTICS AND THE LAW (2004).
Asgari. Shaki Asgari, Nilanjana Dasgupta & Nicole Gilbert Cote, When Does
Contact with Successful Ingroup Members Change Self‐Stereotypes? A Longitudinal
Study Comparing the Effect of Quantity vs. Quality of Contact with Successful
Individuals, 41 SOC. PSYCHOL. 203 (2010).
65
Ashburn‐Nardo. Leslie Ashburn‐Nardo, Assistant Professor of Psychology. Department of Psychology, Indiana University – Purdue University Indianapolis, Presentation, The Implicit Association Test: Its Uses (and Potential Misuses) in
Organizations, (undated) available at
www.drtci.org/public/drci_iat_handouts.ppt.
Banaji & Heiphetz. Mahzarin R. Banaji & Larisa Heiphetz, Attitudes in SUSAN T.
FISKE, DANIEL T. GILBERT & GARDNER LINDZEY, EDS., HANDBOOK OF SOCIAL
PSYCHOLOGY 363‐65 (5th ed. 2010).
Banks. R. Richard Banks, Jennifer L. Eberhardt & Leet Ross, Discrimination and
Implicit Bias in a Racially Unequal Society, 94 CAL. L. REV. 1169 (2006) (focusing on
research and racially discriminatory outcomes with regard to decisions in
investigation, lethal force, and criminal sentencing).
Bartlett. Katherine Bartlett, Making Good on Good Intentions: The Critical Role of
Motivation in Reducing Implicit Workplace Discrimination, 95 VA. L. REV. 1893
(2009).
Benforado. Adam Benforado & John Hanson, The Great Attributional Divide: How
Divergent Views of Human Behavior Are Shaping Legal Policy, 57 EMORY L.J. 311,
325‐26 (2007‐2008).
Bennett. Mark W. Bennett, Unraveling the Gordian Knot of Implicit Biasin Jury
Selection: The Problems of Judge‐Dominated Voir Dire, the Failed Promise of Batson, and
Proposed Solutions, 4 HARV. L. & POL’Y REV. 149, 149 (2010).
Bernstein. DOUGLAS A. BERNSTEIN ET AL., PSYCHOLOGY 290 (8th ed. 2007).
Bertrand. Marianne Bertrand &Sendhil Mullainathan, Are Emily and Greg more
employable than Lakisha and Jamal? 94 AM. EC. REV. 991 (2004).
Blair. Irene V. Blair, The Malleability of Automatic Stereotypes and Prejudice, 6
PERSONALITY & SOC. PSYCHOL. REV. 242 (2002).
Blanton. Hart Blanton et al., Decoding the Implicit Assʹn Test: Implications for
Criterion Prediction,42 J. EXPERIMENTAL SOC. PSYCHOL. 192 (2006).
66
Bronson. Po Bronson & Ashley Merryman, See Baby Discriminate, NEWSWEEK
(September 5, 2009), available at www.newsweek.com/2009/09/04/see‐baby‐
discriminate.html.
Bureau of Labor. U.S. Department of Labor, U.S. Bureau of Labor Statistics,
Report 1026,Table 6. Employed people by detailed occupation, race, and Hispanic or
Latino ethnicity, annual averages, 2010, available at
http://www.bls.gov/cps/cpsaat11.pdf.
Chabris. CHRISTOPHER CHABRIS & DANIEL SIMON, THE INVISIBLE GORILLA AND
OTHER WAYS OUR INTUITIONS DECEIVE US (2010
Chabris test. Christopher Chabris & Daniel Simons, The Invisible Gorilla,
Film/Test, http://www.theinvisiblegorilla.com/videos.html (1.5 minutes)
Chang. Mitchell J. Chang et al., The Educational Benefits of Sustaining Cross‐Racial
Interaction Among Undergraduates, 77 J. HIGH EDUC. 430, 449 (2006).
Checkerboard. Checkerboard Illusion, http:
//web.mit.edu/persci/people/adelson/checkershadow_description.html.
Correll. Joshua Correll, Across the Thin Blue Line: Police Officers and Racial Bias in
the Decision to Shoot, 92 J. PERSONALITY & SOC. PSYCHOL. 1006 (finding a lower
threshold for deciding to shoot an African American).
Darley. John M. Darley & Paget H. Gross, A Hypothesis‐Confirming Bias in Labeling
Effects, 4 J. PERSONALITY & SOC. PSYCHOL. 20 (1983), available at
http://data.psych.udel.edu/psyc467/Darley%20%20Gross/Darley.and.Gross.pdf.
Dasgupta & Asgari. Nilanjana Dasgupta & Shaki Asgari, Seeing Is Believing:
Exposure to Counterstereotypic Women Leaders and Its Effect on the Malleability of
Automatic Gender Stereotypes, 40 J. EXPERIMENTAL SOC. PSYCHOL. 642 (2004).
Dasgupta & Rivera Motivation. Nilanjana Dasgupta & Luis Rivera, From
Automatic Anti‐gay Prejudice to Behavior: The Moderating Role of Conscious Beliefs
about Gender and Behavioral Control, 1 J. PERSONALITY & SOC. PSYCHOL. 268 (2006).
Dasgupta & Rivera, Nilanjana Dasgupta & Luis Rivera, When Social Context
Matters: The Influence of Long‐term Contact and Short‐term Exposure to Admired
Outgroup Members on Implicit Attitudes and Behavioral Intentions, 26 SOC.
COGNITION 112 (2008).
67
Dasgupta Mechanisms. Nilanjana Dasgupta, Mechanisms Underlying Malleability
of Implicit Prejudice and Stereotypes: The Role of Automaticity Versus Cognitive
Control in T. NELSON (ED.), HANDBOOK OF PREJUDICE, STEREOTYPING, AND
DISCRIMINATION (2009).
Dasgupta. Nilanjana Dasgupta, Implicit Ingroup Favoritism, Outgroup Favoritism,
and Their Behavioral Manifestations, 17 SOC. JUSTICE RES. 143 (2004).
Davison. Adam Davison, The Lunch Date,
http://www.youtube.com/watch?v=epuTZigxUY8 (film).
Diallo. Amadou Diallo, Times Topics, N.Y. TIMES, available at
http://topics.nytimes.com/topics/reference/timestopics/people/d/amadou_diallo/i
ndex.html.
Donald. Merlin Donald, How Culture and Brain Mechanisms Interact in Decision
Making, in CHRISTOPH ENGEL & WOLF SINGER, EDS., BETTER THAN CONSCIOUS?
DECISION MAKING, THE HUMAN MIND, AND IMPLICATIONS FOR INSTITUTIONS 191
(2008).
Dovidio. John F. Dovidio, Kerry Kawakami, Craig Johnson, Brenda Johnson & Adiah
Howard, On the Nature of Prejudice: Automatic and Controlled Processes, 33 J. OF
EXPERIMENTAL SOC. PSYCHOL. 510 (1997), available at
http://www.psych.yorku.ca/kawakami/documents/Onthenatureofprejudice_000.pdf.
Eagly & Carli. ALICE H. EAGLY & LINDA L. CARLI, THROUGH THE LABYRINTH: THE
TRUTH ABOUT HOW WOMEN BECOME LEADERS (2007).
Eagly & Karau. Alice H. Eagly & Steven J. Karau, Role Congruity Theory of
Prejudice toward Female Leaders,109 PSYCHOL. REV. 573.
Eberhardt. Jennifer L. Eberhardt et al., Looking Deathworthy: Perceived
Stereotypicality of Black Defendants Predicts Capital Sentencing Outcomes,17
PSYCHOL. SCI., No. 5 (2006).
Faigman, David Faigman, Nilanjana Dasgupta & Cecilia L. Ridgeway, A Matter of
Fit: The Law of Discrimination and the Science of Implicit Bias, 59 Hastings L.J. 1389
(2008).
68
Fiske & Gilbert.SUSAN T. FISKE, DANIEL T. GILBERT & GARDNER LINDZEY, EDS.,
HANDBOOK OF SOCIAL PSYCHOLOGY 406‐8, 1090 (5th ed. 2010).
Fiske. SUSAN T. FISKE, SOCIAL BEINGS: CORE MOTIVES IN SOCIAL PSYCHOLOGY (2nd
ed. 2010).
Gawronski & Bodenhausen. Bertram Gawronski, Galen V. Bodenhausen &
Andrew P. Becker, I Like It, Because I Like Myself: Associative Self‐anchoring and
Post‐decisional Change of Implicit Evaluations, 43 J. EXPERIMENTAL SOC. PSYCHOL.
221 (2007).
Gawronski & Payne. BERTRAM GAWRONSKI & B. KEITH PAYNE, HANDBOOK OF
IMPLICIT SOCIAL COGNITION: MEASUREMENT, THEORY AND APPLICATION 87, 117
(2010).
Gawronski. Bertram Gawronski, Etienne P. Lebel & Kurt R. Peters, What Do
Implicit Measures Tell Us? Scrutinizing the Validity of Three Common Assumptions,
PERSPECTIVE ON PSYCHOL. SCI. 2‐2 (2007), available at
http://www.psych.utoronto.ca/users/spa/news/data/files/Gawronski_implicit_pe
rspectives_paper.pdf.
Gladwell 2007. MALCOLM GLADWELL, BLINK: THE POWER OF THINKING WITHOUT
THINKING (2007).
Goldin. Claudia Goldin & Cecilia Rouse, Orchestrating Impartiality: The Impact of
ʺBlindʺ Auditions on Female Musicians, 90 AM. ECON. REV. 715 (2000), available at
http://www.uh.edu/adkugler/Goldin&Rouse.pdf.
Greenstein. Marla Greenstein, Standing Column on Judicial Ethics: A Diverse Bench, A
Broader View, 48 JUDGES’ J. 41 (2009).
Greenwald AALS. Anthony G. Greenwald, PhD, Professor, University of
Washington, Department of Psychology, Presentation, Implicit Association Test
(IAT) in Legal Settings, AALS Annual Meeting (Jan. 6 2011).
Greenwald et al. 1998. Anthony G. Greenwald et al., Measuring Individual
Differences in Implicit Cognition: The Implicit Association Test, 85 J. PERSONALITY &
SOC. PSYCHOL. 1464 (1998).
69
Greenwald Poehlman. Anthony G. Greenwald, T. Andrew Poehlman, Eric Luis
Uhlmann & Mahzarin R. Banaji, Understanding and Using the Implicit Association
Test: III. Meta‐analysis of Predictive Validity, 97 J. PERSONALITY & SOC.
PSYCHOL. 17 (2009) (finding “incremental validity”).
Greenwald Psychology. Anthony Greenwald, Anthony G. Greenwald, PhD,
Professor, University of Washington, Department of Psychology, Presentation,
The Psychology of Blink ‐ Part 1 of 2: Understanding How Our Minds Work
Unconsciously,
http://www.youtube.com/watch?v=xRUs9Ni3Bv8&feature=related.
Guthrie Blinking. Chris Guthrie, Jeffrey J. Rachlinski & Andrew J. Wistrich,
Blinking on the Bench: How Judges Decide Cases, 93 CORNELL L. REV. 1 (2007).
Guthrie Hidden. Chris Guthrie, Jeffrey J. Rachlinski & Andrew J. Wistrich, The
ʺHidden Judiciaryʺ: An Empirical Examination of Executive Branch Justice, 58 DUKE
L.J. 1477 (2009).
Guthrie. Chris Guthrie et al., Inside the Judicial Mind,86 CORNELL L. REV. 777
(2001).
Hamilton. DAVID L. HAMILTON, SOCIAL COGNITION: KEY READINGS IN SOCIAL
PSYCHOLOGY (2005).
Harvard. Men Still Rule on Harvard Walls, THE HARVARD CRIMSON, Feb, 27, 2002,
at 1.
Hoffman. Willhem Hoffman, Bertram Gawronski, Tobias Gschwender,
Huy Le & Manfred Schmitt, A Meta‐Analysis on the Correlation Between the
Implicit Association Test and Explicit Self‐Report Measures, 31 PERSONALITY &
SOC. PSYCHOL. BULLETIN (2005) available at, http://faculty.washington.edu/agg/IATmaterials/PDFs/Hofmann%20&%20al%20(
PSPB,2005).pdf.
IAT. IAT, http://projectimplicit.net/nosek/iat/; Implicit Association TestInformation,
http://www.projectimplicit.net/generalinfo.php.
Irwin. John F. Irwin & Daniel L. Real, Judicial Ethics and Accountability: At Home
and Abroad: Unconscious Influences on Judicial Decision‐Making: The Illusion of
Objectivity, 42 MCGEORGE L. REV. 1 (2010).
70
Isaac. Carol Isaac, Barbara Lee & Molly Carnes, Interventions That Affect Gender
Bias in Hiring: A Systematic Review, 84 Academic Medicine 1440 (2009).
Jolls. Christine Jolls, Antidiscrimination Lawʹs Effects on Implicit Bias in MITU
GULATI & MICHAEL J. YELNOSKY, NYU SELECTED ESSAYS ON LABOR AND
EMPLOYMENT LAW, 3 BEHAV. ANALYSES OF WORKPLACE DISCRIMINATION 83
(2007).
Jost Beyond Reasonable Doubt. John T. Jost, Laurie A. Rudman, Irene V. Blair,
Dana R. Carney, Nilanjana Dasgupta, Jack Glaser & Curtis D. Hardin, The
Existence of Implicit Bias Is Beyond Reasonable Doubt: A Refutation Of Ideological and
Methodological Objections and Executive Summary of Ten Studies that No Manager
Should Ignore,29 RES. IN ORGANIZATIONAL BEHAV. 39 (2009), available at
http://www.psych.nyu.edu/jost/Jost,%20Rudman,%20Blair,%20Carney,%20Dasg
upta,%20Claser,%20&%20Hardin%20%282009%29.PDF.
Jost. John T. Jost, Mahzarin R. Banaji &Brian A. Nosek, A Decade of System
Justification Theory: Accumulated Evidence of Conscious and Unconscious Bolstering of
the Status Quo, 25 POLITICAL PSYCHOLOGY 881–919 (2004).
Juvenile. Focal Point Summer 1994, Vol. 8 No. 2.
Kang & Banaji. Jerry Kang & Mahzarin Banaji ʺFair Measures: A Behavioral Realist
Revision of Affirmative Action, 94 CALIF. L. REV. 1063 (2006).
Kang & Lane. Jerry Kang & Kristine Lane, Seeing Through Colorblindness: Implicit
Bias and the Law, 58 UCLA L. Rev. 465 (2010) (summarizing empirical evidence
on “colorblindness” and calling on the law to take a ʺbehavioral realistʺ account
of these findings).
Kang Colorblind. Jerry Kang, Nilanjana Dasgupta, Kumar Yogeeswaran & Gary Blasi,Are
Ideal Litigators White? Measuring the Myth of Colorblindness,7 J. EMPIRICAL LEGAL STUDIES
886 (2010).
Kang Primer. JERRY KANG, NATʹL CT. STATE COURTS,IMPLICIT BIAS,A PRIMER
(2009) (hereinafter “KANG PRIMER”),
http://wp.jerrykang.net.s110363.gridserver.com/wp‐
content/uploads/2010/10/kang‐Implicit‐Bias‐Primer‐for‐courts‐09.pdf; Project
Implicit, Frequently Asked Questions,
https://implicit.harvard.edu/implicit/demo/background/faqs.html#faq2.
71
Lane. Kristin A. Lane, Mahzarin R. Banaji, Brian A. Nosek, and Anthony G.
Greenwald, Understanding and Using the Implicit Association Test: IV. What We
Know (So Far),ʹʹ in BERND WITTENBRINK & NORBERT S. SCHWARZ (EDS.), IMPLICIT
MEASURES OF ATTITUDES: PROCEDURES (2007).
Levinson & Young. Justin D. Levinson & Danielle Young, Different Shades of Bias:
Skin Tone, Implicit Racial Bias, and Judgments of Ambiguous Evidence 112 W. V. U. L.
REV. 307,316‐323 (2010) (“Participants who implicitly associated Black and Guilty
were more likely to make harsher judgments of ambiguous evidence.“).
Levinson et al. Justin D. Levinson, Huajian Cai & Danielle Young, Guilty by
Implicit Racial Bias: The Guilty/Not Guilty Implicit Association Test,8 OHIO ST. J.
CRIM L. 187 (2010).
Levinson Media. Justin D. Levinson, Media, Race and the Complicitous Mind, 58 DEPAUL L.
REV. 599 (2009).
Levinson. Justin D. Levinson, Forgotten Racial Equality: Implicit Bias,
Decisionmaking, and Misremembering, 57 DUKE L. J. 345 (2007).
Levit. Nancy Levit, Confronting Conventional Thinking: The Heuristics Problem in
Feminist Legal Theory, 28 CARDOZO L. REV. 391, 394 (2006).
Levy. SHERI LEVY & MELANIE KILLEN (ED.), INTERGROUP ATTITUDES, AND
RELATIONS IN CHILDHOOD THROUGH ADULTHOOD (2008).
Library, America’s Story, America’s Library, Brigham Young Settled in the Great Salt Lake Valley, http://www.americaslibrary.gov/jb/reform/jb_reform_mormon_1_e.html.
LSAC. Charles E. Daye, Abigail T. Panter, Walter R. Allen & Linda F. Wightman,
The Educational Diversity Project: Analysis of Longitudinal and Concurrent Student
and Faculty Data, LSAC GRANTS REPORT SERIES (2010), available at
http://www.lsac.org/LsacResources/Research/GR/GR‐10‐01.pdf.
Luzadis. Rebecca Luzadis, Mark Wesolowski & B. Kay Snavely, Understanding
Criterion Choice in Hiring Decisions from a Prescriptive Gender Bias Perspective. J.
MANAGERIAL ISSUES (Winter, 2008), available at
http://www.entrepreneur.com/tradejournals/article/193141029.html.
72
MacKenzie. Wayne MacKenzie, Testimony, Racial Disparities in the Criminal
Justice System (2009),
http://judiciary.house.gov/hearings/pdf/McKenzie091029.pdf.
Marsh. Shawn C. Marsh, The Lens of Implicit Bias, Juvenile and Family Justice
Today 17‐19 (Summer 2009), available at
http://www.ncsconline.org/D_Research/ref/IMPLICIT%20BIAS%20Marsh%20Su
mmer%202009.pdf.
Miller 1984. Norman Miller & Marilynn B. Brewer, The Social Psychology of
Desegregation: An Introduction, in NORMAN MILLER & MARILYNN B. BREWER EDS.,
CONTACT: THE PSYCHOLOGY OF DESEGREGATION 1, 2 (1984).
Mills. LINDA G. MILLS, A PENCHANT FOR PREJUDICE: UNRAVELING BIAS IN
JUDICIAL DECISION MAKING (1999).
Mustard. David B. Mustard, Racial, Ethnic and Gender Disparities in Sentencing:
Evidence from the US Federal Courts, 44 J.L. & ECON. 285, 285 (2001).
National Center. National Center for Women & Information Technology,
Avoiding Unintended Gender Bias in Letters of Recommendation (Case Study 1)
available at
http://www.ncwit.org/images/practicefiles/AvoidingUnintendedGenderBiasLette
rsRecommendation.pdf.
Nisbett. Richard E. Nisbett & Timothy Decamp Wilson, Telling More than We Can
Know: Verbal Reports on Mental Processes, 84 PSYCHOL. REV. 231 (1977) (people may
not know, and may not report, and, when change their minds, may
“misremember” their earlier attitudes) available at
http://depts.washington.edu/methods/readings/Nisbett_1977.pdf.
Nosek. Brian A. Nosek et al., Pervasiveness and Correlates of Implicit Attitudes and
Stereotypes,18 EUROPEAN REV. SOC. PSYCHOL. 36(2007), available at
http://projectimplicit.net/nosek/stimuli/.
Parks. Gregory PARKS, SHANE JONES & JONATHAN CARDI, ED., CRITICAL RACE
REALISM: INTERSECTIONS OF PSYCHOLOGY, RACE, AND LAW (2008).
Perdue. Charles W. Perdue et al., ʺUsʺ and ʺThemʺ: Social Categorization and the
Process of Intergroup Bias, 59 J. PERSONALITY & SOC. PSYCHOL. 475, 478‐79, 482‐84
(1990).
73
Pettigrew. Thomas F. Pettigrew & Linda R. Tropp, A Meta‐Analytic Test of
Intergroup Contact Theory, 90 J. PERSONALITY & SOC. PSYCHOL. 751 (2006).
Plant. E. Ashby Plant & Michelle Peruche, The Consequences of Race for Police
Officers’ Responses to Criminal Suspects, 16 PSYCHOL. SCI. 180–183(2005).
Plous. SCOTT PLOUS, THE PSYCHOLOGY OF JUDGMENT AND DECISION MAKING 239
(1993).
Porter. Natalie Porter & Florence L. Geis, Women and Nonverbal Leadership Cues:
When Seeing is Not Believing, in CLARA MAYO & NANCY M. HENLEY, EDS., GENDER
AND NONVERBAL BEHAVIORS 40 (1981).
Rachlinski 2006. Jeffrey J. Rachlinski, Chris Guthrie & Andrew Wistrich, Inside the
Bankruptcy Judges Mind, 88 B.U. L. REV. 1227 (2006).
Rachlinski 2007. Jeffrey J. Rachlinski, Heuristics, Biases, and Philosophy, 43
TULSA L. REV. 865 (2007).
Rachlinski 2008. Jeffrey J. Rachlinski, Sheri Lynn Johnson, Andrew J. Wistrich &
Chris Guthrie, Does Unconscious Racial Bias Affect Trial Judges?, 84 NOTRE DAME L.
REV. 1195 (2008‐2009).
Ramirez. Mary Kreiner Ramirez, Into the Twilight Zone: Informing Judicial
Discretion in Federal Sentencing,57 DRAKE L. REV. 592(2009).
Redfield. SARAH E. REDFIELD, DIVERSITY REALIZED: PUTTING THE WALK WITH THE
TALK FOR DIVERSITY IN THE PIPELINE TO THE LEGAL PROFESSION (2009).
Redfield2. Sarah E. Redfield, Hispanics and the Pipeline to the Legal Profession: aka
Lawyers Don’t Do Math (publication forthcoming HISPANIC BAR JOURNAL 2011, on
file with author).
Rudman. Laurie A. Rudman, Sources of Implicit Attitudes,13 CURRENT DIRECTIONS
IN PSYCHOL. SCI. 79 (2004).
Sandler. BERNICE SANDLER, ASSOCIATION OF AMERICAN COLLEGES, THE CAMPUS
CLIMATE REVISITED: CHILLY FOR WOMEN FACULTY, ADMINISTRATORS, AND
GRADUATE STUDENTS(1986).
74
Secunda. Paul M. Secunda, Cultural Cognition at Work, 38 FLA. ST. U.L. REV. 107,
109 (2010) (discussing cultural impact in judicial decision making).
Shepherd. Hana Shepherd, The Cultural Context of Cognition: What the Implicit
Association Test Tells Us About How Culture Works, 26 SOCIOLOGICAL FORUM 121
(2011), available at http://onlinelibrary.wiley.com/doi/10.1111/j.1573‐
7861.2010.01227.x/pdf.
Simons. Daniel J. Simons, Selective Attention Test, (1999)
http://viscog.beckman.illinois.edu/flashmovie/15.php.
Smith. Alison C. Smith & Edith Greene, Conduct and Its Consequences: Attempts at
Debiasing Jury Judgments, 29 LAW AND HUMAN BEHAVIOR 505 (2005), available at
http://0‐
www.jstor.org.cardcatalog.law.unh.edu/stable/pdfplus/4499437.pdf?acceptTC=tr
ue.
Smith. Alison C. Smith & Edith Greene, Conduct and Its Consequences: Attempts at
Debiasing Jury Judgments, 29 L. & HUM. BEHAV.505 (Oct. 2005).
Steele & Aronson. Claude M. Steele & Joshua Aronson, Stereotype Threat and the
Intellectual Test Performance of African Americans, 69 J. PERSONALITY & SOC.
PSYCHOL. 797, 808 (1995).
Steele 2010. CLAUDE M. STEELE, WHISTLING VIVALDI AND OTHER CLUES TO HOW
STEREOTYPES AFFECT US (2010) (describing source of title as a black graduate
student who becomes non‐threatening when he whistles Vivaldi).
Stroop. Interactive Stroop Effect Experiment, available at
http://faculty.washington.edu/chudler/java/ready.html.
Tajfel. Henri Tajfel, Experiments in Intergroup Discrimination, 223 SCI. AM. 96
(1970).
Tetlock. Philip Tetlock & Gregory Mitchell, Calibrating Prejudice in Milliseconds, 71
SOC. PSYCHOL. Q. 12 (2008).
Tetlock2. Philip Tetlock & Gregory Mitchell, Implicit Bias and Accountability
Systems: What Must Organizations Do to Prevent Discrimination? 29 RES. IN
ORGANIZATIONAL BEHAV. 3 (2009).
75
Trix. Frances Trix & Carolyn Psenka, Exploring the Color of Glass: Letters of
Recommendation for Female and Male Medical Faculty, 14 DISCOURSE & SOCIETY,
191(2003).
Turner. Rhiannon N. Turner & Richard J. Crisp, Imagining Intergroup Contact Reduces
Implicit Prejudice, 49 BRIT. J. PSYCHOL. 120 (2010), available at
http://www.leeds.ac.uk/lihs/psychiatry/courses/dclin/cpd/past_events/positive_psycholo
gy/intergroup.pdf.
Uhlmann. Eric Luis Uhlmann & Geoffrey L. Cohen, Constructed Criteria:
Redefining Merit to Justify Discrimination, 16 PSYCHOL. SCI. 474 (2005).
VA. Univ. of Virginia Portraits of Law School Deans Focus Attention on Need for
Diversity, JET, Mar. 3, 1997 at 32.
Valian Beyond. Virginia Valian, Beyond Gender Schemas: Improving the
Advancement of Women in Academia, 20 HYPATIA #3 (2005).
Valian Why So Slow. VIRGINIA VALIAN, WHY SO SLOW? THE ADVANCEMENT OF
WOMEN (1999).
Vedantam Hidden Brain. SHANKAR VEDANTAM, THE HIDDEN BRAIN (2010).
Vedantam See No Bias. Shankar Vedantam, See No Bias, WASH. POST, Jan. 23,
2005 (Magazine) at 12.
Walters. Alyssa M. Walters et al. Stereotype Threat, the Test‐Center Environment,
and Performance on the GRE General Test, ETS Research Report 34‐36 (December
2004), available at http://ftp.ets.org/pub/gre/gre‐01‐03R.pdf (not finding support
for stereotype threat in testing situations).
Weber. Elke U. Weber and Eric J. Johnson, Mindful Judgment and Decision Making.
60 ANNUAL REV. PSYCHOL. 53 (2009).
Weisbuch. Max Weisbuch & Nalini Ambady, Unspoken Cultural Influence: Exposure to and
Influence of Nonverbal Bias, 96 J. PERSONALITY & SOC. PSYCHOL. 1104 (2009), available at
http://ase.tufts.edu/psychology/ambady/pubs/2009WeisbuchJPSP.pdf.
Wellman. Justin A. Wellman, Alexander M. Czopp & Andrew L. Geers, The
Egalitarian Optimist and the Confrontation of Prejudice (2009) available at
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2844083/.
76
Ziegert. Jonathan C. Ziegert & Paul J. Hanges, Employment Discrimination: The
Role of Implicit Attitudes, Motivation, and a Climate for Racial Bias, 90 J. APPLIED
PSYCHOL. 553, 556 (2005).