The Curse of Knowledge in Reasoning about False Beliefs ... · The Curse of Knowledge 4 think the...
Transcript of The Curse of Knowledge in Reasoning about False Beliefs ... · The Curse of Knowledge 4 think the...
The Curse of Knowledge 1
Running Head: THE CURSE OF KNOWLEDGE
The Curse of Knowledge in Reasoning about False Beliefs
Susan A. J. Birch
University of British Columbia
and
Paul Bloom
Yale University
Word Count: 5743
Keywords: False Belief; Theory of Mind; Egocentrism; Hindsight Bias; Curse of
knowledge; Perspective-taking; Reasoning
Correspondence can be addressed by mail to: Susan Birch, Department of Psychology,
University of British Columbia, 2136 West Mall, Vancouver, BC, Canada, V6T 1Z4; by
e-mail to: [email protected], or by phone (604) 822-3994.
* Manuscript
The Curse of Knowledge 2
Abstract
Young children have problems reasoning about false beliefs. We propose that these problems
arise, at least in part, because children possess an exaggerated version of the same bias that exists
in adults - the 'curse of knowledge', which is a tendency to be swayed by one’s own knowledge
when trying to appreciate a more naïve knowledge state. If the ‘curse of knowledge’ can interfere
with false belief reasoning, then adults should also be biased by their own knowledge when
reasoning about false beliefs, if tested using sufficiently sensitive measures. This prediction was
supported in two experiments using a modified false belief task. We also found that adults'
privileged knowledge only led to biased false belief reasoning in the presence of a perceived
justification for their response (Experiment 1) and that the curse of knowledge bias cannot be
explained by salience alone (Experiment 2). This research supports the more general claim that
there is a fundamental similarity in children's and adult's abilities, and limitations, in reasoning
about the mind.
Word Count: 168
The Curse of Knowledge 3
The Curse of Knowledge in Reasoning about False Beliefs
Over the last two decades, a wealth of research has explored the development of false
belief reasoning in young children (see Wellman, Cross, & Watson, 2001 for a review). As
Dennett (1978) pointed out, it is possible to correctly predict the action of another individual
without any understanding of mental states, by simply observing the true state of the world—as
long as the other individual’s belief of the world is consistent with reality. A more valid test of
whether someone appreciates mental states involves predicting actions when the other individual’s
mental state is inconsistent with reality—thus, the interest in children’s ability to reason about
false beliefs.
Most research on false belief reasoning has utilized some variant of the ‘displacement task’
(e.g. Wimmer & Perner, 1983; Baron-Cohen, Leslie, & Frith, 1985). For example, participants are
told a story about Sally who puts her candy in a basket and leaves. In her absence, another
character moves the candy to a box. When Sally returns, where will she look for her candy? The
right answer, the basket, requires attributing a false belief to Sally. Another method is the
‘Smarties task’, developed by Perner, Leekam, and Wimmer (1987). Children are shown a closed
Smarties container and asked what is inside. When they give the answer, “Smarties”, the
experimenter opens the container to reveal that there are actually pencils inside. The box is then
closed and the child is asked what someone else, who was absent when the pencils were revealed,
will think is inside. The child is also asked what he or she originally said was inside. Again,
getting these questions right requires attributing a false belief, either to oneself or to someone else.
Four-year-olds tend to do fairly well at these tasks, but younger children tend to fail (see
Wellman et. al’s, 2001 meta-analysis of 178 studies). The younger children do not answer
randomly; instead, they tend to answer in accord with their own knowledge, saying that Sally will
The Curse of Knowledge 4
think the candy is in the box, that the other person will think that there are pencils in the container,
and that they themselves had originally said the container held pencils.
The source of this difficulty is a matter of considerable debate. Some researchers interpret
children’s difficulties on these tasks as reflecting a qualitative and conceptual difference in the
way young children understand the mind compared to older children and adults. For instance,
young children are sometimes said to lack a concept of belief or a concept of mental representation
more generally (e.g. Gopnik, 1993; Perner et al.,1987; Perner, 1991; Wellman, 1990; Wellman, et
al., 2001; Wimmer, Hogrefe, & Sodian, 1988). An alternative view is that young children's
problems in reasoning about false beliefs might be due to more general factors such as memory
load, assumptions of rationality, conflicting representations, processing limitations, among others,
and thus not necessarily indicative of a different concept of belief (e.g. Fodor, 1992; Leslie, 1987;
Friedman & Leslie, 2004; German & Leslie, 2000; Koos, Gergely, Csibra, & Biro, 1997; Moses,
1993; Roth & Leslie, 1998; Zaitchik, 1990; 1991; for a discussion, see Bloom & German, 2000).
We propose that children are more susceptible to a cognitive bias found in adults in which
one overextends one’s current knowledge state when trying to appreciate a more naïve state. As a
result of younger children’s greater susceptibility to this bias, they experience greater difficulties
with false belief reasoning when they themselves have specific knowledge that the other person
does not share. That is, we propose that children’s difficulties with false belief reasoning might be
accounted for, at least in part, by an exaggerated bias referred to in the adult literature as the ‘curse
of knowledge’ (Camerer, Weber, & Loewenstein, 1989). This proposal is not inconsistent with the
notion of some conceptual change across development in understanding mental states. However, it
is most consistent with the views emphasizing limitations in general processing factors because it
points to the commonalities between children’s and adult’s abilities to reason about mental states
The Curse of Knowledge 5
(see also Birch, 2005; Birch & Bloom, 2003; 2004; Bernstein, Atance, Loftus, and Meltzoff, 2004;
Epley, Morewedge, and Keysar, 2004; Royzman, Cassidy, and Baron, 2003; Keysar, Lin, & Barr,
2003; Mitchell & Taylor, 1999; Taylor & Mitchell, 1997).
Economists and social psychologists have pointed out that adults suffer from the curse of
knowledge (Camerer et al., 1989); a tendency to be biased by one’s own knowledge when
attempting to take a more naïve perspective. This bias has received different names in different
disciplines; such as ‘hindsight bias’ or ‘creeping determinism’ (e.g. Fischhoff, 1975), the
knew-it-all-along effect (e.g. Taylor, Esbenson, Bennett, 1994), ‘realist bias’ (e.g. Taylor &
Mitchell, 1997), ‘adult egocentrism’ (e.g. Kelley & Jacoby, 1996); and ‘epistemic egocentrism’
(e.g. Royzman, et al., 2003) but the phenomenon is the same—a tendency to overextend one’s
current knowledge state when attempting to appreciate a more naïve state (whether that naïve state
is someone else’s or an earlier state of one’s own). Once we know the solution to a problem, we
tend to overestimate how easy it is for someone else to solve (Kelley & Jacoby, 1996). Similarly, if
we know a company's earnings (Camerer, et. al., 1989), the meaning of an idiom (Keysar & Bly,
1995), or whether a statement is sarcastic (Keysar, 1994), our assessments of the judgments of a
naïve person are biased in the direction of what we know (see also Blank, Fischer, & Erdfelder,
2003; Fischhoff, 1975; Heine & Lehman, 1996; Keysar, Ginzel, & Bazerman, 1995; Nickerson,
1999; Sanna & Schwarz, 2003; see Hawkins & Hastie, 1990 for a review).
Not surprisingly, children are also susceptible to this bias (Bernstein et al., 2004; Birch &
Bloom, 2003; Taylor, Esbenson, and Bennett, 1994; Pohl & Haracic, 2005; Mitchell & Taylor,
1999). In Birch & Bloom (2003), 3- to 5-year-old children were presented with novel toys and
were told that each contained “something special” inside. One toy was presented as familiar to the
experimenter’s puppet friend Percy (i.e., he said: “I’ve played with that one before”). The other
The Curse of Knowledge 6
was presented as unfamiliar to Percy (i.e., he said: “I’ve never ever seen that one before”). The key
manipulation was whether the children themselves knew what was inside the toys. For half of the
trials, the children were ‘cursed’ by showing them what was inside the toys prior to Percy’s
appearance. For the other half of the trials, the children were not shown what was inside the toys.
The results revealed a significant curse of knowledge bias: When the children themselves knew
what was inside they were more likely to over-attribute knowledge to Percy. Importantly, the
opposite is not true, however: when the children were ignorant about what was inside they were
not more likely to under-attribute knowledge to Percy. That is, there is not an equivalent ‘curse of
ignorance’, suggesting that the results cannot be accounted for by a more general egocentric bias
(see Birch, 2005 and Birch & Bloom, 2003; 2004 for discussions).
Applying the same logic to false belief tasks, it should be particularly hard to appreciate
that someone has a different perspective than your own (e.g., they hold a false belief) when you
possess specific knowledge of the actual state of affairs. Traditional false beliefs tasks are
‘cursed’— participants know the outcome and are asked to judge how a more naïve other (or
themselves in a previously naïve state) would think or act. Indeed, there are many parallels
between curse-of-knowledge studies with adults and the different kinds of false belief tasks given
to children.
First, Fischhoff (1975) provided adult participants with descriptions of events and told
them that these descriptions were also presented to other students. Participants who were told the
outcome of the events estimated that naïve students would predict higher probabilities to that
outcome than participants who were ignorant of the outcome. In other words, like children in the
displacement task who responded as if naïve Sally shared their outcome-knowledge, adults in
The Curse of Knowledge 7
Fischhoff’s (1975) task who were told the event outcome also responded as if naïve others shared
their knowledge, though notably to a lesser magnitude.
Second, Fischhoff and Beyth (1975)’s asked participants on the eve of former President
Nixon’s trips to China and the USSR to estimate the probability of various possible outcomes of
the visits. After the trip’s completion, the same participants were asked to recall their original
predictions. Like children in the Smarties task who tended to report their new knowledge of what
was in the box instead of their original predictions, participants’ recollections were biased in the
direction of their knowledge of what had actually taken place. Again, adults and children
experience similar limitations but to differing degrees.
Note finally that Birch & Bloom (2003) found that the magnitude of the bias significantly
declined from age 3 to age 5—the same developmental period in which you see shifts in children’s
false belief performance. Thus, any difficulty that adults have in appreciating false beliefs when
they are knowledgeable would likely be exaggerated in younger children.
The studies below explored these parallels in a more direct fashion and tested whether
adults, who undoubtedly are able to represent and conceive of false beliefs, will find reasoning
about false beliefs harder when they have specific knowledge about the outcome than when they
do not have specific knowledge. We gave adults a displacement task that differed from standard
tasks in three important ways. First, we used a more sensitive measure than the categorical
response that is typically obtained with children. Participants were asked to report the probability
that the protagonist would look in each of the containers when she returned. Second, instead of two
containers, we used four, to enable us to manipulate the participants’ knowledge of the outcome.
Either participants were told that the target object (a violin) was moved to a specific container (i.e.
the ‘Knowledgeable’ conditions); or told that the violin was moved to another container but were
The Curse of Knowledge 8
not told which one (i.e. the ‘Ignorant’ condition). We predicted that knowledge of the outcome in a
false belief task like that used with children would compromise even adults’ ability to reason about
false beliefs.
Third, the containers were rearranged following displacement to allow us to manipulate the
plausibility that the protagonist would look in each of the containers. This manipulation was
included because previous research has demonstrated that plausibility can mediate the magnitude
of the curse of knowledge (see Pohl, 1998). For instance, adults are more biased in the direction of
the outcome that they know about when the outcome is brought about because of a plausibly
foreseeable reason. Wasserman, Lempert, and Hastie (1990) told subjects that the British-Gurka
war was won by the British because of the superior discipline of the troops (i.e. what we will call
‘plausibly foreseeable’), whereas other subjects were given the same outcome, but were told that a
sudden unseasonal rainstorm was responsible for the victory (i.e. not likely foreseeable). The
magnitude of the hindsight bias (i.e. subjects’ estimates of the probability they would have
assigned to the outcome had they not heard the report on the British victory) was greatest in the
plausibly foreseeable condition. We believe the reason for the heightened magnitude in the
plausibly foreseeable condition is because it allows subjects to perceive that they could have
predicted the answer without knowing it, even if the actual possibility of them predicting it is not
increased. That is, it allows subjects to justify or rationalize their biased predictions.
Here, we manipulated the perceived plausibility of the outcome by creating a
knowledgeable condition in which the violin was moved to a location that would allow subjects to
generate a plausible explanation for why the protagonist would look there for her violin. In a
‘Knowledge – Plausible’ condition, participants were told that the violin was moved to a container
that, following rearrangement, was in the same physical location (different container) as the violin
The Curse of Knowledge 9
was originally—thus, it would be reasonable to assume that the character might look there by
mistake. In a ‘Knowledge – Implausible’ condition, participants were told that the violin was
moved to a container that following rearrangement was in a different location than where the violin
was originally. A ‘True Belief’ condition in which the violin was not moved was also included.
We predicted that the curse of knowledge would be stronger for adults if subjects could conceive
of a plausible rationale for their biased response.
Experiment 1
Method
Participants. Two hundred ten students enrolled in an Introductory Psychology course at
Yale University in New Haven, CT participated. Subjects completed the single-page questionnaire
as part of a larger packet of questionnaires and received course credit for their participation.1
Ninety-three were male, 112 were female, and 5 opted not to specify their gender. An additional 17
participants (7.5%) were excluded because their percentage totals did not equal 100%.
Materials and Procedure. Participants were randomly assigned to one of four conditions:
True Belief, Ignorant, Knowledge – Plausible, and Knowledge – Implausible. All participants
received the same stimuli—a color version of Figure 1 (without the color word labels) depicting a
girl holding a violin by a sofa and four containers. Each container was a different color: blue,
purple, red, and green. Beneath the first picture was an image of a different girl holding a violin
and the same four containers rearranged: red, green, purple, blue.
Figure 1: The Knowledge – Plausible Version of the Stimuli
1 Subjects had a limited amount of time to complete the package of questionnaires in both Experiments 1 and 2. It is an open question whether the same results reported here would obtain without time pressure.
The Curse of Knowledge 10
Participants in all four conditions read, “This is Vicki. She finishes playing her violin and puts it in
the blue container. Then she goes outside to play. While Vicki is outside playing, her sister,
Denise…” At this point, the conditions differed:
True Belief: “comes into the room.”.
Ignorant: “moves the violin to another container.”
The Curse of Knowledge 11
Knowledge – Plausible: “moves the violin to the red container.”
Knowledge – Implausible: “moves the violin to the purple container.”
All participants then read, “Then, Denise rearranges the containers in the room until it
looks like the picture below.” This was followed by, “When Vicki returns she wants to play her
violin. What are the chances Vicki will first look for her violin in each of the above containers?
Write your answers in percentages in the spaces provided under each container.”
Results and Discussion
In the True Belief condition participants gave a mean probability rating of 68% to the blue
container where Vicki placed the violin and a mean probability rating of 28% to the red container
(that following rearrangement occupied the location that the blue container previously occupied).
That is, most of the time they thought Vicki would look in the original container, but often they
thought she would look in a different container that was in the same location as the violin was
originally. They gave the other two containers, not the container that held the violin, and not the
same location either, a combined probability rating of just over 3%. Thus, this True Belief
condition serves as a baseline of how plausible subjects think it is for Vicki to look in each of the 4
containers after they have been rearranged. Subjects judged that it was quite plausible for her to
look in the red container and quite implausible for her to look in the purple container. See Table 1.
The Curse of Knowledge 12
Table 1 Mean Probability Judgments that Vicki Will Look in Each of the Containers
Condition Blue Container (Where the violin was originally)
Red Container (Now occupies the location where the violin was originally)
Purple Container (Now occupies a different location than where the violin was originally)
Green Container (Now occupies a different location than where the violin was originally)
True Belief (Violin “not moved”)
68%
(SD= 26%)
28%
(SD=24%)
1%
(SD = 2%)
2%
(SD = 6%) Ignorant (Violin moved to “another container.”)
71% (SD = 26%)
23% (SD = 22%)
2% (SD = 5%)
3% (SD = 7%)
Knowledge – Plausible (Violin moved to “the red container.”)
59% (SD = 27%)
34% (SD = 25%)
3% (SD = 5%)
4% (SD = 7%)
Knowledge – Implausible (Violin moved to “the purple container.”)
73%
(SD = 29%)
19%
(SD = 21%)
6%
(SD = 16%)
3%
(SD = 5%) In the Ignorant condition, probability ratings that Vicki would look for her violin in the
blue container where she left it, did not differ from that of the True Belief condition, t (104) =
-0.59, p = .56, NS (two-tailed). That is, participants who knew that the violin was moved but did
not know where it was moved, predicted that it was as likely that Vicki would act based on a false
belief, as participants who knew the violin had never been moved.
In contrast, consider the Knowledge – Plausible condition in which participants were told
that the violin was moved to the red container, a container in a location that Vicki might plausibly
look. Here, participants reported significantly higher probabilities to the red container, than
participants in the Ignorant condition, t (105) = - 2.42, p < .05 (one-tailed). In addition,
participants in the Knowledge – Plausible condition reported significantly lower probabilities that
Vicki would look in the blue container, than participants in the Ignorant condition, t (105) = 2.35,
The Curse of Knowledge 13
p < .05 (one-tailed). In other words, participant’s own knowledge of the outcome interfered with
their ability to reason about another’s false belief.
However, in the Knowledge – Implausible condition in which participants knew that the
container was moved to the purple container, an improbable container for Vicki to look in, their
probability judgments that Vicki would look in that container were not significantly higher than in
the Ignorant condition, t (97) = - 1.44, p = .154, NS (two-tailed). Also, participants’ knowledge
that the violin was in the purple container did not lead them to decrease their probability judgments
that Vicki would possess, or act on, a false belief. In other words, participants reported similar
probabilities for the blue container in both the Knowledge – Implausible and Ignorant conditions, t
(97) = -0.21, p = .42, NS (one-tailed). 2
To sum up, our findings demonstrate that a person’s own knowledge can interfere with
their ability to reason about another’s false belief. However, in the false belief task we explored,
the participants’ knowledge only led to biased responses when there was a seemingly justifiable
reason to support such a biased response. Note that the fact the violin occupied the same location in
space (different container) as it originally had was not sufficient to lead to these biased responses
(i.e. this was true in all conditions). Rather, it was only when the subject themselves knew that the
violin was in this location that they increased their predictions that Vicki would look there.
Experiment 2
What role does salience play in the previous findings? Does the participant have to believe
that the violin is in a specific container, or is merely drawing attention to that container enough to
increase participant’s probability judgments for that container? One could argue that simply by
The Curse of Knowledge 14
mentioning the red container, and thus drawing attention to it, participants would be more likely to
provide higher probabilities to that container. Alternatively, it is not the salience of the red
container per se that matters but the knowledge that the red container is where the object is hidden.
We predicted that merely mentioning the red container would not be sufficient to significantly
increase participants’ assessments of the likelihood Vicki would look there, nor would it
significantly bias their false belief reasoning.
Method
Participants. One hundred ten students enrolled in an Introductory Psychology course at
Yale University in New Haven, CT participated. Subjects completed the single-page questionnaire
as part of a larger packet of questionnaires and received course credit for their participation. Fifty
were male, 59 were female, and 1 person opted not to specify his or her gender. An additional 12
participants (9.8%) were excluded because their percentage totals did not equal 100%.
Materials and Procedure. Participants were given the same pictorial representation of the
displacement task utilized in Experiment 1. Participants were randomly assigned to one of three
conditions: Knowledge (Plausible), Ignorant, and Salience. The task was the exact same as in
Experiment 1 except after reading, “This is Vicki. She finishes playing her violin and puts it in the
blue container. Then she goes outside to play. While Vicki is outside playing, her sister,
Denise…” the conditions differed as follows:
Knowledge (Plausible): “moves the violin to another container—the red one.”
Ignorant: “moves the violin to another container.”
Salience: “moves the violin to another container—not the red one.”
2 There was a significant difference between the ratings for the purple container in the True Belief and Ignorant conditions, t (104) = -2.103, p < .05, two-tailed). We did not predict this finding, nor do we have a good explanation
The Curse of Knowledge 15
Results and Discussion
Replicating our main finding from Experiment 1, we found that participants in the
Knowledge condition reported significantly higher probabilities that Vicki would look in the
container that they knew held the violin, compared to participants in the Ignorant condition who
did not have specific knowledge about the violin’s whereabouts, t (68) = -2.52, p < .05
(one-tailed). These ‘cursed’ participants also reported significantly lower probabilities that Vicki
would look in the container she should falsely believe held her violin, compared to participants in
the Ignorant condition, t (68) = 2.13, p < .05 (one-tailed). Thus, reproducing our finding that a
person’s own knowledge can interfere with their ability to appreciate someone else’s false belief,
even on a seemingly simple task like that used with children. See Table 2.
for it, but it is inconsequential to the main findings.
The Curse of Knowledge 16
Table 2 Mean Probability Judgments that Vicki Will Look in Each of the Containers
Condition Blue Container (Where the violin was originally)
Red Container (Now occupies the location where the violin was originally)
Purple Container (Now occupies a different location than where the violin was originally)
Green Container (Now occupies a different location than where the violin was originally)
Knowledge (Plausible) (Violin moved to “the red container”)
60% (SD = 26%)
34% (SD = 24%)
3% (SD = 6%)
3% (SD = 6%)
Ignorant (Violin moved to “another container”)
74%
(SD = 28%)
19%
(SD = 22%)
3%
(SD = 5%)
3%
(SD = 5%) Salience (Violin moved to “another container—not the red one.”)
68%
(SD = 24%)
25%
(SD = 22%)
4%
(SD = 4%)
4%
(SD = 5%)
The main question of interest in Experiment 2 concerns the salience condition.
Participants’ probability judgments that Vicki would look in the blue container in the Salience
condition were not significantly different from either the Knowledge, t (77) = - 1.30, p = .10, NS
(one-tailed) or Ignorant conditions, t (69) = .30 NS (two-tailed). However, participants in the
Salience condition reported significantly lower probabilities that Vicki would look in the red
container, than participants in the Knowledge condition, t (77) = 1.67, p < .05 (one-tailed).
Furthermore, participants in the Salience condition did not report significantly higher probabilities
that Vicki would look in the red container, compared to participants in the Ignorant condition, t
(69) = -1.05, p = .30, NS (two-tailed). These data suggest that salience cannot account for the bias
in adults’ false belief reasoning in its entirety.
The Curse of Knowledge 17
General Discussion
Our findings reveal that a person’s own knowledge can contaminate his or her ability to
reason about false beliefs. In Experiment 1, we demonstrated that the subjects’ own knowledge
biased their ability to predict another’s behavior based on her false belief and that this bias is
affected by the plausibility of the outcome. When subjects knew exactly where Vicki’s violin was
moved, they significantly reduced their predictions that she would act on a false belief, compared
to subjects who did not know where the violin was moved. However, the ease with which subjects
could generate a plausible rationale for their (biased) response served to mediate the potency of the
curse of knowledge. That is, subjects’ knowledge of the outcome proved only to interfere with
their false belief reasoning if the outcome allowed for a plausible justification for their biased
responses. If a potential explanation was available to subjects for why Vicki might act in accord
with the subjects’ own knowledge, rather than her own, subjects were biased in their false belief
reasoning. Note that this potential explanation that subjects could use to justify their responses
(i.e. that the violin was in the same location in space as it was before) was true in all conditions.
Thus, it was the combination of this fact with the subjects’ own knowledge of the violin’s true
location that led to their biased responses.
In Experiment 2, we replicated the finding that adults’ ability to attribute false beliefs can
be biased by their own knowledge. Adults who knew exactly where Vicki’s violin was moved
reported significantly lower probabilities that Vicki would act according to a false belief, than
adults who did not possess such knowledge. Experiment 2 also demonstrated that simply drawing
attention to the container is not sufficient to increase participants’ probability judgments to that
container or to interfere with their false belief reasoning. Salience may contribute to the biasing
effects of knowledge but cannot account for the bias in and of itself.
The Curse of Knowledge 18
This research supports the general claim that there is a fundamental similarity in children's
and adult's abilities, and limitations, in reasoning about the mind. The data are consistent with a
growing body of literature that shows that adults’ own knowledge and belief states can bias their
mental state reasoning or ‘theory of mind’ (e.g. Bernstein, et. al., 2004; Harvey, 1992; Keysar,
1994; Keysar et. al., 2003). This research is also akin to research demonstrating that adults make
similar errors to those made by children on executive-functioning tasks, just to a lesser degree (e.g.
Diamond & Kirkham, 2005). Indeed, we posit that developmental changes in executive
functioning skills (see Moses, Carlson, & Sabbagh, 2005) may be one mechanism underlying
developmental changes in the curse of knowledge (Birch, 2005; Birch & Bloom, 2004).
Consider also the work of German & Hehman (in press), who find that elderly adults (mean
age of 80) experience difficulties with false belief reasoning similar to those of younger children.
This is likely to be due, not to a conceptual deficit, but to problems with inhibitory control, the
same problems that lead to poorer performance by the elderly on tests such as the Stroop task.
More specifically, we would suggest that these problems in executive functioning make it difficult
for the elderly to override the curse of knowledge bias. Empirical support for the relationship
between inhibitory control and the curse of knowledge comes from research by Bayen, Erdfelder,
& Auer (2005) who found that elderly individuals showed an exaggerated curse of knowledge bias
compared to younger adults and that this deficit was due to inhibitory factors rather than other
factors associated with aging. These findings suggest this bias does not simply follow a declining
trend with age and experience, but more closely fits a U-shaped curve that parallels developmental
changes in processing abilities.
The findings from Experiment 1 are consistent with our proposal that the curse of
knowledge bias is mediated by an ability to provide a rationalization or justification (even if only
The Curse of Knowledge 19
implicitly) for someone else sharing one’s knowledge, or acting as if they do. These findings have
implications for the study of adult social cognition and for cognitive development. Due to the
nature of the tasks that have typically been used in the adult literature, participants have usually
been 'cursed' with plausible or reasonable knowledge (e.g. the outcomes of Nixon's visits, whether
or not statements are sarcastic, solutions to problems, etc.). The data from such studies might be
substantively different if the participants were told about event outcomes that were less plausible
(see Pohl, 1998). Our findings suggest that knowledge is a more potent curse when it is in
conjunction with a potential rationale for inflating one's estimates of what others know. All
knowledge is not created equal.
The findings on the role of plausibility also suggest that manipulating the plausibility of the
various outcomes in children’s tasks could prove fruitful in furthering our understanding of
children’s developing ‘theory of mind’. If younger children have a less sophisticated
understanding of what is, and is not, plausible than older children and adults than this may
contribute to their greater susceptibility to the curse of knowledge and exacerbate their difficulties
with mental state reasoning. Indeed, young children are often noted to be more gullible and
accepting of certain impossibilities, such as those presented in magic tricks, than adults.
In the course of exploring other hypotheses, other researchers have somewhat reduced the
‘curse of knowledge’ in false belief tasks (Wimmer & Perner, 1983, Koos et. al., 1997, April). For
example, the experimenter ate the candy, or the antagonist in the story baked it into a cake.
Consistent with our findings, children did significantly better under these conditions than in the
standard task. However, the youngest children’s performance was still not significantly above
chance (Wellman et. al., 2001).
The Curse of Knowledge 20
There are two possible explanations for why young children’s performance improved
under such conditions but still did not exceed chance. First, it may be that even if the curse of
knowledge were completely removed young children would still experience difficulty with false
belief tasks; either because of other problems posed by the task such as mnemonic and linguistic
demands (see Bloom & German, 2001; Fodor, 1992, Leslie, 1994, Zaitchik, 1990) or due to a
genuine inability to reason about false beliefs (see Gopnik, 1993; Perner, 1991; Wellman et. al.,
2001). Alternatively, it may be that in these studies the ‘curse of knowledge’ was not adequately
removed—it was simply reduced. Knowledge, after all, is graded. Knowing that the candy has
been moved to one of four containers, but not knowing which one, is less specific than knowing
that it has been eaten, but more specific than knowing it was moved to one of seven possible
containers, for example.
We propose that the more specific the knowledge, the more potent the curse. In these types
of studies analyzed in Wellman et al’s (2001) meta-analysis, knowing that the candy has been
eaten or baked may still be enough of a curse to contaminate children’s false belief reasoning.
Indeed, the present results demonstrate that the bias is the most strong when it is in conjunction
with a plausible explanation—perhaps for young children it didn’t seem so implausible that the
other person would have known they ate the candy or used the chocolate to bake a cake. Perhaps
children would have performed better had the outcome seemed more implausible or surprising (see
Edgar & Birch, 2005; see also Pezzo, 2003 for a discussion of the mediating role of surprise).
Our claim that more specific knowledge creates a greater ‘curse’, or is harder to put aside,
than less specific knowledge is reminiscent of research by Zaitchik (1991) in which she found that
children participating in a displacement task that had seen the object being moved in the
protagonist’s absence were more likely to err in reasoning about false beliefs than those that had
The Curse of Knowledge 21
only been told that the object had been moved. Three-year-olds who had only been told about the
object’s true location, and had not seen the object in its true location were successful at attributing
false beliefs. Zaitchik (1991) proposed that seeing the object moved makes the information more
salient and harder to ignore than simply hearing about the object being moved. We suspect that a
similar process occurs with more specific knowledge—the more specific and detailed the subject’s
knowledge, the harder it is to fully put aside.
Are young children always more vulnerable to this bias than adults? Perhaps not --
Mitchell et al. (1996) presented adults with stories about a character who heard a message from
someone else. In some cases, the message contradicted what the character had previously
observed. The subject’s task was to judge whether the character would believe or disbelieve the
message. Not surprisingly, adults said that the character would be less likely to believe the
message if it conflicted with prior experience. In addition, consistent with the literature reviewed
above, and with our own results, adult subjects made judgments that were contaminated by their
own knowledge. That is, when they knew the message was true they were more likely to assume
that the character would believe this information than if they knew the message was false. But
although the children that they tested (one group with a mean age of 5;9, another with a mean age
of 8;9) also took into account the character’s prior experience, they differed in that they showed no
such effect of curse of knowledge. As Mitchell et al. (1996) suggest, these older children might be
answering their questions by rigidly applying a rule about the effects of informational access and
subsequent belief state. Adults, in contrast, solve the task in a less mechanical way, and hence are
more vulnerable to the curse of knowledge (see also Wilson and Brekke, 1994). In other tasks,
where applying a simple ‘seeing = knowing’ rule might not have been very useful, school-aged
children did reveal a greater bias than adults (Pohl & Haracic, 2005).
The Curse of Knowledge 22
The findings from both the present experiments have implications for research on
children’s and adults’ ability to reason about the mental states of others. Anytime someone is more
knowledgeable than the person whose perspective they are trying to take, they may be subject to
the curse of knowledge. In light of research demonstrating that the curse of knowledge is stronger
earlier in development (e.g. Birch & Bloom, 2003; Pohl & Haracic, 2005), younger children’s
performance on false belief tasks would be even more compromised than older children and adults.
Thus, traditional false belief tasks are likely unfair assessments of any developmental differences
in false belief reasoning because they will either mask young children’s true competencies or
exacerbate their difficulties. It is important to note that although we only tested adults’ curse of
knowledge bias on displacement tasks (one of the most commonly used tasks to assess children’s
ability to reason about false beliefs) we believe the same logic can be applied to a number of
difficulties in children’s mental state reasoning and perspective-taking such as ‘unexpected
contents’ or ‘Smarties’ tasks (e.g. Perner, Leekam, and Wimmer, 1987), the appearance-reality
tasks (e.g. Gopnik & Astington, 1988); and source of information tasks (e.g. Taylor et. al, 1994).
In sum, these data demonstrate that one’s ability to reason about a more naïve mental state
can be influenced by the subject’s own knowledge state and the ease with which they can generate
a potential justification for a biased response. This research supports the claim that there is a
fundamental consistency between children's and adults' ability to reason about the mind. For
children, the heightened potency of the curse typically leads to more blatant errors in mental state
attribution. But even for adults, their own knowledge can substantially contaminate their ability to
make sense of how others will think and act.
The Curse of Knowledge 23
References
Baron-Cohen, S., Leslie, A. M., & Frith, U. (1985). Does the autistic child have a ‘theory of
mind’? Cognition, 21, 37-46.
Bayen, U. J., Erdfelder, E. & Auer, T. S. (July, 2005) Hindsight bias and aging. Paper presented by
U. J. Bayen at the 1st International Hindsight Bias Workshop, Leipzig, Germany.
Bernstein, D. M., Atance, C., Loftus, G. R., & Meltzoff, A. N. (2004). We saw it all along: Visual
hindsight bias in children and adults. Psychological Science, 15, 264-267.
Birch, S. A. J. (2005). When knowledge is a curse: Children’s and adults’ mental state reasoning.
Current Directions in Psychological Science, 14, 25-29.
Birch, S. A. J. & Bloom, P. (2004). Understanding children’s and adult’s limitation in mental state
reasoning. Trends in Cognitive Science, 8, 255-260.
Birch, S. A. J. & Bloom, P. (2003). Children are cursed: An asymmetric bias in mental-state
attribution. Psychological Science, 14, 283-286.
Blank, H., Fischer, V., & Erdfelder, E. (2003). Hindsight bias in political elections. Memory, 11,
491-504.
Bloom, P. & German, T. P. (2000). Two reasons to abandon the false belief task as a test of theory
of mind. Cognition, 77, B25-B31.
Camerer, C., Lowenstein, G., & Weber, M. (1989). The curse of knowledge in economic settings:
An experimental analysis. Journal of Political Economy, 97, 1232-1254.
Dennett, D. (1978). Beliefs about beliefs. Behavioral and Brain Sciences, 1, 568-570.
Diamond, A. & Kirkham, N. (2005). Not Quite as Grown-Up as We Like to Think: Parallels
Between Cognition in Childhood and Adulthood. Psychological Science, 16, 291-297.
The Curse of Knowledge 24
Edgar, C. & Birch, S. A. J. (June, 2005). Children’s biased assessments of what others know and
believe. Poster presented by C. Edgar at the Jean Piaget Society, Vancouver, BC, Canada.
Epley, N., Morewedge, C. K., & Keysar, B. (2004). Perspective taking in children and adults:
Equivalent egocentrism but differential correction. Journal of Experimental Social
Psychology, 40, 760-768.
Fischhoff, B. (1975). Hindsight does not equal foresight: The effect of outcome knowledge on
judgment under uncertainty. Journal of Experimental Psychology: Human Perception
and Performance, 1, 288-299.
Fischhoff, B. & Beyth, R. (1975). “I knew it would happen”—remembered probabilities of
once-future things. Organizational Behavior and Human Performance, 13, 1-16.
Fodor, J. (1992). A theory of the child's theory of mind. Cognition, 44, 283-296.
Friedman, O., & Leslie, A. M. (2004). Mechanisms of belief-desire reasoning: Inhibition and
bias. Psychological Science, 15, 547-552.
German, T.P. & Hehman, J.A. (in press). Representational and executive selection resources in
'theory of mind': Evidence from compromised belief-desire reasoning in old age.
Cognition.
German, T. P. & Leslie, A. M. (2000). Attending to and learning about mental states. In P.
Mitchell & K. Riggs (Eds.), Children’s reasoning and the mind. Hove, UK: Psychology
Press.
Gopnik, A. (1993). How we know our own minds: the illusion of first person knowledge of
intentionality. Behavioral and Brain Sciences, 16, 1-14.
Gopnik, A. (1996). The scientist as child. Philosophy of Science, 63, 485-514.
The Curse of Knowledge 25
Gopnik, A, & Astington, J. W. (1988). Children's understanding of representational change and its
relation to the understanding of false belief and the appearance-reality distinction. Child
Development, 59, 26-37.
Gopnik, A. & Meltzoff, A.N. (1997). Words, thoughts, and theories. Cambridge, MA: MIT Press.
Heine, S. J. & Lehman, D. R. (1996). Hindsight bias: A cross-cultural analysis. Japanese Journal
of Experimental Social Psychology, 35, 317-323.
Harvey, N. (1992). Wishful thinking impairs belief-desire reasoning: A case of decoupling failure
in adults? Cognition, 45, 141-162.
Hawkins, S. A., & Hastie, R. (1990). Hindsight: Biased judgments of past events after the
outcomes are known, Psychological Bulletin, 107, 311-327.
Kelley, C. M., & Jacoby, L . L. (1996). Adult Egocentrism: Subjective experience versus analytic
bases for judgment. Journal of Memory and Language, 35, 157-175.
Keysar, B. (1994). The illusion of transparency of intention: Linguistic perspective taking in text.
Cognitive Psychology, 26, 165-208.
Keysar, B. & Bly, B. (1995). Intuitions of the transparency of idioms: Can one keep a secret by
spilling the beans? Journal of Memory and Language, 34, 89-109.
Keysar, B., Lin, S., & Barr, D. J. (2003). Limits on theory of mind use in adults. Cognition, 89,
25-41.
Keysar, B., Ginzel, L. E., & Bazerman, M. H. (1995). States of affairs and states of mind: The
effect of knowledge of beliefs. Organizational Behavior and Human Decision Processes,
64, 283-293.
The Curse of Knowledge 26
Koos, O., Gergely, G. , Csibra, G., & Biro, S. (April, 1997). Why eating Smarties makes you
smart. Understanding false belief at the age of 3. Poster presented at the Society for
Research in Child Development Biennial Meeting in Washington, DC, USA.
Leslie, A.M. (1987). Pretense and representation: the origins of “theory of mind”. Psychological
Review, 94, 412-426.
Leslie, A.M. (1994). Pretending and believing: Issues in the theory of ToMM. Cognition, 50,
193-200.
Lillard, A.S. (1998). Ethnopsychologies: Cultural variations in theories of mind. Psychological
Bulletin, 123, 3-32.
Mitchell, P. & Taylor, L. M. (1999). Shape constancy and theory of mind: Is there a link?
Cognition, 70, 167-190.
Mitchell, P., Robinson, E. J., Isaacs, J. E., & Nye, R. M. (1996). Contamination in reasoning about
false belief: An instance of realist bias in adults but not children. Cognition, 59, 1-21.
Moses, L.J. (1993). Young children's understanding of belief constraints on intention. Cognitive
Development, 8, 1-25.
Moses, L. J., Carlson, S, & Sabbagh, M. A. (2005). On the specificity of the relation between
executive function and children’s theories of mind. In W. Schneider, R.
Schumann-Hengsteler, & B. Sodian (Eds.), Young children's cognitive development:
Interrelationships among executive functioning, working memory, verbal ability, and
theory of mind (pp. 131-145). Mahwah, NJ, US: Lawrence Erlbaum.
Perner, J. (1991). Understanding the representational mind. Cambridge, MA: MIT Press.
The Curse of Knowledge 27
Perner, J., Leekam, S. R., & Wimmer, H. (1987). Three-year-olds’ difficulty with false belief:
The case for a conceptual deficit. British Journal of Developmental Psychology, 5,
125-137.
Pezzo, M. V. (2003). Surprise, defence, or making sense: What removes hindsight bias? Memory,
11, 421-441.
Pohl, R. (1998). The effects of feedback source and plausibility of hindsight bias. European
Journal of Cognitive Psychology, 10, 191-212.
Pohl, R. & Haracic, I. (2005). Der Rückschaufehler bei Kindern und Erwachsenen [Translated
Title: Hindsight bias in children and adults]. Zeitschrift für Entwicklungspsychologie und
Pädagogische Psychologie, 37, 46-55.
Roth, D. & Leslie, A.M. (1998). Solving belief problems: towards a task analysis. Cognition, 66,
1-31.
Royzman, E. B., Cassidy, K. W. & Baron, J. (2003). “I know, you know”: Epistemic egocentrism
in children and adults. Review of General Psychology, 7, 38-65.
Sanna, L. J. & Schwarz, N. (2003). Debiasing the hindsight bias: The role of accessibility
experiences and (mis)attributions. Journal of Experimental Social Psychology, 39,
287-295.
Taylor, L. M. & Mitchell, P. (1997). Judgments of apparent shape contaminated by knowledge of
reality: Viewing circles obliquely. British Journal of Psychology, 88, 653-670.
Taylor, M., Esbensen, B. M., & Bennett, R. T. (1994). Children’s understanding of knowledge
acquisition: The tendency for children to report that they have always known what they
have just learned. Child Development, 65, 1581-1604.
The Curse of Knowledge 28
Wasserman, D., Lempert, R. O., & Hastie, R. (1990). Hindsight and Causality. Personality and
Social Psychology Bulletin, 17, 30-35.
Wellman, H.M. (1990) The Child’s Theory of Mind, Cambridge, MA: MIT Press.
Wellman, H. M., Cross, D., & Watson, J. (2001). Meta-analysis of theory of mind development:
The truth about false-belief. Child Development, 72, 655-684.
Wimmer, H., & Perner, J. (1983). Beliefs about beliefs: Representation and constraining function
of wrong beliefs in young children’s understanding of deception. Cognition, 13, 103-128.
Wimmer, H., Hogrefe, J., Sodian, B. (1988). A second stage in children's conception of mental
life: Understanding informational accesses as origins of knowledge and belief. In J. W.
Astington, et al. (Eds.), Developing theories of mind (pp. 173-192). New York, NY, US:
Cambridge University Press.
Zaitchik, D. (1990). When representations conflict with reality: The preschooler’s problem with
false beliefs and “false” photographs. Cognition, 35, 41-69.
Zaitchik, D. (1991). Is only seeing really believing? Sources of the true belief in the false belief
task. Cognitive Development, 6, 91-103.