Workshop4 cog biases - Clinical...
Transcript of Workshop4 cog biases - Clinical...
1
Just a reminder that the lecture notes for this session are on the website: www.clinical-‐reasoning.org This session – summary of what we have learned so far, a bit about what happens on a typical MAU ward round, then we’ll get to the main bit of what today’s session is about – cogniCve biases and errors.
In your clinical reasoning assessments you only get one mark for the right answer. Nearly ALL the marks are for describing what you think is going on and WHY -‐ ie your reasoning. Unfortunately students are fixated on “the right answer” – which is why you oOen do badly on these assessments (you’re afraid of thinking out loud in case it’s “wrong”). The fact is, in medicine someCmes there is no immediate right answer, there may be a number of potenCal answers, so your ability to reason things through, and demonstrate your reasoning, is vital.
2
Summary so far [read slide] Remember that since the 1960’s psychologists began to examine our thinking, decision and acCons scienCfically and what they found was that thinking itself is prone to error. This affects everyone. Even highly intelligent people fall in to the same cogniCve traps. Also, error is not randomly distributed – we systemaCcally err in the same direcCon, which makes our mistakes predictable – but only to a degree.
Psychology and other disciplines support the ‘two minds hypothesis’ : there are forms of cogniCon that are ancient and shared with other animals – where speed is oOen more important than accuracy – and ones that are recently evolved and disCnctly human. Each ‘mind’ has access to mulCple systems in the brain. So we have a fast, pa^ern recognising, intuiCve way of thinking – and a slow, controlled but high effort way of thinking.
3
Psychologists esCmate that we spend 95% of our daily lives engaged in Type 1 thinking – the intuiCve, fast, subconscious mode of decision-‐making. Imagine driving a car, for example – it would be impossible to funcCon efficiently if every decision and movement was as deliberate, conscious, slow and efforcul as in our first driving lesson. With experience, complex procedures become automaCc, fast and effortless. The same applies to medical pracCce. There is evidence that expert decision-‐making is well served by intuiCve thinking. The problem is that although intuiCve processing is highly efficient in many circumstances, in others it is prone to error.
4
Clinicians use both Type 1 and Type 2 thinking, and both types are important in clinical decision-‐making. When encountering a problem that is familiar, clinicians employ pa^ern recogniCon and reach a working diagnosis or differenCal diagnosis quickly (Type 1 thinking). When encountering a problem that is more complicated, they use a slower, systemaCc approach (Type 2 thinking). Both types of thinking interplay – they are not mutually exclusive in the diagnosCc process. Figure 1.6 illustrates the interplay between Type 1 and Type 2 thinking in clinical pracCce.
Errors can occur in both Type 1 and Type 2 thinking – for example people can apply the wrong rules or make errors in their applicaCon while using Type 2 thinking. However, it has been argued that the common cogniCve biases encountered in medicine tend to occur when engaged in Type 1 thinking.
For example, imagine being asked to see a young woman who is drowsy. She is handed over to you as a ‘probable overdose’ because she has a history of depression and a packet of painkillers was found beside her at home. Her observaCons show she has a Glasgow Coma Score of 10/15, heart rate 100 beats per minute, blood pressure 100/60 mmHg, respiratory rate 14 per minute, oxygen saturaCons 98% on air and temperature 37.5oC. Already your mind has reached a working diagnosis. It fits a pa^ern (Type 1 thinking). You think she has taken an overdose. At this point you can stop to think about your thinking (raConal override in Fig. 1.6). ‘What is the evidence for this diagnosis? What else could it be?’
On the other hand, imagine being asked to assess a paCent who has been admi^ed with syncope. There are several different causes of syncope and a systemaCc approach is required to get to a diagnosis (Type 2 thinking). However, you recently heard about a case of syncope due to a leaking abdominal aorCc aneurysm. At the end of your assessment, following evidence-‐based guidelines, it is clear the paCent can be discharged. Despite this, you decide to observe the paCent overnight ‘just in case’ (irraConal override in Fig. 1.6). In this example, your intuiCon is actually availability bias (when things are at the forefront of your mind) which has significantly distorted your esCmate of probability.
5
CogniCve biases should not be confused with ‘expert intuiCon’ which is a common way experts make decisions – but only in their domain of experCse. In 1973, two American psychologists took two groups of people – one consisCng of chess masters and one consisCng of novices – and showed them chessboards with 20-‐25 pieces on them, set up as if in the middle of a game. The subjects were shown the boards briefly and then asked to recall the posiCons of the pieces. The chess masters were able to recall the posiCon of every piece on the board, but the novices could only recall four or five. The experiment was then repeated, but this Cme the pieces were randomly distributed on the chess board. This Cme, the chess masters were no be^er than the novices. Chess masters, with their years of experience, could look at the chess pieces in the middle of a game and see a pa^ern. The chess pieces were like le^ers in a word, and like readers recognise whole words, chess masters are experts in the language of chess. But if they were asked to simply look at a jumble of le^ers, they performed no be^er than everyone else.
Expert intui+on is really tacit knowledge. Although it involves intuiCve thinking, this is slightly different to the subconscious ‘assumpCons’ to which we are all prone, experts included. The apparent effortlessness is in fact not effortless at all -‐ 10,000+ purposeful pracCce, feedback/coaching
6
You all know this word says “remember” because you recognise the word – you did not have to figure it out from the individual le^ers. In the same way, expert chess players saw the pieces like le^ers in a word, and expert clinicians see individual clinical features like le^ers in a word – it seems effortless but the effort it took to get there is “hidden” from you. NOT the same as “jumping to conclusions” which we are going to discuss in more detail later.
7
One definiCon of diagnosCc error is this: the clinician has all the informaCon available to get the right diagnosis and then gets the wrong diagnosis. Why does this happen?
Knowledge gaps. For example, a cons colleague of mine told a pt her pain (that sounded really like biliary colic) could not be due to gallstones because she had had her gallbladder removed. That’s a knowledge gap.
MisinterpretaCon of diagnosCc tests Cog biases – the topic today.
8
[tell story]
9
So let’s move on to talk about cogniCve errors and biases …
10
[read slide]
11
Here’s a common example – which line is the longest? A, B or C? Using a ruler takes more effort, right?
There’s another famous experiment where 3 lines of DIFFERENT lengths were given to subjects. It was easy to see which was the longest line, but when they were put in a group of people who all disagreed with them, they frequently chose the incorrect answer due to peer pressure!
12
CogniCve biases fall in to 4 main groups: Social – peer pressure, ‘halo effect’ (when someone is good at one thing so you assume they are good at everything) Memory – hindsight bias (hindsight significantly impairs our ability to judge the quality of decision making that occurred in the past) Decision making – confirmaCon bias and others in the hand out coming around now [HANDOUT] Probability/belief biases – e.g. Gambler’s fallacy, availability bias – more of that in a minute!
13
The mistaken belief that is something happens more frequently than normal, then it will happen less frequently in the future (or vice versa). In situaCons where what is being observed is random, this belief is obviously false.
14
Is the process of inferring the CAUSES of events or behaviours (with no evidence)
15
We have probably all been a vicCm of this, as in the line experiment.
16
ConfirmaCon bias is the tendency to look for confirming evidence to support a theory rather than looking for contradictory evidence to refute it, even if the la^er is clearly present. ConfirmaCon bias is common when you are seeing a paCent who has already been seen by another dr, who may be more senior than you. Actually, confirmaCon bias is rife in everyday life: in general, people read newspapers that already support their views, browse internet sites that mirror their own values, and hang out with like minded people.
17
I like this one – fear of flying. Availability bias is when things are at the forefront of your mind because you have seen several cases recently or have been studying that condiCon in parCcular. There have been (what seems to be) a lot of airline crashes in the news recently. But neglect of probability (or base rate neglect) is the tendency to ignore the prevalence of something which then distorts our reasoning.
Of course, in life AND medicine, pure logic is someCmes NOT helpful when dealing with an anxious person!
18
Anchoring bias occurs even when iniCal informaCon is clearly wrong … We use anchoring all the Cme – whenever we have to guess something like the length of the Mississippi River, we start with something we are sure of (the anchor) and take it from there – otherwise we would just be making up a number. Unfortunately we use anchors when we don’t have to as well: it is the common human tendency to subconsciously hang on to the first piece of informaCon given when making a decision. Lots of experiments demonstrate this. In one, estate agents were given a tour of a house and asked to esCmate its value. Beforehand they were informed of a randomly generated sales price. The higher that price, the higher they valued the property. Car salesman and other negoCators do the same thing.
19
And finally, story bias is our tendency to remember stories more than abstract facts and also to be moved more by a human side to a situaCon. ChariCes and news media use this bias to great effect. However, stories can give us a false sense of understanding – they simplify reality and filter things out that don’t fit. But they make sense to us!
Authority bias refers to how figures of authority can exert influence on your reasoning. For example, if the consultant said this is what the diagnosis is, subconsciously (or not) defer to authority and assume that person knows what they are doing, and they tend not to interfere (strict hierarchy btw can have disastrous consequences if people are afraid to quesCon).
20
Around 100 cogniCve errors and biases are described in this book ‘the art of thinking clearly’ – easy to read and recommended. I’ve listed a few more here: sunk-‐cost fallacy is when you have already invested in something you hold out to the bi^er end, even when it makes no sense (bad movie, bad project, bad war) Social proof is a cause of stock market panic. If suddenly everyone looks up, you automaCcally look up too – we insCncCvely follow the herd (in evoluConary terms, this was a survival strategy). Overconfident is the tendency to believe we know more than we actually know, placing too much faith in opinion instead of gathered evidence (?poliCcians) The other ones marked with an * are in the handout and you can read about them there …
Visceral bias – negaCve or posiCve feelings towards people affect our decision making Su^on’s Slip – going for the obvious and not considering other possibiliCes
Order effects – the tendency to remember the first and last bits of informaCon and forget the stuff in the middle
Commission bias – tendency towards acCon ‘be^er to do something than nothing’ instead of watching and waiCng which might be the best thing to do Stereotyping
21
[read slide]
22
23
24
25
Some of you may have been told about famous group think experiments where apparently normal intelligent people do really strange things because everyone else is doing it too.
Examples when I was a junior dr:
50 year old alcoholic man, abdo pain/vomiCng, fever, ‘proteinuria’ = UTI (pancreaCCs) 70 year old woman, CSF leak, immunosuppressed, new confusion = cons said ‘b cult if spikes temp’ (meningiCs)
26
27
Personality type and other individual characterisCcs influence decision making. Some people are naturally more confident (or over-‐confident) than others e.g. men vs women. Decisions are also made in context. My work environment is an accident waiCng to happen: characterised by noise, interrupCons, mulC-‐tasking and cogniCve overload for example.
‘Comfortably numb’ we menConed before – it refers to mindlessly adopCng strategies to conserve thinking. This leads to problems: e.g. failure to do a thorough history and exam, blindly accepCng informaCon from others, deferring to authority without quesCon, adopCng a non-‐skepCcal approach … you must quesCon everything!
Healthcare providers cannot afford to be comfortably numb when paCent care is at stake.
28
[read slide] The areas of the brain required for system 2 processes are most affected by things like stress, cogniCve overload, sleep deprivaCon and faCgue … All factors combine to increase use of system 1 processes and compromise funcCon of system 2 processes. We will learn more about that in year two when we cover ‘Human Factors’.
29
So no ma^er how smart we are, our brains are wired to miss things and assume things … and make errors. So what can we do about it? [read slide]
30
James Reason is a well known psychologist who became famous for studying error in healthcare. He said this.
31
That’s it!
32
So in your PBL sessions try and spot when you might be subconsciously filling in gaps/jumping to conclusions when really you don’t know.
33
34