Patient Safety Training Evaluations: Reflections on Level 4 and more… Eduardo Salas, Ph.D....

Post on 05-Jan-2016

214 views 1 download

Transcript of Patient Safety Training Evaluations: Reflections on Level 4 and more… Eduardo Salas, Ph.D....

Patient Safety Training Evaluations: Reflections on Level 4

and more…Eduardo Salas, Ph.D.Department of Psychology &Institute of Simulation & TrainingUniversity of Central Floridaesalas@ist.ucf.edu

Purpose Today…

I. Challenge Offer some observations & myths

II. Proposal Time to think differently

III. Guide Best Practices

A few thoughts about the science of training…

What do we know about training? The science has evolved & matured… The past decade—an explosion of research!

More empirical work Research conducted in organizations New, more & deeper theories and models More evaluations reported

Huge military investment… Influence of cognitive psychology…

Expertise

What do we know about training? Much progress in…

Organizational needs analysis Cognitive task analysis Transfer of training Instructional design Feedback Training evaluation Simulation-based training Individual characteristics

Observations From the Science The quality and quantity of research has

increased The cognitive and organizational concepts

is revolutionizing the field The field is multi-disciplinary The influence of technology will continue Training is part of an organizational system There are more guidelines, tools and

approaches for practitioners

Framework for Training Effectiveness

Myths & misconceptions about training…

The Simplistic View of Training

Uninformed About the Science Erroneous Assumptions

Unskilled Worker

Training Program

Skilled Worker

Myth

Reality Opinions aside, training is a

behavioral/cognitive event that can be structured to empirical investigation.

There is a science of training that should be exploited to optimize training design.

Processes exist which, if appropriately and consistently applied, can help to ensure that effective training is designed.

Myth

Reality Experts do not have access to their own

expertise. Knowledge becomes “compiled”

Task experts do not necessarily understand the learning process or how learning progresses.

Task experts are crucial, but they must be paired with learning experts. Partnership

Myth

Reality Just because trainees are having fun,

doesn’t mean that they are learning anything. Very little or no relationship

“Instrumentality” does seem to be a factor. Does seem to be related to learning Affects motivation to learn

Simple measures of training outcomes are insufficient to judge training quality.

Myth

Reality Training transfer is a very complex

phenomenon. Some of the factors:

Supervisor Peer support Climate for Transfer Opportunity to perform/practice

Even when trainees demonstrate learning after training, it does not mean that they can or will transfer back to the job.

Thinking Differently about Training Evaluation…

Kirkpatrick’s Model of Training Evaluation

Level 1 – ReactionDid the participants like the training? What do they plan to do with what they learned?

Level 2 – LearningWhat skills, knowledge, or attitudes changed after training? By how much?

Level 3 – Behavior / Training TransferDid the participants change their behavior on-the-job based on what they learned?

Level 4 – Results Did the change in behavior positively affect the organization?

Level 5 – Return on InvestmentWas the training worth the cost?

This Model… Has served as well! Used, misused & abused! Created a misconception that Level 1 is

all one needs Over simplified evaluations Links among levels, weak Minimal impact of training on Level 4

Clinical outcomes

So…

What if we reverse Kirkpatrick’s model?

Start as Level 4…

What are the outcomes/results we want

out of this training?

Level 3: Given these wanted outcomes…

What behaviors we want/need of our trainees?

Level 2: Given these needed behaviors…

What KSAs we want our trainees to have?

Level 1: Given those KSAs…

What reactions we want our trainees to have?

What do you get by reversing Kirkpatrick’s typology? Precise learning outcomes Better links among Levels Better link of training to outcomes

Clinically-relevant Hints for performance

assessment/observation Tailor training program better Better accountability

Best Practices after Training Evaluation in…Healthcare,

Aviation…

Best Practices1. Even before designing your training, start

backwards: Think about evaluation first.2. Accept that effective training does not exist

without effective evaluation.3. Strive for robust, experimentation design in

your evaluation: It is worth the headache.4. When designing your evaluation plan and

metrics, ask the experts – your frontline staff.5. Do not reinvent the wheel, leverage existing

data relevant to training objectives.

Best Practices (cont)6. When developing measures: Consider multiple

aspects of performance.7. When developing measures: Design for variance.8. Evaluation is affected by more than just

training itself: Consider organizational, team, or other factors which may help (or hinder) the effects of training (and thus the outcome of your evaluation)

9. Engage socially powerful players early: Physicians, nursing & executive management is crucial to evaluation success…

Best Practices (cont)10. Ensure evaluation continuity: Have a plan for

employee turnover at both the participant & evaluation administration team level.

11. Environmental signals before, during, and after training must indicate that the trained KSAs & the evaluation itself are valued by the organization.

12. Get in the game coach! Feed evaluation results back to frontline providers & facilitate continual improvement through constructive coaching.

13. Report evaluation results in meaningful way.

Conclusions

Avoid Myths! Training Evaluation matters! Reverse Kirkpatrick’s

typology!