A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational...
-
Upload
melvin-bake -
Category
Documents
-
view
221 -
download
1
Transcript of A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational...
![Page 1: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/1.jpg)
A Procedure for Assessing Fidelity of Implementation
in Experiments Testing Educational Interventions
Michael C. Nelson1, David S. Cordray1, Chris S. Hulleman2, Catherine L. Darrow1, & Evan C.
Sommer1
1Vanderbilt University, 2James Madison University
1
![Page 2: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/2.jpg)
Purposes of Paper:
2
To argue for a model-based approach for assessing implementation fidelity
To provide a template for assessing implementation fidelity that can be used by intervention developers, researchers, and implementers as a standard approach.
![Page 3: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/3.jpg)
Presentation Outline
3
I. What is implementation fidelity?
II. Why assess implementation fidelity?
III. A five-step process for assessing
implementation fidelity
IV. Concluding points
![Page 4: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/4.jpg)
A Note on Examples:
4
• Examples are drawn from our review of (mainly) elementary math intervention studies, which we are currently deepening and expanding to other subject areas
• Examples for many areas are imperfect or lacking
• As our argument depends on having good examples of the most complicated cases, we appreciate any valuable examples to which you can refer us ([email protected].)
![Page 5: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/5.jpg)
What Is Implementation Fidelity?
5
![Page 6: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/6.jpg)
What is implementation fidelity?
6
Implementation fidelity is the extent to which the intervention has been implemented as expected
Assessing fidelity raises the question: Fidelity to what?
Our answer: Fidelity to the intervention model.
Background in “theory-based evaluations” (e.g., Chen, 1990; Donaldson & Lipsey, 2006)
![Page 7: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/7.jpg)
Why Assess Implementation Fidelity?
7
![Page 8: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/8.jpg)
Fidelity vs. the Black Box
8
The intent-to-treat (ITT) experiment identifies the effects of causes:
Assignment to Condition
Treatment “Black Box”
Intervention’s Causal
Processes
OutcomesOutcome Measures
Control “Black Box”
Business As UsualCausal
Processes
OutcomesOutcome Measures
![Page 9: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/9.jpg)
Fidelity vs. the Black Box
9
…While fidelity assessment “opens up” the black box to explain the effects of causes:
Intervention “Black Box”
Intervention Component
Mediator OutcomeAssignment to Condition
FidelityMeasure 1
FidelityMeasure 2
Outcome Measure
![Page 10: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/10.jpg)
Fidelity assessment allows us to:
10
Determine the extent of construct validity and external validity, contributing to generalizability of results
![Page 11: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/11.jpg)
Fidelity assessment allows us to:
11
Determine the extent of construct validity and external validity, contributing to generalizability of results
For significant results, describe what exactly did work (actual difference between Tx and C)
![Page 12: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/12.jpg)
Fidelity assessment allows us to:
12
Determine the extent of construct validity and external validity, contributing to generalizability of results
For significant results, describe what exactly did work (actual difference between Tx and C)
For non-significant results, it may explain why beyond simply “the intervention doesn’t work”
![Page 13: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/13.jpg)
Fidelity assessment allows us to:
13
Determine the extent of construct validity and external validity, contributing to generalizability of results
For significant results, describe what exactly did work (actual difference between Tx and C)
For non-significant results, it may explain why beyond simply “the intervention doesn’t work”
Potentially improve understanding of results and future implementation
![Page 14: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/14.jpg)
Limitations of Fidelity Assessment:
14
Not a causal analysis, but it does provide evidence for answering important questions
![Page 15: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/15.jpg)
Limitations of Fidelity Assessment:
15
Not a causal analysis, but it does provide evidence for answering important questions
Involves secondary questions
![Page 16: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/16.jpg)
Limitations of Fidelity Assessment:
16
Not a causal analysis, but it does provide evidence for answering important questions
Involves secondary questions Field is still developing and validating methods
and tools for measurement and analysis
![Page 17: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/17.jpg)
Limitations of Fidelity Assessment:
17
Not a causal analysis, but it does provide evidence for answering important questions
Involves secondary questions Field is still developing and validating methods
and tools for measurement and analysis Cannot be a specific, one-size-fits-all approach
![Page 18: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/18.jpg)
A Five Step Process for Assessing Fidelity of Implementation
18
1. Specify the intervention model2. Identify fidelity indices3. Determine index reliability and validity4. Combine fidelity indices*5. Link fidelity measures to outcomes*
*Not always possible or necessary
![Page 19: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/19.jpg)
Step 1: Specify the Intervention Model
19
![Page 20: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/20.jpg)
The Change Model
20
• A hypothetical set of constructs and relationships among constructs representing the core components of the intervention and the causal processes that result in outcomes
![Page 21: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/21.jpg)
The Change Model
21
• A hypothetical set of constructs and relationships among constructs representing the core components of the intervention and the causal processes that result in outcomes
• Should be based on theory, empirical findings, discussion with developer, actual implementation
![Page 22: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/22.jpg)
The Change Model
22
• A hypothetical set of constructs and relationships among constructs representing the core components of the intervention and the causal processes that result in outcomes
• Should be based on theory, empirical findings, discussion with developer, actual implementation
• Start with Change Model because it is sufficiently abstract to be generalizable, but also specifies important components/processes, thus guiding operationalization, measurement, and analysis
![Page 23: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/23.jpg)
Change Model: Generic Example
23
Teacher training in use of educational
software
Teachers assist students in using
educational software
Improved student learning
Intervention Component
Mediator Outcome
![Page 24: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/24.jpg)
Change Model: Project LINCS
24
Adapted from Swafford, Jones, and Thornton, 1997
Instruction in student
cognition of geometry
Instruction in geometry
Increase in teacher
knowledge of student
cognition
Increase in teacher
knowledge of geometry
Improved teacher
instructional practice
![Page 25: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/25.jpg)
The Logic Model
25
The set of resources and activities that operationalize the change model for a particular implementation
![Page 26: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/26.jpg)
The Logic Model
26
The set of resources and activities that operationalize the change model for a particular implementation
A roadmap for implementation
![Page 27: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/27.jpg)
The Logic Model
27
The set of resources and activities that operationalize the change model for a particular implementation
A roadmap for implementation
Derived from the change model with input from developer and other sources (literature, implementers, etc.)
![Page 28: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/28.jpg)
Logic Model: Project LINCS
28
Adapted from Swafford, Jones, and Thornton, 1997
Research seminar on van Hiele
model
Geometry content course
Increase in teacher knowledge of student
cognition
Increase in teacher knowledge of geometry
Instruction in geometry
Instruction in student cognition of
geometry
Improved teacher
instructional practice
How it is taught
Characteristics teachers display
What is taught
![Page 29: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/29.jpg)
A Note on Models and Analysis:
29
Recall that one can specify models for both the treatment and control conditions.
The “true” cause is the difference between conditions, as reflected in the model for each.
Using the change model as a guide, one may design equivalent indices for each condition to determine the relative strength of the intervention (Achieved Relative Strength, ARS).
This approach will be discussed in the next presentation (Hulleman).
![Page 30: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/30.jpg)
Steps 2 and 3: Develop Reliable and Valid Fidelity Indices and Apply to the Model
30
![Page 31: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/31.jpg)
Examples of Fidelity Indices
31
Self-report surveys Interviews Participant logs Observations Examination of permanent products created
during the implementation process
![Page 32: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/32.jpg)
Index Reliability and Validity
32
Both are reported inconsistently Report reliability at a minimum, because
unreliable indices cannot be valid Validity is probably best established from pre-
existing information or side studies
![Page 33: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/33.jpg)
Index Reliability and Validity
33
Both are reported inconsistently Report reliability at a minimum, because
unreliable indices cannot be valid Validity is probably best established from pre-
existing information or side studies We should be as careful in measuring the
cause as we are in measuring its effects!
![Page 34: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/34.jpg)
Selecting Indices
34
• Guided foremost by the change model: identify core components as those that differ significantly between conditions and upon which the causal processes are thought to depend
![Page 35: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/35.jpg)
Selecting Indices
35
• Guided foremost by the change model: identify core components as those that differ significantly between conditions and upon which the causal processes are thought to depend
• Use the logic model to determine fidelity indicator(s) for each change component
![Page 36: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/36.jpg)
Selecting Indices
36
• Guided foremost by the change model: identify core components as those that differ significantly between conditions and upon which the causal processes are thought to depend
• Use the logic model to determine fidelity indicator(s) for each change component
• Base the number and type of indices on the nature and importance of each component
![Page 37: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/37.jpg)
Selecting Indices: Project LINCS
37
Adapted from Swafford, Jones, and Thornton, 1997
Change Model Construct
Logic Model Components
Indicators Indices
Instruction in geometry
Geometry content course
None; Proposed: Teacher attendance, content delivery
None; Proposed: Head count, observation
Instruction of student cognition of geometry
Research seminar van Hiele model
None; Proposed: Teacher attendance, content delivery
None; Proposed: Head count, observation
Increase of teacher knowledge of geometry
None Ability to apply geometry knowledge
Pre/post test of geometry knowledge
Increase of teacher knowledge of student cognition
None Ability to describe student cognition
Pre/post test of van Hiele levels
Improved teacher instructional practice
What is taught Alignment of lesson content with van Hiele levels
Observations
Improved teacher instructional practice
How it is taught
Particular instructional behaviors of teachers
Observations
Improved teacher instructional practice
Characteristics teachers display
Reflecting knowledge of student cognition in planning
Lesson plan task
![Page 38: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/38.jpg)
Step 4: Combining Fidelity Indices*
38
![Page 39: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/39.jpg)
Why Combine Indices?
39
*May not be possible for the simplest models *Depends on particular questions
![Page 40: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/40.jpg)
Why Combine Indices?
40
*May not be possible for the simplest models *Depends on particular questions
Combine within component to assess fidelity to a construct
Combine across components to assess phase of implementation
Combine across model to characterize overall fidelity and facilitate comparison of studies
![Page 41: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/41.jpg)
Some Approaches to Combining Indices:
41
• Total percentage of steps implemented• Average number of steps implemented
![Page 42: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/42.jpg)
Some Approaches to Combining Indices:
42
• Total percentage of steps implemented• Average number of steps implemented
HOWEVER: These approaches may underestimate or overestimate the importance of some components!
![Page 43: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/43.jpg)
Some Approaches to Combining Indices:
43
• Total percentage of steps implemented• Average number of steps implemented
HOWEVER: These approaches may underestimate or overestimate the importance of some components!
• Weighting components based on the intervention model
• Sensitivity analysis
![Page 44: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/44.jpg)
MAP Example
44
Weighting of training sessions for the MAP intervention
Cordray, et al (Unpublished)
TrainingSession
Month Content Initial Weight
Adjusted Weight
Session 1 September Administration .25 .10
Session 2 October Data use .25 .30
Session 3 November Differentiated Instruction
.25 .50
Session 4 May Growth and planning
.25 .10
![Page 45: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/45.jpg)
Step 5: Linking Fidelity Measures to Outcomes*
45
![Page 46: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/46.jpg)
Linking Fidelity and Outcomes
46
• *Not possible in (rare) cases of perfect fidelity (no covariation without variation)
• *Depends on particular questions• Provide evidence supporting the model (or
not)• Identify “weak links” in implementation• Point to opportunities for “boosting” strength• Identify incorrectly-specified components of
the model
![Page 47: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/47.jpg)
Assessment to Instruction (A2i)
47
Teacher use of web-based software for differentiation of reading instruction
Professional developmentStudents use A2i Teachers use A2i recommendations for grouping and lesson planningStudents improve learning
Measures: Time teachers logged in, observation of instruction, pre/post reading
(Connor, Morrison, Fishman, Schatschneider, and Underwood, 1997)
![Page 48: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/48.jpg)
Assessment to Instruction (A2i)
48
Used Hierarchical Linear Modeling to analyze Overall effect size of .25 Tx vs. C Pooling Tx+C, teacher time using A2i accounted
for 15% of student performance Since gains were greatest among teachers who
both attended PD and were logged in more, concluded both components were necessary for outcome
(Connor, Morrison, Fishman, Schatschneider, and Underwood, 1997)
![Page 49: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/49.jpg)
Some Other Approaches to Linking from the Literature
49
• Compare results of hypothesis testing (e.g., ANOVA) when “low fidelity” classrooms are included or excluded
• Correlate overall fidelity index with each student outcome
• Correlate each fidelity indicator with the single outcome
• Calculate Achieved Relative Strength (ARS) and use HLM to link to outcomes
![Page 50: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/50.jpg)
Concluding points
50
![Page 51: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/51.jpg)
In Summary:
51
If we do not know what we are testing, we cannot know what the results of our tests mean.
![Page 52: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/52.jpg)
In Summary:
52
If we do not know what we are testing, we cannot know what the results of our tests mean.
Model-based (change and logic) assessment answers the question “Fidelity to what?”
![Page 53: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/53.jpg)
In Summary:
53
If we do not know what we are testing, we cannot know what the results of our tests mean.
Model-based (change and logic) assessment answers the question “Fidelity to what?”
There is a need for a systematic approach to fidelity assessment, which we describe
![Page 54: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/54.jpg)
In Summary:
54
If we do not know what we are testing, we cannot know what the results of our tests mean.
Model-based (change and logic) assessment answers the question “Fidelity to what?”
There is a need for a systematic approach to fidelity assessment, which we describe
Most useful when research designs are able to incorporate this process from early stages
![Page 55: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/55.jpg)
In Summary:
55
If we do not know what we are testing, we cannot know what the results of our tests mean.
Model-based (change and logic) assessment answers the question “Fidelity to what?”
There is a need for a systematic approach to fidelity assessment, which we describe
Most useful when research designs are able to incorporate this process from early stages
Additional examples and refinement of measurement and analytical tools are needed
![Page 56: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/56.jpg)
References
56
Chen, H.T. (1990). Theory-driven evaluation. Thousand Oaks, CA: Sage Publications.
Cohen, J. (1988). Statistical Power Analysis for the Behavioral Sciences, 2nd ed. Hillsdale, NJ: Erlbaum.
Connor, C. M., Morrison, F. M., Fishman, B. J., Schatschneider, C., & Underwood, P. (2007). Algorithm-guided individualized reading instruction. Science, 315, 464-465.
Cook, T. (1985). Postpositivist critical multiplism. In R. L. Shotland & M. M. Marks (Eds.), Social science and social policy (pp. 21-62). Beverly Hills, CA: Sage.
Cordray, D.S. (2007) Assessing Intervention Fidelity in Randomized Field Experiments. Funded Goal 5 proposal to Institute of Education Sciences.
Cordray, D.S., Pion, G.M., Dawson, M., and Brandt, C. (2008). The Efficacy of NWEA’s MAP Program. Institute of Education Sciences funded proposal.
Donaldson, S.I., & Lipsey, M.W. (2006). Roles for theory in contemporary evaluation practice: Developing practical knowledge. In I. Shaw, J.C. Greene, & M.M. Mark (Eds.), The Handbook of Evaluation: Policies, Programs, and Practices (pp. 56-75). London: Sage.
Fuchs, L.S., Fuchs, D., and Karns, K. (2001). Enhancing kindergarteners’ mathematical development: Effects of peer-assisted learning strategies. Elementary School Journal, 101, 495-510.
Fuchs, L. S., Fuchs, D., Yazdian, L, & Powell, S. R. (2002). Enhancing First-Grade Children's Mathematical Development with Peer-Assisted Learning Strategies. School Psychology Review, 31, 569-583.
Gamse, B.C., Jacob, R.T., Megan, H., Boulay, B., Unlu, Fatih, Bozzi, L., Caswell, L., Rodger, C., Smith, W.C., Brigham, N., and Rosenblum, S. (2008). Reading First Impact Study Final Report. Washington, D.C.: Institute of Education Sciences.
![Page 57: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/57.jpg)
References
57
Ginsburg-Block, M. & Fantuzzo, J. (1997). Reciprocal peer tutoring: An analysis of teacher and student interactions as a function of training and experience. School Psychology Quarterly, 12, 1-16.
Holland, P.W. (1986). Statistics and causal inference. Journal of the American Statistical Association.81(396), 945-960.
Hulleman, C. S., & Cordray, D. (2009). Moving from the lab to the field: The role of fidelity and achieved relative intervention strength. Journal of Research on Intervention Effectiveness, 2(1), 88-110.
Hulleman, C.S., Cordray, D.S., Nelson, M.C., Darrow, C.L., & Sommer, E.C. (2009, June). The State of Treatment Fidelity Assessment in Elementary Mathematics Interventions. Poster presented at the annual Institute of Education Sciences Conference, Washington, D.C.
Institute of Education Sciences (2004). Pre-doctoral training grant announcement. Washington, DC: US Department of Education.
Knowlton, L.W. and Phillips, C.C. (2009). The Logic Model Guidebook: Better Strategies for Great Results. Washington, D.C.: Sage.
Kutash, K., Duchnowski, A. J., Sumi, W. C., Rudo, Z. & Harris, K. M. (2002). A school, family, and community collaborative program for children who have emotional disturbances. Journal of Emotional and Behavioral Disorders, 10(2), 99-107.
McIntyre, L.L., Gresham, F.M., DiGennaro, F.D., and Reed, D.D. (2007). Treatment integrity of school-based interventions with children in the Journal of Applied Behavior Analysis 1991-2005. Journal of Applied Behavior Analysis. 40, 659-672.
![Page 58: A Procedure for Assessing Fidelity of Implementation in Experiments Testing Educational Interventions Michael C. Nelson 1, David S. Cordray 1, Chris S.](https://reader030.fdocuments.us/reader030/viewer/2022032611/56649cc45503460f9498cd44/html5/thumbnails/58.jpg)
References
58
Michalopoulos, C. (2005). Precedents and Prospects for Randomized Experiments. In H.S. Bloom (Ed.) Learning More from Social Experiments, (pp. 1-36). New York, NY: Russell Sage Foundation.
Noell, G.H., Witt, J.C., Slider, N.J., Connell, J.E., Gatti, S.L., Williams, K.L., Koenig, J.L. & Resetar, J.L. (2005). Treatment Implementation Following Behavioral Consultation in Schools: A Comparison of Three Follow-up Strategies. School Psychology Review, 34(1), 87-106.
O'Donnell, C. L. (2008). Defining, Conceptualizing, and Measuring Fidelity of Implementation and Its Relationship to Outcomes in K-12 Curriculum Intervention Research. Review of Educational Research, 78(1), 33-84.
Shadish, W.R., Cook, T.D., and Campbell, D.T. (2002). Experimental and Quasi-Experimental Designs for Generalized Causal Inference. New York, NY: Houghton Mifflin Company.
Swafford, J.O., Jones, G.A., and Thornton, C.A. (1997). Increased Knowledge in Geometry and Instructional Practice. Journal for Research in Mathematics Education, 28(4), 467- 483.
Trochim, W. and Cook, J. (1992). Pattern matching in theory-driven evaluation: A field example from psychiatric rehabilitation. In H. Chen and P.H. Rossi (Eds.) Using Theory to Improve Program and Policy Evaluations. Greenwood Press, New York, 49-69.