Post on 16-Feb-2016
description
Systematic Evaluation Model to Ensure the Integrity of MTSS
Implementation in Florida2011 FASP Conference
November 4th, 2011
Kevin StockslagerKelly Justice
Beth Hardcastle
Advanced Organizer
• Accountability and Evaluation• MTSS and Program Evaluation in the
Schools• Example of an MTSS Evaluation Model• Review of Potential Data Sources
• Surveys• Self-Assessments• Permanent Product Reviews
PS/RtI vs. MTSS
Accountability and Evaluation
What does…
• Accountability mean to you?
• Evaluation mean to you?
Accountability in Florida
• Increasing accountability focus the last decade
• Examples include:• School grading• AYP• Special education rules• DA• FEAPs & Teacher evaluation systems
Impact of Accountability
Criticisms• Lack of educator
involvement
• Controversy
• Consequence driven
• Compliance driven
• Conflicting requirements
• Duck and cover approach
Positives• Establishes and maintains
standards for performance
• Reinforces use of data to monitor student outcomes
• Reinforces need to examine resource use
• Student outcome rather than process focus
• Success stories
(Hall & Hord)
Accountability & Evaluation Issues
• Compliance driven versus informative evaluation• Evaluation often done to meet accountability
requirements• Evaluation can serve to help integrate and improve
school and district services
• Evaluation is fundamental to MTSS• MTSS has the potential to:
• Be viewed as one more thing we have to do OR• Help address accountability & evaluation demands
through the multi-tier framework
MTSS and program evaluation in the
schools
Important MTSS Evaluation Issues
• Stakeholders should be involved in all aspects of planning and carrying out the evaluation process as well as in decision-making
• Goals through planning should drive the process• Information obtained to:
• Determine where you currently are (needs)• Take ongoing looks at how things are working• Make decisions about what to keep doing and what
to change or eliminate
MTSS Evaluation Issues cont.
• The data you collect should be driven by the evaluation questions you want to answer• Are students meeting expectations? Academically?
Behaviorally? Social-emotionally?• Are we implementing MTSS with fidelity?• Do we have the capacity to implement successfully?• Do staff buy into implementing MTSS?
*Example questions
Table Top Activity
• Brainstorm and discuss some additional evaluation questions that you might want to answer at your schools • (2-3 minutes then report out)
How Are Students Performing?
Examples of data sources• Academics
• FCAT• FAIR• Core K-12• End of Course Exams
• Behavior• Attendance• Tardies• Suspensions• Discipline referrals
• Global Outcomes• Graduation Rates
Are Schools Implementing MTSS with Fidelity?
Examples of data sources• Curriculum and Instruction/Intervention
• Principal walkthroughs• Lesson plans• Intervention Documentation Worksheets
• Components of MTSS and Data-Based Problem-Solving*• BOQ, PIC, BAT• SAPSI, Tier I & II CCCs, Tier III CCCs
* See http://flpbs.fmhi.usf.edu/ and http://floridarti.usf.edu for more information
Rol
es R
epre
sent
ed
Prob
lem
Iden
tific
atio
n
Prob
lem
Ana
lysi
s
Inte
rven
tion
Dev
elo.
..
Prog
ram
Eva
luat
ion/
...0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
0.680.76
0.54
0.28
0.64
Tiers I & II Observation Checklist
Percent Present
Roles Present and Problem-Solving Steps
Perc
enta
ge o
f Rol
es/C
ompo
nent
s Pr
esen
t
Do We Have the Capacity to Implement MTSS with Fidelity?
Examples of data sources• Leadership Team structure and functioning
• Organizational charts• Minutes/meeting summaries• SAPSI, BOQ, PIC
• Staff knowledge and skills• FEAPs & teacher evaluation system• Staff development evaluations• Work samples
• Resources allocated to match needs• SIP, DIP• Master calendar/schedule• School rosters• Resource maps
Do Staff Buy Into Implementing MTSS?
Examples of data sources• Leadership vision and commitment
• SAPSI, BOQ, PIC• Required and non-required plans
• Staff buy in• SAPSI, BOQ, PIC• District/school staff and climate surveys• Dialogue• Brief interviews with key personnel
District commitment SBLT support Faculty involvement SBLT present Data to assess commitment
1 2 3 4 5
0
1
2
3
Sunshine Elementary: Self-Assessment of Problem Solving Implementation (SAPSI) Data Consensus Building
BOYEOY
Item Description
Indi
cato
r Sta
tus
3= Maintaining2= Achieved1= In Progress0= Not Started
Example of an MTSS Evaluation Model
Table Top Activity
• Mock Small-Group Planning and Problem-Solving Process
Small-Group Planning and Problem-Solving Process
1. What is our desired goal?2. Brainstorm the resources and barriers to achieving our goal3. Select a barrier/group or related barriers to address first4. Brainstorm strategies to reduce or eliminate our selected barrier5. Develop an action plan to reduce or eliminate our selected barrier
• Include who, what, when (Be specific!)
6. Develop a follow-up plan for each action• Include who, what, when
7. Develop a plan to evaluate the reduction or elimination of our chosen barrier
8. Develop a plan to evaluate progress towards achieving our goal from Step 1
Mock Small-Group Planning and Problem-Solving
1. Goal: Develop and implement a data-based evaluation system in my school and/or district
2. Brainstorm the resources and barriers to achieving our goal
3. Select a barrier/group or related barriers to address first
4. Brainstorm strategies to reduce or eliminate our selected barrier
Potential Data Sources
Perceptions of RtI Skills Survey
Assessing Perceptions of Skills Integral to PS/RtI Practices
Briefly…
• Role of survey data• Beliefs Survey• Perceptions of Practices
Survey
Perceptions of Skills
The likelihood of embracing new practices increases when:1) Educators understand the need for the
practice2) Educators perceive they either have the
skills to implement the practice or will be supported in developing required skills
(Showers, Joyce, Bennett, 1987)
Description and Purpose
Perceptions of RtI Skills Survey
Perceptions of Skills—Description and Purpose
• Theoretical Background:• Assess educators’ perceptions of skills they possess to implement PS/RtI
• Understand perceptions of skills and how perceptions change as function of professional development to facilitate PS/RtI implementation
Description of Survey• Assesses skills/amount of support needed for:
• Applying PS/RtI practices to academic content• Applying PS/RtI practices to behavior content• Data manipulation and technology use
• 20 items; 5-point Likert scale• 1= I do not have the skill at all (NS)…5= I am
highly skilled in this area and could teacher others (VHS)
Purpose of Instrument
Purpose of the Perceptions of RtI Skills Survey:1) Assess impact of professional development2) Identify “comfort level” with PS/RtI
practices to inform PD; allocate resources
Administration Procedures &
ScoringPerceptions of RtI Skills Survey
Administration procedures-Intended Audience
• Who should complete?• SBLT members• Instructional staff
• Who should use results?• SBLTs• DBLTs
Directions for Administration
• Methods for administration/dissemination• Completed individually• Anonymity• Opportunity for questions
• Role of school principal—explain the “why”• Role of RtI coach/coordinator/SBLT member• Frequency of use: resources, rationale,
recommendations
Scoring
Two techniques to analyze survey responses:1) Mean rating for each item calculated to
determine average perceived skill level2) Frequency of each response option
selected calculated for each item
Calculating Item Mean• Overall assessment of perceived skills of educators within
a school/district• Can be done at domain(factor) and/or individual item
level• Domain level: examine patterns in perceived
skills re: academic content, behavior content, data manipulation/technology use
• Item level: identify specific skills staff perceive possessing v. skills in need of support
Calculating Frequency of Response Options
• Provides information on range of perceived skill levels
• Can be used to determine what percentage of staff may require little, some, or high levels of support to implement PS/RtI
• Informs professional development decisions
Answering Evaluation Questions
• Use data to inform evaluation questions• Use data to answer broad/specific questions• Align analysis and data display with
evaluation questions• Consider available technology resources to
facilitate analyses of data—online administration, automatic analysis, knowledge and skill of personnel
Technical AdequacyPerceptions of RtI Skills Survey
Technical AdequacyContent validity:• Item set developed to represent perceived skills
important to implementing PS/RtI• Reviewed by Educator Expert Validation Panel (EEVP)Construct validity:• Factor analysis conducted using sample of 2,184
educators• Three resultant factors
Technical Adequacy (cont.)
Internal Consistency Reliability:• Factor 1 (Perceptions of RtI skills applied to
academic content): α = .97• Factor 2 (Perceptions of RtI skills applied to
behavior content): α = .97• Factor 3 (Perceptions of Data Manipulation
and Technology Use Skills): α = .94
Interpretation and use of dataPerceptions of RtI Skills Survey
Interpretation & Use of Data
• Three domains:• Perceptions of skills applied to academic content• Perceptions of skills applied to behavior content• Perceptions of data manipulation and technology use skills
• Three methodologies:• Calculate mean at domain level• Calculate mean at item level• Frequency/percentage of who selected each response
option
• Identify specific skills/skills sets for PS/support
14a. G
raph t
arget
stude
nt da
ta
14b. G
raph b
enchm
ark da
ta
14c. G
raph p
eer da
ta
14d. D
raw an
aimlin
e
14e. D
raw a
trend
line
15. In
terpre
t grap
hed P
M data
to de
termine
stude
nt RtI
19. Disa
ggreg
ate da
ta by
vario
us de
mograp
hic fa
ctors
20a. A
ccess
interv
ention
resou
rces v
ia the
Intern
et
20b. U
se PD
As to c
ollect
data
20d. U
se the
SWIS f
or PB
S
20e. G
raph a
nd dis
play s
tudent
and s
chool
data
21. Fa
cilitat
e a Pr
oblem
Solvin
g Team
mee
ting
.0
10.0
20.0
30.0
40.0
50.0
60.0
70.0
80.0
90.0
100.0
Perceptions of RtI Skills Survey: Item Response Data Factor Three (Data manipulation skills)
Very Highly SkilledHighly SkilledSome Support NecessaryMinimal SkillsNo Skill at all
Constituent Item / Overall Factor
Perc
enta
ge o
f To
tal R
espo
nses
Interpretation & Use of Data (cont.)
• Sharing data with stakeholders:• DBLTs, SBLTs, instructional staff
• Use data to:• Develop/adjust PD goals• Design training/coaching activities• Facilitate consensus-building discussions re:
rationale for PD, patterns, barriers
Facilitating DiscussionsSample guiding questions…
• To what extent do you believe your school possesses the skills to use school-based data to evaluate core instruction (Tier 1)? Supplemental instruction (Tier 2)?
• Based on what staff has learned about data-based decision-making, how consistent are those skills with PS/RtI practices (i.e., to what degree do teams evaluate the effectiveness of core and supplemental instruction?
Table Top ActivityWith a partner, examine the data graph for Alligator Elementary:1) After the BOY administration of the Skills survey, what conclusions could be made about the level of support needed by staff to apply PS/RtI practices to behavior content?2) Compare changes in skills levels from BOY to EOY. Then compare items suggesting substantial growth to items suggesting little or no growth. What decisions could be made re: professional development and needed support?
Implementation Integrity
What is “Integrity”and why is it important?
• Integrity is the degree to which something was done the way it was intended to be done.
• When a process or procedure lacks “integrity,” few if any assumptions can be made about the outcome or impact of that process or procedure.
Tools to Measure Implementation Integrity
• SAPSI• Tier I and II Critical Components
Checklist• Tier III Critical Components Checklist• Tier I and II Observation Checklist• Problem-Solving Team Meeting Checklists
(Initial and Follow-up)
Monitoring implementation to guide systemic decision-making
SAPSI (SELF-ASSESSMENT OF PROBLEM SOLVING IMPLEMENTATION)
Description and Purpose
SAPSI
SAPSI: Theoretical Background
• Assesses extent to which schools are making progress toward full implementation
• Implementation is a gradual progress• Many reform efforts fail due to:
LACK OF IMPLEMENTATION(Sarason, 1990)
• Implementation integrity must be examined
SAPSI: Description
• Self-report measure• Organized around system’s change model:
consensus, infrastructure, implementation• Collaboratively completed by SBLTs (School-
based Leadership Teams)• Response options: (N) Not Started, (I) In
Progress, (A) Achieved, (M) Maintaining
Florida’s Change Model
Consensus
Infrastructure
Implementation
Change Model• Consensus
• Belief is shared• Vision is agreed upon• Implementation requirements understood
• Infrastructure Development• Regulations• Training/Technical Assistance• Tier I and II intervention systems
• E.g., K-3 Academic Support Plan• Data Management• Technology support• Decision-making criteria established
• Implementation
SAPSI: PurposeTwo-fold:1. Assess current level of implementation
In what areas do we need to take action in order to facilitate implementation?
2. Progress monitor implementation How successful have our actions been? What systemic needs still exist?
Administration Procedures &
ScoringSAPSI
SAPSI: Intended audience
• School-based leadership team (SBLT)• 6-8 member multi-disciplinary team• Leadership role in facilitating
implementation• Trained on PS/RtI and Systems Change• Roles and responsibilities
i.e., facilitator, time-keeper, data coach, recorder, content area
expertise
SAPSI: Directions for Administration
STEP 1: Ensure understanding
- Facilitator ensures content and format understood
- SBLT receives info re: purpose, what is measured, how data are used, completion procedures
SAPSI: Directions for Administration
(cont.)
STEP 2: Individual preview - Approximately one week prior to completion
distribute copy to each SBLT member- Members complete individually- Members record their perspective and
prepare to contribute to discussion
SAPSI: Directions for Administration (cont.)
STEP 3: Group completion- Facilitator guides discussion and records
group responses- Consensus reached on each item - Completion takes 30 min. – 2 hours
SAPSI: Directions for Administration (cont.)
“N” – Not started = occurs less than 24% of time
“I” – In progress = occurs approx. 25-74% of time
“A” – Achieved = occurs approx. 75-100% of time
“M” – Maintaining = rated “achieved” last time and continues to occur 75-100% of time
SAPSI: ScoringOPTIONS FOR ANALYSIS:
1. Avg. activity level across domains or by item-What are the general patterns of change?
-To what extent are staff engaging in specific activities?
2. Frequency of each response option-What percentage of schools are engaged in
specific activities?
SAPSI: Scoring (cont.)
AVERAGE ACTIVITY LEVEL by domain
• Examine general patterns in consensus, infrastructure, implementation
• A domain score is calculated for each of the three domains
Sum of ratingsof items in domain(domain score)
÷Total number of items= AVG. ACTIVITY LEVEL FOR DOMAIN
SAPSI: Scoring (cont.)
Calculating Average Activity Level(domain level)
SAPSI: Scoring (cont.)
Items That Comprise Each Domain
• Domain 1 (Consensus): Items 1-5• Domain 2 (Infrastructure): Items 6-20• Domain 3 (Implementation): Items 21a-27
SAPSI: Scoring (cont.)
RESPONSE OPTION VALUES:
“N” – Not started = 0“I” – In progress = 1“A” – Achieved = 2“M” – Maintaining = 3
SAPSI: Scoring Example
Domain 11. N = 0 6 ÷ 5 = 1.2 (Avg. Activity Level for Domain 1)
2. I = 1 On average, “in progress” with consensus building
3. I = 14. A = 25. A = 2
SAPSI: Scoring (cont.)
AVERAGE ACTIVITY LEVEL at the by item
• Identify extent to which educators are engaging in specific activities
• Identify activities that need to be addressed systemically
• Does NOT provide information re: variability among schools for each activity
SAPSI: Scoring (cont.)
FREQUENCY OF EACH RESPONSE OPTION
• Range of activity levels across schools• Determine percentage of schools engaged
in specific activities• Gauge the magnitude of the problem (All
schools? Some? Few?)
QUESTIONS DRIVE YOUR ANALYSES
What are the general trends in change stages across the district over time?
(avg. activity level by domain)
Which activities should be systematically addresses through PD or district policy?
(avg. activity level by item)
How can we most efficiently deploy resources to address school needs?
(frequency of response option)
Technical Adequacy SAPSI
SAPSI: Content Validity
• Evidenced by careful identification and definition of measured contentA. As reflected in systems change literatureB. Based on review on instruments that
purport measurement of identified domains
• Adapted from IL-ASPIRE SAPSI v. 1.6• Matched to Florida systems change model• Modified to align with Florida’s PS/RtI Model
SAPSI: Internal Consistency Reliability
• Computed separately for each of the three domains
• Utilized SAPSIs administered to 34 pilot schools in Winter 2010
• Cronbach’s alpha coefficient: • Consensus: α = .64• Infrastructure: α = .89• Implementation: α = .91
Interpretation and use of data
SAPSI
Examining the Broad Domains
• Examine the three broad domains first(i.e., consensus, infrastructure, implementation)
• Graphs used to examine levels• Previously mentioned scoring methods used• Frequency of response option often used by
Project to examine aggregate pilot school data
Y1_B
OYY1
_EOY
Y2_M
OYY2
_EOY
Y3_M
OYY3
_EOY
Y1_B
OYY1
_EOY
Y2_M
OYY2
_EOY
Y3_M
OYY3
_EOY
Y1_B
OYY1
_EOY
Y2_M
OYY2
_EOY
Y3_M
OYY3
_EOY
Y1_B
OYY1
_EOY
Y2_M
OYY2
_EOY
Y3_M
OYY3
_EOY
Y1_B
OYY1
_EOY
Y2_M
OYY2
_EOY
Y3_M
OYY3
_EOY
1. District com-mitment
2. SBLT support 3. Faculty involvement 4. SBLT present 5. Data to assess commitment
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Aggregate Florida PS/RtI Project Pilot Schools Self-Assessment of Problem Solving Implementation
(SAPSI) Consensus
Not StartedIn ProgressAchievedMaintaining
Item
Perc
ent
of S
choo
ls
1. District Commitment 2. SBLT support 3. Faculty involvement 4. SBLT present 5. Data to assess com-mitment
.00
1.00
2.00
3.00
SCHOOL LEVEL DATASelf-Assessment of Problem Solving Implementation (SAPSI)
Consensus
Year 1_BOY
Year 1_EOY
Year 2_EOY
Year 3_EOY
Year 4_EOY
Item
Stat
us
3= Maintain-ing2= Achieved1= In Progress0= Not Started
Identification of Specific Needs
• Graph items to identify trends and which activities are engaged in more/less frequently
• Consider various factors when examining levels of activity engagement (e.g., training, length of implementation, SEA/LEA policies)
• Self-report data valuable, but positively biased –compare with other implementation data
SAPSI: Sharing Data with Stakeholders
• Scale-up practices should include a plan for dissemination, analysis and discussion
• Identify key stakeholders [e.g., instructional staff, SBLT, DBLT (District-based Leadership Team)]
• Share data quickly and frequently
SAPSI: Sharing Data with Stakeholders
(cont.)
• SBLTs – use data to strategize, develop/alter goals, update instructional staff
• DBLTs – use data to inform district level support and policy
• Stakeholder support/input is critical for effective action planning
SAPSI: Using Guiding Questions
• What are the patterns? • What patterns are evident among each of the individual
items on the checklist and across all data sources?• What steps of the problem-solving process are occurring
more frequently? Less frequently?• Are there any current indicators that show a zero or low
level of implementation? Why? • Have these been targeted in the past? • Do barriers exist with consensus or infrastructure? • Other priorities? • Meetings not happening or not focusing on implementation?
SAPSI: Using Guiding Questions (cont.)
• How have you progressed in implementing the Problem-Solving Model with fidelity?• Looking across all fidelity measures (CCC, SAPSI,
and Observations), what are the general levels of implementation? What are the general trends?
• Do the data from the Critical Component Checklist and Observations support what is evident in the SAPSI items 22a-22i? • Are there discrepancies among the different sources
of data with using the Problem-Solving model?• How might these discrepancies be interpreted?
School-wide Data Example
Table Top Activity
• Think about one school that you are currently serving
• Take a few minutes to complete the “Consensus” section of the SAPSI
• Share out
Tiers I and II Critical Components
Checklist
Description and Purpose
Tiers I and II Critical Components Checklist
Theoretical Background
• Implementation of new practices is a gradual process that occurs in stages, not a one-time event (Fixen, Naoom, Blasé, & Wallace, 2005).
• Since many educational reform initiatives fail due to lack of implementation (Sarason, 1990), it is critical to examine implementation integrity
• Several methods for examining implementation integrity exist (Noell & Gansle, 2006)• Self-report• Permanent product reviews• Observations
Description• Permanent product review• Measures the extent to which components of the PS/RtI
process are evident in permanent products from data meetings addressing Tier I and/or Tier II content
• 11 items organized around the 4-step problem-solving process1. Problem identification2. Problem analysis3. Intervention development and implementation4. Program evaluation/RtI
• Response options: 0=Absent, 1=Partially present, 2=Present (N/A for some items)
Problem-Solving Process
EvaluateResponse to
Intervention (RtI)
Problem Analysis
Validating ProblemIdentify variables that contribute to problem
Develop plan
Define the ProblemWhat do we want the student(s)
to know and be able to do?
Implement PlanImplement As Intended
Progress MonitorModify as Necessary
Tiered Model of School Supports & the Problem-Solving Process
ACADEMIC and BEHAVIOR SYSTEMS
Tier 3: Intensive, Individualized, Interventions.
Individual or small group intervention.
Tier 2: Targeted, Strategic Interventions & Supports.
More targeted interventions and supplemental support in addition to the core curriculum and school-wide
positive behavior program.
Tier 1: Core, Universal Instruction & Supports.
General instruction and support provided to all students in all
settings.
Revised 10.07.09
Purpose
• To provide stakeholders with a practical methodology for evaluating the extent to which educators implement PS/RtI practices in data meetings addressing Tier I and /or II content
• Permanent product reviews typically more reliable than self-report, but more resource-intensive
Administration Procedures and Scoring
Tiers I and II Critical Components Checklist
Who should complete the checklist?
• The person completing Tiers I and II CCC should have expertise in PS/RtI model and conducting permanent product reviews• Specifically, the 4 steps of the problem-solving
process
• PS/RtI Coaches, school psychologists, literacy specialists, etc.
Who should use the results for decision-making?
• School-Based Leadership Team (SBLT)• SBLT should take a leadership role in
implementing PS/RtI in their school• SBLT should have representation across staff• SBLT members should receive training in
PS/RtI
• District-Based Leadership Team (DBLT)
Directions for Administration
• Step 1• Identify the content areas and grade levels
being targeted by the school(s) for which the Tiers I and II CCC is being completed
• It is recommended that the checklists be completed from products derived from Tier I and II data meetings related to the goals of the school
Directions for Administration (cont.)
• Step 2• Identify when Tier I and II data meetings occur and
who is involved in the meetings• Examples of common meetings include leadership
team meetings, grade level meetings involving teachers, team meetings, and meetings during which small-group interventions are planned
• Meetings focused on Tier I instruction typically occur 3-4 times per year, more frequently for Tier II instruction
• Tier I and II CCC is not completed for meetings in which individual student problem-solving occured
Directions for Administration (cont.)
• Step 3• Find out who to contact for permanent
products that come from identified meetings and what products will likely be available
Directions for Administration (cont.)
• Step 4• Gather any relevant document for the period of
time for which the checklists are being completed• Reviewers may choose to complete the Tier I and
II CCC to align with universal screening windows• Example) Universal screening data collected 3 times
per year, then Tier I and II CCC could be completed from the products derived from each data meeting
• Once the time frame is identified, permanent products from Tier I and II data meetings can be reviewed
Directions for Administration (cont.)
• Step 5• Complete the checklists using the Tier I and II
CCC Standard Scoring Rubric• Rubric provides criteria for how to score each
item• Recommended to complete checklist for each
target area and grade level the school is targeting
• Important that those completing the checklist have knowledge of the problem-solving process
Directions for Administration (cont.)
• Step 6• Complete inter-rater procedures when applicable• Ensuring that permanent product reviews are
completed accurately is critical to data collection• Periodically, have two reviewers complete a Tier I
and II CCC using products from the same meeting and compare results
• Allow reviewers to compare notes and discuss differences
• Inter-rater procedures frequency dependent on time and resources of reviewers
Frequency of Use• Consider resources available, including the time needed
to:• Complete the instrument• Enter, analyze, graph, and disseminate data• Personnel available to support data collection• Additional data collection activities SBLT members and
school staff participate in
• General recommendations• Data collection aligned with school’s target content areas
and grade levels• Aligned with the frequency of universal screening and
progress monitoring data
Scoring• Examples of two data analysis techniques
1. Calculate the mean rating for each item2. Frequency distribution of each response option
selected (i.e., Absent, Partially present, and Present)
• Four domains1. Problem Identification (Items 1-3)2. Problem Analysis (Items 4-5)3. Intervention Development and Implementation (Items
6a-7c)4. Program Evaluation/RtI (Items 8-11)
Scoring (cont.)
1. Calculating the mean rating for each item• Provides an overall impression of implementation• Allows for examination general patterns of
implementation
2. Frequency distribution of each response option• Provides information on range of implementation• Can be used to determine what percentage of
schools or grade levels implemented, partially implemented, or did not implement components of PS/RtI
1. Data
to de
termine
effect
ivene
ss of
core
2. Deci
sions
made t
o mod
ify cor
e or d
evelop
inter
ventio
ns
3. Univ
ersal
scree
ning u
sed to
id gr
oups
need
ing in
terven
tion
4. Tea
m uses
hypoth
eses t
o ide
ntify r
eason
s for n
ot maki
ng be
nchmark
5. Data
used
to de
termine
hypo
theses
for n
ot maki
ng be
nchmark
6a. M
odific
ation
s mad
e to c
ore in
struct
ion - P
lan do
cumen
ted
6b. M
odific
ation
s mad
e to c
ore in
struct
ion - S
uppo
rt docu
mented
6c. Mod
ificati
ons m
ade t
o core
instr
uction
- Imple
mentat
ion do
cumen
ted
7a. S
upp.
instru
ction
devel
oped
or m
odifie
d- Pla
n docu
mented
7b. S
upp.
instru
ction
devel
oped
or m
odifie
d- Su
pport
docum
ented
7c. Su
pp. in
struct
ion de
velop
ed or
mod
ified-
Imple
mentat
ion do
c.
8. Cri
teria
for po
sitive
RtI w
ere de
fined
9. Pro
gress
monito
ring d
ata sc
hedu
led/co
llecte
d
10. D
ecisio
n reg
arding
stud
ent R
tI was
docum
ented
11. Pl
an to
conti
nue,
modify,
or te
rmina
te int
erven
tions
provid
ed.00
1.00
2.00
Tier I/II Critical Components Checklist: Mean Item Response Data
2005-20062006-20072007-20082008-2009
Item
Aver
age
Leve
l of
Impl
emen
tatio
n
0 = Absent1 = Partially Present2 = Present
Problem Identification Problem Analysis
Intervention Development and Implementation Program Evaluation/RtI
Technical AdequacyTiers I and II Critical Components Checklist
Content Validity
• Review of relevant literature, presentations, instruments, and previous program evaluation projects to develop an item set representative of the critical components of PS/RtI implementation
Inter-Rater Agreement
• Ability of reviewers to provide reliable data has been supported by inter-rater agreement among PS/RtI Project Coaches completing the instrument• Inter-rater agreement = # agreements/total #
items• Average inter-rater agreement = 91.16%
Interpretation and Use of the DataTiers I and II Critical Components Checklist
Examination of Broad Domains
• Start by examining broad domains to evaluate the extent to which permanent products indicate PS/RtI practices are being implemented
• Examining the data graphically allows for educators to determine the extent to which the major steps of problem-solving are occurring
• Examine implementation levels at each time point, as well as trends over time
Identification of Specific Needs
• Tiers I and II CCC can be used to identify which components of problem-solving are more vs. less evident
• Consider what training educators have received and how long implementation efforts have been occurring
• Stakeholders can use this data to identify components of the problem-solving process that require additional support to be implemented• Professional development• Policies and procedures
• Important to consider all aspects of the school/district system that might contribute to implementation
Dissemination to Stakeholders
• Important to disseminate implementation data to key school and district stakeholders as quickly and frequently as possible
• Allow for stakeholders to discuss implementation levels, develop/alter implementation goals, and design strategies to increase implementation
Dissemination to Stakeholders (cont.)
• Guiding questions• What are the patterns?
• What patterns are evident among each of the items?• What steps of the PS process are occurring more/less
frequently?• Are there indicators that show zero implementation?
Why?• Have these been targeted in the past?• Do barriers exist with consensus or infrastructure?• Other priorities• Meetings not happening or focusing on implementation?
Dissemination to Stakeholders (cont.)
• Guiding questions (cont.)• How have you progressed in implementing the
PS model with fidelity?• Looking across all fidelity measures (CCC, SAPSI,
Observations), what are the general levels of implementation? What are the general trends?
• Do the data from the CCC and Observations support what is evident in the SAPSI items 22a-22i?• Are there discrepancies among different data sources
with using the PS model?• How might these discrepancies be interpreted?
Table Top Activity
• Discuss the critical pieces of information that you would want to communicate to your SBLT and administrators related to evaluation • (2-3 minutes then report out)
Discussion• What are you currently doing to examine these areas in your
district or school?• What are the critical questions you ask?• What data sources do you have to answer them?• What questions do you already have that you cannot answer with
available data?• How do you use the data you collect to inform decisions?
• What areas need to be addressed as you return to your districts to plan? What are the priorities?• What critical questions do you need to start asking?• What data sources do you need?• How can you better use the data to inform decisions?
Additional Resources
Floridarti.usf.edu
Flpbs.fmhi.usf.edu
Thank you!!!
• Kevin Stockslager• kstocksl@usf.edu
• Beth Hardcastle• hardcast@usf.edu
• Kelly Justice• justice@usf.edu