RTI Implementer Webinar Series: Implementing Response to Intervention (RTI)
Facilitating PS/ RtI Capacity: Tools , Skills, and Strategies for Practitioners
description
Transcript of Facilitating PS/ RtI Capacity: Tools , Skills, and Strategies for Practitioners
Facilitating PS/RtI Capacity:
Tools, Skills, and Strategies for Practitioners
NASP 2013 Mini-Skills Presentation
February 14th, 2013
Amanda MarchAmber Brundage
Clark DormanJose Castillo
Kevin Stockslager
University of South Florida
Purpose
• To enhance practitioners’ understanding of empirically supported systems-change procedures, detailing systems-theory and principles identified as critical for success of PS/RtI initiatives.
Learning Objectives
• Discussion of critical components of PS-RtI implementation & scale-up
• Review of a comprehensive three-phase systems-change model– Consensus– Infrastructure– Implementation
• Presentation, discussion, & practice of various tools to evaluate & progress monitor PS/RtI practices
• Participants will leave with knowledge, skills, & tools to facilitate PS/RtI implementation & evaluation in their local settings
Advanced Organizer
• PS/RtI & MTSS• MTSS and Program Evaluation in the Schools• Example Tools & Practice Activities
– Beliefs on RtI Scale– Perceptions of RtI Skills Survey – Revised– Tier I and II Critical Components Checklist
• Discussion• Resources & References
Introductions
• Who are we?• Who are you?
– Clicker Activities
PS/RtI & MTSS
PS/RtI Model of Service Delivery
•RtI is the practice of providing high quality instruction matched to student needs and utilizing data to make educational decisions about students (Batsche et al., 2005)
•Components of a PS/RtI Model▫ Integrated multi-tier model of service delivery▫Problem-solving method▫ Integrated data collection and assessment system
MTSS Defined
• Evidence-based model of educating students that uses data-based problem-solving to integrate academic and behavioral instruction and interventions;
• Integrated instruction/interventions are delivered to students in varying intensities (multiple tiers) based on student need;
• Need-driven decision making seeks to ensure the allocation of resources (district, school, classroom) are based on student need at the appropriate levels to accelerate performance of all students to achieve and/or exceed proficiency.
Context
+
=________________________________________________________________
Florida’s MTSS
Changing Systems
Multi-Tiered System of Support Model in Education
Systems Approach:• System - “The orderly combination of two or more
individuals whose interaction is intended to produce a desired outcome.” (Curtis, Castillo, & Cohen, 2008)
• Principles of Systems Change:• Shared Mission, Beliefs, and Values • Key Stakeholder Involvement• Effective, Committed Leadership• Systems Perspective – “Big” Picture• Structured Planning and Problem-Solving
Data-based decision making Knowledge and skills to build capacity through professional development
Three-Phase Change Model
Florida’s Change Model
Consensus
Infrastructure
Implementation
Three Phase Change ModelConsensus
– Belief is shared– Vision is agreed upon– Implementation requirements understood
Infrastructure Development– Regulations– Training/Technical Assistance, Professional Development– Model (e.g., Standard Protocol)– Tier I and II intervention systems
• E.g., K-3 Academic & Behavioral Support Plan– Data Management– Technology support– Decision-making criteria established
Implementation
MTSS and Program Evaluation in the Schools
PS/RtI Evaluation Tool Technical Assistance Manual - Revised
www.floridarti.usf.edu
Florida PS/RtI Evaluation Tools
• Tools for Progress Monitoring PS/RtI– Self-Assessment of Problem-Solving Implementation (SAPSI)
• Tools for Examining Consensus Development– Beliefs on RtI Scale– Perceptions of Practice Survey
• Tools for Examining Infrastructure Development– Perceptions of RtI Skills Survey –Revised– Coaching Evaluation Survey – Revised
• Tools for Examining Integrity of PS/RtI– Tier I and II Observation Checklist– Tier I and II Critical Components Checklist– Problem-Solving Team Meeting Checklists – Initial & Follow-Up– Tier III Critical Components Checklist
Today’s Featured Tools
• Tools for Examining Consensus Development– Beliefs on RtI Scale
• Tools for Examining Infrastructure Development– Perceptions of RtI Skills Survey –Revised
• Tools for Examining Integrity of PS/RtI– Tier I and II Critical Components Checklist
MTSS Evaluation Issues
• The data you collect should be driven by the evaluation questions you want to answer– Are we implementing MTSS with fidelity?– Do we have the capacity to implement successfully?– Do staff buy into implementing MTSS?
Answering Evaluation Questions
• Use data to inform evaluation questions• Use data to answer broad/specific questions• Align analysis and data display with
evaluation questions• Consider available technology resources to
facilitate analyses of data—online administration, automatic analysis, knowledge and skill of personnel
Beliefs on RtI ScaleAssessing Beliefs Integral to PS/RtI Practices
Description and purposeBeliefs of RtI Scale
The Beliefs of Educators
Research suggests:• Educators beliefs about student learning styles, styles of
teaching, and instructional strategies impact their willingness to implement new practices (Fang, 1996; Sparks, 2002)
• Educational reform occurs when educators understand the need for change and embrace this need as a moral imperative (Fullan, 2010; Sharratt & Fullan, 2009)
• Beliefs of leaders communicated to stakeholders influence the climate for successful implementation of new practices (Sharratt & Fullan, 2009)
Description of Survey
• Assesses educators beliefs regarding:– Student learning– The role of data in decision-making– Expectations for the effectiveness of instruction
• 19 items; 5-point Likert scale• 1= Strongly Disagree …5= Strongly Agree
Purpose of Instrument
Purpose of the Beliefs on RtI Scale is to measure and inform consensus development in two ways:1) Assess impact of professional development
on educator beliefs about PS/RtI2) Identify commonly held beliefs that will likely
facilitate or hinder implementation efforts
Administration procedures & scoring
Beliefs on RtI Scale
Administration procedures-Intended Audience
• Who should complete?– SBLT members– Instructional staff
• Who should use results?– SBLTs– DBLTs
Directions for Administration
• Methods for administration/dissemination– Completed individually– Anonymity– Opportunity for questions
• Role of school principal—explain the “why”• Role of RtI coach/coordinator/SBLT member• Frequency of use: resources, rationale,
recommendations
Scoring
Two techniques to analyze survey responses:1) Mean rating for each item calculated to
determine average perceived belief level2) Frequency of each response option selected
calculated for each item
Calculating Item Mean
• Overall assessment of reported beliefs of educators within a school/district
• Can be done at domain (factor) and/or individual item level– Domain level: examine patterns in reported beliefs
regarding 1) academic ability and performance of SWD2) data-based decision-making3) functions of core and supplemental instruction
– Item level: identify specific beliefs staff report v. beliefs in need of exploration and support
Calculating Frequency of Response Options
• Provides information on range of belief levels• Can be used to determine what percentage of
staff may require little, some, or high levels of support to implement PS/RtI
• Informs professional development decisions
Interpretation and use of dataBeliefs on RtI Scale
Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOY Y1 BOY Y1 EOY Y2 EOY Y3 EOY Y4 EOYFactor 1 Factor 2 Factor 3
1.00
2.00
3.00
4.00
5.00
Castillo Elementary: SBLT MembersBeliefs Survey: Response Data by Factor
Item
Mea
n Fa
ctor
Sco
re
12. U
se fl
exib
le in
stru
ctio
nal p
ract
ices
13. S
taff
supp
ort w
ould
res
ult i
n m
ore
flexi
ble
inte
rven
tions
14. G
en. e
d. in
terv
entio
n =
succ
ess
for
mor
e st
uden
ts
15. P
reve
ntio
n/ea
rly in
terv
entio
n re
duce
s re
ferr
als
and
SP
ED
pla
cem
ents
16. S
tude
nt R
tI de
term
ines
aca
dem
ic p
robl
em s
ever
ity
17. S
tude
nt R
tI de
term
ines
sev
erity
of b
ehav
iora
l pro
blem
20. S
tude
nt d
ata
is m
ore
accu
rate
than
teac
her
judg
men
t
21. S
tude
nt R
tI is
mor
e ef
fect
ive
than
"te
st"
scor
es
22. S
tude
nts
belo
w b
ench
mar
ks a
re h
ighe
r pr
iorit
y th
an th
ose
who
at o
r ab
ove
23. G
raph
ing
data
faci
litat
es d
ata-
base
d de
cisi
ons
24. P
aren
ts s
houl
d be
invo
lved
AS
AP
25. P
aren
tal i
nvol
vem
ent i
ncre
ases
stu
dent
RtI
27. P
urpo
se o
f ass
essm
ent i
s to
incr
ease
effe
ctiv
e in
stru
ctio
n
0102030405060708090
100
Castillo Elementary: Spring 2012 SBLT Beliefs SurveyFactor Two (Data-Based Decision-Making)
Strongly Disagree
Disagree
Neutral
Agree
Strongly Agree
Item
Perc
enta
ge o
f Tot
al R
espo
nses
Interpretation & Use of Data (cont.)
• Sharing data with stakeholders:– DBLTs, SBLTs, instructional staff
• Use data to:– Develop/adjust consensus-building goals– Design training/coaching activities– Facilitate consensus-building discussions re: rationale for
PS/RtI, patterns and changes in beliefs over time
Practice ActivityBeliefs on RtI Scale
Perceptions of RtI Skills SurveyAssessing Perceptions of Skills Integral to PS/RtI Practices
Perceptions of Skills
The likelihood of embracing new practices increases when:1) Educators understand the need for the
practice2) Educators perceive they either have the
skills to implement the practice or will be supported in developing required skills
Description and purposePerceptions of RtI Skills Survey
Perceptions of Skills—Description and Purpose
• Theoretical Background:– Assess educators’ perceptions of skills they
possess to implement PS/RtI– Understand perceptions of skills and how
perceptions change as function of professional development to facilitate PS/RtI implementation
Description of Survey
• Assesses skills/amount of support needed for:– Applying PS/RtI practices to academic content– Applying PS/RtI practices to behavior content– Data manipulation and technology use
• 50 items; 5-point Likert scale• 1= I do not have the skill at all (NS)…5= I am
highly skilled in this area and could teach others (VHS)
Purpose of Instrument
Purpose of the Perceptions of RtI Skills Survey:1) Assess impact of professional development2) Identify “comfort level” with PS/RtI practices
to inform PD; allocate resources
Administration procedures & scoring
Perceptions of RtI Skills Survey
Administration & Scoring
• Same procedures as Beliefs on RtI Scale
Interpretation and use of dataPerceptions of RtI Skills Survey
Interpretation & Use of Data
• Three domains:– Perceptions of skills applied to academic content– Perceptions of skills applied to behavior content– Perceptions of data manipulation and technology use skills
• Three methodologies:– Calculate mean at domain level– Calculate mean at item level– Frequency/percentage of who selected each response option
• Identify specific skills/skills sets for PS/support
14a.
Graph t
arget
stude
nt da
ta
14b.
Graph b
ench
mark da
ta
14c.
Graph p
eer d
ata
14d.
Draw an
aimlin
e
14e.
Draw a
trend
line
15. In
terpre
t grap
hed P
M data
to de
termine
stud
ent R
tI
19. D
isagg
regate
data
by va
rious
demog
raphic
facto
rs
20a.
Acces
s inte
rventi
on re
sourc
es vi
a the
Inter
net
20b.
Use P
DAs to c
ollec
t data
20d.
Use th
e SWIS
for P
BS
20e.
Graph a
nd di
splay
stud
ent a
nd sc
hool
data
21. F
acilit
ate a
Problem
Solv
ing Tea
m mee
ting
.0
10.0
20.0
30.0
40.0
50.0
60.0
70.0
80.0
90.0
100.0
Perceptions of RtI Skills Survey: Item Response Data Factor Three (Data manipulation skills)
Very Highly SkilledHighly SkilledSome Support NecessaryMinimal SkillsNo Skill at all
Constituent Item / Overall Factor
Perc
enta
ge o
f Tot
al R
espo
nses
Practice ActivityPerceptions of RtI Skills Survey - Revised
Implementation Integrity
What is “Integrity”and why is it important?
• Integrity is the degree to which something was done the way it was intended to be done.
• When a process or procedure lacks “integrity,” few if any assumptions can be made about the outcome or impact of that process or procedure.
3 Ways to Assess Implementation Integrity• Self-Report
– Most efficient– Least reliable
• Permanent Product Reviews– Moderately efficient– Moderately reliable
• Observations– Least efficient– Most reliable
Tiers I and II Critical Components Checklist
Description and PurposeTiers I and II Critical Components Checklist
Theoretical Background
• Implementation of new practices is a gradual process that occurs in stages, not a one-time event (Fixen, Naoom, Blasé, & Wallace, 2005).
• Since many educational reform initiatives fail due to lack of implementation (Sarason, 1990), it is critical to examine implementation integrity
• Several methods for examining implementation integrity exist (Noell & Gansle, 2006)
– Self-report– Permanent product reviews– Observations
Description
• Permanent product review• Measures the extent to which components of the PS/RtI
process are evident in permanent products from data meetings addressing Tier I and/or Tier II content
• 11 items organized around the 4-step problem-solving process1. Problem identification2. Problem analysis3. Intervention development and implementation4. Program evaluation/RtI
• Response options: 0=Absent, 1=Partially present, 2=Present (N/A for some items)
Purpose
• To provide stakeholders with a practical methodology for evaluating the extent to which educators implement PS/RtI practices in data meetings addressing Tier I and /or II content
• Permanent product reviews typically more reliable than self-report, but more resource-intensive
Administration Procedures and Scoring
Tiers I and II Critical Components Checklist
Who should complete the checklist?
• The person completing Tiers I and II CCC should have expertise in PS/RtI model and conducting permanent product reviews– Specifically, the 4 steps of the problem-solving process
• PS/RtI Coaches, school psychologists, literacy specialists, etc.
Directions for Administration
1. Identify content areas/grade levels being targeted by the school
2. Identify when Tier I/II data meetings occur and who is involved in the meetings
3. Find out who to contact for permanent products that come from identified meetings and what products will likely be available
4. Gather any relevant documents for the period of time for which the checklists are being completed
5. Complete the checklists using the Tier I and II CCC Standard Scoring Rubric
6. Complete inter-rater procedures when applicable
Frequency of Use
• Consider resources available, including the time needed to:– Complete the instrument– Enter, analyze, graph, and disseminate data– Personnel available to support data collection– Additional data collection activities SBLT members and school staff
participate in• General recommendations
– Data collection aligned with school’s target content areas and grade levels
– Aligned with the frequency of universal screening and progress monitoring data
Scoring
• Examples of two data analysis techniques1. Calculate the mean rating for each item2. Frequency distribution of each response option selected
(i.e., Absent, Partially present, and Present)
• Four domains1. Problem Identification (Items 1-3)2. Problem Analysis (Items 4-5)3. Intervention Development and Implementation (Items 6a-
7c)4. Program Evaluation/RtI (Items 8-11)
Interpretation and Use of the Data
Tiers I and II Critical Components Checklist
Examination of Broad Domains
• Start by examining broad domains to evaluate the extent to which permanent products indicate PS/RtI practices are being implemented
• Examining the data graphically allows for educators to determine the extent to which the major steps of problem-solving are occurring
• Examine implementation levels at each time point, as well as trends over time
1. Data
to de
termine
effec
tiven
ess o
f core
2. Dec
ision
s mad
e to m
odify
core
or de
velop
inter
venti
ons
3. Univ
ersal
scree
ning u
sed t
o id g
roups
need
ing in
terve
ntion
4. Te
am us
es hy
pothe
ses t
o ide
ntify
reaso
ns fo
r not
making
benc
hmark
5. Data
used
to de
termine
hypo
these
s for
not m
aking
benc
hmark
6a. M
odific
ation
s mad
e to c
ore in
struc
tion -
Plan
docu
mented
6b. M
odific
ation
s mad
e to c
ore in
struc
tion -
Sup
port
docu
mented
6c. M
odific
ation
s mad
e to c
ore in
struc
tion -
Imple
mentat
ion do
cumen
ted
7a. S
upp.
instru
ction
deve
loped
or m
odifie
d- Plan
docu
mented
7b. S
upp.
instru
ction
deve
loped
or m
odifie
d- Sup
port
docu
mented
7c. S
upp.
instru
ction
deve
loped
or m
odifie
d- Im
plemen
tation
doc.
8. Crite
ria fo
r pos
itive R
tI were
defin
ed
9. Prog
ress m
onito
ring d
ata sc
hedu
led/co
llecte
d
10. D
ecisi
on re
gardi
ng st
uden
t RtI
was do
cumen
ted
11. P
lan to
conti
nue,
modify
, or t
ermina
te int
erven
tions
prov
ided
.00
1.00
2.00
Tier I/II Critical Components Checklist: Mean Item Response Data
2005-20062006-20072007-20082008-2009
Item
Aver
age
Leve
l of I
mpl
emen
tatio
n
0 = Absent1 = Partially Present2 = Present
Problem Identification Problem Analysis
Intervention Development and Implementation Program Evaluation/RtI
Identification of Specific Needs
• Tiers I and II CCC can be used to identify which components of problem-solving are more vs. less evident
• Consider what training educators have received and how long implementation efforts have been occurring
• Stakeholders can use this data to identify components of the problem-solving process that require additional support to be implemented– Professional development– Policies and procedures
• Important to consider all aspects of the school/district system that might contribute to implementation
Dissemination to Stakeholders
• Important to disseminate implementation data to key school and district stakeholders as quickly and frequently as possible
• Allow for stakeholders to discuss implementation levels, develop/alter implementation goals, and design strategies to increase implementation
Dissemination to Stakeholders (cont.)
• Sample guiding questions– What are the patterns?– Are there indicators that show zero implementation? Why?– How have you progressed in implementing the PS model
with fidelity?
Practice Activity
Tier I & II Critical Components Checklist – Domain I (Problem Identification)
Tier I & II CCCPractice Activity Materials • Tier I & II CCC Scoring
Rubric– Domain 1– Questions 1-3
Practice Activity: Tier I & II CCC
• Domain 1– Problem Identification
Discussion
• What are you currently doing to examine these areas in your district or school?– What are the critical questions you ask?– What data sources do you have to answer them?– What questions do you already have that you cannot answer with available
data?– How do you use the data you collect to inform decisions?
• What areas need to be addressed as you return to your districts to plan? What are the priorities?– What critical questions do you need to start asking?– What data sources do you need?– How can you better use the data to inform decisions?
Additional Resources
Additional Resources
• MTSS Implementation Components: Ensuring Common Language & Understanding
• http://www.floridarti.usf.edu/resources/format/pdf/mtss_q_and_a.pdf
Additional Resources, cont.
• Implementing a Multi-Tiered System of Support for Behavior: Recommended Practices for School and District Leaders– http://flpbs.fmhi.usf.edu/pdfs/RtIB%20guide%20101811_fin
al.pdf
• Guiding Tools for Instructional Problem-Solving (GTIPS)– http://www.florida-rti.org/_docs/GTIPS.pdf
• Florida’s PS/RtI Evaluation Tool Technical Assistance Manual – Revised (2012)– http://www.floridarti.usf.edu/resources/program_evaluation/
ta_manual_revised2012/index.html
Floridarti.usf.edu
Flpbs.fmhi.usf.edu
• Amber Brundage– [email protected]
• Jose Castillo– [email protected]
• Clark Dorman– [email protected]
• Amanda March– [email protected]
• Kevin Stockslager– [email protected]