MiBLSi Schools’ Implementation Process and Student Outcomes

Post on 12-Jan-2016

28 views 0 download

Tags:

description

MiBLSi Schools’ Implementation Process and Student Outcomes. Anna L. Harms Michigan State University. Agenda. Reasons for studying implementation and ways to do it Linking research to our schools’ data Next steps Questions and Feedback. The Status of Research. - PowerPoint PPT Presentation

Transcript of MiBLSi Schools’ Implementation Process and Student Outcomes

MiBLSi Schools’ Implementation Process and Student Outcomes

Anna L. HarmsMichigan State University

MiBLSi State Conference 2009 1

Agenda

• Reasons for studying implementation and ways to do it

• Linking research to our schools’ data• Next steps• Questions and Feedback

MiBLSi State Conference 2009 2

The Status of Research

• Primary focus has been on developing and identifying practices. . .– National Reading Panel Reports– What Works Clearinghouse– Florida Center for Reading Research Reviews– OJJDP Model Programs– Center for the Study and Prevention of Violence Model Programs

MiBLSi State Conference 2009 3

What determines the evidence base for a practice?

MiBLSi State Conference 2009 4

• Independent randomized control trial is the gold standard

• Effect size (Cohen, 1988) :– Large: .80– Moderate: .50– Minimal/Weak: .20

Efficacy vs. Effectiveness(Christensen, Carlson, Valdez, 2003)

• Efficacy– controlled conditions– Conducted by innovation developers

• Effectiveness– External to the developers of an innovation– Replication– Under different conditions

MiBLSi State Conference 2009 5

RESEARCH PRACTICE

IMPLEMENTATION

Greenberg, Domitrovich, Graczyk, Zins (2005)

MiBLSi State Conference 2009 6

PLANNED INTERVENTION

PLANNED IMPLEMENTATIO

N SYSTEM

PROGRAM ASIMPLEMENTED

ACTUAL INTERVENTION

ACTUAL MPLEMENTATION

SUPPORT

ACTUAL MPLEMENTATION

SUPPORT

=

NIRN/SISEP

• Framework for Implementation• Stages of Implementation• Core Implementation Components• Multi-level Influences on Successful

Implementation

MiBLSi State Conference 2009 7

Effective Intervention Practices+Effective Implementation Strategies_______________________________= Positive Outcomes for Students

SISEP, 2009MiBLSi State Conference 2009 8

Getting into the Habit of Collecting, Analyzing, and Acting Upon Data

MiBLSi State Conference 2009 9

Problem Identification

Problem Analysis

Plan Selection

Plan Implementation

Plan Evaluation DATA &

DOCUMENTATION

Response to I________

• Intervention ?

• Instruction ?

• Implementation of evidence-based practices

MiBLSi State Conference 2009 10

Reasons for Studying and Monitoring Implementation

• Effort evaluation• Quality improvement• Documentation• Internal validity• Program theory• Process evaluation• Diffusion• Evaluation quality

MiBLSi State Conference 2009 11

Greenberg, M. T., Domitrovich, C. E., Graczyk, P. A., & Zins, J. E. (2005).

What tools can we use to measure implementation of

school-wide systems?

MiBLSi State Conference 2009 12

Tier 1 Implementation Tools

READING BEHAVIORPlanning and Evaluation Tool Effective Behavior Supports Team

Implementation Checklist

Effective Reading Supports Team Implementation Checklist

Effective Behavior Supports Self Assessment Survey

Observational Protocols School-wide Evaluation ToolPrinciple’s Reading Walkthrough

DocumentsBenchmarks of Quality

School Climate Survey

MiBLSi State Conference 2009 13

Tier 2 & 3 Implementation Tools

READING BEHAVIORIntervention Validity Checklists Checklist for Individual Student

SystemsIEP Implementation Validity Checks IEP Implementation Validity Checks

MiBLSi State Conference 2009 14

MiBLSi Mission Statement

“to develop support systems and sustained implementation of a data-driven, problem solving model in schools to help students become better readers with social skills necessary for success”

MiBLSi State Conference 2009 15

Our Data

COHORT START DATE SCHOOLS* YEARS OF DATAAVAILABLE

1 January 2003 15 4.5

2 February 2005 27 3.5

3 January 2006 50 2.5

4.1 January 2007 65 1.5

4.2 March 2007 27 1.3

4.3 June, 2007 11 1

MiBLSi State Conference 2009 16

* Refers to # of elementary schools included in this study.

• MiBLSi’s existing data• Elementary Schools (any combination of K-6)

Purpose of the Study

• To systematically examine schools’ process of implementing school-wide positive behavior supports and a school-wide reading model during participation with a statewide RtI project.

• To systematically examine the relation between implementation fidelity of an integrated three-tier model and student outcomes.

MiBLSi State Conference 2009 17

Conceptual Framework

MiBLSi State Conference 2009 18

PLANNED INTERVENTION

School-wide Positive Behavior Supports

Response to Intervention for

Reading

ACTUAL IMPLEMENTATION

Submission of Implementation

Checklists

Scores on Implementation

Checklists

STUDENT OUTCOMES

Office Discipline Referrals

Performance on Curriculum-Based Literacy Measures

Performance on State-Wide Standardized

Test in Reading

(Chen, 1998; Greenberg et al., 2005)

Measuring Implementation

• Effective Behavior Support Self Assessment Survey (EBS-SAS)• Spring of each school year• Total % implementation by building location

• Effective Behavior Support Team Implementation Checklist (EBS-TIC)

• 4 x per school year (quarterly)• Total % Implementation

• Planning and Evaluation Tool for Effective Reading Supports-Revised (PET-R)

• Fall of each school year• Total/Overall % implementation

MiBLSi State Conference 2009 19

THE PROCESSHOW LONG

SUSTAINABILITYASSOCIATED STUDENT OUTCOMES

BEHAVIOR + READING

MiBLSi State Conference 2009 20

Systems Implementation Research

• Expect 3-5 years for full implementation (Fixsen, Naoom, Blase, Friedman & Wallace, 2004; OSEP Center on Positive Behavioral Interventions and Supports, 2004; Sprague et al., 2001)

• Studies often split up implementation and outcomes (Reading First--U.S. Department of Education, 2006)

• View implementation at one point in time (McCurdy, Mannella & Eldridge, 2003); McIntosh, Chard, Boland & Horner, 2006; Mass-Galloway, Panyan, Smith & Wessendorf, 2008)

• A need for systematic research

MiBLSi State Conference 2009 21

THE PROCESSHOW LONG

SUSTAINABILITYASSOCIATED STUDENT OUTCOMES

BEHAVIOR + READING

MiBLSi State Conference 2009 22

Process and Progress

• Just as we measure student progress, we should also measure our progress toward implementation efforts.

• What is our current level of implementation?• What is our goal?• How do we get from here to there?

MiBLSi State Conference 2009 23

How do scores vary by year of implementation?

MiBLSi State Conference 2009 24

MiBLSi State Conference 2009 25

MiBLSi State Conference 2009 26

MiBLSi State Conference 2009 27

THE PROCESSHOW LONG

SUSTAINABILITYASSOCIATED STUDENT OUTCOMES

BEHAVIOR + READING

MiBLSi State Conference 2009 28

How long does it take?

2-5 years

MiBLSi State Conference 2009 29

At each year of implementation, what % of schools attain criterion

levels of implementation?

MiBLSi State Conference 2009 30

MiBLSi State Conference 2009 31

PET-R: COHORT 3 (N=50)

0-5 mo. 6-11 mo. 1:6-1:11 2:6-2:11 3:6-3:11 4:6-4:11

24(48%)

1(2%)

25 schools (50% did not attain criterion scores)

MiBLSi State Conference 2009 32

EBS-SAS: COHORT 3 (N=50)

0-5 mo.

21 schools (42% did not attain criterion scores)

6-11 mo. 1:0-1:5 2:0-2:5 3:0-3:5 4:0-4:5 5:0-5:5

13(26%)

2(4%)

14(28%)

MiBLSi State Conference 2009 33

EBS-TIC: COHORT 3 (N=50)

0-5 mo.

13 schools (26% did not attain criterion scores)

6-11 mo. 1:0-1:5 2:0-2:5 3:0-3:5 4:0-4:5 5:0-5:5

6(12%)

1(2%)

30(60%)

THE PROCESSHOW LONG

SUSTAINABILITYASSOCIATED STUDENT OUTCOMES

BEHAVIOR + READING

MiBLSi State Conference 2009 34

Sustainability

• Think and work –Up–Down–Out

MiBLSi State Conference 2009 35

What percent of schools that attain criterion levels of implementation are able to maintain or improve

their score in all subsequent years?

MiBLSi State Conference 2009 36

MiBLSi State Conference 2009 37

PET-R: COHORT 3 (N=50)

6-11 mo. 1:6-1:11

1(2%)

1

MiBLSi State Conference 2009 38

EBS-SAS: COHORT 3 (N=50)

0-5 mo. 6-11 mo. 1:0-1:5 2:0-2:5 3:0-3:5 4:0-4:5 5:0-5:5

13(26%)

2(4%)

14(28%)

2

12

2

MiBLSi State Conference 2009 39

EBS-TIC: COHORT 3 (N=50)

0-5 mo. 6-11 mo. 1:0-1:5 2:0-2:5 3:0-3:5 4:0-4:5 5:0-5:5

6(12%)

1(2%)

30(60%)

15

01

Another way of looking at implementation. . .

MiBLSi State Conference 2009 40

What % of implementation data do schools submit for each year of

implementation?

MiBLSi State Conference 2009 41

% of Schools Submitting PET-R Data Each Year

1 2 3 4 5 6C1 -- -- 93% 80% 73% 60%

C2 -- 78% 89% 78% -- --

C3 -- 90% 94% -- -- --

C4.1 -- 97% -- -- -- --

C4.2 -- 96% -- -- -- --

C4.3 91% -- -- -- -- --

MiBLSi State Conference 2009 42

% of Schools Submitting EBS-SAS Data Each Year

1 2 3 4 5 6C1 -- -- 60% 60% 47% 53%

C2 70% -- 74% 63% 67% --

C3 84% -- 70% 78% -- --

C4.1 95% -- 86% -- -- --

C4.2 89% -- 81% -- -- --

C4.3 -- 82% -- -- -- --

MiBLSi State Conference 2009 43

% of Schools Submitting EBS-TIC Data Each Year

MiBLSi State Conference 2009 44

1 2 3 4 5 6C1 -- -- 47% 53% 73% 53%

C2 74% -- 78% 70% 56% --

C3 60% -- 80% 58% -- --

C4.1 77% -- 80% -- -- --

C4.2 56% -- 48% -- -- --

C4.3 -- 45% -- -- -- --

THE PROCESSHOW LONG

SUSTAINABILITYASSOCIATED STUDENT OUTCOMES

BEHAVIOR + READING

MiBLSi State Conference 2009 45

Is the % of behavior checklist data submitted each year related to

student behavior outcomes for that year?

MiBLSi State Conference 2009 46

Is the % of reading checklist data submitted each year related to

student reading outcomes for that year?

MiBLSi State Conference 2009 47

Are scores on the behavior implementation checklists related to student behavior outcomes for that

year?

MiBLSi State Conference 2009 48

Are scores on the reading implementation checklist for each year of implementation related to student reading outcomes for that

year?

MiBLSi State Conference 2009 49

THE PROCESSHOW LONG

SUSTAINABILITYASSOCIATED STUDENT OUTCOMES

BEHAVIOR + READING

MiBLSi State Conference 2009 50

What is the impact on student outcomes when schools meet

criteria on none, some, or all of the implementation checklists?

MiBLSi State Conference 2009 51

Limitations

• Self-report implementation measures• Limited number of schools in earlier cohorts• We don’t know what specific factors have

impacted implementation

MiBLSi State Conference 2009 52

Remember. . .

•More data is not necessarily better.

• Data should have a purpose: – It should help us to make well-informed decisions

that will improve outcomes for students.

MiBLSi State Conference 2009 53