MO SPDG Data Report

46
Professional Development to Practice The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. Professional Development to Practice School Implementation Scale Team Functioning Survey OSEP Annual Performance Report Collaboration Survey Quarterly Reports Evidence-Based PD Components Rubric MO SPDG Data Report

description

MO SPDG Data Report. School Implementation Scale Team Functioning Survey OSEP Annual Performance Report Collaboration Survey Quarterly Reports Evidence-Based PD Components Rubric. Evaluation Overview. Evaluation data collected for many purposes: Reporting to OSEP - PowerPoint PPT Presentation

Transcript of MO SPDG Data Report

Page 1: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

School Implementation ScaleTeam Functioning Survey

OSEP Annual Performance ReportCollaboration Survey

Quarterly ReportsEvidence-Based PD Components Rubric

MO SPDG Data Report

Page 2: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

The contents of this presentation were developed under a grant from the US Department of Education to the Missouri Department of Elementary and Secondary Education (#H323A120018). However, these contents do not necessarily represent the policy of the US Department of Education, and you should not assume endorsement by the Federal Government. P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Page 3: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Evaluation Overview

Evaluation data collected for many purposes:

Reporting to OSEP Continuous improvement for schools Continuous improvement for RPDC

providers Reporting to DESE Monitoring implementation compliance

Page 4: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

School Implementation Scale

42-item online survey Assesses implementation in five domains:

Leadership & Empowering Culture Evidence-Based Instruction, Assessment, &

Curriculum Ongoing Professional Development Collaborative Data Teaming Processes

Page 5: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

School Implementation Scale

31 items on a Likert Scale:

Six Yes/No Items relating to Collaborative Data Teams

1 2 3 4 5

Not at all true of me now

Somewhat true of me now

Very true of me now

Page 6: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

School Implementation Scale 3,126 respondents in 206 schools

Comparisons with 2013 data based on 2,487 responses from that year

General Educa-tion Teacher;

2260

Other Certified Staff; 333

Noncertified Staff; 59

Special Education Teacher; 375

Administrator; 102

Respondents' Roles

Page 7: MO SPDG Data Report

School Implementation ScaleAverage percent of respondents who answered “Agree” or “Strongly Agree” on items in each domain.

Leadership & Empowering Culture Evidence-Based Instruction, Assessment & Curriculum

Ongoing Professional Development Collaborative Data Teaming Processes

0%

20%

40%

60%

80%

100%

71.9%

85.5%

77.2%

69.8%71.7%

86.8%

79.5% 79.4%

Percent of Respondents answering "Agree" or "Strongly Agree"Average by Domain

2013 averages 2014 averages

Page 8: MO SPDG Data Report

School Implementation Scale

1. I feel that my administrators are committed to implementing evidence-based instructional practices.

9. I have a clear understanding of the State Standards for my grade/subject.

17. I have the time necessary to analyze student data and problem-solve with my colleagues.

19.I receive school-wide academic and behavioral data in usable and understandable formats.

21. I can summarize my school's shared vision/mission.

22. I think my school has an effective process in place to identify available resources (e.g., materials, technology, people).

23. I am involved in action planning school-wide improvements with the other staff and administrators.

28. I have the technology and resources that I need to provide effective instruction.

31. I think that the current school initiatives are improving education for students in my school.

0% 20% 40% 60% 80% 100%

90.0%

87.1%

43.1%

65.3%

77.6%

73.1%

67.8%

66.3%

76.9%

89.5%

87.9%

43.1%

67.9%

79.3%

72.2%

68.0%

63.5%

73.9%

Leadership & Empowering Culture

2014 results 2013 results

Page 9: MO SPDG Data Report

School Implementation ScaleSTRONG: Administrator commitment (Item 1)WEAK: Time needed to analyze data and problem-solve(Item 17)ACTION ITEM: Work with administrators to increase meeting time allocated to data teams.

STRONG: Administrator commitment (Item 1)WEAK: Technology and resources needed for effective instruction (Item 28)ACTION ITEM: Work with administrators to obtain new technology and make resources available to instructional staff. ACTION ITEM: Provide training to help instructional staff identify and effectively utilize existing technology/resources.

Page 10: MO SPDG Data Report

School Implementation Scale

2. I am able to differentiate instruction according to student needs while addressing the State Standards.

5. I adapt the environment, curriculum, and instruction based on my students' behavioral data.

6. I adapt the environment, curriculum, and instruction based on my students' academic data.

7. I modify my instructional practices based on students' common formative assessment data.

8. Based on assessment results, I re-teach information that students have not mastered.

10. My instruction intentionally addresses the State Standards for my grade/subject.

0% 20% 40% 60% 80% 100%

87.7%

89.5%

93.7%

86.3%

90.4%

90.0%

85.5%

89.5%

93.1%

88.5%

91.9%

93.7%

Evidence-Based Instruction, Assessment, & Curriculum

2014 results 2013 results12. I monitor each of my students' progress toward meeting the State Standards for my grade/subject.

14. I review formative assessment data for every student that I support.

15. When I'm concerned about a student's academic progress, I collaborate with colleagues to identify interventions.

16. When I'm concerned about a student's behavioral progress, I collaborate with colleagues to identify interventions.

18. I am involved in meetings where data results are discussed.

20. I evaluate the effectiveness of my instruction based on common formative assessment data.

30. I review universal screening data at least three times a year for every student that I support.

0% 20% 40% 60% 80% 100%

83.9%

82.5%

91.6%

92.5%

77.4%

78.9%

66.4%

85.9%

86.4%

91.4%

92.3%

80.1%

83.1%

66.4%

Page 11: MO SPDG Data Report

School Implementation ScaleSTRONG: Adapting environment, curriculum, and instruction based on student data (Item 6)WEAK: Evaluate the effectiveness of instruction based on common formative assessment data (Item 20)ACTION ITEM: Ensure that common formative assessment data is being collected at least three times per year.ACTION ITEM: Work with administration to ensure that each teacher is administering common formative assessments.ACTION ITEM: Teach instructional staff how to use common formative assessment data to adapt instruction more effectively.

Page 12: MO SPDG Data Report

School Implementation Scale

3.I participate in professional development where I learn to improve my instructional practices.

4. I receive coaching/mentoring to implement evidence-based instructional practices.

11. I participate in professional development where I learn how to develop curricular plans that address the State Standards.

13. I participate in professional development where I learn how to monitor students' progress.

0% 20% 40% 60% 80% 100%

89.8%

68.5%

76.1%

74.2%

91.2%

70.2%

79.0%

77.7%

Ongoing Professional Development2014 results 2013 results

Page 13: MO SPDG Data Report

School Implementation ScaleSTRONG: Participation in professional development to improve instructional practices(Item 3)WEAK: Receive coaching/mentoring to improve instructional practices(Item 4)ACTION ITEM: Use post-test results to identify areas for follow-up support.ACTION ITEM: Provide in-district coaching to instructional staff to improve instructional practices.

Page 14: MO SPDG Data Report

School Implementation Scale

32. My building has Collaborative Data Teams (CDT) that meet regularly (at least one time per month).

33. The CDT structure in my building includes representatives from all teaching roles (i.e., regular education, special education, special classes [music, art, PE], etc.).

34. The CDTs in my building have been trained to collect and analyze data to inform instruction.

35. I participate regularly on one or more CDTs in my building.

36. My school has identified at least three effective teaching practices to implement in classroom instruction.

37. All teachers have been trained to implement the identified effective teaching practices.

38. The CDTs in my building develop and administer Common Formative Assessments (CFAs) and use the results to inform instruction.

0% 20% 40% 60% 80% 100%

81.4%

64.6%

71.8%

64.0%

75.4%

62.7%

69.0%

89.8%

69.9%

84.2%

74.2%

82.0%

73.1%

82.7%

Percent of Respondents answering "Yes"Collaborative Data Teaming Processes

2014 results 2013 results

Page 15: MO SPDG Data Report

School Implementation ScaleSTRONG: Collaborative Data Teams are established and meet at least once per month (Item 32)WEAK: CDTs include representatives from all teaching roles (Item 33)ACTION ITEM: Invite additional staff to participate on the Collaborative Data Team, making sure to include general education teachers, special education teachers, and teachers of special classes.

STRONG: Three effective practices are identified (Item 36)WEAK: All teachers have been trained to implement the identified effective practices (Item 37)ACTION ITEM: Identify which staff members have not been trained, then provide additional training to these individuals.

Page 16: MO SPDG Data Report

School Implementation Scale Quickly glance through the data. What are your first

impressions? Does the number/role of survey participants adequately

represent our schools? Celebrate successes: Which items or essential elements

show high levels of implementation? What processes, professional development, etc. are in place that support these high levels of implementation?

How do the results from the School Implementation Scale align with other school-level data? Is additional data needed?

Prioritize needs: Which essential elements show low levels of implementation? Which survey items highlight areas that could be improved over the next year?

Next steps: How do the results influence our action planning for next year?

Blog post: List three action items you might suggest to schools in your region.

Page 17: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Team Functioning Survey

17-item online survey Assesses team functioning in three

domains: Structure Communication Focus

Page 18: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Team Functioning Survey 17 items on a Likert Scale:

Page 19: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Team Functioning Survey 2,927 respondents in 203 schools

Comparisons with 2013 data based on 2,472 responses from that year

Other Certified Staff; 312

General Educa-tion Teacher;

2112

Noncertified Staff; 49

Special Educa-tion Teacher;

350Administrator;

104

Respondents' Roles

Page 20: MO SPDG Data Report

Team Functioning SurveyAverage percent of respondents who answered “Agree” or “Strongly Agree” on items in each domain.

Structure Communication Focus0%

20%

40%

60%

80%

100%

72.5% 75.7% 73.5%75.8%79.8% 78.8%

Percent of Respondents answering "Agree" or "Strongly Agree"Average by Domain

2013 2014

Page 21: MO SPDG Data Report

Team Functioning Survey

Multiple meeting roles assigned prior to the meeting (e.g., facilitator, note-taker)

Meeting starts and ends on time as scheduled

Nearly all team members attend regularly

Agenda developed and available prior to meetings

Minutes/notes taken during meeting and distributed to all team members after the meeting

0% 20% 40% 60% 80% 100%

59.4%

79.7%

86.4%

74.9%

62.2%

65.2%

81.1%

87.6%

77.4%

67.7%

Structure

2014 2013

Page 22: MO SPDG Data Report

Team Functioning SurveySTRONG: Nearly all members attend regularlyWEAK: Multiple roles assigned prior to meetingACTION ITEM: Use the CDTs learning package to provide training on how to assign meeting roles.ACTION ITEM: Work with school teams to create a system of rotating meeting roles (e.g. facilitator, note-taker).

STRONG: Agenda created and availableWEAK: Minutes/notes taken and distributedACTION ITEM: Use the CDTs learning package to provide training on how to take and distribute notes.ACTION ITEM: Follow up with school teams to be sure notes are being distributed after meetings.

Page 23: MO SPDG Data Report

Team Functioning Survey

High level of engagement from all team members (e.g., verbal input, attention, willingness to complete tasks)

Discussions stay on track; no sidebar conversations

Team members communicate effectively (e.g., speak directly, ask questions, express support, restate ideas)

Disagreements/conflicts are addressed (e.g., problem solving, respect, listening)

Members value each other`s roles and contributions

All viewpoints shared and given adequate time prior to decision-making (e.g., discussion of options and consequences)

Shared decision-making with balanced influence of team members (e.g., voting on decisions, discussion of options)

0% 20% 40% 60% 80% 100%

75.2%

69.7%

79.7%

74.4%

79.1%

75.8%

75.8%

79.8%

71.7%

84.6%

79.4%

82.6%

80.6%

79.7%

Communication

2014 2013

Page 24: MO SPDG Data Report

Team Functioning SurveySTRONG: Team members communicate effectivelyWEAK: Discussions stay on track; no sidebar conversationsACTION ITEM: Use the CDTs learning package to provide training on the roles of Facilitator and Norms Minder.ACTION ITEM: Keep track of how time is spent during a team meeting, including time on task and time spent in sidebar conversations. Review the results with team members and brainstorm strategies to cut down on sidebar conversations.

Page 25: MO SPDG Data Report

Team Functioning Survey

Meeting has clear purpose, which is communicated in advance

Data drives decision-making (i.e., relevant data is reviewed and discussed, decisions clearly influenced by data)

Status of action items from last meeting is reviewed

Clear action items (e.g., deadlines, person responsible)

Meetings are productive; continual progress focused on purpose

0% 20% 40% 60% 80% 100%

78.2%

77.9%

62.7%

73.7%

75.0%

81.5%

83.0%

69.3%

80.5%

79.7%

Focus

2014 2013

Page 26: MO SPDG Data Report

Team Functioning SurveySTRONG: Data drives decision-makingWEAK: Status of action items from last meeting are reviewedACTION ITEM: Use the CDTs learning package to provide training on creating agendas.ACTION ITEM: Be sure each agenda includes time to review the action items from the previous meeting.ACTION ITEM: Use data on the progress made on action items from the previous meeting to determine whether further action is required.

Page 27: MO SPDG Data Report

Team Functioning Survey What is your team most proud of regarding team

functioning? Name one area in which your team needs to

improve. Why is it important to improve this aspect of team

functioning? What will your team do to improve this aspect of team

functioning? Finish this phrase: It is beneficial to look at team

functioning because…

Blog post: List three steps you can take to help building-level teams improve their team functioning.

Page 28: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

ACTIVITY

Page 29: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

SPDG Annual Performance Report Submitted annually to OSEP Three main program measures:

Program measure 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.

Program measure 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

Program measure 3: Projects use SPDG professional development funds to provide follow-up activities designed to sustain the use of SPDG-supported practices.

Page 30: MO SPDG Data Report

SPDG Annual Performance ReportProgram measure 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.

- High-Quality Professional Development Checklist

SubjectAvg %

Indicators met

Percent of trainings missing one or fewer indicator for each domain: % found

to be HQNumber of submitted checklistsPrepar-

ationIntro-

ductionDemon-stration

Engage-ment

Evalu-ation Mastery

Assessment Capable Learners 95.8% 100.0% 100.0% 100.0% 100.0% 91.7% 91.7% 91.7% 12

Collaborative Work Overview 100.0% 100.0% 100.0% 100.0% 100.0% 100.0% 100.0% 100.0% 1

Common Formative Assessment 96.4% 100.0% 100.0% 100.0% 100.0% 100.0% 100.0% 100.0% 15

Data-Based Decision Making 96.6% 100.0% 100.0% 100.0% 100.0% 100.0% 100.0% 100.0% 4

Feedback 94.7% 100.0% 100.0% 100.0% 100.0% 83.3% 91.7% 83.3% 12Reciprocal Teaching 97.5% 100.0% 100.0% 100.0% 100.0% 100.0% 100.0% 100.0% 9Spaced v. Massed 98.5% 100.0% 100.0% 100.0% 100.0% 100.0% 100.0% 100.0% 3Overall 97.1% 100.0% 100.0% 100.0% 100.0% 96.4% 97.6% 96.4% 56

Page 31: MO SPDG Data Report

SPDG Annual Performance ReportProgram measure 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.

- Pre/post-tests

Assessment-Capable Learners

Collaborative Data Teams

Common Formative Assessment

Data-Based Decision Making

Feedback

Reciprocal Teaching

Spaced vs. Massed Practice

-20% 0% 20% 40% 60% 80% 100%

62.1%

46.2%

51.4%

46.7%

60.0%

69.9%

71.0%

80.1%

40.0%

76.7%

75.2%

76.5%

90.0%

91.8%

18.0%

-6.2%

25.3%

28.6%

16.5%

20.1%

20.8%

Overall Averages by Learning Package

Difference Post Avg Pre Avg

Page 32: MO SPDG Data Report

SPDG Annual Performance ReportProgram measure 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.

- Pre/post-tests

June 2013 Oct 2013 Jan 20140%

20%

40%

60%

80%

100%

78.1%

64.4%60.0%

93.3%

83.5%79.2%

15.2%19.1% 19.3%

2013-14 Shared Learning Pre/Post Results

PrePostDifference

Page 33: MO SPDG Data Report

SPDG Annual Performance ReportProgram measure 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.

- Satisfaction surveys

1. The presenter(s) was/were knowledgeable about this subject.

2. The workshop materials were clear and well-organized.

3. The instructional/presentation skills were effective and appropriate.

4. The ideas, skills, and strategies will be useful in improving student learning.

5. The information and/or strategies presented will impact my teaching and/or leadership role.

6. I will recommend this presenter/workshop to others.

7. The overall program was worthwhile.

0% 20% 40% 60% 80% 100%

94.1%

84.4%

85.2%

88.1%

88.1%

71.1%

80.0%

Satisfaction Survey Quantitative Responses

Page 34: MO SPDG Data Report

SPDG Annual Performance ReportProgram measure 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.

- Collaboration Survey

0

1

2

3

4

5

2.01 2.242.70

1.732.29 2.36

3.64 3.61

0.28 0.120.94

1.87

Average Level of RPDC Consultants' Collaboration with External Entities(0=No Contact, 1=Networking, 2=Cooperation, 3=Coordination,

4=Coalition, 5=Collaboration)

Pre-CW Present Day Difference

Page 35: MO SPDG Data Report

SPDG Annual Performance ReportProgram measure 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.

- Collaboration Survey

Department of Elementary and Secondary Education

Other Regional Professional Development Centers

CW Schools

State Implementation Specialist(s)

0% 20% 40% 60% 80% 100%

12.9%

8.3%

9.4%

2.6%

54.1%

65.5%

40.0%

34.2%

32.9%

26.2%

50.6%

63.2%

Percent of Respondents Reporting Each Direction of Change in their Levels of Collaboration with Entities

% Negative % Neutral % Positive

Page 36: MO SPDG Data Report

SPDG Annual Performance ReportProgram measure 1: Projects use evidence-based professional development practices to support the attainment of identified competencies.

- Collaboration Survey

Collaborative Data Teams

Common Formative

Assessment

Data-Based Decision Making

Effective Teaching/Learning

PracticesAssessment-

Capable Learners

Feedback

Reciprocal Teaching

Spaced vs. Massed Practice

0% 20% 40% 60% 80% 100%

94.7%

87.4%

89.1%

88.0%

83.3%

92.7%

74.5%

75.7%

Percent of Respondents Reporting an "Emerging" or "Proficient" Level of Confidence with Each Learning

Package

Page 37: MO SPDG Data Report

SPDG Annual Performance ReportProgram measure 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

- School Implementation Scale- Team Functioning Survey- Student data

- Attendance- State assessment proficiency rates- Inclusion data- Discipline data

- Response rate data

Page 38: MO SPDG Data Report

SPDG Annual Performance ReportProgram measure 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

- Student data2.g. Performance Measure

Measure Type Quantitative Data

The attendance rate for students with IEPs in buildings participating in SPDG professional development will increase.

 PROJ

Target Actual Performance DataRaw

Number Ratio %Raw

Number Ratio %

  

89/100   93.93% 93.93/100 93.93%

2.i. Performance MeasureMeasure

Type Quantitative Data

The percentage of students with IEPs in Collaborative Work buildings who meet or exceed proficiency on state assessments in Communication Arts will increase.

 PROJ

Target Actual Performance DataRaw

Number Ratio %Raw

Number Ratio %

  

58/100   24.76% 24.76/100 24.76%

2.m. Performance MeasureMeasure

Type Quantitative Data

The percentage of students with IEPs within Collaborative Work buildings who were in the regular education classroom greater than 79% of the school day will increase.

 PROJ

Target Actual Performance DataRaw

Number Ratio %Raw

Number Ratio %

  

60/100   64.7% 64.7/100 64.7%

Page 39: MO SPDG Data Report

SPDG Annual Performance ReportProgram measure 2: Participants in SPDG professional development demonstrate improvement in implementation of SPDG-supported practices over time.

- Response rate data

2.n. Performance MeasureMeasure

Type Quantitative Data

All (100%) Collaborative Work buildings are represented by respondents on the School Implementation Scale.

 PROJ

Target Actual Performance DataRaw

Number Ratio %Raw

Number Ratio %

  

100/100   61.4% 221/360 61.4%

2.r. Performance MeasureMeasure

Type Quantitative Data

All (100%) Collaborative Work buildings are represented by respondents on the Team Functioning Survey.

 PROJ

Target Actual Performance DataRaw

Number Ratio %Raw

Number Ratio %

  

100/100   60.0% 216/360 60.0%

Page 40: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

SPDG Evidence-Based Professional Development Components Rubric

Management team rates the project’s performance annually

Results are reported to OSEP in the APR Domains:

Selection Training Coaching Performance Assessment Facilitative Administrative Support/Systems

Intervention

Page 41: MO SPDG Data Report

SPDG Evidence-Based Professional Development Components Rubric

Prof Dev Domain

Specification

Exemplar Description (=4) Good Description (=3)

Barely Adequate Description (=2)

Inadequate Description (=1)

B(3) Training Designed with relevance and application practice incorporated

Describes how training is skill-based Participant behavior rehearsals to criterion with an expert observing 

1. 90% of training meets the criteria for behavior rehearsals and reflection as observed by an outside evaluator or as reported by participants in training evaluation.

2. 90% of the training provides opportunities to plan for initial and continued implementation as observed by an outside evaluator or as reported by participants in training evaluation.

3. 80% of participants track and report the use of new skills as monitored through fidelity measures built into the training packages.

4. Of the 80% of participants tracking and reporting the use of new skills, 90% receive coaching and feedback.

5. 100% of participating buildings submit a Common formative assessment developed by the collaborative data teams monthly to the SPDG data portal.

6. 100% of the submitted Common Formative Assessments are vetted by the RPDC consultants and posted to the public access area of the SPDG data portal.

1. 80% of the training meets the criteria for behavior rehearsals and reflection as observed by an outside evaluator or as reported by participants in training evaluation.

2. 80% of the training provides opportunities to plan for initial and continued implementation as observed by an outside evaluator or as reported by participants in training evaluation.

3. 70% of participants track and report the use of new skills as monitored through fidelity measures built into the training packages.

4. Of the 70% of participants tracking and reporting the use of new skills, 80% receive coaching and feedback.

5. 90% of participating buildings submit a Common formative assessment developed by the collaborative data teams monthly to the SPDG data portal.

6. 90% of the submitted Common Formative Assessments are vetted by the RPDC consultants and posted to the public access area of the SPDG data portal.

1. 60% training meets the criteria for behavior rehearsals and reflection as observed by an outside evaluator or as reported by participants in training evaluation.

2. 60% of the training provides opportunities to plan for initial and continued implementation as observed by an outside evaluator or as reported by participants in training evaluation.

3. 50% of participants track and report the use of new skills as monitored through fidelity measures built into the training packages.

4. Of the 50% of participants tracking and reporting the use of new skills, 70% receive coaching and feedback.

5. 80% of participating buildings submit a Common formative assessment developed by the collaborative data teams monthly to the SPDG data portal.

6. 80% of the submitted Common Formative Assessments are vetted by the RPDC consultants and posted to the public access area of the SPDG data portal.

1. Less than 60% of training meets the criteria for behavior rehearsals and reflection as observed by an outside evaluator or as reported by participants in training evaluation.

2. Less than 60% of the training provides opportunities to plan for initial and continued implementation as observed by an outside evaluator or as reported by participants in training evaluation.

3. Less than 50% of participants track and report the use of new skills as monitored through fidelity measures built into the training packages.

4. Of the participants tracking and reporting the use of new skills, less than 70% receive coaching and feedback.

5. Less than 80% of participating buildings submit a Common formative assessment developed by the collaborative data teams monthly to the SPDG data portal.

6. Less than 80% of the submitted Common Formative Assessments are vetted by the RPDC consultants and posted to the public access area of the SPDG data portal.

Page 42: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

ACTIVITY

Page 43: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Review Collaboration Survey results

What results did you expect to see? Does the data match your expectations? How does it differ?

Which entities does your RPDC do a good job of collaborating with?

Which entities could your RPDC do a better job of collaborating with?

What are some of the barriers you encounter when collaborating with these entities? How could those barriers be overcome?

Blog post: List two entities you would like to improve collaboration with and explain how you plan to accomplish this.

Page 44: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Review Quarterly Reports (pre/post-test results & satisfaction survey

results)

What results did you expect to see? Does the data match your expectations? How does it differ?

What do the pre/post-test results tell you about what content areas need to be re-taught?

What do the pre/post-test results tell you about how consistently the pre- and post-tests are administered? How could you improve distribution of these tests?

What do the satisfaction survey results tell you about the efficacy of content delivery? How could content delivery be improved?

Blog post: Describe your plan for improving administration of pre/post-tests.

Page 45: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Pattie [email protected]

Jennifer [email protected]

Research Collaboration

(785) 864-0517

Page 46: MO SPDG Data Report

P r o f e s s i o n a l D e v e l o p m e n t t o P r a c t i c e

Blog about your data findings

Blog post: Based on your results from the School Implementation Scale, list three action items you might suggest to schools in your region.

Blog post: Based on your results from the Team Functioning Survey, list three steps you can take to help building-level teams improve their team functioning.

Blog post: Based on your results from the Collaboration Survey, list two entities you would like to improve collaboration with and explain how you plan to accomplish this.

Blog post: Based on your results from the quarterly reports, describe your plan for improving administration of pre/post-tests.