April 19, 2012 SBE Presentation on Performance Evaluations.

8
April 19, 2012 SBE Presentation on Performance Evaluations

Transcript of April 19, 2012 SBE Presentation on Performance Evaluations.

Page 1: April 19, 2012 SBE Presentation on Performance Evaluations.

April 19, 2012

SBE Presentation on Performance Evaluations

Page 2: April 19, 2012 SBE Presentation on Performance Evaluations.

2 2

Context

There are three major routines that comprise DDOE’s management of districts’ Race to the Top/Success plans:

Routine Purpose DDOE Staff Involved District Staff Involved Location Frequency

Progress Reviews

Assess district progress on plan activities and identify opportunities to improve

Delivery Unit (DU) Chief Performance Officer

DU Deputy Officer District Liaison

Chief RTTT manager Others as desired

by the Chief

On-site at districts

1-3 times a year, depending on grant size and performance

Performance Evaluations

Assess district performance on plan measures and identify opportunities to improve

Secretary of Education

Deputy Secretary Chief Performance

Officer District Liaison

Chief Board Rep. Teacher Rep. RTTT manager Others as desired

by the Chief

DDOE Cabinet Room

1-2 times a year, depending on grant size and performance

Chiefs’ Workshops

Discuss RTTT data and initiatives in “PLCs” and identify opportunities to improve

DDOE Leadership Team

DDOE Content Experts as needed

Chief 1-2 additional

district leaders (as determined by Chief)

DDOE Collette Center

Monthly during the school year

Page 3: April 19, 2012 SBE Presentation on Performance Evaluations.

3 3

Performance Evaluations

Agenda▪ The mid-year performance evaluation conversation focused on:

▪ Initial thoughts for what is driving the district’s strengths and challenges

▪ How the district will dig deeper to really understand what is going on, and

▪ How the district will then replicate its strengths and address its challenges

Purpose of evaluations

▪ The purpose of performance evaluations is to assess the impact of district plans and district performance overall, and to identify opportunities to improve performance before final funding decisions are made in June

What we heard▪ Most districts had already begun to drill down into their data at the school

and grade-level

▪ Most district had clear hypotheses for the drivers of their strengths

▪ Across many districts, effective PLCs, RTTT-funded specialists and extended learning time programs were cited as a driver of district strengths

▪ Most districts felt that they would need further analysis to understand the root causes of their challenges

▪ Across many districts, “initiative overload” was cited as a potential cause for district challenges

Page 4: April 19, 2012 SBE Presentation on Performance Evaluations.

4 4

Performance Dashboards – Purpose, Status and Use

Status and use of dashboards

▪ For the February 2012 performance evaluations all data is formative, and only marks progress towards LEA’s RTTT goals (which begin in Spring of 2012)

▪ All 19 districts will have performance evaluations in June; 14 of the districts had an additional mid-year performance evaluation at the end of February (based on grant size and/or performance to date)

▪ The dashboards are draft/for internal use only – please see the Performance Evaluation Overview for more information on this classification

Purpose of dashboards

▪ The dashboards were the primary focus of LEAs’ performance evaluations

▪ Performance evaluation dashboards provide a picture of LEAs’ performance against their Race to the Top goals, key state performance measures, and LEA-specific performance measures

Page 5: April 19, 2012 SBE Presentation on Performance Evaluations.

5 5

Performance Dashboards – Guide to Understanding

DCAS Measures: Page 1Winter (SY '12)

Δ in Winter (SY '11 to SY '12)

Fall to Winter Growth (SY '12)

Δ in Fall to Winter Growth (SY '11 to SY '12)

Goal for Spring 2012

Additional Students to Meet 2012 Goal

Goal for Spring 2015

Reading Proficiency - Grades 3-5 52 -1 12 -2 67 4491 81

State Example

Where are we in

winter 2012?

Where are we vs. last winter?

What was our

fall to winter

growth?

What was our

F-W growth vs. last year?

Colors are based on district performance vs. the state (green = above the state; red = below the state)

Arrows are based on district performance this year vs. the previous year (up = performance has improved; neutral = performance has stayed within 1 percentage point; down = performance has declined)

Goals are based on reducing non-proficiency by 50% by 2015 – a similar methodology as was used in the ESEA Flexibility Application

The “additional students to meet goal” calculation is based on the number of students who took the winter test, so it may not be “exact”

Colors are based on district performance vs. the state (green = above the state; red = below the state)

Arrows are based on district performance this year vs. the previous year (up = performance has improved; neutral = performance has stayed within 1 percentage point; down = performance has declined)

Goals are based on reducing non-proficiency by 50% by 2015 – a similar methodology as was used in the ESEA Flexibility Application

The “additional students to meet goal” calculation is based on the number of students who took the winter test, so it may not be “exact”

What is our

Spring 2012

Goal?

How many more

students are

needed to meet the

goal?

What is our

Spring 2015

Goal?

Page 6: April 19, 2012 SBE Presentation on Performance Evaluations.

6 6

State Data

Please see your handout for an overview of statewide trends based on the data.

Page 7: April 19, 2012 SBE Presentation on Performance Evaluations.

7 7

District Data

Each of the 14 districts with scheduled performance evaluations received an overview with the following components:

•Plan highlights (from the plan submitted in June, 2011)

•Progress review strengths (from the progress review conducted in October/November, 2011)

•Performance strengths (from the dashboard generated in February, 2012)

•Opportunities to strengthen performance (from the dashboard generated in February, 2012)

•Additional relevant trends/data points (from the dashboard generated in February, 2012)

All district-specific overviews were shared with the Innovation Action Team.

Page 8: April 19, 2012 SBE Presentation on Performance Evaluations.

8 8

Next Steps

Stakeholder Communications

▪ DDOE shared the state and district dashboards with all of the stakeholder groups that comprise the Innovation Action Team

▪ DDOE provided the opportunity for each stakeholder group to schedule an individual overview of the performance evaluation process and findings

Public Communications

▪ DDOE publicly released the state dashboard, state summary, and district-specific strengths

▪ DDOE will use existing communication opportunities (e.g., the Governor’s Rotary Club meetings) to highlight the performance evaluation process and findings

▪ DDOE will publicly release end-of-year district dashboards in summer 2012

▪ If the state’s ESEA flexibility application is approved, DDOE will align and disseminate communications regarding the RTTT performance dashboards and the new accountability changes – the two methodologies are very similar, with some differences.

Further Analysis▪ DDOE used the February and March Chiefs’ meetings to further discuss district

data and initiatives

▪ DDOE is in the process of conducting further data analysis to identify district strengths, coupled with on-site visits in April to help understand the connection between district initiatives and performance data