Webinar Logistics

Post on 14-Feb-2016

21 views 0 download

Tags:

description

Evaluating School Principal Effectiveness Why We Need to Evaluate Principals and Use Principal Evaluation as a Tool for Professional Improvement October 4, 2011. Webinar Logistics. Everyone is muted Use the chat function to make a comment or ask a question - PowerPoint PPT Presentation

Transcript of Webinar Logistics

Evaluating School Principal Effectiveness

Why We Need to Evaluate Principals and Use Principal Evaluation as a Tool for

Professional ImprovementOctober 4, 2011

Webinar Logistics

Everyone is muted

Use the chat function to make a comment or ask a question

You may chat privately with individuals on your team

If you have problems, you may send William Bentgen a message via the chat function or an email at williamb@ccsso.org

EVALUATION

Welcome

Janice Poda, CCSSO

Initiative Director Education Workforce

4

Moderator

Mary Canole

School Leadership Consultant, Council of Chief State School Officers

5

Purpose

To provide an objective, research-based overview of what an effective principal evaluation system should include.

To provide SCEE Teams a Framework for Principal Evaluation Tool.

Framework for Principal Evaluation

7

Framework for Principal Evaluation: Key evaluation elements and considerations Developed by Margaret Terry Orr, Bank Street College of Education, New York (morr@bnkst.edu), October 4, 2011

Elements Considerations Current state policy Decisions to be made Who is assessed Principals only, or to include other

school and district leaders Differentiation based on years of

experience, level and responsibilities Differentiated based on context

The purposes of assessment

Personnel management to make consequential decisions

Leadership development for growth and improved practice

Organizational change

What is assessed Leadership practices Teacher effectiveness and

organizational conditions Student outcomes Context

What sources of evidence are used

Judgments Observations, classroom visits and site

visits Documents and other evidence Portfolios and artifacts

How the assessment is conducted

Frequency and timing Use of surveys, interviews or focus

groups

Presenters

Margaret Terry Orr

Bank Street College of Education

Jean Satterfield

Assistant State Superintendent for the Maryland Division of Certification and Accreditation

Sarah Brown Wessling

National Teacher of the Year 2010, English Teacher, Johnston High School, Johnston, Iowa

Research on conventional practice for principal evaluation

Wide variation in principal evaluation scope, instruments, and practices

Few psychometrically rigorous evaluation rubrics or rating systems

Movement:

away from assessing leadership traits

toward use standards

toward the relationship between leadership practices and student achievement

Essential content elements of principal evaluation system:

Who is assessed

The purposes of assessment

What is assessed

What sources of evidence are used

Essential organizational elements of principal evaluation system:

How the assessment is conducted

How evidence is valued

Psychometric qualities

Implementation, organization, and support of evaluation

Evaluation of the system’s effectiveness

Considerations of who is assessed

How “principal” is defined

To include all school building leaders, or just principals

To include district leaders or not

To differentiate based on years of experience, time in current building assignment, and levels of responsibility

Purposes of the evaluation

Summative—for consequential decisions

Formative—for professional growth

Organizational change—cohesive system

Evaluation systems differ based on which purposes are incorporated and to what degree.

Poll

How much emphasis does your state give to each of the 3 purposes of leader evaluation?

Summativea) No emphasis b) Minimal emphasis c) Moderate emphasis d) Great emphasis

Formative a) No emphasis b) Minimal emphasis c) Moderate emphasis d) Great emphasis

Organizational changea) No emphasis b) Minimal emphasis c) Moderate emphasis d) Great emphasis

15

Poll Results

16

Summativea) No emphasis (0%) b) Minimal emphasis (5%) c) Moderate emphasis (13%) d) Great emphasis (16%)

Formative a) No emphasis (0%) b) Minimal emphasis (5%) c) Moderate emphasis (16%) d) Great emphasis (13%)

Organizational changea) No emphasis (0%) b) Minimal emphasis (16%) c) Moderate emphasis (13%) d) Great emphasis (5%)

What is assessed?

Leadership practices

Teacher capacity and effectiveness

Student achievement

gains

Other student outcomes

Organizational capacity and effectiveness

Other school outcomes

School, community, district and state context

Leadership Development

Leadership practices

National standards

District priorities for practice (e.g. teacher evaluation practices)

Span of authority and control in whether leaders can perform the practices

Teacher and organizational capacity and effectiveness

Indirect influence on student achievement through influence on:

teacher instructional practices

distributed leadership

school culture and climate

teacher and school use of data

community engagement

working conditions

school wide improvement goals

Student and other outcomes

Student achievement progress

Progress on other student outcomes, such as graduation rates and reduced dropout rates

Progress on other broader school effectiveness goals, such as improved learning for ELLs and special education students

Improved safety and security

Context

Resources

Challenges

Parent and community expectations

Other district and state policies

What types of evidence is collected?

Observations

Documentation

Principal reports

Perceptions of actions and behaviors

Perceptions of working conditions, school climate

Student performance data

Whose judgments?

Principal

Subordinate staff (teachers, other professionals, support staff)

Peers (other principals)

Supervisors (central office and superintendent)

Students

Families

Community partners

Considerations in selecting types of evidence to include

Psychometric considerations

Validity of measures

Validity of combining measures

Representation of scope and depth of principal work

Reliability

Balance between direct observation of principal practice, evidence and impact

Evaluator skill

Time

When measures are made and how interpreted?

How often is measurement made? Initial-interim-final? or Annual only?

How are results interpreted? What is used to make judgments? Rubrics

and rating forms? Are results disaggregated? Who makes the judgments in reviewing the

evidence?

How measures are valued:

Dimension Rating Weight Score

Development 3 20% .60

Behavior 4 20% .80

Intermediate outcomes

3 30% .90

School outcomes

2 30% .60

Total 2.90

See: Principal Score Card (Milanowski, 2009)

Evaluating the evaluation system

New field

Test out:

Measures

Tools

Processes

Implementation

Evaluate the underlying theory of action

Theory of action of principal evaluation as a lever of change

Student and

school outcomes

Teacher and organizational effectiveness

Leader

practices

• Principal Evaluation System

Making evaluation system design decisions

Start with purpose

Build in an evaluation of the system from the start

Involve critical stakeholders to engage, educate and create buy-in

Keep it simple, easy to use, and easy to understand

Framework for Principal Evaluation: Key evaluation elements and considerationsElements Considerations Current state

policyDecisions to be

made

The purposes of assessmentWho is assessed

What is assessed

What sources of evidence are usedHow the assessment is conductedHow evidence is valued

What psychometric qualities are maintainedHow the assessment system is implemented and operates

30

Assistant State Superintendent for the Maryland Division of

Certification and Accreditation

 

Jean Satterfield

31

7 MD Pilots Model Teacher & Principal Evaluation System

2011-2012: 7 Districts run pilot to identify ways to measure student growth in all subject areas and for all teachers

Student growth will account for 50% of a teacher and principal evaluations

2012-2013: Statewide pilot using results and feedback from pilot year to inform the no-fault, statewide pilot.

Fall 2013: Mode fully operational statewide

32

Pilots Underway…

Baltimore City

8 principal volunteers with 300+ teachers in 8 schools begin 1st cycle in December

33

Baltimore County

Instrument aligns to the Danielson Model

11 principals self selected to participate [with 80+ teachers]

Data systems and measures in place

MD District Pilots

Charles County: 7 pilot school principals & 56 teachers now working with teacher leaders to complete a pilot evaluation tool.

Kent County: All 7 schools (2 teachers per school)

Completed internal restructuring

Migrated to a new student data management system

34

Pilots (continued)

Prince Georges County: Aligned with the Danielson model – All principals & 100 teachers in 38 schools. Data systems and measures are progressing.

Queen Anne’s County: 7 principals & 126 teachers are exploring cost effective methods for aligning data, validating student growth measures and delivering PD.

35

Pilots (continued)

St. Mary’s County:

Five principals,11 assistant principals, 235 teachers

Implemented the Danielson model for the past 10 years

36

Data collection system in place to identify PD needs of teachers, principals and the system

Sarah Brown WesslingNational Teacher of the Year 2010

English Teacher, Johnston High School,

Johnston, Iowa37

Evaluation Discussion Group

Join the Evaluation Discussion Group

http://scee.groupsite.com/page/teacher-evaluation

On the Collaboration Site Home Page select Evaluation

If you are not already a member, request an invitation

38

Upcoming Webinars

NEW DATE: November 1, 2:00 EDT

Continuing the Conversation About Educator Evaluation: Next Steps After the SCEE Topical Meeting

Save the date for our December webinar

December 13, 2:00 EDT

39

30 Minute Q&A

Participants respond to questions regarding the framework tool—we’ll pose three questions

Participants ask questions of the experts

We will post the Q&A on the webinars page at the conclusion of this event

http://scee.groupsite.com/page/webinars

40

Using the Chat

Find the Chat in the bottom right side of your screen.

To make the Chat appear larger on your screen, click on the triangle next to the Participants list to minimize it.

Questions and comments sent to All Participants are visible to everyone.

To offer an anonymous question or comment privately, click on Circe Stumbo’s name in the list of Chat recipients or email her at circe@westwinded.com.

For technical assistance find William Bentgen in the Chat box or email him at williamb@ccsso.org.

41

42

43

Chat with other SCEE members…

1. Which elements of the Framework for Principal Evaluation generated the most discussion with your team?

Example:

In Maryland, framework elements most discussed: The difference between how to measure highly effective and effective.

44

Chat with other SCEE members…

2. If you have a Principal Evaluation Model in place, who are you evaluating (“Who is assessed”)?

Example:

In Maryland, principals are included in the evaluation/assessment – We are discussing whether the same model could be used for all levels of administrators, e.g., assistant principals and supervisors.

45

Chat with other SCEE members…

3. Which elements of the Framework for Principal Evaluation should be the highest priority for SCEE to attend to with future technical assistance (TA)?

Example:

In MD, we would like TA to address validity, reliability, and how to use student growth data.

46

Please complete the webinar evaluation that you will receive

by email.

Thank You

47

Resources

Brown-Sims, M. (2010). Evaluating School Principals. Tips & Tools. Washington, DC: National Comprehensive Center for Teacher Quality.

Calabrese, R. L., & Zepeda, S. J. (1999). Decision-making assessment: Improving principal performance. The International Journal of Educational Management, 13(1), 6.

Catano, N., & Stronge, J. H. (2006). What are principals expected to do? congruence between principal evaluation and performance standards. NASSP Bulletin, 90(3), 221-237.

Goldring, E., Porter, A. C., Murphy, J., Elliot, S. N., & Cravens, X. (2007). Assessing learner-centered leadership: Connections to research, professional standards and current practices. Nashville, TN: Vanderbilt University.

Hessel, K., & Holloway, J. (2001). School leaders and standards: a vision for leadership. Princeton, NJ: Educational Testing Service.

Leithwood, K., & Jantzi, D. (2008). Linking leadership to student learning: The contributions of leader efficacy. Educational administration quarterly, 44(4), 496-528.

Marzano, R. J., Waters, T., & McNulty, B. A. (2005). School leadership that works: From research to results. Alexandria, VA: Association for Supervison and Curriculum Development.

Resources (cont.)

McREL. (2010). McREL's Principal Evaluation System.

Milanowski, A., & Schuermann, P. (2009). Principal evaluation (powerpoint slides), Teacher Incentive Fund Grantee Meeting. Bethesda, MD: Center for Educator Compensation Reform.

Murphy, J., Elliott, S. N., Goldring, E., & Porter, A. C. (2006). Learning-centered leadership: A conceptual foundation. Nashville, TN: Vanderbilt University.

Porter, A. C., Goldring, E., Murphy, J., Elliot, S. N., & Cravens, X. (2006). A framework for the assessment of learning-centered leadership. Nashville, TN: Vanderbilt University.

Portin, B., Feldman, S., & Knapp, M. S. (2006). Purposes, Uses, and Practices of Leadership Assessment in Education Seattle, WA: Center for the Study of Teaching and Policy, University of Washington.

Reeves, D. B. (2004). Assessing educational leaders. Thousand Oaks, CA.: Corwin Press.

Rhode Island Department of Education. (November 9, 2010 ). Working draft. Rhode Island Model. building administrator professional practice framework. Providence, RI: Rhode Island Department of Education.

Robinson, V. M. J., Lloyd, C. A., & Rowe, K. J. (2008). The impact of leadership on student outcomes: An analysis of the differential effects of leadership types. Educational administration quarterly, 44(5), 635-674.