Data Interpretation II Workshop 2008 Writing Assessment for Learning.

75
Data Interpretation II Workshop 2008 Writing Assessment for Learning

Transcript of Data Interpretation II Workshop 2008 Writing Assessment for Learning.

Page 1: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

Data Interpretation II Workshop

2008 Writing

Assessment for Learning

Page 2: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

2

Purposes for the Day – p. 2

• Deepen understanding about the writing assessment project results;

• Initiate reflection and discussion among division-level staff members related to the writing assessment results;

• Provide a range of tools and processes to support division-level staff in their work throughout the system related to school improvement; and,

• Provide opportunity to discuss and plan around the data in the context of school improvement.

Page 3: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

3

Agenda

OpeningAssessment for Learning – Writing

AssessmentConceptual FrameworkComparatorsThe Reports

Here’s WhatThe DataProcesses to Support School/Division School ImprovementChanging ContextsBuilding Capacity

So What?Analysis of Data and Support StructuresRole of Central Office in Supporting School ImprovementSustainability

Now What?Using Goals to Inform PlanningMonitoring & Assessing ProgressLinking goals and Assessment DataIdentifying InterrelationshipsEvidence of Implementation

Closure

Page 4: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

4

Magnetic Quotes

• In various locations around the room are statements regarding data use and school improvement.

• Take a moment to read each and then go to the sign with the statement that resonates most for you.

• Create a pair or trio with your colleagues and discuss why you connect with the statement and what it means to you.

Page 5: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

“Quantitative information – the numbers – takes us out of the realm of assumption, feeling, guesswork, gut instinct, intuition, and bias, into the realm of reliable fact based on measurable evidence.”

Stephen Few, 2004

A goal is a statement that changes how students learn in their classrooms.

Goals should lead to plans that teachers administer and control.

Goals shouldn’t place the solution in someone else’s hands.

O’Neill, J. & Cozemius, A. (2006). The power of SMART goals: Using goals to improve student learning. Bloomington, IN: Solution Tree.

Schools and school districts are rich in data. It is important that the data a group explores are broad enough to offer a rich and deep view of the present state, but not so complex that the process becomes overwhelming and unmanageable.

Wellman, B. & Lipton, L. (2004). Data-driven dialogue. Mira Via, LLC.

Data are simply information. Individuals and groups create meaning by organizing, analyzing and interpreting data. Interpretation is subjective; data are objective. Frames of reference, the way we see the world, influence the meaning we derive from the data we collect and select. (Wellman & Lipton, 2004)

Planning for the future on the basis of events past is the school planning problem

Stephen White (2005) identified as the Rearview Mirror Effect.

Page 6: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

6

Assessment for Learningis a Snapshot

• Results from a large-scale assessment are a snapshot of student performance.– The results are not definitive. They do not tell the

whole story. They need to be considered along with other sources of information available at the school.

– The results are more reliable when larger numbers of students participate and when aggregated at the provincial and division level, and should be considered cautiously at the school level. Individual student mastery of learning is best determined through effective and ongoing classroom-based assessment. (Saskatchewan Learning, 2008)

Page 7: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

7

.

Depth and Specificityof Knowledge

From Saskatchewan Learning. (2006). Understanding the numbers.

Little knowledge ofspecific students

In-depth knowledge of specific students

Individual NationalSchoolClassroom InternationalDivision Provincial

Assessments

In-depth knowledge of specific students

Individual NationalSchoolClassroom InternationalDivision Provincial

Assessments

In-depth knowledge of systems

Page 8: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

8

Provincial Writing Assessment: Conceptual Framework – p. 3

• Colourful Thoughts– As you read through the information on the

Provincial Writing Assessment, use highlighters or sticky notes to think about your reading:

Wow! I agree

with this.

Hmm! I wonder. . .

Yikes!

Adapted from Harvey, S. & Goudvis, A. Strategies that work, 2007.

Page 9: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

9

Comparators: Types of Referencing

• Criterion-referenced: Comparing how students perform relative to curriculum objectives, level attribution criteria (rubrics) and the level of difficulty inherent in the assessment tasks. If low percentages of students are succeeding with respect to specific criteria identified in rubrics, this may be an area for further investigation, and for planning intervention to improve student writing.(Detailed rubrics, OTL rubrics and test items can be sourced at www.education.gov.sk.ca)

• Standards-referenced: Comparing how students performed relative to a set of professionally or socially constructed standards. Results can be compared to these standards to help identify key areas for investigation and intervention.(Figure .2b, .3c, .4a, .6b, .7b and .8b.)

Page 10: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

10

Comparators: Types of Referencing

• Experience or self–referenced: Comparing how students perform relative to the assessment data gathered by teachers during the school year. Where discrepancies occur, further investigation or intervention might be considered. It is recommended that several sources of data be considered in planning.(E.g.. Comparing these results to current school data. The standards set by the panel.)

• Norm-referenced: Comparing how students in a school performed relative to the performance of students in the division, region or project. Note cautions around small groups of students. Norm-reference comparisons contribute very little to determining how to use the assessment information to make improvements.(E.g.. Tables comparing the school, division and province.)

Page 11: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

11

Comparators: Types of Referencing

• Longitudinal-referenced: Comparing how students perform relative to earlier years’ performance of students. Viewed across several years, assessment results and other evidence can identify trends and improvements. (This data will not appear until the next administration of this assessment.)

Page 12: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

12

Opportunity-to-Learn Elements as Reported by Students

• Propensity to Learn– using resources to explore

models, generate ideas and assist the writing process

– Motivation, attitude and confidence

– Participation, perseverance and completion

– Reflection

• Knowledge and Use of Before, During and After Writing Strategies

• Home Support for Writing and Learning– Encouragement and

interaction– Access to resources and

assistance

Page 13: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

13

Opportunity-to-Learn Elements as Reported by Teachers

• Availability and Use of Resources– Teacher as key resource

• Teacher as writer

• Use of curriculum

• Educational qualifications

• Professional development

– Time– Student resources

• Classroom Instruction and Learning– Planning focuses on

outcomes– Expectations and criteria

are clearly outlined– Variety of assessment

techniques– Writing strategies explicitly

taught and emphasized– Adaptation

Page 14: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

14

Student Performance Outcome Results

Demonstration of thewriting process

– Pre-writing– Drafting– Revision

Quality of writing product– Messaging and content

• Focus• Understanding and

support• Genre

– Organization and coherence

• Introduction, conclusion, coherence

– Language use• Language and word

choices• Syntax and mechanics

Page 15: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

15

Standards

To help make meaningful longitudinal comparisons in future years, three main processes will be implemented.

1. Assessment items will be developed for each assessment cycle using a consistent table of specifications.

2. The assessment items will undergo field-testing - one purpose of which is intended to inform the comparability of the two assessments.

3. A process for setting of standards for each of the assessment items, so that any differences in difficulty between two assessments are accounted for by varying standards for the two assessments.

Page 16: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

16

Opportunity-to-Learn and Performance Standards

• In order to establish Opportunity-to-Learn and Performance standards for the 2008 Writing Assessment, three panels were convened (one from each assessed grade), consisting of teachers from a variety of settings and post-secondary academics including Education faculty.

• The panelists studied each genre from the 2008 assessment in significant detail and established expectations for writing process, narrative products and expository products as well as opportunity to learn.

Page 17: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

17

Thresholds of Adequacyand Proficiency

Beginning Developing Adequate Proficient Insightful

Page 18: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

18

Thresholds of Adequacyand Proficiency

Threshold of Adequacy

Threshold of Proficiency

1.87 3.92

Adequate Proficient & Beyond

Page 19: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

19

Cut Scores

• On page 4 of the detailed reports you will find the cut scores detailing the percentage correct required for students to be classified at one of two levels:

Opportunity-to-Learn Elements

Performance Component

Excellent Standard

Sufficient Standard

Proficient Standard

Adequate Standard

5 Level Scale Process – 3 Level Scale

Product – 6 Level Scale

Page 20: Data Interpretation II Workshop 2008 Writing Assessment for Learning.
Page 21: Data Interpretation II Workshop 2008 Writing Assessment for Learning.
Page 22: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

22

Four Major Categories of Data: Demographics

• Local Data– Descriptive information

such as enrollment, attendance, gender, ethnicity, grade level, etc.

– Can disaggregate other data by demographic variables.

• AFL– Opportunity-to-

Learn Data• Family/Home support

for student writing– encouragement and

interaction

– access to resources

Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

Page 23: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

23

Four Major Categories of Data: Student Learning

• Local Data– Describes outcomes in

terms of standardized test results, grade averages, etc.

• AFL– Readiness Related

Opportunity-to-Learn Data• Using resources to explore

writing• Student knowledge and use of

writing strategies (before, during, after)

– Student performance outcomes

• Writing 5,8,11 – Narrative and Expository

• Writing process, Writing product & categories within

Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

Page 24: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

24

Four Major Categories of Data:Perceptions

• Local Data– Provides information

regarding what students, parents, staff and community think about school programs and processes.

– This is data is important because people act in congruence with what they believe.

• AFL– Readiness Related

Opportunity-to-Learn Data

• Commitment to learn– Using resources– Motivation & attitude– Confidence– Participation– Perseverance &

completion– Reflection

• Knowledge and use of writing strategies

Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

Page 25: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

25

Four Major Categories of Data:School Processes

• Local Data– What the system and teachers

are doing to get the results they are getting.

– Includes programs, assessments, instructional strategies and classroom practices.

• AFL– Classroom Related

Opportunity-to-Learn Data• Instruction and learning

– Planning and reflection– Expectations and

assessment– Focus on writing

strategies– Adaptations

• Availability and use of resources

– Teacher– Time– Resources for students

and teachers

Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

Page 26: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

26

Examining the Report

• Take a few minutes to look through the entire AFL report.

• Use the section on data in the chart in your handout package to guide your thinking and conversation.

• Note three or four areas of strength and areas for improvement.

DATA

Here’s What

So What?

Now What?

School ImprovementH

ere’s What

Page 27: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

27

DATA

Here’s What

So What?

School ImprovementHere’s What

Now What?

Page 28: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

28

A Changing ContextHargreaves & Fink (2006)

Old Basics• Literacy• Numeracy• Obedience• Punctuality

New Basics• Multiliteracy• Creativity• Communication• IT• Teamwork• Lifelong Learning• Adaptation & Change• Environmental

Responsibility

Page 29: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

29

Saskatchewan’s Changing Context

Old Paradigm• Content-based

curricula• Teaching from

activities• Assessment sorts and

selects• Streaming• Those who can, learn

Evolving Paradigm• Balance between

content and process• Outcome-based

learning• Assessment supports

learning• Adaptive Dimension• Everyone learns

Page 30: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

30

Judith Warren Little

In large numbers of schools, and for long periods of time, teachers are colleagues in name only. They work out of sight and hearing of one another, plan and prepare their lessons and materials alone, and struggle on their own to solve most of their instructional, curricular and management problems. Against this almost uniform backdrop of isolated work, some schools stand out for the professional relationship they foster among teachers. These schools, more than others, are organized to permit the sort of reflection . . . That has been largely absent form professional preparation and professional work in schools. For teachers in such schools, work involves colleagueship of a more substantial sort.

Page 31: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

Professional Learning Communities

– Transform knowledge– Shared inquiry– Evidence informed– Situated certainty– Local solutions– Joint responsibility– Continuous learning– Communities of

practice

Performance-Training Sects

– Transfer knowledge– Imposed requirements– Results driven– False certainty– Standardized scripts– Deference to authority– Intensive training– Sects of performance

Hargreaves, A. (2003). Teaching in the knowledge society: Education in the age of insecurity. New York, NY: Teachers College Press.

Page 32: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

32

Building Capacity for Success: Learning by Trial and Evidence.

• In the article, note there are many examples of schools who have engaged in school improvement through a process of trial and evidence.

• In all cases the focus was on the questions:– What will students learn?– How will teachers best support student learning?– What are indicators of success?– How will we measure success of the practices?– What can we do to support students not meeting

expectations?

Page 33: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

33

Building Capacity for Success: Learning by Trial and Evidence.

• Characteristics of improving schools:– A focus on achievement.– Build in monitoring and measuring.– Leadership.– Involvement of all partners.– Considered all students’ needs.

Page 34: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

34

Read and Example:Building Capacity for Success: Learning by Trial

and Evidence.

•Find a partner, decide who is A and who is B.•Both partners read to the end of the questions on p.7 of the text, then stop.

– A summarizes the reading – Pairs craft examples or non-examples from their experiences

•Both partners continue reading up to the end of p. 8.– B summarizes the reading– Pairs craft examples (or non-examples)

•Read to the end of p. 9.– A summarizes the reading – Pairs craft examples or non-examples from their experiences

Page 35: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

35

Reflection Questions

•What key lessons can be taken from this reading and applied to your context?•In what ways are you gathering evidence of promising practices within your schools and school divisions?•In what ways do goals reflect your school or division’s core values and beliefs?•In what ways are you building community with your schools and school divisions?

DATA

Here’s What

So What?

School ImprovementH

ere’s What

Now What?

Page 36: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

36

DATA

Here’s What

So What?

Now What?

School ImprovementHere’s What

Page 37: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

37

Double-Loop ExerciseP. 10

• As a table group, use the provided double-loop to clarify the connections between the professional structures supporting school improvement and the information you are getting from the AFL data.

• In the top circle write out the current structures (PLCs, catalyst teachers, PD) and initiatives (literacy) already in place in your division.

• In the bottom circle write down 3-5 significant (strengths & areas for improvement) indicators from the AFL data.

• Draw arrows from the items in the bottom circle that are connected to or could be supported by items in the top circle.

Lezotte, L. W. & McKee, K. M. (2006). Stepping up: Leading the charge to improve our schools.

Okemos, MI: Effective Schools Products, Ltd.

PLCs

Catalyst Teachers

Reading Group Writing Strategies

Majority of studentsscored at proficientin narrative writing.

Students aren’treporting use of

writing processes.

Page 38: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

38

Double-Loop Exercise

Once you have completed the diagrams, use the following questions to guide discussion at your table:

• What current structures support areas where student performance was strong?

• What current structures could meet the needs identified for improvement?

• What new structures/initiatives may need to be considered?

Page 39: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

39

Please return at 12:50

I’d trade, but peanut butter sticks to my tongue stud.

Page 40: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

40

The Role of Central Office

• Focus on Alignment– Equipping staff with the knowledge and skills

for aligning school improvement processes.– Provide multiple opportunities for staff to

collaborate around current literature and best practices.

– Model and encourage reflective practice; aligning improvement efforts requires time for reflection.

Adapted from - Mooney, N. J. & Mausbach, A. T. (2008). Align the design: a blueprint for school improvement. Alexandria, VA: ASCD.

Page 41: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

41

The Role of Central Office

• Supporting Alignment Initiatives– Link data to the goals and strategies already

in place.– Keep improvement goals front and center.– Engage staffs in discussion about

improvement initiatives.

Adapted from - Mooney, N. J. & Mausbach, A. T. (2008). Align the design: a blueprint for school improvement. Alexandria, VA: ASCD.

Page 42: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

42

The Role of Central Office

• Getting to Goal– Recognize and address alignment problems.– Support all schools in aligning improvement

plans.– Sustain the plan until . . .

Adapted from - Mooney, N. J. & Mausbach, A. T. (2008). Align the design: a blueprint for school improvement. Alexandria, VA: ASCD.

Page 43: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

43

Sustainability

• Sustainability is the capacity of a system to engage in the complexities of continuous improvement consistent with deep values of human purpose. (Fullan, 2004)

• Sustainability does not simply mean whether something can last. It addresses how particular initiatives can be developed without compromising the development of others in the surrounding environment, now and in the future. (Hargreaves & Fink, 2000)

Page 44: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

44

Challenges to SustainabilityP. 11

• In your handout package is a template with five common challenges to sustainable collaborative work.

• In groups of 3-6 brainstorm possible solutions for each challenge.

Page 45: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

45

Supporting Improvement

• So what meaning are you making about this information about data and school improvement?

DATA

Here’s What

So What?

Now What?

School ImprovementH

ere’s What

Page 46: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

46

Goals to Inform PlanningP. 12

• What are your school or division goals?

• What structures and supports do you have in place to support sustainable improvement towards that goal?

• What kinds of data are you gathering to inform decision making and progress?

DATA

Here’s What

So What?

Now What?

School ImprovementH

ere’s What

Page 47: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

47

Progress Measure AreasP. 13

Assessing ProgressStudent DataShort-Term

Medium-Term

Long-Term

Evidence

of

Implementation

Goal TypesImprovement Goals

Proficiency Goals

From Boudette, City, & Murnane (2005) and Holcomb (2004).

Page 48: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

48

Improvement and Proficiency

GROWTH• Improvement refers to

students’ growth on a given assessment within a specified period of time.

• A student or group of students may experience great growth but still fall short of set proficiency goals.

COMPETENCE• Proficiency refers to how

many students will achieve a certain level of performance within a specified period of time.

• Proficiency goals don’t measure student growth – they measure how many have reached a set standard or benchmark.

Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Page 49: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

49

Improvement and Proficiency

• Attending to both improvement and proficiency ensures that students grow academically and have achieved degrees of competence in their studies.

• Thinking of growth and competence compels us to consider in what ways all students will grow (weak, average, and gifted) and what levels of competence are desired for all.

Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Page 50: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

50

Goal Types

• In what ways do your school division goals reflect improvement?

• In what ways do your school division goals reflect proficiency?

Page 51: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

51

Monitoring and Assessing Progress

• An integral part of every action plan is the detailed plan to ensure that progress is being made.

• It is important to gather data from a variety of sources that clearly demonstrate the plan is being implemented, change in instruction is occurring, and student learning is improving.

Page 52: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

52

Short-, Medium-, andLong-Term Data

• Short-Term Data– Gathered daily or weekly via classroom assessments

and/or observations.• Medium-Term Data

– Gathered at periodic intervals via common department, school, or division assessments. These are usually referred to as benchmark assessments.

• Long-Term Data– Gathered annually via standardized provincial,

national, or international assessments.Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Page 53: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

53

Short-, Medium-, andLong-Term Assessments – p. 14

• Your school or school division has set a goal to improve students’ quality of writing, particularly as it relates to organization and coherence.

• Teachers’ in-class assessment strategies provide formative feedback to students in these areas – writing effective introductions and conclusions, as well as transitions.

• Writing prompts are developed for each grade level in the school and administered at the end of each reporting period. Teachers collaboratively grade the papers using the rubrics from the Assessment for Learning program and analyze the results together.– Following the common assessment, students who have not achieved

the set benchmark receive additional instruction and formative assessment as they work towards the goal.

• In 2010 students are again assessed on their writing with the provincial AFL program.

Page 54: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

54

Short- and Medium-Term Assessments

• Referring to your identified goals, indicate what types of short- and medium-term assessments would best measure the progress of students as they work toward the goal.

• It may be useful to plan the medium-term assessments first to provide a framework within which short-term assessments would fit.

• Use the provided Short- and Medium-Term Assessment Planning template to plan when these might be administered.

• You will also want to consider Long-Term Assessments at some point.

Page 55: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

55

Short- and Medium-Term Assessments

• In what ways might you support staff as they design their learning improvement plans? Consider sustainability as you identify the types of assessments needed to provide data to support decision making.

Page 56: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

56

To Consider . . .

• Using the frame of short-, medium-, and long- term data, where do improvement and proficiency goals fall?

• What would be the nature of the assessments used to address improvement goals?

• What would be the nature of the assessments used to address proficiency goals?

Page 57: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

57

Identifying Starting Points:Interrelationship Diagramming

(Wellman & Lipton, 2003)

• Interrelationship diagrams reveal critical relationships among the elements in a system. The intent is to coordinate decision-making to determine choices and starting points for improvement plans.

• This tool is most effective when implemented with a group with a variety of roles and experiences in order to examine multiple perspectives.

Wellman, B. & Lipton, L. (2004). Data-driven dialogue. Mira Via, LLC.

Page 58: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

58

Interrelationship Diagramming

• First, the group identifies all of the issues related to a process, area or goal they are working on.

• For example—improving student writing—what are the related issues and how might they be categorized?– Teacher Knowledge About Writing– Teacher Instructional Practices– Student Interest/Motivation– Resources to Support Writing– Etc

• Brainstorm a list of issues related to a goal you are working on in your school division.

• Categorize the issues into 6-8 broad categories. Place the titles of categories on sticky notes.

Wellman, B. & Lipton, L. (2004). Data-driven dialogue. Mira Via, LLC.

Page 59: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

59

Creating anInterrelationship Diagram

Take the sticky notes and place them in a circle with the issue as a header above:

Goal

Wellman, B. & Lipton, L. (2004). Data-driven dialogue. Mira Via, LLC.

Page 60: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

60

Identifying Effects and Drivers: Interrelationship Diagramming

• Then the group needs to separate drivers (causes) from effects.– For example, in the area of student writing:

Wellman, B. & Lipton, L. (2004). Data-driven dialogue. Mira Via, LLC.

Arrows moving away from boxes indicate drivers,while arrows moving into boxes indicate effects.

Teacher Instructional

Practices

StudentInterest

Driver Effect

Page 61: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

61

Creating anInterrelationship Diagram

• Select one category as a starting point.

• Ask two-way questions to determine whether this category is a driver or effect of each of the other categories. For example, “Does teacher content knowledge influence (or drive) teacher instructional practices (effect)? Or, do teacher instructional practices influence or (drive) teacher content knowledge? Draw arrows from the drivers to the effects.

• Continue in this way through each of the categories.

• NO two-headed arrows! Decide which category dominates the other.

Wellman, B. & Lipton, L. (2004). Data-driven dialogue. Mira Via, LLC.

Arrows moving away from boxes indicate drivers,while arrows moving into boxes indicate effects.

Page 62: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

62

Creating anInterrelationship Diagram

• DRIVERS

– Drivers are indicated by the number of arrows going away from a category.

– Count the number of arrows going away from each category.

– Rank the drivers from highest to lowest.

• EFFECTS

– Effects are indicated by the number of arrows pointing towards a category.

– Count the number of arrows pointing towards each category.

– Rank the effects from highest to lowest.

Wellman, B. & Lipton, L. (2004). Data-driven dialogue. Mira Via, LLC.

Page 63: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

63

Creating Action PlansBased on Drivers

• Select the top 3-5 drivers.

• The drivers are the effective intervention points and thus provide the best opportunities to align all the elements of school/division improvement:– goals, – professional development, – assessment,– data collection

• You will want to consider the role of research and literature in providing background information for this activity or its role as follow up to the activity.

Page 64: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

64

Evidence of ImplementationP. 15 & 16

• Once teachers have decided on teaching/ learning strategies to use to improve student learning, it is important to identify implementation indicators.

• On the following slides are two examples of ways to gather evidence of implementation.

Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Page 65: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

IMPLEMENTATION INDICATORS

Writing Process: PrewritingWhat we will see and hear in classrooms

Teachers Teachers will teach strategies of effective brainstorming. Teachers will bridge the prior knowledge of students to the

writing prompt. Teachers engaging students in discussion around ideas.

Students Students will be using a variety of graphic organizers. Students will work collaboratively to brainstorm. Students will be sorting and selecting ideas. Students will be making revisions to their prewriting.

Classrooms Noise, as students collaborate. Samples of graphic organizers. Information on brainstorming as a prewriting strategy.

Student Work Prewriting will be extensive, relevant and organized. Drafts are connected to prewriting Revisions include meaningful changes.

Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Page 66: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

66

Indicators of School Based Application of Assessment for Learning

• How are teachers checking to see what has been learned and what needs to be learned next?

• How are teachers ensuring that students have access to specific and descriptive feedback, in relation to criteria that is focused on improvement?

• How are teachers finding ways to reduce evaluative feedback?

• How are teachers involving students – the people most able to improve the learning – deeply in the assessment process?

Davies, A., Herbst-Luedtke, S. & Parrot Reynolds, B. (2008). Leading the way to make classroom assessment work. Courtenay, B. C.: Connections Publishing.

Page 67: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

67

Indicators of Classroom Application using Assessment for Learning

• Consider a division goal within a curricular area (eg. Literacy):– Descriptions of success – learning destinations

within the course of study – are posted in the classroom or handed out to students and parents. These descriptions reflect the standards or learning outcomes and express them in simple terms that everyone can understand.

Davies, A., Herbst-Luedtke, S. & Parrot Reynolds, B. (2008). Leading the way to make classroom assessment work. Courtenay, B. C.: Connections Publishing.

Page 68: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

68

Indicators of Classroom Application of Assessment for Learning

• Look for:– Students are able to answer the question

– ‘What do you need to know to be successful?’ – by articulating the important ideas (or referring to a handout which does so) and describing how this knowledge or set of skills will be useful outside of school.

Davies, A., Herbst-Luedtke, S. & Parrot Reynolds, B. (2008). Leading the way to make classroom assessment work. Courtenay, B. C.: Connections Publishing.

Page 69: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

69

Indicators of Classroom Application of Assessment for Learning

• Look for:– Teachers are able to summarize the

learning destination and explicitly describe how the activity, assignment, or range of activities and assignments help all students learn. Furthermore, teachers can show plans for how student evidence or proof of learning will account for all the standards or outcomes.

Davies, A., Herbst-Luedtke, S. & Parrot Reynolds, B. (2008). Leading the way to make classroom assessment work. Courtenay, B. C.: Connections Publishing.

Page 70: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

70

Indicators of Classroom Application of Assessment for Learning

• Look for:– In response to the question – ‘What does

quality look like?’ – students will refer to models, exemplars, or criteria.

Davies, A., Herbst-Luedtke, S. & Parrot Reynolds, B. (2008). Leading the way to make classroom assessment work. Courtenay, B. C.: Connections Publishing.

Page 71: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

71

Evidence of Implementation

• Using either template provided, discuss and write down what data would need to be gathered as evidence that each of the indicators is being actualized in an effective manner, i.e. the strategies are being used as designed as opposed to interpretations of the strategy.– Who will collect the data?– How will it be collected?– When will it be collected?

Page 72: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

72

Advancing Assessment Literacy Modules

• 17 Modules designed to facilitate conversations and work with data for improvement of instruction.

• www.spdu.ca– Publications

• Advancing Assessment Literacy Modules

• Download a PDF of a PowerPoint and accompanying Lesson Plan for use by education professionals in schools.

Page 73: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

Advancing Assessment Literacy - Module Selection Matrix The intended audience of each module is indicated below. Modules marked Informational Activity are to demonstrate the processes to groups that will not be making school level decisions.

Module

Total Time* Teachers In School

Administration

School Community

Council School Boards

Central Office Staff

Setting the Stage: I Engaging Stakeholders 3 hours ● ● ● ● ● II Understanding Data Purposes and Uses 3 hours ● ● ● ● ● III Freedom of Information and Protection of

Privacy Act for Central Office 2 hours 20 minutes

III Freedom of Information and Protection of Privacy Act for General Audience

45 minutes ● ● ● ●

IV Building Learning Communities 3.5 – 6 hours ● ● ● Data Gathering:

I Establishing Outcomes 3 – 3.5 hours

● ● Informational

Activity Informational

Activity

II Creating Questions 1 hour

● ● Informational

Activity Informational

Activity

III Identifying and Valuing Different Types of Data 1.5 – 2 hours

● ● Informational

Activity Informational

Activity

IV Collecting and Collating Data 1 – 1.5 hours ● ● Data Analysis: I Summarizing, Representing and Sharing Data 1.5 hours ● ● ● ● ● II Examining and Interpreting Data 2.5 hours ● ● III Extending the Assessment 1 – 1.5 hours ● ● Data Informed Decision Making: I Building a Collaborative Culture 2 hours ● ● ● ● ● II Goal Setting 2 hours ● ● III Creating Action Plans 2.5 – 3 hours ● ● IV Monitoring and Assessing Progress 2.5 – 3 hours ● ● ● Designing Interventions 3 – 3.5 hours ● ● ● Continuing the Conversation 3 – 3.5 hours ● ● ● ● ● * All modules are broken into smaller units of time so that the material can be delivered in a variety of formats – staff meetings, PLC meetings, half-day, full-day, etc.

Page 74: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

74

Reflection

Partner Interviews:

• Select a one-word summary for today.

• Why did you choose that word?

• What commitments are you making to yourself?

Page 75: Data Interpretation II Workshop 2008 Writing Assessment for Learning.

75

Evaluation

• Deepen understanding about the writing assessment project results;

• Initiate reflection and discussion among division-level staff members related to the writing assessment results;

• Provide a range of tools and processes to support division-level staff in their work throughout the system related to school improvement; and,

• Provide opportunity to discuss and plan around the data in the context of school improvement.