Data Interpretation I Workshop

65
Data Interpretation I Workshop 2008 Writing Assessment for Learning

description

Data Interpretation I Workshop. 2008 Writing Assessment for Learning. Purposes for the Day. Bring context and meaning to the writing assessment project results; Initiate reflection and discussion among school staff members related to the writing assessment results; - PowerPoint PPT Presentation

Transcript of Data Interpretation I Workshop

Page 1: Data Interpretation I Workshop

Data Interpretation I Workshop

2008 Writing

Assessment for Learning

Page 2: Data Interpretation I Workshop

2

Purposes for the Day

• Bring context and meaning to the writing assessment project results;

• Initiate reflection and discussion among school staff members related to the writing assessment results;

• Encourage school personnel to judiciously review and utilize different comparators when judging writing assessment results;

• Model processes that can be used at the school-and division-level for building understanding of the data among school staff and the broader community; and,

• Provide an opportunity to discuss and plan around the data

Page 3: Data Interpretation I Workshop

3

Agenda

• Understanding data—sources, categories & uses• Provincial Writing Assessment

– Conceptual Framework– Comparators– Student Performance Data– Opportunity to Learn Data– Standards and Cut Scores

• Predicting• Categories of Data• Action Planning

– Linking Data, Goals and Intervention • Closure

Page 4: Data Interpretation I Workshop

4

Synectics

• Please complete the following statement:“Data use in schools is like . . . because . . .”

Data use in schools is like molasses because it is slow and gets slower as it gets colder.

Data use in schools is like molasses because it is sticky and can make a big mess!

Page 5: Data Interpretation I Workshop

5

A Data-Rich Environment

Wellman & Lipton (2004) state:

Schools and school districts are rich in data. It is important that the data a group explores are broad enough to offer a rich and deep view of the present state, but not so complex that the process becomes overwhelming and unmanageable.

Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC.

Page 6: Data Interpretation I Workshop

6

International Data Sources

• Programme for International Student Assessment (PISA)

http://snes.eas.cornell.edu/Graphics/earth%20white%20background.JPG

Page 7: Data Interpretation I Workshop

7

National Data Sources

• Pan-Canadian Achievement Program (PCAP)

• Canadian Test of Basic Skills (CTBS)

• Canadian Achievement Tests (CAT3)http://www.recyclage.rncan.gc.ca/images/canada_map.jpg

Page 8: Data Interpretation I Workshop

8

Provincial Data Sources

• Assessment for Learning (AFL)– Opportunity to Learn

Measures– Performance

Measures

• Departmentals

http://regina.foundlocally.com/Images/Saskatchewan.jpg

Page 9: Data Interpretation I Workshop

9

Division Data Sources

• Division level rubrics

• Division bench mark assessments

http://www.sasked.gov.sk.ca/branches/ed_finance/north_east_sd200.shtml

Page 10: Data Interpretation I Workshop

10

Local Data Sources

• Cum Folders

• Teacher designed evaluations

• Portfolios

• Routine assessment data

Page 11: Data Interpretation I Workshop

11

Nature of Assessment Data

From Understanding the numbers. Saskatchewan Learning

Definitive Indicative

Individual Classroom School Division Provincial National International

Student Evaluations System Evaluations

Page 12: Data Interpretation I Workshop

12

.

Depth and Specificityof Knowledge

From Saskatchewan Learning. (2006). Understanding the numbers.

Little knowledge ofspecific students

In-depth knowledge of specific students

Individual NationalSchoolClassroom InternationalDivision Provincial

Assessments

In-depth knowledge of specific students

Individual NationalSchoolClassroom InternationalDivision Provincial

Assessments

In-depth knowledge of systems

Page 13: Data Interpretation I Workshop

13

Using a Variety of Data Sources

• Thinking about the data sources available, their nature and the depth of knowledge they provide, how might the information in each impact the decisions you make?– What can you do with this data?– What is its impact on classrooms?

Page 14: Data Interpretation I Workshop

14

Using a Variety of Data Sources

Data Sources Uses Impacton Classroom

Provincial•AFL•Departmental

Page 15: Data Interpretation I Workshop

15

Using a Variety of Data Sources

Data Sources Uses Impacton Classroom

Provincial•AFL•Departmental

•AFL data can be used as a snapshot of achievement at the school, school division and provincial level to inform planning at each level

Page 16: Data Interpretation I Workshop

16

Using a Variety of Data Sources

Data Sources Uses Impacton Classroom

Provincial•AFL•Departmental

•AFL data can be used as a snapshot of achievement at the school, school division and provincial level to inform planning at each level

•Long term impact on planning as teachers work to capitalize on areas of strength and address areas for improvement

Please refer to the “Using a Variety of Data Sources” template on p. 3 in your handout package as a guide for your discussion.

Page 17: Data Interpretation I Workshop

17

Assessment for Learningis a Snapshot

• Results from a large-scale assessment are a snapshot of student performance.– The results are not definitive. They do not tell the

whole story. They need to be considered along with other sources of information available at the school.

– The results are more reliable when larger numbers of students participate and when aggregated at the provincial and division level, and should be considered cautiously at the school level. Individual student mastery of learning is best determined through effective and ongoing classroom-based assessment. (Saskatchewan Learning, 2008)

Page 18: Data Interpretation I Workshop

18

Provincial Writing Assessment: Conceptual Framework – p. 4 & 5

• Colourful Thoughts– As you read through the information on the

Provincial Writing Assessment, use highlighters or sticky notes to think about your reading:

Wow! I agree

with this.

Hmm! I wonder. . .

Yikes!

Adapted from Harvey, S. & Goudvis, A. Strategies that work, 2007.

Page 19: Data Interpretation I Workshop

19

Comparators: Types of Referencing – p. 6

• Criterion-referenced: Comparing how students perform relative to curriculum objectives, level attribution criteria (rubrics) and the level of difficulty inherent in the assessment tasks. If low percentages of students are succeeding with respect to specific criteria identified in rubrics, this may be an area for further investigation, and for planning intervention to improve student writing.(Detailed rubrics, OTL rubrics and test items can be sourced at www.education.gov.sk.ca)

• Standards-referenced: Comparing how students performed relative to a set of professionally or socially constructed standards. Results can be compared to these standards to help identify key areas for investigation and intervention.(Figure .2b, .3c, .4a, .6b, .7b and .8b.)

Page 20: Data Interpretation I Workshop

20

Comparators: Types of Referencing

• Experience or self–referenced: Comparing how students perform relative to the assessment data gathered by teachers during the school year. Where discrepancies occur, further investigation or intervention might be considered. It is recommended that several sources of data be considered in planning.(E.g.. Comparing these results to current school data. The standards set by the panel.)

• Norm-referenced: Comparing how students in a school performed relative to the performance of students in the division, region or project. Note cautions around small groups of students. Norm-reference comparisons contribute very little to determining how to use the assessment information to make improvements.(E.g.. Tables comparing the school, division and province.)

Page 21: Data Interpretation I Workshop

21

Comparators: Types of Referencing

• Longitudinal-referenced: Comparing how students perform relative to earlier years’ performance of students. Viewed across several years, assessment results and other evidence can identify trends and improvements. (This data will not appear until the next administration of this assessment.)

Page 22: Data Interpretation I Workshop

22

Opportunity-to-Learn Elements as Reported by Students

• Propensity to Learn– using resources to explore

models, generate ideas and assist the writing process

– Motivation, attitude and confidence

– Participation, perseverance and completion

– Reflection

• Knowledge and Use of Before, During and After Writing Strategies

• Home Support for Writing and Learning– Encouragement and

interaction– Access to resources and

assistance

Page 23: Data Interpretation I Workshop

23

Opportunity-to-Learn Elements as Reported by Teachers

• Availability and Use of Resources– Teacher as key resource

• Teacher as writer

• Use of curriculum

• Educational qualifications

• Professional development

– Time– Student resources

• Classroom Instruction and Learning– Planning focuses on

outcomes– Expectations and criteria

are clearly outlined– Variety of assessment

techniques– Writing strategies explicitly

taught and emphasized– Adaptation

Page 24: Data Interpretation I Workshop

24

Student Performance Outcome Results

Demonstration of thewriting process

– Pre-writing– Drafting– Revision

Quality of writing product– Messaging and content

• Focus• Understanding and

support• Genre

– Organization and coherence

• Introduction, conclusion, coherence

– Language use• Language and word

choices• Syntax and mechanics

Page 25: Data Interpretation I Workshop

25

Standards

To help make meaningful longitudinal comparisons in future years, three main processes will be implemented.

1. Assessment items will be developed for each assessment cycle using a consistent table of specifications.

2. The assessment items will undergo field-testing - one purpose of which is intended to inform the comparability of the two assessments.

3. A process for setting of standards for each of the assessment items, so that any differences in difficulty between two assessments are accounted for by varying standards for the two assessments.

Page 26: Data Interpretation I Workshop

26

Opportunity-to-Learn and Performance Standards

• In order to establish Opportunity-to-Learn and Performance standards for the 2008 Writing Assessment, three panels were convened (one from each assessed grade), consisting of teachers from a variety of settings and post-secondary academics including Education faculty.

• The panelists studied each genre from the 2008 assessment in significant detail and established expectations for writing process, narrative products and expository products as well as opportunity to learn.

Page 27: Data Interpretation I Workshop

27

Thresholds of Adequacyand Proficiency

Beginning Developing Adequate Proficient Insightful

Page 28: Data Interpretation I Workshop

28

Thresholds of Adequacyand Proficiency

Threshold of Adequacy

Threshold of Proficiency

1.87 3.92

Adequate Proficient & Beyond

Page 29: Data Interpretation I Workshop

29

Cut Scores

• On page 4 of the detailed reports you will find the cut scores detailing the percentage correct required for students to be classified at one of two levels:

Opportunity-to-Learn Elements

Performance Component

Excellent Standard

Sufficient Standard

Proficient Standard

Adequate Standard

5 Level Scale Process – 3 Level Scale

Product – 6 Level Scale

Page 30: Data Interpretation I Workshop
Page 31: Data Interpretation I Workshop
Page 32: Data Interpretation I Workshop

32

Page 33: Data Interpretation I Workshop

33

Predicting Card Stackand Shuffle

• Individually: As you refer to the cut scores on page 4, create a stack of cards with some of your predictions about student outcomes in Narrative and Expository writing – consider each separately.– Writing Process – (Prewriting, drafting, revising)– Writing Product – (Message, organization and language choices)

• Eg. I predict our 85% of our Gr. 8s will meet the adequate standard or higher in Propensity to Learn and of those, 20% will be proficient or higher because our students are very comfortable with writer’s workshop processes, which we have emphasized for the last three years.

• Eg. I predict 90% of our Gr. 5s will score adequate or higher on demonstration of writing process in narrative writing because of our whole school emphasis on writing, especially with respect to narrative writing.

Page 34: Data Interpretation I Workshop

34

Predicting Card Stackand Shuffle

• As you complete each card, place it in the center of the table.

• As a group, shuffle the cards.• In turn, each group member picks a card to read aloud to

the table group. The group engages in dialogue or discussion about the items.

• Guiding questions:– With what parts of this prediction do you agree? Why?– With what parts of this prediction do you disagree? Why?– To what extent is this prediction generalizable to all the

classrooms in your school?

Page 35: Data Interpretation I Workshop

35

Predictions

• Considering all of the predictions, are there any themes or patterns emerging upon which you can all agree?– Why might this be?

Page 36: Data Interpretation I Workshop

36

Comparisons

The completed tables are on page 7.– What are you noticing about the data?– What surprised you?

•Which of your predictions were confirmed?•Which of your predictions were not confirmed?•Consider your assumptions as you discuss the results.

Wellman, B. & Lipton, L. (2004). Data driven dialogue. Mira Via, LLC.

Page 37: Data Interpretation I Workshop

37

Examining the Report

• Take a few minutes to look through the entire AFL report. Use the chart below to guide your thinking and conversation.

Performance Data

OTL Data

What pops out? Strengths, Areas of Improvement

Questions???

Page 38: Data Interpretation I Workshop

38

Please return at 12:40

I’d trade, but peanut butter sticks to my tongue stud.

Page 39: Data Interpretation I Workshop

39

Local Level Sources of Data

While international, national and provincial sources of data can provide direction for school initiatives, the data collected at the local level is what provides the most detailed information regarding the students in classrooms.

Page 40: Data Interpretation I Workshop

40

Four Major Categories of Data: Demographics – p. 7

• Local Data– Descriptive information

such as enrollment, attendance, gender, ethnicity, grade level, etc.

– Can disaggregate other data by demographic variables.

• AFL– Opportunity-to-

Learn Data• Family/Home support

for student writing– encouragement and

interaction

– access to resources

Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

Page 41: Data Interpretation I Workshop

41

Four Major Categories of Data: Student Learning

• Local Data– Describes outcomes in

terms of standardized test results, grade averages, etc.

• AFL– Readiness Related

Opportunity-to-Learn Data• Using resources to explore

writing• Student knowledge and use of

writing strategies (before, during, after)

– Student performance outcomes

• Writing 5,8,11 – Narrative and Expository

– Writing process– Writing product

Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

Page 42: Data Interpretation I Workshop

42

Four Major Categories of Data:Perceptions

• Local Data– Provides information

regarding what students, parents, staff and community think about school programs and processes.

– This is data is important because people act in congruence with what they believe.

• AFL– Readiness Related

Opportunity-to-Learn Data

• Commitment to learn– Using resources– Motivation & attitude– Confidence– Participation– Perseverance &

completion– Reflection

• Knowledge and use of writing strategies

Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

Page 43: Data Interpretation I Workshop

43

Four Major Categories of Data:School Processes

• Local Data– What the system and teachers

are doing to get the results they are getting.

– Includes programs, assessments, instructional strategies and classroom practices.

• AFL– Classroom Related

Opportunity-to-Learn Data• Instruction and learning

– Planning and reflection– Expectations and

assessment– Focus on writing

strategies– Adaptations

• Availability and use of resources

– Teacher– Time– Resources for students

and teachers

Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

Page 44: Data Interpretation I Workshop

44

What Data are Usefuland Available? P. 8

• Think about the goals/priorities set within your school or school division regarding student writing.

• Using the supplied template, begin to catalogue the data you already have and the data you need in order to better address the goals that have been set.

• An example follows on the next slide.

Page 45: Data Interpretation I Workshop

Questions

What data do you have answer questions?

What other data do you need to gather?

Demographics •Grade levels teaching writing strategies

• Number of teachers teaching writing skills

Perceptions • Student journals regarding writing habits

• Parent feedback

•Student perception of their success

Student Learning •Division writing benchmarks

•AFL for Gr. 5, 8, 11

•Student use of writing skills

•Common writing assessments at all grades

School Processes • Current instructional practice in teaching writing

• Writing skills explicitly taught in each subject

Bernhardt, V. L. (2004). Data analysis for continuous school improvement, 2nd Edition. Larchmont, NY: Eye on Education.

Goal: Students will consciously use writing strategies for all genres.

Page 46: Data Interpretation I Workshop

46

Designing Interventions

• Assumptions must be examined because our interventions will be based on them.

• We must strive to correctly identify the causal factors.

• Don’t fall in love with any theory until you have other data.

• Use a strength-based approach to interventions.

Page 47: Data Interpretation I Workshop

47

Team Action Plan

• Please turn to page 9 in your handout package.• What are some areas of strength indicated

within your data?• What are some areas for improvement indicated

within your data?• Please consider all aspects of the report

including the Opportunity to Learn Measures.

Page 48: Data Interpretation I Workshop

48

Fishbone Analysis:Strengths - p 10

• At your table, analyze one strength and consider all contributing factors that led to that strength.

Writing Process

All classrooms using Writers’ Workshop

PLC read Strategies that Work

Majority of PD focused on writing

Teachers explicitly teaching pre-writing strategy in all subjects

Page 49: Data Interpretation I Workshop

49

Fishbone Analysis:Area for Improvement – p. 11

• Identify one area for improvement.• What elements from your area of strength could

contribute to improvement in this area?– Eg. We did well in the process of writing because all

teachers are explicitly teaching pre-writing across the curriculum with every writing activity

– So, we need to explicitly teach how to write introductions, conclusions, and transitions in writing in all subject areas

Page 50: Data Interpretation I Workshop

50

Setting a Goal – p. 12

• Based on your previous discussions regarding strengths and areas for improvement, write a goal statement your team will work on over the coming year.

• Eg. For the 2010 AFL in Writing, all students will score at level 4 and above with respect to their use of before, during and after writing strategies.

• Write your goal on the provided bubble map. This is a template – add more bubbles if you need them! You do not have to fill in all the bubbles.

• Brainstorm possible strategies for meeting that goal. You may need to use different strategies at different grade levels.

Page 51: Data Interpretation I Workshop

51

Research Instructional StrategiesP. 13

• Once you have completed brainstorming strategies, you will want to conduct some research on the effectiveness of those strategies.

• Available resources could include a variety of websites, the professional collection at the Stewart Resources Centre and the McDowell Foundation (www.stf.sk.ca).

Page 52: Data Interpretation I Workshop

52

Impact/Feasibility – p 14

• Once you have completed your research, conduct an impact/feasibility analysis of the strategies you have identified.

• Impact refers to the degree to which a strategy will make a difference in the learning of students. A high impact strategy will make the greatest difference in learning for the broadest population of students.

• Feasibility refers to the practical supports that need to be in place such as time, funding, scheduling, etc.

Strategy Impact Feasibility

Activate prior knowledge

before writing new texts.

High High

Adopt new curriculum materials

Medium Low

Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

When done, choose the strategy that will have the greatest impact and is most feasible to implement.

Page 53: Data Interpretation I Workshop

Data-Driven Decision Making Improvement Cycle – p. 16

1. Find the data – “Treasure Hunt”

5. Identify Specific Strategies to Achieve Goals

4. Goal Setting and Revision

3. Needs Analysis

2. Data Analysis and Strength Finder

7. Action Plan, Schedule, REVIEW

6. Determine Results Indicators

(White, 2005)

Page 54: Data Interpretation I Workshop

54

Four Tasks of Action PlanningP. 17

1. Decide on strategies for improvement.

2. Agree on what your plan will look like in classrooms.

3. Put the plan down on paper.

4. Plan how you will know if the plan is working.

Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Page 55: Data Interpretation I Workshop

55

Put the Plan Down on Paper

• By documenting team members’ roles and responsibilities and specifying the concrete steps that need to occur, you build internal accountability for making the plan work.

• Identifying the professional development time and instruction your team will need and including it in your action plan lets teachers know they will be supported through the process of instructional improvement.

Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Page 56: Data Interpretation I Workshop

56

Writing Out The Plan – p. 18

• Using the supplied “Action Plan” template, begin to draft the details of the plan as you work to achieving your goal.

• The supplied template is only a suggestion – you may create your own or use another of your own design.

Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Page 57: Data Interpretation I Workshop

57

Plan How You Will Know if the Plan is Working

• Before implementing your plan, it is important to determine what type of data you will need to collect in order to understand whether students are moving towards the goal.

Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Page 58: Data Interpretation I Workshop

58

Different Lenses – p. 20

• What types of data might be required to gain a clearer picture of how specific groups of students are doing?– Consider the four categories of data available

– demographics, perceptions, student learning and school processes – as you explore what types of data you need.

Page 59: Data Interpretation I Workshop

59

Short-, Medium-, andLong-Term Data

• Short-Term Data– Gathered daily or weekly via classroom assessments

and/or observations.• Medium-Term Data

– Gathered at periodic intervals via common department, school, or division assessments. These are usually referred to as benchmark assessments.

• Long-Term Data– Gathered annually via standardized provincial,

national, or international assessments.Boudette, K., City, E. A., & Murnane, R. J. (2005). Data wise: A step-by-step guide to using assessment results to improve teaching and learning. Cambridge, MA: Harvard Education Press.

Page 60: Data Interpretation I Workshop

60

Short- and Medium-Term Assessments

• Referring to your action plan, identify what types of short- and medium-term assessments would best measure the progress of students as they work toward the goal.

• It may be useful to plan the medium-term assessments first to provide a framework within which short-term assessments would fit.

• Use the provided Short- and Medium-Term Assessment Planning template to plan when these might be administered.

Page 61: Data Interpretation I Workshop

61

Short-, Medium-, andLong-Term Assessments

• Your school or school division has set a goal to improve students’ quality of writing, particularly as it relates to organization and coherence.

• Teachers’ in-class assessment strategies provide formative feedback to students in these areas – writing effective introductions and conclusions, as well as transitions.

• Writing benchmark prompts are developed for each grade level in the school and administered at the end of each reporting period. Teachers collaboratively grade the papers using the rubrics from the Assessment for Learning program and analyze the results together.– Following the common assessment, students who have not achieved

the set benchmark receive additional instruction and formative assessment as they work towards the goal.

• In 2010 students are again assessed on their writing with the provincial AFL program.

Page 62: Data Interpretation I Workshop

62

Advancing Assessment Literacy Modules – p. 21

• 17 Modules designed to facilitate conversations and work with data for improvement of instruction.

• www.spdu.ca– Publications

• Advancing Assessment Literacy Modules

• Download a PDF of a PowerPoint and accompanying Lesson Plan for use by education professionals in schools.

• The PPT of this workshop will also be available on the same site.

Page 63: Data Interpretation I Workshop

Advancing Assessment Literacy - Module Selection Matrix The intended audience of each module is indicated below. Modules marked Informational Activity are to demonstrate the processes to groups that will not be making school level decisions.

Module

Total Time* Teachers In School

Administration

School Community

Council School Boards

Central Office Staff

Setting the Stage: I Engaging Stakeholders 3 hours ● ● ● ● ● II Understanding Data Purposes and Uses 3 hours ● ● ● ● ● III Freedom of Information and Protection of

Privacy Act for Central Office 2 hours 20 minutes

III Freedom of Information and Protection of Privacy Act for General Audience

45 minutes ● ● ● ●

IV Building Learning Communities 3.5 – 6 hours ● ● ● Data Gathering:

I Establishing Outcomes 3 – 3.5 hours

● ● Informational

Activity Informational

Activity

II Creating Questions 1 hour

● ● Informational

Activity Informational

Activity

III Identifying and Valuing Different Types of Data 1.5 – 2 hours

● ● Informational

Activity Informational

Activity

IV Collecting and Collating Data 1 – 1.5 hours ● ● Data Analysis: I Summarizing, Representing and Sharing Data 1.5 hours ● ● ● ● ● II Examining and Interpreting Data 2.5 hours ● ● III Extending the Assessment 1 – 1.5 hours ● ● Data Informed Decision Making: I Building a Collaborative Culture 2 hours ● ● ● ● ● II Goal Setting 2 hours ● ● III Creating Action Plans 2.5 – 3 hours ● ● IV Monitoring and Assessing Progress 2.5 – 3 hours ● ● ● Designing Interventions 3 – 3.5 hours ● ● ● Continuing the Conversation 3 – 3.5 hours ● ● ● ● ● * All modules are broken into smaller units of time so that the material can be delivered in a variety of formats – staff meetings, PLC meetings, half-day, full-day, etc.

Page 64: Data Interpretation I Workshop

64

Reflection

What did you discover today

that surprised you?

What will you take with

you from today?

Page 65: Data Interpretation I Workshop

65

Evaluation

• Bring context and meaning to the writing assessment project results;

• Initiate reflection and discussion among school staff members related to the writing assessment results;

• Encourage school personnel to judiciously review and utilize different comparators when judging writing assessment results;

• Model processes that can be used at the school and division level for building understanding of the data among school staff and the broader community; and,

• Provide an opportunity to discuss and plan around the data.