Unpacking Assessment Literacy - NAESP...• Alignment is the process of linking content and...

Post on 21-Apr-2020

5 views 0 download

Transcript of Unpacking Assessment Literacy - NAESP...• Alignment is the process of linking content and...

Unpacking Assessment Literacy Intentionally Assessing Learning in Your School

National Association of Elementary School Principals

Presented by Lindsey Allard Agnamba, Laura Bornfreund,

Kimberly Pearson Cooke

Objectives: Principals will

• Have the opportunity to reflect on AL as it relates to your schools and your leadership

• Learn how to help teachers use assessment data to inform and differentiate their instruction to meet individual student needs

• Develop an action plan to promote more intentional, non-duplicative assessments

Agenda

• Defining assessment literacy

– Voices of district and school leaders

• The context for assessment of young children

• Case study and discussion questions

• Develop an action plan

Research suggests that teachers spend from 1/4 to 1/3 of their professional time on assessment-related activities. Almost all do so without the benefit of having learned the principles of sound assessment.

What is Assessment Literacy?

First helps to think about the definition of assessment: a process of gathering and documenting information about the achievement, skills, abilities, and personality variables of an individual. Assessment literacy then is the knowledge of the basic principles of sound assessment practice—including terminology, development, administration, analysis and standards of quality.

-- Northwest Education Association

Key terms: planning for districts and schools

• Age Appropriate refers to the characteristics of the skills taught, the activities and materials selected, the assessment items used, and the language level employed; each should reflect the chronological age of the student.

• Alignment is the process of linking content and performance standards to assessment, instruction, and learning in classrooms.

• Disaggregation is the reporting of data for a particular student group of the population (e.g., Asian American or special education) rather than for the entire population as a whole.

• Reliability is the degree to which the results of an assessment are dependable and consistently measure particular student knowledge and/or skills.

• Validity is the extent to which a test measures what it was designed to measure. Multiple types of validity exist.

Key Terms: administering assessments

• Accommodations are changes in the administration of an assessment, such as setting,

scheduling, timing, presentation format, response mode, or others, including any combination of these that does not change the construct intended to be measured by the assessment or the meaning of the resulting scores.

• Documentation is the process of keeping track of and preserving children’s work as evidence of their progress.

• Modifications are changes made to the test itself: reduced number of distracters, fewer items, etc.

• Standardization. A consistent set of procedures for designing, administering, and scoring an assessment.

Key Terms: data analysis

• Assessment System. The combination of multiple assessments into a comprehensive

reporting format that produces comprehensive, credible, dependable information upon which important decisions can be made about students, schools, districts, or states.

• Baseline Data are the initial measures of performance against which future measures will be compared.

• Body of Evidence constitutes information or data that establish that a student can perform a particular skill or has mastered a specific content standard. The evidence must be either produced by the student or collected by someone who is knowledgeable about the student.

• Data-literate educator continuously, effectively, and ethically accessing, interpreting, acting on, and communicating multiple types of data from various sources to improve outcomes for students in a manner appropriate to educators’ professional roles and responsibilities.

• Measurement error refers to inconsistencies in scores across various “instances of measurement,” or across multiple assessments.

Large group activity

• Visit at least 3-5 stations around the room

• How to choose, create, and identify quality assessments

• Types of assessments

• Useful state resources

• Barriers and solutions to increase assessment literacy

• Student Learning Objectives (SLOS)

• Jot down your key takeaways from each table

• Discuss with your neighbor

• Share with whole group

PreK-3rd grade teacher evaluation and SLOs

• Three components:

• Goal

• Target

• Assessment, rubric, or other measure of progress toward goal

• Other names: SGOs, TAS goals

Ensuring Purposeful and Systematic Assessment

• Planning how data will be used

• Who should have access to the data?

• In what decisions they will play a role, and what stakeholders need to know about them.

• Ideally, any assessment activity benefits children by providing information that can be used to inform their caregivers and teachers, to improve the quality of their care and educational environments, and to identify child risk factors that can be remedied

Voices from the field

Voices from the field

Voices from the field

Major themes in Early Childhood Assessment:

• Purposeful Assessment

• Instructionally Aligned Assessment

• Beneficial Assessment

Challenges in Assessment:

• How do we account for developmental variability?

• What gets measured?

• How should we assess?

• Other challenges?

Case Study: Doreen Jones @ PS 167

• Spend 3-5 minutes reading and reflecting

• Meet with someone new to discuss the guiding questions

• Share back with the group

Developing an action plan

More resources, questions

Closing and Reflections

The principle goal of education in the schools should be creating men

and women who are capable of doing new

things, not simply repeating what other

generations have done. Jean Piaget

FOCUS: Assessment for student learning

NOT Assessment as measurement