New England Common Assessment Program Fall 2004 Pilot Test Administration Workshop September-October...
-
Upload
erick-emery-stewart -
Category
Documents
-
view
213 -
download
0
Transcript of New England Common Assessment Program Fall 2004 Pilot Test Administration Workshop September-October...
New England Common Assessment Program
Fall 2004 Pilot TestAdministration Workshop
September-October 2004
Workshop ProgramWelcome and introductions
1. Overview of the assessment program Questions
2. Description of the fall 2004 pilot test Questions
3. Pilot test logistics Questions
4. What’s Next Questions
An Emerging Vision
Cabot School, Vermont, Web Project Artwork
New England
Common Assessment Program
How Did We Get Here?
It began with No Child Left Behind –
but it became a shared vision of high standards and quality assessment.
Each state must assess students every year in each of grades 3-8, and one grade at high school beginning in the 2005-2006 academic year.
Each state is responsible for developing expectations for student achievement in mathematics and reading/language arts in each of grades 3-8;
To meet this challenge…
January 8, 2002, No Child Left Behind was signed into law.
The New England Compact was instituted in 2002 by the Commissioners of Education of Maine, New Hampshire, Rhode Island, and Vermont.
The New England Compact provides a forum for the states to explore idea, build a collective knowledge base, and establish cross-state activities that benefit each state.
From this collaboration emerged the New England Common Assessment Program
www.necompact.org/
What are the Advantages of Collaboration?
Developing a customized test at off the shelf prices Expanded Resources and Improved Quality
Teacher Involvement X 3 Test Coordinators X 3 Content Experts X 3 Technical Advisory Committee X 3 Bias Review X 3 Commitment to and experience with
Item Development and Review X 3
Key Challenges in the Design of the New England Common Assessments
Create a common set of Grade Level Expectations that fairly and validly represent the standards of all three states
Reach agreement and shared vision on how to measure the GLEs
Allow schools, districts and the states to maintain unique approaches to curriculum and instruction
Develop common test standards and cut points that will work in each state’s unique accountability system
Provide accessibility to the assessment for the maximum number of students possible
Based on a year and a half of work with Grade Level Expectation Teams, Content Teams, Item Review Teams and Bias Review Committees, we believe a large-scale common assessment can and should:
Be linked to state and local content standards
Provide information valued at the classroom level by teachers who use this data to change instruction - in other words be INSTRUCTIONALLY RELEVANT
Support the continuum of assessment from classroom to state levels
Meet tough standards of reliability and validity
Be maximally accessible
Emerging Principles -
Overview of Test Design
Who? The assessment includes public school students in
grades 3-8 in New Hampshire, Rhode Island, and Vermont.
Through explicit planning during test construction and the use of accommodations, the tests will be accessible to all but a very few students.
The common assessment does NOT include each state’s high school assessment, science assessment, alternate assessment or English language proficiency assessment programs.
Overview of Test DesignWhat?
The content, skills, and depth of knowledge contained in the Grade Level Expectations (GLEs) developed jointly by the three states expressly for this assessment program.
Reading and Mathematics tests at grades 3 through 8. Writing tests at grades 5 and 8.
At each grade level, the tests will measure end-of-grade GLEs for the previous year.
Each test will be designed to measure a range of student achievement across four performance levels.
Overview of Test DesignWhen?
A “full-scale” Pilot Test will be administered October 26, 27 and 28, 2004 for Reading and Mathematics, and January for Writing
Tests will be administered in the fall rather than the spring.
Operational testing will begin in October 2005. Testing will occur during a 3-week window at the
beginning of October.
Overview of Test DesignWhy - Why fall testing? Assessments results will be returned in December
(January 2006 in the initial year), followed later by accountability results. Allows time for interpretation and use of the
assessment results for curriculum and instruction improvement during the spring and summer
Allows us to get the results back to the teacher who gave the test
Minimizes impact on instructional time Allows us to truly test end of grade standards Provides measurement of long-term learning Improves compliance with NCLB accountability
requirements
WHAT ARE TEST SPECIFICATIONS?
Type of items: multiple-choice, extended response, etc.
Length of test: hours and sessions Number of test items and points Distribution of Emphasis Depth of Knowledge
Overview of Test Specification
Each test will include a variety of item types Multiple-choice Constructed-response Short Answer (mathematics and writing) Extended Writing (writing)
Form Follows Function
The New England Common Assessment uses a mixed common and matrix design.
Common Items- Items that are the same for ALL. The scores are based on these items
Matrix Items- Items that are part of equating and field testing. Each form is different. These items don’t “count” on student scores.
Overview of Test Specifications
What is Depth of Knowledge? Levels are focused on the complexity of the
item, not on how different students interact with the item
Descriptors in each discipline to guide item development and classification
Levels help define the upper limits and range of items that are “fair game” for an assessment for a given GLE
Overview of Test Specifications
Depth of Knowledge Levels
Level 1 Recall Level 2 Skill/Concept Level 3 Strategic Thinking Level 4 Extended Thinking
From the work of Norman Webb
Overview of Test Specifications
Example: Depth of Knowledge Applied to Math Level 1 involves recall, or the use of a
procedure, solving an equation, or applying an algorithm or formula.
Level 2 involves more than one step, demonstrating conceptual understanding through models and explanations, classifying information, and interpreting data from a simple graph.
Overview of Test Specifications
Math Example (continued) Level 3 involves reasoning, planning, or
using evidence Level 4 requires complex reasoning,
planning, and thinking over extended periods of time. In mathematics, Level 4 Depth of Knowledge will not be assessed on the state grade level assessments.
Overview of Test Specifications
MATHEMATICS TEST SPECIFICATIONS:
ITEM TYPES: multiple-choice (1 points) Short answer (1 points) Short answer (2 points) Constructed response (4 points)
NUMBER OF POINTS: 66 Total Multiple-choice (32 points) Short answer (18 points) Constructed response (16 points)
MATHEMATICS TEST SPECIFICATIONS (cont.)
Test Length: Three testing sessions of approximately 1
hour each.
Depth of Knowledge: Levels 1 – 3 are measured on the
assessment.
Distribution of Emphasis for Mathematics Assessment:
2(3) indicates end of grade 2 tested beginning of grade 3
Approximate Distribution of Emphasis
2 (3) 3(4) 4(5) 5(6) 6(7) 7(8)
Number and Operations
55% 50% 50% 45% 30% 20%
Geometry and Measurement
15% 20% 20% 25% 25% 25 %
Algebra and Functions
15% 15% 15% 15% 30% 40%
Data, Statistics, & Probability
15% 15% 15% 15% 15% 15%
100% 100% 100% 100% 100% 100%
READING TEST SPECIFICATIONS:
ITEM TYPES: 2 long passages with 8 multiple-choice items
and 2 constructed-response items 2 short passages with 4 multiple-choice items
and 1 constructed-response item 4 stand alone multiple-choice items
READING TEST SPECIFICATIONS (cont.):Number Of Points: 52 Total
multiple-choice (28 points) constructed response (24 points)
Test Length: Three sessions of about 1 hour each.
Depth of Knowledge: Levels 1 - 3
Distribution of Emphasis for Reading AssessmentReading Content
Clusters2/
Begin 33/
Begin 44/
Begin 55/
Begin 66/
Begin 77/
Begin 8
Balance ofRepresentation
Balance ofRepresentation
Balance ofRepresentation
Balance ofRepresentation
Balance ofRepresentation
Balance ofRepresentation
Word Identification 20% 15% - - - -
Vocabulary 20% 20% 20% 20% 20% 20%
InitialUnderstanding Lit.
20% 20% 20% 20% 15% 15%
InitialUnderstanding Info
20% 20% 20% 20% 20% 20%
Analysis andInterpretationLiterary
10% 15% 20% 20% 25% 25%
Analysis and InterpretationInformational
10% 10% 20% 20% 20% 20%
ReadingAssessmentTOTALS
100% 100% 100% 100% 100% 100%
Overview of Test DesignImproved Accessibility through Universal design
Inclusive assessment population Precisely defined constructs Accessible, non-biased items Amenable to accommodations Simple, clear, and intuitive instructions and
procedures Maximum readability and comprehensibility Maximum legibility
Overview of Test DesignImproved Accessibility through Universal design
2003-2005 Assessment Development Timeline
Activity When Who
Develop GLEs and Test Specs
1/03 – 9/03 DOEs, Committees, NCIEA
Issue RFP and Select an Assessment Contractor
10/03 – 1/04 DOEs
Project Planning 1/04 – 2/04 DOEs, MP
2003-2005 Assessment Development Timeline
Activity When Who
Preliminary Item Development
2/04 – 3/04 MP, DOEs
First Item Review Committee Meetings
3/1/04 Committees, DOEs, MP
Reading/Writing Preliminary Passage Selection
3/04 MP, DOEs
Bias/Sensitivity Committee Review of Passages
3/04 DOEs, Committee, MP
2003-2005 Assessment Development Timeline
Activity When Who
Second Item Review Committee Meetings
4/27/04 – 4/28/04
Committees, DOEs, MP, NCIEA
Item Development 5/04 – 6/04 MP, DOEs
Third Item Review Committee Meetings
7/6/04 – 7/9/04 Committees, DOEs, MP, NCIEA
Bias/Sensitivity Committee Review Meeting
7/14/04 – 7/15/04
Committee, DOEs,MP
Final Item Review & Selection
7/04 – 8/04 DOEs, MP
2003-2005 Assessment Development Timeline
Activity When Who
Field Test Forms Construction/ Production
7/04 – 9/04 DOEs, MP
Ship Materials to Schools
By 10/20 MP
Reading & Math Field Test Administration
10/26/04 – 10/28/04
All Schools, DOEs, MP
Writing Field Test Administration
January, 2005 Schools, DOEs, MP
Fall 2004 NECAP Pilot TestingPurpose: Provide an opportunity to field-test all of the items
Further refine the item sets Guide selection of items for operational tests
Try out all the planned testing procedures, manuals, shipping/receiving, etc.
Give schools an opportunity to experience the assessment prior to the 1st operational administration in October 2005
Give schools an opportunity to provide feedback (via student and teacher questionnaires)
Fall 2004 NECAP Pilot TestingWhen? Reading and math will be piloted on October 26,
27 and 28, 2004 Writing will be piloted in January, 2005 Pilot testing materials will be shipped via UPS on
October 15th and delivered to schools by October 20th
Completed tests and materials should be packed for UPS pick-up by 8:00 AM on Monday, November 1st. Schools DO NOT need to contact UPS
Fall 2004 NECAP Pilot TestingWho? All public schools in NH, RI and VT with any
of the grades 3 through 8 will participate Each grade in a school will be assigned one
content area test Schools selected to pilot the writing test WILL
NOT administer a test at grades 5 and 8 in October
Fall 2004 NECAP Pilot TestingWho?
Because the results of the pilot will be used to judge the accessibility of the assessment for all students, it is important to include as many students as possible in the NECAP Pilot Assessment, including students with disabilities and English Language Learners
Fall 2004 NECAP Pilot TestingWho? All students enrolled in grades 3 through 8 as of
October 26 must participate, with the following exceptions: Students who would normally participate in an
alternate assessment English Language Learners enrolled in a US school
for less than one year Students who are absent for the entire pilot testing
window Students whose extraordinary personal
circumstances prevent them from participating DO NOT contact the DOE to report or request
exceptions
Fall 2004 NECAP Pilot TestingWho? Students who are Blind or Visually Impaired
Large print forms WILL be available for the pilot Braille forms WILL NOT be available for the pilot
Students who need accommodations Approved Accommodations list is available in the
Principal/Test Coordinator Manual Available to all students based on individual need Informal decision by educational team, consistent
with past practice or current needs
Fall 2004 NECAP Pilot TestingWhat? - “We have a field trip scheduled
for October 26th. What should we do?” You may administer two sessions of
the pilot test on the same day, one in the AM and one in the PM
Make sure to give the sessions in the correct order
DO NOT administer any sessions prior to October 26th
Fall 2004 NECAP Pilot TestingWhat? - What should we do if students
can’t finish in the stipulated time? Testing times are estimated. Allow
students who are working productively to continue up to 100% additional time.
Because testing times are estimated, don’t schedule testing right before lunch, recess, or dismissal
Fall 2004 NECAP Pilot TestingWhat? - What should we do if a student is
absent? Make up sessions should be
scheduled only when an entire class misses a scheduled pilot testing session.
In the event of individual student absences, it is not necessary to schedule a make-up session.
DO NOT administer any sessions prior to October 26th
Fall 2004 NECAP Pilot TestingWhat? - What scores will we receive from
the pilot tests? No school or student scores will be
generated from pilot testing Purpose of the pilot is to gather
information about the items and administration procedures
We are testing the test, not the students
The Purpose of this workshop is to ensure:
Everyone understands the administration procedures for the Pilot Tests
That NECAP is administered in a comparable way in all locations across the three states
That the information collected is of high quality
Quality control in returning materials
Important Contact Information
Amanda Smith – NECAP Program Assistant
Phone - 1-603-749-9102 ext. 2259E-mail - [email protected]
Monica Frattaroli – NECAP Program ManagerPhone - 1-603-749-9102 ext. 2162E-mail - [email protected]
Important Contact Information
Harold Stephens – NECAP Program Director
Phone - 1-603-749-9102 ext. 2235E-mail - [email protected]
Timothy Crockett – Assistant Vice President
Phone - 1-603-749-9102 ext. 2106E-mail – [email protected]
Important Contact Information
Measured Progress Service Center
1-877-632-7774
Important Dates
October 20: Receive/Inventory Materials October 26-28: Pilot Test Administration
(Reading and Mathematics) November 1: UPS pickup of test materials for
return to Measured Progress (materials must be ready at 8 AM)
Checklist for Principals and Test Coordinators (ii)
Before Testing During Testing After Testing
Test Coordinator’s Responsibilities (4)
Primary responsibilities are to serve as contact person with Measured Progress coordinate all test related activities prepare teachers for administration oversee the inventory, distribution, collection, and
return of all test materials ensure test security and compliance with
administration procedures
Test Security (5)
All test items and responses to those items in the NECAP are secure material and may not be copied or duplicated in any way or retained in the school after testing is completed.
Test Security (5)
Breaches in Test Security
Any concern about breaches in test security must be reported immediately to the test coordinator and/or principal.
The test coordinator and/or principal is responsible for immediately reporting the concern to the state director of assessment at the department of education.
Before Testing(5)
Preparation for Test Administration (5)
Read the Principal/Test Coordinator and Test Administrator Manuals
Student Participation and Accommodations Who Should Be Tested Determining How Students Will Participate
Using Accommodations Other Accommodations Document Accommodations
Preparation for Test Administration (6)
Scheduling Test Sessions Prior to Test Administration
Designating Test AdministratorsBriefing Test Administrators
Test Materials (10)
Inventory Test MaterialsQuantities of Test Booklets and
Response BookletsQuantities of Other MaterialsOrdering Additional MaterialsStoring Test Materials
Test Materials (13)
School MaterialsProviding Necessary Equipment
and MaterialsEquipment and Materials
Prohibited During Test Administration
Summary of Test Materials
During Testing(14)
Test Administration (14)
Distributing Test Materials Monitoring Test Administration
After Testing(15)
Preparation of Test Materials for Return (15)
Collecting Materials after TestingCompleting Student Information on
Student Response BookletsSpecial EducationCompleting the Principal’s Certification
of Proper Test Administration FormReturn of Materials
QUESTIONS & ANSWERS
BUILDING THE TESTWhat’s Next? Review data from pilot tests; Construct forms for October 2005 testing; Develop and revise additional items; Develop a practice test in reading, writing, and
mathematics Develop additional support materials Design report formats Develop scoring rubrics Set standards after first administration Finalize details of Fall Testing Make accountability decisions
Questions, Comments, Suggestions:
Tim Kurtz Director of AssessmentNH Department of Education(603) 271-3846
Mary Ann Snider Director of Assessment and Accountability
Rhode Island Department of Elementary and Secondary
Education(401) 222-4600 ext. 2100
Michael Hock Director of Educational AssessmentVermont Department of Education(802) 828-3115