Dr. Deborah A. Brady Ribas Associates, Inc.. Please create a name tag or a “name tent” with...
-
Upload
ashlee-doyle -
Category
Documents
-
view
214 -
download
1
Transcript of Dr. Deborah A. Brady Ribas Associates, Inc.. Please create a name tag or a “name tent” with...
Dr. Deborah A. Brady
Ribas Associates, Inc.
Please create a name tag or a “name tent” with your first name and school or department.
Read the Table of Contents on page 1 Respond to the DO Now on page 2 of your handout.
*The materials are on line
if you want to follow along
and add notes at
http://tinyurl.com/l7287z9
PLANNING
By the end of this session, participants will:
1.Understand the timeline, expectations, and implications of District Determined Measures for all Marshfield educators.
2.Leave with a year-long plan for developing your department’s, school’s, team’s DDMs for this year’s pilot and next year’s full implementation.
3.Have the tools and resources and will have begun the process of developing and implementing at least one DDM.
SY 2014 SY 2015 SY 2016
ORIGINAL • Collect Data (year 1) • Collect Data• Issue Student
Impact Ratings (Year 2)
• Collect Data Issue Student Impact Ratings (Year 3 and on)
APRIL Revision
• Pilot 5 DDMs• Research others• Due in February—A
District Plan for assessing all teachers
• Collect Data for all teachers (year 1)
• Collect Data Issue Student Impact Ratings (year 2)
July 19 Revision
• Pilot 5 DDMs • December—
Implementation Extension Request Form
• Due in June—Final Plan for assessing all teachers with at least 2 DDMs
• Possible one-year extensions of implementation of specific grade/course/subject DDMs as long as based on Final Plan and
• Data Collection continues
• Collect Data for all teachers
• Issue Student Impact Ratings for all except waived grades/courses/subjects; this is their first year
DESE is still rolling out the evaluation process and
District Determined Measures
4
1
3
2
Sample DDMs in the five required pilot areas (last Friday).
Technical Assistance and Networking sessions on September 19th across the state
Technical Guide B (in this PowerPoint) addresses the practical application of assessment concepts to piloting potential DDMs and measuring student growth.
Model collective bargaining language will be available An ongoing Assessment Literacy webinar series continues
Guidance on constructing local growth scores and growth models will be released
Guidance on determining Student Impact Rating will be released.
(A work in progress with changes along the way)
Additional Model Curriculum Units, which include curriculum-embedded performance assessments CEPAs
Guidance on the use of CEPAs as part of a DDM-strategy.
Professional development for evaluators on how to focus on shifts embedded in the new ELA and math Curriculum Frameworks during classroom observations.
Professional development for evaluators on how to administer and score DDMs and use them to determine high, moderate or low growth, focused on the five required DDM pilot areas.
A Curriculum Summit in November
Take advantage of a no-stakes pilot year to try out new measures and introduce educators to this new dimension of the evaluation framework.
Districts are strongly encouraged to expand their pilots beyond the five required pilot areas.
Fold assessment literacy into the district's professional development plan to stimulate dialogue among educators about the comparative benefits of different potential DDMs the district could pilot.
Consider how contributing to the development or piloting of potential DDMs can be folded into educators' professional practice goals.
From the Commissioner:
“Finally, let common sense prevail when considering the scope of your pilots.
“I recommend that to the extent practicable, districts pilot each potential DDM in at least one
class in each school in the district where the appropriate grade/subject or course is taught.
“There is likely to be considerable educator interest in piloting potential DDMs in a no-
stakes environment before year 1 data collection commences, so bear that in mind
when determining scope.”
Pilot Pilot Year SY2014
PILOT YEAR
SEPTEMBER provide DESE a tentative plan for:Early grade literacy (K-3)Early grade math (K-3)Middle grade math (5-8)High school “writing to text” (PARCC multiple texts)PLUS one more non-tested course, for example:
Fine Arts Music PE/Health Technology Media/Library Other non-MCAS growth courses including grade 10 Math and ELA, Science
DECEMBER—Implementation Extension Request Form for specific courses in the JUNE PLAN
BY JUNE PLAN for all other DDMs must be ready for implementation in year 2 SY2015 At least one “local” (non-MCAS) and two measures per educator
2014
The scores will not
count for those who
pilot DDMs in 2014.
SY 2015All professional personnel will be assessed with 2 DDMs, at least one of which will be locally determined and one will be MCAS growth scores, when available: All teachers Guidance Principals, Assistant Principals Speech Therapists School Psychologists NursesEXCEPT those waivered by DESE based on a case-by-case decision process.
The scores will count as the first half of the “impact score” with
the waivered courses as the only exception
SY2016
“Impact Ratings” will be given to all licensed educational personnel and sent to DESE
Two measures for each educator At least one locally determined
measure for everyone Some educators will have two locally
determined measures The locally determined measure can
be a standardized test such as the DRA, MAP, Galileo, etc.
The MCAS can be only one measure The average of two years’ of scores And of a two year trend of those two
scores
“Impact Ratings”
Are based upon two years’
growth scores for two different assessments, at least one non-
MCAS score that is locally
determined.
Every educator earns two ratings
ExemplaryProficient
Needs ImprovementUnsatisfactory
HighModerate
Low
Massachusetts Department of Elementary and Secondary Education
14
SummativePerformance
Rating
Impact Ratingon
StudentPerformance
*Most districts will not begin issuing Impact Ratings before the 2015-16 school year.
Massachusetts Department of Elementary and Secondary Education
15Impact
Ratingon
StudentPerformance
MCAS can serve as one score for (ELA, Math, Science)One or two locally developed assessments; some educators may have threeDESE Exemplars for the required piloted areas will be available in August 2013The MA Model Units Rubrics can be used
Galileo BERS-2 (Behavioral Rating Scales)
DRA (Reading) Fountas and Pinnell Benchmark
DIBELS (Fluency) ??? MCAS-Alt MAP AP
On Demand (timed and standardized)
Mid-Year and End-of-Year exams
Projects Portfolios Capstone Courses Unit tests Other
Formats can include:Multiple choiceConstructed responsePerformance (oral, written, acted out)
Use School-wide Growth Measures Use MCAS growth measures and extend them to all educators in a school
Use “indirect measures” such as dropout rates, attendance, etc., as measures
Use Student Learning Objectives (SLOs) Team-based SLOs Or create measures. A pre- and post- test are generally required to measure growth except with normed assessments
4503699
244/ 25 SGP
230/ 35 SGP
225/ 92 SGP
GROWTH SCORES for Educators Will Need to Be Tabulated
for All Locally Developed AssessmentsMCAS SGP
Is the measure aligned to content?
Is the measure informative?
20
Is the measure aligned to content? Does it assess what is most important for students to learn and be able to do?
Does it assess what the educators intend to teach?
21
Is the measure informative? Do the results of the measure inform educators about curriculum, instruction, and practice?
Does it provide valuable information to educators about their students?
Does it provide valuable information to schools and districts about their educators?
22
1. Measure growth
2. Employ a common administration procedure
3. Use a common scoring process
4. Translate these assessments to an Impact Rating
5. Assure comparability of assessments (rigor, validity).
23
Comparable within a grade, subject, or course across schools within a district Identical measures are recommended
Comparable across grade or subject level district-wide Impact Ratings should have a consistent meaning across educators; therefore, DDMs should not have significantly different levels of rigor
24
Pre-Test/Post TestRepeated MeasuresHolistic EvaluationPost-Test Only
26
Description: The same or similar assessments administered at the beginning and at the end of the course or year
Example: Grade 10 ELA writing assessment aligned to College and Career Readiness Standards at beginning and end of year
Measuring Growth: Difference between pre- and post-test.
Considerations: Do all students have an equal chance of demonstrating growth?
27
Description: Multiple assessments given throughout the year.
Example: running records, attendance, mile run
Measuring Growth:GraphicallyRanging from the sophisticated to simple
Considerations:Less pressure on each administration.Authentic Tasks
28
29Date of Administration
# of errors
Description: Assess growth across student work collected throughout the year.
Example: Tennessee Arts Growth Measure System
Measuring Growth: Growth Rubric (see example)
Considerations: Option for multifaceted performance assessments
Rating can be challenging & time consuming
30
31
1 2 3 4
Details
No improvement in the level of detail.
One is true
* No new details across versions
* New details are added, but not included in future versions.
* A few new details are added that are not relevant, accurate or meaningful
Modest improvement in the level of detail
One is true
* There are a few details included across all versions
* There are many added details are included, but they are not included consistently, or none are improved or elaborated upon.
* There are many added details, but several are not relevant, accurate or meaningful
Considerable Improvement in the level of detail
All are true
* There are many examples of added details across all versions,
* At least one example of a detail that is improved or elaborated in future versions
*Details are consistently included in future versions
*The added details reflect relevant and meaningful additions
Outstanding Improvement in the level of detail
All are true
* On average there are multiple details added across every version
* There are multiple examples of details that build and elaborate on previous versions
* The added details reflect the most relevant and meaningful additions
Example taken from Austin, a first grader from Anser Charter School in Boise, Idaho. Used with permission from Expeditionary Learning. Learn more about this and other examples at http://elschools.org/student-work/butterfly-drafts
Description: A single assessment or data that is paired with other information
Example: AP exam
Measuring Growth, where possible: Use a baseline Assume equal beginning
Considerations: May be only option for some indirect measures What is the quality of the baseline information?
32
Portfolios Measuring achievement v. growth
Unit Assessments Looking at growth across a series
Capstone Projects May be a very strong measure of achievement
33
Piloting District Determined Measures
Piloting:TestAnalyzeAdjustRepeat
Being strategic and deliberate:
CollaborationIterationInformation
35
1. Prepare to pilot Build your team Identify content to assess Identify the measure
Aligned to content Informative
Decide how to administer & score
2. Test Administer Score
3. Analyze
4. Adjust36
Is the measure fair to special education students?
Are the variations of scores in scores due to rater?
Is growth equal across the scale?
37
Each DDM should have:
1. Directions for administering2. Student directions3. Instrument (the assessment)4. Scoring method5. Scoring directions
38
Existing ESE Staff Part VII of the Model System Technical Guide A Assessment Quality Checklist and Tracking Tool Assessment Literacy Webinar Series Materials from Technical Assistance sessions Commissioner's Memorandum Technical Guide B
What’s Coming Exemplar DDMs (August 30th) Other Supporting Materials
39
See page 5-6 of Handout for DESE recommendations
Table or Partner Talk
Pages 5-6 in Handout
Some options: Writing to text 9-12? K-12? (NEASC) Research K-12? Specialist Coordination Opportunities Support for Art, Music, PE, Health Math—one focus K-12? (fractions, e.g.)Are there present assessments that might be modified slightly
Consider all of the options, concerns, initiatives, possibilities as you look at what the next step for your school and district should look like.
Be ready to share this very basic “first think” on DDMs.
After this, you will be given tools that will support your assessments of tasks and curricula’s quality, rigor, and alignment.
Page 2 (The DO Now)
Process with a partner. Why might Elmore’s idea be germane to your planning? What can educators learn from DDMs?
http://edworkspartners.org/expect-success/2012/09/21st-century-aligned-assessments-identify-develop-and-practice-2/
Tools to assess Alignment
Tools to assess Rigor
Tools to assess the quality of student work
Quality Tracking Tool
Assess the Quality of your inventory of assessments
Also use Lexicon of Quality Tracking Tool Terms (in packet)
On DESE website
http://www.doe.mass.edu/edeval/ddm/
Educator Alignment ToolTemporarily Removed from doe web site.
Interactive data base for all educators and possible assessments that could be used for each.
It has been taken down from the web site temporarily.
Checklist Tracker
Alignment Alignment to
Common Core,PARCC, and the District Curriculum
Shifts for Common Core have been made: Complex texts Multiple texts Argument, Info, Narrative Math Practices Depth over breadth
Rigor
For Assessing Rigor and Alignment
1.Daggett’s Rigor/Relevance Scale 2.DESE’s Model Curriculum (Understanding by Design) 3.DESE’s Model Curriculum Rubrics (a destination)4.PARCC’s Task Description5.PARCC’s Rubrics for writing6.Protocols for Calibration (to use with teacher groups)7.Writing to Text Wikispace: http://tinyurl.com/l7287z9
“Task Complexity Continuum” 1 2 3 4 5MCAS ` MCAS PARCC CC Aligned ClassroomsORQ Composition multiple Authentic TasksELA ORQ Math texts Simple/Complex
School MEPID
Last Name
First Name
Grade Subject Course Course ID Potential DDM1
Potential DDM2 Potential DDM3
HS 07350 Smith Abagail 10 ELA Grade 10 ELA 01051 MCAS ELA10
HS 07351 Smith Abagail 9 ELA World Studies 01058 Writing to text
HS 07351 Smith Abagail 9 ELA Grade 9 ELA 01051
HS 07352 Smith Brent 10 Math IMM 2
HS 07352 Smith Brent 10 Math IMM 1 MCAS MATH10
HS 07353 Smith Cathy 11 Social Science
World Studies 04053
HS 07354 Smith Deb 12 Engineering Engineering
HS 07355 Smith Emily 12 Science Science
HS 07356 Smith Frank k-4 PE PE
HS 0736 smith Gus K Math Math
ELEM 07357 Smith Gus K ELA ELA
ELEM 07358 Smith Heidi 1 Math Math 1
ELEM 07358 Smith Jane 4 All Math MCAS Math 4
ELEM 07359 Smith Jane 4 All ELA MCAS ELA 4
ELEM 07360 Smith Karen 3 Technology Technology 3
ELEM 07361 Smith Linda 4 Science Science 4
ELEM 07362 Smith Mary 5 PE PE 5
MS 07363 Smith Nora 6 Math Math 6
MS 07364 Smith Oscar 6 ELA ELA 6 MCAS ELA 6
MS 07365 Smith Pam 6 Social Science
Social Science 6
MS 07366 Smith Quentin 6 Science Science 6 MCAS MATH 7
MS 07367 Smith Rob 7 Math Math 7
MS 07368 Smith Sandy 7 ELA ELA 7 MCAS ELA 7
MS 07369 Smith Tim 7 Social Science
Social Science 7
The Next Step?The 2011 MA Frameworks Shifts to the Common Core
Complex Texts Complex Tasks Multiple Texts Increased Writing
A Giant Step?Increase in cognitive load Mass Model Units—PBL with Performance-Based Assessments
(CEPAs) PARCC assessments require matching multiple texts
The PARCC Research Assessments
Literary Simulation Either AssessmentMay be given
Research Simulation
3-5 (E)6-8(MS)9-11(HS)
Grade Bands 3-5 (E)6-8(MS)
9-11(HS)
2 daysMid-year
DurationComputer (paper option first years)
2 daysMid-year
E, MS, HS—one extended one short literary text (digital also)HS—extended text could be literary non-fiction
“Suite” of Interrelated TextsE. 250-800 words
MS. 400=1000 wordsHS. 500-1500 words
E, MS, HS—one extended and three shorter informational
text (digital also)
E—some M/C comprehension questionsMS, HS—4-6 M/C comprehension questions
Comprehension Questions
Note higher level of challenge in PARCC
E, MS, HS—6-9 M/C comprehension questions
E, MS, HS—Narrative about one selection
Writing Task 1 E—Summary on one text6-Summary without opinions
or judgments7,8,HS—Objective Summary
on one text
E, MS, HS—Analytic essay on one or both texts
Writing Task 2 E, MS, HS—Analytic essay on anchor text plus three other
shorter texts (May be video, podcast,
photo, etc.)
Students carefully consider two literary texts worthy of close study.
They are asked to answer a few new and more sophisticated multiple choice questions about each text to demonstrate their ability to do close analytic reading and to compare and synthesize ideas.
e.g., Which of the 6 claims from question 1 can be made after reading paragraphs 3 and 11. There may be more than one answer.)
Students write a literary analysis about the two texts.
58
Use what you have learned from reading “Daedalus and Icarus” by Ovid and “To a Friend Whose Work Has Come to Triumph” by Anne Sexton to write an essay that provides an analysis of how Sexton transforms Daedalus and Icarus.
As a starting point, you may want to consider what is emphasized, absent, or different in the two texts, but feel free to develop your own focus for analysis.
Develop your essay by providing textual evidence from both texts. Be sure to follow the conventions of standard English.
Thus, both comprehension of the 2 texts and the author’s craft are being assessed along with the ability of the student to craft a clear argument with substantiation from two texts.
59
Creating Curriculum Units That Support (or Surpass)
The Cognitive Complexity of PARCCLiterature
One central Anchor text plus other clearly related texts
Interrelated Texts on Clearly Defined and
Refined Focus
InformationalOne central anchor text plus other clearly related texts
Read anchor text for clear comprehension
Literal comprehensionLiterary techniques
Author’s craft—why did the author use these techniques
to create his message?
Step 1Scaffolds
Socratic Seminars, Get the Gist, Interactive Notebooks, simpler exemplars, graphic
organizers
Read anchor text for clear comprehension
Literal comprehensionStructure, literary, rhetorical
techniquesAuthor’s craft—why did the
author use these techniques to create his message?
Relate the anchor text to the essential questions or
theme.
Going deeper Step 2Cognitive Complexity
Relate the anchor text to the essential questions or theme.
Read and comprehend additional texts and
compare and contrast to the anchor text as they relate to
the essential questions or theme
Going deeper Step 3
AnalyzeOrganize
Cite claimsWrite argument;
present argument (project, essay, museum, debate,
etc.) reflect for metacognitive
awareness of whole process
Read and comprehend additional texts and compare and contrast to the anchor text as they relate
to the essential questions or theme
State assessments
Massachusetts Model Assessments (PBL/Performance Assessments)
Quality Performance Assessments (Capstones; units)
New PARCC question and task prototypes http://www.parcconline.org/samples/item-task-prototypes
Writing to Text on Wikispaces
My collected resources; many address the “giant step” of pairing texts, an increased cognitive load, and PARCC’s standard as well as curriculum models. http://tinyurl.com/l7287z9
1 2 3 4 5 6
Topic development:The writing and artwork identify the habitat and provide details
Little topic/idea development, organization, and/or details Little or no awareness of audience and/or task
Limited or weak topic/idea development, organization, and/or details Limited awareness of audience and/or task
Rudimentary topic/idea development and/or organization Basic supporting details Simplistic language
Moderate topic/idea development and organization Adequate, relevant details Some variety in language
Full topic/idea development Logical organization Strong details Appropriate use of language
Rich topic/idea development Careful and/or subtle organization Effective/rich use of language
Evidence and Content Accuracy: writing includes academic vocabulary and characteristics of the animal or habitat with details
Little or no evidence is included and/orcontent is inaccurate
Use of evidence and content is limited or weak
Use of evidence and content is included but is basic and simplistic
Use of evidence and accurate content is relevant and adequate
Use of evidence and accurate content is logical and appropriate
A sophisticated selection of and inclusion of evidence and accurate content contribute to an outstanding submission
Artwork; identifies special characteristics of the animal or habitat, to an appropriate level of detail
Artwork does not contribute to the content of the exhibit
Artwork demonstrates a limited connection to the content (describing a habitat)
Artwork is basically connected to the content and contributes to the overall understanding
Artwork is connected to the content of the exhibit and contributes to its quality
Artwork contributes to the overall content of the exhibit and provides details
Artwork adds greatly to the content of exhibit providing new insights or understandings
Stage 1 Desired Results
ESTABLISHED GOALS GStandards from MA Frameworks
TRANSFERStudents will be able to independently use their learning to…
TMEANING
UNDERSTANDINGS UStudents will understand that…
ESSENTIAL QUESTIONS
QACQUISITION
Students will know…
K
Students will be skilled at…
Stage 2 - EvidenceEvaluative Criteria Assessment EvidenceCriteria for success (Describe expectations captured in rubric.)
CURRICULUM EMBEDDED PERFORMANCE ASSESSMENT (PERFORMANCE TASKS)
PTOTHER EVIDENCE:
OEStage 3 – Learning Plan
Summary of Key Learning Events and Instruction
Adapted from Understanding by Design 2.0 © 2011 Grant Wiggins and Jay McTighe Used with Permission July 2012
Developing rubrics
Developing exemplars
Calibrating scores
Looking at Student Work (LASW)
http://Nsfharmony.org/protocol/a_z.html
Sample for Developing Rubrics from an assessment
Rather than first focusing on the work's quality, these processes often ask teachers to suspend suspend judgmentjudgment and describe its qualities--bringing multiple perspectives to bear on what makes students tick and how a school can better reach them.
This protocol reduces the initial anxiety reduces the initial anxiety of competition since everyone will have an example of each level.
New groups can use this as a beginning protocol to explore and develop shared expectations develop shared expectations for student learning and performance.
Each teacher brings in two or three examples of high, medium, and low level work for a specific task, test, or prompt
A simple protocol to monitor improvement monitor improvement
Teachers identify patterns patterns in student work
Teachers create an action plan based on the patternsaction plan based on the patterns
Or teachers develop rubrics as descriptors of levels of quality of student Or teachers develop rubrics as descriptors of levels of quality of student workwork
Looking at Student Work (LASW): High-Medium-Low (H-M-L) Protocol* Purpose: Teachers often use this rubric as a beginning point to monitor improvement in student work. After each LASW session teachers identify patterns in student work and then form an action plan to help students improve in the area(s) identified. New groups can use this as a beginning protocol to explore and develop shared expectations for student learning and performance.
Time Process: Process Support: 2 min 1. Before looking at the student work:
Choose a facilitator Choose a timekeeper Chose a recorder
Refer to role descriptions for individual responsibilities. Facilitator: Note the time at which reflection must begin.
10 min 2. High-Medium-Low Sorting Without consulting other group members, each person sorts
student work into High, Medium, and Low piles After everyone has sorted the work, each member briefly
shares general observations about student performance
5 min 3. Developing a Rubric (or a general list of criteria) Each member writes general descriptors for each level, H, M,
& L
15 min 4. Share descriptors and agree on a group rubric (or, less formally, a simple list of criteria that participants used to sort the work into piles)
Recorder: Record rubric or criteria on chart paper for your group.
10 min 5. Summarize your findings about student work What was notable or surprising about the criteria your group
used to sort the work? What was notable to you about the students’ understanding?
Individually reflect and write. (5 minutes) Record on chart paper. (5 minutes)
15 min 6. What are the next steps for teaching the students in the High, the Medium, and the Low group
Develop an action plan for each level of student. Record on chart paper.
10 min 7. Reflect on the protocol (Record your ideas on the Reflection Matrix) What did you gain by using this protocol? In what ways did the structure of this protocol help you and
your group understand student thinking? How could using this protocol to look at student work improve
student learning, your classroom practice, and your work with peers?
Facilitator: Make sure that all group members record their thoughts in the Reflection Matrix.
*This protocol is an adaptation of the High-Medium-Low Protocol to be used in a workshop environment.
Looking at Student Work (LASW): High-Medium-Low (H-M-L) Protocol* Purpose: Teachers often use this rubric as a beginning point to monitor improvement in student work. After each LASW session teachers identify patterns in student work and then form an action plan to help students improve in the area(s) identified. New groups can use this as a beginning protocol to explore and develop shared expectations for student learning and performance.
Time Process: Process Support: 2 min 1. Before looking at the student work:
Choose a facilitator Choose a timekeeper Chose a recorder
Refer to role descriptions for individual responsibilities. Facilitator: Note the time at which reflection must begin.
10 min 2. High-Medium-Low Sorting Without consulting other group members, each person sorts
student work into High, Medium, and Low piles After everyone has sorted the work, each member briefly
shares general observations about student performance
5 min 3. Developing a Rubric (or a general list of criteria) Each member writes general descriptors for each level, H, M,
& L
15 min 4. Share descriptors and agree on a group rubric (or, less formally, a simple list of criteria that participants used to sort the work into piles)
Recorder: Record rubric or criteria on chart paper for your group.
10 min 5. Summarize your findings about student work What was notable or surprising about the criteria your group
used to sort the work? What was notable to you about the students’ understanding?
Individually reflect and write. (5 minutes) Record on chart paper. (5 minutes)
15 min 6. What are the next steps for teaching the students in the High, the Medium, and the Low group
Develop an action plan for each level of student. Record on chart paper.
10 min 7. Reflect on the protocol (Record your ideas on the Reflection Matrix) What did you gain by using this protocol? In what ways did the structure of this protocol help you and
your group understand student thinking? How could using this protocol to look at student work improve
student learning, your classroom practice, and your work with peers?
Facilitator: Make sure that all group members record their thoughts in the Reflection Matrix.
*This protocol is an adaptation of the High-Medium-Low Protocol to be used in a workshop environment.
Timeline for Piloting
Timeline for Developing DDMs for all educatrs
Required for Piloting 2013-4
Grade
Name
Source Type of Assessme
nt
Item Type(s)
How long in use in District?
Why selected to
pilot?
Selected
Assessmen
ts
Example HS Writing to Text
District On DemandConstructed Resp.
Constructed
New 2013 Like PARCC Research
Simulation
1. Early grade literacy (K-3)
1. Early grade math (K-3)
1. Middle grade math (5-8)
1. HS Writing to Text
1. Trad. non-tested grade
Other Possible Pilots
Elementary
Middle School
High School
Indirect Measures
Specialists
Art
Music
September October November
December
January February March April
May June
Convene District Committee
Create School-Wide Committees
Begin work on piloted DDMs
Submit proposal for piloted DDMs by September 30
Explain DDMs and Growth to all educators
Continue to develop pre- and post-tests as grade-levels, departments, schools, teams
Pre-test to give time to support student’s success
Teach students and collaborate on research based methods that support student growth
Post test
Score assessments
Assess low, average, and high growth of students
Assess quality of assessment (Quality Tool)
Develop DDMs for every educator to be implemented in SY2015
At least 2 per educator, one MCAS when available (4-8)
Waiver applications due
Continue refining DDM components based on Quality Tool components
Plan for 2015 administration for all sections or a course (unlike pilots)
Continue refinement of components and plan
DDM Final Plan due
On Page 3 of agenda
Quality Alignment ToolAlignment to contentAlignment to rigorIf the assessment passes these criteria:
Then validity and reliabilityThen instructions, procedures for assessment, etc.
“Don’t let perfection get in the way of good.”
“We are all in this together”
“It will not be perfect”
“We will be making mistakes along the way.”
“We need your help to make the process better.”
Focus on Growth, not Gotcha
Form Joint Committees
Conversations are critically important
Communication is essential Using student growth as a measure of
educator effectiveness can be unsettling
Joint Meetings with Union Leaders and Members and Administration
Engage School CommitteeJohn Doherty