PAIU Curriculum Coordinators Meeting November 2015 PVAAS Update PVAAS Statewide Team for PDE...

Click here to load reader

download PAIU Curriculum Coordinators Meeting November 2015 PVAAS Update PVAAS Statewide Team for PDE pdepvaas@iu13.org pdepvaas@iu13.org.

of 34

Transcript of PAIU Curriculum Coordinators Meeting November 2015 PVAAS Update PVAAS Statewide Team for PDE...

PowerPoint Presentation

PAIU Curriculum Coordinators MeetingNovember 2015PVAAS Update

PVAAS Statewide Team for PDE [email protected] PVAAS Update TopicsTransition of State Assessments/PSSA and PVAASPVAAS Accounts and AccessPVAAS and PEERS Reporting ReleasesPerformance and Quintile Diagnostic Resource DocPVAAS Fall Professional DevelopmentASA and AERA StatementsSpring 2016 Roster VerificationQuestions/Feedback/Follow-up Needed?

Transition of State Assessments: Impact on PVAAS?Transition of State Assessments/PSSA: PVAASIs the group of students (indicated by the yellow star) at the same RELATIVE position in the distribution of statewide scores from SY13-14 to SY14-15? Yes = Green on PVAAS.

Transition of State Assessments/PSSA: PVAAS - PennsylvaniaDistribution of school value-added levels over the past two years, SY13-14 and SY14-15 for PSSA Math in grades four through eight. Despite the changes in proficiency over the past two years, the distribution of PVAAS color indicators has remained fairly constant (Red is the lowest and Dark Blue is the highest).

Transition of State Assessments/PSSA: PVAAS - PennsylvaniaDistribution of school value-added levels over the past two years, SY13-14 and SY14-15 for PSSA Reading/ELA in grades four through eight. Despite the changes in proficiency over the past two years, the distribution of PVAAS color indicators has remained fairly constant (Red is the lowest and Dark Blue is the highest).

Transition of State Assessments/PSSA: PVAASNew document with these illustrations will be available soon on the PVAAS login page!

New version will replaced by end of next week!PVAAS User Accounts and AccessPVAAS User AccountsPVAAS user accounts should not be shared. LEA staff should provide each staff who will access PVAAS with the appropriate type of account and the appropriate accessIU staff should not be using another users account from an LEA. LEAs may choose to provide the IU staff with their own account for the LEA.Let the LEA contact login with his/her PVAAS User AccountAudit Trail record of who logged in (PVAAS username) and what actions were taken.

Protect yourself, and encourage your LEAs to protect their staff and the confidentiality of their data. For help managing PVAAS accounts and access,click Contact Us at the top right of any PVAAS page!PVAAS Reporting AccessAccess for IU Consultants, PATTAN, PDE Staff State User accounts, with access to reporting for multiple LEAs and no student namesAccess was provided on November 6The PVAAS Statewide Team manages account changes (creating, deactivating) for your IU consultantsNote that these State User accounts, with access to multiple LEAs within the IU region and NO student names, are managed by the PVAAS Statewide Team. Other accounts at your IU, for your IU as an LEA, are managed locally. Accounts with your IU as an LEA have access to reporting to only your IU, not other LEAs in your region.

PVAAS and PEERSReporting ReleasesSY14-15 ReportingPassword Protected Site:Currently Enrolled Projections, GIEPCurrently enrolled data released November 5LEAs who submitted complete data into PIMS as of Oct 16, COB, can now access the PVAAS Student Projection Reports and PVAAS Projection Summary Reports for students who have changed grade levels, have changed schools (elementary to middle school, and middle to high school), and/or who are new to the LEA. GIEP flag will be added in PVAAS with next release/update in December - using the November PIMS Enrollment filePassword Protected Site:Currently Enrolled and New ProjectionsPVAAS Projections are available for the following assessments:Math, ELA and Science (4,8): Students enrolled in grades 4-8Keystones Algebra I, Literature, Biology: Students enrolled in grade 6 and higherPSAT: Students enrolled in grades 6-11ACT, SAT: Students enrolled in grades 9-12Advanced Placement (AP) Exams: Students enrolled in grades 6-12AP BiologyAP Calculus ABAP StatisticsAP English LiteratureAP English LanguageAP Government PoliticsAP US HistoryAP PsychologyPVAAS Public SiteAnticipated date of release November 19, 2015Includes reporting ACROSS GRADES by subject/Keystone content area. Reporting is ONLY included on the PVAAS Public Site if more than one grade level of students is represented in the PVAAS score for the specific subject/Keystone content area. (This is defined in Act 104.)An email announcing the release of the PVAAS Public Site will be sent, when the site is released, from EVAAS Support to:District Admin account holders (Supt or designee/1 per LEA)District User account holdersSchool Admin account holders (Principal/1 per school building)Intermediate Unit, PATTAN, and PDE account holders will be included in this email announcement.

PEERSUpdate with SY14-15 FINAL SPP scoreWhen SY14-15 score is available in PEERS, an e-mail notification will be sent (from EVAAS Support) to:District Admin account holders (Supt or designee/1 per LEA)District User account holdersSchool Admin account holders (Principal/1 per school)Performance and Quintile Diagnostic Resource DocumentPVAAS Diagnostic Reports:Performance Diagnostic and Quintile Diagnostic

How are they different? How can each be used?Updated documents shared with PAIU CC and IU PVAAS Contacts (via e-mail) on October 30.Please share this information with your LEAs!Documents will also be posted on the PVAAS login page soon!

PVAAS Diagnostic Reports:Performance Diagnostic and Quintile Diagnostic

PVAAS Diagnostic Reports:Performance Diagnostic and Quintile Diagnostic

PVAAS Fall Professional DevelopmentFollow up and Common Questions

Fall 2015 PVAAS SessionsFace-to-Face Admins Sessions?

35 Sessions Completed 12 More Sessions to go!Thoughts?Feedback?

Coach Videoconference Sessions

9 VCs CompletedVC Archive Link6 Follow-up WebinarsLink will be sent to all VC participantsArchivedThoughts?Feedback?

Reporting of LEA Participation at IU SessionsInformation needed for Report to PDEPlease provide information in spreadsheet or table format (MS Excel or Word; please, no pdf files or scanned sign in sheets)Include information about the PVAAS session(s) at your IU:Date, time, and title of sessionParticipant NameParticipant E-mail AddressParticipant LEAParticipant building (if available)Participant role (if available)File doesnt need to be pretty! If possible, pull right from your registration system. Include whatever fields of information you collect; well use the ones above and delete others.When sending final sign-in sheets to us, as record of who was actually in the session, please do so IN ADDITION TO the above spreadsheet/table. Thanks!

1:1 Sessions with Some LEAsAuditor General Report on PDE Support to Struggling Schools in PALarger LEAs; Lower Achieving LEAs difficult to leave district/school for all needed trainingsWe are offering 1:1 sessions 1:1 Sessions Completed in 10 LEAs so far approx 70 1:1 sessions1:1 Session with each principal and district debriefExample: Erie Public Schools, 2 consultants, 2 days, Thurs/7:30AM-5:00PM and Fri/7:30AM -3:00PM, 18 schools, 90 min per school, district admin debrief session; ARL; CRO; IU debrief being scheduledMANY Other LEAs still on the invite/to be scheduled listI will email the IU PVAAS Contract regarding outreach from our team which LEAs, your IU participation/debriefPVAAS Fall Professional DevelopmentCommon Questions

Common Questions/Topics Were Hearing in Training SessionsWhat does the 0-3 mean on the Teacher List by School Report?0-3 score (last column) is NOT the AGI0-3 score for Composite indicates conversion of AGI to what score would be IF used on 82-1 rating tool/evaluationIntended for people to see score along the way to a 3 year rolling average0-3 score NOT to be used yet unless from a 3 year rolling averageCommon Questions/Topics Were Hearing in Training SessionsHow can Composite be yellow/red but all singles green?MORE evidence!

How can 2 teachers be green, but 1 teacher red if they taught same students? 2 teachers had 35 students each for only 20% of the time (small amount of IR), but the other teacher had all 70 students combined and for 80% of the time (larger amount of IR). That one teacher had twice as many students and a lot more %IR (more data, more evidence).MORE evidence!

Response: the model for PVAAS Teacher Specific Reporting assumes that the group of students is GREEN, UNLESS.There is enough data/evidence in the students assessment results to indicate otherwise and pull them to the other colors (dark blue, light blue, yellow, red)

Common Questions/Topics Were Hearing in Training SessionsWhat to address with principals when they ask how teacher can go from dark blue to red, or from red to dark blue?Is it red flag if they say they didnt do anything different? If they didnt do anything different, what does that mean?

Short Response: DIG Deeper!!!!!!!!!!!Also look at prior achievement of students; if teacher/admin indicates teacher did not do anything different, but the teacher had students at a different achievement levelCommon Questions/Topics Were Hearing in Training Sessions

What does GREEN mean?Green DOES mean growth!!!!!

ASA and AERA StatementsSAS EVAAS Response

ASA and AERA: Value-Added ModelingAmerican Statistical Association (ASA) Position StatementTechnical ConsiderationsAmerican Educational Research Association (AERA) StatementTechnical and Policy Considerations

SAS EVAAS' response to this most recent AERA statement and their statistical/technical considerations:AERA's statement on the use of value-added models includes several technical and policy recommendations. SAS agrees with many of the technical points and believes they would be true of any measure of schooling and teaching effectiveness, such as classroom observations or student surveys. Not all value-added models are the same, and the EVAAS models address many of AERA's recommendations, such as using multiple years of student data, reporting multiple years of growth estimates with their standard error, and adopting a robust approach to accommodating different tests. We view the policy recommendations as suggestions to educators and policymakers for implementing value-added reporting within education systems rather than limitations of value-added modeling itself.

As some of you may recall there was a statement released previously from the American Statistical Association (ASA) with statistical/technical considerations about value-added models. The headlines and use of the ASA information was "reprinted" by a range of groups/organizations, some stating ASA's position and the details less accurately than stated by ASA. SAS EVAAS (PVAAS provider) had a response to ASA's statement at that time. SAS EVAAS' statement basically stated that they agreed with ASA's position paper regarding value-added modeling as not all models have/use the same statistical safeguards.In a statement released yesterday (Nov 11), the American Educational Research Association (AERA) again advised those using or considering use of value-added models (VAM) about the scientific and technical limitations of these measures for evaluating educators and programs that prepare teachers. The statement, approved by AERA Council, cautions against the use of VAM for high-stakes decisions regarding educators.SAS EVAAS' response to this most recent AERA statement and their statistical/technical considerations:AERA's statement on the use of value-added models includes several technical and policy recommendations. SAS agrees with many of the technical points and believes they would be true of any measure of schooling and teaching effectiveness, such as classroom observations or student surveys. Not all value-added models are the same, and the EVAAS models address many of AERA's recommendations, such as using multiple years of student data, reporting multiple years of growth estimates with their standard error, and adopting a robust approach to accommodating different tests. We view the policy recommendations as suggestions to educators and policymakers for implementing value-added reporting within education systems rather than limitations of value-added modeling itself.In case you are asked about this most recent statement from AERA - I wanted to make sure all of you received the accurate/non-edited version of this information. If at any point anyone would want to discuss the technical/statistical considerations with Dr. John White/SAS EVAAS, he would be more than happy to discuss the information and answer any questions. If you read the AERA requirements below (even without any additional details) you can see that these are the kind of technical/statistical considerations addressed by EVAAS/PVAAS.AERA's 8 technical requirements that must be met for the use of VAM to be accurate, reliable, and valid:1. VAM scores must only be derived from students' scores on assessments that meet professional standards of reliability and validity for the purpose to be served.2. VAM scores must be accompanied by separate lines of evidence of reliability and validity that support each claim and interpretative argument.3. VAM scores must be based on multiple years of data from sufficient numbers of students4. VAM scores must only be calculated from scores on tests that are comparable over time5. VAM scores must not be calculated in grades or for subjects where there are not standardized assessments that are accompanied by evidence of their reliability and validity6. VAM scores must never be used alone or in isolation in educator or program evaluation systems7. Evaluation systems using VAM must include ongoing monitoring for technical quality and validity of use8. Evaluation reports and determinations based on VAM must include statistical estimates of error associated with student growth measures and any ratings or measures derived from them.

30PVAAS Roster VerificationAre you reeaddddddy to rosteeeerrrrrrr?!!!!!!Roster Verification Phases: Spring 2016Please note the adjustment for the opening and closing of each phase: Each phase will close on a Friday, at 11:59 pm, and roll into the next phase Saturday at midnight. E-mail notifications to the appropriate users for the next phase, informing them of the opening of the phase, will be sent on the Monday of each phase. The change was made to reflect when the work of verifying rosters is generally completed: during the work week (M-F).

32SY15-16 Updated RV Resources Posted!

https://pvaas.sas.com

RV VLM Part I Posted!!What is RV.

Questions? Feedback?Follow-up [email protected]