Facilitating Continuous Improvement through Reporting ......formative/summative administration, or...

34

Transcript of Facilitating Continuous Improvement through Reporting ......formative/summative administration, or...

  • Facilitating Continuous

    Improvement through Reporting

    Capabilities

    David Rock, Kaye Pepper, Blake Adams

    University of Mississippi

    Smitty Wood

    University of Southern Arkansas

  • Issue:

    • Importance of data collection and analysis as a

    means of measuring accountability and effective

    program improvement practices (Orland, 2015)

    • Increased pressure for use of Electronic

    Assessment Systems (EAS) to demonstrate

    candidate success (Norman & Sherwood, 2015)

    • Kirchner & Norman (2014) found that an EAS

    developed in-house provides more control over

    design, implementation, use, and ability to change

    the system to meet the users’ specific needs

  • System Background & Development

    • Use of various data collection techniques:

    – prior to 2002 - Spreadsheets

    – 2002 - Open-source application

    – 2006 - Commercial vendor

    – 2008 - Access database

    – 2011 - MySQL web-based format

    » collects data on key assessments

    » facilitates the field placement process

    » ability to self-generate a variety of reports

    – 2015 - Redesign of current system in process

  • Purpose of the presentation:to describe the use of reporting capabilities in

    our in-house developed EAS to facilitate

    continuous improvement

    The participant will:

    • realize the value of developing an EAS to address

    the specific needs of their EPP in facilitating

    continuous improvement.

    • analyze their own assessment system to determine

    whether it adequately meets their specific needs.

  • Types of Reports• Self-generated aggregate reports by stakeholders

    on results of:

    – Key assessments - by program area, content area, evaluator, formative/summative administration, or regional campus

    – Admission/graduation data - by total numbers, GPA, test scores, licensure

    – Enrollment - by degree program

    – Graduating student and follow-up surveys

    Comparisons can be made across program areas, by level

    of evaluator, by regional campus.

    These comparisons help develop an understanding of

    program and EPP strengths and areas that need

    improvement.

  • Types of Reports (cont.)

    • Synopsis of Student Work – provides complete information about each candidate as they

    progress through the program

    – allows instructors, supervisors, coordinators

    • to review progress of candidates assigned to

    them

    • determine areas of strengths and areas of need

    • faculty can work together to develop a plan of

    intervention, if needed

    - provides information for aggregate reports

  • Types of Reports (cont.)• Field Placement Reports – ensures

    candidates have placements in diverse settings

    – MS Dept of Ed supplies demographics on

    each school site

    – System categorizes schools based on % of

    students on free/reduced lunch and % of

    minority student population

    – Field Placement Coordinator sees previous

    placements for candidates and selects new

    placements that ensures diversity

  • Types of Reports (cont.)

    • Low-Score Report – generated when a candidate’s performance falls below

    expected levels

    – When a candidate receives a low score on an

    assessment item, an email is automatically

    generated to the student and to the program

    coordinator

    – This process allows for immediate feedback

    and assistance to the candidate focused on

    improved performance

  • Assessment System Notification

    Emails to Candidates

  • Report for System Administrator

  • Types of Reports (cont.)

    • Disposition Assessment Report – for individual students and across programs

    – Course Instructors/Supervisors enter instances of

    candidate disposition infractions into the

    assessment system

    – The first infraction serves as a warning. If a

    candidate receives a second infraction, he/she must

    meet with a faculty committee

    – This process assists in improving professional

    ethics and behavior of candidates

    – The procedures are clearly outlined in the candidate

    handbook

  • Report for System Administrator

  • Types of Reports (cont.)

    • Late Score Report – generated when evaluators have not entered data by preset deadlines

    • All evaluators who have not met the deadline receive

    an email(s) requesting that they enter assessment

    results for the candidate in their K-12 classroom or

    course

    • In Fall 2015, data collected on 610 candidates (1,832

    total assessments) - 99% of data collected.

    • Ensures quality of data used for analysis

  • Report for System Administrator

  • Data Driven Decisions• Education Preparation Provider

    – A review of aggregate results of disposition

    reports across the EPP in the annual Assessment

    Retreat facilitated the development of a common

    disposition assessment for all programs in the unit

    – As a result of reviewing graduated candidates’ and

    employers’ perceptions on follow-up surveys,

    additional training for faculty on emerging

    technology and ways to utilize these new tools in

    teaching is being provided

  • Data Driven Decisions (cont.)

    • Programs

    – review of results for the student teaching

    assessment revealed a lack of consistency

    between ratings by university supervisors & P-

    12 mentor teachers. Online training modules

    were developed for the assessment instrument

    and now all junior level candidates, P-12

    mentor teachers, and supervisors complete

    the online training

    – As a result, all stakeholders have a better

    understanding of the assessment items and

    expectations for providing evidence of mastery

  • Data Driven Decisions (cont.)

    • Programs

    – After reviewing candidate and P-12 mentor

    teacher responses on items related to

    university supervisors, a feedback component

    was added to the assessment system to

    provide an additional avenue to conference

    and mentor candidates

  • Data Driven Decisions (cont.)

    • Programs

    – A review of several years of masters

    comprehensive exam results led to a change in

    the requirements for the completion of the

    masters program to a more project-based

    approach

  • Data Driven Decisions (cont.)

    • Supervisory Personnel

    – The results of the Field Experience Survey

    completed by the candidates, P-12 mentor

    teachers, and university supervisors provides

    guidance for planning training activities for the

    mentor teachers/supervisors.

    – A comparison of the Field Experience Survey

    results, over several semesters, guide

    decisions about the personnel who work with

    our candidates.

  • Data Driven Decisions (cont.)• Candidates:

    – The assessment system monitors the number of

    infractions a candidate receives. A report is sent to

    the program coordinator when a candidate

    receives 2 infractions.

    – The coordinator calls a meeting with that candidate

    and a faculty committee. After a hearing, the

    committee may develop an intervention plan for

    that candidate to assist in improving or

    recommend dismissal.

    – An aggregated report across a program of the

    specific dispositions most often violated assists in

    planning orientation sessions with new candidates.

  • Data Driven Decisions (cont.)

    • Candidates:

    – A report is sent to the program coordinator when

    a candidate scores below expectation on an item

    on a key assessment. This provides the

    opportunity for the candidate to get immediate

    feedback and assistance for improving their

    performance

    – If several candidates consistently score low on a

    particular assessment item, program faculty can

    make adjustments to course instruction

  • Does your assessment system

    adequately meet your specific needs

    and facilitate continuous improvement?

    • Please complete the survey

    • Discussion

  • Questions?

    Kaye Pepper

    Director of Assessment

    [email protected]

    Blake Adams

    Systems Analyst II

    [email protected]

    David Rock

    Dean

    [email protected]

    mailto:[email protected]:[email protected]