TSPC MEETING JULY 20-22,2011

19
ACCREDITATION SITE VISITS

description

TSPC MEETING JULY 20-22,2011. ACCREDITATION SITE VISITS. HISTORY OF SITE VISITS. DIVISION 010 – SITE VISIT PROCESS DIVISION 017 – UNIT STANDARDS DIVISION 065 – CONTENT STANDARDS. HISTORY OF SITE VISITS (cont.). Team selected from higher education peers and k-12 educators. - PowerPoint PPT Presentation

Transcript of TSPC MEETING JULY 20-22,2011

Page 1: TSPC MEETING JULY 20-22,2011

ACCREDITATION SITE VISITS

Page 2: TSPC MEETING JULY 20-22,2011

DIVISION 010 – SITE VISIT PROCESS

DIVISION 017 – UNIT STANDARDS

DIVISION 065 – CONTENT STANDARDS

Page 3: TSPC MEETING JULY 20-22,2011

Team selected from higher education peers and k-12 educators.

Institutions would present evidence at TSPC office for review.

Teams would review evidence and visit the institution.

Team evaluated evidence based on standards.

Purpose of the site visit was to determine compliance to standards.

Page 4: TSPC MEETING JULY 20-22,2011

Programs approved by commission and reapproved as part of unit site visit.

Critics of process1.Process subjective2. Inconsistent in evaluations3.Teams made recommendations based on

site visit findings.4.Undefined culture of evidence5.No program review process.

Page 5: TSPC MEETING JULY 20-22,2011

Move purpose of process from compliance to continuous improvement

Change the definition of culture of evidence1.Define required assessment systems 2.Define required categories of data to

demonstrate candidate competencies.3.Define processes for use of data for

program improvement.

Page 6: TSPC MEETING JULY 20-22,2011

Create a rigorous program review process as part of accreditation process.

1.Emphasis on assessments, rubrics and scoring guides

2.Emphasis on quality of data for purposes of continuous improvement

3.Use of data in continuous improvement process

Page 7: TSPC MEETING JULY 20-22,2011

Key standards for accreditation1.Candidate competencies evidenced by data2.Assessment systems3.Field experiences4.Cultural competency/Diversity and inclusion5.Faculty Qualifications6.Unit Resources and Governance

Page 8: TSPC MEETING JULY 20-22,2011

Site team use of rubrics to determine meeting standards

Allows for meeting standards yet determining Areas for Improvement (AFI)

Page 9: TSPC MEETING JULY 20-22,2011

New process in accreditation. Evidence used to demonstrate validity of

candidate competency data during unit site visit.

Program review process virtual in nature based on electronic exhibits.

Program reviews conducted six months prior to unit site visits.

Page 10: TSPC MEETING JULY 20-22,2011

The commission has adopted a template for the program review process associated with site visits, major program modifications and new endorsement programs

The intent is to provide clear directions on the requirements for program review, addition and modification. Electronic submission of materials is required for easier review by commissioners and site team members.

 

Page 11: TSPC MEETING JULY 20-22,2011

PRINCIPLES TO FOLLOW FOR DATA COLLECTION

  · Candidates ability to impact student

learning · Knowledge of content · Knowledge of content pedagogy · Pedagogy and professional knowledge, · Dispositions as defined by state standards

or the unit’s conceptual framework · Technology

Page 12: TSPC MEETING JULY 20-22,2011

The following rubric will be used when considering whether the program meets state standards.

 Acceptable: The program is aligned to the state program standards. Assessments do address the range of knowledge, skill and dispositions stated in standard or by unit. Assessments are consistent with the complexity, cognitive demands, and skill required by the standard it is designed to measure. The assessment does measure what it purports to measure. The assessments are defined. The assessments and scoring guides are free of bias.

Page 13: TSPC MEETING JULY 20-22,2011

Assessment instruments do provide candidates or supervisors with guidance as to what is being sought. Assessments and scoring guides allow for levels of candidate proficiency to be determined. The assessments do address candidate content knowledge, content-pedagogy, pedagogy and professional knowledge, student learning and dispositions. Field experience does meet the requirements of the standards. There is evidence data has been summarized and analyzed. The data has been presented to the consortium. Syllabi clearly align and clearly address the program standards.

Page 14: TSPC MEETING JULY 20-22,2011

AFI Example: Key assessment do not provide candidates or supervisors with substantive guidance as to what is being sought.

Rationale: Scoring guides use simple words (i.e. unacceptable, emerging, proficient, or exemplary) and are left to broad interpretation.

Page 15: TSPC MEETING JULY 20-22,2011

AFI Example: Instruments and scoring guides do not allow for levels of candidate proficiency to be determined.

Rationale: Data demonstrates little or no distribution of candidates across the scoring guide scale. All candidates receive predominately the same score.

Page 16: TSPC MEETING JULY 20-22,2011

State Program Review Results Report: The State Program Review Results Report is

the document that will be submitted by the program review site team to the Commission for review. at the meeting prior to the submission of the unit’s Institutional Report.

Page 17: TSPC MEETING JULY 20-22,2011

The program review site team will make recommendations to the Commission regarding whether the Commission should extend full state recognition of the program(s), recognition with conditions, or denial of the program’s recognition. {See Division 10 for the levels or program review recognitions.}

Page 18: TSPC MEETING JULY 20-22,2011

Small group activity:

Question: Does the acceptable level in the rubric define clearly expectations for program review and approval?

Question: Show teams review syllabi to program standards?

Question: Should teams evaluate assessments and data for quality?

Page 19: TSPC MEETING JULY 20-22,2011

Small Group (cont.)

Question: Should programs provide evidence of consortium review of data.?

Question: National standards require 3 years of data. What should be Oregon’s standard?

Question: At what point should conditions be imposed? At what point should recognition be denied?