Increasing faculty investment in program review and program assessment : a view from the trenches
description
Transcript of Increasing faculty investment in program review and program assessment : a view from the trenches
Increasing faculty investment in program review and program
assessment :
a view from the trenches
CIHE Assessment ForumDecember 6, 2012
Increasing faculty investment in program review and program
assessment
QuickTime™ and a decompressor
are needed to see this picture.QuickTime™ and a decompressorare needed to see this picture.
QuickTime™ and a decompressor
are needed to see this picture.QuickTime™ and a
decompressorare needed to see this picture.
QuickTime™ and a decompressor
are needed to see this picture.
Committee Composition• Internal Members• External Voices
Committee Dynamic I
Academic Sparring………..
OR:
Committee Dynamic II
Deliberative Dialogue?
From Status Determination to Continuous Improvement
Or
So we applied for and received a grant from the Davis Educational Foundation to improve program
review and program assessment in the VSC.
Our plan was to . . .
• Create a steering committee • Educate ourselves about best practices in PR
and PA• Research local faculty attitudes/perceptions• Develop resources to support faculty
conducting reviews• Improve the process at PR meetings
But then, faculty began speaking.
• Many hated the PR process.• They saw it as risky and potentially punitive.• Thus, reports were guarded; those attending
PR meetings often were defensive.• Faculty of accredited programs, particularly,
resented the additional work.
In addition, we found other problems.
• Who wrote the reports?• Who knew what they said?• Who knew the outcome of the PR process?• How were the reports used• By the faculty?• By the president and dean?
So, what was the point?
Why were we doing all this?
It became clear that past PR efforts had few if any benefits and were being engaged in by faculty who took only a “compliance” point of
view.
So we revised our objectives, raised our aspirations, and redirected the grant.
• We wanted faculty to see that these activities could benefit their programs and students.
• For that, we needed to remove the punitive elements of the policy.
• Indeed, if at all possible, we needed to sever the connection to the old policy.
• So we sought to change and rename the policy.• And we did.
Our first win: the low-hanging fruit
• The Board agreed that accredited programs should not have to undergo the PR process (11/4/10).
• Importantly, this convinced many faculty– that our commitment to positive change was
serious and– that the Board was willing to listen to faculty.
Then we moved on to the harder work of changing the policy as it
affects the rest of academic programs.
This required surgery.
• Eliminating a much-disputed cost-revenue analysis
• Eliminating the requirement that faculty address such issues as– “competitive advantages and disadvantages,”– institutional recruiting strategies– etc.
More surgery
• And, most important, eliminating the stipulation that PR could result in Board decisions that included—Gasp!— “Termination.”
But we needed to do this . . .
. . . while beefing up honest self-evaluation of program
effectiveness.
Finally, April 28, 2011:
Program Review and Continuous Improvement Process, PReCIP,
adopted.
What was the process of our two-year effort?
The process.• Ten day-long steering committee (SC) retreats• Fall of year 1 was devoted to learning how faculty viewed the
process and what was needed to improve it.• Each college held faculty focus group meetings.• We administered a faculty survey.
The process, continued.• Took steps to change Board policy• With chancellor’s support, advanced the “modest proposal”
regarding accredited programs.• SC developed proposal for a comprehensive policy change as well
as substantial changes to the self-study “template.”
The process, continued.• In year 2, work shifted to increasing faculty expertise in PA.• This took many forms.– VTC’s dean and department chairs read Walvoord’s
Assessment Plain and Simple; chairs took turns facilitating discussions of each chapter.
– JSC’s dean focused several of the chairs meetings on assessment; chairs took turns presenting and getting feedback on their educational objectives, assessment plans, and resulting data.
The process, continued.– Castleton and CCV faculty attended NEEAN workshops and
conferences.– JSC’s SC members prepared an RFP, inviting faculty to request
support for an assessment-related project. – Several colleges brought in consultants, e.g., Peggy Maki and
Martha Stassen.
The process, continued.• May 2012 system-wide retreat held for faculty of programs that
were scheduled for 2013 review. • Five SC faculty developed an on-line PR and PA manual to support
faculty writing self-studies and working on program assessment.
VERMONT STATE COLLEGESASSESSMENT GUIDE
RELATED TO PROGRAM REVIEW AND CONTINUOUS IMPROVEMENT PROCESS (PReCIP) REPORTS
TABLE OF CONTENTS
1. PURPOSE AND USE OF THIS GUIDE
2. BASICS OF ASSESSMENT
3. THINGS TO KNOW BEFORE YOU START
4. TROUBLESHOOTING PROBLEMS YOU MIGHT ANTICIPATE
5. ASSORTED BEST PRACTICES AND WISDOM
6. GLOSSARY OF TERMS AND INDEX
7. APPENDIX 1 - ILLUSTRATIVE EXAMPLES AND GUIDANCE
The process, continued.• May 2012 system-wide retreat held for faculty of programs that
were scheduled for 2013 review. • Five SC faculty developed an on-line PR and PA manual to support
faculty writing self-studies and working on program assessment.• VSC deans developed statements regarding the desired qualities
of outside members and their role.
The process, continued.• May 2012 system-wide retreat held for faculty of programs that
were scheduled for 2013 review. • Five SC faculty developed an on-line PR and PA manual to support
faculty writing self-studies and working on program assessment.• VSC deans developed statements regarding the desired qualities
of outside members and their role.• The SC developed three instruments to evaluate PReCIP products
and processes.
The role of steering committee faculty
• We were very fortunate to have selected outstanding faculty for the steering committee. – All were highly respected.– Most were tenured senior faculty; but the SC also included a
few early-/mid-career faculty. – They represented a broad range of disciplines.– They had a broad range of experience with assessment.– All were willing to play leadership roles among their peers.
The role of steering committee faculty, continued
• At most colleges, SC faculty became the principal spokespersons and public advocates for PR/PA and its importance/value.
• Most became coaches/mentors of faculty who were going through the process.
• CCV’s faculty member periodically writes articles on assessment for the dean’s monthly newsletter.
• All worked closely with their deans to assess how the process was being conducted and to seek ways to improve it.
The role of the deans• Often in a role secondary to SC faculty, deans helped explain the
new policy and process.• Often with the assistance of SC faculty, deans help faculty
develop expertise in program assessment.• Deans obtain essential resources.• Deans play a critical role in raising institutional awareness and the
significance of PR and PA at the college. Towards these ends:– JSC’s strategic plan now includes a priority related to the continuous
improvement of academic programs.– In what is also an annual budget meeting, each VTC academic department
presents its educational objectives and outcomes to the president’s Cabinet.
The faculty experience:Rounds 1 and 1.5
Lyndon’s Round 1:2011-2012
Three departmentsFive programs
Three different approachesEverybody else watching
(or refusing to look)
Onecampus-wide
initialresponse:
One common response:
First Round Departments
•Computer Info Systems – one full-time faculty
•Mountain Rec Mgmt – four full-time faculty
•Natural Sciences – four full-time faculty*
including one SC member
Strategy:
Fall 2011 – campus wide meeting •Reports due May 2012, May 2013: all
department faculty•Reports due subsequent years: 1 department representative
Preparation
Info-sharing, brainstorming responses across disciplines
Strategy:
Spring 2012 – initial internal deadlines•spaced evenly throughout term•”easiest” sections due first
Implementation
Reality:You want these when??
•three departments, three experiences•common threads emerge
Common ThreadsConcerns•PReCIP felt like the old Policy 101
-first round = starting from the ground up-may remain a problem for whole first cycle
•Very difficult to complete during semester-faculty time constraints
Discoveries•Despite problems, this process is far more productive than Policy 101
Experiences• Degree of collaboration during writing
– CIS– MRM– NS
• Writing process– Reflective– Entire document meaningful– Reflection obviously helpful for faculty work
• Summer Meetings– Uniformly beneficial
Moving Forward Towards the Next Cycle
• Due date later in the summer to acknowledge difficulties of the undertaking during the term.
• Establish the most productive way to move towards writing the next report
• Pass the word!• Participants actively encourage departments
reporting in successive years• Helps to break down silos
Round 1: Most important lessons •Improve departmental collaborative writing process
•Extend due date farther into the summer •Summer review meetings are really helpful!
--excellent atmosphere for constructive criticismnew collaborations
•Implement continual preparation for next review
Progress for Round 1.5
•Departments have begun active planning •Some skepticism remains•Another SC member directly involved
distinct benefit
•Generally on-track
•Truly reflective•A useful summary
--guides assessment reflection•Supports much better preparation for the next report•Increased buy-in after first round
New process much improved!
Lessons learned• Recruiting the right faculty leaders to this initiative was absolutely
critical.• Listening and responding to faculty were critical, even if it meant
changing the plan.• Early positive action, including by the chancellor and Board,
allowed otherwise skeptical faculty to suspend disbelief.• It proved wise to allow each college to pursue its own path while
maintaining common system-wide goals.
Lessons learned, continued• Resources are needed to support these efforts, but they are not
out of reach, even in the underfunded VSC.• Presidents/deans will have to embed PR and PA results in college
strategic plans and frequently will have to give these matters “air time.”
• Affecting faculty attitudes and institutional culture will take years. This is a particular challenge when one can count on presidential, deanly, and faculty turn-over.
And especially:
• We must “practice what we preach,” that is, we have to strive continuously to improve this “continuous improvement” process, and we need to publicize those ongoing improvement efforts to our faculty.
Questions, reactions, observations?
And discussion