Assessment & Review of Graduate Programs -
Doctoral
Duane Larick, NC State University
Michael Carter, NC State University
Margaret King, NC State University
Council Of Graduate SchoolsPre-Meeting Workshop
December 7, 2005
Guidelines for This Presentation
Please feel free to raise questions at anytime during the presentation
We have included discussion questions along the way
We are very interested in your participation through questions and sharing of experiences from your campus
We will also leave time at the end for general discussion.
Agenda
Introduction and Objectives Overview of Graduate Program Review
Reasons for Graduate Assessment General Process of Program Review
Process or Processes for Development of a Program Review Procedure External program review Outcome based – continuous & ongoing review Comparative Data Sources
Summary and Discussion
Objectives
Discuss various motivators for undertaking graduate assessment
Increase overall awareness of recent trends in Graduate Program Review
Demonstrate practical experience/knowledge gained related to development and implementation of external reviews and outcome-based continuous and ongoing procedures for Graduate Program Review
Illustrate examples of data and managerial tools developed/utilized to improve the efficiency of the process
Background Information About Our Audience
How many of you are responsible for graduate program review at your institutions?
How many of you have this as a new responsibility?
How many of you have recently (or are considering) changing your procedure?
Why Assess Graduate Programs?
Why Assess Graduate Programs?
The primary purpose should be to improve in the quality of graduate education on our campuses By creating a structured, scheduled
opportunity for a program to be examined, program review provides a strategy for improvement that is well-reasoned, far-seeking, and as apolitical as possible
Why Assess Graduate Programs?
External Drivers: To help satisfy calls for accountability
Especially at the State level
State Mandated Evaluation of New Programs
All new degree program proposals must include an evaluation plan that includes: the criteria to be used to evaluate the quality and
effectiveness of the program measures to be used to evaluate the program expected levels of productivity of the proposed
program for the first four years of operation (number of graduates)
a plan and schedule to evaluate the proposed new degree program prior to the completion of its fifth year
State-Mandated 5th-Year Review - Issues
Statewide Productivity Assessment of Graduate Programs
Capacity in Relation to Student Demand Capacity in Relation to Occupational Demand Centrality in Relation to Instructional Mission Success of Graduates Program Costs
Low Productivity Analysis - Elements of Statewide Analysis for Each Program Area to be Reviewed
Trends in enrollment and degrees granted
Student characteristics Program costs Occupational demand Recommendations for expansion or
elimination of programs on a statewide basis
Why Assess Graduate Programs?
External Drivers: Requirement for regional accreditation,
licensure, etc.
Regional Accreditation Agencies
Southern Association of Colleges and Schools Western Association of Colleges and Schools Northwest Association of Colleges and Schools North Central Association New England Association of Schools and
Colleges Middle States Commission on Higher
Education
SACS Principles of Accreditation
Core requirement #5: “The institution engages in ongoing, integrated, and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement and (b) demonstrates that the institution is effectively accomplishing its mission.”
SACS Criterion for Accreditation
Section 3 – Comprehensive Standards - #16 “The institution identifies outcomes for its
educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.
Western Association of Schools & Colleges
Accreditation Standards1.2. Educational objectives are clearly recognized throughout the institution and are consistent with stated purposes. The institution has developed indicators and evidence to ascertain the level of achievement of its purposes and educational objectives.
4.4. The institution employs a deliberate set of quality assurance processes at each level of institutional functioning, including new curriculum and program approval processes, periodic program review, ongoing evaluation, and data collection. These processes involve assessments of effectiveness, track results over time, and use the results of these assessments to revise and improve structures and processes, curricula, and pedagogy.
Intent of Accreditation Agency Effort
The intent of the regional accrediting agencies is to “encourage” institutions to create an environment of planned change for improving the educational process
Other Accreditation Agencies
Education, Architecture, Engineering, etc. Often focused on minimum standards
required Department approach to development of the
self-study and the review is focused on demonstration of achievement of those standards – not necessarily program improvement
Why Assess Graduate Programs?
Internal Drivers: Meet short-term (tactical) objectives or
targets Enrollment Growth & Funding
Meet long-term (strategic) institutional or departmental goals Funding allocation/reallocation
Understand sources of retention/attrition among students and faculty
Funded project evaluation (GAANN, IGERT)
Discussion Questions
What other external and internal drivers exist on your campuses?
So The Questions We Need To Ask Ourselves Are
What are we currently doing? Why are we currently doing it? Is what we are currently doing accomplishing
the external goals described above? Is what we are currently doing accomplishing
the internal goals described above? Is there a better way? Who defines better?
Procedure(s) for Review of Doctoral Graduate Programs
External program review conducted on a 5 – 10 year cycle Standard practice at most Institutions
Outcome-based continuous and ongoing program review Being implemented by many in response to
regional and state accreditation requirements and institution needs
Key Characteristics of External Program Reviews
Program review is evaluative, not just descriptive More than merely a compilation of data , it
requires academic judgment of the data Review of graduate programs is forward
looking It is directed toward improvement of the
program, not simply assessment of its current status
Key Characteristics of External Program Reviews - continued
Programs should be reviewed on the basis of academic strengths and weaknesses, not on their ability to generate funding Finances and funding should be relevant
only as they affect the quality of the academic program
To the extent possible, program review should be an objective process
Key Characteristics of External Program Reviews - continued
Graduate program review should be an independent process, distinct from any other review Efficiency can be gained by incorporating graduate
program review with other internal or external reviews but, to be effective, graduate program review must lead to its own set of conclusions and direct its recommendations to the faculty and administrators who have the power to improve the graduate program
Key Characteristics of External Program Reviews - continued
Most importantly, program review MUST result in action Based on the self-study, reviewers’ comments and
recommendations, and faculty and administrator response to the review report, the institution develops and agrees on a plan to implement the desired changes
This plan must be linked to the institution’s planning and budget process
Successful Graduate Program Review Answers the Following Questions
Is the program advancing the state of the discipline or profession?
Is its teaching and training of students effective?
Does the program meet the institution’s goals?
Does it respond to the profession’s needs? How is it assessed by experts in the field?
Operational Procedures: 5 - 10 year review cycle Components
Internal self-study - report External team review Review team’s report Program’s response Administrative Meeting
General Process for External Reviews
Issues to be Resolved Before Beginning Program Reviews
Locus of Control – Administration of Review ProcessComprehensive reviews are often coordinated by the office of the college or school dean or the chief academic officerGraduate program reviews are often coordinated by the graduate dean
Issues to be Resolved Before Beginning Program Reviews - continued
Regardless of who controls the review, the following principles should apply:All reviews should involve the college or school administrationThe graduate dean should play a major leadership role in all graduate reviewsThe essential participants in any graduate program review are the chief academic officer, college administration, graduate dean, department chair, graduate program administrator, graduate program faculty, review team(s) and graduate students in the program
Issues to be Resolved Before Beginning Program Reviews - continued
Counting – and Paying – the Costs A realistic estimate of the costs must be
made and an agreement must be reached regarding who will pay them
Costs include: Travel, accommodations and meals for reviewers,
honoraria for reviewers, etc. Costs for developing and reproducing review
documents, etc.
Issues to be Resolved Before Beginning Program Reviews - continued
Graduate Versus Overall Program Review? Advantages to graduate-only review
Allows for a thorough, in-depth review of the graduate program
Attention focused on quality indicators unique to graduate education
No risk of the graduate program review being “overwhelmed” by the size of the undergraduate program
Graduate Versus Overall Program Review? Advantages to comprehensive review
Potential savings in time and money Does not subject departments to multiple separate
reviews Graduate and undergraduate programs, as well as
research and outreach activities are interdependent
Matters like faculty teaching loads, program and departmental budgets, facilities and quality of teaching and research experience may be more adequately addressed
Issues to be Resolved Before Beginning Program Reviews - continued
Issues to be Resolved Before Beginning Program Reviews - continued
Scheduling Reviews More well-meaning plans for graduate
program review have foundered on an unworkable timetable than any other obstacle!
Recommendation is a 5 – 7 year cycle This depends on the number of programs
and resources available Programs may be grouped by department,
college, etc. for review. The review “unit” should be established prior to scheduling
Issues to be Resolved Before Beginning Program Reviews - continued
Scheduling Reviews – continued
Factors to consider in determining the order of programs for review: Length of time since last review Compelling financial problems or resource
needs Major proposals for curricular change Upcoming accreditation or external reviews Faculty or administration desire for review
Issues to be Resolved Before Beginning Program Reviews - continued
Scheduling Reviews – continued
The schedule MUST be published far in advance Programs generally need 9-12 months to
prepare the self-study, etc. Once established, every effort should
be taken to maintain the schedule, BUT things happen!
Issues to be Resolved Before Beginning Program Reviews - continued
Coordination With Accreditation Reviews Graduate program reviews should be a separate
process from accreditation reviews, but much can be gained by conducting them in tandem, sequentially, or at least in the same academic year: Efficiency of data collection Graduate program review team can benefit from
the expertise and report of the accreditation team
When done in tandem, it is extremely important that the accreditation team acknowledge the difference(s) in the nature of the two reviews
Issues to be Resolved Before Beginning Program Reviews - continued
Masters Versus Doctoral Programs Whether it leads to a doctoral program or not, a
master’s degree should have its own academic integrity
At those institutions with research-oriented master’s and doctoral programs in the same department, programs at both levels should be reviewed simultaneously
The institution should examine the unique characteristics of each master’s program and develop criteria of evaluation appropriate for that program
Issues to be Resolved Before Beginning Program Reviews - continued
Research Based Versus Practitioner Graduate Program ReviewsTraditional research-based and practitioner programs often exist within the same departmentDespite the differences in their educational goals, they should be reviewed togetherIt is essential that they be reviewed using different criteriaShould not rely on the use of professional accreditation review in place of internal review
Issues to be Resolved Before Beginning Program Reviews - continued
Interdisciplinary ProgramsTruly interdisciplinary programs cause special problems for review
Faculty and students are often arranged into academic departments
Those academic departments often control resources, faculty hiring, student admissions, course offerings, etc.
In spite of the administrative convenience of working through existing departments, interdisciplinary programs should be reviewed independently
Issues to be Resolved Before Beginning Program Reviews - continued
Integration of Formal Review with Continuous Outcomes AssessmentIt is important that formal review and continuous and ongoing assessment be seen as part of the same whole, with a common goal of improving graduate educationTo accomplish this, they should somehow be coordinated and integrated
We will discuss how we do that at NC State later in the presentation
Discussion Questions
What other issues have you had to resolve on your campuses?How have you resolved them?
Clear, Consistent Guidelines These guidelines should describe:
The purpose of graduate program review The process to be followed Guidelines for materials to be included in each phase A generic agenda for the review The use to which results will be put
These guidelines should be posted on the Graduate School or Academic Affairs web page
Key Elements of a Successful Program Review
Administrative Support Adequate staffing and general
administrative support are vital to the success of any program review Departments can provide their own support
for the self-study The larger review process should be
staffed centrally
Key Elements of a Successful Program Review
Administrative Support - continued Successful reviews depend on accurate
institutional data This data should be developed and maintained
centrally but should be reviewed and evaluated by the program faculty
A standard report format using a single set of definitions should be developed in advance
The best information often comes from a combination of central and departmental sources
Key Elements of a Successful Program Review
Managerial Tools Created for Program Review - Website
Managerial Tools Created for Program Review - Website
Managerial Tools Created for Program Review - Profile Data
Managerial Tools Created for Program Review - Profile Data
Managerial Tools Created for Program Review - Profile Data
Managerial Tools Created for Program Review - Website
Managerial Tools Created for Program Review - Website
Departmental Self-Study The self-study is prepared by the faculty
and is descriptive, evaluative, and aspirational
It is the department’s opportunity to scrutinize itself, publicize its accomplishments, examine its flaws, and focus on future directions
Key Elements of a Successful Program Review
Key Self-Study Components Departmental mission & organization Program purpose Program Assessment Plan Department size – faculty, staff, students,
budgets, etc. Faculty profile Faculty accomplishments – research &
scholarly activity, contributions to graduate program
Key Elements of a Successful Program Review
Key Self-Study Components - continued Student profile Professional development opportunities –
faculty and students Financial support for graduate students Facilities Curriculum Student productivity
Key Elements of a Successful Program Review
Key Self-Study Components - continued Programmatic climate Collateral support – interaction with other
programs Profile of graduates Future directions Overall evaluation of program – strengths,
weaknesses, national reputation, etc.
Key Elements of a Successful Program Review
Surveys/Questionnaires Surveys from current students, alumni, and
employers can provide valuable information Factors to be considered:
Time and expense to develop, distribute & collect responses
Likely response rate Uniqueness of information to be gained
It is generally preferable to have such surveys developed and administered at the institutional level
Key Elements of a Successful Program Review
Student Participation Graduate students should participate in
the program review process Serve on the review committee Be interviewed collectively and individually
by the review committee
Key Elements of a Successful Program Review
Review Team Make-up On-Campus Representation
Often a Graduate School and/or Graduate Faculty Representative
If possible, they should be from fields that give them some understanding of the program(s) being reviewed
One or more off-campus external experts Depends on scope of program(s) being reviewed
Will add to expense – honorarium plus travel expenses Selection process can vary – programs can have
input but should not make the final decision
Key Elements of a Successful Program Review
Review Team Report Generally includes some form of an
analysis of the strengths, weaknesses, and opportunities for and needs of the graduate program from the perspective of their peers
Should include recommendations for change and improvement
Key Elements of a Successful Program Review
Program Faculty’s Response It is important to keep the program faculty
informed about the findings and to give them a chance to comment on the evaluation
This gives the faculty a chance to correct any factual errors and to reply to any specific criticisms or recommendations
This also gives faculty a chance to outline their proposed actions as a result of the findings
Key Elements of a Successful Program Review
ImplementationThe most important step in program review is not to produce the report but to implement its recommendations!Turning recommendations into actions involves the following steps:
One or more meetings of key administrators (department, college, graduate school, and university administration) to discuss the recommendations
An action plan or memorandum of understanding drawn up and agreed on by all participants
Discussion of the recommendations with program faculty for implementation
Integration of the action plan into the institution’s long-range planning and budget process
Key Elements of a Successful Program Review
Follow Up Since most improvements take time, it
is essential to establish a procedure to monitor progress towards completion of the action plan This is generally done at one- or two-year
intervals
Key Elements of a Successful Program Review
Discussion Questions
What other key elements are missing from the process of formal (external) reviews I described?
Discussion Questions continued
How many of your institutions have a graduate program review process similar to what was just described?What are some of the variations that exist?How often or what is the frequency of review – remember the words “continuous improvement”
Graduate Program Review at NC State – External Review
Until 02-03, we basically followed the formal review process described
Beginning it 02-03 we started to develop and implement a continuous and ongoing, outcome based review process to compliment our external reviews
Motivations For Change
Growing culture of program improvement on our campus –general education, undergraduate, graduate
Undergraduate Student Affairs had implemented an outcomes-based review program that was operational
SACS was just around the corner
SACS Principles of Accreditation
Core requirement #5: “The institution engages in ongoing,
integrated, and institution-wide research-based planning and evaluation processes that incorporate a systematic review of programs and services that (a) results in continuing improvement and (b) demonstrates that the institution is effectively accomplishing its mission.”
SACS Criterion for Accreditation
Section 3 – Comprehensive Standards - #16 “The institution identifies outcomes for its
educational programs and its administrative and educational support services; assesses whether it achieves these outcomes; and provides evidence of improvement based on analysis of those results.
Questions We Began to Ask Ourselves
Does each of our degree programs have clearly defined outcomes?
Are they measurable? Do our programs gather data to assess the
achievement of program outcomes? Do they use assessment results to improve
programs? Do we document that we use assessment
results to improve programs?
Ultimate Question for NC State Became
How could we create a hybrid that combined the benefits of periodic program review and outcomes-based program assessment? Accomplish administrative goals regarding
evaluation of quality related to funding and institutional goals
Accomplish graduate school goals related to program improvement
The ultimate goal is to improve educational programs, not fill out reports to demonstrate accountability
Studying & Revising the Process
Graduate Dean Appointed a Task Force Made up of stakeholders Relied on on-campus expertise
Focus groups with administrators, faculty, students, etc.
Could not utilize Undergraduate Program Review personnel Work load issue New perspectives
Bottom Line – The opportunity for change is at the faculty level, so we want the process to address improvement at that level.
What We Decided to Do
Continue the traditional external review program on an 8-year schedule
Continue to partner with external reviews already conducted for accreditation or other purposes
Emphasize development of program-specific outcomes and assessment procedures to determine if they are being achieved
What We Decided to Do-continued
In addition to the external program review we will require each program to: Develop program-specific objectives and
outcomes Develop an assessment plan outlining the
assessment activities they will conduct Collect and analyze data on a regular basis Complete biennial assessment reports that
are submitted online
What We Decided to Do -continued
Provide the training and support necessary for programs to implement these changes Phase I: Creating Assessment Plans
Identify diverse programs for pilot Work with pilot programs to create assessment plans Offer all DGPs workshops based on pilot materials Provide support for creating assessment plans (individual
work, workshops, online management tool) Phase II: Implementing Assessment Plans
Identify at least one pilot program in each college Work with programs to collect, analyze, and report data Offer all DGPs workshops based on pilot materials Provide support for implementing assessment plans
What We Decided to Do -continued
Increase efforts relative to follow-up after the graduate program review – assess progress on recommendations Tie the annual assessment and biennial
reports to the external review by incorporating the changes made as a result of assessment into the self-study
Emphasize an “Action Plan” Agreed upon by University, Graduate School,
College and Department Administration
What is Outcomes-Based Assessment?
It entails a shift in emphasis from inputs to outcomes
It is continuous rather than periodic It involves regular reports of program
assessment to the institution Its results are used by the program and
institution for gauging improvement and for planning
What is Outcomes-Based Assessment?
It is a process that engages program faculty in asking three questions about their programs
What are our expectations for the program? To what extent is our program meeting our expectations? How can we improve our program to better meet our
expectations? It is a process that provides program faculty the
means to answer these questions By creating objectives and outcomes for their program By gathering and analyzing data to determine how well the
program is meeting the objectives and outcomes By applying the results of their assessment toward
improving their program
Potential Benefit of Assessment Planning Process
It is a faculty-driven process that Gives faculty a voice in defining the
program and thus a greater stake in the program
Gives faculty an investment in assessing the program
Provides faculty-approved indicators for gauging and improving the effectiveness of the program
What Are Objectives?
Program objectives are the general goals that define what it means to be an effective program.
Three Common Objectives
Developing students as successful professionals in the field
Developing students as effective researchers in the field
Maintaining/enhancing the overall quality of the program
What Are Outcomes?
Program outcomes are specific faculty expectations for each objective that define what the program needs to achieve in order to meet the objectives.
Example for Outcome 1 – Professional Development
1. To enable students to develop as successful professionals for highly competitive positions in industry, government, and academic departments, the program aims to provide a variety of experiences that help students to:
a. achieve the highest level of expertise in XXXX, mastery of the knowledge in their fields and the ability to apply associated technologies to novel and emerging problems
b. present research to local, regional, national, and international audiences through publications in professional journals and conference papers given in a range of venues, from graduate seminars to professional meetings
c. participate in professional organizations, becoming members and attending meetings
d. broaden their professional foundations through activities such as teaching, internships, fellowships, and grant applications
Example for Outcome 2 – Effective Researchers
2. To prepare students to conduct research effectively in XXXX in a collaborative environment, the program aims to offer a variety of educational experiences that are designed to develop in students the ability to:
a. read and review the literature in an area of study in such a way that reveals a comprehensive understanding of the literature
b. identify research questions/problems that are pertinent to a field of study and provide a focus for making a significant contribution to the field
c. gather, organize, analyze, and report data using a conceptual framework appropriate to the research question and the field of study
d. interpret research results in a way that adds to the understanding of the field of study and relates the findings to teaching and learning in science
Etc.
Example for Outcome 3 – Quality of Program
3. To maintain and improve the program’s leadership position nationally and internationally, the program aims to:
a. continue to be nationally competitive by attracting high-quality students
b. provide effective mentoring that encourages students to graduate in a timely manner
c. place graduates in positions in industry and academics
d. maintain a nationally recognized faculty that is large enough and appropriately distributed across XXXX disciplines to offer students a wide range of fields of expertise
Four Questions for Creating an Assessment Plan
1. What types of data should we gather for assessing outcomes?
2. What are the sources of the data?3. How often are the data to be
collected?4. When do we analyze and report the
data?
Types of Data Used
1. Take advantage of what you are already doing
Preliminary exams Proposals Theses and dissertations Defenses Student progress reports Student course evaluations Faculty activity reports Student exit interviews
Types of Data Used
2. Use Resources of Graduate School and Your Institutional Analysis Group
Enrollment statistics Time-to-degree statistics Student exit data Ten-year profile reports Alumni surveys
Types of Data Used
Use your imagination to find other types of data
Dollar amount of support for faculty Student cv’s Faculty surveys
Data: Two Standards to Use in Identifying Data
1. Appropriateness: Data should provide information that is suitable for assessing the outcome
2. Accessibility: Data should be reasonable to attain (time, effort, ability, availability, resources)
Four Questions for Creating an Assessment Plan
1. What data should we gather for assessing outcomes?
2. What are the sources of the data?3. How often are the data to be
collected?4. When do we analyze and report the
data?
Sources of Data
Students Faculty Graduate School Graduate Program Directors Department Heads Registration and Records University Information Technology University Planning and Analysis
Four Questions for Creating an Assessment Plan
1. What data should we gather for assessing outcomes?
2. What are the sources of the data?3. How often are the data to be
collected?4. When do we analyze and report the
data?
Frequency of Data Collection
Every semester Annually Biennially When available from individual graduate
students At the preliminary exam At the defense At graduation
Four Questions for Creating an Assessment Plan
1. What data should we gather for assessing outcomes?
2. What are the sources of the data?3. How often are the data to be
collected?4. When do we analyze and report the
data?
Creating a Timeline for Analyzing Assessment Data
According to objective: year 1-objective 1; year 2-objective 2; year 3-objective 3; year 4-objective 1; etc. (3-year cycle)
More pressing outcomes earlier and less pressing ones later
Outcomes easier to assess earlier and outcomes requiring more complex data gathering and analysis later
Approximately the same workload each year of the assessment cycle
Creating a Timeline for Reporting Assessment Data
Standard practice appears to be to call for a short annual or biennial assessment report
Longer cycles lose the impact on the continuous and ongoing nature
When possible combine with pre-existing external review program; including assessment reports as part of the self-study is recommended
Four Questions for Creating an Assessment Plan
1. What data should we gather for assessing outcomes?
2. What are the sources of the data?3. How often are the data to be
collected?4. When do we analyze and report the
data?
Questions to Guide Biennial Assessment Report
What outcomes did you plan to assess for the most recent reporting period?
What outcomes assessments were completed? What data did you collect and from what sources?
What did you learn about your program and/or your students from your assessments?
As a result of your assessment, what initiatives, if any, did you implement or do you propose to implement to address areas of concern? How will you measure the success of these initiatives?
What outcomes assessments are you planning for the upcoming reporting period?
Training Workshops Provided
Graduate Program Review – Where we are, Where we are headed, and why?Assessing the Mission of Doctoral Research Universities (a workshop on outcomes-based assessment put on by outside experts)Creating Outcomes and ObjectivesCreating an Assessment PlanUtilizing the Graduate School Managerial ToolsDeveloping an Institutional Database for Assessment of Graduate Programs – to be developed
Managerial Tools Created for Program Review - Website
Managerial Tools Created for Program Review - Profile Data
Managerial Tools Created for Program Review – Review Document Management
Managerial Tools Created for Program Review – Review Document Management
Managerial Tools Created for Program Review – Review Document Management
Managerial Tools Created for Program Review – Review Document Management
Managerial Tools Created for Program Review – Review Document Management
Revised Review Process Implemented at NC State
Initial Year 1(Start-Up)
•Development of objectives, outcomes and assessment tools
•Identification of data sources and beginning of data collection
Cycle Year 3 (also 5 and 7)
Continued data collection pertinent to outcomes and assessment measures
CompactInitiatives
Cycle Year 2(also 4 and 6)
•Ongoing assessment & self-study by grad faculty •Programmatic changes
•Brief biennial assessment report
Cycle Year 8(program review) •Self-study report
•External review
•Review report
•Program response
• Action plan
What We Have Learned/ Discussion Points
The process of change takes time We have been at this for almost four years (since the
start of the Task Force) and have just started collecting the first biennial reports
Communication is the key to success Clearly communicated goals and expectations are
important It is important to pilot assessment processes
before taking it to all graduate programs.
What We Have Learned/ Discussion Points continued
This kind of review process must be ground (faculty) up not top (administration) down Even then faculty may be skeptical about work
loads versus value – they must be able to see the the process is both meaningful and manageable
This kind of review process requires significant human resources Training, data collection, analysis, and
interpretation, etc. A key to our success is how much of this can be
institutionalized
Discussion Questions
How many of your institutions have an outcomes-based graduate program review process?How many of you are considering implementing such a review program?What do your programs (in place or under consideration) look like?What are some of the variations that exist across universities?
Discussion Questions continued
What kinds of faculty training have you provided? How successful is it?What kinds of accountability have you instituted? If reports, how often are they due?What are some of the problems you have encountered, or fear that you will encounter, in establishing outcomes-based assessment?What has been the level of campus buy-in?
Top Related