Update on ABET Assessment Work Dr. Wasfi G. Al-Khatib Information and Computer Science Department.
-
Upload
helena-mccormick -
Category
Documents
-
view
214 -
download
1
Transcript of Update on ABET Assessment Work Dr. Wasfi G. Al-Khatib Information and Computer Science Department.
Update on ABET Assessment Work
Dr. Wasfi G. Al-Khatib
Information and Computer Science Department
Emerging Information Regarding ABET Assessment
• Two major workshops related to ABET accreditation were given in the past two months.
• “Major findings” out of these workshops are:– The assessment process must be useful in assessing the
program outcomes, yet simple enough to be carried out on a regular basis
– ABET DOES NOT require • Course-level learning outcome evaluation
– although KFUPM as a school may do (see PAC guidelines).
• Mapping each course-learning outcome to a program outcome
Current Direct Measure
• The direct measure of using course learning outcomes, as has been advocated by the ABET steering committee earlier, raises the following concerns:– The use of course grades is highly discouraged by ABET
for program outcome evaluation purposes (guess why?)– The system is complex and may not sustain periodic
evaluation, which is essential for this and subsequent ABET visits (with all those weighted averages!!!!)
– Not enough to substantiate the claims that students have fulfilled program outcomes based on individual course evaluation
• Q: What is available for the evaluators from student work when they come to visit us to verify our achievement claims?
• A: Only best and worst samples.
What to do??!!
• Introduce and use “Rubrics” in addition to the current direct measure– A rubric is a descriptive rating scale with several different observable
levels of performance possible for each performance criteria being assessed.
– Each performance level is described and assigned a numeric score • For example, 4 = exemplary, 3 = good, 2 = marginal, 1 = unacceptable. • Why 4 not 5? (Make it simpler and reduce variations in evaluation)
– Each performance level describes the level of cognition or skill that the outcome requires in order to be classified into this performance level.
• It is a program outcome measure, but not a grade!!!!– Reporting the percent of students who score at each of the levels (or
as an average, for example) provides useful data that are linked directly to the anticipated program outcome and focuses the evaluation in order to look for strategies for improvement, if needed.
Rubrics for Computer Science
• Senior Project Option:– Senior Project:
• Final Report• Final Presentation
– Summer Training– Maybe the database course
• Coop Option– Coop Course
• Final Report• Employer Input
– Maybe the database course
Rubric Example from Computer Science (Early Draft)
ProgramOutcome
4 points 3 points 2 points 1 point Score
Modeling
• Overall architecture present and well defined
• The design of the system, broken into modules, is present and consistent
• Details of each module are present and seem accurate
• Overall architecture present and well defined
• The design of the system, broken into modules, is present and mostly consistent
• Details of 50% of modules are not present but seem accurate
• Overall architecture present and well defined
• The design of the system, broken into modules, is present but not complete or has inconsistencies.
• Details of modules seem not to be accurate nor complete
The overall architecture may be present but the design of the system is either missing or lacking major components with little or no details
Rubric Example from Computer Science (Early Draft)
ProgramOutcome
4 points 3 points 2 points 1 point Score
Written Communication
• Report is well organized
• Report is grammatically sound
• Report is spell-checked
• Report is well organized
• Report is mostly grammatically sound
• Some words are misspelled
• Report is well organized
• Report contains major grammatical mistakes and/or some words are misspelled
Report is not well organized
Rubric Example from Computer Science (Early Draft)
ProgramOutcome
4 points 3 points 2 points 1 point Score
Oral Communication
• Clear• Well-
formed sentences
• Confident• Maintains
proper eye contact
• Clear• Most
sentences are well-formed
• Confident• Maintains
proper eye contact
• Not as clear
• Not more than 50% of sentences are ill-formed
• Confident
•Not clear and/or many sentences are ill-formed.
Beyond Rubrics-attached Courses
• Employer evaluation, in form of a questionnaire represent an excellent direct measure for “not-that-mappable” program outcomes to student work. For example– Ability to design a system/component to meet desired needs– Oral Communication– Team work and Professional/Ethical Responsibility– Contemporary issues
• This can apply to COOP/Summer Training/Senior Project
• This means that we need to change the current employer/senior project evaluation forms
Beyond Rubrics-attached Courses
• Current Summer Training/Coop Form
Conclusions
• The department has to come up with– a set of courses to do a program evaluation by a group of faculty members.
• Senior project, COOP, Summer Training and [may be] at most two other junior/senior-level courses
• This will remove reliance on “grades”– a set of rubrics for each "student work".
• This will canonize the program outcomes SCALE that is suitable for each department• This will reduce the variation in evaluation of different faculties
• Remember that the using student reports and employer questionnaires can be displayed for each student for the team to look at and evaluate upon their visit– Whether you choose to do all senior projects or take a random sample
• This needs to be done every semester.– A standing departmental committee has been established in each department to
carry out the assessment every semester • We have to keep the current direct measure
– We will evaluate it vs. rubrics and possibly stick to the most appropriate direct measure for future assessments
Thank You