Post on 21-May-2020
Rethinking UMBC’s Course Evaluation Mechanism: A Preliminary Report Faculty Affairs Committee – Task Force on Course Evaluation Marie desJardins, Kimberly Moffitt, Zhiyuan Chen, and Matt Baker December 1, 2011
Executive Summary Since at least 2001, UMBC has engaged in repeated discussions and debates regarding the improvement and updating of our course evaluation mechanism [Demorest 2003, Dillon 2008, UMBC UFRC 2010], but has not acted on any of the resulting recommendations. The SCEQ survey, which has been in place for over 30 years, has never been scientifically validated, and is widely regarded as an inadequate mechanism for evaluating teaching and course delivery. In tenure evaluation and other personnel decisions, inordinate weight is typically placed on a single question (Question 9, “How would you grade the overall effectiveness”). Moreover, the mechanisms for distributing, collecting, and analyzing SCEQ data are costly and problematic. The Faculty Affairs Committee has been asked to develop recommendations for a new course evaluation mechanism. In particular, we have identified three key questions to be answered:
1. Should UMBC move to a new course evaluation survey to replace the SCEQ questions, and if so, what instrument should be used?
2. Should UMBC consider a new mechanism for administering the course evaluations, and specifically, should we adopt an online-‐only or hybrid online-‐paper model rather than continuing the current paper-‐only model?
3. What other mechanisms, beyond end-‐of-‐semester student course evaluations, should departments be using for promotion decisions, ongoing faculty review, and continuous assessment and improvement of teaching and curriculum?
The first two questions are expanded and discussed in this report. The third question is outside the scope of this report, but the FAC is investigating best practices for teaching evaluation, and will make recommendations to the Faculty Senate in Spring 2012. The purpose of this report is to familiarize senators and departments with some of the issues and research on course evaluation, and to suggest preliminary recommendations. Senators are asked to bring this report back to their departments and begin a constructive discussion about the best way for UMBC to create standardized, useful ways for gathering and analyzing student viewpoints about the quality and effectiveness of teaching at UMBC.
To summarize our preliminary findings and recommendations:
1. By March 2012, UMBC should formally adopt a new course evaluation survey from among the existing validated course instruments that are available as commercial or open-‐source products. Our preliminary recommendation, based on our investigations and discussions to date, is that UMBC should adopt the SIR II survey created by the Educational Testing Service.
2. UMBC should create an online administration mechanism for the adopted course evaluation instrument, to be piloted in the 2012-‐13 academic year and made permanent and campus-‐wide effective in Fall 2013. Further discussion of online vs. paper options, and ways to maximize student response rates, are discussed in the body of the report.
Selecting a Course Evaluation Instrument In May 2010, the UFRC made the following recommendation to Provost Hirshman:
We recommend that a committee be constituted to examine measures for teaching assessment and effectiveness that can be used across Colleges, measures that DP&TCs and Deans can refer to in a more consistenty way in retention and promotion procedures…. There is almost universal agreement that the SCEQ survey is seriously flawed, yet substantial weight is given to SCEQ scores in the promotion cases. The UFRC is very concerned about the use, and misuse, of SCEQ scores in promotion and tenure proceedings. The language in departmental reports and Dean letters makes statements about the significance of score differences that we feel cannot be supported on either statistical grounds or legal grounds. [UMBC UFRC 2010]
Raoul Arreola, in his book Developing a Comprehensive Faculty Evaluation System: A Guide to Designing, Building, and Operating Large-Sale Faculty Evaluation Systems, identifies the “homemade” nature of many surveys (as with the SCEQ) as a primary reason why faculty believe that student rating forms are not valid or reliable [Arreola 2012]. Nevertheless, extensive research has shown that student ratings can be a valid, reliable, and useful way to evaluate faculty instruction. Arreola surveys relevant literature in this area, reaching the conclusion that “properly constructed, appropriately administered, and correctly interpreted student ratings can be valid and reliable measures indicating the quality of teaching.” Arreola does identify some areas in which design and interpretation of student rating systems are especially critical. Upper-‐level courses are generally rated more favorably than lower-‐level courses, which should be taken into account when interpreting results. Similarly, required courses are rated lower than electives; STEM courses are rated lower than non-‐STEM courses; and as a rule, single general items (such as Question 9 on the SCEQ) should not be used as a sole measure of performance, since these items tend to correlate more highly with instructor
gender, student status, and required vs. elective courses than either specific items or subscale sets of questions. Some concerns that have been raised (at UMBC and elsewhere) seem not to be supported by the research findings. In particular, for well designed, specific questions and subscales, significant correlations have not been shown between student ratings and expected grades, instructor’s gender, class size, time of day, whether the student is majoring in the subject or not, or the rank/title of the instructor.1 It is clear that the existing SCEQ—particularly the emphasis on Question #9 for teaching evaluation—is neither well designed nor likely to be valid (and in fact, to our knowledge, no effort has ever been made to scientifically validate the SCEQ or correlate it to desired instructional behaviors or student outcomes). However, developing and validating a new student rating instrument would be a time-‐consuming and challenging endeavor. We are therefore recommending that an existing validated instrument be adopted and used in place of the SCEQ as UMBC’s new student rating survey, effective in the Fall 2012 semester. The Faculty Affairs task force that was charged with investigating course survey instruments considered several alternatives. The first three instruments are available from the providers in both paper and online formats, and provide statistical analyses of the results; the last is a public-‐domain survey that could be adopted but would have to be analyzed and administered internally (or through an outside firm hired for that purpose). All are normed and validated based on previous student data; all provide subscale statistics (through grouped questions) as well as individual question statistics. All permit the introduction of additional questions (either selected from a large set of optional items, or designed by the university; note that validity cannot be maintained if questions are dropped). Samples of each of these surveys are provided at the end of this document.
1. Aleamoni Course-‐Instructor Evaluation Questionnaire (CIEQ) • http://www.cieq.com/ • The website claims 500 clients (though it is not clear whether this
refers to current clients or also includes past clients: the instrument has been in use for 40+ years, though it seems to have been updated at least in 1992). The list provided by Aleamoni includes many large public universities, including UMCP, UMass, UMichigan, and UNC.
• Committee members felt the wording of some questions (e.g., “Some things were NOT explained very well”) was overly informal and sometimes confusing.
• It is not clear whether results are provided in electronic (spreadsheet) format that could readily be used for further analysis.
1 Note, however, that with an improved survey instrument, and ready access to the data it provides, UMBC would be better able to determine whether, and to what extent, these and other factors do affect student ratings at UMBC specifically.
• It is not clear whether open-‐ended questions are included, or whether open-‐ended questions can be added.
2. IDEA Student Ratings of Instruction • http://www.theideacenter.org/ • This instrument is based on “20 teaching methods and 12 learning
objectives.” The survey includes items about “possible learning objectives” that seem to be irrelevant for many courses. The committee was generally unenthusiastic about this survey.
• It is not clear whether open-‐ended responses are possible. • A list of clients is not provided, but the testimonials seem to mostly be
from smaller regional schools. The website claims that the form is used in 200,000 classes at 340 colleges and universities.
• Results are provided in paper, PDF, and spreadsheet format. 3. SIR II (Student Instructional Report)
• http://www.ets.org/sir_ii/about • Based on “eight dimensions of college instruction” (e.g., course
organization and planning, faculty communication, faculty/student interaction).
• Students can write comments that are (anonymously) forwarded to the instructor.
• No list of clients is provided, but the website indicates that they have “comparative data from nearly one million students in 107,000 two-‐year and more than 117,000 four-‐year courses nationwide.”
• Results are provided in paper and electronic reports, including subscale and item-‐level statistics per course and across the institution. Available comparative data from other doctoral-‐granting universities is extensive, and is broken down by instructor rank; faculty status (full/part time); class format, size ,and level; and department/discipline.
4. SEEQ (Students’ Evaluations of Educational Quality) • http://onlinelibrary.wiley.com/doi/10.1111/j.2044-‐
8279.1982.tb02505.x/abstract • Based on nine components of teaching effectiveness. • The SEEQ is quite long, and the wording of some of the questions is
rather odd. Some of the questions in the “breadth” topic may not apply to some classes/disciplines.
• Adopted by a number of universities (e.g., University of Wisconsin Whitewater, University of Saskatchewan, Fordham, University of Manitoba), though no definitive list is available. (One source indicates that SEEQ “has been used by more than a million students in 50,000 courses worldwide [MAU 2011].)
• Public domain (i.e., the SEEQ is not copyrighted and is freely available, but no support is provided for administering and analyzing the data, and no current comparative/normed data is available).
• Citations: [Marsh 1982, Coffey & Gibbs 2001] 5. Instructional Assessment System (IAS)
• http://www.washington.edu/oea/services/course_eval/index.html • Developed and in use at the University of Washington for 20+ years,
also used at 60 other institutions. • Specialized forms for different instructional formats. • Certain questions (esp. 2, 3, 13) seem oddly worded and potentially
difficult for students to interpret. • Various report formats are available, using institutional norms and
item-‐level responses. (Grouped subscales do not appear to be provided.)
• Both paper and online forms are available. Pricing varies; mechanism and rates for online option are not clear.
Administering the Course Evaluation Survey As of 2005, approximately 33% of colleges and universities administered online surveys for all or some of their courses [Anderson et al. 2006]. This number was climbing rapidly at that time, so the percentage is likely higher today, though we were not able to find national statistics. A recent survey of our institutional peers showed that four of them use paper surveys, three use online surveys exclusively, and three use a combination of both. Of the aspirational peers on which we were able to obtain data, two use paper-‐based surveys and two use online surveys. Response Rate. Concerns about potentially lower response rates are legitimate, but there is no evidence that the actual ratings from online surveys are statistically different from paper surveys [Arreola 2007, Carinin et al. 2003]. Nulty [2008] surveyed eight published studies, and concluded that on average, the response rate for online surveys was 33%, compared to 56% for paper surveys.2 Nulty’s main conclusion, though, is that the more steps that are taken to increase response rate, the higher the response rate. The university in the reported surveys that used repeat reminder emails to both students and faculty, as well as prizes for respondents awarded through a lottery, had the highest response rate (47%). An NJIT report indicates that
Columbia has experienced response rates of 85% or more in their web-based course evaluations. The success is due to a combination of technology-mediated communications, incentive packages, and internal marketing strategies. The Columbia system allows them to monitor response rates during the survey administration period and target emails to both faculty and students where incremental urging is required. Columbia also provides incentives: palm pilot give-aways, pizza parties associated with completing surveys in a designated computer lab.
2 In the one study that Nulty provides for online courses, there was no difference between paper and online response rates, so it appears that it is the face-‐to-‐face administration of the survey that makes the difference, rather than the mode of response.
NJIT itself reported that in the three years since implementing online course evaluations in distance learning courses, response rates had risen from 47% to 61%, whereas face-‐to-‐face response rates had remained roughly stable at 60%. (Another interesting observation is that UCLA’s medical school achieves a 100% response rate because students receive incomplete grades if they do not complete the evaluations.) UMBC’s pilot study of online course evaluations, which was done in Summer 2001 and Spring 2002, found a quite low response rate (roughly 25% vs. 72% for the same courses the previous year) – but the mean ratings were not statistically different from paper-‐based ratings. Moreover, it is not clear how well publicized that study was, or exactly how faculty asked the students to participate. It is quite clear from the literature that familiarity and experience, systematic campus-‐wide reminders, publicity and encouragement from the university and student organizations, and incentives will all increase the response rate significantly. Another issue to consider is that students spend significantly more time online now (and are significantly more likely to have a personal computer) than they did in 2002. Students are more comfortable with online interactions, and the use of technology is no longer in and of itself a barrier to student participation in online course evaluations. UMBC’s Assured Access Initiative, which went into effect in Fall 2001, ensures that all of our students have ready access to a computer, either their own computer or the computers available in the library and engineering laboratories. In fact, Donovan et al. [2007] found that most students, when offered a choice, preferred the online option. Some of the reasons given for this preference are the increased privacy, the ability to spend more time on the survey, a belief that students who answer surveys online will be more truthful (as opposed to rushing to complete the survey in order to leave class), and a dislike of “bubble sheets.” Students not preferring online surveys expressed concerns about security, about the professor being able to see the results before grades are submitted, and a belief that students would be more likely to complete the survey if it was administered in class. Johnson [2002] found that the percentage of students who gave written comments in the open-‐ended sections of the course survey increased significantly, from 10% for paper-‐based surveys to 63% for online surveys. This is an interesting finding that is the reverse of the overall reduction in response rate with online surveys, and may indicate that students are more likely to provide thoughtful, detailed feedback when given the time and opportunity through an online survey. Mechanisms for Increasing Response Rate. By far the most common mechanism mentioned in the research literature for increasing response rate is to limit early access to grades to students who have completed the course surveys. We strongly recommend that this incentive be adopted at UMBC. The other extremely important factor has to do with communication: Johnson found that response rate increased
from 20% (for professors who didn’t mention the online survey at all) to 32% when students were encouraged but not explicitly assigned to complete the survey. The response rate climbed again, to 77%, when students were explicitly assigned to complete the survey (although no points were given and the instructor did not actually have access to the responses). Instructors who actually gave points for completing the survey (again, without access to the responses themselves) increased the response rate in their classes to 87%. Faculty engagement and student engagement are two of the most essential components to a successful course evaluation process. Regardless of the mechanism or process, students are more likely to complete a survey, and more likely to be truthful and forthcoming in their responses, if they perceive that the instructor, department, and administration care about the student feedback—and that they will use the feedback for course and instructor mentoring and improvement. While no systematic study of student perception in this area has been done at UMBC, based on anecdotal discussions with students during the writing of this report, the general belief among students is that few if any faculty or departments actually read or care about the survey responses. If this is not the case—i.e., if we do care (beyond just having some data to “plug into” the P&T process)—then we should make a point of communicating the ways in which their responses are used to work towards continual improvement of our educational programs. Cost. Based on the analysis that has previously been done of the SCEQ administration, it appears that the cost of printing, distributing, scanning, and analyzing the SCEQ data is on the order of $30/section (the estimate in the OIR report is $80K for administration to 2700 course sections). The IS department, which uses an outside provider to administer their course surveys, pays approximately $15/section.3 The SIR II pricing information on the website indicates that online surveys are 90 cents per student, for an average of $26 per 28-‐student section. (Paper surveys are approximately $1.40 per student, or $40 per section, if the same reports as those available with the online forms are included. This pricing information does not include any potential discounts that could be negotiated given the size of our student population.) Timeline. We expect that this report will generate lively and constructive discussion on the topic of course surveys among the faculty and administration. Our intended timeline for the conversation and next steps is as follows:
• December 2011 – Distribute preliminary report and begin discussion.
3 The IS survey is a non-‐validated adaptation of the SCEQ survey, so the price that the outside vendor charges does not include any licensing fee for the survey instrument itself, only for the collection of provision of data through an external website. Our proposal would be that the campus-‐wide online surveys be administered through SA but processed at the commercial provider, to permit ready access (and connection with the UMBC Alert and password system) but maintain privacy and anonymity of student responses.
• February 2012 – Continue discussion at the February Faculty Senate meeting and request concrete inputs and suggestions.
• March 2012 – Present motions for the adoption of a new course survey instrument, and for the adoption of a new survey administration mechanism.
• April 2012 – November 2012 – Conversion period during which logistics, pricing, and processes for administering the new survey will be finalized by the appropriate units.
• December 2012 – First administration of the new survey (potentially in paper and online form by different courses/departments).
• Fall 2013 – Full adoption of online surveys (potentially with an option to continue with a paper-‐based model on a departmental level).
Transition Process. For junior faculty moving towards promotion, who will have some SCEQ results and some results under the new system, we recommend that departments be directed to explicitly address how they are interpreting the old and new evaluations (and in particular, that they be directed not to combine ratings from the old and new system, which may not be directly comparable). Because we recognize that some faculty and departments are especially reluctant about moving to an online system, we suggest that in the first year of the transition to a new survey instrument, the option of paper or online surveys should be given. The choice of whether to use paper or online surveys could be made at the departmental or the individual level. After the first year, the committee’s recommendation is that departments that choose to continue to provide paper surveys should be asked to bear the additional cost of that administration method (but that they should have the option to do so if they so choose).
References Joan Anderson, Gary Brown, and Stephen Spaeth, “Online Student Evaluations and Response Rates Reconsidered,” Journal of Online Education 2(6), August/September 2006. Raoul A. Arreola, Developing a Comprehensive Faculty Evaluation System: A Guide to Designing, Building, and Operating Large-Scale Faculty Evaluation Systems (3/e). Anker Publishing Company, 2007. Robert M. Carini, John C. Hayek, George D. Kuh, John M. Kennedy, and Judity A. Ouimet, “College student responses to web and paper surveys: Does mode matter?” Journal of Research in Higher Education, 44(1): , 1-‐19, February 2003. Martin Coffey and Graham Gibbs, “The Evaluation of the Student Evaluation of Educational Quality Questionnaire (SEEQ) in UK Higher Education,” Assessment &
Evaluation in Higher Education 26(1): 89-‐93, doi 10.1080/026029300200223182001, 2001. Marilyn Demorest, “Report: Online SCEQ Project,” UMBC internal memorandum, November 2003. Michael Dillon, “Administration of Student Course Evaluations,” UMBC internal memorandum, September 2008. Judy Donovan, Cynthia Mader, and John Shinsky, “Online vs. Traditional Course Evaluation Formats: Student Perceptions,” Journal of Interactive Online Learning, 6(3),: 158-‐180 Winter 2007. Trav Johnson, “Online Student Ratings: Will Students Respond?” Conference of the American Education Research Association, New Orleans 2002. H. W. Marsh, “SEEQ: A Reliable, Valid, and Useful Instrument for Collecting Students’ Evaluations of University Teaching.” British Journal of Educational Psycholog, 52: 77–95. doi: 10.1111/j.2044-‐8279.1982.tb02505.x, February 1982. Mount Allison University, “Teaching & Learning: Seeking Useful Feedback from Students,” http://www.mta.ca/pctc/TONI_SEEQ/abbrev_history.htm, accessed November 2011. New Jersey Institute of Technology, “On-‐Line Course Evaluation” (white paper), http://www.njit.edu/about/pdf/online-‐student-‐evaluation-‐supportingmaterial-‐2008-‐10-‐13.pdf (accessed November 2011), October 2008. Duncan D. Nulty, “The Adequacy of Response Rates to Online and Paper Surveys: What can be Done?” Assessment & Evaluation in Higher Education 33(3): 301-‐314, June 2008. UMBC UFRC, “Recommendations from the UFRC,” UMBC internal memorandum, May 2010.
Created: 08/18/08
By: slw
Modified: 08/19/08
By: slw
Master Marks & Tag Line: Layer 9
Timing & Program Marks: Layer 6
Response Positions: Layer 1
Red 185: Layer 1
Corner Cut: Layer 12
Color 3: Layer 3
Color 4: Layer 4
Color 3: Layer 1
Color 4: Layer 4
Color 5: Layer 5
Black: Layer 6
Notes: Similar to 5902.01
5903.01
Copyright ! IDEA Center, 1998 Continued on back page
Institution:
Course Number:
Instructor:
Time and Days Class Meets:
1.2.3.4.5.6.7.8.9.
10.11.12.13.14.15.16.17.18.19.20.
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
Displayed a personal interest in students and their learningFound ways to help students answer their own questionsScheduled course work (class activities, tests, projects) in ways which encouraged students to stay up-to-date in their workDemonstrated the importance and significance of the subject matterFormed "teams" or "discussion groups" to facilitate learningMade it clear how each topic fit into the courseExplained the reasons for criticisms of students’ academic performanceStimulated students to intellectual effort beyond that required by most coursesEncouraged students to use multiple resources (e.g. data banks, library holdings, outside experts) to improve understandingExplained course material clearly and conciselyRelated course material to real life situationsGave tests, projects, etc. that covered the most important points of the courseIntroduced stimulating ideas about the subjectInvolved students in "hands on" projects such as research, case studies, or "real life" activitiesInspired students to set and achieve goals which really challenged themAsked students to share ideas and experiences with others whose backgrounds and viewpoints differ from their ownProvided timely and frequent feedback on tests, reports, projects, etc. to help students improveAsked students to help each other understand ideas or conceptsGave projects, tests, or assignments that required original or creative thinkingEncouraged student-faculty interaction outside of class (office visits, phone calls, e-mail, etc.)
The Instructor:
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
Your thoughtful answers to these questions will provide helpful information to your instructor.
Describe the frequency of your instructor’s teaching procedures, using the following code:
Twelve possible learning objectives are listed below, not all of which will be relevant in this class. Describe theamount of progress you made on each (even those not pursued in this class) by using the following scale:
21.22.23.24.
25.26.27.28.29.30.31.32.
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
Gaining factual knowledge (terminology, classifications, methods, trends)Learning fundamental principles, generalizations, or theoriesLearning to apply course material (to improve thinking, problem solving, and decisions)Developing specific skills, competencies, and points of view needed by professionals in the field most closely related to this courseAcquiring skills in working with others as a member of a teamDeveloping creative capacities (writing, inventing, designing, performing in art, music, drama, etc.)Gaining a broader understanding and appreciation of intellectual/cultural activity (music, science, literature, etc.)Developing skill in expressing myself orally or in writingLearning how to find and use resources for answering questions or solving problemsDeveloping a clearer understanding of, and commitment to, personal valuesLearning to analyze and critically evaluate ideas, arguments, and points of viewAcquiring an interest in learning more by asking my own questions and seeking answers
Progress on:
12345
-----
No apparent progressSlight progress; I made small gains on this objective.Moderate progress; I made some gains on this objective.Substantial progress; I made large gains on this objective.Exceptional progress; I made outstanding gains on this objective.
IMPORTANT!
1=Hardly Ever 2=Occasionally 3=Sometimes 4=Frequently 5=Almost Always
Improper MarksProper Marks
SURVEY FORM - STUDENT REACTIONS TO INSTRUCTION AND COURSES
+
Created: 08/18/08
By: slw
Modified: 08/19/08
By: slw
Master Marks & Tag Line: Layer 9
Timing & Program Marks: Layer 6
Response Positions: Layer 1
Red 185: Layer 1
Corner Cut/Perf. Target: Layer 12
Color 5: Layer 5
Black: Layer 6
Notes:
5903.02
36.37.38.39.40.41.42.
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
I had a strong desire to take this course.I worked harder on this course than on most courses I have taken.I really wanted to take a course from this instructor.I really wanted to take this course regardless of who taught it.As a result of taking this course, I have more positive feelings toward this field of study.Overall, I rate this instructor an excellent teacher.Overall, I rate this course as excellent.
Describe your attitudes and behavior in this course, using the following code:1=Definitely
False2=More False
Than True3=In Between 4=More True
Than False5=Definitely
True
1 2 3 4 5
33.34.35.
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
Amount of readingAmount of work in other (non-reading) assignmentsDifficulty of subject matter
On the next three items, compare this course with others you have taken at this institution, using the following code:1=Much Less than
Most Courses2=Less than
Most Courses3=About Average 4=More than
Most Courses5=Much More
than Most Courses
43.44.45.46.47.
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
As a rule, I put forth more effort than other students on academic work.The instructor used a variety of methods--not only tests--to evaluate student progress on course objectives. The instructor expected students to take their share of responsibility for learning.The instructor had high achievement standards in this class.The instructor used educational technology (e.g., Internet, e-mail, computer exercises, multi-mediapresentations, etc.) to promote learning.
For the following items, blacken the space which best corresponds to your judgment:1=Definitely
False2=More False
Than True3=In Between 4=More True
Than False5=Definitely
True
48.49.50.51.52.53.54.55.56.57.
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
EXTRA QUESTIONSIf your instructor has extra questions, answer them in the space designated below (questions 48-67):
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
58.59.60.61.62.63.64.65.66.67.
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
1 2 3 4 5
Use the space below for comments(unless otherwise directed).Note: Your written comments may bereturned to the instructor, You may wantto PRINT to protect your anonymity.
Comments:
Printed in U.S.A.TF5903 (08/08) 0 9 8 7 6 5 4 3 2 1
The Course:
1
Student Evaluation of Educational Quality (SEEQ) Developed by Dr. Herbert W. Marsh
Paper Version Please read each question very carefully. Make sure you understand what is being asked. Use this scale and circle the number that is closest to your rating for that item: Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Not applicable LEARNING You find the course intellectually challenging and stimulating. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 You have learned something which you consider valuable. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Your interest in the subject has increased as a consequence of this course. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 You have learned and understood the subject materials in this course. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 ENTHUSIASM Instructor is enthusiastic about teaching the course. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Instructor is dynamic and energetic in conducting the course. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Instructor enhances presentations with the use of humor. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Instructor's style of presentation holds your interest during class. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5
2
ORGANIZATION Instructor's explanations are clear. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Course materials are well prepared and carefully explained. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Proposed objectives agree with those actually taught so you know where the course is going. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Instructor gives lectures that facilitate taking notes. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 GROUP INTERACTION Students are encouraged to participate in class discussions. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Students are invited to share their ideas and knowledge. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Students are encouraged to ask questions and are given meaningful answers. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Students are encouraged to express their own ideas and/or question the instructor. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 INDIVIDUAL RAPPORT Instructor is friendly towards individual students. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Instructor makes students feel welcome in seeking help/advice in or outside of class Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5
3
Instructor has a genuine interest in individual students. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Instructor is adequately accessible to students during office hours or after class. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 BREADTH Instructor contrasts the implications of various theories. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Instructor presents the background or origin of ideas/concepts developed in class. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Instructor presents points of view other than his/her own when appropriate. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Instructor adequately discusses current developments in field. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 EXAMINATIONS Feedback on examinations/graded materials is valuable. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Methods of evaluating student work are fair and appropriate. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Examinations/graded materials test course content as emphasized by instructor. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 ASSIGNMENTS Required readings /texts are valuable. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Readings, homeworks, etc., contribute to appreciation and understanding of the subject. Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5
4
OVERALL How does this course compare with other courses you have had at Mount Allison? Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 How does this instructor compare with other instructors you have had at Mount Allison? Very Poor Poor Moderate Good Very Good N/A 1 2 3 4 5 Do you have any comments to add about your OVERALL EVALUATION of the course especially with respect to items not mentioned in the above questions? STUDENT AND COURSE CHARACTERISTICS Please circle your answer to each question. Course difficulty, relative to other courses, is: very easy easy medium hard very hard Course workload, relative to other courses, is: very light light medium heavy very heavy Course pace, relative to other courses, is: too slow slow about right fast too fast Hours per week required outside of class: 0 to 2 hours 3 to 5 hours 6 to 8 hours 9 to 11 hours over 11 hours Your level of interest in the subject prior to this course: very low low medium high very high Your overall grade point average at MtA: below 2.5 2.5 - 2.9 3.0 - 3.4 3.5 - 3.7 above 3.7 Your expected grade in the course: A B C D F Your reason for taking the course: Required distribution credit elective personal interest Your year in school: 1 2 3 4
29. From the total average hours above, how many do you considerwere valuable in advancing your education?
Fill in bubbles darkly and completely.Erase errors cleanly.
IA
S
nstructionalssessment
ystem AFORM
Instructor Course DateSection
Completion of this questionnaire is voluntary. You are free to leave some or all questions unanswered.
1. The course as a whole was:2. The course content was:3. The instructor’s contribution to the course was:4. The instructor’s effectiveness in teaching the subject matter was:
Excel-lent
VeryGood Good Fair Poor
VeryPoor
5. Course organization was:6. Clarity of instructor’s voice was:7. Explanations by instructor were:8. Instructor’s ability to present alternative explanations when needed was:9. Instructor’s use of examples and illustrations was:
10. Quality of questions or problems raised by instructor was:11. Student confidence in instructor’s knowledge was:12. Instructor’s enthusiasm was:13. Encouragement given students to express themselves was:
14. Answers to student questions were:15. Availability of extra help when needed was:16. Use of class time was:17. Instructor’s interest in whether students learned was:18. Amount you learned in the course was:
19. Relevance and usefulness of course content were:20. Evaluative and grading techniques (tests, papers, projects, etc.) were:21. Reasonableness of assigned work was:22. Clarity of student responsibilities and requirements was:
23. Do you expect your grade in this course to be:24. The intellectual challenge presented was:25. The amount of effort you put into this course was:26. The amount of effort to succeed in this course was:27. Your involvement in this course (doing assignments, attending classes, etc.) was:
Relative to other college courses you have taken: MuchHigher Average
MuchLower
28. On average, how many hours per week have you spent on thiscourse, including attending classes, doing readings, reviewingnotes, writing papers and any other course related work?
Under 22 - 34 - 5
6 - 78 - 910 - 11
12 - 1314 - 1516 - 17
18 - 1920 - 2122 or more
Under 22 - 34 - 5
6 - 78 - 910 - 11
12 - 1314 - 1516 - 17
18 - 1920 - 2122 or more
30. What grade do you expect in thiscourse?
B (2.9-3.1)B- (2.5-2.8)C+ (2.2-2.4)
A (3.9-4.0)A- (3.5-3.8)B+ (3.2-3.4)
C (1.9-2.1)C- (1.5-1.8)D+ (1.2-1.4)
D (0.9-1.1)D- (0.7-0.8)E (0.0)
PassCreditNo Credit
31. In regard to your academic program, is this coursebest described as:
In your major?In your minor?
A distribution requirement?A program requirement?
An elective?Other?
©1995, University of Washington - Office of Educational AssessmentMark Reflex® forms by Pearson NCS MM89765-3 654321 ED06 Printed in U.S.A.
ADDITIONAL ITEMSUse only if directed
A B C D E F G
A B C D E F G
A B C D E F G
A B C D E F G
A B C D E F G
A B C D E F G
A B C D E F G
A B C D E F G
A B C D E F G
A B C D E F G
A B C D E F G
A B C D E F G
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
0
1
2
3
4
5
6
7
8
9
1
2
3
4
5
6
7
8
9
10
11
12
13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35