05739211

download 05739211

of 6

Transcript of 05739211

  • 7/30/2019 05739211

    1/6

    SIGNALS AND SYSTEMS ASSESSMENT: COMPARISON OF RESPONSES TO MULTIPLECHOICE CONCEPTUAL QUESTIONS AND OPEN-ENDED FINAL EXAM PROBLEMS

    Kathleen E. Wage

    George Mason Univ.

    ECE DepartmentFairfax, VA 22030 USA

    [email protected]

    John R. Buck

    Univ. of Massachusetts Dartmouth

    ECE DepartmentN. Dartmouth, MA 02747 USA

    [email protected]

    Margret A. Hjalmarson

    George Mason Univ.

    Graduate School of EducationFairfax, VA 22030 USA

    [email protected]

    Jill K. Nelson

    George Mason Univ.

    ECE DepartmentFairfax, VA 22030 USA

    [email protected]

    ABSTRACT

    The validity of the Signals and Systems Concept Inventory (SSCI)

    was evaluated by comparing students performance on the SSCI to

    open-ended final exam problems. An assessment instrument is said

    to be valid to the extent that it measures what it was designed to

    measure. The SSCI was designed to measure students understand-

    ing of core concepts in undergraduate signals and systems (S&S)

    courses through 25 multiple choice questions. The SSCI scores

    and final exam scores for more than 150 students in four sections

    of S&S at two schools were found to have a statistically signifi-

    cant correlation. A more detailed analysis was conducted with a

    pool of over 60 students at both schools. This second analysis com-

    pared detailed coding of students responses on the final exam prob-

    lems to their answers for specific SSCI questions assessing the same

    topic. This analysis found statistically significant correlations be-

    tween SSCI questions and final exam problems for some convolu-

    tion and Fourier transform problems. Results were mixed for the

    problem on Bode plots.

    Index Terms Assessment, signals and systems, concept in-

    ventory

    1. INTRODUCTION

    Engineering problem solving requires both conceptual and procedu-

    ral knowledge. For example, signals and systems students must un-

    derstand both the fundamental concept of convolution and the proce-

    dure for computing the convolution integral. Student understanding

    of convolution and other important topics can be assessed in various

    ways. One way to test conceptual understanding is to administer a

    concept inventory, which is a standardized multiple-choice exam. A

    concept inventory (CI) is designed so that the incorrect choices (dis-

    tractors) represent common misconceptions. Ideally, CIs emphasize

    conceptual understanding, rather than rote calculation. Procedural

    knowledge, such as how to compute a convolution integral, is of-

    ten measured using problem-solving exercises. Analysis of student

    responses to open-ended problems reveals whether they understand

    how to implement the convolution operation.

    This study compares student responses to a concept inventorywith their responses to open-ended final examination questions for

    a signals and systems course. A key motivation for this work is the

    need to validate the concept inventory. An exam is said to be valid

    if it measures what it was intended to measure [1]. There are a

    number of different aspects of validity; see the article by Moskal et

    al. for a summary [2]. This study investigates the content validity of

    Work funded by NSF Grants DUE-0512430 and DUE-0512636.

    the questions by examining whether student responses to the inven-

    tory accurately reflect their understanding of the underlying concept.

    It also looks at criterion-related evidence for validity by correlating

    the inventory scores with other measures, such as final exam scores.

    Steifet al.s analysis of the Statics Concept Inventory is an example

    of the type of validation study required [1, 3].

    The focus of this study is the Signals and Systems Concept In-

    ventory (SSCI) [4]. The SSCI is a 25-question exam designed to test

    knowledge of the core concepts in the undergraduate signals and sys-

    tems curriculum taught in electrical and computer engineering. De-velopment of the SSCI began in 2001, and as of 2010, 30 instructors

    have given the SSCI to more than 2600 students. The project website

    (http://signals-and-systems.org) provides a complete

    list of SSCI-related publications. Instructors can request a password

    to access the latest copies of the inventory. This paper is a follow-on

    to an initial study presented at the Frontiers in Education Conference

    in 2007 [5]. The 2007 study compared the SSCI and final exami-

    nation results for a single class of students at one university. The

    present study analyzes a larger data set obtained from four classes at

    two universities.

    The rest of the paper is organized as follows. Section 2 describes

    the data set and provides relevant details about the courses where

    data were collected. Section 3 examines the correlation between stu-

    dents SSCIscores and their scores on thefinal examination. Follow-

    ing that, Section 4 compares student responses to three open-endedfinal exam problems with their responses to related questions on the

    SSCI. Section 5 summarizes the paper.

    2. DATA SET

    This study focuses on data from four undergraduate signals and sys-

    tems classes taught at George Mason University (GMU) and the

    University of Massachusetts Dartmouth (UMD) between 2006 and

    2009. Table 2 summarizes the course information, number of stu-

    dents, instructor, and textook for each of the classes in the data set.

    Three of the classes were sections of ECE 220 taught by the first au-

    thor at GMU in three different semesters. The remaining class was

    ECE 321, taught by the second author at UMD. Both ECE 220 and

    ECE 321 focus on continuous-time linear signals and systems. ECE220 is open to both sophomores and juniors, and is often taken con-

    currently with courses in circuits and differential equations. ECE

    321 is taken primarily by second-term juniors who have already

    taken a discrete-time signals and systems course, two semesters of

    circuits courses, and a differential equations course.

    Both ECE 220 and ECE 321 were scheduled for two 75-minute

    lectures per week. ECE 220 also had one 50-minute recitation and

    one 100-minute laboratory session each week. The laboratory as-

    198978-1-61284-227-1/11/$26.00 2011 IEEE DSP/SPE 2011

  • 7/30/2019 05739211

    2/6

    Table 1. Summary of classes in the data set.

    Class Course name N Instructor Textbook

    1 GMU ECE 220 40 Wage [8]

    2 GMU ECE 220 39 Wage [8]

    3 GMU ECE 220 49 Wage [9]

    4 UMD ECE 321 26 Buck [9]

    Table 2. SSCI results for 4 classes.SSCI Pre-test Post-test

    Class version mean (std) mean (std) Gain

    1 v3.2 39.3 (11.0) 67.4 (12.9) 0.46

    2 v3.2 41.5 (9.5) 67.4 (11.9) 0.44

    3 v4.11 45.3 (10.6) 74.0 (11.7) 0.53

    4 v4.11 53.3 (9.0) 75.4 (10.2) 0.47

    signments consisted of Matlab exercises. ECE 321 does not have a

    separate laboratory session, but students worked on Matlab projects

    in small groups outside of class. Both ECE 220 and ECE 321 weretaught using active and collaborative (ACL) methods, such as those

    described in the 2004 article by Buck and Wage [6]. Specifically,

    class periods consisted of short lecture segments interspersed with

    in-class problems that students worked in small groups. See the ECE

    220 course webpages on Kathleen Wages website [7] for more in-

    formation on how ACL methods were implemented and for sample

    in-class problems.

    The SSCI was administered as a pre-test and post-test in all four

    classes. The pre-test took place during the first lecture, and the post-

    test was administered during the first hour of the final examination

    period. The second part of thefinal exam consisted of standard open-

    ended analytical problems. Final exams for all classes were closed

    book. During the SSCI portion of the exam period, students could

    not use notes or calculators. Once the SSCI was complete, students

    could use three sheets of notes and a calculator for the second part of

    the final. Both the SSCI post-test and the open-ended final problems

    counted towards the final grade. The SSCI was worth between 5-7%

    of the grade, and the final was worth 13-19% of the grade.

    Table 2 shows the SSCI results for the 4 classes. Note that the

    first two classes used version 3.2 of the CT-SSCI, while the last two

    classes used version 4.11 of the CT-SSCI. There were 20 common

    questions between versions 3.2 and 4.11. An ANOVA test indicates

    there is no statistically-significant difference in the four class means

    for the subtest containing only the common questions. Figure 1

    shows the post-test difficulty indexes (percentage correct) for the 20

    SSCI questions that are common to versions 3 and 4. While there

    are some differences between the four classes, the general behavior

    of the difficulty index curves is quite similar. The ANOVA test and

    the difficulty indexes indicate that the student populations in thesefour classes are similar enough that it is reasonable to compare data

    from these different semesters and schools.

    In addition to the SSCI data, classes 1 and 4 shared three ques-

    tions on the problem-solving section of their final exams. The stu-

    dents responses to these questions were coded and linked to the

    SSCI results using anonymous study ID numbers. A five-level cod-

    ing scheme was used: 4=correct, 3=minor errors, 2=major errors,

    1=wrong, and 0=no answer. Three of the authors coded the results

    independently. Differences were discussed and resolved, so that the

    final coding represents the group consensus. Section 4 compares the

    0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 2510

    20

    30

    40

    50

    60

    70

    80

    90

    100

    CTSSCI question number (version 3)

    Perc

    entcorrect

    Posttest difficulty index for questions common to v3 and v4

    Class 1

    Class 2

    Class 3

    Class 4

    Fig. 1. Comparison of post-test difficulty indexes (percent correct)

    for the four classes. Plot shows results for the 20 CT-SSCI questions

    shared by versions 3 and 4. The question numbers refer to version 3.

    Table 3. Correlation: CT-SSCI vs. final exam

    Class1 2 3 4

    Correlation 0.637 0.427 0.610 0.760

    Significance 0.000 0.007 0.000 0.000

    coded results for these final exam problems to related questions on

    the SSCI.

    3. CORRELATION BETWEEN SSCI SCORE AND FINAL

    EXAM SCORE

    Figure 2 shows scatter plots of scores on the CT-SSCI versus scores

    on the final exam for each of the four classes. Based on these plots,

    the SSCI scores appear positively correlated with the final exam

    (problem-solving) scores. Table 3 contains the correlation coeffi-

    cients and the associated significance values for the SSCI/final exam

    comparison. The correlation varies between 0.42 and 0.76 for these

    four classes, and the analysis indicates these correlations are statisti-

    cally significant at the 1% level (p < 0.007 for all classes).

    These results indicate that, for this population, students who

    have greater conceptual understanding (as measured by the SSCI)

    perform better on open-ended exam problems. The correlation is not

    equal to 1, nor would we expect it to be since conceptual understand-

    ing and procedural understanding are not necessarily correlated. For

    example, Mazur found that student scores on a series of conceptual

    physics problems were uncorrelated with scores on a set of paired

    conventional problems [10]. The conventional problems could besolved using procedural knowledge (a standard recipe), but the

    conceptual problems could not be solved by rote.

    4. COMPARISON OF OPEN-ENDED FINAL EXAM

    PROBLEMS TO SELECTED SSCI QUESTIONS

    This section compares student responses to three final exam prob-

    lems (F1, F2, and F3) to their performance on SSCI questions that

    assess related concepts. Classes 1 and 4 are used in this analysis.

    199

  • 7/30/2019 05739211

    3/6

    0 50 1000

    20

    40

    60

    80

    100

    Final (percent)

    SSCI(percent)

    Class 1

    0 50 1000

    20

    40

    60

    80

    100

    Final (percent)

    SSCI(percent)

    Class 2

    0 50 1000

    20

    40

    60

    80

    100

    Final (percent)

    SSCI(percent)

    Class 3

    0 50 1000

    20

    40

    60

    80

    100

    Final (percent)

    SSCI(percent)

    Class 4

    Fig. 2. Scatter plots offinal exam score versus SSCI score for the

    four classes in the study. Classes 1 and 2 took version 3.2 of the

    CT-SSCI and Classes 3 and 4 took version 4.11 of the CT-SSCI.

    4.1. Problem F1: Convolution

    Figure 3 shows Final Exam Problem F1 that assesses students un-

    derstanding of the fundamental topic of convolution. This problem

    Problem F1

    System 1 is a linear time-invariant (LTI) system with the impulse

    response h1(t) shown below:

    -2 -1 0

    t

    1h1(t)

    Determine and sketch the output y(t) of the system when theinput is the signal x(t) shown below:

    0 3 51

    1

    2

    t

    x(t)

    Fig. 3. Final exam Problem F1 (one part of longer problem).

    is linked with SSCI v3.2 Question 8 (Q8) and SSCI v4.11 Ques-

    tions 13 (Q13) and 15 (Q15). Problem F1 asks students to compute

    the output of an LTI system when the input is two non-overlappingrectangles and the impulse response is a short square pulse. The cor-

    rect answer is a tall triangular pulse followed by shorter trapezoidal

    signal. SSCI question Q8 asks students to identify the output of an

    LTI system when the impulse response is a unit amplitude rectan-

    gular pulse and the input signal is a square pulse. Students must

    recognize first that they need to convolve these signals, and then

    identify the correct results for that convolution. The distractors for

    Q8 vary substantially in the level of misconception. Two distractors

    probe whether students understand what convolution means, since

    Table 4. Class 1, F1 versus CT-SSCI Question 8Problem F1

    minor majorSSCI Q8 correct errors errors wrong zero

    correct 19 6 5 1 0

    wrong (shape ok) 0 1 5 2 0

    wrong (add sigs) 0 0 0 1 0

    wrong (multiply sigs) 0 0 0 0 0

    Table 5. Class 1, Contingency Table for F1 vs. Q8

    Problem F1

    SSCI Q8 correct incorrect

    correct 25 6

    incorrect 1 8

    they give the results of adding and multiplying x(t) with h(t). Thethird distractor has the right trapezoidal shape, but the wrong extent.

    Students choosing this distractor would recognize that convolvingthe signals given should produce a trapezoid, but lack sufficient un-

    derstanding to identify which trapezoid.

    Feedback from the SSCI Design Team and student interviews

    indicated that Q8 on version 3 was too inconsistent in the distractors

    between gross misconceptions (e.g., adding instead of convolving)

    and subtle misconceptions (getting the right shape but the wrong

    time extent). Based on this feedback, version 4 of the SSCI replaced

    Q8 from version 3 with two new questions: Q13 and Q15. These

    questions were designed to probe different levels of misconceptions

    about convolution. Q13 was designed to be the more basic of the

    two. The question statement for Q13 is identical to Q8 from version

    3.2. However, in this case, the correct answer is the only trapezoid

    given among the choices. As with Q8, two of the distractors repre-

    sent the addition and the multiplication of x(t) and h(t). The third

    distractor is now a rectangular pulse but with the same extent and lo-cation as the correct output should have. This final distractor probes

    for the case of students who remember how to find the starting and

    ending times for the convolution of two finite signals, but dont re-

    call how to actually compute a convolution. The design goal for Q13

    was to probe whether students have the basic understanding that they

    need to convolve the input and impulse response to find the output,

    and what shape that convolution will have.

    Question Q15 was designed to probe for a deeper understand-

    ing of convolution than Q13. This problem also asks the students to

    specify the output of an LTI system given the input and impulse re-

    sponse. In this case, both the input and the impulse response are unit

    amplitude rectangular signals, but with different lengths. Again, the

    correct answer is a trapezoid. However, all three distractors have the

    same region of support on the time axis as the correct trapezoid. The

    distractors are a trapezoid with the wrong extents for the linear re-gion and constant regions (i.e., slope is wrong), a trapezoid with the

    wrong amplitude for the constant region, and a triangle. Obtaining

    the correct answer to Q15 requires a deeper level of understanding

    about convolution that Q13 requires.

    Table 4 compares the coded responses of Class 1 students to

    Problem F1 with their answers to SSCI Q8. As the table shows, all

    students who were able to compute the correct answer to the open-

    ended problem selected the correct answer to Q8. The majority of

    students who made minor errors on the open-ended problem were

    200

  • 7/30/2019 05739211

    4/6

    Table 6. Class 4, F1 versus CT-SSCI Question 13Problem F1

    minor major

    SSCI Q13 correct errors errors wrong zero

    correct 5 8 7 4 1

    wrong (equal to h(t)) 0 0 0 1 0

    wrong (add sigs) 0 0 0 0 0wrong shape/right length 0 0 0 0 0

    Table 7. Class 4, F1 versus CT-SSCI Question 15

    Problem F1

    minor major

    SSCI Q15 correct errors errors wrong zero

    correct 5 6 4 3 1

    wrong (slope is incorrect) 0 0 0 1 0

    wrong (max height wrong) 0 1 3 1 0

    wrong shape (triangle) 0 1 0 0 0

    also able to choose the correct answer to the conceptual question.

    Some students who made major errors or were completely wrong on

    Problem F1 were able to answer Q8 correctly, but others chose the

    answer with the right shape, but the wrong time extent. Note that the

    only student who selected the SSCI distractor indicating he would

    add the input and the impulse response to obtain the output was also

    completely unable to solve the open-ended convolution problem.

    Assuming that students who made minor errors had a correct

    understanding of convolution, whereas those who made major errors

    did not, Table 4 reduces to the 2 2 contingency table shown inTable 5. This contingency table can be analyzed using Fishers exact

    test [11]. Fishers test indicates that the results for Q8 and Problem

    F1 are statistically significantly correlated (p = 0.0003).

    Tables 6 and 7 compare the coded responses of Class 4 to theopen-ended Problem F1 with their responses to the two new convo-

    lution questions on v4.11 of the SSCI. As Table 6 shows, all but one

    student in Class 4 was able to answer Q13 correctly. The student

    who answered Q13 incorrectly also produced a completely wrong

    answer to Problem F1. The 25 students who chose the correct an-

    swer to Q13 gave responses to F1 that ranged from correct to com-

    pletely wrong. Given the distribution of responses in Table 6, it is

    not surprising that Fishers exact test of the corresponding 2 2 ta-ble rejects the hypothesis that student responses to Q13 and F1 are

    correlated. Table 7 comparing F1 and Q15 shows similar results.

    Most of the students were able to select the correct answer to Q15,

    but their responses to Problem F1 varied from correct to completely

    wrong. Fishers exact test for the 2 2 table corresponding to Ta-ble 7 rejects the hypothesis that the results for Q15 and Problem F1

    are correlated.There are several possibilities for why Q8 correlates well with

    the Problem F1 while Q13 and Q15 do not. First, the data set for

    Class 4 is relatively small, and very few students got the conceptual

    questions wrong. A larger data set may be required to fully sample

    the distractor space. Second, SSCI questions 13 and 15 are more fo-

    cused on specific aspects of convolution than Q8. While the general

    five-level coding scheme (correct, minor errors, major errors,

    wrong, and zero) worked well for the general Q8, it may not be

    appropriate for these more specific questions. For example, since all

    of the Q15 answers have the same time extent, a coding scheme that

    Problem F2

    A causal linear time-invariant system has the transfer functionH(s) given below:

    H(s) =s + 10

    (s + 1)(s + 100).

    Sketch the Bode magnitude plot for this system.

    Fig. 4. Final exam Problem F2 (one part of longer problem).

    Table 8. Class 1, F2 versus CT-SSCI Question 22

    Problem F2

    minor major

    SSCI Q22 correct errors errors wrong zero

    correct 10 11 6 0 0

    wrong (pole at 10) 1 2 1 2 1

    wrong (zero at 100) 2 0 1 0 0

    wrong (-40dB offset) 1 1 1 0 0

    ignores time axis errors might work better. Third, as noted in the

    conclusions (Section 5 below), it is possible that the cognitive level

    of Problem F1 is significantly higher than that of questions 13 and

    15. This mismatch in level could explain the lack of correlation of

    the results.

    4.2. Problem F2: Bode Frequency Response Plots

    Final exam Problem F2 and CT-SSCI Q22 (v3.2) and Q20 (v4.11)

    are also closely related. All of these questions require students to

    work with Bode frequency response plots. Problem F2, shown in

    Figure 4, asks students to sketch the Bode magnitude plot for a sys-

    tem function H(s) with two poles (at s =

    1,

    100) and one zero(s = 10). Q22 and Q20 are the same basic conceptual question,differing only in formatting details. The question is designed to as-

    sess whether students understand how the introduction of a new pole

    modifies the Bode plot. Q22/20 provides a Bode magnitude response

    for a system H(j) and asks the student to identify the magnituderesponse of an new system obtained by multiplying H(j) by anadditional pole. The distrators include one that adds a zero instead

    of a pole, one that changes the DC value of the Bode plot, and one

    that puts the pole at the wrong frequency.

    Tables 8 and 9 summarize the results for the comparison of the

    open-ended Bode plot problem and the corresponding SSCI ques-

    tion. The results for Class 1 shown in Table 8 indicate that most

    students can correctly answer the conceptual question, but the coded

    results indicate that these same students exhibit varying levels of

    ability when it comes to sketching a Bode magnitude plot given thesystem function. Fishers exact test for the corresponding 22 tablerejects the hypothesis that the results for Problem F2 and SSCI Q22

    are correlated. The results for Class 4 shown in Table 9 show a simi-

    lar mixture of coded responses for the large majority of students who

    get Q20 correct. Again the hypothesis that the open-ended Bode plot

    problem and the SSCI question are correlated is rejected by Fishers

    test. As noted in the previous section, there are several possibilities

    for the lack of correlation between the conceptual question and the

    open-ended problem. It may be useful to consider a different coding

    scheme that is tailored to represent the distractors in Q22/20.

    201

  • 7/30/2019 05739211

    5/6

    Table 9. Class 4, F2 versus CT-SSCI Question 20Problem F2

    minor major

    SSCI Q20 correct errors errors wrong zero

    correct 4 5 4 5 1

    wrong (pole at 10) 2 0 1 2 1

    wrong (zero at 100) 0 0 0 0 0

    wrong (-40dB offset) 0 0 0 1 0

    4.3. Problem F3: Fourier Transform

    Final exam Problem F3, shown in Figure 5, is related to a collec-

    tion offive questions on the CT-SSCI dealing with Fourier transform

    properties. On CT-SSCI v3.2, these are Questions 911, 15, and 16,

    while on CT SSCI v4.11, the relevant questions are 1012, 21, and

    22. The content and distractors for these five questions were essen-

    tially unchanged between the two versions of the exam,1 although

    the formatting and wording were adjusted in an effort to make the

    questions easier to read with a cleaner layout. Part (a) in Problem F3

    asks students to do a rote calculation of the inverse Fourier transformof a square pulse, while part (b) requires students to apply Fourier

    transform properties to analyze a new system.

    For this study we compare the results of F3a and F3b with scores

    on an SSCI subtest consisting of 5 Fourier-transform-related concep-

    tual questions. (Note that for the earlier study [5] it was 6 questions,

    but we removed one of those questions between version 3 and 4 of

    the SSCI.) Figures 6 and 7 show the scatter plot of the Fourier sub-

    test scores versus the coded responses for Problem F3 for Classes

    1 and 4, respectively. The size of the dots on these plots is indica-

    tive of the number of students with that response. The correlation

    coefficient and the statistical significance are shown in the title of

    each plot. Figure 6 indicates that for Class 1, the results for F3b are

    have a correlation of 0.497 at better than the 1% level (p = 0.001)with the Fourier subtest, while the correlation of F3a with the Fourier

    subtest is not signifi

    cant. The results for Class 4 shown in Figure 7show similar behavior: the correlation of F3b with the Fourier sub-

    test is 0.46 with significance level p = 0.018. These results areconsistent with the idea that F3a tests students procedural knowl-

    edge, thus the results should not necessarily be correlated with the

    conceptual question results. On the other hand, F3b requires con-

    ceptual understanding to deconstruct a rather complex problem into

    a set of subproblems. It is therefore reasonable that performance on

    F3b is statistically-significantly correlated with the SSCI subtest on

    Fourier transform concepts.

    5. CONCLUSIONS

    An assessment of validity is crucial for any concept inventory. This

    paper evaluated the validity of the SSCI by comparing student per-

    formance on the inventory questions to their performance on a set

    of related open-ended exam problems. Analysis of a population of

    more than 150 students in four classes at two universities indicated

    a statistically-significant correlation between SSCI scores and asso-

    ciated final exam scores. A more detailed analysis of the final exam

    problems for two classes revealed significant correlations between

    the coded scores for open-ended problems and the relevant SSCI

    1One distractor on Q15 was changed based on a review of the version 3data and feedback from the SSCI Development Team.

    Problem F3

    The signal p(t) has the Fourier transform P() shown below.

    P()

    B +B

    1

    (a) Determine and sketch p(t). Use the definition of theinverse Fourier transform to determine this result. (You

    may use the Fourier transform table to check your result,

    but you will not receive full credit unless you prove your

    result using the definition.)

    (b) The signal p(t) is the input to the following system. Notethat the frequency response H() is shown below thesystem.

    p(t)+H()

    p(t)

    p(t)s(t) x(t) y(t)

    r(t)

    2

    Bcos(2Bt)

    1

    +2B2B 0

    H()

    Determine and sketch the Fourier transform of the signals

    at each point in this system. In other words, sketch S(),X(), Y(), and R(). Show your sketches and anyother work below.

    Fig. 5. Final exam problem F3.

    questions. Results for other open-ended problems, notably the Bode

    plot problem, were not statistically-significant.

    The results from Section 4 raise the possibility that the SSCI

    questions were easier for many students in this pool than the open

    ended problem solving questions. Specifically, in Tables 4, 6 and

    7 all of the students who got the correct answer on the open-ended

    problem correct also got the SSCI question correct, but many of the

    students who correctly solved the SSCI question had serious errors

    on the open- ended problem. This suggests that it might be revealing

    to examine the SSCI and final exam questions from the perspective

    of Blooms taxonomy to see if some of the open-ended problems

    require the students to function at a higher cognitive level than the

    paired conceptual level.

    6. ACKNOWLEDGMENTS

    We thank the National Science Foundation (NSF) for its support of

    the SSCI project through grants DUE-0512686 and DUE-0512430

    under the Assessment of Student Achievement program. NSF also

    supported the initial development of the SSCI through grant EEC-

    9802942 to the Foundation Coalition. In addition we thank the mem-

    bers of the SSCI Development Team for their input on the design and

    202

  • 7/30/2019 05739211

    6/6

    0 1 2 3 4

    0

    1

    2

    3

    4

    5

    Problem F3a

    Fouriersubtestscore

    Corr=0.201, Sig=0.213

    0 1 2 3 4

    0

    1

    2

    3

    4

    5

    Problem F3b

    Fouriersubtestscore

    Corr=0.497, Sig=0.001

    Fig. 6. Scatter plots of the results for Problems F3a and F3b for

    Class 1.

    0 1 2 3 4

    0

    1

    2

    3

    4

    5

    Problem F3b

    Fouriersubtestscore

    Corr=0.460, Sig=0.018

    0 1 2 3 4

    0

    1

    2

    3

    4

    5

    Problem F3a

    Fouriersubtestscore

    Corr=0.382, Sig=0.054

    Fig. 7. Scatter plots of the results for Problems F3a and F3b for

    Class 4.

    development of the SSCI.

    7. REFERENCES

    [1] Paul S. Steif and John A. Dantzler, A Statics Concept Inven-

    tory: Development and Psychometric Analysis, Journal of

    Engineering Education, pp. 363371, October 2005.

    [2] Barbara M. Moskal, Jon A. Leydens, and Michael J. Pavelich,

    Validity, reliability, and the assessment of engineering educa-

    tion, Journal of Engineering Education, pp. 351354, July

    2002.

    [3] Paul S. Steif and Mary Hansen, Comparisons between per-

    formances in a statics concept inventory and course examina-

    tions, International Journal of Engineering Education, vol.

    22, pp. 10701076, 2006.

    [4] Kathleen E. Wage, John R. Buck, Cameron H. G. Wright, and

    Thad B. Welch, The Signals and Systems Concept Inventory,

    IEEE Trans. on Educ., vol. 48, no. 3, pp. 448461, August

    2005.

    [5] John R. Buck, Kathleen E. Wage, Margret A. Hjalmarson, and

    Jill K. Nelson, Comparing student understanding of signalsand systems using a concept inventory, a traditional exam and

    interviews, in Proceedings of the 37th ASEE/IEEE Frontiersin Education Conference, Milwaukee, WI, October 2007, pp.

    S1G1S1G6.

    [6] John R. Buck and Kathleen E. Wage, Active and Cooperative

    Learning in Signal Processing Courses, IEEE Signal Process-

    ing Magazine, vol. 22, no. 2, pp. 7681, March 2005.

    [7] ECE 220 course materials,http://ece.gmu.edu/kwage/teaching.html .

    [8] B. P. Lathi, Linear Systems and Signals, Oxford University

    Press, Oxford, England, 2005.

    [9] A. V. Oppenheim and A. S. Willsky with H. Nawab, Signals

    and Systems, Prentice Hall, Englewood Cliffs, NJ, second edi-

    tion, 1997.

    [10] Eric Mazur, Peer Instruction: A Users Manual, Prentice Hall,

    Englewood Cliffs, NJ, 1997.

    [11] Jerrold H. Zar, Biostatistical Analysis, Prentice Hall, Upper

    Saddle River, NJ, fourth edition, 1999.

    203