Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition....

127
LABORATOIRE INFORMATIQUE, SIGNAUX ET SYSTÈMES DE SOPHIA ANTIPOLIS UMR 6070 I NTERACTIVE QUIZZES IN E-L EARNING .D ESIGN , I MPLEMENTATION AND U SEFULNESS Juliette Mainka, Michel Buffa, Peter Sander Projet MAINLINE Rapport de recherche (diploma thesis) ISRN I3S/RR–2004-24–FR Septembre 2004 (Déc.2003) LABORATOIRE I3S: Les Algorithmes / Euclide B – 2000 route des Lucioles – B.P. 121 – 06903 Sophia-Antipolis Cedex, France – Tél. (33) 492 942 701 – Télécopie : (33) 492 942 898 http://www.i3s.unice.fr/I3S/FR/

Transcript of Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition....

Page 1: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

LABORATOIRE

INFORMATIQUE, SIGNAUX ET SYSTÈMESDE SOPHIA ANTIPOLIS

UMR 6070

INTERACTIVE QUIZZES IN E-LEARNING. DESIGN,IMPLEMENTATION AND USEFULNESS

Juliette Mainka, Michel Buffa, Peter Sander

Projet MAINLINE

Rapport de recherche (diploma thesis)ISRN I3S/RR–2004-24–FR

Septembre 2004 (Déc.2003)

LABORATOIRE I3S: Les Algorithmes / Euclide B – 2000 route des Lucioles – B.P. 121 –06903 Sophia-Antipolis Cedex, France – Tél. (33) 492 942 701 – Télécopie : (33) 492 942 898

http://www.i3s.unice.fr/I3S/FR/

Page 2: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

RÉSUMÉ :Les QCMs électroniques sont un élément important dans le processus de e-learning. Ce rapport dresse un état de l’art dans

le domaine et propose l’étude d’un cas particulier (au travers d’un outil développé au sein de l’équipe Mainline, le serveur deQCMs "eX@m")

MOTS CLÉS :E-Learning, QCM, Quizz, questionnaires interactifs

ABSTRACT:Quizzes as the assessment part of e-learning are important to realize to fully enable the adaptation of education to this new

field. This paper will deal with general issues of quizzes as well as a practical case.The theoretical part will explain how quizzes differ from e-learning in general, and which changes exactly arise from adapting

them to the e-learning environment. An analysis of the current situation will describe the actual standards and tools, as well asfurther examine the issue of realizing quizzes as real exams.

The practical case describes a quizz server, "eX@m", developped by the Mainline team.

KEY WORDS :E-Learning, QCM, Quizz, trivia

Page 3: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Technische Universität Braunschweig

Institut für Betriebssysteme und Rechnerverbund

in cooperation with

Université de Nice Sophia-Antipolis

Interactive Quizzes in E-Learning –

Design, Implementation and Usefulness

Diploma Thesis

Conceptual formulation and supervision:

Prof. Dr. S. Fischer (Braunschweig)

M. Buffa, MdC (Nice)

Braunschweig, 1.12.2003

cand. Wirt.-Inf. Juliette MainkaSchlegelstr. 13,38104 BraunschweigMatr.Nr. 2503138

Page 4: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Erklärung/Assertion

Ich versichere, die vorliegende Arbeit selbstständig und nur unter Benutzung der angegebenen

Hilfsmittel angefertigt zu haben.

I hereby affirm that I wrote the present paper independently by using only the specified

resources.

Braunschweig, 1.12.2003

Juliette Mainka

2

Page 5: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Abstract

Quizzes as the assessment part of e-learning are important to realize to fully enable the

adaptation of education to this new field. This paper will deal with general issues of quizzes as

well as a practical case.

The theoretical part will explain how quizzes differ from e-learning in general, and which

changes exactly arise from adapting them to the e-learning environment. An analysis of the

current situation will describe the actual standards and tools, as well as further examine the

issue of realizing quizzes as real exams.

The practical case describes the enhancement of an existing quiz server by an import/export

functionality and by specializing an editor to further handle the resulting files.

Kurzfassung

Die Realisierung von Prüfungen als Teils des E-learning ist wichtig, um Bildung vollständig

auf diesen neuen Bereich übertragen zu können. Aufgabe dieser Arbeit ist es, einerseits

generelle Fragen und Zusammenhängen über Quizze im E-learning zu behandeln, als auch

einem praktischen Fall zu diesem Thema vorzustellen.

Im theoretischen Teil soll dabei geklärt werden, wie sich Quizze im Vergleich zum E-learning

allgemein umsetzen lassen und welche Veränderung es dabei im Vergleich zu traditionellen

Prüfungen gibt. Auch soll die gegenwärtige Situation von Quizzen im E-learning bezüglich

Standards und Werkzeugen analysiert werden. Besondere Betrachtung findet dabei auch die

Frage, inwieweit sich diese Quizze als offizielle Prüfungen nutzen lassen.

Der praktische Teil beschreibt die Erweiterungen eines existierenden Quiz Servers um ein

Import/Export Modul als auch um die Möglichkeit, die speziell daraus resultierenden XML

Dateien weiter zu behandeln.

3

Page 6: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Braunschweig, den 1. Juni 2003

Aufgabenstellung für die Diplomarbeit

Interactive Quizzes in eLearning – Design, Implementation and Usefulness

vergeben an Frau cand. wirtsch. inform. Juliette Mainka

Matr.-Nr. 2503138, Email: [email protected]

Background

Survey, state of the art in the field of interactiveexercises/quizzesAs an introduction to the domain, Juliette Mainka will first research material (papers, articles,web…) in the field of interactive exercices and quizzes in order to have an overview of thestate of the art. Juliette Mainka will also investigate under which circumstances "quizzes" can be used as realexams. For that, she will have to describe the requirements on exams and evaluate how theserequirements can be met in online environments.Our research group has previously defined its own XML DTD for quizzes and we havedeveloped some simple tools for manipulating these quizzes. Since this initial work, manyother solutions have been proposed by the community and some standards have emerged.Juliette Mainka will compare and evaluate quizz-generation standards and tools, in particularthe XML standards proposed by IMS (www.ims.org) for XML quizz definition. IMS is a websource for XML standards, many relating to quizzs or exercices in general.To evaluate existing work, she will define evaluation criteria such as simplicity, extensibility,features, etc… and will propose solutions (to adopt or adapt an existing standard, to define anew standard...?)

Develop WYSIWYG editor for quizzesOnce the XML standard has been chosen, Juliette Mainka will develop tools for manipulatingquizzs with the chosen format: edit/create/import quizzes described with the given XMLstandard, export quizzes in different targeted formats like PDF or HTML. The approach that has been proposed involves using an open source XML editor calledGenDoc (http://gendiapo.sourceforge.net). GenDoc is able to support different DTDs andprovides ways to manipulate XML documents via interactive WYSIWYG. This software isextensible through java plugins. Juliette Mainka will write a set of plugins for supporting the

4

Technische Universität BraunschweigINSTITUT FÜR

BETRIEBSSYSTEME UND RECHNERVERBUND

Page 7: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

chosen XML standard, in particular GenDoc will need a plugin for supporting an editableWYSIWYG styled view of the document.GenDoc also support XSL stylesheets for exporting the documents to several output formats.Juliette Mainka’s work, in case the targeted solution is feasible, would be to add WYSIWYGsupport for editing XML quizzs and exporting them to HTML and PDF formats.

Integrating these tools into a quizz serverOur group also has a prototype “online quizz server” that uses J2EE technology for creating,displaying, using, and correcting quizz exams. The core of the server is written using theJBoss application server and EJBs, where the gui is provided by JSP/Servlets. The data arestored on a postgres database. This prototype so far does not rely on any XML standard and iscurrently under development. Juliette Mainka will join the development team (composed by two students in first year of theirmaster) in order to plug the tools she will develop on her side into the quizz server. Inparticular, she must provide for features for importing/exporting quizzes created by herWYSIWYG editor to/from the quizz server. In that case, a teacher would be able to create quizzes using the WYSIWYG tools (moreergonomic than www pages), and add them to the server database) Students can then passexams online that will be automatically corrected, providing statistics and such. Using thesetools, existing quizzes or quizzes that have been created using online www pages could beimported in Juliette Mainka tools, then exported in PDF, or just edited.Possible solutions are :

I. Bypass the EJB layer of the application deployed on the JBoss application serverand talk directly to the database, using JDBC or JDO. In that case, Juliette willhave to develop a tool that gets the XML quizzes, parse them and translate intoJDO or JDBC calls.

II. Write some EJBs, provide methods that takes the XML quizzes as parametersand perform all database operations (Entity beans).

III. Write a java client that talks to the EJB by performing method calls, in the sameway the actual jsp/servelet clients do. The client should read/parse the XMLquizzes, connect to the EJB server and perform the method calls.

Laufzeit: 6 Monate

Die Hinweise zur Durchführung von Studien- und Diplomarbeiten am IBR sind zubeachten (siehe http://www.ibr.cs.tu-bs.de/lehre/arbeiten-howto/)

Aufgabenstellung und Betreuung:Prof. Dr. S. Fischer____________________________

Juliette Mainka____________________________

5

Page 8: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Table of Contents1. Introduction............................................................................................................................1

1.1 Motivation and task description.......................................................................................1

1.2 Structure of the paper......................................................................................................1

2. Basic information concerning interactive quizzes in E-learning ...........................................3

2.1 Overview on E-learning ..................................................................................................3

2.1.1 Definitions................................................................................................................3

2.1.2 Areas of use..............................................................................................................5

2.1.3 Advantages and disadvantages.................................................................................6

2.1.3.1 Advantages.......................................................................................................6

2.1.3.2 Disdvantages....................................................................................................8

2.2 Quizzes in e-learning .......................................................................................................9

2.2.1 Types of assessment.................................................................................................9

2.2.2 Use in e-learning.....................................................................................................10

2.2.3 Advantages and disadvantages ..............................................................................12

2.2.3.1 Advantages.....................................................................................................12

2.2.3.2 Disadvantages.................................................................................................13

2.2.3.3 Conclusion on the advantages and disadvantages..........................................15

2.3 Question types in e-learning...........................................................................................16

2.3.1 Types of questions..................................................................................................16

2.3.2 Differentiation of the question types regarding their use in e-learning...................18

2.3.3 Explaining the preferences in the use of question types.........................................24

3. The actual situation on quizzes in e-learning........................................................................25

3.1 Quizzes as real exams ...................................................................................................25

3.1.1 Requirements on the use of quizzes as real exams.................................................25

3.1.1.1 Regulations concerning exams.......................................................................25

3.1.1.2 Resulting requirements for real exams............................................................27

3.1.2 Fulfillment of the requirements in an online environment.....................................28

3.1.2.1 Changes in the requirements when adapting to the e-learning environment...28

3.1.2.2 Evaluation of how these requirements can be met in online environments.....30

3.1.2.3 Outlook on general problems of exams..........................................................35

3.1.2.4 Further important points to consider..............................................................36

6

Page 9: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

3.2 Quiz Standards...............................................................................................................38

3.2.1 General information on standards..........................................................................38

3.2.1.1 Definition........................................................................................................38

3.2.1.2 Development of standards and specifications.................................................39

3.2.1.3 The role of specifications................................................................................40

3.2.1.4 The use of XML in standards and specifications............................................40

3.2.2 The actual situation concerning standards and specifications in e-learning............41

3.2.2.1 The acting groups...........................................................................................41

3.2.2.2 Actual quiz standards/specifications...............................................................41

3.2.2.3 Annotations regarding the future of quiz standards ......................................43

3.2.3 Evaluation of the quiz specifications .....................................................................43

3.2.3.1 Evaluation Criteria for Quiz Standards (in XML)..........................................43

3.2.3.2 Annotation about modularity concerning the criteria.....................................45

3.2.3.1 The evaluation results.....................................................................................46

3.3 Quiz tools.......................................................................................................................48

3.3.1 Evaluation criteria for quiz tools............................................................................48

3.3.2 Evaluation of the state of the art of quiz tools ......................................................51

3.3.2.1 Differences in quiz-creating tools...................................................................51

3.3.2.2 The choice of tools to evaluate.......................................................................51

3.3.2.3 Reasons for the choice....................................................................................52

3.3.2.4 Synthesis of the evaluation results..................................................................53

4. Practical case: a Quiz Server extension ...............................................................................56

4.1 Definition of the choices taken for the project ..............................................................56

4.1.1 Initial situation............................................................................................................56

4.1.2 The taken choices and their reasons.......................................................................57

4.1.2.1 The choice of a standard.................................................................................57

4.1.2.2 The choice of a quiz-generating tool..............................................................58

4.2 The Quiz DTD...............................................................................................................59

4.2.1 Design....................................................................................................................59

4.2.2 Implementation ......................................................................................................60

4.3 The GenDoc plugins .....................................................................................................65

4.3.1. Overview about GenDoc and its plugins...............................................................65

4.3.2 Editing plugin ........................................................................................................66

7

Page 10: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

4.3.2.1 Design ............................................................................................................66

4.3.2.2 Implementation : explaining important details................................................68

4.3.3 Publishing plugin ...................................................................................................72

4.3.3.1 Design.............................................................................................................72

4.3.3.2 Implementation...............................................................................................73

4.4 Connection to “eXam” Quiz Server...............................................................................78

4.4.1 Design ...................................................................................................................78

4.4.2 Implementation ......................................................................................................79

4.5 Use case: combining all developed components............................................................84

5. Assessment/Evaluation of the developed components..........................................................86

5.1 The Quiz DTD ..............................................................................................................86

5.2 The GenDoc Plugins......................................................................................................87

5.2.1 The editing plugin...................................................................................................87

5.2.2 The publishing plugin.............................................................................................87

5.3 The connection program................................................................................................88

6. Synopsis and future prospects.............................................................................................90

6.1 Summarization...............................................................................................................90

6.2 Future prospects............................................................................................................92

Bibliography..............................................................................................................................94

APPENDIX A: Interaction of groups involved in e-learning standards...................................97

Appendix B: Results of Quiz Specification Evaluation.............................................................99

Appendix C: Results of the Quiz Tools Evaluation................................................................100

Appendix D: Data Mapping eXam / Quiz1.dtd.....................................................................108

Appendix E: Use instructions of required programs...............................................................113

8

Page 11: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Table of Figures

Figure 1 : Automatic correction with "Solution Editor"

Figure 2: The standards evolution process

Figure 3 : The 3 views in the GenDoc GUI

Figure 4 : Example for both imported and typed code parts

Figure 5 : Example of a BlanksText

Figure 6 : Publishing dialog for Quizzes

Figure 7 : GUI of the Import/Export module

Figure 8 : Components and actions concerning quiz data processing

9

Page 12: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

10

Page 13: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

1. Introduction

1.1 Motivation and task description

With the rise and continuous enhancement of new technologies such as the Internet, the use of

new media expands over more and more areas of life. One area that is particularly influenced

by this is education [QOL]. E-learning (as the known result of this) is especially interesting

because of the new possibilities it offers to education, e.g by the use of additional media as

well as by the possible decentralization and desynchronization of learning processes. Most

works about e-learning these days deal with teaching in e-learning, but about the other

component of the learning process, assessment of what was learned, only few is said yet. This

paper will treat of this last area of e-learning, that is to say: Interactive quizzes in e-learning.

To get an in-depth look on this field, the following issues should be discussed further:

• With a focus on quizzes, what does e-learning mean in the context of education?

• How does it change the way education can be done, regarding e-learning in general and, in

comparison to this, quizzes in special?

• What characterizes quizzes in e-learning as opposed to traditional assessment?

• What can eventually hinder their use as an adaptation of the traditional methods?

• Concerning the actual situation, which trends and developments exist regarding the state of

the art as well as regulations for quizzes in e-learning?

• And how well do these last ones fit the actual needs?

This view on quizzes in e-learning will be completed by further describing a practical case

from this area that was undertaken within the scope of this project.

1.2 Structure of the paper

The structure of this paper, which should provide answers to the above questions, is as

follows:

Chapter 2 will give necessary basic information that should introduce the reader to this area

and facilitate further understanding of the paper. It will therefore concern in chapter 2.1 an

1

Page 14: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

overview on e-learning, followed by a focus on its assessment part in chapter 2.2. This will

especially treat of the different types of quizzes in e-learning by comparing it to traditional

assessment. The last part, chapter 2.3 will furthermore zoom on question types, including an

evaluation on how well the different types are suited for an interactive use in e-learning.

Chapter 3 will further specify the above started discussion about quizzes by analyzing the

present situation. This consists in a first section (chapter 3.1) of checking the use of quizzes as

real exams, concerning requirements (such as existing regulations) and their fulfillment of

these in an e-learning environment. The second part (chapter 3.2) will then examine the

existence of quiz standards, whereas the third one (chapter 3.3) analyzes the state of the art of

quizzing tools. Both last parts hereby include evaluations on the respective area of research.

After the theoretical approach to quizzes in e-learning in both above chapters, the following

two ones will focus on a practical case:

Chapter 4 described all steps undertaken in the development of the project. This starts in a first

part (chapter 4.1) with the description of the goals as well as of the initial situation, and the

explanations on the resulting choices taken for the project. The following chapters (4.2, 4.3

and 4.4) will respectively present the main points of both the design and the implementation of

the major components of the developed concept. The final part of this chapter (4.5) concludes

on a use case that shows how all components could be used in one context.

Chapter 5 closes the practical part by presenting an evaluation of the developed parts. This

includes their strengths and weaknesses, including ideas on how the disadvantages could be

solved in the future.

The last chapter of this paper, chapter 6, summarizes the findings throughout the paper, and

will finally present some future prospects in the area of quizzes in e-learning.

2

Page 15: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

2. Basic information concerning interactive quizzes in E-learning This chapter serves to provide an introduction to quizzes in e-learning by defining and

explaining the necessary basic information needed for further understanding of the paper.

Beginning with e-learning in general, it will in the second part zoom on quizzes as a part of it,

and then further on the different question types. In each part, definitions should be provided,

followed by the actual use with its advantages and disadvantages compared to the traditional

way of education. By the continuous zooming, differences of the specific area compared to the

general one should be shown.

2.1 Overview on E-learning

2.1.1 Definitions

One frequently mentioned problem concerning definitions in the area of e-learning in literature

(see previous research from [T02]) is the lack of precision: definitions are often unclear or not

sufficiently well defined, which leads to inconsistency and therefore confusion of the use of the

word/concept. This fact is also reinforced by a “jumble of terms” [K01] which arises from

having a lot of word-creations parallel to the development of this new area. These are

sometimes used synonymously in different contexts (like e-learning, Internet-learning, virtual

classroom or virtual university) and not based on any definition. To avoid confusion, some

important terms in the area of e-learning will be explained here, followed by an overview on

the use of the different implementations of e-learning.

First of all, lets take a look at the definition of e-learning itself. The ”e” in e-learning stands

for “electronic”, but with the meaning of “electronically supported learning”(as tried to clarify

by [DE01]). E-learning therefore describes all learning processes that use electronic media

such as computers as learning environment in their activities. Some definitions explicitly

mention that e-learning is not only the deployment of pedagogical content via electronic

media, but also integrates an interactive dimension which permits to evaluate the assimilation

of knowledge [N02]. A closer look on assessment in e-learning will follow in the next chapter

concerning quizzes in e-learning.

Often, e-learning is associated with the use of networks, mostly the Internet. This opens a lot

of additional possibilities for e-learning activities, which will also be discussed later.

3

Page 16: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Nevertheless, it does not limit the general e-learning definition to its necessary use. Similarly,

e-learning does not necessarily imply that the whole learning process is done in an

electronically supported way. Also a partial use, mixed with traditional ways, can be possible.

This of course broadens the term “e-learning” and could create confusion when using it in

different contexts without further distinction. A possible solution to this could be to talk

instead about the degree to which e-learning is implemented in a specific learning

environment/situation.

An alternative definition [D03] sees e-learning as opposite to the traditional learning process

regarding time, place and content (underlining further the idea of possible degrees of

implementation): in the traditional way teaching occurs in a determined place, synchronously

at a fixed time and concerning one specific object of learning content that will be taught. In e-

learning these parts are potentially distributed: distributed places (usually, but not only, but

more and more enabled via the Internet), time (not necessarily synchronously) and learning

activities (not all participants have necessarily the same additional information, as exercises).

“Potentially” means in this context that not all three parts have to be imperatively distributed,

depending on the degree of e-learning implementation used. In the simplest case (that is

nearest to traditional teaching and learning), this could even include computer-assisted

learning. One example to this are students working at the same time in the same room on the

same set of exercises but using a computer to do so, thus causing distribution by the fact that

each student faces the exercise by itself [D03].

In this paper this broader approach towards e-learning is used as it helps to see the whole

spectrum of possibilities, especially in the later chapter about quizzes as a part of e-learning.

For a better understanding of this definition, an overview on the different already implemented

usages of e-learning (with different degrees) should be made, while in the same time explaining

some terms often used in this context.

Before starting with this, one more basic fact should be mentioned first to clarify the following

information. It concerns the parties involved in the learning process (and therefore as well

in the assessment). To provide a more theoretical view on them here, these can be generalized

in two general groups of persons by the role they take: the learner and the learning assistant.

This last one is usually the role of the teacher or tutor, but in e-learning this could also be the

e-learning program itself, which of course was developed based on the ideas of a real tutor.

4

Page 17: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

2.1.2 Areas of use

One major area of use for e-learning is the traditional education, where it is used in many

different degrees to support the teaching and learning process. At the moment it has its biggest

importance in the area of higher education, where it is often used similarly with the term of

“virtual university”. Still, this is another vague term, that includes everything from online

presentation of some courses with the possibility to contact the teacher by email, the delivery

of exercises and solutions over the net, or finally to the implementation of whole programs of

studies online [K01]. This again shows that there are several ways of how this can be

implemented, which depends on the extend to which the shift towards online teaching is

adapted. A recent study in this area found out that there are four types of virtual universities in

development which will be used in higher education in the future[K01]. These are:

1. replacing the real universities, thus taking only place in the “virtual world”. This includes

both synchronous trainings (e.g. by video conference) that can be seen as an electronic

reconstruction of the traditional “class room situation” [K01] as well as the asynchronous

way (downloadable videos of the lectures, discussions by email), which is a form of

distance education. Both types usually necessitate the Internet as channel, which also

underlines the profound influence of the Internet on this type of learning [QOL].

2. complement of the traditional learning offers of real universities, where e-learning is

offered in parallel to the “presence studies”. When there is a clear distinction of the mix of

phases of physical presence (as in traditional education) and virtual phases (e-learning), it is

usually called “Blended learning” [RT03].

3. cooperations of real universities within an educational network, with the goals of sharing

creation and deployment of online content that saves resources (e.g. www.winfoline.de, a

cooperation of Business Informatics departments in Germany). This also includes shared

courses or seminaries on Internet-base that are already often undertaken.

4. One last form with a more economical view includes complement partners from the private

industry and shared education brokerage, having the focus on the fact that learning content

is developed partially or completely outside the universities.

These last two types do not exactly fit in the scheme of differentiating by degree of e-learning

implemented, as they focus more in the content and structural aspect of the education, but they

show that cooperation has an increasing importance in the learning environment.

Although e-learning is mostly associated with the idea of higher education and learning at

5

Page 18: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

universities, it should not be forgotten that it is also used in companies. There it is generally

used for further education of the employees, whether in courses having a human learning guide

or by self-guided training with the help of a computer-based assistance. It is mostly employed

to learn specialized knowledge in form of additional skills, which is especially done in the area

of software skills [RT03]. Still, for now it is most often done in larger companies that have the

needed knowledge of usage and infrastructure.

2.1.3 Advantages and disadvantages

2.1.3.1 Advantages

The above paragraphs are about different implementations of e-learning (about who applied it

or where it was applied, which degree of e-learning was used and how it was realized), but one

important question that remains is why it is used. The reasons are in its advantages. Before

providing an overview of the main possible advantages of e-learning it is important to

annotate that the advantages shown here do of course only apply depending on the degree to

which e-learning is used (like the distribution of time, place and content).)

The overall goal of e-learning is to improve the learning process [O96], which is logical since

it was created as an enhancement of the traditional way of learning. The advantages of e-

learning arise therefore from the differences with traditional teaching and learning as described

earlier, mainly resulting from the use of a computer as a new medium or electronic media in

general, with possibilities to connect to a network, in the best case the Internet.

• The main one is the gain of flexibility. All involved persons (learners as well as learning

assistants) can through the use of e-learning become independent of time (being able to

choose the time and also the speed of learning [QOL], place (learning even in remote or

mobile situations [SW+03]), and also content (which could be made adaptable to interests

and needs of the learner) of the teaching/learning activity. This especially opens new

perspectives of learning for people who want to do some further education to acquire extra

skills but have a full time job in the same time.

• The above described advantage also brings simplification without loosing the aspect of

communication and interaction with others (by electronic media as the Internet) compared

to traditional way of learning. [K01]

6

Page 19: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

• Also, the place independence reduces the costs for students (e.g. for traveling to the

learning place).

• Immediate feedback on the learning state can be given by automatic evaluation through a

program. For students this is helpful to the learning process [O96], and takes away the

need for a learning assistant to do so. Especially when learning by oneself, the interactivity

with the program enhances the learning process significantly. For the teacher/learning

assistant (if involved in the learning process) it brings time saving by omitting this tasks,

and the feedback about the knowledge status of the students (individually or as a group)

helps also to better control on how to adapt the course, and possibilities to compare results

more easily by automatic generated statistics [O96]. This feedback could also be used in

assistant-independent studies to ensure that the goals of a learning unit were reached before

going to the next unit, which is especially important when new concepts build directly

upon previous ones [O96].

• Due to the electronic media used, it is also easier to repeat learning contents, so that

learners can use exercises several times if they want. It can even give the possibility of

dynamic automatic adaptation of learning content by the system to the needs of the

user, depending on the learner's level of knowledge found out through evaluation. This

automatic adaptation of learning pace and content brings the learner more motivation and

less frustration, because of a better matching to the difficulty level. Although this is not

easy to implement and still under research ([RTL02] and others), it will be an important

improvement for the future.

• Another important advantage due to the electronic form of content is the fact that teachers

get a possibility to distribute their knowledge homogeneously and fast [K01]. This also

facilitates reusability and sharing content (lectures or exercises), which reduces time for

content creation. Although this approach seams to open a vast amount of content to be

easily reused by the teachers, it will often be hindered by the differences in content styles

and the use of different standards regarding formats or quality assurance across multiple

authors and institutions.[W03] An important point is therefore interoperability.

• More types of media can be used to impart knowledge, [K01] generally all kind of

computer-based formats, like video, sound etc, that bring interactivity to the coursework

[QOL]. These new types of media can improve the quality of the learning process because

they offer the possibility to use the most appropriate way to explain certain learning

contents, or even different ones in parallel where the learner can choose from depending on

7

Page 20: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

his preferences. They also facilitate learning in general by combining different learning

styles/types by use of varying channels, like acoustical or visual (which could be both

pictures and words).

Of course it should not be forgotten that the use of additional media (like pictures) by itself

does not lead to a higher quality of learning. The right usage of them is important: their

didactical function should be to challenge/dare the learner to mentally process the learning

content. Its usage should also be well-balanced: too much additional media can numb the

learner, which is the opposite of the goal that is wanted to be obtained by using them. The

importance is the right strategy in the usage of media in general. [S95]

• Another aspect regarding e-learning in the area of further education is the use within

companies. This can reduce their costs by saving time for teaching and explaining and

obtain better motivation for workers by letting them learn more independently as already

mentioned above. The positive results of it is higher qualified personnel by simplified means

[RT03].

• Finally, computer-based learning programs in e-learning can have a positive effect on the

learning process: because of a mix of self- and program-guidance, the learner stays in an

activated state. This is due to the continuous choice of the learning path he has to take, the

possibility to try out what he learned in little exercises and to evaluate and control the

learning progress through quizzes. [S95]

2.1.3.2 Disdvantages

Although e-learning has a lot of advantages compared to the traditional teaching and learning

process, it should not be forgotten that it also brings along new problems that mainly arise

with the use of computers.

• Another disadvantage (in addition to the ones already described with the advantages) is that

the creation of online content is usually much more time consuming, especially when

creating in the above described interactive formats.

• These formats also cause disadvantages on a technological point of view, as the multi-

media based creation and deployment of learning content is usually very resource

intensive [K01]. The learning systems have to be performant enough to handle them.

• Another point concerning the technology is the importance to have a supported and safe

system on which the learning process is delivered, which could also be very time and

8

Page 21: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

resource intensive.

• Also the medium computer itself can scare learners, especially older ones that are perhaps

not enough familiar with it [EL]. This enforces the general need to create easy to use

content presented on an easy to learn program.

• Finally, the content providers, that could take some of the difficult work from the learning

assistants, are still rare, because of the high investment costs (one estimation: one e-

learning module for about 1 million Euro, [N02]) and the risk of the market. One solution

could be the above discussed cooperation between universities and companies, that would

bring advantages to both sides.

However, in an overall view the advantages in general outweigh the disadvantages, meaning

that e-learning can bring enhancement to most learning processes. How and really if this is

possible will of course have to be checked separately for each case.

2.2 Quizzes in e-learning

This chapter should provide an overview about quizzes in e-learning for a better understanding

of the following chapters, where some specific aspects of quizzes will be treated more

thoroughly. Throughout this paper, quizzes are seen as all types of assessment, and here

especially in the area of e-learning education as described before, therefore carried out based

on computers. To better understand quizzes in e-learning, it is useful to start from assessments

in general and see how the concepts have been adapted in the e-learning context.

2.2.1 Types of assessment

The main goal of evaluation in general is to provide feedback about the state of an actual

observed object. In the case of learning, it serves to control the knowledge of a learner.

Learning and evaluation of learning therefore belong strongly together. Without feedback

through assessment, learning will have no means to evaluate the knowledge acquired and

therefore no guidance on how to evolve/continue to enhance itself, due to the fact that

negative evaluation results can also reflect shortcomings in the learning content [EL]. These

two parts together do therefore form the control cycle system for effective learning.

9

Page 22: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

The feedback can be obtained through different types of assessment, that differ in the

circumstances they are used in and the didactical purposes [EL] they serve to the different

subgroups involved in the learning and assessment process. The 3 main types are:

1. Self-assessment: usually for exercising purpose, where the result only serves the learner to

get information about his own level of understanding. The evaluation can whether be done

by the learner himself, but also by another person.

2. Formative assessment: non-grade-giving informative check (see [W03], [H02], [MH 03]),

carried out in forms of pretests/quick tests and mock exams. Although both given examples

are done in a similar way, they have a slightly different emphasis towards which person

subgroup they serve/are of use : pretests/quick tests are mainly necessary for the learning

assistant to better guide the teaching as described above (which is the main purpose of this

type of assessment), whereas mock exams are also useful for the learner to get to know the

conditions of real exams (extra purpose).

3. Summative assessment: formal assessment carried out as a “post-test” [EL] after a study

unit ([W03], [H02], [MH 03]), needed by the learning assistant to define in an often

regulated way the level of knowledge at the end of a study unit (in this paper ofter referred

to as “real exams”, which includes all grade giving exam for official purpose in education).

2.2.2 Use in e-learning

All different types of assessment described above can also be used in the e-learning context

[EL]. The following paragraphs will explain how this can be done and which constrains

regarding the degree of independence in time, place and content it implies.

Self-tests for self-controlled evaluation of studies can be applied in any case of e-learning

studies. This could be whether in a self-education program or parallel to an actual offline

education where this kind of assessment is used for exercising purpose.

There are no constraints regarding time place and content, as this is a voluntary task where

only the learner takes the choice of conditions and content of the test.

Formative assessments can also be applied in all types of e-learning, as they serve to guide

the learning process depending on the knowledge of the learner. The adaptation is done by the

10

Page 23: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

learning assistant, which could whether be a real person (teacher) or the learning program

itself. Even so, there are more constrains as in self-test that depend a lot on the goals it has:

the time for this test is fixed loosely by the fact that the purpose is to check the knowledge at a

certain point of the studies (for example at the beginning of the teaching or after a study unit)

where decisions about the further teaching way has to be taken. For a more exact comparison

of levels of a group of learners at a certain point it is better to fix the time. With the same

reasoning as for time, the content has to be determined to serve the given purpose. An

exception to this could be the case of dynamic adapted questions in an individual software-

based study program. This new concept also shows an additional purpose of this kind of tests

in the context of e-learning: to personalize the learning process for the learners [EL]. The

place should nevertheless usually be free to choose in any kind of e-learning studies, but can

in some cases (depending on the requirement of the teacher) also be fixed.

Summative assessments are the most strict ones of the described assessments. The reason is

easy, as their purpose is to give a (official) grade to the learner about his level of knowledge in

a specific area of studies at the end of a determined study unit with a fixed time. For this

reason (based on the idea of fairness between learners) the equality of conditions of the exam

and also the identity of the learner have to be guaranteed. As a logical conclusion, time, place

and content should be determined. Nevertheless, there are real-life cases where grade-giving

quizzes are taken without restrictions of the place (at UNK, a US university, some exams

could in some cases be taken from home at a fixed time) and sometimes also without strict

time settings (in a pharmacological course of studies, students could take online quizzes for

extra credit points by answering multiple choice questions about extra articles [H02]) How

this strictness is defined in regulations in the system of higher education and how these strict

requirements can be met in an online environment will be explained in detail in the following

two chapters.

The repartition of use of the different assessment types were examined, among other

things, in a recent survey about the use of computer-aided assessments (CAA) in higher

education in the UK.[W03] It showed that summative assessment is still on top with 45%,

followed by formative one with 30% and self-assessment with 20%. The last 5% come from

an emerging kind called diagnostic assessment, where the focus is on the diagnosis of a case,

as the name suggests. An interesting fact is that there appears to be a definite shift towards

11

Page 24: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

formative use: less than half of the 2003 tests were summative, whereas in 1999 most tests

were summative.

Although this finding deals with higher education in the UK, similar results are likely to be

found for higher education in general, because these are at large comparable in their goals and

methodology among different countries.

Nevertheless, the above results only consider exams that are already undertaken on a

computer-base. But they do not show that on the other hand summative assessments in form

of exams are still an exception when comparing to the traditional way. In most education

institutions that already implement a large part of their teaching [O96] or even the whole

education in the e-learning way (like most virtual universities), exams are (contrarily to the

other assessment types) still mostly taken the traditional way. A further discussion on this

issue will be done in the corresponding chapter about “quizzes as real exams”.

2.2.3 Advantages and disadvantages

2.2.3.1 Advantages

The advantages of quizzes in the e-learning context arise, analogously to e-learning, from

their differences compared to their traditional use. As assessment is a part of e-learning, these

are mostly the same points like those mentioned before for e-learning in general. Some

assessment specific characteristics were also already discussed in this context under the topic

“immediate feedback”. As a result of a 2003 survey in higher education in the UK [W03], this

point was even rated as being the most important benefit of computer-aided assessment

(CAA). But still there are also some changes regarding the strength and weaknesses for

assessments in an e-learning context that should be discussed here. First a look on the

improvements:

Especially when comparing to the way quizzes are done in traditional education, further

simplifications exist in addition to the ones already mentioned before: First of all, computer-

based quizzes eliminate the need for paper, which reduces costs and the waste of paper as a

resource, thus being more ecological. The electronic form of quizzes also make their handling

much easier. These does of course include automatic correction, but also the possibility to

12

Page 25: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

have question banks. These were already found to be used in a number of higher education

institutions [W03] and the high interest will probably lead to further growth. This is only

logical when seeing their desirable simplifications, like the amount of time it saved by reusing

questions.

More important are the improvements in readability for learners and learning assistants due

to typing textual answers instead of handwriting. The use of other simple input method, like

mouse clicks, provide unambiguous results as well, leading to the same advantage. Again, this

is also in the same time necessary for automatic correction. Another advantage for the learner

is also the ability to easily edit answers several times during the exam.[TPP03] But typing

could also be a problem because it implies a sufficient level of typing skill and speed of the

learner and could cause a time problem for slow typers. It is therefore necessary to check if a

basic typing skill is given, and to give enough time for the taking of the quiz. Clicking is in this

case also more advantageous, as it is faster and easier to do, but does not fit to all question

types (see below). In a real-time test of a remote exam at the Open University, UK, [TP+01] it

could be shown, that the fact of typing answers does not hinder students from producing

answers in the given time. Most participating students also perceived typing over handwriting

as an advantage, independently of their general opinion about remote compared to normal

examination.[TPP03] However, the pressure of time due to typing continues to be a major

cause of anxiety for students. [TPP+02]

2.2.3.2 Disadvantages

The above observations show that an aspect that creates improvement can also bring up new

problems in the same time. This goes also for the following ones:

The use in an e-learning environment brings also a gain of flexibility to quizzes. But this does

not count to the same extend for all types of assessment, due to restrictions that come along

with them. Self-test, having less limitations, can benefit of a total flexibility regarding time,

place and content. In contrast to this, real exams are very restricted because of their goal and

the requirements to ensure equality for learners and also avoid fraud, and therefore cannot

apply more flexibility. The requirements on real exams will be treated later (in chapter 3.1).

13

Page 26: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Automation of correction due to the electronic medium is without doubt a big advantage

especially for quizzes. First, it incomparably speeds up the time necessary for correction,

which could be ascertained as one of the 4 main benefits of CAA in the before described

survey. Also, it reduces the risk of human errors during the correction [SW+03], thus

making exams more reliable. It can also create more fairness and equality, as during tests on

the Open University (OU) it could be found out that there is a greater variation between

human markers than between an automatic marker and the average human mark [ME]. Finally,

the possibility of automatic data analysis in form of statistics which is made from the results

of the correction is even seen as another key success factor of CAA. [SW+03] On the

downside, the realization of automatic correction is not always easy to implement. It depends

a lot on the differences in question types, which should be checked later (chapter 2.3).

Although the time saving due to automatic correction is obviously true, the opinions diverge

regarding overall time saving by the use of computer-based quizzes in general, as found out in

the earlier described survey [W03]. The additional time spend results from the earlier

mentioned amount of time needed to create and organize the delivery of the quizzes, which

was also reported as one of the main disadvantages of CAA [W03]. Still, the majority of the

respondents of the survey who reported productivity benefits did not see a corresponding shift

in workload to other staff. The differences in the perceptions in real time spending and saving

do therefore not only result from the existence and quality of a question bank, but probably in

large parts also come from the quality of the quiz creation and deployment tool used in the

special case, along with the personal skills of the author to use these.

Connected to the problem of more time spent by the creation of quizzes is the difficulty of

writing good questions, which was also reported as one of the main disadvantages of CAA

[W03]) This is to one part due to the already mentioned use of more and possibly interactive

media types, which are more difficult to create. The second and more important reason of the

difficulty to create “good” questions in e-learning [EL] is connected to the question types

themselves and their validity to check the right and sometimes complex learning goal. This

problem will be discussed further in the following chapter.

One major weakness of quizzes in e-learning, especially when used as real exams, is their

strong dependence on the technical systems they are based on. It is therefore no wonder that

14

Page 27: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

the key success factors of computer-based exams [SW+03] are mostly technical issues

concerning both hardware and software.

The two main points, also listed in the main disadvantages of CAA, are accessibility and

reliability of the whole system (including both hardware and software components).

Reliability is especially important to counter the common concerns towards quizzes in e-

learning that are usually connected to technological failures. Software reliability was found out

to be an important requirement during tests on remote exams at the Open University.

[TPP+02] Concerns that participants have about the reliability also further increase their stress

regarding electronic exams. This results from the fact that, in addition to the always present

anxiety about rigorous time limits, the students felt also more responsible for ensuring that

their answers are submitted on time.[TP+01]

The aspect of security, which is another key success factor, is especially important for quizzes

as real exams regarding the requirement of avoiding fraud. It is obvious that electronically

taken exams require more security checks, and only system vendors try to make believe that

they could offer greater security compared to paper exams. [SW+03] How security measures

could be ensured for quizzes as real exams will be checked later in a separate chapter.

One last success factor concerning the technological aspects is the technical usability of the

used software. It has already become a basic requirement for the software market in general.

But as seen in the paragraph about time saving, it should further be improved for quiz specific

tools to further facilitate its use.

2.2.3.3 Conclusion on the advantages and disadvantages

As for e-learning in general, computer-based quizzes can generally bring a lot of improvements

for both learners and learning assistants. Weaknesses on the other side arise nearly completely

from the use of quizzes as real exams and the restriction it brings. Thereby the difficulties

come mostly from the technological aspect. A whole chapter is therefore dedicated to bring

answer to the question on how to realize real exams in an e-learning context.

15

Page 28: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

2.3 Question types in e-learning

Up to now the discussions were about quizzes in general, where these were treated as entities,

without considering the most important parts they are made of, namely the questions. This

chapter will therefore focus on the differences of question types and their suitability

concerning their use in an e-learning environment.

2.3.1 Types of questions

There are different types of questions used in computer-based learning programs, which

should be overviewed now. The term “question types” has to be seen in its broadest meaning

of exercise/task types, as it does not only consist of the act of asking the question but also

more importantly on how/ in which form the answer has to be given, e.g. textual or graphical

form. These types are usually differentiated by the way the learning goals are checked,

therefore explaining the naming of the different groups after the above described forms of

answering: choosing, matching, order, estimate, numerical and textual entered answers.

Choosing questions consist of choosing the answer(s) from a given list of proposed possible

answers. Further differentiation is done by the amount of choices given and by the number of

right answers. “Single choice” signifies that the possible answers are given in a pair like

Yes/No or True/False (therefore this question type is often abbreviated to T/F) so that there is

only one choice: whether true or false. Contrarily to this, “multiple choice” generally means

that there are more than 2 possible answers proposed. Depending on the number of right

answers, they can be further specified as generally called multiple choice question (MCQ) that

have only one, or multiple response question (MRQ) that can have more than one correct

answer, as the name already implies. [S95] As a special case where the choosing is done in a

graphical way is the “HotSpot” called exercise. The choice there is realized by clicking on a

part of a given picture.

Matching questions also serve the idea of choosing items from a given set, but with the focus

on matching them to other objects or groups of objects. This is usually implemented in a

graphical way, so that the objects are attributed by moving them with the mouse. Examples for

this are “Pick-and-Place” and also “Connect the points” tasks.

16

Page 29: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Order questions have the goal to bring objects in the right logical order. This is especially

good suited to recognize the order of activities. The implementation is often done in graphical

way, as the “drag-and-drop” helps to visualize and thereby simplify the task.

Estimate questions can be used for example to estimate the percentage of a given total

quantity. Again, this is easy to make in a graphical view, so that the task is to mark a point or

area on scale or slider.

The last two groups both concern the fact of providing an answer by typing it. The first one is

specifically about numerical entered answers, where these can simple numbers or also ranges

of values. The second group on the other hand handles textual entered answers. These can

be words, terms, and even complex text entries. Further differentiation can be done regarding

the length of the entered text.

Simple text answers, also called short-answer questions ask for words and terms only. So

called “Fill in the Blanks” questions (FIB) are a subtype of this subgroup too, as the only

difference is that the answer is typed directly at a place in a given text. The option to supply a

list of possible answers on the other hand changes the focus of the question type, so that this

case could be seen as special type of multiple choice type questions, with the difference that

the answer has to be typed in.[TP+01]

The second subtype are complex text entries, often called “essay questions”, where the

answer consists of whole sentences up to even a long part of text. These correspond the most

to questions in traditional written exams.

Although the above described kind of classification should be used throughout this paper, it is

of course not the only one possible. Another possibility used in the IMS QTI (a quiz

specification) is to group the questions by the type of input for the answer, being: logical

identifier (using clickable buttons or sliders), xy-coordinates, String (textual entries), numeric,

and logical group (graphically group items) [QTIa]. This approach from the developer side is

logical as the IMS QTI is a standard used in quiz creation (more about this later in chapter

3.2, about “Quiz Standards”).

17

Page 30: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

2.3.2 Differentiation of the question types regarding their usein e-learning

After gotten to know the different question types, it could be of interest to find out how well

the above described question types lend themselves for the use in e-learning. First, it is

important to mention that there are no “good” or “bad” question types, it always depends

on the way they are used and how well they are suited to check the given learning goals in a

specific situation. [S95] The following observations will therefore not give an absolute grade

to the different question types, but take a closer look on some characteristics that are

important to consider in the discussion about their use in e-learning. The results can also help

to explain why certain question types are momentarily more used than others. One example: in

the 2003 survey discussed earlier, MCQs were still on top (in comparison to a similar 1999

survey) with 55%, followed by MRQs with 17%, thus constituting the majority of use by the

“choice” group. [W03]

Some aspects that are quite important to reflect the differences and that should be checked

here, can be expressed through the following questions:

How do question types differ in their use in e-learning regarding...

1. easiness of creation and use

2. possibility (including the possible degree) of automatic correction and marking

3. suitability to check complex learning goals/content

Easiness of creation and use

The ease of creation depends of course on the tool used, but in general it seems a little easier

to create textually presented questions than graphical ones regarding the media they use

(mentioned in the description of the question types). This counts for both the author of the

quiz as well as for the creator of the tool, where the load shift towards the tool developer

takes some load from the author. Of course it is to annotate that not all tools support the use

of graphics in the same amount and thereby also not all kind of question types.

The ease of use on the other hand depends of course on the kind of input form used (typing,

mouse, ...) that is connected to the question type (which was also described before).

Generally, all computer-based input methods have the advantage of being not very difficult as

the devices are simple to use. Still, the use of the mouse can simplify a little more the

18

Page 31: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

answering tasks compared to typing.

As a conclusion on both, the differences per question type are not relevantly important. Also,

all computer-based question types are advantageous in their use, whereas the creation strongly

depends on the features and quality of the tool used.

Automatic correction and marking

There are two approaches possible in which correction (and therefore also marking) can be

conducted: manual and automatic. Although automatic correction is not new (e.g. Computer

Marked Assignments (CMAs) using mark sense cards were used at the Open University since

1971, [TP+01]), it is of course one of the main advantages of assessments in the e-learning

context (as mentioned earlier). One problem is that it is not equally applicable to the different

question types.

Automatic correction is easily possible to do for all question types that consist of choosing,

use a mouse as input form or consist of entering numbers. The reason for this is that in all

these cases the answers (complete ones as well as parts of the complete solution, like one

possible answer of a MRQ question or one object in an order question) can unambiguously

be evaluated whether answered right or wrong.

All textual answers (without a given choice of answers, which then would be treated as

choosing questions) are a little more difficult. Automatic correction is theoretically possible for

all kind, but becomes more and more difficult to implement with increasing complexity of the

text.

Short-answer consist of entering words and terms, but in comparison to numerical questions

that also accept only one short entry, the problem is here that textual answers are not

necessarily unambiguous and can have more than one right answer. This results on one side

from possibly allowing orthographic mistakes, which then have to be recognized by the system

by allowing a certain fuzziness. On the other side, which is more important, there could be

synonymous words and also different correct spellings. The easiest way to solve this is to

supply a list of similar words that count as right answer. [S95] Still, it can sometimes be

difficult and sometimes also time intensive (e.g. when the information is gathered through a

polling of many people) to get a sufficiently good list.

Essay questions are the most difficult ones, as the answer is given in form of a long answer

that can be a whole sentence up to a complex text. To find out if the answer is correct (or to

19

Page 32: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

which degree it is correct), it could be checked if the text contains the expected statements and

coherences of facts. Thereby, it is usually not sufficient to check the text for the existence of

keywords only, but also look for combinations of possible allowed keywords. [S95]

In the case of only very few sentences as an answer, this is still possible to be done

automatically. An example: at the Open University in the UK, they use automatic correction

for mock exams that are taken over the Internet [ECI]. So that automatic correction of

answers consisting of only one or two sentences can be done by the tool, the author of the

exam has to give a “stylised specimen solution and mark scheme“ for each question. The tool

for this, called “Solution Editor”, can be seen below in Figure1.

Still, the task to automatically correct complex texts usually overstrains the capabilities of the

learning tool technology. The reason that complex texts can hardly be checked well by

machine is due to the non-logical structure of language and the fact that the required

interpretation of it is an extremely difficult task.[TP+01] The usual solution depends on the

learning case. In case of self-assessment or learning without a person as a learning assistant, a

proposal for the solution (a kind of sample-solution or standard answer) will usually be given

as feedback so that the learner can compare his result to it. When there is a person as a

learning assistant, the comparison and grading will have to be done manually by this one.

[S95]

20

Figure 1 : Automatic correction with "Solution Editor" (source: [ECI])

Page 33: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

In e-learning quizzes that use many different kind of questions, it is often possible to have a

mixed-mode correction: long-answer questions will be marked manually, and all other types

are corrected automatically (including answers that only consist of a sentence, if the tool

provides the possibility to do so).[TP+01]

Check complex learning goals/content

Another key success factor of CAA that was found out in the corresponding survey [SW+03]

is test validity and reliability. How reliable a quiz is to assess learning achievements depends

on the suitability of the questions to check complex learning goals, like understanding

coherences between facts, which is usually required in summative assessments. The diverse

question types described before are differently well suited for this.

Starting with the “choosing” questions, the simplest kind of this group, T/F questions, are

not useful to check complex learning goals.[S95] The main reason for this is the before

explained simple choice, which does not offer the possibility to check more complex

coherences. Also, the chance of answering by chance is too high (later discussed in more

details).

In contrast to this, MCQs and especially MRQs can be used to check complex learning

goals. As already mentioned, they are a popular method of computer assisted assessment, due

to their obvious advantages, like simplification of creation, use and evaluation. However, their

introduction to real exams, that traditionally consist of essay and problem-type questions

[OB03], usually rises concerns about their reliability [MH 03]. This is due to a general

negative association with theses in practice, where MCQ and MRQ are generally said to be

dull/obtusely or even dumb. The reproach is that they usually test the recall of recognition and

not knowledge. A citation that reflects this well is "A doctor does not get five choices"

[MAW03].

Still, this does not result from the question type itself, but from the way they are usually

implemented. It is an important didactical trick to avoid controlling factual knowledge and to

instead check applied knowledge or practice/practical behavior.

Another fear towards the use of this question types is the idea of simplification by the

possibility to answer by chance, as the answers have no longer to be “found” through the work

of the student, but “given” (although in a bundle with wrong answers). This fact seems to

simplify the assessment too much and rises concerns about keeping the standard of education.

21

Page 34: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Experiments were already undertaken to find out how big the question guess factors are.

These were mainly done for MRQs, by comparing those with other questions that have so

called “objective formats” such as T/F and MCQ [MH 03]. The data presumed for the

analysis, concerning both the amount of right answers and the amount of total possible

answers, were based on a precedent surveys, where a large amount of real used questions

were analyzed. It showed that the most popular combination (with 40% of the questions

tested) of keys and options was 3 from 6 [QOL]. The final results were that the chance factor

for MRQs is still pretty high compared to MCQ and T/F (“MRQ chance factors peaked at 0.5

accounting for 57.8% of the questions” [MH 03]), but as an advantage MRQs allow more

dispersed result possibilities because of more possible options and combinations to choose

from.

Some more ideas about how to decrease the risk of "answering by chance" for MRQs should

be explained now:

• The assessment of a question could be handled as “all or nothing”, which means that there

are no points given for partial results so that only the correct combination of right possible

answers counts as passed. The possibility to get no points at all provides extra motivation

for the learner to ensure that the answer is indeed correct. This system is already used for

the written theoretical part of the drivers license examination in Germany, therefore being

approved.

• A similar aggravated version of the above described one is to give minus-points for

choosing the wrong answer. The possible negative result drastically reduces the amount of

answers based on answering right by guessing. It is in general used to make especially

MCQs more difficult which cannot adapt the first version as there is just one right answer.

The use in MRQs is also possible, but it should better not be combined with the “all or

nothing” one, as this would be much too difficult and could scare the learners to a point

that they do not answers question when they are not 100% sure.

• A very simple improvement for MRQ that is usually implied is not saying the number of

right answers for an MRQ question to the learner.

• A last possibility that concerns the content of the possible answers is to make them more

difficult by providing very similar solutions for a questions, so that the learner really has to

think about the problem. Of course this is only a very simple proposition and does not at all

solve the problem that most authors find especially MRQs often difficult to create because

of the multitude of case differentiations combined with the compactness of the presentation

22

Page 35: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

of the answers.[S95]

“Matching” as well as “order” questions, that are quite similar in their goals and methods,

were also both found out to be well suited to check complex learning goals, with order

questions being especially good for checking practical knowledge.[S95] The reasons for both

are the same: the use of these types is quite well suited to check if a concept was really

understood by the learner, due to the fact that more complex correlations (even more than for

MRQs) can be simulated, as multiple objects have to be associated to possibly multiple

groups.

Regarding the suitability to check complex learning goals of the two question types with

numbers as answers, “Estimate” questions are not really useful.[S95] This is mainly due to

the purpose of the question to check factual knowledge, along with its fuzzy view on the

result.

Contrarily to this, questions with numerical entered answers are well suited, for example by

checking results of difficult calculations. Still, the fact that only the result of a complex task is

evaluated is problematic, especially concerning feedback: it is difficult to find out in case the

given answer is wrong WHY it is so or where the mistakes are [S95].

Regarding the group of textual entered answers, distinctions have to be made. Simple text

answer questions that require only words and terms as entry (including also FIB) can hardly

be used to check complex learning achievement, especially concerning applied knowledge.

Again, the reason for this lies in the problem that only facts can be checked with them. In

contrast, essay questions consisting of complex text parts are logically suited to do so, as

they are widely used in traditional assessment. More difficult to classify are textual answers

that consist of only one or two sentences and therefore lie in their complexity between both

short or long answers. Some analysis regarding the validity of this kind of short answers were

conducted at the Open University [TP+01], where the test exams contained both short and

long textual answers which were all graded manually.

In the first test exam, questions were evenly split between both types. Analysis of the results

showed that the students’ results on the short answer questions were a very good predictor of

the students’ final grades.[TP+01] The second test exam was therefore changed in its

weighting between the question types to 70% short and 30% long answers, while in the same

23

Page 36: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

time making the long answer questions more demanding. The outcome was still that the short

answers predicted the final grade extremely well. This shows that also this kind of question is

suitable to assess complex learning goals, but ultimately the design of the questions is the main

issue.

2.3.3 Explaining the preferences in the use of question types

The question concerning “why” some question types are used more than others can now be

explained easily:

• MCQs and MRQs hold the actual majority in use, as they respond well in all the categories

described above. This number does not include T/F questions, although they are also part

of the “choosing” group, but are not sufficiently suited to check complex learning goals.

• The next following places in the 2003 survey [W03] were shared by graphical HotSpot and

text input questions, both having 4%. Reasons for the much lesser use compared to MCQs

are for HotSpot the complexity of creation, which is probably not supported by all quiz

tools, being a new type with a very specific use area. For textual entries the result is

surprising and could only be explained by the fact that short-answers are not suited for

complex content checking either, and essay questions can rarely or even not at all be

automatically corrected.

• The remaining 20% consist of all other types which individually do not reach the 4 percent

limit.

Finally, it should not be forgotten that the design of the questions is most important to the

success of the assessment.[TP+01] But especially this is the major difficulty of the much used

MCQs and MRQs [OB03], which still does not seem to significantly reduce their popularity.

24

Page 37: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

3. The actual situation on quizzes in e-learningThis chapter will continue the presentation of information in the area of quizzes in e-learning

started in chapter 2. But compared to this one, it will specifically analyze the actual situation.

The relevant topics to be discussed are the use of quizzes as real exam, actual quiz standards

and the situation of the state of the art of quiz tools.

3.1 Quizzes as real exams

This area of use for quizzes is most interesting because it has (as seen in chapter 2.2) the most

restrictions. Regarding actual regulations, it should be found out throughout this chapter, if

quizzes could be used as real exams, and which requirements would and also could thereby

have to be fulfilled.

3.1.1 Requirements on the use of quizzes as real exams

3.1.1.1 Regulations concerning exams

As said before, summative assessments are the most difficult ones to realize in e-learning

because of their strictness of requirements. This results from the purpose to give an (official)

grade that reflects the knowledge level in a defined area of studies, which could be studies in

higher education or any kind of specialized knowledge. The grade is used to show

qualifications for a certain task or gives the right to do something (like working in a special job

or driving a car). For a general acceptance of this degree and also the possibility to take results

of similar institutions as equivalent without the need to do any further comparison, there have

to be official regulations that specify the exact meaning of the degree and how it has to be

measures/assessed. In most cases, assessment is done by any kind of exam (orally, written or

a mix of both). Before adapting e-learning to these kind of exams it is necessary to check

which requirements there are in the regulations concerning exams, and then how these can be

met to find out if and how real exams can be done in an e-learning environment.

The requirements are checked here by examining the regulations in the field of higher

education, because of its high importance due to the fact that the resulting degree is a first job

qualifying degree ([DPO], § 1), and because of its importance of size in terms of usage. An

25

Page 38: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

example is thereby given from the area of computer science, which is interesting through the

fact that it is usually the subject in higher education that first uses computer-aided assessment

(CAA) as found out in a recent survey.[W03]

Each country has general precepts regarding studies and exams, containing general rules that

have to be adhered by all institutions of higher education in this country [OB03]. These are

usually general statements about the goals of studies and exams, like the precept in the UK

saying that “higher education institutions must ensure that (...) assessment policies and

practices are responsive and provide for the effective monitoring of the validity, equity and

reliability of assessment.” [OB03] Similar ones exist in most countries, but it is obvious that

these are not specific prescriptions, as they usually only define “what” has to be done, but not

“how”. More specific regulations are defined by the university themselves, usually for each

area of studies separately. As an example, lets further take a look at the quiz concerning parts

of the diploma examination regulations (“Diplompruefungsordnung”) in the area of studies

“Business Informatics” at the Technical University of Braunschweig, Germany:

The beginning statement says that “requirements to this exam (the final diploma examination)

are necessary to ensure the standard of the education” ([DPO], § 1), thus giving another

reason for the importance to have fixed requirements. Therefore, the structure of the final

examination is generally determined in different paragraphs, specifying areas of the partial

exams, length and the role of supervisor personnel. Still, the responsibility and choice

regarding the content of the exam is given to the examiner (defined in §8, 5.). As possibilities

of exams only the written and oral form are mentioned (in §8, 1.), but it is not at all prescribed

which medium has to be used. This means that these regulations do not hinder computer-based

exams, as they can be seen as a special form of written exams. In this case, the regulation

concerning written exams have to be applied. In the given example, they are as follows:

“In a written exam the student has to prove that he can, within a given time, with limited

resources and under surveillance, recognize a problem and find ways to a solution by

applying the commonly used familiar methods of this subject.” ([DPO], §8, 3.)

One last important aspect regarding exams defined in the regulation is the problem of how

cheating should be handled. It states that any attempt of cheating by deception or the use of

unauthorized resources will result in the particular exam being counted as “failed”. (§10, 3)

26

Page 39: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

3.1.1.2 Resulting requirements for real exams

As a summarization, the requirements that can be made on exams in general should be listed

now. Thereby, those can be divided down by the part of the exam-taking they concern, which

is whether the content or the proceeding of an exam.

Content

The main requirement on the content is to assess the learning level of the exam-taker (which

is of course THE goal of any assessment). Generally, the choice and responsibility regarding

the content is left up to the examiner (as also mentioned in the above regulations). This also

concerns the form in which the content is presented,

thus the question types and the problem of their different suitability that come along with them

as discussed earlier. One more task that lies in the responsibility of the author of the exam is

to ensure an unambiguous comprehension of the content of the questions and the whole exam

Proceeding

The group of requirements concerning the exam-taking proceeding serve to fulfill two main

purposes: The first one is to ensure the equality of conditions with the goal of comparability

of results. The second concerns rightness of exam taking, which also includes avoiding

fraud, with the goal to ensure the quality of education. Most of the requirements ensure both

purposes in the same time, as most measures taken to ensure one overlap with those that

ensure the other.

The necessary requirements on the proceeding are as follows:

1. set and ensure the same exam conditions for all participants:

a) regarding time this concerns the length of the exam-taking time, as well as the date and

hour

when the test is taken, and also when the results are returned

b) regarding information this includes having the same questions and same additional

resources (like books) as the others, but also limiting these information sources by

generally

preventing the access to other infos than the allowed ones (including non-allowed

communication with others) to ensure autonomous work.

2. ensure unambiguous comprehension of the instructions on how to proceed in the exam

27

Page 40: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

(which generally is also done by the author and can therefore be considered in the same

way as the content of the exam as described above)

3. set and ensure equal grading conditions: this consists on the one side of having

unambiguous, clear right answers that will be used to compare with the results from the

exam to determine their correctness. The other side emphasizes the fact that the answers

should be (at least per question) corrected by the same person.

4. the authentication of the participants: this part of the exam supervision, apart from the

already mentioned monitoring to ensure the equality, has the goal to guarantee that the

knowledge of the “right” person is assessed (the one the person says to be), thus serving

the purpose to prevent fraud.

To offer quizzes as real exams, all these requirements have to be met as well in the e-learning

environment. How this can be done is examined in the following chapter, along with details on

if and how this is possible to realize.

3.1.2 Fulfillment of the requirements in an online environment

3.1.2.1 Changes in the requirements when adapting to the e-learningenvironment

Before starting the analysis on the different possibilities of exams in e-learning in their

fulfillment of the necessary requirements, it should first be checked which ones are relevant in

an e-learning environment.

First, it is not necessary to check the requirement about the content when applying quizzes to

another context, as the content generally stays the same in any case. And again all decisions

and responsibilities concerning the creation of the quiz are left up to the author, where it could

be important to annotate again that this task usually does not necessarily become easier in the

e-learning environment (see findings on disadvantages of quizzes in e-learning and problems in

creation of questions of different type discussed earlier).

The second group of requirements on the other side, concerning the exam-taking proceeding,

should be observed in more detail on changes in their fulfillment. The reason for this is that the

adaptation of quizzes to e-learning brings along the use of a new medium to carry out the

28

Page 41: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

exam, which offers both new possibilities and problems.

Finally, the use of computer as a new medium also brings along a new group of technical

requirements that need to be ensured (for exams as well as for all kind of assessments). These

include the following points that were already mentioned earlier as the key success factors of

quizzes in e-learning (see chapter 2.2.3 concerning advantages and disadvantages of quizzes in

e-learning):

• Accessibility: to guarantee equality of access speed and prevent breakdown of connection

to the server

• Reliability of the system: to ensure that it is stable and prevent information loss in case of

problems. This also includes persistence of the transactions, so that problems arising from

failure during the transmission of answers can be avoided. [TP+01]

• Security: to prevent that no other than the allowed persons accesses sensitive information

about the quizzes (like the questions and even their solutions) at any time or even takes the

exam (again to avoid fraud)

• Technical usability: although this generally includes the ease of creation and use of the

quiz tool, only the ease of taking the exam is of concern here. It is mostly based on the

possibilities of the quiz-program used and also partly realized through clear and

unambiguous content and layout of exam, as described above in the 2nd proceedings

requirement. The creation of a exam in contrary, which serves the author and not the exam-

taker, depends on the possibilities of the creation tool. This will be further examined at a

later point.

All these technical requirement are extremely important to ensure because of the importance

of the exam itself, and solutions have to be thought of to catch all kind of problems that could

arise to ensure the best possible conditions of the exam-taking. As these requirements are

tightly connected to the quiz-taking system that is used in a specific situation, they will

therefore have to be checked then in the given context. This means that they will not be

examined further here as the following analysis only deals with general possibilities on a more

abstract, system-independent level.

One last additional and very general requirement when adapting quizzes to the e-learning

environment is that the whole exam-taking process should bring advantages to both examiner

29

Page 42: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

and participant of the exam. This includes for example that the grading of the exam should

NOT become more difficult than the traditional way, which generally is manual grading, but

eventually supported by some technology, as already possible for exams that only contain

“choice” questions.

3.1.2.2 Evaluation of how these requirements can be met in onlineenvironments

In general there are two possibilities of exam taking on computers that principally differ in the

concept of the place they are taken in (as time and content always have to be fixed for exams,

as said before, the place is the only relatively free option): these can be whether in specified

places chosen by the studies administration or in unspecified remote places chosen by the

learner themselves. In this chapter, these two methods will be examined regarding on how they

meet the necessary requirements for real exams, which will determine whether or not they are

suited to this use, and if not, which problems cause this and what would have to be changed to

solve these.

I. Adaptation of the traditional method

The first method consists of a rather strict adaptation of the traditional exam taking (resulting

in fixing the place in addition to the time and content), with the difference that it is delivered

and taken on an electronic basis by the use of a computer or terminal as medium that is

connected on a network. The exam-taking place is monitored and can of course also be

realized as several distributed places (being predefined and monitored as well), analogical to

real exams (e.g. the baccalaureate in France). Both possibilities in distribution of places will be

handled in parallel here, as this does not change the general proceedings of this type of exam

taking and only differs in some details in the implementation of the technological aspects,

which will at the given point be mentioned in detail.

How the requirements on the proceeding of the exam can be met will now be analyzed by

checking them one by one:

1. a) Equality of time is logically given as the time is fixed like in traditional exams, so that all

participants take it synchronously. All measures of ensuring equal time (date and length of

30

Page 43: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

exam taking as well as the time when results are returned) can nevertheless be much more

simplified by (partly or fully) assigning these tasks to the computer system itself. The time

until the results are given back can be shortened as well, when automatic correction is used.

Still, the results have to be presented at a time after the end of the exam due to this type of

assessment (and not only because some question types require manual correction), which is

in contrary to self-assessments where results can be given immediately after each question.

b) Equality of resources is as well given, as all have the same questions, additional

resources and communication are controlled, just like in traditional exams. Simplification

can also be done here by sharing the monitoring task between a monitoring person for

the real environment (like in traditional exam taking) and the computer system for the

virtual environment (e.g. by blocking unwanted ways of information retrieval, and

monitoring/logging all transactions done and check these for unallowed ones). This is

especially useful as in the case of computer-based exams the possibilities to cheat are

potentially enlarged by the additional possibilities offered by the used medium, especially

when connected by a network or even over the Internet.

2. The problem concerning unambiguous comprehension of the instructions stays

generally the same as in the traditional model, as it is still left up to the author of the exam.

The only possible change is due to the use of a software to create the quiz. Depending on

the used tool, there is sometimes the choice to use the instructions created by the author of

the quiz, or to use predefined instructions that are adapted to the different question types

supported by the program. A mix of both that simplifies the work for the author is the

possibility to let the content of these instructions per question type be entered by the author

once and displayed automatically at the corresponding question in the exam. More details

on creation tools should later be discussed in detail.

3. Equality of grading conditions can also more easily and equitably be done by letting the

system do the automatic corrections, although there may always remain some question

types that require manual correction. Still this is an enhancement of the traditional way.

Another advantage that is important when the exam-taking places are distributed is the fact

that the electronic transmission and sometimes also the distribution of the system can

ensure that the correction of all exams will be done in the same period of time, compared to

the problem that occurred in the traditional way when the answers had to be send by mail

first to the correcting person. On the other side, automation requires an even higher grade

of unambiguity and clearness of the solution. But as explained earlier in chapter 2.3

31

Page 44: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

concerning question types, it is ensured by the author (like traditional way) in combination

with the used quiz creation tool and, depending on the type of question, the clearness is

sometimes implied in the form of the answer.

4. More problematic is the authentication of the exam participants which can hardly be done

by the computer system alone, as e.g. passwords or chip cards alone are not sufficient for

sure authentication of the person, or it requires more complex methods like biometric data

control that are more resource and cost intensive. But as this kind of exam is done in

monitored places, again this task could whether be done only by the supervising person (as

traditionally), or additionally in combination with the computer system as an extra way of

authentication.

Conclusion on I.

Although the computer system can enhance or even ensure completely most of the traditional

tasks, it is more difficult when trying to replace those of a monitoring person. This is due to

the fact that this last one is still more useful and economic to monitor real-world information

exchange and authentication, which could otherwise only be done with large efforts and

investments in electronic equipment. This monitoring problem turns out to be the justification

for fixed places in real exams. Indeed, they are already widely used in education, independently

of the degree of e-learning implemented. Usually they are done as simple computer-aided

assessments in one fixed place as described earlier. One example that implements this solution

to the problem of having distributed places is the at the Open University, UK. This one offers

since 1997 courses also to students outside the UK, together with the possibility to do exams.

These are done in some local centers in the corresponding countries, with the monitoring of

some qualified local invigilators, who also supervises the correctness of the data transmission.

The transmission of questions and answers of the students are done in an electronic way to

ensure equality of time in quiz-taking and also correction in one central place.[TP+01]To

avoid cheating, the participants were supervised during the exam and their Internet connection

was disabled.

Generally this method is not difficult to implement and brings enhancements to both students

and examiners, although the advantages for the examiner outweigh. This is different for the

second possibility discussed now.

II. Remote exams in unspecified places (take-home exams)

As already mentioned above the main difference compared to the first described possibility of

32

Page 45: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

computer-based exam-taking is the fact that the place for this second approach is not specified

but can be freely chosen by the participant. With a high probability, this will often be the

participants home, for reasons of convenience, which explains the alternative name “take-

home exams”. Without fixed places, the role of the supervising person also ceases to exist.

How this affects the fulfillment of the requirements will be explored now:

The requirements regarding both unambiguous comprehension of the instructions and equal

grading conditions (2. and 3.) stay the same compared to the exam-taking in specified

places, as the choice of the place does not influences these parts.

This is different for the remaining requirement that all involve monitoring:

The equality of time (1.a) becomes the task of the system only. This has to be safe against all

kind of manipulation, like the possibility to manipulate the timer so that it is avoided,

stopped or set back, e.g. by logging off and relogging on later. So the technical requirements

are even more important in this kind of exam-taking.

For ensuring the equality of resources (1.b), the whole task is, like for time, transferred to the

system. This means especially that it has to supervise not only the virtual but also the real

environment. Although monitoring of the computer-based part is possible as described earlier,

it is not as simple as in specified places because the control over the computers of the

participants is not automatically given. Due to the multitude of differences between these,

some extra tests have to be done before the exam itself, to check that the computers of all

participants are well configured and the needed tools correctly installed to do the exam.

[TP+01] Also, the participant has to give his authorization that his computer can be controlled

from the quiz system administrators to ensure the right proceeding of the exam. To generally

avoid problems related to technological issues, the use of a less tightly timed exam is another

approach. [TP+01]

As described before, it is difficult for the system to monitor everything outside the virtual

world and requires additional soft- and hardware to do so, and therefore complicates the

whole process. One example: a very straightforward idea to solve this that can directly be

deducted from the traditional monitoring way could be to use webcams to supervise the

participants. Still, one webcam alone cannot see all that happens in the remote place to

guarantee that no additional information is used. Resulting from this, a large amount of

cameras as well as microphones to catch the sound in the room would have to be used. But

this becomes quite bulky and therefore unrealistic to implement for a large amount of

33

Page 46: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

participants.

Similar problems also encounters the requirement of authentication of the participants (4.).

Generally this shows the same problems as for the exam in specified places, but it excludes the

possibility to solve this by the simple solution of a monitoring person. Some ideas about how

the identity can be checked: Again, the most direct way is to have visual connecting to the

participant and to compare this to an earlier taken picture of the person or even a copy of a

certified ID card. Of course in this case there must be a possibility to check if the person in

front of the webcam is also the one typing the answers. Some other propositions on how to

check if the person who takes the exam is the one he pretends to be are about asking the

person some questions about the things he just wrote in his answers. This can either be done

during the exam (in written or audio way) or afterwards as a kind of reaudit.[TP+01] Also, a

style analysis could be performed to compare some of the student's homework to the exam

answers and detect eventual discrepancies. When in any of these cases a doubt is raised about

the identity of the writer, further investigations could be undertaken.[TP+01] Still, this

examples are all complicated to realize or not safe enough.

Conclusion on II.

This kind of exam-taking way is generally not suited enough for real exams, as the required

level of control for it can only be assured technically with great efforts and investments in

electronic equipment. Therefore remote assessment are for now generally just used for

homework or personal evaluation tests, as it works on the basis of trust and the willingness of

the participants to evaluate themselves, because whoever really wants to cheat could find a

way do it. Another possibility are remote mock exams. There, supervising is not necessary

neither, because it serves only revision purpose for a following real exam and does not count

for credit. As it is up to the student to use the opportunity to gain experience of sitting an

examination under controlled conditions, cheating is obsolete and therefore usually does not

occur.

As shown, the final hurdle will be to allow students to take an examination in their own home

unsupervised. There are already some experiments undertaken with the goal to find a way to

do real exams in the described remote way.

Again at the OU, another remote mock exam with more closely simulated exam conditions

(timed and having some remote monitoring) was done with this purpose.[TP+01] Security

34

Page 47: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

issues were no problem, but the biggest challenge was still the authentication and integrity of

the students. The measures of invigilation undertaken were the use of Microsoft Netmeeting

with allowed a two way audio connection and additionally webcam use, as well as remote

desktop sharing facility that gives the possibility to check what each student sees on his screen

or does with the mouse. For authentication, the students were interrogated twice at random

intervals to verify their identity. After the exam, the majority of the participants rated this way

of examination at least as good as a conventional written ones, thus showing that the used

additional methods did not really hinder them.[TPP+02]

But as said before, the major obstacle remains the need to ensure that cheating does not take

place even with all theses measures taken. One important argument is that even with personal

monitoring, there can usually never be 100% certitude that the participants did not cheat. An

alternative approach to this is to simply design the exam in such a way that access to

materials is not significant (so called open book exam).[TP+01] Still, this does not totally

eliminate the main issue in remote exams of cheating by using someone else to answer the

questions. Another approach against cheating is to raise awareness of detection in the

students' mind.[TP+01] This is best done in a discussion with them before the exam, where

they are asked to negotiate a number of hurdles that could detect cheating at different levels of

certainty, like in the examples given earlier. By this the students learn which measures will be

taken and that these bring reasonable chances to be caught cheating. The examination system

thus becomes an “honour system”.[TP+01]

3.1.2.3 Outlook on general problems of exams

The problems that exams in e-learning have to face can be tracked back to the general

problems of exams as a form of assessment. From a provoking point of view, the only real

use of exams seems to be the verification of the identity of the participant, and therefore to

detect cheating. This could raise the question about the need for conventional exam. Their

only merit could be seen in the fact that they are a relatively inexpensive way of assessing

large numbers of students. Actually, this form of assessment is more and more considered as

inherently flawed [TP+01]. Wiggins (1990) found that to be due to the fact that by contract

exams rely on indirect or proxy “items”, defined as “efficient, simplistic substitutes from which

we think valid inferences can be made about the student's performance at those valued

challenges.”(Wiggins 1990, in [H02]) This also explains the lack of realism expressed towards

35

Page 48: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

some question types of exams in the before cited sentence "A doctor does not get five

choices" [MAW03]. To overcome this inherently problem, some educational institutions move

toward “authentic” kind of assessment which is similar to the already mentioned diagnostic

approach, by which student performance is directly examined on worthy intellectual tasks

[H02].

3.1.2.4 Further important points to consider

To conclude the discussion about the requirements on quizzes as real exams, an overview

should be given of some other new points of concern that also arise with the adaptation to an

electronic environment. These are not fixed by any regulations and usually left up to the

decision of the examiners, but are important to consider before any implementation of quizzes

in e-learning.

A first point is the use of graphical entries as answer to a question. This type was not listed

under the earlier explained question types, probably because it is not easy to adapt to the

electronic way. Nevertheless, it could be seen as a possibility of “application of familiar

commonly used methods” (as prescribed in the [DPO]), like for example making sketches or

schemes. If the author of the exam still wants to include this in his exam instead of the simple

way of testing the knowledge through another supported question type, it necessitates that the

used quiz tool supports such kind of tasks, e.g. by a simple drawing tool. The concern that the

use of this drawing tool is too time consuming and stressful for the participants can be

minimized in several ways [TP+01]: provide sufficient familiarization with the tool before the

exam, as well as facilitate the drawing task itself by requiring only simple sketches, like line-

diagrams, as answer format, and also by providing templates. One disadvantage of this answer

types that automatic correction for it is seldom implemented yet.[TPP03]

Another important field of discussion concerns new options in the proceeding of a quiz in an

electronic environment. Problems arise here with the use of some of these options that

facilitate some requirements on exams, but are opposed to the goal to do the exams close to

the traditional way.

Some possibilities to make real exams more independent of monitoring by reducing fraud

possibilities that are possible due to the electronic medium are:

36

Page 49: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

• to display and therefore treat the questions one by one rather than all on one page

• to restrict the time per question (in combination with the above option), so that the

question counts as not answered when the time is up before answering

• to shuffle the same questions and/or the possible answers per question (for choice

question), again in combination with the sequential display to keep the effect

These options reduce fraud by making the possibilities for cheating more difficult. An

additional idea is to give different questions at random from a question base to each

participant, which could help to make home exams more difficult [TP+01], but is not possible

due to the requirements of having equality of content among the participants. Similarly, the

shuffling of the questions could cause a legal problem if the changed order can be interpreted

as a different resulting exam.

Still, all the above options are contrary to the real-world exam conditions, where the time is

only determined for the whole exam and participants can choose the order of answering and

even have the possibility to revisit already answered questions to change the result. This is

especially important as it could be found out in a study that there is a strong correlation

between cross-checking and score in quiz results [P01]. Also, they can submit the complete

exam at the end instead of doing so for each question.

General conclusion

But which implementation should then be preferred? The decision is generally left up to the

examiner, and also depends a little bit on the use and goal on the exam. Also, both types above

have already been used successfully. All different forms are therefore valid as long as the

conditions of the exam stay the same for all participants, which also means that they can be

different than those in the real world. Disadvantages could only arise if traditional and

interactive quiz are options of the same quiz, as then the equality can never be established

because this is perhaps technically not possible to realize.

37

Page 50: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

3.2 Quiz Standards

The goal of this chapter is to provide an overview of the standards existing for quizzes in e-

learning. In the first part, general information about e-learning standards will be provided,

followed by an analysis on the actual situation with more detailed descriptions of existing quiz-

specific standards. The second part consists of an evaluation of theses quiz standards with

some criteria that will also be defined then.

3.2.1 General information on standards

3.2.1.1 Definition

Before starting the discussion about standards, the question on what a standard is should be

answered. The International Organization of Standardization (ISO) defines standards as:

"Documented agreements containing technical specifications or other precise criteria to be

used consistently as rules, guidelines, or definitions of characteristics, to ensure that materials,

products, processes and services fit their purpose".

They further state that their goal is to help “raising levels of quality, safety, reliability,

efficiency and interchangeability - as well as in providing such benefits at an economical cost”

[ISO]. These advantages also explain why especially in expanding technologies like the

Internet it is useful to adapt standards and norms, as the crucial factors for their further grows

are to be recognized and used in different domains.[WG02] Standards are important for e-

learning as well, where they bring advantages to the whole e-learning market (users, content

developers and tool vendors).

The advantages regarding learning content in e-learning can be listed as follows [EL]:

IV.Accessibility (easy remote access and delivery, e.g. by browser)

V. Interoperability (platform-independent use)

VI.Adaptability (taylor the learning process to specific needs, helped by e.g. the use of

metadata)

VII.Re-usability (component-based approach allows multiple and diverse reuse)

VIII.Durability (robust against technology changes)

IX.Affordability (reduce time and costs by reusability)

38

Page 51: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Standards in e-learning therefore guarantee these advantages by giving specific definitions on

how to implement certain details in both technical and methodological aspects. Being future-

proof, they reduce risks of further problems for all persons that create or use learning content

or tools that are compliant to a standard. Still, it is important to say that these standards do

not mean a unification of the products, as they only concern the way the information is coded

numerically, structured and handled.[WG02]

3.2.1.2 Development of standards and specifications

Often the terms “standard” and “specification“ are confused and used similarly, although they

have some significant differences. These can best be distinguished by taking a look at the

creation process of standards.

After the definition of C. Simard,[WG02] there are different phases to pass in the

development of standards, as can also be see in Figure 2 below. It usually starts with the

necessity of a sector to create norms, followed by rough descriptions of what they should

contain. Some organizations then develop specifications, which evolve by testing and

enhancing them step by step. These organizations are non-accredited committees, like the

IETF, W3C or OMG. Once successful and stable, these de facto standards usually impose

themselves on the industry as models to follow. At this point some organizations start to

specialize in certifying the conformity of products to these standards, which thereby become

legal specifications. The last phase, once the specifications are matured, is to reevaluate them

and finally turn them into “real” standards. This process of standardization is a very

complicated and time consuming process that could take up to 10 years [EL]. It is done by

legally accredited national or international standardization organizations or committees, like

e.g. the ISO, IEEE or CEN.

39

Page 52: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

3.2.1.3 The role of specifications

Specifications are, as could be seen above, a step before standards and are therefore more

rough and experimental in their description and not 100% fixed as well [EL]. Still they are

already used until the long process of standard creation is done. Especially in the growing area

of e-learning, some kind of regulations are needed to hold onto, as explained earlier. The use

of specifications therefore solves the temporary problem and helps to ensure the above

described possibilities given through standards before these ones are mature.

40

Figure 2: The standards evolution process

(source: http://www.adlnet.org/index.cfm?fuseaction=scormpres)

Page 53: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

3.2.1.4 The use of XML in standards and specifications

Both standards and specification have in common that they are usually defined in some kind of

meta-language that describes information in a standardized structure. One often used is

XML (eXtended Mark-up Language, www.w3.org/XML/), that is both flexible and powerful.

[WG02] Being an industry standard itself, it supports interoperability of content

(import/export from/to different systems), which is especially important for Internet-enabled

and distributed applications. Standards and specifications defined in XML therefore promote

the widest possible adoption [QTIa].

To describe the legal structure and elements of XML documents of a certain type, these can be

specified in a DTD (Document Type Definition) or XML Schema

(www.w3.org/XML/Schema). XML standards and specifications are therefore usually built in

this way, as they only concern the possible structure with its possible content, but not some

specific information.

3.2.2 The actual situation concerning standards andspecifications in e-learning

3.2.2.1 The acting groups

The 3 groups of actors in the development of standards are [WG02]:

1. specification developers

2. specification users (who apply the standards in their context)

1. standardization organisms/committees

The way these groups interact in the area of e-learning standards is described in the tables of

Appendix A. As a conclusion of these, it can be seen that all type of actors work

interconnected on and with the different standards of e-learning. Also it is to annotate that

most e-learning product developing groups mentioned above implement the IMS standard (in

collaboration with IMS), while this one also adopts ideas from others again (like LD and

SCORM). The standard groups therefore seem to be converging [AICC]. The ADL Initiative

even sets this as a goal for their SCORM specification [ADL], which is built on the work of

many of the above mentioned standard creators.

41

Page 54: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

3.2.2.2 Actual quiz standards/specifications

In the context of the present paper, an analysis of the main e-learning specification developers

was undertaken to find out which ones had specifications regarding quizzes. The result (as can

be seen in detail in TABLE ## IN THE APPENDIX!!) was that only 3 organizations were

found to concern these, namely IMS, EML and DocBook. Their specifications are differently

specialized in quizzes, as the following short overview on the 3 organizations and their

specifications will show. A more detailed look on their included features will be provided in

the next subchapter through an evaluation of these specifications regarding some selected

criteria.

IMS QTI

IMS Global Learning Consortium, Inc. (IMS) has developed a very detailed specification

concerning solely quizzes, called IMS QTI (Question and Test Interoperability). It describes in

XML a general structure for quizzes, which supports, apart from all kind of imaginable

question types, also the possibility to describe the rough design structure of each question and

its answers as well as to already added information about scoring and feedback (which is

optional). This is done in results reports that correspond to either a question (in QTI: item) or

a test (assessment).[QTIb]

EML

Developed by the Open University of the Netherlands, the Educational Modelling Language

(EML) codifies units of study (e.g. courses, course components and study programmes), in an

integral fashion. It describes not just the content of a unit of study (texts, tasks, tests,

assignments) but also the roles, relations, interactions and activities of students and teachers. It

was created to allow users to model a variety of pedagogies for education (including quizzes).

The major EML implementation is in XML as well. [EML]

EML lately became Learning Design in cooperation with IMS, which approved it as the IMS

LD specification [LD]. The original EML website does not exist anymore (probably for this

reason).

DocBook

DocBook [DB] is a large DTD, which is available both in XML and SGML (another meta-

language). It is actually maintained by the DocBook Technical Committee of OASIS.

42

Page 55: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Although being for general purpose, it is especially well suited for books and papers in the

area of computers. It is not specialized on quizzes, as its capabilities for handling quiz-specific

information is limited to a Question&Answer set. Therefore its further evaluation as a quiz

specification was undertaken (see the results of the evaluation in the following section),

although it is not really necessary. Nevertheless, being open source, it could be adapted to this

purpose by adding the needed tags.

3.2.2.3 Annotations regarding the future of quiz standards

As a summarization, there are only 2 real specification concerning quizzes existing in e-

learning at this moment, IMS QTI and EML. Their developers are already cooperating on

their work, mainly through the adaptation of IMS QTI by LD into their specification and the

approval of EML as the IMS LD specification [LD]. All this seems to result in a convergence

towards a single multi-containing quiz standard by IMS. This idea is also enforced by the fact

that many other e-learning specification developers, like ALIC and ADL (see tables in

Appendix A), are planning on adopting IMS QTI in their specifications. What impact this has

on quizzes in the future (positive or negative) will have to be seen.

3.2.3 Evaluation of the quiz specifications

This second part will consist of the two steps in the evaluation of the existing quiz

specifications: first the criteria by which the specification will be evaluated will be explained

and listed, followed by the actual evaluation and a summarization of the significant results.

3.2.3.1 Evaluation Criteria for Quiz Standards (in XML)

The criteria listed below that were used for the evaluation should show the wide range of

possible features that a specification for quizzes could contain (e.g specified in its DTD). This

does not mean that all of them are always required in all cases (like for different types of

assessment). To differentiate their importance, they are given values from 1 to 3, defined as

follows:

(3) = These criteria are required for an absolute minimum useful quiz (in exam style). They

43

Page 56: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

should therefore be included in the DTD, and their use required.

(2) = These criteria are good extra feature, but not needed in the minimum useful quiz. Thus,

they should be in the DTD, but their use is optional.

(1) = These criteria are specific extra feature that strongly depend on the use of the quiz,

therefore do not need to be in every DTD. They can be added later to the DTD when

needed.

The actual evaluation criteria shown below are divided into two groups (further called

“basics” and “extensions”), depending on the use of the quiz. These will be explained at the

given point.

A. Basics

The following basic criteria have to be met when the quiz is used in a static environment

(that means, the quiz is only printed or displayed on the web, but not taken interactively). The

criteria are shown with their degree of importance (as defined above). In some cases, this

degree changes when used in a dynamic environment, which will then be further explained.

1. Supporting different sorts of questions:

1.1. Minimum question types (most important and easy to realize) (3): T/F, MC, MR,

short answer incl. FIB (with or without a set of answers to choose from) and numerical

entries, essay questions (in dynamic: (2), because of difficult automatic correction);

1.2 Other question types (1): like hot spot, drag&drop, order, connect the points, etc.

2. Including static media:

2.1 Pictures (3)

2.1 Others: (1), except when needed in area of use, like equations in math (3) or import

code (function supported by XML) for informatics (3)

3. Saving/memorizing the right answer(s) per question (3)

4. Saving/memorizing the points possible per answer (for right and wrong ones) (2)

5. Option to shuffle the order of the questions in one quiz (1)

6. Option to shuffle the order of the possible answers for a questions (1)

7. Possibility to have different "views" on the data:

7.1 Text parallel in multiple languages (1)

44

Page 57: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

7.2 Views for different groups of users (2)

8. Metadata for question and quiz for reusability and easier search

8.1 ID (static (2), dynamic (3) to facilitate database operations)

8.2 Keywords (1)

9. Accompanying text for questions and whole quiz apart from the actual questions

(like titles, explanations/instructions and question-specific tips/hints)

9.1 specifically for a question/quiz (2)

9.2 Repetitive, general information (static (2), dynamic (1) because could be done by

the

presentation program)

10. (optional: allow different types of presentation / layouts of the questions, described as a

rough structure => rest/most done by presentation program) (1)

B. Extensions

These following extension criteria should be included in addition to the basic ones, when

used for quizzes in a dynamic environment (this means, used as interactive quizzes taken

through a specific program)

1. Set time interval for quiz taking with date and time (either as start and end

points or start and length; default: anytime and unlimited; useful also for blocking

feedback and result in that time) (3)

2. Fix time limit for answering each question (default: unlimited) (2)

3. Feedback that will be given per question, depending on the answer chosen

(not for exam situations!). Some possibilities:

3.1 full right answer/solution (with explanations)(2)

3.2 optional: give hints after a wrong choice

((1); when used for self-evaluation (2))

4. include dynamic media ((1), except when needed in area of use, like sound for musical

exam

(2-3))

5. maximal answering attempts per question (default=1) ((1); when used for self-evaluation

(2))

6. are all necessary default values (for 1,2,5) predefined/given? (2)

45

Page 58: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

3.2.3.2 Annotation about modularity concerning the criteria

Some of the criteria described above that are said to have to be met by (or should be required

to be included in) a quiz standard could also (and sometimes should better) be handled

separately by the presentation or the grading function/program.

This modularity was chosen in the cases of some possible criteria that were taken out of the

list, where this was obviously the better thing to do. These were:

– in the Basics: The possibility to show the quiz with or without right answers (as one more

option of the different views described in A.7.) => to be done by the presentation part

– in the Extensions:

– As an addition to the feedback (B.3.), the possibility to state whether a question has

been answered right, wrong or not answered => can better be done by the presentation

program

– The possibility to save information about the participant of quiz (about the person, like

the name, but also about the results (points) per question after the quiz has been taken)

=> this information does not belong to the Quiz DTD, as it does not concern the content

of the quiz but the proceedings and results of a specific participant.

In other cases the decision is ambiguous and depends on the point of view (depending for

example on the system used for the quizzes). Examples from the above listed criteria:

– A.5+6: shuffle options could also be fixed later in the presentation layer

– A.8: IDs could be added/calculated automatically in a special quiz

management/organizing layer

– A.9: repetitive text could be handled automatically too in another layer

– B.1+2: time limits could be set in another layer that is concerned with the details of the

quiz taking

– B.6: max. attempts per question could be set in a quiz management/organizing layer

All these points have to be thought of when looking at the actual results of the evaluations of

the quiz specification by the above criteria as presented in Appendix B.

46

Page 59: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

3.2.3.1 The evaluation results

The table in Appendix B shows the results of the evaluation of the quiz specifications

regarding how these fulfill the given criteria. The only relevant information for now are the

results concerning IMS QTI and EML as the 2 existing quiz specifications. In addition the

table also contains the results of DocBook, which are not further relevant, as explained before,

and also the results of Peter Sander's quiz DTD that will be used in later parts of this paper.

Synthesis of the quiz standards evaluation

Concerning the given evaluation criteria, both quiz specifications (IMS QTI and EML) show

similar results and meet almost all requirements. Both meet the required minimum criteria by

having similar basic quiz-concerning tags. This seems rather logical, as the rough structure of a

quiz cannot be made in much different ways and therefore comes out to roughly the same. In

addition, IMS QTI has much more extra functions/details, like pre-layout and extra question

types that are also more sophisticated (e.g. hotspot, slider, sound and video use...). The reason

for this is simple as it was specifically created for quizzes, whereas EML is a more general

specification that describes all different kind of learning objects, with quizzes only being a

small part, as described earlier. These results about IMS QTI further enforce the vision that

the use of an enhanced IMS QTI version as a standard in the future seems already very

probably and near.

Still, the results show that most but not all criteria are met. A part of it could be implemented

by another layer/program (as already described under the aspect of “Modularity” above). But

others, mostly those needed for specific purposes like the code import or multi-language

support, cannot be done differently. In theses cases an extension of a standard is a possible

solution to this problem. An example for this is the above mentioned quiz-specific DTD from

Peter Sander (more information later).

47

Page 60: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

3.3 Quiz tools

This chapter serves the goal to give an overview of the state of the art in quiz tools, with a

special focus on the creation part of these. The reasons for this focus are on the one side to

also show this part concerning quizzes, as the discussions before were mostly oriented

towards the presentation part (like in the chapter about “quizzes as real exams”). On the other

side it will help to gain basic understanding of these which will be necessary for the practical

part of the project, that mainly concerns this.

In the first part of this chapter, some criteria should be established on which the tools will be

checked regarding their fulfillment of these. In the second part, quizzing tools in general and

the evaluated tools in special will be presented, followed by the synthesis of the results of the

evaluation.

3.3.1 Evaluation criteria for quiz tools

Similar as for standards, the criteria here were chosen to show all the different possible

features a quiz tool can and should have, with a focus here on the creation part. They

therefore reflect the requirements towards quiz tools in e-learning regarding the following

areas: functionality, interoperability, ergonomics, extensibility and the quality of the resulting

quiz. The importance brought to them depends of course strongly on the case of assessment

the quizzes are used for.

1. Functionality: The criteria in this field should help to find out if it is possible to create

quizzes with all options the creator needs to have. In this it is important to annotate that the

following criteria are already implicitly included in case a quiz standard is used within this

tool that supports these options (see also the earlier described “Evaluation criteria for quiz

specifications”). Of course this not only affects the here described criteria that are specific

for the quiz creation, but also those options for the quiz presentation, as also described

before.

48

Page 61: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

The here used criteria are:

• different sorts of questions:

• minimum: all “choice” questions and questions with textual entries (short and essay)

• optional: other types (as described before regarding quiz specification criteria)

• including pictures and other media (e.g. sound for musical exam)

• saving/memorizing the right answer(s)

• optional: Depending on the tool there is either the possibility to manipulate the quiz

layout during creation (pre-layout options for the questions, described as a rough

structure; the rest is better done by the presentation program, as earlier discussed

regarding “modularity”)or in other cases this is fixed/predefined/ automatically created

and cannot be controlled by the author of a quiz (see also export format and result

quality)

2. Interoperability: Checking this is especially important in consideration of the fact that

support for interoperability was rated as being the second most popular advantage on a

recent survey about computer-aided assessment (CAA)[W03]. This seems just logical when

recalling the advantages of standards from the last chapter. The criteria checked in this field

were:

• supported import formats

• export of different formats (e.g. XML)

• supporting standards/norms (e.g. IMS QTI, SCORM, ...) (not required in all cases, but

very useful extra feature)

3. Ergonomics: This important area (given the aspects “support” and “ease of use” on a

shared 6th place in the above survey [W03]) includes views on both easiness of using and

learning to use the tool. The points to check here are therefore:

• easy to use (e.g. by self-explaining names and icons, logic and guidance in steps to

follow, clear / overviewable environment, ...)

• easy to learn (e.g. by good documentation, examples, tutorials, ...)

4. Extensibility: The main question here is if the complexity of tool is adaptable to the needs

of the user. In the above cited survey, flexibility was even perceived as the most important

advantage of CAA, along with scalability on the shared 6th place [W03]. But scalability both

49

Page 62: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

includes extensibility, if more complexity is needed, but also the possibility to reduce

complexity. This other side, wanting less complexity, is usually given implicitly, but consists

also of another aspect to take care of: is it still easy to make a simple quiz? The answer to

this is covered under the above topic "ergonomics". As extensibility is not always needed,

usually only when not all possible features are available (yet) in a tool, the following

criteria to check are optional depending on the case:

• additional options on functionality (e.g. question types, layout options, ..)

• additional functions on interoperability (e.g. import/export formats, ...)

5. Quality of resulting quiz: This last field concerns more the presentation part of the

program than the creation of the quiz. It also mostly overlaps with other areas covered

above, but focuses directly on the resulting quiz:

• ergonomics of the resulting quiz

• platform independence?

• formats (overlaps with the export formats, but also includes options of the presentation

part to have at least a computer-based and a printable version, depending strongly on

the used testing environment)

Scope of the above criteria

As mentioned above, the focus here is set on the quiz creation part. Therefore the criteria do

not consider the following tasks that are (generally) not included in the creation of quizzes (or

only as extra options, but usually these tasks belong to the presentation layer, as earlier

discussed concerning “modularity”) :

• administrate questions or exams

• pass/realize/undertake exams (including timing, presentation of questions etc)

• evaluate (correct and grade) exams

• layout of the questions and also the whole quiz

The last point concerning layout is problematic, as it does not necessarily belong to the

presentation layer (only in case the presentation is fixed/predefined by the tool), but can also

be included somewhere in the quiz creation process. But although this task is to be seen apart

from the creation of the content of the quiz and its questions, it was also partially included in

50

Page 63: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

the formats at "quality of resulting quiz" and in the optional layout option at "functionality".

Nevertheless, it should not be forgotten that the layout plays of course a much lesser role than

the content of a quiz, which reduces a bit the importance of this point.

3.3.2 Evaluation of the state of the art of quiz tools

3.3.2.1 Differences in quiz-creating tools

There is a wide range of different quiz-generating tools existing today. Before looking at some

specific examples, the main points describing their differences should be listed here:

• stand-alone programs vs. based on a e-learning/course-management system or on the

web

• about costs: free vs. commercial tool

• simple vs. complex creation tools (includes whether minimum features or many extra

ones)

• concerning the function: basic (only the creation) vs. additional (also presentation,

administration etc. of the quizzes)

• offline application vs. online tool (concerning both creation and testing environment

[EL]. Of course, both could also be combined, e.g. when downloading the test, work on

it offline, and send results again by Internet.[TP+01]

• general creation tool (e.g. for XML, with quizzes being only one application) vs.

specialized in quizzes

• area of quiz: specialized on a subject (like maths, languages, informatics,

geography, ... ) or general use

• purpose of created quiz: real exam vs. self-evaluation

• target group (all, schools, universities, companies)

3.3.2.2 The choice of tools to evaluate

To show the state of the art, and also to give some ideas for the implementation in the

practical part of this project, some tools that offer quiz-generation were tested. This includes a

differentiation by the above characteristics, and an evaluation with the criteria described before

(see Appendix C for more information, also about the chosen tools).

51

Page 64: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

The checked tools were:

1. QuestionMark's Perception

2. Respondus 2.0

3. HotPotatoes

4. Canvas Learning Author

5. Auto QCM

6. Gen Doc

Some more tools that were more roughly looked at for additional information, but were not

evaluated in this project, included:

1. Mind on site

2. Ready Go

3. Tactic

4. Project MALTED (Malted2 is EUs application of HotPotatoes)

5. Test IT (by www.ariadne-eu.org)

6. QUIZSTAR (quiz.4teachers.org/index.php3)

7. QUIA (www.quia.com)

8. QUIZ CENTER ( from discoveryschool.com)

9. Macromedia Flash MX learning Interactions (included in the Macromedia e-learning suite)

10. MS Class Server (includes a quiz tool)

3.3.2.3 Reasons for the choice

The choice of the tools that were actually evaluated was done relatively at random, taking by

purpose different types of quizzes (regarding the above described characteristics) to get a

better overview of the state of the art. This therefore includes among others both a market

leader like Question Mark [TP+01] as well as the generic tool GenDoc that will be needed in

the practical part. The exclusion of Mind on site, Ready Go, Tactic and Project MALTED

(which are all 4 e-learning systems containing a quiz function) can further be explained by their

earlier evaluation undertaken by a student project of the MIAGE at the University of Nice,

France. [MIA] Concerning different tools than the ones chosen here, these results could also

be used to further extend the overall results.

52

Page 65: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

3.3.2.4 Synthesis of the evaluation results

The results of the evaluation should be presented here in form of a synthesis that condenses

the findings gained through comparison of the evaluation of the different quiz tools that can be

found in the tables of Appendix C, along with the use of some results of the MIAGE student

project mentioned before (except in the area of “functionality” concerning quizzes) about the 4

quiz tools there. The following conclusions, grouped by general characteristics as well as the

areas described in the criteria, could be made:

General characteristics

• The range goes from simple home-brewed quiz tools that could be used for simple exams

in schools, to sophisticated ones for universities, often combined with a learning platform.

This reflects the very divergent need for quizzes in different areas of use.

• The tools could mostly be assigned to 2 major clusters of tools:

1. free tools, often simple, standalone and mostly web-based, mainly used for exercises in

the education area

2. commercial tools, usually more complex (more extra features), when used in the

education area often used in connection with an e-learning system or CMS (course

management system).

This second group also reflects the findings on the dominant use of commercial CAA

systems in higher education (namely Perception, Blackboard and WebCT ) in the already

cited actual CAA survey.[W03]

• Most tools include creation as well as presentation of the quizzes (or when in

cooperation with a CMS, the presentation is done through this one).

• Only few tools offer/include the possibility to use a quiz as a real exam (which of course

depends more on the presentation part used).

• Educational institutions often pay less or nothing when using commercial tools. In fact, this

often serves the purpose to fix an educational institution on a specific CMS on which the

tool is based. It also shows that, although the broad use of e-learning quizzes is still in the

beginning (described as being still the “province of 'early adopters' ”in a recent survey on

CAA [W03]), there is already a strong competition in this further growing and therefore

interesting market. Its extension seems also to be positively influenced by the wide adoption

of Virtual Learning Environments (VLE, other term for describing e-learning systems).

Concerning them, the same survey also found out that the actual use is split between

53

Page 66: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

commercial and open/free systems.[W03] The main disadvantage of commercial ones is still

seen in their costs, which logically leads back to the strategy to reduce prices in the

educational market to compete with free systems, thus being the above finding of the

evaluation.

Functionality

All tools (except of GenDoc that cannot be evaluates in this area as it is a generic tool) offer

the basic question types (minimum) and the possibility to memorize the right points per

answer. Regarding the other criteria in this area, the tools show different results. For these,

they could be divided in 3 main groups that reflect the different attitudes and goals of the tools

towards quiz creation and use, as can be seen in the below table.

Tool group: “Basic”

(simple, free, used in

education)

“Serious”

(simple, commercial,

mostly in education)

“Fancy”

(complex, both commercial

and free, all uses)Example tool: Auto QCM Respondus HotPotatoes (+ most others)

additional question

types:

no no yes (a lot)

media inclusion: no yes yes

layout during

creation:

fixed fixed semi-free or predefined (rest

by publisher tool)

Table 1: Possible clusters for quiz tools regarding functionality criteria

Interoperability

The import types used by the tools are quite mixed. Ordered by their highest usage, where

multiple types per tools are possible, they list as followed:

• XML (supported by most tools)

• text formats like .txt and .rtf (by some tools)

• tool specific formats (their exclusive use reduces interoperability to zero)

• none (in some “basic” tools as described above)

54

Page 67: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

The export types are often connected to the import type used by the tool, so usually these

same formats are also used here, again with multiple types per tools possible. Additional

formats are often the HTML format or other kind of export to web or print formats (like

PDF).

Quiz specifications are rarely supported for the moment, especially in simple free tools they

are never used. But a fair amount of commercial quiz tools already support IMS QTI or plans

to do this in the future, which can be done either directly or by the CMS used.

Ergonomics

The overall ergonomics, which is an important point for the choice of a tool and the

satisfaction of the user, is good in all tested tools. This probably reflects the fact that

ergonomics have already become a basic requirement in software development.

Some main conclusions about ergonomics here:

• All tools have a good guidance of the user, by fill-in forms, wizards and/or a

WYSIWYG view.

• The explanations and additional assisting material are usually in relation to the

complexity of the tool.

Extensibility

Only few tools offer the possibility to add or even create extensions. Especially commercial

tools are usually not extensible, which is probably not wanted by the vendor, but often also not

really necessary either, because these tools already contain a large amount of extra options.

Quality of resulting quiz

This depends in general on the used export/publish and presentation function or platform, and

rarely on the editor. The range here goes from sober text quizzes to fancy multimedia-

supporting ones, and is usually okay.

Overall result on quiz tool evaluation

Although some generalizations about different aspects could be found helpful to cluster the

market, the actual quiz tools form a multitude of different varieties. This results both from

their different purposes of use that exist and are demanded, as well as from the different goals

55

Page 68: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

of the quiz tool creators. The decision on a particular tool therefore depends strongly on the

given basic conditions of the specific case.

56

Page 69: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

4. Practical case: a Quiz Server extension After providing a large overview on necessary information and the actual situation concerning

quizzes in the e-learning environment in the more theoretical first half of this paper (chapter 2

and 3), the now following second half (chapter 4 and 5) will, based on this, take a closer look

at a practical implementation of a quiz system undertaken within the scope of the actual

project. The proceedings in the actual chapter will be as follows:

First the choices taken for the project should be explained, which starts by analyzing the initial

situation and the set goals as well as the resulting needs. This leads to the reasons of the actual

choices and to the setting of some requirements on the implementation. These will then be

described in a design phase, done separately for each of the main components, followed by an

implementation phase which focuses on important details of it. The chapter will close on a use

case that includes all the developed components.

An evaluation of the developed concept regarding the advantages and disadvantages of the

solution (by components), including an outlook on how the disadvantages could be solved in

the future, situated in the following chapter 5, will conclude the practical part.

4.1 Definition of the choices taken for the project

4.1.1 Initial situation

One of the near future plans at the ESSI (Ecole Superieure en Sciences Informatiques,

www.essi.fr/) is to have interactive, computer-based exams, usable as both real exams for

different courses as well as also for self-testing for the students. As a first approach to this

goal, a quiz server named eXam was created earlier this year by two German exchange student

during their project at the ESSI. It is now maintained and tested in the framework of the

MIAGE project (University of Nice, www.unice.fr/) by Prof. Michel Buffa and ready for

usage at the ESSI, where some first trials have already been undertaking with the beginning of

the fall semester 2003. (For further information about eXam, see [EX].)

Actual features of eXam

eXam is a all-containing system by itself. It can be used to create quizzes (via a web interface

and easy-to-use forms) which will be saved in the connected database, to administrate all

57

Page 70: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

quizzes in the database as well as all assigned users, and to carry out exams. But for now, it

does not have the possibility to import and export quizzes. This leads to the project described

in this paper.

Needs

A possibility to import into and export from eXam is needed. The quizzes outside of eXam

would best be saved in an XML structure that also supports a quiz specification, to make them

easier reusable and portable. This opens eXam and its quiz-creating users a larger horizon of

already existing quiz content.

Along with a tool that would do the import and export, another one to edit quizzes in the

given structure with a WYSIWYG-like supportive interface seems useful. If possible this

should even be done in one.

4.1.2 The taken choices and their reasons

At that point in the beginning of the project, two main decisions had to be taken:

1. Which quiz specification or (more generally spoken) structure of the XML file (both

fixed in a DTD or XML Schema) should be used as basis for all handled quiz files in

XML ?

2. Which tool(s) should be used, adapted or newly created to realize the import/export

and the supportive quiz editing in XML?

4.1.2.1 The choice of a standard

The requirements towards quizzes at ESSI will have to be defined in the structure of all

XML files and therefore be defined and fixed in a corresponding DTD or XML Schema. They

consist, in addition to the criteria required with high importance in the quiz standard

evaluations, of the following main points:

1. The possibility to import pictures and code parts from external files

2. Internationalization of the quiz by editing a quiz in one or more languages at a time

(with memorizing in which language each text part is written)

There are also two requirements towards the structure itself: it should be compact

(containing not too many tags) and simple to understand and use (reducing work for users

like typing repeating words, also in different languages). This last aspect consists of a left-out

58

Page 71: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

of those words that can be automatically included later during the publication/ presentation of

the quiz on the quiz server.

The choice taken here was to create a small DTD based on the concepts of the already

existing Quiz DTD from Prof. Peter Sander mentioned before. This is justified by the fact that

it is an extension of the EML specification that includes most of the wanted quiz-specific extra

features that EML did not have.(See also the column concerning the evaluation of this DTD in

the table of Appendix B.) Therefore it is already very close to the finally wanted DTD and

therefore a good starting base, where just some further enhancements had to be done. The

reasons why this already existing DTD was not used are that, for one, it is too large, being an

extension of the EML specification, where just a very little part of this is actually used for

structuring quizzes. The new DTD will therefore just take the small subset as base to start

from. Another one is that not all features additional to EML are fully implemented in the way

or extend needed here. These and other important details of the new DTD will be explained

further in the following chapter.

The last reason why this choice was taken here instead of adapting a specification is that, as

said before, only a small subset of all the possible tags given in the standards needed for

quizzes at ESSI, therefore it does not matter if EML or IMS QTI was chosen as a base. As the

resulting DTD will contain a very basic structure of a quiz, that is also present in all quiz

specifications, it is possible to later map resulting quizzes in XML to a quiz specification.

Mapping could become useful in the future to allow wider interoperability, especially in IMS

QTI as it seems to develop towards a standard.

4.1.2.2 The choice of a quiz-generating tool

The solution chosen for this project is based on the before mentioned GenDoc, a generic XML

editor written in Java that can be extended by plugins to be adaptable to a DTD and to do the

additional wanted features (described above, like the code import function, export to different

formats and to the exam quiz server). Additional reasons were the facts that GenDoc is

freeware (under GPL license) and from a partner research team of the University of Grenoble.

More details about the design and implementation of the additional plugins that serve the

wanted functions will be provided in the two chapters following the next one.

59

Page 72: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

4.2 The Quiz DTD

4.2.1 Design

The reason why a DTD had to be used in first place is due to the fact that GenDoc does for

now not support XML Schema, which are much more complex and better suited to describe in

greater detail more complicated contexts. Still, the idea of using a Schema was taken up again

in the component concerning the import/export function of the eXam quiz server (see chapter

4.4 about this theme for more details).

The requirements on the needed DTD consist of the most relevant specification evaluation

criteria (all those having an importance of 3, and most of those with 2) and the extra features

wanted at the ESSI as described above. For the actual project they can be listed as follows:

• minimum question types (T/F, MC, MR, FIB, short answer, essay)

• include pictures

• import code parts from external file

• specify the right answer(s) of question

• save possible points per answer (in form of weight points, as these are by their

independence regarding the scoring system more advantageous for further reusability of the

quiz [OB03])

• text parallel in multiple languages (internationalization)

• some meta-data, including ID for both quiz and questions

• Accompanying text for quiz and questions: specific repetitive words in questions or the

whole quiz, as said above

• time interval for quiz taking (date and time)

• solution of question (with explanations)

• hints (for self-evaluation use)

• ensure necessary default values

The above requirements are specifically chosen for the DTD of this project. They therefore do

not include some of the criteria that also have an importance of 2, because theses tasks are

whether done by the quiz Server (e.g. time limit per question) or are not needed for now or

not yet supported by the quiz server (e.g. inclusion of dynamic media).

60

Page 73: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

4.2.2 Implementation

After setting the requirements for the DTD, their actual implementation should be shown now.

The explanations here will focus on some important details, whereas the complete DTD (and

also the other developed components) can be found on the CD provided at the end of the

paper.

Question types

The question types mentioned as “minimum requirement” belong obviously to the two types

“choice” and “text entries” (see also in chapter “Question types” for details). This can be used

to simplify the DTD (compared to EML) and make it more clear by deducing the questions

from these two basic types, called “multiple-choice-question” and “text-question” in the DTD.

<!ELEMENT quiz-item (metadata?, (multiple-choice-question | text-question))>

Due to the idea of simplification, the different subtypes of choice questions (T/F, MCQ and

MRQ) are not defined explicitly in the DTD, but differentiated by their number of given

possible answers. It later is the task of the publishing function for quizzes constructed with this

DTD to interpret what type is meant. The differentiation goes as follows:

if a question has only one correct answer it is of type MCQ, if it has more it is logically a

MRQ. The remaining difficulty is to distinguish between a MCQ with two answers and a T/F

question, especially without enlarging the DTD by further elements or attributes. The idea

used here is that the T/F question needs only one answer that will be treated as the "True"

part. The reason for this is that, as the correctness of the "False" part is always the opposite of

the one chosen for "True", this element is not explicitly needed. The content of the "True"

answer should be left empty in the XML file and will be filled in later by the publishing

function (the implementation used here will be described later). This has some advantages:

• the content is the same for all T/F questions, consisting of one element of the word pair

"true/false" or similar. If filled in when publishing it avoids that the author of the quiz will

have to type these words over and over again (thus falling in the area of the before

explained repetitive words that should better be automatically included).

• it is also much easier to exchange the word pair by similar ones, which is especially useful

for the here also required internationalization of the quiz and publishing it in different

languages (see later about the implementation).

61

Page 74: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

For text-questions, that can also be used here for numeric entries, it is important to state that

the differentiation is done explicitly only between FIB questions and all others regarding the

structure of the question: whether the answer is to be included in a text (being therefore FIB)

or it is to be entered separately from the question statement (as for all non-FIB question types,

such as numeric, short answer or essay (see also chapter 2.2.2.2), that are in this DTD not

further differentiated and handled in the same way). In the last case the solution is

straightforward, as it consists of a question an one or more answer elements (see more details

about this in the following point). In the first one, it consists of a so called “blanks-text” (along

with some optional elements), which is simply a text part that contains the left-out words in

the “blank” element(s) included at their right place in the text.

<!ELEMENT blanks-text (#PCDATA | blank)* >

Right answers

The task of specifying the right answer(s) of a question is logically different for both main

question groups, as the “choice questions” already include one or more right answer(s) in the

proposed possible answers. This results in two separate elements in this DTD used for

different purposes:

• The element “answer” describes the possible answers for the choice questions. It contains

the attribute “correct” that serves to differentiate if the possible answer is right or not. It is

also to annotate that the answer can also contain other content apart from text (e.g.

pictures; see DTD for more details).

<!ELEMENT answer (%content;)* > <!ATTLIST answer correct (true |false) #REQUIRED >

• The second element “instructors-answer' serves two functions: first, it is simply used to

identify the answer of text questions. An exception to this is of course the FIB question,

where the right answers are specified in the “blank” elements as described before. Although

the non-FIB questions are not explicitly differentiated in the DTD, a stricter format of the

instructors answer could be required to represent these different question types in the later

use of the data (as in the components described in the following chapters). As a second

function, reflecting further the concept of showing what is “one” right answer (being

exemplary as for essay question described in the chapter about “question types”), it can also

be used for any additional answer that contains more detailed information in both text or

62

Page 75: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

choice questions. This also explains why it can, like the “answer”, contain different content

as well.

Hints

Hints (for self-evaluation use) are implemented here by an element of the same name, “hint”.

Like answer and instructors-answer, it can contain all kind of content, but serves the purpose

of giving tips for the solving on a certain question.

Picture and code import

The inclusion of pictures is realized (similar to HTML) by having a special “picture” element,

that has an attribute “href” containing the path to the source of the picture. The physical

inclusion will only be done later at the presentation.

<!ELEMENT picture EMPTY > <!ATTLIST picture href CDATA #REQUIRED>

The importation of code parts from external files is in its idea identical to the above

proceeding for pictures, although the actual “code-elem” element has also another function

compared to the “picture”. It also serves to distinguish typed-in code from regular text that is

saved in “paragraph” elements. The attribute “type” further helps to specify differences in the

included code-parts, where the default "block" is used for a block of code by itself, and

"code"/"keyword"/"file" optionally specify the layout of code parts included in the text.

<!ELEMENT code-elem (#PCDATA) > <!ATTLIST code-elem type (code|keyword|file|block) "block" href CDATA #IMPLIED >

This way of code import differs from the one used in Peter Sander's DTD. There this was not

specifically defined in the DTD itself, but the source paths of the code parts were included

manually in the Document Type Declaration at the beginning of the XM file (see also later at

“Publishing plugin”).

Weight points

Saving possible points per answer is done in the attribute “points” of the “quiz-item”

element that stands for what is usually understood as a “question” in a quiz (containing the

question statement and the (possible) answer(s) ). These points represent here the weight of

63

Page 76: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

the question in comparison to the others, not the grade points per question. Compared to

grade points it would represent the maximum points possible for this question in relation to the

max points of the quiz. Using weight points was found out to be a better way than including

grade points, as this simplifies significantly the reusability.[OB03]

Metadata and IDs

In this DTD, IDs are not included due to the fact that this is handled internally by the quiz

server. Still it includes other metadata that are optional and available for both quiz and

questions. For now these include: author, date of creation, an eventual short description,

keyword(s), user-group(s), and an optional difficulty-level.

Automatic inclusion of words

Specific fixed or repetitive words in questions or the whole quiz used here are for example

some short instructions (like “Read theses instructions carefully” before the list instructions, or

“A possible solution may be:” for the above described instructors-answer), but also simple

words like “Question” or “True” and “False” as mentioned before. In the actual solution taken

from Peter Sander's DTD, they are not included in the DTD but must be defined in an extra

XML file, so that these can be read out during publication via XSL transformation (see

“publication plugin” of GenDoc in the following chapter) or exchanged with the quiz server

via the import/export function (see in corresponding chapter that follows after the next one).

Internationalization

Concerning parallel text parts in multiple languages, there is a distinction to make

regarding the type of text:

• For regular text parts that in the DTD are included in “paragraph” elements, but also for the

above described “blanks-text”, this is done through the attribute of this element:

<!ATTLIST paragraph lang CDATA "fr"> <!ATTLIST blanks-text lang CDATA "fr">

• For automatically included words, an extra XML file per supported language has to be

provided that contains (in the corresponding tags) the words to use in the chosen language.

64

Page 77: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Time

The time interval for quiz taking is simply implemented by the two elements “date” and

“time” that are both contained in the quiz header.

Default

The necessary default values are ensured by the possibilities in the XML language, in case of

attributes by using some explicitly fixed default values (like for quiz-item points, code-elem

type, list type or paragraph language).

Additional features

In addition to the implementation of the above listed requirements, some more tags were

added to the DTD because of the use of GenDoc as editor. Some are just a useful addition

seen in GenDoc examples, like list prefixes/types as attribute to the “list” element.

<!ATTLIST list type (none | bullet |arabic | upperAlpha |lowerAlpha | upperRoman | lowerRoman) "bullet">

Others are specifically needed for the use with GenDoc, as the a HTML-like styledStrings

that help to support styling text parts and are used e.g. in “paragraph.

<!ENTITY % styledString "#PCDATA | strong | emphasis | underline | sup | sub | a | code |path">

But all of these additional tags are optional so that the DTD can also be used without

GenDoc.

65

Page 78: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

4.3 The GenDoc plugins

4.3.1. Overview about GenDoc and its plugins

GenDoc is (as already mentioned before) an XML editor written in Java2, and was developed

in the ARCADE team of the CLIPS research laboratory of the Université Joseph Fourier of

Grenoble. It was based on an existing project (MerlotXML) from which it kept its general

structure.

GenDoc's graphical user interface consists of 3 views (as can be seen in the screenshot below

(Figure 3)):

a tree view (on the left) that reflects the structure of the actual elements in the XML

document, an attribute view (on the lower part of the right side) showing the attributes of the

current element, and a "styled view" (above the attribute view) that represents the elements

of the document and their contents with a more “visual aspect”

(http://gendiapo.sourceforge.net/). The styled view is also the biggest improvement over

MerlotXML and will be described further in the following part.

GenDoc simplifies the editing process of XML files for the user in two ways: by presenting

the structure and its included data in the above described different views, and by using simple

context menus to build the structure. Further information can be found in the GenDoc User

Manual (as specified in the “Use Instructions” in Appendix E).

66

Figure 3 : The 3 views in the GenDoc GUI

Page 79: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

The editing of XML documents in GenDoc requires a valid DTD that has to be provided. In

general, the styled view shows a basic layout, which can be customized by implementing an

“editing plugin” for a special DTD (therefore sometimes also called “DTD plugin”). Some

plugins concerning pedagogic objects are already provided with the available GenDoc version,

as GenDoc was primarily designed with the goal to support the creation of these objects by the

teaching personnel. The implementation of further plugins is possible for all interested

developers, as GenDoc is open source under General Public License (GPL) with the source

code provided on their website (see [GEN]). The creation of a plugin for quizzes at the will be

explained further in the following chapter.

Apart from its main function, GenDoc also offers the possibility to publish XML documents in

HTML or PDF format. For this, a “publishing plugin” (alternatively called “action plugin”)

has to be implemented for each specific DTD. The publication is thereby done by the

transformation of the XML document via the XSLT/XSLFO technology

(www.w3.org/Style/XSL/). Again, some plugins (correspond to the provided editing plugins)

are also included in the actual GenDoc version. More details on the transformation along with

the actual implementation for Quiz1.DTD developed for the ESSI will be provided in chapter

4.3.2.2.

4.3.2 Editing plugin

4.3.2.1 Design

The goals of the GenDoc editing plugin required in this project are that it should adapt to the

created Quiz DTD with all its required features that are defined in its elements (see prior

chapter for details), by further simplifying the editing task for the user. In the following, the

necessary steps towards their fulfillment should be explained.

The first and most general step is to adapt the plugin to the given Quiz DTD. To simplify

this task, a model plugin is provided on the website of GenDoc [GEN]. The advantage is that

the required structure is already provided. The remaining task is then to adapt this this model

further to the DTD. General changes to do are described in an also available documentation

[GEN], whereas more specific features have to be developed from scratch.

67

Page 80: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

The second step therefore concerns customization of the editor, especially of the styled view.

The enhancements of this already visual editor should thereby further help the user by showing

a view of the document nearer to what it could look like later. The best way to do this are

generally so called “What You See Is What You Get” (WYSIWYG) editors. In the case of

XML document, this kind of editor is impossible to implement, due to the fact that the

resulting document is first of all still an XML structure whose purpose is to structure data but

not to present it. As it does not contain itself any kind of layout definition, the final look of the

same document can look differently depending on the form chosen for the publication. In

GenDoc, this is specified in the publishing plugin (as will be described later in chapter 4.3.3).

Still, some adaptations of the editing view, concerning more general aspects on how it could

look like, are possible and can thereby help to facilitate the imagination of user. Regarding the

Quiz DTD, this includes the following points:

2. Include and display external information, like

1. pictures

2. code parts taken from external files

3. Enhance the appearance of text parts so that it looks more like the resulting text. This can

be done through:

1. adding styling possibilities for text (like bold, underline, ...)

3. displaying chosen list item prefixes

2. hiding some labels/markers for elements with self-explaining content fields (e.g. for the

“paragraph” element, but also for the above styling elements)

4. displaying a blanks-text in a more self-explaining way

Some of these options already exist in the plugins provided with GenDoc, and will therefore

be included in the plugin and adapted to the DTD. Those are:

2. the image import function for the “picture” element

1. the “styledString”elements described earlier (see chapter 4.2), that are used as in a text

editor by marking a text part and clicking the correspondent button in the menu bar,

whereas the text part changes. For this, tips showing how to adapt these are also included

in the adaption instructions mentioned before.

2. list prefixes displayed before each “list-item” element

3. hiding the paragraph marker

68

Page 81: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

The remaining features (code-import and representing the blanks-text) will nevertheless be

created. The implementation of all here described features are explained in the following

section.

4.3.2.2 Implementation : explaining important details

The changes on the “model” plugin mainly consist in adapting the structure and the

properties to the elements of the used DTD. The most important file thereby is “plugin.xml”.

As the main file of the editing plugin, it contains necessary information about the chosen DTD

and paths to default classes and required files, that will have to be adapted. But more

importantly, it offers the possibility to specify layout options for objects of the DTD (elements

and attributes) that should differ from the default view.

The 3 most relevant options should be mentioned here to simplify further explanations. They

consist of:

• specifying if an attribute will have another than the default attribute panel (e.g. having a

browse function, as the picture attribute “source”), that will be implemented by another

than the default class

• hiding styling elements in the styled view (here: all styledString elements)

• declare “actions”, which are styling elements that will be inserted by clicking on a

corresponding button in the menu bar (done for the styledStrings)

More details and information regarding the required changes to do and possible options to

specify can be found in the adaptation documentation for this plugin mentioned before [GEN].

One more thing that should be explained here, as it is not exactly done in the documentation

but important to understand the following part, are the View classes used to display the

elements in the styled views. All elements that use another than the default view have to be

assigned to a specialized View class which has to be included in the plugin. The assignment is

done in the PluginQuizViewFactory (see source code for details). Some of the already

existing special View classes in the model plugin were used to fulfill some of the required

features, as follows:

69

Page 82: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

• ImageDummyView for “picture”: it whether provides a dummy picture if no source as

been specified yet, and otherwise imports and displays the picture (therefore being the

second part necessary for the required picture import, after the above described browse

function).

• ListNodeView for “list”: displays the list prefixes chosen in the corresponding attribute

“type” before the marker of the “list-item” elements of this list

• ItemNodeView for “paragraph”: the name resulting from its former use in existing plugins,

it is here used to show the content of a paragraph element without having a marker with the

name “paragraph” before it, therefore hiding the marker. This as well as the option to hide

the remaining whitespaces have both to be specified in “plugin.xml” (see documentation for

details).

As could be seen, adaptations of the DTD objects to these existing View classes already

implement many of the required features. Two important function that remain to be added are

the code import from external file and the creation and presentation of blanks in the blanks-

text.

Code import

The code import needed by the “code-elem” element should be done in a similar way than the

picture import. In a first step, the panel that contains the attribute “href” of “code-elem” was

be adapted analogically as the one from “picture”, to have a browsing function. The second

step, concerning the view is a little more difficult to realize, due to the fact that the element

“code-elem” can contain whether code that was typed in or that was imported from a file

whose source is specified in the attribute “href” as explained before (see chapter 4.2).

To solve this, it is first important to understand the general relationship between the

elements of the underlying XML file and the views created and displayed in GenDoc.

To each XML document, that is parsed into a DOM tree structure by GenDoc, corresponds a

swing document (Interface Document in javax.swing.text), which is created by the GenDoc

class GenDiapoDocumentFactory. In this class, each XML element is converted into one or

more swing elements. In the create-method of the class ViewFactory (also in the

javax.swing.text package), each of theses swing elements is then associated to a View in the

styled view of GenDoc.

70

Page 83: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

For the “code-elem” elements in the XML document, as for all elements, more than one

swing elements are actually created and displayed by default views as described earlier if not

specified in the PluginQuizViewFactory. This fact was used to solve the problem of having to

hide whether the imported code or the default text field (PCDATA) view when showing the

other one, by using two separate views, one per possible shown content. Therefore, two View

classes were implemented: one that shows a PCDATA text entrance field in the editor, and

another one that displays imported code parts when a source has been chosen.

The first one, CodeElemPCDataView, handles the text field for the PCDATA part belonging

to the “code-elem” element in the XML document. In contrast to the default class used to

display PCDATA, this new view class has the function to behave as the default PCDataView

when no source is defined in the attribute “href”, and to “disappear” from the styled view of

the editor when code is imported (which can be seen in the screenshot below). That last

function is implemented here by constantly checking if a source has been entered, and in this

case reducing its heights to zero and repainting the view in this way. This also has the

advantage to reset the size to normal when the source has been detected to be empty again, so

that the normal PCDataView function is restored, including some eventually previously typed

in text.

The second class created here, CodeElemDummyView, finally has the function to display the

code parts from the source specified in the “href” attribute. Here again, it has to be

continuously checked if the source is empty or not. When a source has been given (by typing

or browsing for it), the content of this is tried to be imported, which fails and returns an error

message in case the path or name are wrong or the file does not contain text. In case the file is

right, the text is imported from it and displayed in the view, whose size has been recalculated

to fit the imported text.

The swing element belonging to the “code-elem” used here to implement the above functions,

by changing its default view into the above described CodeElemDummyView, is a so called

“VOID” element. This little auxiliary element, generally only serves the purpose to make

whitespaces after the marker name, and can be deleted when needed, as described earlier.

The imported text in the resulting view is nevertheless read-only and not modifiable,

contrarily to PCDATA text fields. The reason for this is the purpose to only show the content

of the file and not to modify it. This prevents inconsistency between the imported code and the

71

Page 84: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

one in the file, and avoids the appearance of typing errors in this file, which especially for

programming code can result in dramatic problems.

The following screenshot (Figure 4) shows the result of both classes: the code in the marked

Code-element was imported from the file specified in the attribute panel, and those of the

Answer elements below were typed in.

Blanks text

The implementation of the blanks in the “blanks-text” element is relatively simple to realize,

as these can be handled similarly to the “styledStrings”: by clicking a corresponding button, a

selected part of the blanks-text is transformed into a blank, which is represented in a graphical

way that differentiates the blank from the rest of the text. The adaptations to do for the blank

element are therefore analogical to those for styledStrings, and the blanks-text is handled as

the paragraph. The instructions for this can therefore be found in the adaptation

documentation. This also includes adding HTML mark-up that specifies the layout in which

the blank will be graphically represented in the editor. In the actual plugin, the blanks are

marked in red, as can be seen in the screenshot below. The original idea was to surround the

text part of the blank by a box which would better represent a blank, but this is unfortunately

not supported by GenDoc.

An example of the result can be seen in the following screenshot:

72

Figure 4 : Example for both imported and typed code parts

Page 85: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

4.3.3 Publishing plugin

After showing the design and implementation of the editing plugin for the Quiz DTD, this

chapter will describe in an analogous way the creation of the second kind of plugin, the

publishing plugin.

4.3.3.1 Design

The general goal of the publishing plugin is, as its name already explains, to publish an XML

file of a quiz that is based on the Quiz DTD of this project, which should now usually be

edited by using the editing plugin. Specifically, the quiz should be published in whether HTML

or PDF format, and with the choice of some other publishing options, like for the publication

of the examples provided with GenDoc.

73

Figure 5 : Example of a BlanksText

Page 86: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

The publishing options to choose from for the new plugin are:

• the already mentioned format (HTML or PDF)

• the language

• and whether to show or hide the following information:

• the right answers (needed for quiz taking)

• hints (only for self-evaluation)

• a more detailed answer, as defined in “instructors-answer”, when this is not used for the

answer statement (see chapter 4.2)

The publishing should of course also include the code import from external files, which was

not done during the editing of the XML file (as described earlier in chapter 4.2.2), and will

therefore have to be done at this moment.

Like for the editing plugin, some parts can be adapted from the existing plugins, but others

will have to be added. To start from, there is again an already provided model plugin [GEN]

to be adapted to the used DTD. This includes again the structure but also a lot of general

classes and methods that can be reused, as the existing plugins also treat HTML and PDF

transformation, although with other parameters as options. The following section will explain

the details on the changes and additions made for this plugin.

4.3.3.2 Implementation

General adaptations

Similar as for the editing plugin, the major changes specific to the plugin have to be made in

the “plugin.xml” file. Compared to the editing plugin, is much smaller here, and there are just

a few details to adapt, concerning general information about the DTD used and the actual

plugin (like the names, version, pathes etc.), as well as specifications concerning the launch of

the transformation process (menu name, java class to launch, ...). (For more information, see

[GEN])

74

Page 87: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Adaptation in the 3 main classes

The main adaptations in the code had to be done in the three classes of the

org.merlotxml.merlot.plugins.action.xslQuiz package concerning specific details of the

publishing action:

• PluginQuiz: This class is called to start the transformation and therefore has to be specified

in the plugin.xml file. From this class, the two following ones are called. It normally does

not have to be changed. But in this case it was slightly altered, as the way the transfer of

the parameters, which are read in in the class PublishUI and given to the class Publish, was

changed (before handed over in an XML file, now in a hashtable).

• PublishUI: It manages the dialog that opens when choosing “Export → Quiz →

Publication” in the “File” menu for a quiz that is actually open in the editor. The dialog

serves to chose the publishing format, the language, the to be optionally shown additional

information ( thus being the right answers, hints and the extra details on the answers, as

mentioned above) and the location and name of the resulting file. In this case, it had to be

rewritten (by adapting some existing methods and writing new ones for remaining needed

tasks) to adapt to the necessary parameters. The resulting dialog can be seen in the

screenshot below.

75

Figure 6 : Publishing dialog for Quizzes

Page 88: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Some more details on the elements used by this class should be further explained.

GenDoc offers the possibility to display the instructions and the parameters to choose from

in different languages, that will be chosen automatically depending on the default language

of the used computer system. For this, all appearing text parts have to be specified in

“.properties” files (e.g. ui.properties and fr_ui.properties, holding the same information

but in respectively english or french), which will be used in the PublishUI class. This

separation from functionality and displayed text also has the advantage to be more easily

changed when necessary. All those '.properties' files have to be placed in the

org.merlotxml.merlot.plugins.action.xslQuiz.resource directory.

To filter the right files while browsing for a path and name for the resulting file, PublishUI

uses Filter classes located in the same package, namely HTMLFileFilter and PDFFileFilter.

• Publish: This class launches the actual transformation process, which can be whether to

PDF or HTML, depending on the chosen format. The actual call is done in respectively the

doItPDF or doItHTML method, that hand over the parameters (passed through by a

hashtable, as explained earlier) and the necessary XSL stylesheets to the actual

transformation method of GenDoc. It is possible to perform more than one XSL

transformation on the given quiz file by specifying successively applied calls of the

transformation method with the XSL files to apply. A closer examination on the XSL

stylesheet used here will be provided below.

XSL stylesheets

Logically, two XSL stylesheets (one for each format) are needed for this plugin. These are in

this case Quiz.xsl for XSLT transformation to HTML and QuizFO.xsl for XSLFO

transformation to PDF format. Both stylesheets had to be created specifically for the Quiz

DTD. The basic layout is an adaptation from the one used in stylesheets that already existed

for Peter Sander's DTD. The functionality had nevertheless been completely reconstructed (in

form of templates for each element with the corresponding functionality) to fit to the DTD.

The transformations defined in these stylesheets also include the use of the remaining

parameters that are passed over to the transformation process (as described before). These

influence the content of the resulting file by both showing or not the optional information and

by presenting the quiz in the chosen language. This last part concerns two tasks: first,

publishing only those text parts that are whether neutral or specified in the chosen language,

76

Page 89: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

which can be done by comparing this one to the “lang” attribute in the concerned elements.

Second, to include the right fixed or repetitive words that are defined in special XML files (as

explained earlier in the context of the DTD development). This is simplified by the fact that the

file names contain the abbreviation of the language, which is the same as the passed parameter,

so that the call for this file and the contained words from the XSL file can be done like this

(where “lang-file” is a concatenation of the language abbreviation and the ending “.xml”):

<xsl:variable name="Instructions" select="document($lang-file)/local-stuff/question"/>

The language specific files used in this plugin actually hold replacing text for instructions,

question, answer, true, false, hint, instructors-answer (as extra details about an answer) and

FIB. These files, as well as all XSL files, have to be placed in the

org.merlotxml.merlot.plugins.action.xslQuiz.xsl directory.

Code Import

One functionality that could not be made in these two main transformation processes

respectively is the code import from external files. The reason for this lies in the idea on how

this inclusion can be done. As explained earlier (see chapter 4.1.2), the import in Peter

Sander's DTD was realized by including the name of the picture and code files in form of

entities explicitly in the DOCTYPE declaration at the beginning of the XML file of the quiz

and placing references to these entities in the corresponding XML elements. In this case, the

content of these files would be included automatically by the internal processing when doing a

transformation. To use the same idea also in this plugin, this DOCTYPE declaration at the

beginning of the XML file that will be published as well as the entity references have to be

created and included before the actual transformation, otherwise the implicit inclusion will not

work.

For this, another XSL stylesheet called codeimport.xsl, that has to be executed before the

main one described above, had to be created. Its function can be explained as follows:

This transformation does not actually copy the needed information into the initial quiz XML

file, as this is not possible. Still, it results in the same by creating an new XML file with the

same name that contains copies of all the content and tags from the original file but has in

addition also the DOCTYPE declaration and the references to the entities.

The DOCTYPE with the entities is included by adding text at the root of the resulting XML

77

Page 90: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

file, therefore done within a template referring to the root:

<xsl:template match="/">

There, for each code-elem whose attribute “href” is not empty, which means that code has to

be imported, an entity is created having the following pattern:

<!ENTITY name SYSTEM "path">

where path is the path to the file that is given in the “href” attribute, and name is a unique

name for this specific entity that will be referenced in the content of the concerned “code-

elem”. This is done in the second step, when the complete XML file is recopied, by creating a

special template to copy all “code-elem” elements and include the entity reference in the ones

that again have a non-empty “href” attribute. One problem thereby is that the name defined in

the entity and its reference have to be the same and also unique. The way chosen here to solve

this is to define the needed entity name by concatenating the word “entity” with the number of

the actual “code-elem” returned when counting all “code-elem” elements in the file, as this

always returns the same number for the same “code-elem” and is therefore unique. The

number is simply returned by <xsl:number level="any"/> called for the actual treated

element.

Annotation about both developed plugins

The final version of both new plugins that contain all the above described extra-features are

actually tested at the ESSI. A first evaluation about the advantages and disadvantages or

strength and weaknesses of this implementation will be given in a later chapter about the

evaluation of the implementations.

78

Page 91: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

4.4 Connection to “eXam” Quiz Server

4.4.1 Design

The goal of the last practical part of the project consists of having a possibility to import

quizzes created with the above described plugin into the eXam quiz server, and also to export

quizzes from it and save them in an XML file which is validated against the Quiz1.dtd (as

described in chapter 4.2). The resulting program will therefore have to provide a connection

between the XML files and the server, and then exchange the corresponding data.

The connection done by the program, that consists of accessing the needed data from the one

side (XML file or eXam) and to hand it over to the other, depends on the technology used for

this on either side:

1. In eXam, the access to data in the database is handled through Enterprise Java Beans

(EJB). These already have methods concerning the creation of new quizzes and the access

to already existing ones which can simply be called by the connection program.

2. To access data from XML files, the JAXB technology (Java architecture for XML Binding)

can be used, which automates the mapping between XML documents and Java objects

(http://java.sun.com/xml/jaxb/). It also supplies tools that enable the creation of new XML

files. The necessary steps to implement this for the given Quiz DTD will be explained in the

following implementation section.

The second step to enable correct data exchange is to ensure correct mapping from

information of one side to elements of the other side. As the quizzes from eXam and those

based on Quiz1.dtd do not exactly have the same structure, this mapping will have to be

implemented in the connecting program as well.

79

Page 92: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

4.4.2 Implementation

JAXB

The first step of the implementation consists of creating the Java objects that JAXB needs for

the automated mapping and that will be accessed by the connection program. The creation of

these objects that will be bound to a specific XML structure, therefore called the binding

process, has to be done once by compiling an XML schema into one or more Java technology

classes. To use JAXB for the actual project, the Quiz DTD had therefore first to be

transformed into the required XML Schema. This was done by using the free converter tool

Trang (http://www.thaiopensource.com/relaxng/trang.html). The object classes resulting from

the binding are all situated in the quiz.binding package used in the program, whereas the as

well created Javadoc files were added on the attached CD. More detailed information and

instructions on the binding process can be found in the Java Web Services Tutorial [WST].

Important parts of the connection program

In this section, specific details on the most important parts of the program itself should be

provided. This should help to clarify the parts that need explanations. The complete source

code can again be found on the CD.

The main parts of the program that will be further described are:

1. the GUI needed to connect and get the necessary parameters for the following processes

2. the two classes that perform the actual import and export

The GUI

The 2 tasks of the GUI are to first connect to eXam and then to whether import or export

quizzes. For the connection to eXam, the name of the host server on which eXam is running as

well as a valid user name and password for this server have to be entered. When this has

succeeded, necessary parameters for the at that point chosen process (import or export) have

to be provided. These are, depending on the process, as follows:

80

Page 93: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

For the import:

• an existing XML document to be imported into eXam, which can be found by using the

browse function. A preview of the chosen file helps to insure that the correct quiz will be

imported.

• the choice of one of the available languages supported by the Quiz DTD and the plugins

(for now just French and English). This is needed to specify which text elements should be

taken, as the XML document supports having text parts in multiple languages in parallel for

one element, as said before (see chapter 4.2).

• a user group, for which the test will be made available, which can be chosen from a list of

all user groups existing on the chosen eXam server.

For the export:

• the list of all actual quizzes on eXam (returned from a request on it) from which one has to

be chosen that will be exported. As a help, a preview of the resulting XML document is

displayed in an panel, which will only be saved when specified by clicking the

corresponding button.

• again the language, this time necessary to set the “lang” attribute of text elements in the

XML file to the correct language.

• name and location (whether existing or new) of the resulting XML document.

81

Page 94: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

The classes performing the import and export processes

The two classes that perform the actual import and export processes are ImportQuiz and

ExportQuiz. Their task consist mainly of getting the information from one side and map/set

them to elements on the other side. Accessing the data in the XML files is in both cases done

through the binding classes described earlier, that correspond to the elements. To handle

information about quizzes in eXam, existing methods from the EJBs in the middle layer of

eXam can be used to set and get these data to and from the database in which the quizzes and

all their elements are saved. (More details about the EJB Tier of eXam can be found in the

report about that part of the eXam project, specified in the Use Instructions in Appendix E at

“Installation of eXam”).

For the import into eXam, methods from the ProfManager Session Bean that is created when

logging in (see GUI above) can be used to set the data of the new quiz. These are

createQuestionary (to create the quiz), addQuestionToActualQuestionary (to add one

question to the quiz) and addLineToQuestionOnActualQuestionary (to add a “Line” to a

question). A line hereby specifies whether the answer for text questions or a possible answer

for choice question or the blanks in the blanks text. These last question types can therefore

have more than one Line.

For the export from eXam, the needed data for the chosen quiz can be taken from the fields of

the Value Object belonging to its Entity Bean. This one also contains a list of its

QuestionValueObject, that again hold, apart from other information, also a list of its

82

Figure 7 : GUI of the Import/Export module

Page 95: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

LineValueObjects The data can therefore easily be accessed through the corresponding get-

methods of these objects.

The mapping of the needed elements in eXam to those of the XML file for both import and

export can be found in the mapping tables in Appendix D. Nevertheless, this mapping was

mostly not possible to be realized by simply handing over the data. In most cases, these had to

be further processed, due to the following reasons:

• The simplest cases were differences in type (solved by conversion) or also in the structure

or format of the data (as for example the Date and Time elements of the XML file being

Strings of the formats YYYYMMDD and HH:MM-HH:MM, compared to those time

elements in eXam saved in java.util.Date specific Long values).

• Other information are defined explicit in eXam, but implicit in the DTD, as for the

Question types (that are defined as the “int” type variable “qtype” in the

QuestionValueObject). For the import, these types had therefore to be found out through

case checking, like counting the correct answers of a choice question (see also explanations

about this described in chapter 4.2.2).

For the use with eXam, the non-FIB text questions can be further specialized when edited

with GenDoc in the following way: to create a text question that will be recognized as

short answer question during the import to eXam, the first instructors-answer element

should contain “the” correct short answer that should not be longer than 30 characters

(including whitespace)!

• An important problem to solve was the treatment of all elements that can be contained in a

part of text. As eXam supports HTML tags in its text parts (such as the question

statement), these will have to be transformed (when possible) into StyledString and also

Content elements of the Quiz DTD during export. This is done in the ExportQuiz class by

parsing the Strings passed over from the ValueObjects in the methods parseContent and

parseIntoStyledString. All other tags are skipped and included (with escaped brackets) in

the actual textpart.

During the import process, the above described transformation has to be done in reverse,

by recomposing text parts from all its elements, by including the right HTML tags around

the corresponding elements. This is done in the methods getStyledTextAsString and

getContentAsString of the ImportQuiz class.

• The Blanks Text, as a special case of the above described text parts, is similarly treated. In

83

Page 96: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

eXam, it consists of a text which contains “???” that represent the blanks, whereas the

content of the blank are saved separately in Lines. This differs strongly from its

representation in the Quiz DTD, where the blanks are included in the text (see chapter 4.2).

The import process will therefore have to decompose the BlanksText element, whereas it

will be recomposed when exported.

• One last necessary thing that is rather simple to solve is to fill in default values when

needed. This again shows that the overlapping of both set data is not complete. In this case,

the absence of the needed values (feedback texts during import, and instructions during

export) results from the different goals of both the XML file and the data in eXam: the

XML file serves to store data about the quiz itself, therefore lacking most of the

information needed to take the quiz in an computer-based quiz-taking environment, and the

database of eXam on the other side does not contain any instructions, as these are set in the

software itself and universal for all quizzes taken on it. The lack of these data is therefore

only of minor importance in an overall view.

84

Page 97: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

4.5 Use case: combining all developed components

The overall goal of all developed components described above is to be used together with

eXam in one context, where each parts undertakes specified tasks concerning the processing

of data about quizzes, as can be seen in the following schema.

The use case

One possible use case that involves all 3 tools will be described now. It should give an idea of

possible uses of these tools. General information on installing, running and using these tools

are explained in the Appendix E (Use Instructions).

Step 1: Create a quiz in eXam

This is done with the help of online forms, as described in the eXam user manual mentioned

before. The quiz is then available to be taken with the chosen parameters or, with the new

functionality, to be exported.

Step 2: Export it in XML format with the developed Connection Program

The export simply transforms the data of the chosen quiz into an XML file based on the quiz

85

Figure 8 : Components and actions concerning quiz data processing

eXamConnection Program

GenDoc

Edit plugin

Publish plugin

Export

Import

Edit

Publish

Edit

Create

Create

Take quiz

Page 98: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

DTD (as described in chapter 4.4).

Step 3: Modify the XML file of the quiz in GenDoc

The exported quiz can be edited with the developed Quiz editing plugin, as this one is

specialized for files that are based on the Quiz1.dtd. Especially inclusion of additional data, as

images and code from external files, is interesting here, as this is not supported by the quiz

editing process of eXam.

Step 4: Publish the modified quiz to HTML or PDF

With the developed publishing plugin, the quiz can furthermore be published. Both proposed

format can be of interest for instructors:

2. with PDF, it allows to get a print version of a quiz that could be used for non-computer

based real exams (see discussion about this in chapter 3.1).

• the HTML version could serve to present the quiz with right answers and further

explanations about them to the participants of the quiz, after this has been returned.

Step 5: Import the quiz to eXam

Finally, the modified quiz (or any other quiz that was created based on the Quiz1.dtd) can be

re-imported with the Connection Program. It then can be made available again for further

quiz-taking (whether for self-evaluation or exams) or be modified before this within eXam to

fine-tune the quiz-taking information of it.

86

Page 99: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

5. Assessment/Evaluation of the developedcomponentsAs announced earlier, this chapter will present an evaluation of the developed concepts

described in chapter 4. For each component, the advantages and disadvantages of the chosen

solution should be discussed, including ideas on how the disadvantages could be solved in the

future.

5.1 The Quiz DTD

The advantage of the developed DTD is that it is customized to the needs of quizzes at

ESSI, by having features such as the code import that do not actually exist in any given quiz

specification. On the downside, by being specialized to this field of use, it is reduced in its

portability and reusability outside of the described using context, compared to a quiz

specifications. A solution to this could be the earlier mentioned mapping to a quiz

specification. A mapping to IMS QTI could turn out to be useful in the future, as this one

seems to develop towards a standard (see also annotations about the future of quiz standards,

chapter 3.2). Generally, the mapping should be possible to implement due to the fact that the

specifications are much broader (regarding the elements they contain) than the actual Quiz

DTD. Still, due to the generalization that would take place, preventing loss of information

cannot be guaranteed. This especially concerns the additional features added to the DTD, that

among other points led to the choice of creating an own DTD instead of adapting a standard

(see chapter 4.1).

One further improvement of the DTD in the future could be to extend the metadata element

(that for now only contains most important ones) to those defined e.g. by the Dublin Core

Metadata Initiative (DCMI)(http://dublincore.org/), which as a standard brings further

advantages regarding portability and reusability.

87

Page 100: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

5.2 The GenDoc Plugins

5.2.1 The editing plugin

Similar as for the DTD, the main advantage of the editing plugin lies in its specialization. The

actual plugin represents the graphical support of the features from the Quiz DTD and is

therefore helps users in the editing process of quizzes. In comparison to other GenDoc

plugins, it also offers new functionality such as the code import.

Nevertheless, its graphical possibilities are restricted by the underlying concept of GenDoc: it

does not allow major changes of the presentation (and anything in the basic concept of

GenDoc) to be done in the plugins, though these are defined as being the parts of GenDoc that

should implement extensions. Desirable changes that belong to this category are for example

the control over the (sometimes quite capricious) opening and shutting ability of the attribute

panel, or also the automatic inclusion of all further needed sub-elements at the creation of a

new element, that will further simplify the work for the user. A solution to these problems lies

in the fact that GenDoc is completely open source: even general changes could be realized by

modifying the source code, if necessary, although this breaks the concept that changes should

be situated in the plugins.

Another useful improvement to simplify the editing process of text parts would be to globally

set the default language for this document when editing, or even better the language for all

further created text parts, whereas changing it will only affect later created elements. This

feature could perhaps even be implemented by the plugin itself, but would have to be checked

first, and nevertheless requires an excellent knowledge of the GenDoc source.

Some general improvements could also arise in the future, as GenDoc is still under further

development. At the moment, a web version is planned, which will also implement the above

mentioned automatic creation of subelements.

Nevertheless, most mentioned disadvantages of this plugin mainly concern general problems of

GenDoc that thereby limit the possibilities of the plugin. Similarly, but in reverse, strengths of

GenDoc can become advantages of the plugin. This concerns for example the fact that, by

being written in Java, portability of GenDoc including the quiz specific plugins is assured.

5.2.2 The publishing plugin

The strength of this plugin lies undoubtedly in the added functionality to import code during

88

Page 101: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

publication, as described earlier. One of its weakness for now is that the appearance of the

resulting publications is rather neutral (which in the same time is an advantage for the actual

use), but nevertheless always the same for each quiz publication. This can easily be solved in

the future by writing additional stylesheets that define the layout and adding the possibility to

choose a stylesheet as a parameter of the publication in the publication dialog. The task of

creating new ones is quite simple if the structure from the existing ones is recopied, so that

only the styling has to be adapted.

5.3 The connection program

The connection program fulfills its required functions of importing quizzes into eXam and

exporting them back into an XML format as desired. Although the thereby done mapping

between the information required on either side already solves the major problems arising from

the imperfect overlapping (see chapter 4.4.2), there are still some details that could be

enhanced in the future. These minor flaws in the result of the exported or imported data, that

at the moment have to be adapted manually with the corresponding editor, when needed, and

could possibly be solved differently, should be described now.

• The first one addresses the intersection of the data that is actually solved by default values

put in the corresponding places (see 4.4.2). This solution nevertheless does not offer any

customization of the default values, and therefore could be further enhanced by allowing

the user to enter default values if desired (e.g. as additional parameters for the import or

export process handed over from the GUI).

• A more important flaw to solve concerns the different usages of points per quiz item: in

eXam these figure the actual points per question, whereas in the Quiz DTD, they represent

the weighting of the quiz item compared to others (with a default value of 1), rather than

points (see chapter 4.2.2 for further details). This by itself is not a problem, but it could

become one regarding the points per line that have to be set during the import of a quiz: in

the case of question types that in eXam have more than one line and also more than one

correct answer (therefore being whether MRQ and FIB), the total points of this quiz-item

will have to be split up evenly on the correct answers, as their implicitly calculated sum will

define the total points of this question in eXam. The problem arises from the fact that for

now these points per question are defined as whole numbers, which is most of the time not

given when calculated from the weight points of the XML quiz as described above. To

89

Page 102: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

solve this problem for now, the partial points are rounded up during import. This is not

perfect either, as it often results in an overweighting of this questions compared to others,

as the recalculation of the question points sums up all these rounded values. A rather simple

solution that does not solve but further enhances this situation consists of multiplying all

weight points by a (also customizable) factor, as the rounding up of generally larger values

per line reduces the overall deviation between the question points. A final solution to this

can nevertheless only result from general changes in eXam (where this problem was already

identified but not yet changed) or by further adapting the Quiz DTD to eXam.

Further enhancements, that aim at better adapting both sides to each other, will, similarly to

the last point above, have to be done outside of the connection program, as they concern

changes in basic concepts of whether the DTD or eXam.

Enhancements concerning eXam:

• The source of the code import as well as hints (apart from links) are actually lost during

import, but could be included somewhere instead (e.g in addiitional elements).

• The inclusion of images in eXam, which for now has to be done manually by including the

necessary HTML code in the text, could be simplified, as done in the GenDoc editing

plugins (see chapter 4.3.2).

Enhancements concerning the Quiz DTD:

The main flaws of the Quiz DTD when used together with eXam can be found in its implicit

defined use of information as well as in the PCDATA fields that thereby generalizes the

element content. Both could both problems regarding the task to ensure correctness of the

entered data, which for now has to be done by the author of the quiz himself: the generalized

content do not ensure the right formats (e.g. of Date and Time) by themselves, and implicit

information cannot always guarantee the correct use of those defined differences (like the

question type). One partial solution to the last problem could be to turn the implicit

information into explicit ones. The other part, concerning formats, could be eventually solved

by adapting the Quiz editing plugin to only allow the required formats, as DTDs are limited in

this task, and the XML Schema that solve this are not yet supported by GenDoc.

90

Page 103: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

6. Synopsis and future prospectsThis chapter concludes the paper by giving in a first part a summarization of the most

important findings of all precedent chapters, followed by future prospects concerning further

research needed in the field of interactive quizzes in e-learning.

6.1 Summarization

General findings on quizzes in e-learning

A first interesting result from chapter 2.2 is that, in contrary to e-learning in general (as

described in chapter 2.1), interactive quizzes cannot be as simply or as well applied to all

possible fields of application that traditional assessment has (see especially discussion about

quizzes as real exams, in chapter 3.1). Nevertheless, they allow new areas of use (like self-

evaluation) that take their strengths from the differences compared to the regular way

assessments are done (that is, sophisticated automation of different steps of the assessment

process, such as feedback and correction of the results).

Differences of adaptability to the e-learning environment also occur on a more detailed level of

quizzes, concerning its questions. As described in chapter 2.3, the available question types are

differently well suited to the use in e-learning, regarding simplicity of creation and use,

automatic correction and marking, and the suitability to check complex learning goals. An

interesting finding on that last point is that the often devalued, but also widely used multiple

choice questions (both MCQ and MRQ) were found out to be capable of checking this, under

the condition that they are created well.

Use of quizzes as real exams

An important issue discussed in chapter 3.1 concerns the use of quizzes in e-learning as real

exams in education. This area of use has the most restrictions concerning time, content and

place, that are generally defined in special regulations about exams of educational

organizations (e.g. universities). As a result of the analysis done here, quizzes as exams are for

now only possible to realize in a similar way as traditional ones, that means taking place in

locations monitored by persons, with as only change being taken on a computer. This is due to

the required authentication, that can (for now) not be assured well enough by means of

91

Page 104: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

computer-based invigilation only, in otherwise unsupervised places chosen by the exam

participants. Still, unmonitored distributed quizzes are continued to be tested for now in form

of mock exams. But before using them as real exams in the future, the technology needed for

computer-based invigilation has to develop further, and more importantly, the regulations

about these (and with this also the philosophy behind these, as discussed at the end of chapter

3.1.2.3) have to change in general. As this last point is not likely to happen very fast, this fact

might be another reason why the future of e-learning is at the moment often seen in “blended

learning”[RT03].

Standards

At the moment, no standards exist yet for interactive quizzes in special (see chapter 3.2). Still,

their development has already begun, with many organizations involved in this process. As the

actual state of affairs, only two specifications concerning quizzes exist for now, being EML

and IMS QTI. EML is thereby a more general one, that defines a wide variety of e-learning

content, whereas IMS QTI only concerns quizzes. As the research showed, it seems probable

that this last one could become the quiz standard in the future, resulting from the convergence

of most existing parts into it and its already growing importance in general.

Tools

The evaluation of the state of the art of quiz tools in chapter 3.3 showed that quite powerful

tools already exist, but the market in general is still rather heterogeneous, offering a wide

variety of tools with different levels of details concerning included features. This results from

the multitude of different demands that this sector actually has, and which is worked by

companies that create these different tools. Finally, the choice on a tool therefore depends on

the situation it will be used in, which leads to the project undertaken at the ESSI.

The project

The practical part described in chapter 4 of this paper has the main goal to enhance the

existing eXam quiz server by an import/export possibility. This reflects the already in chapter

2.2.3 described finding that interoperability is perceived as being more and more important.

Handling quizzes outside the server in XML format ensures broad a usability of those, e.g.

publications in different formats, as implemented in the publication plugin (chapter 4.3.2). The

developed editing plugin of the generic XML editor GenDoc (chapter 4.3.1), that is adapted to

92

Page 105: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

the structure used for the quizzes, thereby enhances the editing of those quizzes in XML. To

determine the structure of those quizzes, a DTD was developed instead of using an existing

specification, as this could be solved later by mapping the DTD to the standard. Also, the use

of a quiz specification would not have helped to solve the problem of matching these elements

to those of the eXam quiz server, as these are for now not based on any specification either,

but which could be changed in the future.

The main problems (as described in chapter 5) that this otherwise useful solution has, arise

from two things: On the one side the problem of choosing between generalization vs.

specialization of the developed components (DTD and Plugins) which can be seen as

advantage or disadvantage depending on the point of view (thus the actual situation). On the

other side, concerning the Import/Export program, the main difficulty is as already said the

mapping between both data structures that are not 100%. Although this is in the actual case

only minimal, it underlines the interest of using standards for both sides in the future.

6.2 Future prospects

As could be seen throughout the paper, the area of quizzes in e-learning is not yet fully

explored and used. Further research is therefore needed to develop and enable more fields of

application in the future.

One possible field of research concerns the further enhancement of automated correction

and marking. This is specifically of interest for those question types that have at the moment

to be corrected manually. Concerning essay questions, it is already being treated with great

interest, as can be seen from its popularity in the topics of this years Computer Assisted

Assessment conference (http://www.caaconference.com/). The final goal would be to

completely automate the marking of quizzes, including also those question types that are for

now rarely used because of their problematic correction, such as diagrams. [TPP03]

Further research is also needed to finally enable exams in remote unmonitored locations. As

mentioned earlier, the main hurdle to overcome remains the problem of securing

authentication. To solve this, a safe way of remote invigilation has to be found. Another

enabling factor is to provide security of the information transmitted between those distributed

places. Even more security (and in case of exams also invigilation) technology will be needed

93

Page 106: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

when quizzes were allowed to be even taken in mobile situations in the future[SW+03], for

greater convenience and independence of the participant.

94

Page 107: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Bibliography

[ADL] “About ADL SCORM”, http://www.adlnet.org/index.cfm?fuseaction=scormabt&cfid=28250& cftoken=15947985

[AICC] “AICC Subcommittee Working Group Meeting Minutes, June 11 - 15, 2001, Pittsburgh, PA”, http://www.aicc.org/docs/meetings/11jun2001/minutes.htm

[D03] C. Desmoulins: “Definition on e-learning”, at ARCADE - Laboratoire CLIPS,Fédération IMAG, Grenoble, by personal correspondence (email), Oktober 2003.

[DB] “What is DocBook?”, http://www.docbook.org/oasis/intro.html

[DE01] H. Dichanz, A. Ernst: “E-learning – Begriffliche, psychologische und didaktische Überlegungen zum 'electronic learning' “, Juni 2001, http://www.medienpaed.com/00-2/dichanz_ernst1.pdf

[DPO] TU Braunschweig, FB 10: “Diplomprüfungsordnung für den StudiengangWirtschaftsinformatik an der Technischen Universität Braunschweig”, http://www.tu-braunschweig.de/Medien-DB/documents/AFS-rkrick_2003-09-23_neue_DPO_WiInfo.pdf , September 2003.

[ECI] “Examinations in Computing over the Internet”,http://mcs.open.ac.uk/eap/Examinations% 20in%20 Computing%20over%20the%20Internet.doc

[EL] “E-learning Site.com”, relevant pages: Tools and Standards, http://www.e-learningsite.com/

[EML] “Introduction to EML”, http://eml.ou.nl/introduction/explanation.htm

[EX] “eXam : un serveur de QCM pour e-miage”, http://miageprojet.unice.fr/twiki/bin/view/eXamQCM/WebHome

[GEN] “GenDoc information and sources for developers”, http://gendiapo.sourceforge.net/dev/

[H02] N. Hanna: “Effective Use of a Range of Authentic Assessments in a WebAssisted

Pharmacology Course”, http://ifets.ieee.org/periodical/vol_3_2002/hanna.htm

[ISO] “Why standards matter”,http://www.iso.org/iso/en/aboutiso/introduction/index.html#one

[K01] W. Kraemer: “Chance: Ist E-Learning das Bildungsmodell für das 21. Jahrhundert?”, in: Absatzwirtschaft, issue No. 9/2001, p. 158-161, Düsseldorf, September 2001.

95

Page 108: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

[LD] “IMS Learning Design Best Practice and Implementation Guide,Version 1.0 Final Specification”, http://www.imsproject.org/learningdesign/ldv1p0/imsld_bestv1p0.htm

[MAW03] T. Mitchell, N. Aldridge, W. Williamson: “Computer Based Testing of MedicalKnowledge”, Presentation at the 7th International Computer Assisted AssessmentConference (CAA), Loughborough, UK, July 2003,http://www.lboro.ac.uk/service/ltd/flicaa/conf2003/ppts/mitchel_t.ppt

[ME] “Initial Report on the automatic marker for the M881 April 2002 Mock Exam”, http://mcs.open.ac.uk/eap/Initial%20Report%20on%20the%20automatic%

20marker.doc

[MH 03] M. McAlpine, I. Hesketh: “Multiple Response Questions - Allowing for chance in authentic assessments”, Presentation at the 7th International Computer AssistedAssessment Conference (CAA), Loughborough, UK, July 2003, http://www.lboro.ac.uk/service/ltd/flicaa/conf2003/ppts/mcalpinem.pps

[MIA] “Etude des solutions auteurs “, February 2002, http://miageprojet.unice.fr/twiki/pub/Bordeaux/UlysseRapport/rapport.doc

[N02] C. Normand: “Dossier: Prescription universitaire et elearning”, in: Livres Hebdo, issue No. 460, p. 85-87, Paris, March 2002.

[O96] B. Oakley II: “A Virtual Classroom Approach to Learning Circuit Analysis”,http://ewh.ieee.org/soc/es/Aug1996/002/cd/ms.htm

[OB03] D. O'Hare, A. Boyle: “Assuring quality Computer-Based Assessment development in UK Higher Education”, Presentation at the 7th International Computer Assisted Assessment Conference (CAA), Loughborough, UK, July 2003, http://www.lboro.ac.uk/service/ltd/flicaa/conf2003/ppts/ohare_d.ppt

[P01] D. W. Petr: “Cross-checking and good scores go together: Students shrug ”, in 31st ASEE/IEEE Frontiers in Education Conference, Reno 2001, http://fie.engrng.pitt.edu/fie2001/papers/1160.pdf , or alsohttp://www.google.com/search?q=cache:xnVrM5rWZ2EJ:fie.engrng.pitt.edu/fie2001/papers/1160.pdf+&amp;hl=en&amp;ie=UTF-8

[QOL] Institute for Higher Education Policy: “Quality On the Line: Benchmarks for Success in Internet-Based Distance Education”, http://www.ihep.com/Pubs/PDF/Quality.pdf

[QTIa] “IMS Question & Test Interoperability: ASI Best Practice & Implementation, Guide Final Specification Version 1.2”, http://www.imsglobal.org/question/qtiv1p2/imsqti_asi_bestv1p2.html

[QTIb] “QTI White Paper from IMS Document: IMSWP-1, Version A”, October 2000, http://www.imsproject.org/question/whitepaper.pdf

96

Page 109: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

[RT03] K. Rebmann, W. Tenfelde: “Erfahrungen mit E-Learning in der Hochschullehre”, in: Berufsbildung, Heft 80/2003, p. 20-22, Seelze, 2003.

[RTL02] M. Rubens, P. Trigano, D. Lenne: “Learner Evaluation in Web-Based LearningEnvironments”, in: Lecture Notes on Computer Science, Bd. 2363, p. 1007, Berlin/Heidelberg, 2002.

[S95] F. Schanda: “Computer-Lernprogramme”, p. 77-84, Beltz-Verlag, 1995.

[SW+03] C. Sealy, J. Winkley, P. Humphries, D. Reppert : “ 'At the Coal Face' –Experiences of Computer-based Exams”, Presentation at the 7th International Computer Assisted Assessment Conference (CAA), Loughborough, UK, July2003, http://www.lboro.ac.uk/service/ltd/flicaa/conf2003/ppts/sealey_c.ppt

[T02] P. Twining: “Exploring descriptions of computer use”, in: “Enhancing the Impactof Investments in 'Educational' ICT”, ch. 4, November 2002, http://kn.open.ac.uk/public/getfile.cfm?documentfileid=2178

[TP+01] P. Thomas, B. Price, et al.: “Experiments with Electronic Examinations over theInternet.” Fifth International Computer Assisted Assessment Conference,Loughborough University, Loughborough, UK, 2001. Formerly found athttp://mcs.open.ac.uk/eap/CAA5Paper.html

[TPP+02] P. Thomas, B. Price, C. Paine, M. Richards: “Remote Electronic Examinations:student experiences”, published in British Journal of Educational Technology(BJET), volume 33, No 5, pp 539-552, 2002. Abstract found athttp://mcs.open.ac.uk/eap/BJETPaper.htm

[TPP03] P. Thomas, C. Paine, B. Price: “Student experiences of remote computer basedexaminations”, Presentation in 7th International Computer Assisted AssessmentConference (CAA), Loughborough, UK, July 2003,http://www.lboro.ac.uk/service/ltd/flicaa/conf2003/ppts/thomas_p.ppt

[W03] B. Warburton: “CAA in UK Higher Education: The State of the Art”, Presentation in 7th International Computer Assisted Assessment Conference (CAA), Loughborough, UK, July 2003,http://www.lboro.ac.uk/service/ltd/flicaa/conf2003/ppts/warburton_b.ppt

[WG02] Working group on Norms and Standards in Online Education, SCTIC, CREPUQ: “Les normes et standards de la formation en ligne - État des lieux et enjeux”, Septembre 2002, http://profetic.org/file/norm-0210-d-RAPPORT.pdf, or also miageprojet.unice.fr/twiki/pub/Bordeaux/OutilsAuteursStandards/norm-0210-d-RAPPORT.pdf

[WST] “Java Web Services Tutorial”, http://java.sun.com/webservices/docs/1.3/tutorial/doc/index.html

97

Page 110: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

APPENDIX A: Interaction of groups involved in e-learning standards

Table1: Organizations involved in the development and use of specifications in the e-learning

field

Developers of specifications for

structuring numeric information

(therefore concerning e-learning)

Organizations using the specification

IMS ADL SCORM, MERLOT, ARIADNE,

CanCore, EdNA, GESTALT , OKI- MIT,

LRN, ULF

IEEE GESTALT, LRN, ULF

DCMI EdNA

AICC LRN

ADL SCORM ULF

ECTS (none mentioned)

EML (none mentioned)

ALIC (none mentioned)

(created with information taken from [WG02], p. 17+18)

Table 2: Standardization organizations concerned by e-learning and their cooperations

Standardization organizations concerned

by e-learning

Cooperation with other organizations on

their project

ISO - JTC 1 - SC36 IEEE - LTSC, CEN/ISSS, AICC,

ARIADNE, IMS, ALIC, ADL, DCMI

IEEE - LTSC ISO - JTC 1 - SC36

CEN/ISSS IEEE - LTSC

W3C (none mentioned)

(created with information taken from [WG02], p. 19)

Table 3: Standard/specification developers, having whether or not a quiz specification

98

Page 111: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Specification developers Having quiz specificationIMS yes: IMS QTIDCMI no, only metadata specificationsECTS no, is only a credit transfer system for

exchange studentsAICC no, concerning e-learning only guidelines

that promote the interoperability of web-

based CMI systemsEML => now Learning Design (LD), see

below

yes: EML

Learning Design (IMS approved/specified as

IMS LD [LD])

LD does not contain quiz specification yet,

but will integrate IMS QTIIEEE P1484 no, currently no working group focused on

quizzes (1)ADL no, but quiz specification is wanted in later

versions of SCORM, probably by adopting

IMS QTI (1)ALIC no, for now their on-line testing working

group only translated and commented IMS

QTI (2)OASIS (DocBook TC) DocBook is a general purpose XML DTD,

not specifically made for quizzes (contains

only a question & answer specification; see

the evaluation for more details!)

(created with data found by personal research and analysis of information from the developers

web pages:

(1)= infos taken from http://www.adlnet.org/index.cfm?fuseaction=scormfuture and [QTIa],p.9

(2)= from http://www.alic.gr.jp/eng/activity/2000/iop/iop_index.htm )

99

Page 112: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Appendix B: Results of Quiz Specification Evaluation

100

Evaluation Criteria Importance IMS QTI EML P. Sander's DTD DocBookA. Basic criteria for a static environments1 Question types:1a minimum: T/F, MC, MR, FIB, short answer, essay 3 x x (FIB by short answ.) x (see EML) - (just Q&As) 1b other types 1 x (many!) x (sequence, match) x (see EML) - 2 including static media: 2a minimum: pictures 3 x x x - 2b others (esp. code import) (1)/area(3) - (not found) - (only codeline) x (code import added) perhaps 3 save right answer(s) of question 3 x x x na 4 save possible points per answer (also wrong one) 2 x - (no points used) x (added !) na 5 shuffle order of questions in quiz 1 x x x na 6 shuffle order of possible answers per question 1 x - - na 7 Views:7a text parallel in multiple languages 2 - - (not found) x (added!) probably 7b views for different groups of users 1 x ("objectives") na (evtl. roles) na (see EML) na 8 Metadata for question and quiz:8a ID stat(2)/dyn(3) x x x probably 8b keywords 1 x x x probably 9 Accompanying text for quiz and questions: 9a specifically for a question/quiz 2 x x (in metadata) x (in metadata) probably 9b repetitive, general information / instructions stat(2)/dyn(1) x x (in metadata) x (in metadata) probably 10 describe rough structure of layout (optional) 1 x (a lot) - (not found) - na B. Extension criteria for dynamic environments1 time interval for quiz taking (date, time) 3 - - x (added: only date) perhaps 2 time limit for answering a question 2 x x (in extra-meta) x (see EML) -/na3 Possible feedback::3a solution of question (with explanations) 2 x x x (see EML) x 3b hints (1)/selfeval(2) x ("hintswitch") x x (see EML) -/na 4 include dynamic media (1)/area(2-3) x (audio,video,applet...) x (audio, video) x (see EML) -/na5 maximal answering attempts per question (1)/selfeval(2) x - (just for whole quiz) - (see EML) -/na 6 necessary default values given? (for 1,2,5) 2 - x x none none (see EML) -/na

Page 113: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Appendix C: Results of the Quiz Tools Evaluation

Description and evaluation of the chosen tools

Annotation: for easier readability/fitting on a page, both result tables were divided in two partswith each 3 tools, with the goal to have all data about one tool in one column

Part 1: Description and evaluation of “GenDoc”, “Respondus” and“Hotpotatoes”

Generaldescription

of the tools(part 1)

GenDoc Respondus 2.0 HotPotatoes

Author /Developer /Owner

Project of theUniversity J. Fourrierof Grenoble, France

http://gendiapo.sourceforge.net/

Respondus Inc. www.respondus.com

Half-baked Software Inc.www.halfbakedsoftware.com/created by team ofUniversity of Victoria,USA:http://web.uvic.ca/hrd/hotpot/

Free /commercialversion

free Commercial, with a 30days trial version

Not freeware (licence topay), but free of chargefor non-profit educationalusers who make theirpages available on theweb

stand-alone /based on e-learningsystem

Stand-alone written inJava (therefore Javaenvironment needed)

Both: print quiz stand-alone, or publish throughuser environments for e-learning systemsBlackboard, eCollege,WebCT or any otherCMS compliant to IMSQTI

Standalone versions forMac and Windowsavailable, to use with awebbrowser that supportsJavaScript

Simple(minimum) /complex(extras)creation tool

Simple (general XMLeditor)

many extra features (spellcheck, dictionaries,archive, ...)

Authoring tool forwebbased interactivequizzes, can create a widerange of interactivequestion types

101

Page 114: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Generaldescription

of the tools(part 1)

GenDoc Respondus 2.0 HotPotatoes

basic (onlycreation) /additionalfunction (alsopresention /administration / ... of thequizzes)

Basic Also administration ofquizzes and questions

Administration,(presentation viawebbrowser)

Online /offline

offline offline Offline

Toolspecialized inquizzes ?

General XML creatingtool

specialized specialized

Subjectspecialized ?

no no Originally created for usein language teaching, sospecialized on this, butcan be used for othertypes, too

Quiz as realexam / self-evaluation

Not defined all Designed to be used asexercises

Target group(all, schools,universities,companies)

All All using the supported e-learning platforms(mainly education butalso companies)

Mainly schools

Evaluation ofthe tools byusing the

criteria (part1)

GenDoc Respondus 2.0 HotPotatoes

*1.functionality:*

* * *

* different sortsof questions:

* * *

minimum:(T/F), (MC),(MR), (FIB),essay

Not defined all MR (implies MC andT/F), essay, FIB

102

Page 115: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Evaluation ofthe tools byusing the

criteria (part1)

GenDoc Respondus 2.0 HotPotatoes

other types Not defined no jumbled-sentence,crossword,matching/ordering

including picturesand other media

Not defined Graphics, audio,video, HTML, links tomultimedia content,math symbols (byEquation Editor)

Graphics, links, audio andvideo

saving/memorizing the rightanswer(s)

Not defined Yes (results in pointvalues)

Yes (results only in % ofright answers / attempts)

layout of quiz:possibility tomanipulateduring creationorfixed/predefined/automaticallycreated?

Layout done in exportplugin by use of XSLand CSS

layout restricted toword processoroptions, generalstructurepredefined/fixed

Textual layout (titles,button descriptions,...)and colors can easily bemanipulated inconfiguration menu,general structure ispredefined/fixed and canonly be changed byadapting source file(JavaScript, HTML)

*2.compatibility /interoperability*

* * *

Import formats XML files .rtf and .txt files thatneed to be organizedin the requiredstandard format ofrespondus, andformats of thesupported e-learningsystems

Opens only HotPotatoesspecific exercise formats

export formats(including at leasta printable and aninteractiveversion)

XML file, Print andpublish versionsversions (have do bespecialized with aplugin)

Print version, .rtf and .txt files, and formatsof the supported e-learning systems

Create web page, exportfor printing (copies thequiz in a basic format intothe clipboard to thenpaste it in any wordprocessor), and export toWebCT, saves onlyHotPotatoes specificexercise formats

103

Page 116: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Evaluation ofthe tools byusing the

criteria (part1)

GenDoc Respondus 2.0 HotPotatoes

Supporting quizstandards / norms

Possible by importingDTD of standard

IMS QTI (throughspecific userenvironment)

Not yet (future projects:adaptation of IMS QTI)

*3. ergonomy* * * *

easy to use(selfexplainingnames and icons,logic andguidance in stepsto follow, clear /overviewableGUI, preview ofquiz/question)

Very easy (few optionsin menu),selfexplaining namesand icons

Clear menues, goodguidance by showingpossible choices at apoint, easy fill-in formwhen creatingquestions,questionwise preview

easy fill-in form whencreating questions, easyand overviewable menuesand icons, no previewoption

easy to learn(e.g. bydocumentation,examples,tutorials, ...)

Selfexplaining GUI,good documentation(unfortunately notupdated), examples ofsome possible resultson the website

Detailed and easy-to-understanddocumentation (UserGuide), online help,demo movies(introduction andabout some specialactions)

Tutorials (available inmany languages), onlineand download, Viewlets,videos examples => doneby users in differentcountries

*4. extensibility*(in case not allpossible /required optionsare available(yet))

* * *

question types,layout?,import/exportformats, ...

All (by creating DTDand export pluging)

Not extensible Program parts have beendeveloped in a way thatpersonal adaptations inappearance andfunctionality of theexercises can be madeeasily using JavaScriptand HTML => done byusers in differentcountries, like teaching-tools.de.vu

*5. quality ofresulting quiz*

* * *

104

Page 117: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Evaluation ofthe tools byusing the

criteria (part1)

GenDoc Respondus 2.0 HotPotatoes

(ergonomy,platformindependent?,formats =alreadymentionned in 3.export formats)

Depends on createdexport plugin/XSL file

Layout depends onchosen platform(dependent on this),print version, onlineversion by platform

Good; needs browser toview (different onespossible)

Part 2: Description and evaluation of “Auto QCM”, QM's “Perception”,CL “Author”

Generaldescription

of the tools(part 2)

Auto QCM QM's Perception Canvas LearningAuthor

Author /Developer /Owner

e-eleves.comwww.e-eleve.com

Questionmarkwww.questionmark.com/perception

Canvas Learning www.canvaslearning.com

Free /commercialversion

free Commercial, with a 30days trial version

Commercial, with a 30days trial version(price otherwise: normal649 GBP, educational99GBP)

stand-alone /based on e-learningsystem

standalone standalone standalone

Simple(minimum) /complex(extras)creation tool

quite simple Complex (many extrafeatures)

Complex (extra featuresconcerning the XMLcode and IMS QTIfeatures)

basic (onlycreation) /additionalfunction (alsopresention /administration / ... of thequizzes)

creation andpresentation of quiz

creation of questions andassessments, theirpresentation andadministration

Only quiz creation(program “Player” fordelivery)

105

Page 118: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Generaldescription

of the tools(part 2)

Auto QCM QM's Perception Canvas LearningAuthor

Online /offline

online offline offline

Toolspecialized inquizzes ?

yes yes no

Subjectspecialized ?

No (but only in frenchlanguage)

no no

Quiz as realexam / self-evaluation

self exam or smallexams, no real ones

Self exam, survey and realexam

Could be (use not definedin design tool)

Target group(all, schools,universities,companies)

Mainly schools Mainly companies All (edu and companies,also professional contentdevelopers)

Evaluation ofthe tools byusing the

criteria (part 2)

Auto QCM QM's Perception Canvas LearningAuthor

*1.functionality:*

* * *

* different sortsof questions:

* * *

minimum:(T/F), (MC),(MR), (FIB),essay

Yes, by 3 basic types:answers as radiobutton, check box ortext field

All except T/F All (“MC” usable as T/F,MR, MC)

other types no Hotspot, order, pulldown menu,...

Drap&drop, drag toorder, fruit machine,slider, ...

including picturesand other media

no Yes: graphics, maps,logos, diagrams

Yes (pics, video, audio,and others per extension)

saving/memorizing the rightanswer(s)

Yes (+/- points) Yes, also nearly rightanswers (partiallycorrectness in FIB)(points in % rightones)

Yes (and points peranswer)

106

Page 119: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Evaluation ofthe tools byusing the

criteria (part 2)

Auto QCM QM's Perception Canvas LearningAuthor

layout of quiz:possibility tomanipulateduring creationorfixed/predefined/automaticallycreated?

No, fixed andautomatically created

Design predefined(with wallpaper tochoose), but structureof questionpresentation to choosewhen publishing

Not in author tool (fixedsimple version)

*2.compatibility /interoperability*

* * *

Import formats none ASCII, QML, XML, XML (but opens alsoother/all types)

export formats(including at leasta printable and aninteractiveversion)

Html-code

(for web or browseruse)

QML, XML, HTML (

also possible to exportto Qpack, ODBL andAccess

Save: XML and also allothers

Supporting quizstandards / norms

no No (only XML) IMS QTI

*3. ergonomy* * * *

easy to use(selfexplainingnames and icons,logic andguidance in stepsto follow, clear /overviewableGUI, preview ofquiz/question)

Yes: few options(overviewable), self-explaining steps(choose perfield/checkbox/radiobutton/drop downmenu, then click andfill in), preview option

Big menus, a littledifficult to overview,but help functionokay, Windowsprograms look andfeel, Wizards forcreation

+ Overviewable menues,drag&drop and resizeelements directly inWYSIWYGview,wizards for creatingquestions, preview andauthor modes, advancedediting features (treeview for showing andediting IMS QTI , editingXML code by hand),preview with testing ofquiz

- icons a bit confusing

107

Page 120: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Evaluation ofthe tools byusing the

criteria (part 2)

Auto QCM QM's Perception Canvas LearningAuthor

easy to learn(e.g. bydocumentation,examples,tutorials, ...)

Few options, self-explaining steps ,linked words withexplanations andguidance

Wizards for creation,help, online manual,on homepage lots of“try it out” examples

+ create questions byWizards for all questiontypes, introducing docuand demos aboutquestion creation onhomepage

- (knowledge of IMS QTIneeded for moredetailed/specific changesin structure)

*4. extensibility*(in case not allpossible /required optionsare available(yet))

* * *

question types,layout?,import/exportformats, ...

none no Yes (said to be on thehomepage, but notexplained further...)

*5. quality ofresulting quiz*

* * *

(ergonomy,platformindependent?,formats =alreadymentionned in 3.export formats)

Good, needs browser Look okay;

per web browser orwindows based

Ergonomy of previewversion (real lookdepends on presentationprogram): explanations ofhow to use need to beadded (not always clear),simple look (preview!);independent becauseXML code

108

Page 121: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Appendix D: Data Mapping eXam / Quiz1.dtd

1. During Export: Matching elements in DTD needed for fields in QuestionaryValueObject,QuestionValueObject and LineValueObject, as follows:

Fields in QuestionaryValueObject Corresponding elements in Quiz1.dtd

private int questionarynr; none (unique number for each questionaryin eXam)

private String title; quiz.quiz-header.title

private Long creationtime; quiz.metadata.date-of-creation

private Long lastchangedtime; none(optionally evtl. in quiz.metadata.date-of-creation instead of creationtime)

private Long starttime; date + first part of time

private Long endtime; second part of time

private String groupe; quiz.metadata.user-group

private int maxpoint; none

private int random; none

private boolean examen; none

private String prof; quiz.metadata.author

private QuestionValueObject[]questions;

all elements of quiz.item-group.quiz-itemor quiz.quiz-item of the actual quiz

Fields in QuestionValueObject Corresponding elements in Quiz1.dtd

private int questionserialnr; none (unique number for each question ineXam)

private int questionnr; none (this number signifies the order ofthe questions in a quiz when displayed)

private String title; quiz-item.question (the actual questionstatement )

109

Page 122: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Fields in QuestionValueObject Corresponding elements in Quiz1.dtd

private int qtype;

1 = MCQ

2 = MRQ

3 = Numeric Questions

4 = Short Answers

5 = T/F

6 = FIB

7 = Essay

if 1, 2, 5: quiz-item.multiple-choice-question

if 3, 4, 6, 7: quiz-item.text-question

private int minpoint; none

private int maxpoint; quiz-item@points

private String rightfeedback; none (optionally evtl. instructors-answer)

private String wrongfeedback; depending on question type:

• essay: instructors-answer

• others: none

private String timeupfeedback; none

private String link; in hint, with “path”- or “a”-tags around!

private Long time; quiz-item@time

private LineValueObject[] line; (see table below!)

Fields in LineValueObject Corresponding elements in Quiz1.dtd

private int linenr; none (continuously numbered for eachquestionary, starting with “1”)

private String text; The answer (proposition) text,

depending on question type:

4. answer (1, 2)

5. T/F (5): no text!!

6. (3, 4, 7): no text!!

7. blank (6): none

110

Page 123: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Fields in LineValueObject Corresponding elements in Quiz1.dtd

private String response; stated if answer correct or none, or theanswer, depending on question type:

1. choice types(1, 2): 1 if correct, 0otherwise

2. T/F: “true” or “false” = means emptyanswer field!!

3. short num. or text answer(3, 4): ins-answer

4. FIB (6): blank

5. essay has no line!! (wrong! but it isempty!!)

private int rightpoints; none

private int wrongpoints; none

Annotations about incomplete matching of values:

1. values from eXam without corresponding part in the DTD (“none” in table):

All of these data are quiz-taking specific information and are therefore not available inthe DTD (as explained in CHAPTER about standards or DTD in paper).

2. required Quiz1.dtd elements without corresponding fields/values in eXam need defaultvalues as follows:

attribute lang: hand over language in GUI or set default to “fr” (as in DTD, too)

element quiz-header.instructions: set either empty or default text

2. During Import: Data that is required for the EJB methods used has to be taken from DTDelements as follows: (Annotation: if no DTD element corresponds, a default value has to be given)

• createQuestionary(String title, Long starttime, Long endtime, String groupe)

Needed values for “createQuestionary” Taken from these elements in Quiz1.dtd

String title; quiz.quiz-header.title

Long starttime; “date” + first part of “time” (if timeexists, otherwise default), transformedinto java.util.DATE and Calendar values

Long endtime; similar to above; only change: take secondpart of “time”

String groupe; none: get as parameter from GUI(optionally get from metadata.user-group, if corresponds with an eXamgroup)

111

Page 124: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

• addQuestionToActualQuestionary(String title, int questionType, int minpoint, Stringrightfeedback, String wrongfeedback, String timeupfeedback, Long time, String link)

Needed values for“addQuestionToActualQuestionary”

Taken from these elements in Quiz1.dtd

String title; quiz-item.question (the actual questionstatement)

int questionType;

(as one of the following numbers:

1 = MCQ

2 = MRQ

3 = Numeric Questions

4 = Short Answers

5 = T/F

6 = FIB

7 = Essay)

• if quiz-item.multiple-choice-question:

1. count lines: if 1 line = T/F, otherwisecontinue

2. count right answers: if only one =MCQ, otherwise MRQ

• if quiz-item.text-question:

1. if contains blankstext = FIB, otherwisecontinue

2. take a look at ins-answer. cases:

- short (max 30 signs) = short answer,

- longer = essay

- no numeric supported yet!

int minpoint; none: set default = 0

String rightfeedback; none: default = “” (optinally evtl.instructors-answers)

String wrongfeedback; • for essay type this must be given(actual answer statement)

• otherwise none: default = “” (optinallyevtl. instructors-answers)

String timeupfeedback; none

Long time; quiz-item@time (in seconds)

String link; check if “path” or “a” in “hint” exists andcontains data, otherwise none: “”

• addLineToQuestionOnActualQuestionary(int questionnr, String text, String response,int rightpoints, int wrongpoints)

Needed values for“addLineToQuestionOnActualQuestionary”

Taken from these elements in Quiz1.dtd

int questionnr; none: being a continuous number, it isreturned for a new question from above”addQuestion...” method

112

Page 125: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Needed values for“addLineToQuestionOnActualQuestionary”

Taken from these elements in Quiz1.dtd

String text; The answer statement,

depending on question type:

1. choice questions (1, 2): answer

2. others: empty/no text

String response; Correctness of answer, with differences

depending on question type:

1. choice questions (1, 2): @correct with

“1” if correct, “0” otherwise

2. T/F: “true” or “false”

3. short answer: the answer text

4. FIB: one blank

int rightpoints; For wrong answers: 0;

For right ones, depending on questiontype:

• MRQ and FIB: proportionate part oftotal points for this question

• all others: always all points(=maxpoints), as just one answer

int wrongpoints; none: default= 0 points

113

Page 126: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

Appendix E: Use instructions of required programs

The following instructions should help persons that want to use the developed componentsdescribed in this paper. It provides information on how to install, start and use respectively the3 involved programs (the generic XML editor GenDoc with the new plugins, the eXam quizserver and the Import/Export program), or thereby references to already existing manuals.

1. GenDocA general User Manual concerning the installation, launch and use of GenDoc can be foundon http://gendiapo.sourceforge.net/download/normal.php. Although this is the manual of anolder version of the editor, it is still mostly valid concerning the general functionality. In the following, some additional information concerning the use of GenDoc with thedeveloped quiz-specific plugins should be provided:

1. Installation: After downloading and installing GenDoc (download needed archive from http://gendiapo.sourceforge.net/download/normal.php and unzip at wanted place), placeboth jar-file of the quiz-specific plugins (Quiz.jar and XSLQuiz.jar, found on attachedCD) into the “plugins/” directory of your main directory of the actual GenDoc version(e.g. GenDoc-1.0-beta2/). The “plugins/” directory also contains all other plugins thatare provided with GenDoc.

2. Running the program: Here a modification of the general launch command has to be done to ensure rightnessof publication process for quizzes, depending on the version of Java used:

• when using Java v. 1.3.x: lauch with regular command "GenDoc.sh" (LINUX) or "GenDoc.bat" (Windows)respectively in the main directory of the actual GenDoc version, where both are situated.

• when using Java v. 1.4.x: lauch GenDoc here with "java -Xbootclasspath/p:lib/saxon.jar -jar GenDoc-1.0-beta2.jar" again from the main directory of the actual GenDoc version.

3. Use: The use of the new plugins are the same as for all others, therefore described in theGenDoc manual. Additional features are generally self-explaining (at edition: using thecode import by browsing for a file and adding a blank by clicking the “blank” buttonwhile having a blanks selected; at publication: choice of the parameters and the resultingfile), and are otherwise explained in greater details in the paper.

2. The eXam quiz serverThe eXam quiz server already has documentations that contain the necessary information:

• The installation of the necessary software is described in chapter 5 (Implementation

114

Page 127: Imh/RR/2004/RR-04.24-M.BUFFA.pdf · the XML standards proposed by IMS () for XML quizz definition. IMS is a web source for XML standards, many relating to quizzs or exercices in general.

Interactive Quizzes in E-learning

details) of the final report on the EJB Tier of eXam, situated at:http://miageprojet.unice.fr/twiki/pub/Projets2002/ProjetsQcmIUP2MoritzDaniel/Studienarbeit.htm

• The use of eXam through the web-based GUI (including information on creation andadministration of quizzes and user accounts, as well as the quiz-taking itself can befound in the Basic User manual at:http://miageprojet.unice.fr/twiki/pub/Projets2002/ProjetsQcmIUP2MoritzDaniel/eXamUserManual.doc.

2. Connection program (Import/Export client)This program, that was developed in this project, serves as an Import/Export extension forthe eXam quiz server. It therefore requires the installation and launch of those theredescribed programs, along with having the same general system requirements. Some morespecific details on the installation, launch and use should be provided now:

• Installation The program can be found on the CD in the form of a jar file named client.jar It has for now be installed on the same machine as the server.

• Launch: Before launch, the following things have to be added to the classpath:• the same JBOSS specific dependencies as described for eXam (see User manual)• JAXB archives as described in the JAXB chapter of the JWSDP tutorials [WST],

being the following ones from the JWSDP home directory:• jaxb/lib/*.jar , jwsdp-shared/lib/*.jar and jaxp/lib/**/*.jarLaunch the program with: java -jar client.jar

• Use: The use, that is mostly self-explaining, consists of the following steps (that are alsoexplained in chapter 3.4.2 of the paper, at the description of the GUI):1. Connect to the used eXam quiz server by providing the correct host name as well as

user name and password (that are valid for this server) in the corresponding fields inthe “connection” panel, executed by clicking the “login” button.

2. When the connection has been done successfully, choose either to import or export aquiz (from the respective tabs), and provide the necessary information. As explained in the paper, these are for the export: • the quiz on eXam to export (to choose from the displayed list), as well as• the wanted language and• the name and location of the resulting XML fileand for the import:• the existing XML document of the quiz to import• the language, and• a user group, chosen from the list of user groups that exist at the moment in eXam

115