Student Modelling - Techomepage.cem.itesm.mx/juresti/ITS/Diapositivas/Tema 5 - The Student... ·...
Transcript of Student Modelling - Techomepage.cem.itesm.mx/juresti/ITS/Diapositivas/Tema 5 - The Student... ·...
What is a Student Model?
Representation of the computer system’s beliefs about the learner Knowledge
Behaviour
Abstract representation of the learner in the system
Captures learner’s understanding and misunderstanding of the domain
2 Revisión 200811
What is a Student Model? ...
Can be viewed as capable of simulating the process by which the learner solves a problem
Should be able to:
Predict what the learner will do next
Work backwards from learner behaviour to generate an explanation
3 Revisión 200811
What is a Student Model? ...
Problems:
No consensus as to what to include in the SM.
Prior relevant learning.
Progress within the curriculum
Preferred learning style
Other learner-related information
Not known if SM is necessary for effective and efficient
instruction.
4 Revisión 200811
What is a Student Model? ...
Types of SM
Explicit SM
Representation of the learner in the learning system that is used to
drive instructional decisions.
Implicit SM
Reflected in design decisions that have been derived from the system
designer’s view of the learner.
Example: metaphor and icons used in an HCI (snapshots of observed
learner behaviour)
5 Revisión 200811
What is a Student Model? ...
Dimensions of SM
Behavioural simulation model
Description of actions
What the learner is observed doing
Functional simulation model
Description of beliefs and goals
What the learner knows and is trying to do
6 Revisión 200811
Mycin/Guidon Neomycin
Lisp Tutor
Proust
Sophie-III
Behavioral Functional
Behavioral
Functional
General
Model of
Domain
Model of reasoning
Dimensions of SM
7 Revisión 200811
What is a Student Model? ...
Assist in: Selecting the content
Selecting the tutorial strategy
Confirming diagnoses
Diagnosis = process of inferring the SM
Student modelling problem SM = data structure representing the learner’s knowledge
Diagnosis = process to manipulate data structure
8 Revisión 200811
What is a Student Model? ...
VanLehn’s uses of a SM
Advancement – select level of mastery
Offering unsolicited advice
Problem generation
Adapting explanations
9 Revisión 200811
What is a Student Model? ...
Barriers to student modelling:
Environment contains large amount of uncertainty and noise
Learner’s inference may be unsound and based on
inconsistent knowledge
Constructing explanation from behaviours is
computationally intractable
Intractable problem (learner’s engage in unanticipated, novel
behaviour that requires much sophistication to interpret)
10 Revisión 200811
Knowledge representation in SM Quantitave scores from domain tests or binary answers
Domain knowledge as
Overlay
Mal-rules (buggy rules)
Learner as a subset of a cognitive model for the domain
Recent work:
Learning style
Affective state
Individual attributes
11 Revisión 200811
Overlay Models
The learner’s knowledge is treated as a subset of an expert’s knowledge
Instruction = to establish the closest possible correspondence between the two
Comparison between learner’s and expert’s behaviour Differences = learner’s lack of skill
12 Revisión 200811
P0 P66
P3
P27
P12
P4
P2
P18
P62
P11
P71
P1
P33
Domain knowledge
Overlay Student model13 Revisión 200811
Overlay Models ...
Learner = simple mechanism that supports inferencing about his cognitive state relative to the ideal domain expert
Works well when only teaching the domain to the learner
Problem: learner’s knowledge may not always be a subset of expert’s knowledge Example: misconceptions an expert does not have, different way of
approaching a task
14 Revisión 200811
Overlay Models ...
Differential model Modification of the overlay model
Acknowledges differences between expert’s and learner’s knowledge
Two types of knowledge: Knowledge the learner should know
Knowledge the learner could not be expected to know
Does not assume all gaps in learner’s knowledge are undesirable
15 Revisión 200811
P11
P71
P7
P91P1
P45P6
P0
P33
P12P4
P38
P12
P32
Domain Knowledge
Expected Student Knowledge
Overlay Student Model16 Revisión 200811
Perturbation models and Bug models
Combines the overlay model with a representation of
faulty knowledge
Learner is not a subset of the expert
Possesses knowledge potentially different in quantity and
quality from the expert
Usually represented as an overlay model augmented
with misconceptions
17 Revisión 200811
Perturbation and Bug models ...
Misconception When a learner demonstrates a more or less consistent but
incorrect general model
Bug Refers to some structural flaw in a procedure that often
manifests itself in faulty behaviour
These terms are used indiscriminately and organized in a “bug library” or “bug catalogue”
18 Revisión 200811
P2’’
P4P12
P62
P2
P27
P18
P66
P0’
P3
P3’
P64
P1’’ P2’
P33
P71
P11
P1
P0
P91
P1’
Domain Knowledge
Perturbation Student model19 Revisión 200811
Perturbation and Bug models ... Perturbation model updated in regard to presence or absence
of bugs
Advantage: allows more sophisticated understanding of the learner
Disadvantages: Misdiagnosis if bug is not present in library
Coverage of the complete library
May uncover a bug but not explain why they have occurred
Reteaching may be as beneficial
Bugs may vary over representations of same domain
20 Revisión 200811
Perturbation and Bug models ...
Approaches to the development and representation of bug libraries
Enumerative
Enumerate bugs based on empirical analysis of learner’s errors
Pros:
Easy to create
Cons:
Effort to assemble and maintain it
Analysis of a large database
21 Revisión 200811
Perturbation and Bug models ...
Approaches to ...
Reconstructive
Reconstruct bugs on the basis of observed errors
Pros:
Only plausible bugs are created
Cons:
Reconstruction may be misleading
22 Revisión 200811
Perturbation and Bug models ...
Approaches to ...
Generative
Try to generate bugs based on a set of underlying misconceptions
Pros:
Offer plausible explanation of bugs from their generation
Provide context for interpreting observed errors
Cons:
Implausible bugs may be generated
23 Revisión 200811
Perturbation and Bug models ...
Bug library is not a SM
Perturbation model must:
Add an interpretation to evolving patterns of bug use and
avoidance
Make use of a bug library to help define the space of possible
misconceptions
24 Revisión 200811
How to build a SM
Who is being modelled? Degree of specialization – individual or classes of learners
Temporal extent – how long will the learner history be maintained
What is being modelled? Goals and plans
Capabilities
Attitudes
Knowledge or beliefs
25 Revisión 200811
How to build a SM ...
How is the model to be acquired and maintained?
Acquisition techniques to learn facts about the learner
Ability to incorporate new information into the existing model as well as
dealing with discrepancies
Why is the model there?
To elicit information from the learner
Provide the learner with help and advice
Provide feedback to the learner
Interpret the behaviour of the learner
26 Revisión 200811
Methods for initialising SM
Users outlining their own learning goals
Users providing a self-description (personality, knowledge,
etc.)
Users being given a pre-test on the subject area
27 Revisión 200811
Self’s recommendations on SM
Design the student-computer interactions such that information needed to build a SM is provided by the learner rather than being inferred by the system
Link the proposed content of the SM with specific instructional actions
Make the content of the SM accessible to the learner, in order to encourage reflection on the part of the learner
28 Revisión 200811
Self’s recommendations ...
Assume a collaborative role for the ITS (the fidelity of
the SM is of less importance)
View the contents of SMs as representing the
learner’s beliefs about the world; the role of the ITS is
then to assist the learner in elaborating those beliefs
29 Revisión 200811
Diagnostic techniques
Model tracing Assumes all student’s significant mental states are available
to the diagnostic program
An interpreter suggests a whole set of rules to be applied next
Diagnostic algorithm fires all rules and gets a set of possible next states One should be the state generated by learner
If so, learner knows that rule
30 Revisión 200811
Diagnostic techniques ...
Path finding
Given two consecutive states, find a path that takes the first
state to the second state
Path given to model-tracing algorithm, which treats it as faithful
representation learner’s mental model
31 Revisión 200811
Diagnostic techniques ...
Condition induction Given two consecutive states, the system constructs a rule
that converts one state to another
Requires two libraries: Library of operators that convert one state to another
Library of predicates
Applies operators to predicates to find rule
32 Revisión 200811
Diagnostic techniques ...
Plan recognition Knowledge must be procedural and hierarchical
All of the physical observable states of learner’s problem solving be available
Problem is analysed as a tree Leaves are primitive actions (e.g. Writing an equation down)
Non-leaf nodes are sub goals (e.g. Factoring an equation)
Root node is the overall goal (e.g. Solve an equation)
Process of inferring a plan tree when only its leaves are given
33 Revisión 200811
Diagnostic techniques ...
Plan recognition ...
Plan tree found by plan recognition is given to a model tracing
algorithm
Model tracing updates the SM
34 Revisión 200811
Diagnostic techniques ...
Issue tracing
Coarse-grained variant of model tracing
Based on analysing a short episode of problem solving into a
set of micro skills or issues that are employed during that
episode
The analysis does not explain how the issues interacted or
what role they played
35 Revisión 200811
Diagnostic techniques ...
Issue tracing ... Steps:
Analyse learner’s move and expert’s move into issues
Each issue has two counters: used and missed
Used counters incremented when learner uses them
Missed counters are incremented when the expert used issues and learner did not
If used > missed -> learner understands
If used < missed -> learner does not understand
If used = missed = 0 -> issue not come up
If used = missed -> decide what to do based on domain
36 Revisión 200811
Diagnostic techniques ...
Issue tracing... Problem:
Learner may only ignore one issue in a given move but issue tracing blames all issues present in the move -> introduces inaccuracy
Solutions: Missed/used ratio must be high before tutoring on an issue.
System of expectations about what issues are learned first
37 Revisión 200811
Diagnostic techniques ...
Expert systems
Use of rules of inference
Provide diagnostic rules for all the situations that arise
38 Revisión 200811
Diagnostic techniques ...
Decision trees Previous techniques do not take into consideration the
effect of several bugs in a learners move or step
Use of a tree to index all possible pair of bugs and their interactions!
Problems are analysed off-line before the interaction
Leaves of tree are diagnoses
39 Revisión 200811
Diagnostic techniques ...
Generate and test Generation of bugs dynamically
Procedure: Begins finding a small set of bugs that match some of the learner’s
answers
Forms pairs of bugs
Adds pairs of bugs known as difficult to spot
Selects the best subset that match the learner’s answers
Continues selecting until the best match is found –> diagnosis
40 Revisión 200811
Diagnostic techniques ...
Interactive diagnosis
Starts with a known problem that causes problems to learners
Based on answers, generates a new problem that matches the learner’s bugs
The new problem should generate a different diagnosis
41 Revisión 200811
LECOBA’s student model
Information obtained mainly from the interaction with the
LC
Updaters and accessors of SM are in Prolog
42 Revisión 200811
LECOBA’s student model ...
Core of SM is a 3 elements list
[InUse,NotUsed,NotKnown]
Each element is a list of the form: [Rule 1, Rule 2, ..., Rule i]
Each “Rule” is a list of the form: [Rule Name, Status, PV]
Example: [[[r1,normal,0.50],[r5,normal,0.74]],
[[r2,notUsed,0.00],[r3,notUsed,0.00]],
[[r7,notKnown,0.00],[r9,notKnown,0.00]]]
43 Revisión 200811
LECOBA’s student model ...
Codifies two important aspects: How much learner knows of a particular rule
How does learner combines rules seen so far to simplify
PV = Proficiency Value Knowledge of a particular rule
Range between 0.00 and 1.00
Rule use preferences represented by position of rules in InUse list
44 Revisión 200811
LECOBA’s student model ...
SM updated in two ways Update of PVs
Update of Rule Order
Example of events detected for update LC working
Student suggests a rule -> update PV and order
Student asks for justification -> update PV
Student working Student applies a rule -> update PV and order
Student asks for a suggestion -> update PV
45 Revisión 200811
LECOBA’s student model ...
PVs update
Grouped into 5 proficiency grades (.2 each)
Poor, Fair, Average, Good and Excellent
SMART (Shute, 1995) method used for update
Obtain a series of regression equations for fast update given an event
One equation for each PV
Equation derived from a table of states (decisions)
3.09823.02273.05051.0)( 23 xxxxPV
46 Revisión 200811
LECOBA’s student model ...
Previous PV New grade New PV
Poor Low Fair High 0.30
High Average Low 0.40
Fair Low Average High 0.50
High Good Low 0.60
Average Low Good High 0.70
High Excellent Low 0.80
Good Low Excellent Low 0.85
High Excellent High 0.90
Excellent Low Excellent High 1.00
High Excellent High 1.00
Student Suggesting Correctly47 Revisión 200811
LECOBA’s student model ...
Rule order update NotUsed in alphabetical order
InUse ordered uses an algorithm based on the work of Kimball(1982) Two NxN matrices, where N = number of rules
Hold information of how a rule is being used in relation to other rules
First matrix = probability a rule is before other rules
Second matrix = probability a rule is used after
Updated
Every step while the student is solving a problem
Each time student gives a suggestion
Each time student teaches
48 Revisión 200811