An introduction to intelligent interactive instructional systems Kurt VanLehn ASU.

Post on 21-Dec-2015

218 views 0 download

Tags:

Transcript of An introduction to intelligent interactive instructional systems Kurt VanLehn ASU.

An introduction to intelligent interactive instructional systems

Kurt VanLehn

ASU

Outline

Step loop– User interface

– Interpreting student steps

– Suggesting good steps

– Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations Dissemination

Tutoring systems

Other interactive instructional systems

Intelligent “tutoring” system is a misnomer

Almost all are used as seatwork/homework coaches The instructor still…

– Lectures

– Leads whole class, small group & lab activities

– Assigns grades; defends grades

– Can assign homework / seatwork problems » or delegate to the tutoring system

The instructor no longer…– Grades homework / seatwork

– Tests?

For-profit web-based homework grading

services are growing rapidly

If students enter only the answer, call it answer-based tutoring

45°

30°

40°

What is the value of x?

X = 25

Answer

If students enter steps that derive the answer, call it step-based tutoring

45°

30°

40°

What is the value of x?

Step

Step

Step

Answer

Step

40+30+y=180

70+y=180

y=110

x+45+y=180

x+45+110=180

x=180-155

x=25

Step

Step

Step

Def: Feedback is a comment on one of the student’s steps

45°

30°

40°

What is the value of x?

40+30+y=180

y = 250

Oops! Check your arithmetic.

OK

Feedback is often given as a hint sequence

45°

30°

40°

What is the value of x?

40+30+y=180

y = 250

Oops! Check your arithmetic.

OK

Hints become more specific

45°

30°

40°

What is the value of x?

40+30+y=180

y = 250

You seem to have made a sign error.

OK

Hints segue from commenting on the student’s step to suggesting a better step

45°

30°

40°

What is the value of x?

40+30+y=180

y = 250

Try taking a smaller step.

OK

and become more specific

45°

30°

40°

What is the value of x?

40+30+y=180

y = 250

Try doing just one arithmetic operation per step.

OK

Def: A bottom-out hint is the last hint, which tells the student what to enter.

45°

30°

40°

What is the value of x?

40+30+y=180

y = 250

Enter 70+y=180, and keep going from there.

OK

Def: A next step help request is another way to start up a hint sequence.

45°

30°

40°

What is the value of x?

40+30+y=180

help

Try doing just one arithmetic operation per step.

OK

Delayed (as opposed to immediate) feedback occurs when the solution is submitted

45°

30°

40°

What is the value of x?

40+30+y=180

y=250

x+45+y=180

x+45+250=180

x=180–250

x= –70

Delayed (as opposed to immediate) feedback occurs when the solution is submitted

45°

30°

40°

What is the value of x?

40+30+y=180

y=250

x+45+y=180

x+45+250=180

x=180–250

x= –70

Oops! Check your arithmetic.

OK

Can an angle measure be negative?

OK

Both step-based tutors and answer-based tutors have a task loop

Tutor and/or student select a task Tutor poses it to the student Student does the task and submits an answer

– If answer-based tutor, then work offline– If step-based tutor, then work online

» The step-loop = Do step; get feedback/hints; repeat

Repeat

Technical terms/concepts (so far) Answer-based tutoring system (= CAI, CBI, …) Step-based tutoring system (= ITS, ICAI…) Step Next-step help Feedback

– Immediate– Delayed

Hint sequence Bottom-out hint Task loop Step loop

Andes user interface

Read a physics problem

Type in equations

Draw vectors

Type in answer

Andes feedback and hints“What should I do next?”

Green means correctRed means incorrect

“What’s wrong with that?”

Dialogue & hints

SLQ-Tutor (Addison Wesley)Problem

Step

Step

Step

The database that the problem refers to

Submit! Feedback

Cognitive Algebra I Tutor (Carnegie Learning)

Problem

Step: Fill in a cell

Step: Label a column

Step: Define an axis

Step: Plot a point

Step: Enter an equation

Step: Divide both sides

The task

Each tutor turn + student turn in the dialogue is a step

Student input is the 2nd half of the step

AutoTutor

Introduction: Summary

Main ideas– Task loop over tasks– Step loop over steps of a task

» Feedback can be immediate or delayed» But it focuses on steps» Hint sequence

Types of tutoring systems– Step-based tutors (ITS) – both loops– Answer-based tutors (CBT, CAI, etc) – task loop only

Initial framework Step loop

– User interface

– Interpreting student actions

– Suggesting good actions

– Feedback and hints

Task selection Assessment Authoring and the software architecture Evaluations Dissemination

Initial framework Step loop

– User interface» Forms, with boxes to be filled

» Dialogue

» Simulation

» Etc.

– Interpreting student steps

– Suggesting good steps

– Feedback and hints

Task selection Assessment Authoring and the software architecture Evaluations Dissemination

Initial framework Step loop

– User interface

– Interpreting student steps» Equations

» Typed natural language

» Actions in a simulation

» Etc.

– Suggesting good steps

– Feedback and hints

Task selection Assessment Authoring and the software architecture Evaluations Dissemination

Initial framework Step loop

– User interface

– Interpreting student steps

– Suggesting good steps» Any correct path vs. shortest path to answer

» Which steps can be skipped?

» Recognize the student’s plan and suggest its next step?

» Etc.

– Feedback and hints

Task selection Assessment Authoring and the software architecture Evaluations Dissemination

Initial framework Step loop

– User interface

– Interpreting student steps

– Suggesting good steps

– Feedback and hints» Give a hint before the student attempts a step?

» Immediate vs. delayed feedback? feedback on request?

» How long a hint sequence? When to bottom out immediately?

» Etc.

Task selection Assessment Authoring and the software architecture Evaluations Dissemination

Initial framework Step loop

– User interface

– Interpreting student steps

– Suggesting good steps

– Feedback and hints

Task selection» Keeping the student in the “zone of proximal development” (ZPD)

» Mastery learning: Keep giving similar tasks until student master them

» Choosing a task that suits the learner’s style/attributes

» Etc.

Assessment Authoring and the software architecture Evaluations Dissemination

Initial framework Step loop

– User interface

– Interpreting student steps

– Suggesting good steps

– Feedback and hints

Task selection Assessment Authoring and the software architecture Evaluations Dissemination

Next

Assessment vs. Evaluation

“Assessment” of students– What does the student know?– How motivated/interested is the student?

“Evaluation” of instructional treatments– Was the treatment implemented as intended?– Did it produce learning gains in most students?– Did it produce motivation gains in most students?– What is the time cost? Other costs?

Assessment consists of fitting a model to data about the student

Single factor model: A single number representing competence/knowledge– Probability of a correct answer on a test item =

f(competence(student), difficulty(item)) Knowledge component model: One number per

knowledge component representing its mastery– Probability of a correct answer on a test item =

f(mastery(KC1), mastery(KC2), mastery(KC3), …) where KCn are the ones applied in a correct solution

Example: Answer-based assessment of algebraic equation solving skill

Test item: Solve 3+2x=10 for x– KC5: Subtract from both sides & simplify

3+2x=10 2x=7

– KC8: Divide both sides & simplify2x=7 x=3.5

Single factor model– If answer is correct, increment competence else decrement

Knowledge component model– If answer correct, increment mastery of KC5 & KC8

– If answer incorrect, decrement mastery of KC5 & KC8» Weakest one is most likely to be the failure, so decrement it more

Step-based assessment of algebraic equation solving skill

Solve 3+2x=10 for x– Step1:– Step2:

Single factor model:– Whenever a step is answered correctly without hints,

increment competence else decrement Knowledge component model:

– Whenever a step is answered correctly without hints,increment its KC’s mastery else decrement

2x = 7

x = 3.5

Task selection uses assessments

Single factor model– Choose a task that is the right level of difficulty i.e.,

in the ZPD (zone of proximal development) of the student

Knowledge component model– Choose a task whose solution uses mostly mastered

KCs, and only a few KCs that need to be mastered

Other assessment issues

Other decisions, besides task selection, that can use assessment?

Assessment of motivation or interest? Assessment of learning styles? Disabilities? Diagnosis of misconceptions? Bugs?

8 0 7

- 1 8 9

6 2 8

8 6 7

- 1 8 9

6 7 8

Should a “Skillometer” displays knowledge component mastery to the student?

Initial framework Step loop

– User interface

– Interpreting student steps

– Suggesting good steps

– Feedback and hints

Task selection Assessment Authoring and the software architecture Evaluations Dissemination

Next

Authoring

Author creates new tasks– Author generates all solutions?– System generates all solutions?

» Same taste as author?

» Can author add new problem-solving knowledge?

Who can be an author?– Instructors?– Professional authors?– Knowledge engineers?

Software architecture & engineering

Client-server issues– Platform independence– Integration with learning management systems

» E.g., Blackboard, WebAssign, many others

– Cheating, privacy Quality assurance

– Software bugs– Content & pedagogy bugs

Initial framework Step loop

– User interface

– Interpreting student steps

– Suggesting good steps

– Feedback and hints

Task selection Assessment Authoring and the software architecture Evaluations Dissemination

Next

Types of evaluations Analyses of expert human tutors

– What do they do that the system should emulate?

Formative evaluation– What behaviors of the system need to be fixed?– Have students talk aloud, interviews; teachers…

Summative evaluation– Is the system more effective than what it replaces?– Two condition experiment: System vs. control/baseline– Pre-test and post-test (+ other assessments)

Hypothesis testing– Why is the system effective?– Multi-condition experiments: System ±feature(s)

Example: Summative evaluation of the Andes Physics tutor University physics (mechanics) 1 semester 2 Conditions: Homework done with…

– Andes physics tutor– Pencil & paper

Same teachers (sometimes), text, exams, labs Results (2000-2003) in terms of effect sizes

– Experimenter’s post-test: d=1.2 – Final exam: d=0.3– d = (mean_Andes_score – mean_control_score) ÷

pooled_standard_deviation

Aptitude-treatment interaction (ATI)

Control instruction

Experimental instruction

Good learner(e.g., highly motivated,

well prepared…)

Large gains Large gains

Poor learner Small gains Large gains

Open-response, problem solving exams scores

Andesy = 0.9473x - 2.4138

R2 = 0.2882

Controlsy = 0.7956x - 2.5202

R2 = 0.2048

-3.0000

-2.0000

-1.0000

0.0000

1.0000

2.0000

3.0000

1 1.5 2 2.5 3 3.5 4

GPA

Z-s

core

on

exam

ANDES

CONTROLS

Linear (ANDES)

Linear (CONTROLS)

Grade-point average

Exa

m s

core

Andes

Control

Ideal tutoring system adapts to the student’s needs

Ass

ista

nce

pro

vid

ed

Assistance neededLow

High

Large l

earn

ing gains

Struggling

Bored, & irritated

Assistance provided = task selection, feedback, hints, user interface…

No-so-good tutoring system helps only some students

Ass

ista

nce

pro

vid

ed

Assistance neededLow

High

Large l

earn

ing gains

StrugglingBored, & irritated

Assistance provided = task selection, feedback, hints, user interface…

Not-so-good tutor

Initial framework Step loop

– User interface

– Interpreting student steps

– Suggesting good steps

– Feedback and hints

Task selection Assessment Authoring and the software architecture Evaluations Dissemination Next

Dissemination = getting the system into widespread use

Routes– Post and hope– Open source– Commercialization

Issues– Instructor acceptance– Instructor training– Student acceptance– Marketing

Outline

Step loop– User interface

– Interpreting student steps

– Suggesting good steps

– Feedback and hints Task selection Assessment Authoring and the software architecture Evaluations Dissemination

Next

Tutoring systems

Other interactive instructional systems

Other intelligent interactive instructional systems

Teachable agent– Student deliberately teaches the system, which is then

assessed (in public) Learning companion

– Student works while system encourages Peer learner

– Student and system work & learn together To be discovered…