From proof of concept to evidence of impact: evaluating learning design tools

8

Click here to load reader

description

Presentation to accompany paper for the Art & Science of Learning Design workshop 2011

Transcript of From proof of concept to evidence of impact: evaluating learning design tools

Page 1: From proof of concept to evidence of impact: evaluating learning design tools

From proof of concept to evidence of impact:

towards a principled approach to evaluating learning design tools

Liz MastermanUniversity of Oxford

ASLD Workshop13th-14th October 2011

Page 2: From proof of concept to evidence of impact: evaluating learning design tools

Evaluating the impact of the LDSE: conceptual foundations

Achieve an impact on lecturers’ practice in designing teaching and learning, particularly in relation to their use of TEL where it is relevant and appropriate

Strong effect on character, development or behaviour

Act and think differently about teaching (Biggs, 2003)

Qualitative transformation, not incremental or quantitative change (Wertsch, 2002

Page 3: From proof of concept to evidence of impact: evaluating learning design tools

What constitutes evidence of impact?

1. Innovating in relation to one’s current practice

2. Giving due weight to students’ needs3. Espousing appropriate theories of learning

and teaching4. Engaging in critical reflection on one’s

practice5. Building personal professional knowledge6. Participating in a community of teachers

and contributing to the development of collective professional knowledge

Page 4: From proof of concept to evidence of impact: evaluating learning design tools

Delimiting the field of vision

Beliefsa) General pedagogic beliefs (active construction of knowledge)

b) Beliefs about digital technologies in T&L (potential to benefit learning)

Attitude+ve

Values

(Intends to use PRS to increase student participation in lectures)

(Uses PRS once or twice only)

Intentions re TEL

1st-order change (reversible)

Behaviours re TEL

2nd-order change (irreversible)

(PRS embedded in T&L practice)

Confronts and strengthensboth belief a) and belief b)

Organisational penetration;

After Koballa (1988) (from Fishbein & Ajzen, 1975); also Ertmer (2007); Kaufman et al. (1996)

Time

Page 5: From proof of concept to evidence of impact: evaluating learning design tools

Qualitative measurement criteria

1. AwarenessRecognising the potential for enhancing practice through engaging in this activity

2. ReactionsFeeling positive about enhancing practice in this way

3. EngagementEngaging with enhancing practice in this way…

HEA Evaluation and Impact Assessment Approach (2009)

Page 6: From proof of concept to evidence of impact: evaluating learning design tools

Qualitative measurement criteria

1. Awareness2. Reactions3. Engagement

4. Learning fromLearning or develop ideas relevant to their practice

5. Applying the learningApplying what they have learned or developed to their practice

6. Effects on student learningIdentifying instances of students learning better as a result of enhanced practice

HEA Evaluation and Impact Assessment Approach (2009)

Page 7: From proof of concept to evidence of impact: evaluating learning design tools

An example

“I got taken along to see what reusable learning objects were and got this lovely example of fulcrum, load and effort and a car crashing into a wall. And […] I thought, ‘Well that’s not what I do because I don’t teach a concept that can be grasped like that.’ And there was this moment, […] I had an epiphany because I suddenly went, ‘Oh, so when I’m teaching that means I could do this!’”

Page 8: From proof of concept to evidence of impact: evaluating learning design tools

Methodological issues

Reliance on self-reports: e.g. from interviews and surveys

Constraints in time and resources first-order changes in behavior or emergent shifts in attitude

Describe impact in the process of happening, rather than measure impact as outcome