Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

35
Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice? Denise Whitelock 1 , Lester Gilbert 2 , and Veronica Gale 2 1 Institute of Educational Technology, The Open University, UK 2 Learning Societies Lab in the School of Electronics and Computing Sciences, University of Southampton [email protected] DMW - CAA 2011 - July 2011

description

This desktop research commissioned by the Higher Education Academy set out to consult with the academic community about which references on assessment and feedback with technology enhancement were most useful to practitioners. While all the recommended publications may be characterised as reputable and the majority were peer-reviewed (67.7%), only a minority provided quantitative data (28.2%), of which relatively few provided appropriate experimental designs or statistical analysis (18.5%). The majority of publications were practitioner-led case studies. The references that were recommended to us are clearly having an impact on current practice and are found valuable by practitioners. The key messages from these sources are consistent and often give detailed and practical guidance for other academics. We found that most of the recommended literature focused on the goals that technology enhancement can enable assessment and feedback to meet and how assessment and feedback can be designed to make best use of the technology.

Transcript of Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Page 1: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Denise Whitelock1, Lester Gilbert2, andVeronica Gale2

1 Institute of Educational Technology, The Open University, UK

2 Learning Societies Lab in the School of Electronics and Computing Sciences, University

of Southampton

[email protected]

DMW - CAA 2011 - July 2011

Page 2: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

The Challenge

• Assessment drives learning (Rowntree, 1987)

• How does e-assessment and feedback support student learning?

DMW - CAA 2011 - July 2011

Page 3: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

I hate marking but want the tasks and feedback to assist student learning

DMW - CAA 2011 - July 2011

Page 4: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

HEA funded Project to:• Consult the academic community on useful references

• Seminar series

• Survey

• Advisors

• Invited contributors

• Prioritise evidence-based references

• Synthesise main points

• For readers:

• Academics using technology enhancement for assessment and feedback

• Learning technologists

• Managers of academic departments

DMW - CAA 2011 - July 2011

Page 5: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Evidence-based literature

• 142 references

• Technology-enhanced methods

• Use for assessment and feedback

• Type of evidence

• Ease of access (18 could not be retrieved)

DMW - CAA 2011 - July 2011

Page 6: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Categories of evidence used

Category Description1a Peer reviewed generalizable study providing

effect size estimates and which includes (i) some form of control group or treatment (may involve participants acting as their own control, such as before and after), and / or (ii) blind or preferably double-blind protocol.

1b Peer reviewed generalizable study providing effect size estimates, or sufficient information to allow estimates of effect size.

2 Peer reviewed ‘generalizable’ study providing quantified evidence (counts, percentages, etc) short of allowing estimates of effect sizes.

3 Peer-reviewed study.4 Other reputable study providing guidance.

DMW - CAA 2011 - July 2011

Page 7: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Number of references recommended in each evidence category

Evidence category

Number of references recommended(a)

Cumulative %

1a 15 12.1%

1b 8 18.5%

2 12 28.2%

3 49 67.7%

4 40 100.00%

Total 124

DMW - CAA 2011 - July 2011

Page 8: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

MCQ & EVS use with large learning gains Draper (2009)• Assertion –reason questions eg Pauli, Hesienerg,

Planck, de Broglie

• Learners generate reasons for and against each answer

• Confidence marking Gardner-Medwin

• Mazur’s ,method of brain teasing. Role of peers,cognitive cnflict

• Students create MCQs for EVS, reflection on counter arguments before you hear them

• Students create MCQs for final exam.Will increase exam performance.

DMW – CAA2011 – July 2011

Page 9: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Mobile Technologies and Assessment

• MCQs ,PDAs Valdiva & Nussbaum(2009)

• Polls,instant surveys

• Simpson & Oliver (2007)

• Draper (2009) EVS

DMW - CAA 2011 - July 2011

Page 10: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

CAP peer assessment system, BSc. Network Management & Security (Intl.), Glamorgan, Phil Davies

DMW - CAA 2011 - July 2011

Page 11: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Peer Assessment and the WebPA Tool

• Loughborough (Loddington et al, 2009)

• Self assess and peer assess with given criteria

• Group mark awarded by tutor

• Students rated:• More timely feedback

• Reflection

• Fair rewards for hard work

• Staff rated:• Time savings

• Administrative gains

• Automatic calculation

• Students have faith in the administrative system

DMW - CAA 2011 - July 2011

Page 12: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Authentic assessments :e-portfolios

Electronic NVQ portfolio cover contents page, OCR IT Practitioner, EAIHFE, Robert Wilsdon

DMW - CAA 2011 - July 2011

Page 13: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Candidate Assessment Records section, OCR IT Practitioner, EAIHFE, Robert Wilsdon

DMW - CAA 2011 - July 2011

Page 14: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Building e-portfolios on a chef’s course

food preparation for e-portfolio, Modern Apprenticeship in Hospitality and Catering,

West Suffolk College, Mike Mulvihill

Evidence of food preparation skill for e-portfolio, Modern Apprenticeship in Hospitality and Catering, West Suffolk College, Mike Mulvihill

DMW - CAA 2011 - July 2011

Page 15: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Sharing e-portfolios: The Netfolio concept

• Social constructivism

• Connecting e-portfolios (Barbera, 2009)

• Share and build upon a joint body of evidence

• Trialled with 31 PhD students at a virtual university

• Control group used but Netfolio group obtained higher grades

• Greater visibility of revision process and peer assessment in the Netfolio system

DMW - CAA 2011 - July 2011

Page 16: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

MCQs: Variation on a theme (1)

The question is an example of a COLA assessment used at the Reid Kerr College, Paisley. It is a Multiple response Question used in one of their modules.

The question was developed using Questionmark Perception at the University of Dundee. It is part a set of formative assessment for medical students.

DMW - CAA 2011 - July 2011

Page 17: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

MCQs: Variation on a theme (2)

Example of LAPT Certainty-Based Marking, UK cabinet ministers demo exercise showing feedback, University College, Tony Gardner-Medwin

Drug Chart Errors and Omissions, Medicines Administration Assessment, Chesterfield Royal Hospital

DMW - CAA 2011 - July 2011

Page 18: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Self diagnosis

• Basic IT skills, first year med students (Sieber, 2009)

• Competency based testing

• Repeating tests for revision

• Enables remedial intervention

DMW - CAA 2011 - July 2011

Page 19: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Students want more support with assessment

• More Feedback

• Quicker Feedback

• Full Feedback

• User friendly Feedback

• And ..................National Students’ Survey

DMW - CAA 2011 - July 2011

Page 20: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Gains from Interactivity with Feedback: Formative Assessment

• Mean effect size on standardised tests between 0.4 to 0.7 (Black & Williams, 1998)

• Particularly effective for students who have not done well at school http://kn.open.ac.uk/document.cfm?docid=10817

• Can keep students to timescale and motivate them

• How can we support our students to become more reflective learners and enter a digitaldiscourse?

DMW - CAA 2011 - July 2011

Page 21: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

LISC: Aily Fowler

• Kent University ab-initio Spanish module

• Large student numbers

• Skills-based course

• Provision of sufficient formative assessment meant unmanageable marking loads

• Impossible to provide immediate feedback

• leading to fossilisation of errors

DMW - CAA 2011 - July 2011

Page 22: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

The LISC solution: developed by Ali Fowler• A CALL system designed to enable students

to:• Independently

practise sentence translation

• Receive immediate (and robust) feedback on all errors

• Attend immediately to the feedback (before fossilisation can occur)

DMW - CAA 2011 - July 2011

Page 23: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

How is the final mark arrived at in the LISC System?

• The two submissions are unequally weighted

• Best to give more weight to the first attempt

• since this ensures that students give careful consideration to the construction of their first answer

• but can improve their mark by refining the answer

• The marks ratio can vary (depending on assessment/feedback type)

• the more information given in the feedback, the lower the weight the second mark should carry

DMW - CAA 2011 - July 2011

Page 24: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Heuristics for the final mark

• If the ratio is skewed too far in favour of the first attempt…

• students are less inclined to try hard to correct non-perfect answers

• If the ratio is skewed too far in favour of the second attempt…

• students exhibit less care over the construction of their initial answer

DMW - CAA 2011 - July 2011

Page 25: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Free text entry• IAT (Jordan & Mitchell, 2009)

• Open Comment (Whitelock & Watt, 2008) http://kn.open.ac.uk/public/document.cfm?docid=11638

• McFeSPA system (Kochakornjarupong & Brna, 2010

• Supports teaching assistants to mark and give feedback on undergraduate computer programming assignments

• Support tool for semi-automated marking and scaffolding of feedback

DMW - CAA 2011 - July 2011

Page 26: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

McFeSPA system

Supports teaching assistants to mark and give feedback on undergraduate computer programming assignments

Support tool for semi-automated marking and scaffolding of feedback

Findings that feedback model would be helpful in training tutors .... Similar to Open Comment findings

DMW - CAA 2011 - July 2011

Page 27: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Audio Feedback (Middleton & Nortcliffe, 2010)1. Timely and meaningful

2. Manageable for tutors to produce and the learner to use

3. Clear in purpose, adequately introduced and pedagogically embedded

4. Technically reliable and not adversely determined by technical constraints or difficulties

5. Targeted at specific students, groups or cohorts, addressing their needs with relevant points in a structured way

6. Produced within the context of local assessment strategies and in combination, if appropriate, with other feedback methods using each medium to good effect

7. Brief, engaging and clearly presented, with emphasis on key points that demand a specified response from the learner

8. Of adequate technical quality to avoid technical interference in the listener’s experience

9. Encouraging, promoting self esteem

10. Formative, challenging and motivational

DMW - CAA 2011 - July 2011

Page 28: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Characteristics Descriptor

Authentic Involving real-world knowledge and skills

Personalised Tailored to the knowledge, skills and interests of each student

Negotiated Agreed between the learner and the teacher

Engaging Involving the personal interests of the students

Recognise existing skills Willing to accredit the student’s existing work

Deep Assessing deep knowledge – not memorization

Problem oriented Original tasks requiring genuine problem solving skills

Collaboratively produced Produced in partnership with fellow students

Peer and self assessed Involving self reflection and peer review

Tool supported Encouraging the use of ICT

Elliott’s characteristics of Assessment 2.0 activities

A d v i c e f o r A c t i o n

DMW - CAA 2011 - July 2011

Page 29: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Creating teaching and learning dialogues: towards guided learning supported by technology• Learning to judge

• Providing reassurance

• Providing a variety of signposted routes to achieve learning goals

DMW - CAA 2011 - July 2011

Page 30: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Key Messages

Effective regular, online testing can encourage student learning and improve their performance in tests (JISC, 2008)

Automated marking can be more reliable than human markers and there is no medium effect between paper and computerized exams (Lee and Weerakoon, 2001)

The success of assessment and feedback with technology-enhancement lies with the pedagogy rather than the technology itself; technology is an enabler (Draper, 2009)

DMW - CAA 2011 - July 2011

Page 31: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Keys Messages 2

Technology-enhanced assessment is not restricted to simple questions and clear-cut right and wrong answers, much more sophisticated questions are being used as well (Whitelock & Watt, 2008)

The design of appropriate and constructive feedback plays a vital role in the success of assessment, especially assessment for learning. (Beaumont, O’Doherty & Shannon, 2008)

DMW - CAA 2011 - July 2011

Page 32: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Key Messages 3

Staff development essential to the process (Warburton, 2009)

Prepare students to take the assessments that use technology enhancement by practicing with similar levels of assessment using the same equipment and methods (Shepherd et al, 2006)

The reports generated by many technology-enhanced assessment systems are very helpful in checking the reliability and validity of each test item and the test as a whole can be checked on commercial systems (McKenna and Bull, 2000)

DMW - CAA 2011 - July 2011

Page 33: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

References Beaumont, C., O’Doherty, M., and Shannon, L. (2008). Staff and student

perceptions of feedback quality in the context of widening participation, Higher Education Academy. Retrieved May 2010 from: http://www.heacademy.ac.uk/assets/York/documents/ourwork/research/Beaumont_Final_Report.pdf.

Draper, S. (2009). Catalytic assessment: understanding how MCQs and EVS can foster deep learning. British Journal of Educational Technology, 40(2), 285-293.

JISC, HE Academy, and ALT (2008). Exploring Tangible Benefits of e-Learning. Retrieved in May 2010 from http://www.jiscinfonet.ac.uk/publications/info/tangible-benefits-publication.

Lee, G. and Weerakoon, P. (2001). The role of computer-aided assessment in health professional education: a comparison of student performance in computer-based and paper-and-pen multiple-choice tests. Medical Teacher, Vol 23, No. 2, 152 - 157.

McKenna, C. and Bull, J. (2000). Quality assurance of computer-assisted assessment: practical and strategic issues. Quality Assurance in Education. 8(1), 24-31.

DMW - CAA 2011 - July 2011

Page 34: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

References 2 Middleton, A. and Nortcliffe, A. (2010) ‘Audio feedback design: principles and

emerging practice’, In D.Whitelock and P.Brna (eds) Special Issue ‘Focusing on electronic feedback: feasible progress or just unfulfilled promises?’ Int. J. Continuing Engineering Education and Life-Long Learning, Vol. 20, No. 2, pp.208-223.

Shephard, K., Warburton, B., Maier, P. and Warren, A. (2006). Development and evaluation of computer-assisted assessment in higher education in relation to BS7988. Assessment & Evaluation in Higher Education, 31: 5, 583 — 595.

Strang, K.D. (2010) ‘Measuring self regulated e-feedback, study approach and academic outcome of multicultural university students’, In D.Whitelock and P.Brna (eds) Special Issue ‘Focusing on electronic feedback: feasible progress or just unfulfilled promises?’ Int. J. Continuing Engineering Education and Life-Long Learning, Vol. 20, No. 2, pp.239-255.

Warburton, B. (2009). Quick win or slow burn: modelling UK HE CAA uptake, Assessment & Evaluation in Higher Education, 34: 3, 257 — 272.

Whitelock, D. and Watt, S. (2008). Reframing e-assessment: adopting new media and adapting old frameworks. Learning, Media and Technology, Vol. 33, No. 3, September 2008, pp.153–156 Routledge, Taylor & Francis Group. ISSN 1743-9884

DMW - CAA 2011 - July 2011

Page 35: Technology-Enhanced Assessment and Feedback: How is evidence-based literature informing practice?

Four Assessment Special Issues

Brna, P. & Whitelock, D. (Eds.) (2010) Special Issue of International Journal of Continuing Engineering Education and Life-long Learning, Focussing on electronic Feedback: Feasible progress or just unfulfilled promises? Volume 2, No. 2

Whitelock, D. (Ed.) (2009) Special on e-Assessment: Developing new dialogues for the digital age. Volume 40, No. 2

Whitelock, D. and Watt, S. (Eds.) (2008). Reframing e-assessment: adopting new media and adapting old frameworks. Learning, Media and Technology, Vol. 33, No. 3

Whitelick, D. and Warburton, W. (In Press). Special Issue of International Journal of e-Assessment (IJEA) entitled ‘Computer Assisted Assessment: Supporting Student Learning’

DMW - CAA 2011 - July 2011