1 Assessment Tomorrow Robert Coe@ProfCoe Centre for Evaluation and Monitoring (CEM) Durham...

download 1 Assessment Tomorrow Robert Coe@ProfCoe Centre for Evaluation and Monitoring (CEM) Durham University Assessment Tomorrow Conference Edinburgh, 22nd November.

If you can't read please download the document

Transcript of 1 Assessment Tomorrow Robert Coe@ProfCoe Centre for Evaluation and Monitoring (CEM) Durham...

  • Slide 1
  • 1 Assessment Tomorrow Robert Coe@ProfCoe Centre for Evaluation and Monitoring (CEM) Durham University Assessment Tomorrow Conference Edinburgh, 22nd November 2012
  • Slide 2
  • 2 Why are we here? CEM aims to Create the best assessments in the world Empower teachers with information for self-evaluation Promote evidence-based practices and policies, based on scientific evaluation To help educators improve educational outcomes measurably
  • Slide 3
  • 3 CEM activity The largest educational research unit in a UK university 1.1 million assessments are taken each year More than 50% of UK secondary schools use one or more CEM system CEM systems used in over 50 countries Largest provider of computerised adaptive tests outside US
  • Slide 4
  • 4 Assessment is the most powerful lever we have Quality matters Technology can make assessment o Efficient o Diagnostic o Embedded o Fun Outline o Valid o Standardised o Secure o Informative
  • Slide 5
  • 5 Makes learning visible Makes us focus on learning Allows us to evaluate o What students do and dont know o Against appropriate norms o Effectiveness of teaching Allows us to diagnose o Specific learning needs Good Assessment
  • Slide 6
  • 6 EEF Toolkit Cost per pupil Effect Size (months gain) 0 0 10 1000 Feedback Meta-cognitive Peer tutoring Pre-school 1-1 tutoring Homework ICT AfL Parental involvement Sports Summer schools After school Individualised learning Learning styles Arts Performance pay Teaching assistants Smaller classes Ability grouping Promising May be worth it Not worth it http://www.educationendowmentfoundation.org.uk/toolkit
  • Slide 7
  • 7 Definition of a grade An inadequate report of an inaccurate judgment by a biased and variable judge of the extent to which a student has attained an undefined level of mastery of an unknown proportion of an indefinite material. Dressell (1983)
  • Slide 8
  • 8 Would you let this test into your classroom? How clearly defined are the acceptable interpretations and uses of test scores? How well do the test scores predict later performance? Do repeated administrations of the test give consistent results? What does the test claim to measure? Do the test items look appropriate? How well do the test scores correlate with other measures of the same thing? How long does the test (or each element of it) take each student? Do the responses have to be marked? How much time is needed for this? How well does the measure correspond with measures of the same and related constructs, using the same and other methods of assessment? Do test scores reflect factors other than the intended construct (such as gender, social class, race/ethinicity)? Does the test discriminate adequately between different levels of performance?
  • Slide 9
  • 9 Computer Adaptive Testing Right answers harder questions Wrong answers easier questions Can give same information in half the time More accurate at the extremes More pleasant testing experience Need access to computers Development costs higher
  • Slide 10
  • 10 PIPS Baseline : start of school
  • Slide 11
  • 11 InCAS: diagnostic assessment through primary school
  • Slide 12
  • 12 Computer Adaptive Baseline Test
  • Slide 13
  • 13 Teachers to author, share and evaluate test items Home-made tests with standardised norms Adaptive presentation Automatic marking of complex responses Platforms for efficient and quality-controlled human judgement (marking) Cheat detection Sophisticated feedback to students and teachers In the future, technology allows