Building Testing Committees that have the Authority to Create Effective Change

25
1 An ExamSoft Client Webinar Building Testing Committees that have Authority to Create Effective Change

Transcript of Building Testing Committees that have the Authority to Create Effective Change

1

An ExamSoft Client Webinar

Building Testing Committees that have

Authority to Create Effective Change

Building    Tes,ng  Commi0ees    that  have  authority  to  create  effec,ve  change

AINSLIE  T.  NIBERT,  PHD,  RN,  FAAN

JANUARY  20,  2015

2

Why  form  a  Tes,ng  Commi0ee? Higher  Educa-on  Trends  

•   Increased  demand  for  measurable  achievement  of  program  outcomes  

• Surveillance  for  gaps  in  curriculum  becoming  an  ongoing  need  

• Avoidance  of  inadequate/late  response  to  academically  at-­‐risk  students    •  Consequences  affec?ng  their  admission,  progression,  gradua?on  

• Increasing  faculty  re?rements  –  inexperienced  faculty  stretched  thin  •  Curriculum  evalua?on  is  a  low  priority  un?l  re-­‐accredita?on  looms  •  Less  experienced  faculty  lack  skills  in  test  item  wri?ng  and  exam  analysis  •  Administra?on:  tes?ng  policies  lack  consistencies  with  their  adop?on,  maintenance,  and  revisions;  policy-­‐making  should  be  supported  by  evidence  

3

Tes,ng  Policies  &  Procedures:  Evidence-­‐based  &  Consistent Well-­‐wri6en  test  policies  and  procedures,  carried  out  consistently,  ensure:  

• Defensibility  of  Tests  used  in  each  course  across  the  curriculum  •  Test  Blueprint  aligns  with  Course  Objec?ves  •  Standard  scoring  procedures  are  used  for  each  test  administra?on  

• Policies  are  constructed  using  APA’s  Code  &  are  applied  consistently  •  Sta?s?cal  item  analysis  is  systema?c  and  applied  rou?nely  to  each  test  •  Procedures  for  item  nullifica?on,  rescoring,  second  grader  review  are  set  up  •  Mechanisms  for  documen?ng  test  outcomes  and  analysis  are  established  

•  Consistent  requirements  established  across  all  courses  •  Computerized  reports  used  whenever  possible  to  streamline  effort  for  the  faculty  •  Code  of  Fair  Tes?ng  Prac?ces  in  Educa?on  by  the  American  Psychological  Associa?on  applied  

•  See  hSp://www.apa.org/science/programs/tes?ng/fair-­‐code.aspx  

4

A  need  for  synergy:    Tes,ng  Policies  rela,ng  to  Admission,  Progression,  and  Gradua,on   Tes-ng  Commi6ee  and  Student  Affairs  Commi6ee  –  policy  alignment  

• The  tes?ng  commiSee’s  recommenda?ons  about  tes?ng  policy  •  Can  overlap  with  policies  adopted  by  the  Student  Affairs  CommiSee  

•  Grading  policies  •  Student  grievance  policies  •  Admission  requirements  •  Condi?ons  students  must  meet  to  be  considered  candidates  for  re-­‐admission  to  the  program  

•  Following  academic  failure  of  a  course  or  courses  •  Following  withdrawal  from  a  course,  or  from  the  program,  for  any  reason  

•  Progression  requirements  •  Gradua?on  requirements  

The  tes?ng  commiSee  does  not  formulate  policy  independently,  but  reports  recommenda?ons  to  the  curriculum  commiSee  for  full  faculty  approval.  

5

Curriculum  evalua,on:    Use  aggregate  student  test  data  to  find/close  the  gaps

• Total  program  evalua?on  and  curriculum  revisions  should  be  accomplished  using  an  evidence-­‐based  approach  that  is  systema?c.  •  Use  aggregate  student  response  data  to  establish  benchmarks;  track  each  cohort’s  performance  against  benchmarks  and  compare  to  previous  cohorts  

•  Use  aggregate  student  response  data  gathered  longitudinally  from  mul?ple  cohorts  to  iden?fy  weaknesses  consistently  seen  by  course  or  by  concept  across  mul?ple  courses;  seek  sources/causes  of  program  weaknesses  

• When  curriculum  evalua?ons  are  substan?ated  by  both  faculty  expert  opinion  (subjec?ve  data)  and  aggregate  student  test  results  (scoring  data,  item  analysis  sta?s?cs  that  substan?ate  test  reliability  and  validity  –  objec?ve  data,)  the  resul?ng  revisions  are  specifically  targeted  at  areas  of  program  weaknesses  •  Saves  faculty  ?me  by  pinpoin?ng  areas  of  concern  &  streamlining  revisions  •  Completes  the  “evalua?on  loop”  -­‐-­‐-­‐  just  collec?ng  the  data  is  insufficient  

6

Avoid  the  “too  li0le,  too  late”  response  for  the  at-­‐risk  student Tes-ng  commi6ees  can  make  recommenda-ons  about  individual  student  interven-ons,  but  should  not  usurp  the  role  of  Student  Affairs.  

• Recommend  updates  to  scoring  benchmarks  following  data  analysis    •  Create  or  adopt  tools  for  longitudinal  student  data  collected  in  each  course  •  Require  course  coordinators  to  submit  the  datasets  each  semester  •  Establish  remedia?on  plans  –  create  student  learning  contract  templates  •  Formulate  tools  that  summarize/document  results  following  implementa?on  of  individual  student  ac?on  plans  to  improve  performance    

• Coordinate  with  Student  Affairs  and  Curriculum  (Faculty)  CommiSee  •  Recommend  changes  to  Admission,  Progression,  and  Gradua?on  Policies  

7

Commitment  to  adherence  to  Best  Prac,ces  in  Tes,ng   Ensure  that  tes?ng  commiSee  has  proper  representa?on  and  authority  ◦  Representa?on  across  all  levels  of  the  program  ◦  Sufficient  resources  to  stay  abreast  of  best  prac?ces  and  elevate  faculty  skills  

  Make  the  commiSee  a  sub-­‐commiSee  of  the  curriculum  commiSee  or  other  faculty  commiSee  that  represents  the  en?re  faculty  ◦  Policy  and  procedures  recommended  by  the  tes?ng  commiSee  ◦  But  are  approved  by  the  overall  faculty  commiSee  

  Enforcement  –  item  bank,  exam  bank,  style  guide,  etc    ◦  CommiSee  must  have  an  avenue  through  administra?on  hierarchy  to  enforce  policies  established  and  carry  out  consequences  when  policies  are  ignored  or  thwarted  

8

Best  Prac,ce  #1:    Establish  Tes,ng    Commi0ee   Design  and  revise  the  program’s  tes?ng  policies  ◦     Publishes  Guidelines  for  Exam  Development    ◦ Wri?ng  style  protocol  –  apply  to  all  tests  ◦ Cri?cal  thinking  items  –  increase  Applica?on  level  

      Exam  Administra?on  Policies  ◦ Number  of  items/exam;  length  of  ?me  alloSed/exam  ◦ Proctoring  Guidelines  ◦ Dissemina?on  of  Grades  (When?  How?)  ◦ Test  Review  –  to  do  or  not  to  do;  how  to  do  it  

…more  Tes,ng  Commi0ee  Ac,vi,es   Systema?c  Item  Analysis  (see  also  #3  below)  

◦   Use  sta?s?cal  parameters  for  analyzing  overall  test  and  individual  test  items  

◦ Adhere  to  published  psychometric  standards  ◦ Summarize  analysis  and  include  ac?on  plan  ◦ Review  test  blueprint  as  needed  before  next  test  ◦ Review  individual  items  earmarked  for  aSen?on  

  Compile/analyze  standardized  test  results  ◦ Make  recommenda?ons  to  curriculum  commiSee  for  changes  based  on  findings  

10

Best  Prac,ce  #2:    Wri,ng  Style  Protocol

Establishing  clear  guidelines  for  faculty  leaves  liSle  room  for  ambiguity  and  helps  insure  uniformity  in  the  presenta?on  of  exams  throughout  the  curriculum:      Will  present  or  past  tense  be  used  for  test  items?      Will  op?ons  end  in  periods,  whether  or  not  there  are        complete  sentences?      Will  all  op?ons  begin  with  a  capital  leSer?      When  stressing  a  word  in  the  stem,  will  it  be  highlighted,        boldfaced,  italicized,  underlined?      Will  the  term  pa?ent  or  client  be  used?  

Best  Prac,ce  #3:  Increase  propor,on  of  Applica,on-­‐and-­‐above  Items

Which  interven?on  is  most  important?  Which  interven?on,  plan,  assessment  data  is/are  most  cri?cal  to  developing  a  plan  of  care?  

Which  interven?on  should  be  done  first?  

What  ac?on  should  the  nurse  take  first?  Which  interven?on,  plan,  nursing  ac?on  has  the  highest  priority?  

What  response  is  best?  

12

Best  Prac,ce  #4:  Use  Uniform  Sta,s,cal  Parameters  for  Item  Analysis

What  is  the  minimum  acceptable  difficulty  level  for  a  test  item?  What  is  an  acceptable  discrimina?on  level  (PBCC)  for  a  test  item?  What  is  the  acceptable  number  of  mastery  items  to  include?  What  is  an  acceptable  reliability  coefficient  for  the  exam?  

13

Systema,c  Item  Analysis  used  for  all  tests:  Use  a  3-­‐Step  Method

  1.      Review  Difficulty  Level  

  2.      Review  Discrimina?on  Data  Item  Discrimina?on  Ra?o  (IDR)  

Point  Biserial  Correla?on  Coefficient  (PBCC)  

  3.      Review  Effec?veness  of  Alterna?ves  Response  Frequencies  

Non-­‐distracters  

        

14

Morrison, Nibert, Flick, J. (2006). Critical thinking and test item writing (2nd ed.).Houston, TX: Health Education Systems, Inc.

15  

Parameter   Level  of  Acceptance  

Item  Difficulty   30%-­‐90%  

Item  Discrimina?on  Ra?o   25  %  and  above  

PBCC    (point  biserial  correla?on  coefficient)  

0.20  and  above  

KR20  (Kuder-­‐Richardson  20)   0.70  and  above  

Morrison, Nibert, Flick, J. (2006). Critical thinking and test item writing (2nd ed.).Houston, TX: Health Education Systems, Inc.

 Use  these  Recommended  Item  Analysis  Standards

PBCC  &  KR-­‐20:  Standards  of  Acceptance  for  Nursing  versus  General  Educa?on  exams  16

PBCC:  0.15  and  above  KR20:  0.60  -­‐  0.65  and  above  

Morrison, Nibert, Flick, J. (2006). Critical thinking and test item writing (2nd ed.).Houston, TX: Health Education Systems, Inc.

Analyze  Student  Response  Frequencies    Target  revisions  of  op?ons  with  ‘0’  responses  These  op?ons  are  not  plausible;  even  poor  performers  know  enough  to  avoid  these  choices  on  the  exam,  which  increases  prospect  for  success  by  guessing  

When  an  op?on  is  chosen  by  fewer  than  30  test  takers,  this  generally  indicates  poor  item  discrimina?on  and  should  be  edited.  If  fewer  than  30  test  takers  took  the  test,  frequencies  are  not  reliable;  cumula?ve  data  should  be  used.  

17  

Make  Curricular  Evalua,on  consistent  by  adop,ng  a  Master  Test  Blueprint  Format

18

Generate  an  electronic  blueprint  (using  the  master  format)  for  each  exam  in  each  course

19

Test  Review  Summary

20

     

Summary:  Tes,ng  ‘Best  Prac,ces’   Improve  Test  Blueprints    

  Adopt  a  uniform  item  style/format;  publish  the  guidelines  

  Invert  the  propor?on  of  Applica?on-­‐and-­‐above  test  items  as  compared  to  Knowledge/Comprehension  items  

  Use  a  Systema?c  Item  Analysis  methodology  

  Include  10%  alterna?ve  item  formats  –  across  all  tests  

  Remove  items  that  no  longer  align  with  na?onal  standards  or  current  clinical  prac?ce,  which  should  be  reflected  in  your  program  outcomes  ◦  In  Nursing:    Remove  obsolete  nursing  diagnoses  ◦  In  Nursing:    Eliminate  Trade  names  for  most  medica?ons  (use  generic  only)  

  For  competencies  that  must  be  tested  in  each  course,  be  consistent:  ◦  In  Nursing:  10%  of  all  test  items  are  medica?on  calcula?ons  –  across  all  tests  in  all  courses  

  Revise  the  test  length/?me  allotments  –  be  consistent  across  all  tests  

21

….more  Tes,ng  ‘Best  Prac,ces’   Establish  scoring  benchmarks  that  are  evidence-­‐based/outcome-­‐based  

  Determine  consistent  weight  of  standardized  exam  in  overall  grade  

  Iden?fy  consequences  for  failure  to  achieve  benchmarks  ◦  Offer  addi?onal  (parallel)  versions  of  the  exam  –  mix  these  up!  ◦  Allow  enough  ?me  for  remedia?on  to  occur  between  aSempts.  ◦  Assign  specific  remedia?on  ac?vi?es  (mul?ple  strategies)  

◦  Date/?me  for  submission  of  these  should  be  specified  in  the  learning  contract.  Monitor  remedia?on  usage  for  student  adherence  to  contract.  

◦  Selec?ons  should  be  based  on  weak  areas  iden?fied  on  scoring  report.  ◦  Focus  on  building  confidence  –  this  is  not  a  puni?ve  ac?vity.  Model  effec?ve  study  habits  and  test  

taking  strategies.  

  Use  proctoring  guidelines;  take  test  security  measures    ◦  Inves?gate  suspected  breaches  and  enforce  published  policies.  

22

A  few  words  about  vigilance  with  Test  Security 1.  Encourage  moral  behavior  (Academic  honesty  program  at  your  school  with  clear  

language  placed  in  handbooks)  

2.  Discourage  chea?ng  a.  Before  tes?ng  

1.  Minimize  access  to  exams  and  viewing  of  exam  content  2.  Use  highest  levels  of  security  available  in  LMS  for  teacher-­‐made  tests  

and  all  security  features  available  in  a  standardized  tes?ng  plaqorm  -­‐  protect  logins  &  access  codes;  ac?ve  dashboarding  

3.  Train  proctors  for  live  proctoring  ac?vi?es;  re-­‐train/re-­‐cer?fy  regularly  b.  During  tes?ng  

1.  Establish  secure  environment  2.  No  devia?ons  to  test  procedures  or  breakdown  of  environmental  

security  allowed.  Ex:    leaving  room  equates  to  the  test  being  over  for  that  student  regardless  of  reason  

3.  Vigilant  proctoring  (proctor  physically  walks  around  the  room  or  rou?nely  accesses  ac?ve  dashboards  for  remote  tes?ng  

3.  Detect  chea?ng  with  Data  Forensics  and  take  ac?ons  as  needed  

23

References American  Psychological  Associa?on.  (2004)  Code  of  Fair  Tes?ng  Prac?ces  in  Educa?on.  Washington,  DC:  Joint  CommiSee  on  Tes?ng  Prac?ces.  hSp://www.apa.org/science/programs/tes?ng/fair-­‐code.aspx  

Morrison,  S.,  Nibert,  A.,  &  Flick,  J.  (2006).  Cri$cal  thinking  and  test  item  wri$ng  (2nd  ed.).  Houston,  TX:  Health  Educa?on  Systems,  Inc.    

Na?onal  Council  of  State  Boards  of  Nursing.  (2013)  2013  NCLEX-­‐RN  test  plan.  Chicago,  IL:  Na?onal  Council  of  State  Boards  of  Nursing.  hSps://www.ncsbn.org/3795.htm  

Nibert,  A.  (2010)  Benchmarking  for  student  progression  throughout  a  nursing  program:    Implica$ons  for  students,  faculty,  and  administrators.  In  Capu?,  L.  (Ed.),  Teaching  nursing:    The  art  and  science,  2nd  ed.  (Vol.  3).  (pp.45-­‐64).  Chicago:  College  of  DuPage  Press.    

Schroeder,  J.  (2013).  Improving  NCLEX-­‐RN  pass  rates  by  implemen?ng  a  tes?ng  policy.  Journal  of  Professional  Nursing,  29(2),  S43-­‐S47.    

Sewell,  J.,  Culpa-­‐Bondal,  F.,  &  Colvin,  M.  (2008).    Nursing  program  assessment  and  evalua?on:    Evidence  based  decision  making  improves  outcomes.  Nurse  Educator,  33(3),  109-­‐112.    

  24

Have  Ques,ons?  Need  More  Info?  

  Thanks  for  your  ?me  &  aSen?on  today!  

25

866-429-8889