Post on 01-May-2018
Overall CourseTable of Contents
Below is a list of the contents presented in all twelve reading assignments. Following this table of contents is the Unit 1 Reading Assignment.
Unit 1: Training Evaluation Overview
Introduction..............................................................................................1.1
Why It’s Important To Evaluate...............................................................1.1
Consequences of Failing To Evaluate.....................................................1.2
When and Where To Evaluate................................................................1.3
Evaluation’s Role in ISD..........................................................................1.3
Evaluating the Analysis Phase................................................................1.4
Evaluating the Design, Development, and Delivery Phases...................1.4
Evaluating the Evaluation Phase.............................................................1.4
Formative Evaluation...............................................................................1.5
Summative Evaluation.............................................................................1.6
Formative vs. Summative Evaluation......................................................1.7
Summary and Transition.........................................................................1.7
Unit 2: Evaluation Models
Introduction..............................................................................................2.1
Context, Input, Process, Product (CIPP) Model......................................2.1
The Six-Stage Model of Evaluation.........................................................2.2
Systems Approach to Evaluation............................................................2.4
Kirkpatrick’s Levels of Evaluation Model.................................................2.5
Summary and Transition.........................................................................2.8
EVALUATION OF TRAINING COURSE (K606) PAGE I
Unit 2
Evaluation Models
Unit 3: Measurement Fundamentals
Introduction..............................................................................................3.1
Knowledge-Based vs. Performance-Based Evaluations........................3.1
Quantitative vs. Qualitative Evaluations..................................................3.2
Selecting a Quantitative or Qualitative Approach....................................3.4
Developing a Sampling Plan...................................................................3.4
Norm-Referenced and Criterion-Referenced Measures..........................3.6
When To Use Norm-Referenced and Criterion-Referenced Measures. .3.7
Types of Items.........................................................................................3.8
Rating Scales..........................................................................................3.9
Tips for Using Rating Scales.................................................................3.10
Other Factors That Can Affect Evaluation Results................................3.10
Summary and Transition.......................................................................3.12
Unit 4: Measuring Reactions
Introduction..............................................................................................4.1
Benefits of Conducting Reaction Evaluations.........................................4.1
Potential Costs........................................................................................4.2
When To Conduct Reaction Evaluations.................................................4.2
Methods To Assess Reactions................................................................4.3
Developing Written Evaluation Instruments............................................4.4
Writing Effective Reaction Questions......................................................4.5
Avoiding Inappropriate Questions...........................................................4.5
Using Observations...............................................................................4.11
Questions To Ask Yourself While Observing........................................4.11
Combining Observing and Question Asking..........................................4.11
Using Interviews....................................................................................4.12
Structured vs. Unstructured Interviews.................................................4.12
Guidelines for Preparing for and Conducting Interviews.......................4.12
Interview Protocols and Data Collection Instruments............................4.17
Summary and Transition.......................................................................4.17
EVALUATION OF TRAINING COURSE (E606)
Overall CourseTable of Contents
Unit 5: Measuring Knowledge
Introduction..............................................................................................5.1
Benefits of Conducting Knowledge Evaluations......................................5.1
Potential Costs........................................................................................5.2
When To Measure Knowledge Acquisition..............................................5.2
Steps for Developing Knowledge Tests..................................................5.3
Relationship Between Test Items and Learning Objectives....................5.4
Testing for Knowledge and the Domains of Learning.............................5.6
Selecting the Type of Knowledge Test....................................................5.6
General Guidelines for Writing Effective Test Items................................5.8
Multiple-Choice Test Items......................................................................5.9
Multiple-Choice Test Items: Advantages and Disadvantages..............5.10
Multiple-Choice Test Items: Guidelines................................................5.10
True-False Test Items...........................................................................5.13
True-False Test Items: Advantages and Disadvantages.....................5.14
True-False Test Items: Guidelines.......................................................5.14
Matching Test Items..............................................................................5.16
Matching Test Items: Advantages and Disadvantages........................5.17
Matching Test Items: Guidelines..........................................................5.17
Short-Answer Test Items.......................................................................5.19
Short-Answer Test Items: Advantages and Disadvantages.................5.20
Short-Answer Test Items: Guidelines...................................................5.20
Essay Test Items...................................................................................5.22
Essay Test Items: Advantages and Disadvantages.............................5.22
Essay Test Items: Guidelines...............................................................5.23
Oral Examinations.................................................................................5.24
Oral Examinations: Advantages and Disadvantages...........................5.24
Oral Examinations: Guidelines.............................................................5.25
Summary and Transition.......................................................................5.26
EVALUATION OF TRAINING COURSE (K606) PAGE III
Unit 7
Measuring OrganizationalResults
Unit 6: Measuring Performance
Introduction..............................................................................................6.1
Benefits of Measuring Performance........................................................6.1
Potential Costs........................................................................................6.2
When To Measure Performance.............................................................6.3
Steps for Developing Performance Assessments...................................6.4
Relationship Between Performance Assessments and Learning Objectives.................................................................................6.5
Establishing the Performance Standards................................................6.6
Selecting Behaviors To Assess...............................................................6.7
Assessing Simulated vs. Actual Performance.........................................6.8
How Much Reality To Simulate?.............................................................6.9
Types of Simulations.............................................................................6.10
Select the Type of Performance Assessment.......................................6.11
Direct Observation.................................................................................6.12
Direct Observation: Advantages and Disadvantages...........................6.12
Direct Observation: Preparation Guidelines.........................................6.13
Direct Observation: Data Collection Instruments.................................6.14
Direct Observation: Behaviorally Anchored Rating Scales..................6.17
Direct Observation: Other Ordinal Rating Scales.................................6.19
Direct Observation: Guidelines for Conducting Observations..............6.20
Assessing Performance Samples.........................................................6.21
Assessing Performance Samples: Advantages and Disadvantages. . .6.21
Assessing Performance Samples: Guidelines.....................................6.22
Assessing Performance Samples: Rating Scales................................6.23
Using Self-Assessments.......................................................................6.24
Using Self-Assessments: Advantages and Disadvantages..................6.24
Using Self-Assessments: Guidelines...................................................6.25
Using Self-Assessments: Rating Scales..............................................6.26
EVALUATION OF TRAINING COURSE (K606)
Overall CourseTable of Contents
Unit 6: Measuring Performance (Continued)
Conducting Performance Surveys.........................................................6.27
Conducting Performance Surveys: Advantages and Disadvantages...6.27
Conducting Performance Surveys: Design Guidelines........................6.28
Conducting Performance Surveys: Administration Guidelines.............6.29
Reviewing Archival Performance Data..................................................6.33
Reviewing Archival Performance Data: Advantages and Disadvantages.......................................................................................6.33
Reviewing Archival Performance Data: Overcoming Data Collection Problems..............................................................................6.34
Using Participant Action Plans..............................................................6.35
Summary and Transition.......................................................................6.37
Unit 7: Organizational Results
Introduction..............................................................................................7.1
Benefits of Measuring Organizational Results........................................7.1
Potential Costs........................................................................................7.2
When To Measure Organizational Results..............................................7.2
Process for Measuring Organizational Results.......................................7.3
Determining Training Costs.....................................................................7.3
Step 1: Identify Training Expenses.........................................................7.4
Step 2: Calculate Training Costs............................................................7.5
Job Aids for Calculating Training Costs..................................................7.5
Establishing the Value of Training...........................................................7.6
Factors To Consider in Quantifying Organizational Gains......................7.7
Steps for Establishing the Value.............................................................7.9
Step 1: Identify Organizational Benefits.................................................7.9
Step 2: Establish the Worth..................................................................7.11
Job Aids for Establishing the Worth......................................................7.12
Step 3: Compare the Benefits to the Costs..........................................7.13
Step 4: Determine the Value................................................................7.14
EVALUATION OF TRAINING COURSE (K606) PAGE V
Overall CourseTable of Contents
Summary and Transition.......................................................................7.15Unit 8: Understanding Validity & Reliability
Introduction..............................................................................................8.1
Validity.....................................................................................................8.1
Content Validity.......................................................................................8.2
Criterion-Related Validity.........................................................................8.3
Construct Validity....................................................................................8.4
Methods To Improve Validity...................................................................8.4
Reliability.................................................................................................8.6
Measurement Error.................................................................................8.7
Assessing Reliability................................................................................8.7
Methods To Improve Reliability...............................................................8.7
The Relationship Between Validity and Reliability................................8.11
Summary and Transition.......................................................................8.11
Unit 9: Considering Legal Issues
Introduction..............................................................................................9.1
Federal Statutes......................................................................................9.1
Code of Federal Regulations (CFRs)......................................................9.2
Representative Court Cases Regarding Employee Testing....................9.3
Griggs vs. Duke Power............................................................................9.3
Brito vs. Zia Company.............................................................................9.3
Wade vs. Mississippi Cooperative Extension Service.............................9.4
Moody vs. Albemarle Paper Company....................................................9.4
Fire Fighters Institute for Racial Equality vs. City of St. Louis.................9.4
Liability....................................................................................................9.6
Strategies To Prevent Liability.................................................................9.7
Summary and Transition.........................................................................9.8
EVALUATION OF TRAINING COURSE (K606) PAGE VI
Overall CourseTable of Contents
Unit 10: Conducting the Evaluation
Introduction............................................................................................10.1
Procedures for Administering Tests......................................................10.2
Testing Facilities....................................................................................10.2
Preparing To Administer the Test..........................................................10.2
Providing Feedback to Participants.......................................................10.3
Group and Individual Feedback............................................................10.4
A Model for Providing Feedback...........................................................10.5
Feedback Step 1: Get Feedback From Participants............................10.5
Feedback Step 2: Give Feedback to Participants................................10.6
Feedback Step 3: Merge Differences in Perspectives.........................10.6
Summary and Transition.......................................................................10.7
Unit 11: Analyzing & Interpreting Results
Introduction............................................................................................11.1
The Normal Curve.................................................................................11.1
Characteristics of the Normal Curve.....................................................11.2
Descriptive Statistics.............................................................................11.2
Measures of Central Tendency.............................................................11.2
Measures of Central Tendency: Mean.................................................11.3
Measures of Central Tendency: Median..............................................11.4
Measures of Central Tendency: Mode.................................................11.5
Measures of Dispersion: Frequency Distribution...................................11.5
Measures of Dispersion: Histogram.....................................................11.6
Measures of Dispersion: Frequency Polygon.......................................11.6
Measures of Dispersion: Range...........................................................11.7
Measures of Dispersion: Standard Deviation.......................................11.7
Item Analysis.........................................................................................11.8
Measure of Item Difficulty......................................................................11.8
Sample Item Difficulty Calculation.......................................................11.10
Item Difficulty Index.............................................................................11.11
Unit 5
Measuring Knowledge
Unit 11: Analyzing & Interpreting Results
Interpreting the Data............................................................................11.12
Summarizing the Data.........................................................................11.12
Summarizing Data Graphically............................................................11.13
Identifying Trends................................................................................11.15
Making Meaningful Interpretations......................................................11.15
Summary and Transition.....................................................................11.16
Unit 12: Developing the Evaluation Plan and Report
Introduction............................................................................................12.1
Planning Your Training Evaluation........................................................12.1
Consider Your Audience.......................................................................12.2
Select Between an Internal vs. External Evaluation..............................12.2
Steps for Developing the Evaluation Plan.............................................12.3
Step 1: Establish the Goals and Objectives.........................................12.4
Step 2: Determine the Scope...............................................................12.5
Step 3: Select What To Measure.........................................................12.6
Step 4: Determine How To Collect Data..............................................12.6
Step 5: Ensure Validity and Reliability..................................................12.7
Step 6: Consider Legal Issues.............................................................12.7
Step 7: Determine Types of Data Analysis ..........................................12.8
Step 8: Identify Budget and Resource Requirements..........................12.9
Evaluation Plan Components..............................................................12.11
Reporting Evaluation Results..............................................................12.17
Guidelines for Reporting Evaluation Results.......................................12.17
Developing the Report.........................................................................12.19
Evaluation Report Components..........................................................12.20
Writing Recommendations and Conclusions.......................................12.25
Reviewing the Report..........................................................................12.26
Summary and Transition.....................................................................12.27
EVALUATION OF TRAINING COURSE (K606)
Overall CourseTable of Contents
Unit 1: Training Evaluation Overview Reading Assignment
EVALUATION OF TRAINING COURSE (K606) PAGE X
Overall CourseTable of Contents
May 2000 Evaluation of Training Course
EVALUATION OF TRAINING COURSE (K606) PAGE XI
Unit 1
Training Evaluation Overview
Contents
Introduction..............................................................................................1.1
Why It’s Important To Evaluate...............................................................1.1
Consequences of Failing To Evaluate.....................................................1.2
When and Where To Evaluate................................................................1.3
Evaluation’s Role in ISD..........................................................................1.3
Evaluating the Analysis Phase................................................................1.4
Evaluating the Design, Development, and Delivery Phases...................1.4
Evaluating the Evaluation Phase.............................................................1.4
Formative Evaluation...............................................................................1.5
Summative Evaluation.............................................................................1.6
Formative vs. Summative Evaluation......................................................1.7
Summary and Transition.........................................................................1.7
EVALUATION OF TRAINING COURSE (K606)
Unit 1
Training Evaluation Overview
Introduction
An evaluation can serve many purposes. It can be used to:
Determine if a training program is accomplishing its objectives;
Identify strengths and weaknesses of a training program including the methods of presentation, learning environment, content, training aids, and scheduling;
Identify which course participants benefited the most, or the least, from training;
Determine if a program was appropriate for its intended purpose and target population;
Determine how well the training met needs;
Determine to what extent the content was mastered;
Assess if training transferred to on-the-job behaviors;
Determine whether the organization’s goals were achieved; and
Determine if the resources invested in the program are justified – was it worth it?
Why It’s Important To Evaluate
Evaluation is a process used to determine worth, value, or effectiveness of the program or course. Training evaluation determines a training program’s effectiveness in meeting its intended purpose – producing competent employees.
Training affects both internal and external customers. Individuals impacted by disasters expect disaster workers to be knowledgeable and well trained. The managers and stakeholders expect the hours and dollars invested in training to pay off. Disaster workers want to feel that they are prepared to handle assigned responsibilities. A systematic training evaluation process can help you determine if your training is meeting your customers’ expectations.
EVALUATION OF TRAINING COURSE (K606) PAGE 1.1
Since training staff, are disaster
victims getting more consistent,
accurate information?
Why should I release
employees for training?
Unit 1
Training Evaluation Overview
Why It’s Important To Evaluate (Continued)
Training evaluations also are important because they:
Provide evidence that the program is or is not working;
Demonstrate the importance of training in improving job performance;
Establish credibility for your department or agency when you are able to demonstrate positive results; and
Substantiate your training department’s contribution to the organization.
Consequences of Failing To Evaluate
If you fail to evaluate training, the following may happen:
Participants may continue to take courses in which they fail to learn;
Participants may attend courses that are not relevant to job performance;
Course modifications may not be based on participant outcome data;
Staff and managers may question if training is worthwhile;
No one will know if staff members possess the knowledge and skills to perform adequately;
Customers may become dissatisfied with the staff;
Staff members may feel overwhelmed and lack confidence to perform job tasks; and
Training attendance may decrease.
When and Where To Evaluate
EVALUATION OF TRAINING COURSE (K606) PAGE 1.2
Unit 1
Training Evaluation Overview
Training evaluations should be conducted in all available settings such as in the classroom, at simulation settings, and at the job site. Evaluations also should be conducted at various times, including prior to training, during training, immediately after training, and 3 to 6 months after training.
Tests are tools used to measure whether learning took place or whether the objectives of the course were met. They are just one method of evaluating training.
Evaluation’s Role in ISD
Evaluation is the quality assurance component of the Instructional Systems Design (ISD) training model. Evaluation occurs at every step of the ISD process to ensure that the training meets the intended objectives and achieves the desired outcomes. As the figure below illustrates, evaluation is cyclical and ongoing.
After the training evaluations are analyzed, decisions may be made about the effectiveness of the program and the extent to which the training program met the organization’s desired outcomes.
EVALUATION OF TRAINING COURSE (K606) PAGE 1.3
Unit 1
Training Evaluation Overview
Evaluating the Analysis Phase
Evaluation can provide you with an assessment of how well you identified the training needs in the analysis step of ISD. If your evaluation finds a lack of skill transfer between the training setting and the work environment, then you may have missed something in your original needs assessment and performance analysis.
Evaluating the Design, Development, and Delivery Phases
Evaluation also provides valuable feedback on how well the course materials and design are addressing the stated objectives. You are concerned with such issues as determining appropriate methodologies, allocating sufficient time for the lessons, and covering the course content.
Evaluating the Evaluation Phase
Evaluation can measure the effectiveness of your overall evaluation plan. Did you capture an accurate before picture of on-the-job performance? Does your training evaluation provide you with an accurate after training picture of on-the-job performance? Are you able to compare these snapshots and measure the extent to which job performance changed after training? “Before-and-after snapshots” are one way of evaluating training effectiveness.
As a result of a training evaluation, you may discover additional training needs such as:
New knowledge and skill requirements not included in the training that are needed to perform job tasks;
Other personnel who need to be trained in order to interface with newly trained personnel;
Additional training to enhance gains made or to maintain new proficiency levels; and
Future changes in job requirements.
At this point, the cycle begins again. Based on the evaluation results, you may opt to revise the existing curriculum or develop new courses.
EVALUATION OF TRAINING COURSE (K606) PAGE 1.4
Unit 1
Training Evaluation Overview
Formative Evaluation
Evaluations are classified into two categories depending upon when the evaluation is conducted.
As it relates to the ISD process, formative evaluation refers to the evaluation of training programs during the analysis, design, and development steps for the purpose of improvement. It also may include evaluation conducted as part of a pilot offering of the course.
The purpose of the formative evaluation is to validate or ensure that the goals of instruction are being met. Formative evaluation is an iterative process that involves ongoing revisions, consultations with subject-matter experts, pilot testing, and assessing feedback. It is the quality assurance component of instructional design and curriculum development.
The following table provides a synopsis of the types of questions answered by formative evaluations.
Formative Evaluation
Analysis
What do you want to accomplish—will training solve the problem? Who is the target audience? What do the learners need? What is the expected outcome of the training?
Design and Development
Is the appropriate material being covered? Does the material cover the objectives? Is the material well organized? Is the timing for the units appropriate? Are the training materials adequate?
EVALUATION OF TRAINING COURSE (K606) PAGE 1.5
Unit 1
Training Evaluation Overview
Summative Evaluation
As it relates to the ISD process, summative evaluation involves collecting data about a program of instruction during the delivery and evaluation steps and using the data to make decisions about the program. You’re interested in finding out the participants’ reactions to the course, if the participants learned the material, if training transferred to the job, and if the training made a difference in the organization’s goals.
The following table provides a synopsis of the types of questions answered by summative evaluations.
EVALUATION OF TRAINING COURSE (K606) PAGE 1.6
Unit 1
Training Evaluation Overview
Summative Evaluation
Delivery
Did the course participants respond positively to the course? Did course participants demonstrate that learning had occurred?
Note: These questions are also asked during the formative evaluation phase.
Evaluation
Did the evaluation tools gather the right information to measure the program’s effectiveness?
Was it a cost-effective program? Did the training transfer to on-the-job performance? Has the performance problem been solved? Did the program achieve the desired outcomes? Was it worth it? What do you need to do differently next time?
EVALUATION OF TRAINING COURSE (K606) PAGE 1.7
Unit 1
Training Evaluation Overview
Formative vs. Summative Evaluation
This reading states that formative evaluation is conducted during the analysis, design, and development steps, including during a pilot offering of the course. Some Instructional Systems Design models also consider as formative any evaluation performed during course delivery that precedes the summative evaluation at the end of the course.
The key distinction between formative and summative evaluation lies in why you are collecting the data. Evaluation data collected to identify and correct problems with the training is considered to be formative. Data collected to determine the worth of the training is summative.
What you are attempting to measure should dictate whether you want a formative or summative evaluation, and the type of evaluation affects the breadth of data you will want to collect. Generally speaking, the scope of a formative evaluation is broader, including everything from the training environment to the training methodology. After a course is implemented (after pilot testing), you most likely will narrow your focus to the impact the training had on the participants and the organization.
Summary and Transition
This reading assignment introduced you to the reasons for conducting training evaluations. Training evaluation becomes increasingly critical in the new work environment where learning is the key to success.
At this time you should return to the course web site and continue with the remainder of the Unit 1 learning activities.
EVALUATION OF TRAINING COURSE (K606) PAGE 1.8