Practices and pitfalls: A practitioner's journey into level 3 evaluation

9
A Practitioner’s Journey Into Level 3 Evaluation 14 www.ispi.org MAY/JUNE 2002 T he US Army’s Chemical Demilitarization Training Facility (CDTF) provides workforce training in support of the Chemical Stockpile Disposal Program. General Physics Corporation (GP) operates and maintains the CDTF, which oversees initial workforce training in support of eight US facil- ities and a current workforce of approximately 4,500 operators, technicians, and support personnel. Operating in a highly regulated industrial setting, the CDTF implemented Level 3 behavior evaluations as a sound, legally defensible process to measure and quantify training program effectiveness (Kirkpatrick, 1994). Level 3 focuses on transfer of learned behavior to the job setting as opposed to the other evaluation levels focusing on reactions to training, learning, and return on investment (ROI). In this effort to promote continuous improvement of our training program, the implementation of Level 3 was targeted to provide the following: A measure of the training program’s effectiveness The collection, analysis, reporting, and retention of training program effectiveness data Identification of areas for training program improvements Implementing Level 3 evaluation required the balance of organizational needs, regula- tory requirements, and the needs of customers at all levels into a program that could provide value to each organizational entity. Since there is no cookie cutter answer to the complexities involved in Level 3 evaluation, we developed processes that aligned customer requirements and built a program that provides the application of appropri- ate measurement and data collection methodologies for a given evaluation situation. Realizing that Level 3 evaluation is an often-discussed but seldom-practiced endeavor, this article presents a practitioner’s overview of the models and lessons learned from a successful application of training evaluation theory, tools, and principles. As such, this discussion will focus on the following key issues: Project alignment Identification and dismissal of assumptions Selling of the plan Process design Results Lessons learned Practices and Pitfalls: A Practitioner’s Journey Into Level 3 Evaluation by Tom Riley, Holly Davani, Pat Chason, and Ken Findley

Transcript of Practices and pitfalls: A practitioner's journey into level 3 evaluation

Page 1: Practices and pitfalls: A practitioner's journey into level 3 evaluation

A Practitioner’s Journey Into Level 3 Evaluation

14 www.ispi.org • MAY/JUNE 2002

The US Army’s Chemical Demilitarization Training Facility (CDTF) providesworkforce training in support of the Chemical Stockpile Disposal Program.General Physics Corporation (GP) operates and maintains the CDTF,which oversees initial workforce training in support of eight US facil-

ities and a current workforce of approximately 4,500 operators, technicians,and support personnel.

Operating in a highly regulated industrial setting, the CDTF implementedLevel 3 behavior evaluations as a sound, legally defensible process to measureand quantify training program effectiveness (Kirkpatrick, 1994). Level 3 focuseson transfer of learned behavior to the job setting as opposed to the other evaluationlevels focusing on reactions to training, learning, and return on investment (ROI).

In this effort to promote continuous improvement of our training program, theimplementation of Level 3 was targeted to provide the following: • A measure of the training program’s effectiveness• The collection, analysis, reporting, and retention of training program

effectiveness data• Identification of areas for training program improvements

Implementing Level 3 evaluation required the balance of organizational needs, regula-tory requirements, and the needs of customers at all levels into a program that couldprovide value to each organizational entity. Since there is no cookie cutter answer tothe complexities involved in Level 3 evaluation, we developed processes that alignedcustomer requirements and built a program that provides the application of appropri-ate measurement and data collection methodologies for a given evaluation situation.

Realizing that Level 3 evaluation is an often-discussed but seldom-practiced endeavor,this article presents a practitioner’s overview of the models and lessons learned from asuccessful application of training evaluation theory, tools, and principles. As such, thisdiscussion will focus on the following key issues:• Project alignment• Identification and dismissal of assumptions• Selling of the plan • Process design • Results• Lessons learned

Practices and Pitfalls:

A Practitioner’s Journey Into Level 3 Evaluationby Tom Riley, Holly Davani, Pat Chason, and Ken Findley

Page 2: Practices and pitfalls: A practitioner's journey into level 3 evaluation

Project Alignment

Practitioners of Level 3 evaluation must recognize theimportance of identifying organizational knowledge andculture that may help or hinder the evaluation process. Thekey to accomplishing this is found in the alignment of theevaluation process with the organization’s business goals.This practice will assist with the implementation of theevaluation process within the organizational context andsupport the achievement of business goals and objectives(Harless, 1979).

The project alignment process provides for four things: • Identification of evaluation resource requirements• Preparation of the organization for the performance of

the evaluation• Identification of aligned evaluation outcomes• Linkage of evaluation outcomes to organizational busi-

ness goals

Research, preparation, and planning in each level of thealignment process will provide a solid foundation for suc-cessful implementation of Level 3 evaluation. Practitionersshould solicit information from the stakeholders on the who,what, when, where, and how of the evaluation process. Thisinformation should include the following: • Business goals• Project goals• Organizational constraints• Resources• Evaluation targets• Deadlines• Individual roles and responsibilities• Data collection methodologies• Scheduling• Reporting

The application of the project alignment process will helpthe evaluation team avoid organizational roadblocks andsoothe individual perceptions concerning this level of eval-uation (Pulley, 1994).

Identification and Dismissal of Assumptions

“The first step in this process of managing performance… isto acknowledge the viewpoints that often characterize thesituation” (Rummler & Brache, 1990, p. 31).

Common assumptions about Level 3 evaluation provideserious barriers to the design and implementation of suchevaluation. Assumptions concerning Level 3 evaluationare generally based on limited experience. Regardless ofthe basis, these assumptions will, at a minimum, provideserious organizational constraints to the evaluationprocess and, at their worst, prove deadly to an evaluationprogram.

Early in the process of designing and implementing the CDTFprogram, the assumptions concerning Level 3 evaluationwere clear. Previously unsuccessful attempts in this area hadsupported negative assumptions and built strong organiza-tional opposition to resource allocation for Level 3 evalua-tion. These assumptions ran across all organizational levels.

At the management level, comments included the following:• “This costs too much.”• “This is going to negatively impact operations.”• “There is very little value, we are already doing fine.”• “We tried this before and it didn’t work.”

At the process level, the comments focused on maintainingthe status quo:• “Our training program is adequate; there is no need to

evaluate it.”• “Our other evaluation processes are enough.”• “If it ain’t broke, don’t fix it.”

At the job-performer level, comments were more negative:• “These evaluations never change anything.”• “This is a waste of my time.”• “Everything is fine; there is no need to do this.”

Do these assumptions sound familiar? It seems like theissues of cost, complexity, and negative impact on opera-tions (time lost to interviews, filling out surveys, and theevaluators disturbing workers) run through the literature onthis subject.

A case for the negative impact of these assumptions con-cerning Level 3 Evaluation is evidenced in the literature.Sixteen percent of the organizations identified as traininginvestment leaders and a mere 13% of the American Societyfor Training and Development (ASTD) BenchmarkingForum practiced Level 3 evaluation (ASTD, 2000, pp. 20).At least 50% of the organizations polled in a 1996 TrainingMagazine survey saw no reason to implement Level 3 eval-uation (Industry Report, 1996).

Although these recent data support a trend of increasingusage among the nation’s leading training organizations,most still do not practice Level 3 evaluation. For many orga-nizations, the adage that perception is reality rings true.Perceptions are reality to the management team trying tomeet business goals. These perceived realities drive man-agement decisionmaking and therefore must be taken intoaccount and dealt with directly.

In implementing a Level 3 program, the successful evaluatormust define relevant assumptions, eliminate inconsistent orincorrect assumptions, and highlight real problems in anorganizational context. The alignment of the evaluation pro-ject with business goals and needs, while identifying andaddressing each assumption, is the starting point for success.

Performance Improvement • Volume 41 • Number 5 15

Page 3: Practices and pitfalls: A practitioner's journey into level 3 evaluation

16 www.ispi.org • MAY/JUNE 2002

It is within the evaluation program designer’s ability to con-trol and use the organization’s assumptions to the program’sbenefit. From planning through implementation, the evalu-ator needs to take advantage of opportunities to address anddeal with organizational assumptions at all levels. Well-planned and aligned evaluation is key to the monitoring andimprovement of any training program.

Selling of the Plan

With more than 80% (McMurrer, VanBuren, & Woodwell,2000) of the nation’s leading training organizations ignorantabout or unwilling to implement Level 3 evaluation, the dif-ficulty of the task of selling this tool is evident. Based onthis almost universal lack of knowledge, the job of selling theevaluation program also becomes a key element to success.

In the development of the CDTF evaluation program, theorganizational structure required orchestration and buyinfrom a diverse group of stakeholders, including the primaryclient, several external clients, and our own organization.These stakeholders were geographically dispersed acrossthe country, which compounded communication problems.Generally considered a worst case scenario for any evalua-tion process, this situation presented the following issues:• The stakeholders did not have an adequate understand-

ing of Level 3 evaluation. • Previous unsuccessful attempts at Level 3 evaluation had

laid the basis for an extremely negative view of the process. • Attempts to get all the stakeholders involved and work-

ing together on other issues had been difficult.

Selling the stakeholders on the concept of Level 3 evalua-tion was the single pivotal issue that, if unsuccessful, wouldhave caused the program to suffer a quick death. Selling theprogram required the development of a plan that could dothe following:• Educate the stakeholders• Communicate intentions• Identify and address relevant issues• Develop cross-functional participation• Develop organizationwide ownership of the evaluation

process

Educate the Stakeholders

Many organizations do not have the necessary information tomake well-informed decisions concerning workforce train-ing issues. Training practitioners must be prepared todevelop and create opportunities to educate the organizationon these processes.

There are many methods of educating the organization; afew that worked in the CDTF implementation of Level 3evaluation included meetings, panel discussions, newslet-ters, technical bulletins, and conferences.

In selling the CDTF program, we provided training to man-agement, process stakeholders, and supervisors through reg-ularly scheduled meetings, monthly management reviews,and regional conferences for plant operations. This trainingprovided the organization with knowledge of the processand potential benefits of Level 3 evaluation.

Communicate Intentions

As the plan moves from dispelling assumptions to selling,the practitioner must consistently and continuously com-municate the program’s intentions to its stakeholders. Thisis accomplished by informing the organization of plans, cur-rent project status, and intentions for the future.

In addition to providing information, practitioners need toask for assistance and feedback at every opportunity. At theCDTF, we were surprised by the volume and value this levelof feedback provided. An evaluation plan that provides fororganizational review and feedback throughout each phase isin an excellent position to provide what its customers need.

Establishing and exercising a policy of open and continuousorganizational communication about what a program isdoing and where it is going will ensure that project align-ment is maintained while needs and concerns are identifiedand addressed at all levels of the organization.

Identify and Address Relevant Issues

A practitioner of Level 3 must understand that to sell theevaluation product, he or she must describe what the cus-tomer needs. Identifying and building possible solutions torelevant organizational issues provides perceived valueearly in the planning process. If the practitioner can targetand provide solutions to relevant issues, he or she is well onthe way to program success.

In implementing the CDTF program, we identified relevantissues, including an organizational level need to ensure sev-eral things:• Collection, analysis, and reporting of data on student

ability to perform job tasks after completion of training• Development of a feedback loop to provide training pro-

gram performance information• Development of ways to identify continuous process

improvement initiatives

The CDTF evaluation process was designed to include toolsand methodologies that would be helpful in addressingthese organizational needs, further strengthening the pro-ject’s alignment with organizational goals.

Develop Cross-Functional Participation

Each organizational entity must not only understand thevalue of the program to its area, but also understand the

Page 4: Practices and pitfalls: A practitioner's journey into level 3 evaluation

overall organizational value of the program. Creating oppor-tunities for stakeholders to be involved in the process spursparticipation, increases cross-functional interaction, createsmultiple feedback loops, and increases communication andmotivation (Fisher & Ellis, 1990).

During previous organizational attempts at Level 3 evalua-tion, a contributing factor to failure was the lack of a cen-tral point of contact at the process and employee levels.Evaluators just showed up at the job site and tried to col-lect data. This approach was far from satisfactory andwas corrected in the design of an improved process.Corrections to the process allowed stakeholders at all levelsto do the following:• Participate in the review of plans, measurement instru-

ments, and interview questions• Participate in the collection of data at the job site• Serve as points of contact to provide subject matter

expertise for technical issues

Stakeholders took part in planning the details of the data-collection process and served as liaisons between the plantpersonnel and the evaluation team. These up-front, cross-functional team-building practices had a direct positiveimpact on the success of the instrument design and data-gathering processes during implementation.

Practitioners must understand needs and take positiveaction to get key individuals involved, keep them involved,and constantly reinforce team behaviors in this phase of theprocess. In doing so, the team can work with a vision oforganizational improvements.

Develop Organizationwide Ownership of the Process

If one successfully educates the organization, identifies andaddresses relevant organizational issues, continually com-municates intent, and builds cross-functional participation,the development of ownership happens naturally. By pro-viding a plan that includes organizational interests, devel-opment, and participation, Level 3 evaluation is no longertraining’s program, but becomes the organization’s program.

Outcome

During the CDTF implementation, the organizational atti-tude on the concept of Level 3 evaluation changed from hos-tile to hopeful, then to one of total acceptance and support.From management and supervisory levels on through to theworkers, all have witnessed the indicators of Level 3 evalu-ation becoming the organization’s program. The CDTF hassince completed Level 3 evaluation for 30 technical jobpositions with critical training paths. Area supervisors

Performance Improvement • Volume 41 • Number 5 17

Performance improvement professionals are generally goodat identifying a few appropriate interventions that will solvea limited range of workplace learning and performanceissues. However, what clients really need from today’s per-formance improvement professional is a more holisticapproach—one that allows a performance consultant tounderstand and recommend the benefits and limitations ofeach approach.

Performance Intervention Maps is a book designed specificallyfor this mission. Readers are guided through a journeywhere they learn to link root causes to appropriate interven-tions, using cleverly constructed road maps. While theauthors note that no accounting of performance interven-tions can ever be called complete, the list presented in thisbook provides a foundation “from which a practitioner can

begin investigating the mostappropriate solutions for hisor her clients.” In addition tothe book’s unique frameworkfor approaching performanceissues, readers will find adviceand guidance from more than20 performance improvementexperts and practitioners who demonstrate how these prin-ciples have been applied successfully in he field.

ISPI/ASTD. ©2001

300 Pages

List Price: $45.95.

ISPI Member Price: $41.35

To order your copy call, 301.587.8570

Performance InterventionMaps36 Strategies for Solving YourOrganization’s Problemsby Ethan Sanders and Sivasailam “Thiagi”Thiagarajan

Page 5: Practices and pitfalls: A practitioner's journey into level 3 evaluation

assist evaluation team members in conducting surveys,interviews, and observations. Employees who initially hes-itated to provide candid information regarding their jobsand training now react positively to data-collection visitsbecause these evaluations have been identified as a mecha-nism for positive change.

In presenting the CDTF process for the implementation ofLevel 3 evaluation, practitioners must recognize there is nosingle best methodology. Program designs for the collection,analysis, reporting, and followup of this level of evaluationmust take into account organizational context. If the evalua-tion practitioner, through the planning and alignment process,has successfully dispelled assumptions, educated the organi-zation, communicated intent, targeted relevant issues, devel-oped cross-functional participation and organizationwideownership, the remaining implementation is academic.

Process Design

The Level 3 evaluation design selected for use in the CDTFimplementation was the result of extensive goal-orientedefforts. The result is a comprehensive, high-impact, andlow-cost model. This model provides for the planning, col-lection, analysis, and reporting of Level 3 information andincludes strategies and tools to collect employee and super-visor data. Figure 1 depicts the process in a flow diagram.

One commonly accepted reason that Level 3 evaluation isperformed so seldom is that, depending on organizationalstructure and the data desired, “it can be costly to gather thisinformation” (Krien & Weldon, 1994, p. 66). Understandingthat cost is relative and that what may be low cost for oneorganization may in fact be high cost for another, we built instrategies to increase the efficiency of the process and lowerprogram costs.

The process developed for the CDTF allows for the collec-tion of Level 3 data directly from the former students andtheir supervisors at the work sites. Surveys, interviews, andobservations of workers take place as work schedules allow.Planning, organization, and cooperation between organiza-tions and stakeholders minimize the time required for datacollection and the negative impact of this process on pro-duction. The resulting process improvements can increasetraining program and workforce performance, adding valueto the bottom line.

The process implemented by the CDTF allows the evalua-tion team to get onto the site, collect data, and get out in ashort time, thus reducing the negative impacts and exces-sive costs generally associated with this type of evaluation.

Planning

The program’s primary goal was to obtain the quantita-tive and qualitative data necessary to provide training

effectiveness information and program defensibility data(Pulley, 1994). Without both quantitative and qualitative data,our data would lack validity and strength to cause change.

The secondary goal was to use these data to assess individ-ual course performance and to identify opportunities fortraining program improvements. In other words, we wantedto know if the training participants could correctly performtheir tasks in their work environment and if we couldimprove our training program.

Research

Once we decided that a Level 3 evaluation could providethe data we needed, we organized the training audience intospecific job positions. After the customer had selected thejob position to be evaluated, we then conducted research todevelop a valid task list representing a job position usingprocedures, training materials, and the customer’s trainingplan. The customer’s subject matter experts validated thistask listing and returned the task listing for incorporationinto the evaluation survey.

Design

Methods triangulation (Patton, 1990) allowed us to evaluateconsistency and validity of findings using three data-collec-tion techniques: quantitative data on job performance, qual-itative data on training issues, and corroborating datagained through observation. Although we selected survey,interview, and observation, this design provides flexibility.

Data-collection instruments were designed using the stan-dard guidelines in questionnaire construction; they com-plied with the recommended way of providing instructionsto respondents, selecting task items, and ordering items(Babbie, 1992). We used a Likert-type scale to obtain numer-ical values for task proficiency and to provide the quantita-tive data that we identified as crucial to the strength of theevaluation program.

A two-part evaluation survey collected quantitative and quali-tative data. The quantitative data were collected from studentself-ratings and validated by supervisor ratings. As self-ratingscould be challenged, we included the supervisor ratings toensure validity of the collected data. Qualitative data were col-lected using interview-type questions. To ensure the validity ofqualitative data, we followed recommendations for writingstandardized open-ended interview questions (Patton, 1990).

Questions included the following content:• Experience/behavior • Opinion/values • Feelings• Knowledge• Background/demographics

18 www.ispi.org • MAY/JUNE 2002

Page 6: Practices and pitfalls: A practitioner's journey into level 3 evaluation

Performance Improvement • Volume 41 • Number 5 19

Figure 1. Level 3 Evaluation Flow Diagram.

Page 7: Practices and pitfalls: A practitioner's journey into level 3 evaluation

Triangulating these data provided some remarkable results.In one instance, while analyzing the self-ratings of employ-ees donning personal protective equipment, we noted con-sistently low scores for several associated tasks. Interviewsconducted with employees corroborated a need for refreshertraining in the same tasks, while observations of job perfor-mance supported the need for refresher instruction in thosesame tasks.

Data-Collection Process

Practitioners should plan carefully and agree on the planfor conducting the evaluation. The CDTF’s successfulprocess required the evaluation team to travel to an onsitelocation for the evaluation and conduct the evaluation sur-veys as time allowed. This process included the followingelements:• Surveys to evaluate the job performer performance rating• Surveys to evaluate supervisors’ responses to group

performance• Structured interviews of the job performers and their

supervisors to obtain answers to the qualitative sectionof the survey

• Observations of job performance

Data Analysis

Using a database expedites the analysis process. The quan-titative data were placed into a database that averages rat-ings for each task listing on the evaluation survey andprovides other comparative data. These data provide a quickway to find tasks that students are not feeling proficient in.This information raises a red flag that must be consideredwhen the evaluation data are examined for action.

Each survey was also averaged to get a numerical rating foreach individual. These numbers were compared to the super-visors’ numbers as a validity check. The qualitative responseswere documented in the report, and comments that occurredfrequently were compiled in the results section.

All qualitative and quantitative data must be compiled andtranslated into meaningful information that the organizationcan easily digest. This information includes training pro-gram performance data and initial recommendations fororganizational improvement initiatives.

Results Reporting

Communicating the results is the culmination of all theevaluation efforts. As such, it deserves the same level ofattention, and in certain cases possibly more, as otheraspects of the Level 3 evaluation process receive.

In implementing the CDTF process, all stakeholders receivea copy of the report. This report includes both quantitative

and qualitative findings and recommendations for furtheraction. The report addresses student level of competence inaccomplishing required job tasks and responses to the inter-view questions. The report addresses supervisors’ reactionsto student job performance and the effectiveness of thetraining in meeting the needs of the workforce.

Followup

Putting the information collected to use within the organi-zation is the final key to Level 3 evaluation success or fail-ure. These data can languish in a file somewhere or be putto immediate use for achieving desired outcomes. This finaluse of the evaluation data, the implementation of selectedinterventions, completes the improvement cycle.

At the CDTF, after all stakeholders have reviewed the report,they meet with the evaluation team. This group thendecides what actions to pursue as a result of the evaluationand who will carry out those actions. The decisions made atthis meeting become a formal part of the final report.

During the pilot implementation of the process, the messagefrom those involved was consistent and clear: “We are goingto be looking for changes in our work because of thisprocess.” The practice is not finished when the report isdone. The findings and recommendations are important tothose who took time to provide the data. Data should bemade available and shared with all process stakeholders.The workforce is interested in seeing changes and improve-ments; thus, the communication of findings, recommenda-tions, and the status of actions must be communicated to alllevels of the organization.

The phrase “success breeds success” is quite true. By shar-ing data, communicating results, and showing the value ofthe practice, we have created an environment where amajority of the workforce looks forward to these evalua-tions. The quality of the data collected has improved witheach evaluation.

Results

The inability of past programs to yield any significantchanges at the performer level initially caused employees toview the program with skepticism. As the program pro-gressed and employees began to see action taken to improvetraining and meet their needs, respondents graduallybecame more relaxed; an environment of open informationsharing has been created. This environment continues to benurtured by the success of the process, thereby easing theburden of data collection and improving the quality of thedata collected. This positive relationship can continue aslong as the workforce identifies the value of the practice.Organizational changes and improvements have includedthe following:

20 www.ispi.org • MAY/JUNE 2002

Page 8: Practices and pitfalls: A practitioner's journey into level 3 evaluation

• Refresher Training for Operation and Maintenance ofPlant Equipment: Several tasks were identified forrefresher training. Due to infrequent occurrence of somejob tasks and employee turnover, the capability to oper-ate and maintain certain equipment had become mar-ginal. Refresher training fostered reacquisition of theseskills and enabled job performers to conduct localmaintenance on specified equipment. The refreshertraining caused a significant decrease in equipmentdown time.

• New Training on the Maintenance of Plant Equipment:Equipment maintenance training previously providedby vendors is now available internally. Working withthe vendor, several employees were certified to providethis training on site. The acquisition of this capabilityby the workforce has led to elimination of costs and hasincreased workforce capability to maintain the subjectequipment, thereby reducing costs and decreasingequipment down time.

• Development of New Courses: Development has beeninitiated in areas where the workforce has reported thatinitial training would have been beneficial and wouldhave made the on-the-job training certification processeasier.

• Incorporation of Simulators in Refresher Training:Several plant systems have been targeted for feasibilitystudies on options for the use of PC-based simulationtraining. This study led to the development and imple-mentation of process control system simulator systemsat three national training sites.

Lessons Learned

Implementation of the Level 3 post-training evaluation hasprovided our organization with several lessons and insightsinto the design and implementation of this type of evalua-tion. The following specific lessons learned may be ofvalue to other organizations wishing to implement this levelof evaluation.

Align the Project

Too often practitioners of Level 3 will enter an organizationand fail to align evaluation processes with organizationalgoals, designing the process within an organizational con-text. Any evaluation that fails to accomplish this is doomedto failure or to providing the organization with a service thatwill be of limited value.

Early CDTF experiences in the implementation of Level 3were unsuccessful primarily because of an inability to alignthe process with business goals or to design the details ofthe evaluation in an organizational context. Once thesetasks were identified and measures taken to maintain afocus on alignment, the process was easier to implementand achieved its final successful results.

Identify Potential Organizational Constraints to the Practice

Constraints and assumptions concerning the implementa-tion of this level of evaluation include a multitude of con-cerns, some of which are cost, time, negative impact onproductivity, and the nature of the practice design. By main-taining alignment and keeping stakeholders informed andinvolved in each phase of the practice, the practitioner canaddress these concerns and develop viable solutions. Doingthis, the assumptions concerning the validity and value ofthe practice can be dispelled.

Sell the Practice

Aggressive marketing of the practice to all levels of the orga-nization is the starting point for success. The practitionermust involve internal and external process stakeholders inthe orchestration and implementation and communicateand sell the plan and its perceived value to the organization.This was a key process for success during the CDTF imple-mentation of Level 3 evaluation.

Evaluation personnel attended meetings, presented plans,published articles in technical newsletters and publica-tions, and used every opportunity to inform, communicate,and market the practice. By actively and aggressively pur-suing advocacy of the plan, the practitioner can nurture top-down buyin, which is crucial to any program’s success.

Build Flexibility Into the Practice

The nature of Level 3 evaluation is such that no singlemodel is the most expedient or efficient for every situation.Therefore, the model should be built with flexibility thatallows it to be fine tuned to meet the needs of a specificevaluation situation. The practitioner may select from dif-ferent data-gathering techniques or may choose the statisti-cal tool or analytical methodology that best supports his orher purpose.

For example, during the CDTF implementation, mail sur-veys were our first choice for data collection. We soon dis-covered that the best data were gleaned during face-to-facesurveys and interviews at the worksite. Our practice wasadjusted to meet this change. The organizational tool bagstill contains the ability to mail out surveys and enact otherdata-gathering strategies, should we identify suitable envi-ronments for their deployment. In this fashion we can applythe appropriate tool or tools as needed.

Minimize the Negative Effects of the Data-Gathering Process

The effect of removing employees from the job to gather datawas one of management’s primary concerns. In response,

Performance Improvement • Volume 41 • Number 5 21

Page 9: Practices and pitfalls: A practitioner's journey into level 3 evaluation

22 www.ispi.org • MAY/JUNE 2002

evaluation teams scheduled surveys and interviews for allshifts and on weekends to minimize the impact to normalwork schedules. Planning and teamwork ensured that work-ers were surveyed and interviewed and returned to work inan expedient manner. Exit surveys from the pilot imple-mentation of this practice overwhelmingly noted that theamount of time spent in this process was sufficient and notexcessive. Front-line supervisors reported that there was nonoticeable negative impact to plant operations from theimplementation of the process.

Summary

Implementation of an effective Level 3 evaluation took plan-ning, flexibility, communication, and cooperation. Carefulplanning provided the model for an effective evaluationprocess. The program’s ability to change and revise method-ologies helped the process to mature and succeed in pro-viding valid data. Communicating the process to all thestakeholders dispelled their fears of failure. Finally, main-taining alignment with organizational goals, as well as coop-eration with and inclusion of all the stakeholders, from theinitial planning through the implementation, ensured theprogram’s success.

References

Babbie, E.R. (1992). The practice of social research (7th

ed.). Belmont, CA: Wadsworth Publishing Company.

Fisher, B.A., & Ellis, D.G. (1990). Small group decisionmaking: Communication and the group process (3rd ed.).New York: McGraw-Hill.

Harless, J.H. (1979). Front-end analysis of soft skills train-ing. Newnan, GA: Harless Performance Guild.

Industry Report. (1996). Training. 33: 10, 29.

Kirkpatrick, D.L. (1994). Evaluating training programs: Thefour levels. San Francisco: Berrett-Koehler.

Krein, T.J., & Weldon, K.C. (1994). Making a play for train-ing evaluation. Training & Development, 62-67.

McMurrer, D.P, VanBuren, M.E., & Woodwell, W.H. (2002).State of the industry report 2000. Alexandria, VA: ASTD,p. 20.

Patton, M.L. (1990). Qualitative evaluation and researchmethods (2nd ed.). Thousand Oaks, CA: Sage Publications.

Pulley, M.L. (1994). Training 101. Training & Development,19–24.

Rummler, G.A., & Brache, A.P. (1990). Improving perfor-mance: How to manage the white space on the organiza-tion chart. San Francisco: Jossey-Bass.

Thomas Riley is an Instructional Systems Manager with General Physics

Corporation, where he oversees instructional design for the Chemical

Demilitarization Training Facility. He is a graduate of Southern Illinois University,

Carbondale and the George Washington University.

An adult educator and human performance improvement practitioner for

two decades, his key interests have led to research and several articles in the

areas of small group communication, adult learning, and training program

assessment and evaluation. Recent accomplishments include the develop-

ment of a comprehensive training evaluation program that has been success-

fully implemented at several Department of Defense facilities. He is a past a

board member for the ASTD/SEVA Training Institute and currently serves as

Cochair of the ISPI Evaluation and Continuous Improvement Track. Thomas

may be reached at [email protected].

Holly Davani is a Senior Instructional Psychologist for General Physics

Corporation. She manages the Instructor Performance Evaluation Program,

conducts interviews for the Post-Training Evaluation Program for Chemical

Demilitarization Training Facility, and develops and delivers inservice training

to the CDTF instructional staff. She is also an adjunct professor in the

Department of Psychology at Towson University.

Holly is a graduate of Virginia Polytechnic Institute & State University and

Drake University. She has made presentations at ISPI conferences on the topic of

Level 3 evaluation design. She may be reached at [email protected].

Pat Chason is a Senior Instructional Designer with General Physics

Corporation. She collects and analyzes data gathered for post-training evalu-

ations. She teaches instructional design, tools, and techniques and performs

instructor evaluations for inhouse personnel. Pat formerly directed a math and

English tutoring program for Baltimore elementary and middle schools.

Pat received her undergraduate degree in English/Education from Towson

University. Her MA degree from the University of Maryland Baltimore County is

in Instructional Systems Development. Pat presented a poster session on the

format and development of evaluation instruments at the ISPI 2000

International Conference and took part in the 2001 ASTD International

Conference and Exposition on the topic of Level 3 evaluation implementation.

She may be reached at [email protected].

Ken Findley is the Contracting Officer’s Representative and Site Project

Manager for CDTF, where he oversees surveillance, quality control, cost

accounting, and contractor engineering and conducts monitoring, evaluation,

and audits of contractor activities for the Program Manager Chemical

Demilitarization. Ken is a graduate of the US Naval Academy and West Virginia

University. He may be reached at [email protected].