134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice...

15
http://hpp.sagepub.com/ Health Promotion Practice http://hpp.sagepub.com/content/6/2/134 The online version of this article can be found at: DOI: 10.1177/1524839904273387 2005 6: 134 Health Promot Pract Ruth P. Saunders, Martin H. Evans and Praphul Joshi How-To Guide Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation: A Published by: http://www.sagepublications.com On behalf of: Society for Public Health Education can be found at: Health Promotion Practice Additional services and information for http://hpp.sagepub.com/cgi/alerts Email Alerts: http://hpp.sagepub.com/subscriptions Subscriptions: http://www.sagepub.com/journalsReprints.nav Reprints: http://www.sagepub.com/journalsPermissions.nav Permissions: http://hpp.sagepub.com/content/6/2/134.refs.html Citations: What is This? - Apr 26, 2005 Version of Record >> at UNIV OF DELAWARE LIB on August 2, 2012 hpp.sagepub.com Downloaded from

Transcript of 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice...

Page 1: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

http://hpp.sagepub.com/Health Promotion Practice

http://hpp.sagepub.com/content/6/2/134The online version of this article can be found at:

 DOI: 10.1177/1524839904273387

2005 6: 134Health Promot PractRuth P. Saunders, Martin H. Evans and Praphul Joshi

How-To GuideDeveloping a Process-Evaluation Plan for Assessing Health Promotion Program Implementation: A

  

Published by:

http://www.sagepublications.com

On behalf of: 

Society for Public Health Education

can be found at:Health Promotion PracticeAdditional services and information for    

  http://hpp.sagepub.com/cgi/alertsEmail Alerts:

 

http://hpp.sagepub.com/subscriptionsSubscriptions:  

http://www.sagepub.com/journalsReprints.navReprints:  

http://www.sagepub.com/journalsPermissions.navPermissions:  

http://hpp.sagepub.com/content/6/2/134.refs.htmlCitations:  

What is This? 

- Apr 26, 2005Version of Record >>

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 2: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

134

HEALTH PROMOTION PRACTICE / April 2005

Evaluation and Practice2April6

Developing a Process-Evaluation Planfor Assessing Health Promotion ProgramImplementation: A How-To Guide

Ruth P. Saunders, PhDMartin H. Evans, MS

Praphul Joshi, PhD, MPH

Process evaluation is used to moni-tor and document program imple-mentation and can aid in under-standing the relationship betweenspecific program elements and pro-gram outcomes. The scope and im-plementation of process evaluationhas grown in complexity as its im-portance and utility have becomemore widely recognized. Severalpractical frameworks and modelsare available to practitioners toguide the development of a compre-hensive evaluation plan, includingprocess evaluation for collaborativecommunity initiatives. However,frameworks for developing a com-prehensive process-evaluation planfor targeted programs are less com-mon. Building from previous frame-works, the authors present a com-prehensive and systematic approachfor developing a process-evaluationplan to assess the implementationof a targeted health promotionintervention. Suggested elements

for process-evaluation plans includefidelity, dose (delivered and re-ceived), reach, recruitment, and con-text. The purpose of this article isto describe and illustrate the stepsinvolved in developing a process-evaluation plan for any health-promotion program.

Keywords: process evaluation; mea-suring program imple-mentation

Much emphasis is placed onoutcome evaluation todetermine whether a health-

promotion program was successful.Process evaluation, which helps usunderstand why a program was orwas not successful, is equally impor-tant (Bartholomew, Parcel, Kok, &Gottlieb, 2001; Steckler & Linnan,2002a). In fact, process evaluationmay be used to confirm that theintervention was indeed imple-mented before using resources toassess its effectiveness (Scheirer,Shediac, & Cassady, 1995). A pro-gram’s lack of success could beattributed to any number ofprogram-related reasons, includingpoor program design, poor or incom-plete program implementation, and/or failure to reach sufficient num-bers of the target audience. Processevaluation looks inside the so-calledblack box to see what happened in

the program and how that couldaffect program impacts or outcomes(Bouffard, Taxman, & Silverman,2003; Harachi, Abbott, Catalano,Haggerty, & Fleming, 1999).

In recent years, an increasing em-phasis has been placed on measur-ing program implementation, in partbecause of great variability in pro-gram implementation and policyadoption in school and commu-nity settings (Dusenbury, Brannigan,Falco, & Hansen, 2003; Harachiet al., 1999; Helitzer, Yoon, Waller-stein & Garcia-Velarde, 2000; McGrawet al., 2000; Scheirer et al., 1995;

Authors’ Note: Correspondence concerningthis article should be addressed to Ruth P.Saunders; Department of Health Promo-tion, Education, and Behavior; ArnoldSchool of Public Health; University ofSouth Carolina; 800 Sumter Street; Colum-bia, South Carolina 29208. Phone: (803)777-2871; e-mail: [email protected].

Health Promotion PracticeApril 2005 Vol. 6, No. 2, 134-147DOI: 10.1177/1524839904273387©2005 Society for Public Health Education

The Authors

Ruth P. Saunders, PhD, is an associateprofessor in the Department of HealthPromotion, Education, and Behaviorin the Arnold School of Public Healthat the University of South Carolina inColumbia, South Carolina.

Martin H. Evans, MS, is a doctoralcandidate in the Department of HealthPromotion, Education, and Behaviorin the Arnold School of Public Healthat the University of South Carolina inColumbia, South Carolina.

Praphul Joshi, PhD, MPH, is aresearch and planning administratorin the Division of CardiovascularHealth at the South Carolina Depart-ment of Environmental Control inColumbia, South Carolina.

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 3: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

Zapka, Goins, Pbert, & Ockene,2004). Several practical frameworksand models to guide the develop-ment of comprehensive evaluationplans, including process evalua-tion for collaborative communityinitiatives, have been developed. In-cluded among these are PreventionPlus III (Linney & Wandersman,1991), Community Coalition ActionTheory (Butterfoss & Kegler, 2002),Getting to Outcomes (Chinman et al.,2001), and the CDC Framework(Millstein, Wetterhall, & CDC Evalu-ation Working Group, 2000). Al-though comprehensive evaluationplans such as these are available topractitioners, frameworks for devel-oping a comprehensive process-evaluation plan for targeted pro-grams are less common (Steckler &Linnan, 2002a). Recent advanceshave occurred in identifying andclarifying the components of processevaluation (Baranowski & Stables,2000; Steckler & Linnan, 2002b).Steckler and Linnan (2002a) alsoprovided an overview of the keysteps in planning process evaluationas well as examples of process evalu-ation from studies in a variety of set-tings (Steckler & Linnan, 2002b).

In this article, we work from thesepreviously identified process evalu-ation components, definitions, andsteps to present a comprehensive,systematic, and user-friendly ap-proach to planning process evalua-tion in public health interventionswith an emphasis on assessing tar-geted program implementation. Spe-cifically, we provide an in-depthguide for developing a plan to assessprogram implementation, which isone component of a comprehensiveprogram-evaluation plan (e.g., Step6, program plan, and Step 7, qualityof implementation, of the Getting toOutcomes framework; Chinman etal., 2001). The process presented inthis article is especially suited forplanning process evaluation for tar-

geted programs that are imple-mented as part of a coalition activity(e.g., Level 2, coalition evaluation;Butterfoss & Francisco, 2004) or as astand-alone program in a specificsetting (e.g., schools, work sites).The development of the process-evaluation plan is illustrated with aproposed, school-based case study,Media Matters.

� INTRODUCTION TOPROCESS-EVALUATIONPLANNING

Ideally, process-evaluation plan-ning is conducted with a collabora-tive planning team that includes key

stakeholders with a multidisciplin-ary professional perspective and anunderstanding of the iterativenature of process-evaluation plan-ning (Bartholomew et al., 2001;Butterfoss & Francisco, 2004;Steckler & Linnan, 2002a). Impor-tant issues that must be addressedwhen developing a process-evalua-tion plan include (a) understandingthe health promotion program andhow it is supposed to work, (b)defining the purposes for the processevaluation, and (c) considering pro-gram characteristics and context andhow these may affect implementa-tion. Having a well-planned andtheory-based health-promotion pro-

Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 135

�FORWARD FROM THE EDITORS

We are pleased to highlight Saunders, Evans, and Joshi’s article,which offers a comprehensive, systematic approach for developing aprocess-evaluation plan. In this article, the authors describe the mainelements of process-evaluation planning (fidelity, dose, reach, recruit-ment, and context) and its six steps (as adapted from Steckler &Linnan, 2002b). The real contribution of the work is that they then pro-vide the reader with a step-by-step evaluation plan for a hypotheticalschool-based media training program, Media Matters.

Many health-promotion and disease-prevention programs are beingimplemented across the country with limited resources, often pro-vided primarily by volunteers. We know that many of you who work inschool and community settings are expected to document your pro-grams and their effects, not to perform expensive, complex programevaluations. Effective process evaluation can be used to monitor pro-gram efforts so that programs are implemented as planned, useresources where needed, and change when appropriate. Processassessment data help provide accountability for administrators andfunders and let planners know why the program worked or did notwork. Process evaluation will help staff assess if the program is reach-ing its intended audience and provide a clear description of the pro-gram that was implemented so that it can be disseminated and sharedwith others. As always, one of the main goals of program evaluation isto make sure it is useful—to you, your administration, your funders,and your program participants. The following article is offered in thespirit of providing another practical, useful tool for our readers.

Frances Butterfoss and Vincent T. Francisco

Evaluation and Practice

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 4: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

gram is the beginning point forplanning process evaluation. Theapproach to process-evaluationplanning described in this articleassumes that the intervention hasbeen planned in detail with guid-ance from appropriate theories and/or a conceptual model.

Broadly speaking, process-evalu-ation data can be used for bothformative and summative pur-poses. Formative uses of processevaluation involve using process-evaluation data to fine-tune the pro-gram (e.g., to keep the program ontrack; Devaney & Rossi, 1997;Helitzer et al., 2000; Viadro, Earp, &Altpeter, 1997). Summative uses ofprocess evaluation involve making ajudgment about the extent to whichthe intervention was implementedas planned and reached intendedparticipants (Devaney & Rossi, 1997;Helitzer et al., 2000). This informa-tion, in turn, can be used to interpretand explain program outcomes, ana-lyze how a program works, and pro-vide input for future planning(Baranowski & Stables, 2000; Dehar,

Casswell, & Duignan, 1993; McGrawet al., 1994, 1996). Beyond these broadpurposes, a process-evaluation planwill also have specific purposes thatare unique to the program for whichit is being designed, as described inthe next section.

Health-behavior change is madein ongoing social systems that typi-cally involve participating agencies(e.g., schools), program implement-ers (e.g., teachers), a proximal tar-get person (e.g., student), and adistal target person (e.g., parent;Baranowski & Stables, 2000). Assess-ment of program implementationrequires taking into account the sur-rounding social systems, includingcharacteristics of the organization inwhich the program is being imple-mented, characteristics of personsdelivering the program (Viadro et al.,

136 HEALTH PROMOTION PRACTICE / April 2005

1. Describe the program

2. Describe complete & acceptable program delivery

3. Develop potential list of questions

4. Determine methods

6. Finalize the process evaluation plan

Steps 3 – 5 considered iteratively

5. Consider program resources, context & characteristics

FIGURE 1 Steps in the Process-Evaluation Process

The Editors

Frances D. Butterfoss, PhD, is an associate professor and directs the Health Promo-tion and Disease Prevention Section at the Center for Pediatric Research, Norfolk,VA.

Vince Francisco, PhD, is associate professor of Public Health Education at the Uni-versity of North Carolina, Greensboro.

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 5: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 137

1997; Zapka et al., 2004), existingstructures of the organizations andgroups, organizational social systemcharacteristics (e.g., interorganiza-tional linkages, community agencypartnerships; Scheirer et al., 1995;Zapka et al., 2004), and factors in theexternal environment (e.g., compet-ing events, controversy about theprogram, external political factors,and history and events that happenconcurrently with the program;Scheirer et al., 1995; Viadro et al.,1997; Zapka et al., 2004).

The characteristics of the programare also an important influence onprogram implementation, includingthe program’s age (new or old), size(large or small), coverage (single- ormultisite, local, state, or national),and complexity (standardized or tai-

lored intervention, single or multi-ple treatments; Viadro et al., 1997).Program characteristics and contextaffect process evaluation in at leasttwo ways. First, important contex-tual factors should be identified andmeasured in process evaluation.Second, as program size and com-plexity increase, the resourcesrequired to monitor and measureimplementation will increase.

�STEPS FOR DEVELOPINGA PROCESS-EVALUATIONPLAN

Six steps to developing a process-evaluation plan for an intervention,adapted from Steckler and Linnan(2002a), are illustrated in Figure 1.As reflected in Figure 1, process-

evaluation planning is an iterativerather than linear process, particu-larly Steps 3 to 5. Each step of theprocess is described in more detailand illustrated with a case study inthe following section. The steps pre-sented in Figure 1 incorporate adetailed understanding of the pro-gram and its characteristics, contex-tual factors, and the purposes of theprocess evaluation.

Step 1: Describe the Program

In Step 1, the previously plannedprogram is described fully, includ-ing its purpose, underlying theory,objectives, strategies, and theexpected impacts and outcomes ofthe intervention. Ideally, this should

PROGRAM DESCRIPTION FOR MEDIA MATTERS

Media Matters is a school-based program designed to decrease adolescent risk behaviors by increasing individual and groupempowerment among the participants. The overarching concept of the Media Matters program is that by developing an under-standing of how media work—how they construct reality, create meaning, and influence behavior—students can increase theirself-efficacy regarding their ability to deconstruct media messages and, thereby, become more critically aware of and resistant tothe unhealthy behaviors that media sometimes reinforce. Because this curriculum includes a component in which students pro-duce their own media, they also have an opportunity to put into practice the concepts and skills that they learn such as coopera-tion, team building, and responsibility. A venue is also provided for the students to create their own messages to influence thesocial norms regarding behaviors and issues with which they are directly involved. The highly participatory classroom strate-gies include small group work, demonstration and feedback, problem solving, and role playing.

The theoretical framework of the program is based largely on Bandura’s (1986) Social Cognitive Theory (SCT). Theprimary constructs of SCT, on which the Media Matters curriculum focuses, are self-efficacy, behavioral capability, situ-ation, observational learning, knowledge, and attitudes. The primary objectives of the Media Matters intervention are toincrease adolescents’ self-efficacy for abstaining from alcohol and tobacco consumption and to influence the socialnorms within the school environment so that alcohol and tobacco consumption are viewed as unhealthy choices. TheMedia Matters logic model is provided below:

Behavioral HealthInputs Immediate Impacts Short-Term Impacts Impacts Outcomes

Providing training,materials, andconsultation, toteachers will

result in the developmentof a Media Matters teamand cross-disciplinaryimplementation of theMedia Matters curriculum,which will

result in changes in theschool media environment,social norms, anddevelopment of students’self-efficacy and skills forabstaining from alcohol andtobacco use, which

will result inreduced use ofalcohol andtobacco and,ultimately,

improvedhealth instudents.

FIGURE 2 Media Matters Case Study—Step 1: Describe the Program

Evaluation and Practice

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 6: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

138 HEALTH PROMOTION PRACTICE / April 2005

COMPLETE AND ACCEPTABLE DELIVERY OF MEDIA MATTERS

The ideally implemented Media Matters program will consist of four essential components: an environmental componentfocusing on creating a Media Matters intervention team; two curriculum modules, Deconstructing Media Messages and Youth-Produced Media; and a teacher training component.

The Environmental Component

A Media Matters team will be formed at each school to develop a strategic plan for delivering the program across several dis-ciplines within the eighth-grade curriculum as documented by a written plan. Team membership will include at least oneadministrator, two teachers, and appropriate staff (media specialist/librarian). Additionally, the school environment will besupportive of media literacy/education by (a) strongly limiting the children’s exposure to commercial media such as advertise-ments within the school (i.e., Channel 1 closed circuit TV, etc.) and (b) providing exhibition space and/or presentation time forcounteradvertising media messages produced by eighth-grade students.

The Curriculum Component

The Media Matters curriculum is comprised of two primary modules —Deconstructing Media Messages and Youth-Produced Media—and is designed to be delivered during an 8-week period with an average time of 1.5 to 2 hours per week spentdirectly on the curriculum. The curriculum is designed to be highly participatory. Implementers are encouraged to engage thestudent participants in a manner that promotes critical thinking and discussion. These qualities will become more vital in thelatter portion of the intervention when students are required to perform some role-playing activities and work in teams.

For implementation to be considered ideal, the curriculum should be delivered across multiple disciplines within theeighth-grade curriculum, one of which must include health. A likely combination of disciplines might include social studies,language arts, and health promotion/prevention. The curriculum components are designed to be delivered in sequential orderbecause the knowledge and skills taught in the second require mastering those taught in the first.

The Deconstructing Media Messages module will contain, at a minimum, the following:

• Instructors will deliver Media Messages curriculum in which students view media examples with in-class discussion.• Instructors will assign students a self-reflection assignment regarding media influence on personal behavior.• Instructors will engage students in role-playing activities in which they view a scene from a movie depicting a particular

risk behavior and then act out how the character could have better handled the situation. They will then enact the conse-quences of the behaviors.

The Youth-Produced Media module will be characterized by:

• Students working in teams to produce counteradvertising messages designed to promote healthy decisions related to alco-hol and tobacco consumption.

• Students participating in the entire production process—brainstorming, scripting, storyboarding, and production.• Students analyzing and critiquing each group’s final product based on its use of the various techniques to increase its

effectiveness.• Counteradvertising messages will be publicly showcased.

The Training Component

Media Matters teachers will obtain needed skills (i.e., effective use of media in the classroom and teaching media productionskills) through training sessions delivered by Media Matters program developers. Training will include 1 week during summerand booster training (a half day) in January. Additional consultation and support will be provided as needed. All necessary pro-gram materials will be provided during training.

The training will be characterized by:

• Having the appropriate teachers present (reflecting multiple disciplines).• Coverage of primary content.• Demonstration of interactive classroom strategies.• Involvement of teachers through small group discussion, demonstration and feedback, role play, and developing counter

messages.

The training will also address the Media Matters team, including:

• Appropriate team membership.• Effective team functioning.• The Media Matters team’s role and activities in the school.• Involving teachers, administrators, and staff outside of the Media Matters team.

FIGURE 3 Media Matters Case Study—Step 2: Describe Complete and Acceptable Delivery of the Program

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 7: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

be conveyed in a logic model thatspecifies the theoretical constructsof interest, those expected to change,and mediators of the change process(Scheirer et al., 1995; Steckler &Linnan, 2002a). A description ofMedia Matters, the fictional caseexample we will use to illustrateprocess-evaluation planning, is pro-vided in Figure 2. Media Matters is aschool-based program with environ-mental and curricula components

designed to reduce alcohol andtobacco use among youth.

Step 2: Describe Complete andAcceptable Delivery of theProgram

The elements that comprise theprogram are described in moredetail in the second step of process-evaluation planning; this includesspecific strategies, activities, media

products, and staff behaviors(Scheirer et al., 1995). The goal ofthis step is to state what would beentailed in complete and acceptabledelivery of the program (Bartholo-mew et al., 2001). A description ofcomplete and acceptable delivery ofthe program should be based on thedetails of the program (e.g., programcomponents, theory, and elementsin logic model) and guided by anexternal framework such as the rec-

Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 139

TABLE 1Elements of a Process-Evaluation Plan, With Formative and Summative Applications

Component Purpose Formative Uses Summative Uses

Fidelity (quality) Extent to which intervention wasimplemented as planned.

Monitor and adjust programimplementation as needed toensure theoretical integrityand program quality.

Describe and/or quantifyfidelity of interventionimplementation.

Dose delivered(completeness)

Amount or number of intendedunits of each intervention orcomponent delivered orprovided by interventionists.

Monitor and adjust programimplementation to ensure allcomponents of interventionare delivered.

Describe and/or quantifythe dose of the interven-tion delivered.

Dose received(exposure)

Extents to which participantsactively engage with, interactwith, are receptive to, and/oruse materials or recommendedresources; can include “initialuse” and “continued use.”

Monitor and take correctiveaction to ensure participantsare receiving and/or usingmaterials/resources.

Describe and/or quantifyhow much of the inter-vention was received.

Dose received(satisfaction)

Participant (primary andsecondary audiences) satisfactionwith program, interactions with staffand/or investigators.

Obtain regular feedback fromprimary and secondarytargets and use feedback asneeded for corrective action.

Describe and/or rateparticipant satisfactionand how feedback wasused.

Reach(participationrate)

Proportion of the intended priorityaudience that participates in theintervention; often measured byattendance; includes documentationof barriers to participation.

Monitor numbers and character-istics of participants; ensuresufficient numbers of targetpopulation are being reached.

Quantify how much ofthe intended target audi-ence participated in theintervention; describethose who participatedand those who did not.

Recruitment Procedures used to approach andattract participants at individualor organizational levels; includesmaintenance of participantinvolvement in intervention andmeasurement components of study.

Monitor and documentrecruitment procedures toensure protocol is followed;adjust as needed to ensurereach.

Describe recruitmentprocedures.

Context Aspects of the environment thatmay influence intervention imple-mentation or study outcomes;includes contamination or theextent to which the control groupwas exposed to the program.

Monitor aspects of the physical,social, and political environ-ment and how they impactimplementation and neededcorrective action.

Describe and/or quantifyaspects of the environ-ment that affected pro-gram implementationand/or program impactsor outcomes.

NOTE: Adapted from Steckler and Linnan (2002a) and Baranowski and Stables (2000).

Evaluation and Practice

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 8: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

140 HEALTH PROMOTION PRACTICE / April 2005

TABLE 2Sample Process Evaluation Questions for Fidelity, Dose Delivered, Dose Received, Reach, Recruitment, and Context

Possible Question Information Needed

Fidelity 1. To what extent was the interventionimplemented consistently with theunderlying theory and philosophy?

1. What constitutes high-quality delivery foreach component of the intervention? Whatspecific behaviors of staff reflect the theoryand philosophy?

2. To what extent was training provided asplanned (consistent with the underlyingtheory and/or philosophy)?

2. What behaviors of trainers convey theunderlying theory and philosophy?

Dose delivered 3. To what extent were all of the intended unitsor components of the intervention or programprovided to program participants?

3. How many units/components (andsubcomponents as applicable) are in theintervention?

4. To what extent were all materials (written andaudiovisual) designed for use in theintervention used?

4. What specific materials are supposed to beused and when should they be used?

5. To what extent was all of the intendedcontent covered?

5. What specific content should be includedand when should it be covered? What is theminimum and maximum time to spend onthe content?

6. To what extent were all of the intendedmethods, strategies, and/or activities used?

6. What specific methods, strategies, and/oractivities should be used in what sessions?

Dose received 7. To what extent were participants present atintervention activities engaged in the activities?

7. What participant behaviors indicate beingengaged?

8. How did participants react to specific aspectsof the intervention?

8. With what specific aspects of theintervention (e.g., activities, materials,training, etc.) do we want to assessparticipant reaction or satisfaction?

9. To what extent did participants engage inrecommended follow-up behavior?

9. What are the expected follow-up behaviors:reading materials, engaging inrecommended activities, or using resources?

Reach 10. What proportion of the priority targetaudience participated in (attended) eachprogram session? How many participatedin at least one half of possible sessions?

10. What is the total number of people in thepriority population?

Recruitment 11. What planned and actual recruitmentprocedures were used to attract individuals,groups, and/or organizations?

11. What mechanisms should be in place todocument recruitment procedures?

12. What were the barriers to recruitingindividuals, groups, and organizations?

12. How will we systematically identify anddocument barriers to participation?

13. What planned and actual procedures wereused to encourage continued involvement ofindividuals, groups, and organizations?

13. How will we document efforts forencouraging continued involvement inintervention?

14. What were the barriers to maintaininginvolvement of individuals, groups, andorganizations?

14. What mechanisms should be in place toidentify and document barriers encounteredin maintaining involvement of participants?

Context 15. What factors in the organization, community,social/political context, or other situationalissues could potentially affect eitherintervention implementation or theintervention outcome?

15. What approaches will be used to identifyand systematically assess organizational,community, social/political, and othercontextual factors that could affect theintervention? Once identified, how willthese be monitored?

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 9: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

ommended elements of a process-evaluation plan. These elementsinclude fidelity (quality of imple-mentation), dose (dose delivered—amount of program delivered byimplementers and dose received—extent to which participants receiveand use materials or otherresources), and reach (degree towhich the intended priority audi-ence participates in the interven-tion; Steckler & Linnan, 2002a).These four elements, plus two addi-tional elements, are described inmore detail in Step 3 below.

Describing fidelity or what consti-tutes high-quality implementation isoften a challenging task (Steckler &Linnan, 2002a). Theory can providea possible guide for defining fidelity.To illustrate, based on theory, twopossible sources of efficacy informa-tion are direct experience (experi-encing success) and vicarious expe-rience (observing models similar tooneself experiencing success;Bandura, 1986). This suggests twointervention strategies: modeling(participants observing persons sim-ilar to themselves practicing a skill)and guided practice (opportunity forparticipants to practice the skill withconstructive feedback and in such away they experience success). Fidel-ity pertains to how well the imple-mentation of these strategies reflectsthe spirit of the theory. For example,the models demonstrating the skillsshould be people that the partici-pants can identify with, and the ver-bal and nonverbal behavior of staffleading the guided practice shouldbe emotionally positive, providinggentle redirection only as neededand carefully reinforcing the aspectsof the skills practice that are donecorrectly. In Figure 3, the expectedcharacteristics of the fourcomponents of Media Matters aredescribed to illustrate Step 2.

Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 141

Evaluation and Practice

POTENTIAL PROCESS-EVALUATION QUESTIONSFOR MEDIA MATTERS

Through a series of meetings, program planners developed a list of potentialprocess-evaluation questions that included the following:

Fidelity

• To what extent was each of the program elements implemented as planned(as described in “complete and acceptable delivery” in Figure 3)?

Dose Delivered

• Were all intervention components delivered (e.g., did teachers deliver allunits of the curriculum)?

Dose Received

• To what extent did instructors participate on the Media Matters team (i.e.,coordinate with other team members on a strategic plan for delivering theMedia Matters program across several disciplines)?

• To what extent did instructors make changes in their own curriculum toincorporate Media Matters modules?

• Did the school remove commercial advertising/media from school?• Did students enjoy the Media Matters curriculum and activities?• Were the Media Matters instructors in the intervention classes satisfied with

the curriculum?

Reach

• Was the Media Matters curriculum delivered to at least 80% of the eighth-grade students?

• To what extent did Media Matters instructors participate in training?

Recruitment

• What procedures were followed to recruit teachers to training and to developthe Media Matters team?

Context

Organizational Factors

• Did the school allow common meeting time for Media Matters teamplanning?

• Did the school provide release time for implementers to attend trainings?

Other

• What other barriers and facilitators influenced delivery of the Media Mattersprogram in the schools?

FIGURE 4 Media Matters Case Study—Step 3: Develop List of Potential Process-Eval-uation Questions

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 10: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

Step 3: Develop a List of PotentialProcess-Evaluation Questions

In Step 3, the initial wish list ofpossible process-evaluation ques-tions based on the program (withoutfull consideration of resources)needed is drafted. (Alternatively,these can be stated as process objec-tives.) This initial draft of process-evaluation questions can be orga-nized by intervention componentand guided by the elements of aprocess-evaluation plan. The ele-ments (or components) of a process-evaluation plan are provided inTable 1 and include fidelity, dosedelivered, dose received, reach,recruitment, and context (developedfrom Baranowski & Stables, 2000;Steckler & Linnan, 2002a). Each com-ponent can be used for both forma-tive and summative purposes, asshown in the table.

To illustrate, a sampling of poten-tial process-evaluation questions fora so-called generic school-based pro-gram is presented in Table 2; alsonoted is the information needed toanswer each process-evaluationquestion, which should also be iden-tified in this step. As noted previ-ously, defining fidelity can be chal-lenging and should be based on theunderlying theory and philosophyof the program. Specifying dosedelivered is fairly straightforwardafter the components of the interven-tion are defined because this per-tains to the parts of the interventionthat staff deliver. Dose received, theexpected participant reaction orinvolvement, is somewhat morechallenging in that it requiresthoughtful consideration of theexpected reactions and/or behaviorsin participants. Note that dose

received is different from reach, dis-cussed below! Although the qualityof training is presented as a fidelityquestion in Table 2, it can also beconsidered a separate interventioncomponent in which dose, reach,and fidelity are assessed, as in thecase example (see Figure 3).

Process evaluation for reach,recruitment, and context often in-volves documentation and recordkeeping. Part of the planning forthese elements involves specifyingthe specific elements to document ormonitor. Attendance or records ofparticipation often assess the reachof the intervention into the prioritypopulation. However, attendancerecords must be compared with thenumber of potential program partici-pants (the number of people in thepriority population) to be meaning-ful. Note that reach can be reported

142 HEALTH PROMOTION PRACTICE / April 2005

TABLE 3Issues to Consider When Planning Process-Evaluation Methods

Methodological Component General Definition Examples of Quantitative and Qualitative Approaches

Design Timing of data collection:when and how often dataare to be collected.

Observe classroom activities at least twice per semes-ter with at least 2 weeks between observations.

Conduct focus groups with participants in the lastmonth of the program.

Data sources Source of information (e.g.,who will be surveyed,observed, interviewed, etc.).

For both quantitative and qualitative approaches, datasources include participants, teachers/staff deliver-ing the program, records, the environment, etc.

Data collection toolsor measures

Instruments, tools, and guidesused for gathering process-evaluation data.

For both quantitative and qualitative approaches, toolsinclude surveys, checklists, observation forms, inter-view guides, etc.

Data collectionprocedures

Protocols for how the datacollection tool will beadministered.

Detailed description of how to do quantitative and/orqualitative classroom observation, face-to-face orphone interview, mailed surveys, etc.

Data management Procedures for getting datafrom field and entered;quality checks on raw dataforms and data entry.

Staff turn in participant sheets weekly; process-evaluation coordinator collects and checks surveysand gives them to data entry staff.

Interviewers transcribe information and turn in tapesand complete transcripts at the end of the month.

Data analysis/synthesis

Statistical and/or qualitativemethods used to analyze,synthesize, and/orsummarize data.

Statistical analysis and software that will be used toanalyze quantitative data (e.g., frequencies and chisquares in SAS).

Type of qualitative analysis and/or software that willbe used (e.g., NUD*IST to summarize themes).

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 11: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

for each session or activity withineach component of the program aswell as overall for the program.Effective process evaluation for

recruitment involves a clear articu-lation of recruitment procedures aswell as developing mechanisms todocument recruitment activities and

barriers to recruitment. Similarly, aneffective assessment of contextualissues requires that the team identifypotential factors in the organiza-tional, community, and political/social environment that may affectprogram implementation or programoutcome and that they develop effec-tive mechanisms to monitor orassess these factors. See Figure 4 forthe Media Matters process-evaluation questions from Step 3.

Step 4: Determine Methodsfor Process Evaluation

In Step 4, the team begins to con-sider the methods that will be usedto answer each question in the wishlist of process-evaluation questions.Considering methods without alsoconsidering the resources needed tocarry out the process evaluation is

Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 143

Evaluation and Practice

�Process-Evaluation MethodsBecause the process evaluation of the Media Matters program will be for both formative and summative purposes, one of the

first issues to consider will be the timing for data collection and reporting. This required the planners to develop a data-collec-tion, analysis, and reporting scheme that would assure that the program planners would receive formative feedback in a timelymanner so that they could make adjustments to the program if necessary. Media Matters would also require a highly trained andcoordinated data-collection and management staff. However, to maximize internal validity, evaluation staff will not performboth formative and summative assessments; an independent evaluation team that had no part in the intervention design or plan-ning would be utilized for summative measures.

Potential process-evaluation methods (data sources, data-collection tools, and timing) for the Media Matters curriculum aresummarized below:

Implementation fidelity for Media Matters curriculum. Possible data sources and methods include reports from teachers imple-menting the curriculum and Media Matters staff observation; both require developing a checklist of the expected characteristicsof implementation.

Dose delivered for Media Matters curriculum. Possible data sources and methods include reports from teachers implementingthe curriculum and Media Matters staff observation; both require developing a checklist of content to be covered and methods tobe used in the curriculum.

Dose received. Possible data sources include teachers, staff, administrators, and students in the school; methods and toolsinclude administering brief satisfaction scales and conducting interviews or focus groups with open-ended questions.

Reach. Data sources are the classes in which the Media Matters curriculum is taught; for each Media Matters session, the teachercould write down and report the head count, have students sign in on a sign-in sheet, or check students’ names off on a class roll.

Recruitment. Media Matters staff document all activities involved in identifying and recruiting teachers for the Media Matterscurriculum training.

Context. Possible data sources include school teachers, staff, and administrators. The primary method and tool are interviewswith open-ended questions to assess barriers to implementation.

FIGURE 5 Media Matters Case Study—Step 4: Determine Methods for Process Evaluation

�Program Resources, Context,and CharacteristicsThe Media Matters program is somewhat complex in terms of size and

researcher control of the intervention. The initial study will involve two treatmentand two comparison middle schools, each having between 250 and 350 eighthgraders. Some aspects of the program are standardized (e.g., the curriculum),whereas other aspects (e.g., the environmental component) are tailored to the spe-cific school. One of the primary issues that program planners had to consider whenprioritizing and developing the process-evaluation questions was the project staffand respondent burden. Because the program is to be delivered within schools byinstructors, training the instructors in the use of the curriculum is essential. Addi-tionally, planners wanted to minimize the burden placed on these instructors interms of collecting process-evaluation data or intruding on them a great deal duringclass times for observations. However, the planners wanted to ensure that the pro-cess data had a high level of reliability and validity. Finally, because of budget con-straints, the project could devote only 1.5 FTE staff for process evaluation (coordi-nation, data collection, management, entry, and analysis).

FIGURE 6 Media Matters Case Study—Step 5: Consider Program Resources, Context,and Characteristics

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 12: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

144

TA

BL

E 4

Fin

al P

roce

ss-E

valu

atio

n P

lan

for

Med

ia M

atte

rs C

urr

icu

lum

Im

ple

men

tati

on

Pro

cess

-E

valu

atio

nD

ata

Too

ls/

Tim

ing

ofD

ata

An

alys

isQ

ues

tion

Sou

rces

Pro

ced

ure

sD

ata

Col

lect

ion

or S

ynth

esis

Rep

orti

ng

Fid

elit

y1.

To

wh

at e

xten

t w

asth

e cu

rric

ulu

mim

ple

men

ted

as

pla

nn

ed?

Tea

cher

s an

dM

edia

Mat

ters

staf

f

Sel

f-re

por

ted

ch

eck -

list

an

d o

bser

va-

tion

wit

h c

hec

klis

t

Tea

cher

s re

por

tw

eekl

y; a

t le

ast

two

obse

rvat

ion

s p

erte

ach

er s

ched

ule

d

Cal

cula

te s

core

bas

edon

per

cen

tage

of

inte

nd

ed c

har

acte

rist

ics

incl

ud

ed

For

mat

ive—

info

rmal

feed

back

to

staf

fw

eekl

y; s

um

mat

ive—

sum

mar

ized

by

mod

-u

le, s

choo

l, a

nd

over

all

Dos

e d

eliv

ered

2.T

o w

hat

ext

ent

wer

eal

l m

odu

les

and

un

its

wit

hin

th

e cu

rric

ulu

mim

ple

men

ted

?

Tea

cher

s an

dM

edia

Mat

ters

staf

f

Sel

f-re

por

ted

ch

eck -

list

an

d o

bser

va-

tion

wit

h c

hec

klis

t

Tea

cher

s re

por

tw

eekl

y; a

t le

ast

two

obse

rvat

ion

s p

erte

ach

er s

ched

ule

d

Cal

cula

te s

core

bas

edon

per

cen

tage

of

inte

nd

ed m

odu

les

and

un

its

incl

ud

ed

For

mat

ive—

info

rmal

feed

back

to

staf

fw

eekl

y; s

um

mat

ive—

sum

mar

ized

by

mod

-u

le, s

choo

l, a

nd

over

all

Dos

e re

ceiv

ed3.

Did

stu

den

ts e

njo

yth

e M

edia

Mat

ters

curr

icu

lum

an

dac

tivi

ties

?4.

Wer

e M

edia

Mat

ters

inst

ruct

ors

sati

sfie

dw

ith

th

e cu

rric

ulu

m?

Tea

cher

s an

dst

ud

ents

Foc

us

grou

ps

wit

hop

en-e

nd

ed q

ues

-ti

ons

for

teac

her

s;br

ief

sati

sfac

tion

scal

es a

dm

inis

-te

red

to

stu

den

ts

Foc

us

grou

ps

and

scal

es a

dm

inis

tere

dat

en

d o

f se

mes

ter

inw

hic

h c

urr

icu

lum

imp

lem

ente

d

Tea

cher

s—th

emes

iden

tifi

ed t

hro

ugh

qu

al-

itat

ive

anal

ysis

; stu

-d

ents

—re

spon

se f

re-

quen

cies

su

mm

ariz

ed

Su

mm

ativ

e—re

por

ted

afte

r cu

rric

ulu

mim

ple

men

tati

on i

sco

mp

lete

d

Rea

ch5.

Was

th

e in

terv

enti

ond

eliv

ered

to

at l

east

80%

of

the

eigh

th-

grad

est

ud

ents

?

Tea

cher

sC

lass

room

ros

ters

Tak

en f

or e

ach

cla

ssin

wh

ich

Med

iaM

atte

rs i

s ta

ugh

t

Loo

k at

nu

mbe

r of

stu

-d

ents

par

tici

pat

ing

inat

lea

st 8

0% o

f cl

ass

sess

ion

s/to

tal

nu

mbe

rof

stu

den

ts

For

mat

ive—

rep

ort

wee

kly

by c

lass

;su

mm

ativ

e—re

por

tby

mod

ule

, sch

ool,

and

ove

rall

Rec

ruit

men

t6.

Wh

at p

roce

du

res

wer

e fo

llow

ed t

ore

cru

it t

each

ers

toM

edia

Mat

ters

trai

nin

g?

Med

ia M

atte

rsst

aff

Med

ia M

atte

rs s

taff

doc

um

ent

all

recr

uit

men

tac

tivi

ties

Dai

lyN

arra

tive

des

crip

tion

of p

roce

du

res

For

mat

ive—

exam

ined

wee

kly

to p

reve

nt/

solv

e p

robl

ems;

sum

mat

ive—

des

crib

ed f

or p

roje

ctov

eral

lC

onte

xt7.

Wh

at w

ere

barr

iers

and

fac

ilit

ator

s to

imp

lem

enti

ng

the

Med

ia M

atte

rscu

rric

ulu

m?

Tea

cher

sF

ocu

s gr

oup

s w

ith

open

-en

ded

qu

es-

tion

s fo

r te

ach

ers

Ad

min

iste

red

at

end

of s

emes

ter

in w

hic

hcu

rric

ulu

m w

asim

ple

men

ted

Th

emes

id

enti

fied

thro

ugh

qu

alit

ativ

ean

alys

is

Su

mm

ativ

e—re

por

ted

afte

r cu

rric

ulu

mim

ple

men

tati

on i

sco

mp

lete

d

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 13: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 145

difficult. Accordingly, Steps 4 and 5are often considered simulta-neously. However, for clarity of pre-sentation, they are presentedseparately here.

Primary issues to consider inplanning process-evaluation meth-ods include design (when data are tobe collected), data sources (fromwhere the information will come),tools or measures needed to collectdata, data-collection procedures,data-management strategies, anddata-analysis or data-synthesis plans(see Table 3). Both qualitativeand quantitative data-collection ap-proaches are used in process evalua-tion. Common qualitative data-collection methods include open-ended questions in interviews andfocus groups, logs, case studies, doc-ument review, open-ended surveys,and content analysis of videotapes;common quantitative methods in-clude surveys, direct observation,checklists, attendance logs, self-administered forms and question-

naires, and project archives (Bar-tholomew et al., 2001; Devaney &Rossi, 1997; McGraw et al., 2000;Steckler & Linnan, 2002a). SeeBaranowski and Stables (2000) for amore detailed presentation of quali-tative and quantitative aspects ofdata collection for each componentof process evaluation.

The selection of specific methodsis based on the process-evaluationquestions (or objectives), resourcesavailable, as well as the characteris-tics and context of the program.Using multiple methods to collectdata is recommended because differ-ent data sources may yield differentconclusions (e.g., teacher report ofclassroom activity vs. observation ofclassroom activity; Bouffard et al.,2003; Helitzer et al., 2000; Resnicowet al., 1998). How the informationwill be used (for formative—imme-diate, or summative—longer termpurposes) and the turnaround timebetween data collection to reportingare critical issues to consider. Data

that cannot be collected in a timelymanner for their intended purposemay not need to be collected!Another issue concerns the level ofobjectivity needed for data collec-tion and whether to use internaland/or external data collectors(Helitzer & Yoon, 2002). For exam-ple, although the same instrumentsor tools may be used for both pur-poses, different staff and reportingsystems may be needed forsummative and formative processevaluations.

Because process evaluation cangenerate voluminous amounts ofinformation, planning data manage-ment and analysis is essential.Where do data go after they are col-lected? Who enters data? What is theprotocol for data entry? Having ananalysis and reporting plan isequally important. What type of dataanalysis will be conducted (e.g., fre-quency counts, qualitative analysis,means)? Who analyzes the data?How long will data analysis take?When will summary reports be gen-erated? Who receives summaryreports? When are the reportsneeded? See Figure 5 for the prelimi-nary process-evaluation methods forMedia Matters from Step 4.

Step 5: Consider ProgramResources and ProgramCharacteristics and Context

In Step 5, the team considers theresources needed to answer thepotential process-evaluation ques-tions listed in Step 3 using the meth-ods proposed in Step 4. More thanlikely, the team has already begun toconsider resource limitations evenas the wish list of questions is grow-ing! In addition to resources such asskilled staff, the team needs to con-sider the characteristics of the pro-gram itself. Programs that are longer,more complex, meet frequently,and/or have large numbers of partic-

�The Final Process-Evaluation PlanAfter weighing carefully all the potential process questions that the program

planners would ideally ask against the complexity of a program with limitedresources, the team decided to focus on fidelity of implementation of the curricu-lum, reach of the curriculum into the student population, dose delivered (extent ofcurriculum implementation), dose received (satisfaction among students andteachers), and barriers to implementation (context).

The process-evaluation methods (data sources, tools/procedures, timing of datacollection, data analysis/synthesis, and reporting) are summarized in Table 4. Foreach process-evaluation question the most appropriate data sources have beenidentified, and the instruments and methods used to collect that data are outlined.For example, Media Matters program planners determined that the most appropri-ate methods of assessing program fidelity (answering the question, “Was the pro-gram implemented as planned?”) was to use a combination of periodic observa-tions by evaluation staff during program implementation and to collect reportsfrom the instructors implementing the program. The decision to conduct periodicobservations necessitated the development of an observation instrument (check-list) and protocol for adequately evaluating the implementation sessions. To ensurereliability and validity of the evaluation instruments, an evaluation protocol wasdeveloped and outlined during training sessions delivered to the evaluation staff.Additionally, the evaluation instruments were pilot-tested prior to actual implemen-tation of the program.

FIGURE 7 Media Matters Case Study—Step 6: Finalize the Process-Evaluation Plan

Evaluation and Practice

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 14: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

146 HEALTH PROMOTION PRACTICE / April 2005

ipants will require more resources,including time, to implement and tomonitor. Certain settings (e.g.,schools) may have scheduling andother constraints. If the interventioninvolves working collaborativelywith a number of organizations orgroups, complexity increases.

Resource considerations includethe availability of qualified staff todevelop and implement all aspectsof the process evaluation as well asthe time needed for planning, pilot-testing instruments and protocols,data collection, entry, analysis, andreporting. Program planners mustalso consider the feasibility of pro-cess data collection within the con-text of the intervention (e.g., disrup-tiveness to the intervention or theorganization’s regular operations) aswell as staff and respondent burden.If new data-collection tools areneeded, consider the time, staff, andlevel of expertise required. Most pro-jects find it necessary to priori-tize and reduce the number of theprocess-evaluation questions toaccommodate project resources. SeeFigure 6 for the Media Matters analy-sis of program resources, character-istics, and context in Step 5.

Step 6: Finalize the Process-Evaluation Plan

The final process-evaluation planemerges from the iterative team-planning process described in Steps3 to 5. Although priorities for pro-cess evaluation will vary in differentprojects, Steckler and Linnan(2002a) recommended a minimumof four elements of process evalua-tion: dose delivered, dose received,reach, and fidelity. They also recom-mended documenting recruitmentprocedures and describing the con-text of the intervention. In the finalplan, the process-evaluation ques-tions should be described for eachcomponent of the intervention and

may vary somewhat by component.See Figure 7 and Table 4 illustratingStep 6 for the Media Matters curricu-lum component. Table 4 provides atemplate for a final process-evaluation plan.

�CONCLUSIONS

The components of and stepsinvolved in process-evaluation plan-ning have been identified in previ-ous studies. After considering issuesconcerning process-evaluation pur-poses, process-evaluation methods,and the context in which health-promotion programs are imple-mented, we present a six-stepapproach to planning process evalu-ation, illustrated with a fictionalcase study. This description of howto plan process evaluation can pro-vide further guidance for practitio-ners who wish to develop compre-hensive process-evaluation plans forhealth-promotion programs.

REFERENCES

Bandura, A. (1986). Social foundations ofthought and action: A social cognitive the-ory. Englewood Cliffs, NJ: Prentice Hall.

Baranowski, T., & Stables, G. (2000). Pro-cess evaluations of the 5-a-day projects.Health Education & Behavior, 27, 157-166.

Bartholomew, L. K., Parcel, G. S., Kok, G., &Gottlieb, N. H. (2001). Intervention map-ping: Designing theory and evidence-basedhealth promotion programs. New York:McGraw-Hill.

Bouffard, J. A., Taxman, F. S., & SilvermanR. (2003). Improving process evaluations ofcorrectional programs by using a compre-hensive evaluation methodology. Evalua-tion and Program Planning, 26(2), 149-161.

Butterfoss, F., & Kegler, M. (2002). Toward acomprehensive understanding of commu-nity coalitions: Moving from practice to the-ory. In R. DiClemente, L. Crosby, & M. C.Keger (Eds.), Emerging theories in healthpromotion practice and research (pp. 157-193). San Francisco: Jossey-Bass.

Butterfoss, F. D., & Francisco, V. T. (2004).Evaluation community partnerships and

coalitions with practitioners in mind.Health Promotion Practice, 5(2), 108-114.

Chinman, M., Imm, P., Wandersman, A.,Kaftarian, S., Neal, J., Pendleton, K. T., et al.(2001). Using the getting to outcomes (GTO)model in a statewide prevention initiative.Health Promotion Practice, 2(4), 302-309.

Dehar, M. A., Casswell, S., & Duignan, P.(1993). Formative and process evaluation ofhealth promotion and disease preventionprograms. Evaluation Review, 17(2), 204-220.

Devaney, B., & Rossi, P. (1997). Thinkingthrough evaluation design options. Chil-dren and Youth Services Review, 19(7),587-606.

Dusenbury, L., Brannigan, R., Falco, M., &Hansen, W. B. (2003). A review of researchon fidelity of implementation: Implicationsfor drug abuse prevention in school set-tings. Health Education Research, 18(2),237-256.

Harachi, T. W., Abbott, R. D., Catalano, R.F., Haggerty, K. P., & Fleming, C. B. (1999).Opening the black box: Using process eval-uation measures to assess implementationand theory building. American Journal ofCommunity Psychology, 27(5), 711-731.

Helitzer, D., & Yoon, S. (2002). Process eval-uation of the adolescent social action pro-gram in New Mexico. In A. Steckler &L. Linnan (Eds.), Process evaluation forpublic health interventions and research(pp. 83-113). San Francisco: Jossey-Bass.

Helitzer, D., Yoon, S., Wallerstein, N., &Garcia-Velarde, L. D. (2000). The role ofprocess evaluation in the training of facilita-tors for an adolescent health education pro-gram. Journal of School Health, 70(4), 141-147.

Linney, J. A., & Wandersman, A. (1991).Prevention plus III: Assessing alcohol andother drug prevention programs at theschool and community level: A four-stepguide to useful program assessment. Wash-ington, DC: U.S. Department of Health andHuman Services.

McGraw, S. A., Sellers, D. E., Stone, E. J.,Bebchuk, J., Edmundson, E. W., Johnson, C.C., et al. (1996). Using process evaluation toexplain outcomes: An illustration from thechild and adolescent trial for cardiovascu-lar health (CATCH). Evaluation Review,20(3), 291-312.

McGraw, S. A., Sellers, D. E., Stone, E. J.,Resnicow, K., Kuester, S., Fridinger, F., etal. (2000). Measuring implementation ofschool programs and policies to promotehealthy eating and physical activity amongyouth. Preventive Medicine, 31, S86-S97.

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from

Page 15: 134 - ACCEL · 134 HEALTH PROMOTION PRACTICE / April 2005 2April6 Evaluation and Practice Developing a Process-Evaluation Plan for Assessing Health Promotion Program Implementation:

Saunders et al. / DEVELOPING A PROCESS-EVALUATION PLAN 147

McGraw, S. A., Stone, E. J., Osganian, S. K.,Elder, J. P., Perry, C. L., Johnson, C. C., et al.(1994). Design of process evaluation withinthe child and adolescent trial for cardiovas-cular health (CATCH). Health EducationQuarterly, Suppl. 2, S5-S26.

Millstein, B., Wetterhall S., & CDC Evalua-tion Working Group. (2000). A frameworkfeaturing steps and standards for programevaluation. Health Promotion Practice,1(3), 221-228.

Resnicow, K., Davis, M., Smith, M., Laza-rus-Yaroch, A., Baranowski, T.,Baranowski, J., et al. (1998). How best tomeasure implementation of school health

curricula: A comparison of three measures.Health Education Research, 13(2), 239-250.

Scheirer, M. A., Shediac, M. C., & Cassady,C. E. (1995). Measuring the implementationof health promotion programs: The case ofthe breast and cervical-cancer program inMaryland. Health Education Research,10(1), 11-25.

Steckler, A., & Linnan, L. (2002a). In A.Steckler & L. Linnan (Eds.), Process evalua-tion for public health interventions andresearch (pp. 1-24). San Francisco: Jossey-Bass.

Steckler, A., & Linnan, L. (Eds.). (2002b).Process evaluation for public health inter-

ventions and research. San Francisco:Jossey-Bass.

Viadro, C., Earp, J. A. L., & Altpeter, M.(1997). Designing a process evaluation for acomprehensive breast cancer screeningintervention: Challenges and opportunities.Evaluation and Program Planning, 20(3),237-249.

Zapka, J., Goins, K. V., Pbert, L., & Ockene,J. K. (2004). Translating efficacy research toeffectiveness studies in practice: Lessonsfrom research to promote smoking cessa-tion in community health care centers.Health Promotion Practice, 5(3), 245-255.

Evaluation and Practice

at UNIV OF DELAWARE LIB on August 2, 2012hpp.sagepub.comDownloaded from