Guskey 2002 Eval Prof Development

download Guskey 2002 Eval Prof Development

of 8

Transcript of Guskey 2002 Eval Prof Development

  • 8/6/2019 Guskey 2002 Eval Prof Development

    1/8

    Does It ae a Difference?Evaluating Professional DevelopmentUsing ive critical evels ofevaluation,you can inmproveyour school'sprofessionaldevelopmentprogram.Buit be sure to startwith thedesiredresult-improvedstudent outcomes.

    Thomas R.Guskey

    E ducators have long consideredprofessional development tobe their right-somethingthey deserve as dedicated andhardworking individuals. Butlegislators and policymakers haverecently begun to question that right. Aseducation budgets grow tight, they lookat what schools spend on professionaldevelopment and want to know, Doesthe investment yield tangible payoffs orcould that money be spent in betterways? Such questions make effective

    evaluation ofprofessional developmentprograms more important than ever.Traditionally, educators haven't paidmuch attention to evaluating theirprofessional development efforts. Manyconsider evaluation a costly, time-consuming process that diverts atten-tion from more important activities suchas planning, implementation, andfollow-up. Others feel they lack the skilland expertise to become involved inrigorous evaluations: as a result, theyeither neglect evaluation issues

  • 8/6/2019 Guskey 2002 Eval Prof Development

    2/8

    completely or leave them to"evaluation experts."

    Good evaluations don't haveto be complicated. Theysimply require thoughtfulplanning, the ability to askgood questions, and a basicunderstanding of how to findvalid answers. What's more,they can provide meaningfulinformation that you can use tomake thoughtful, responsibledecisions about professionaldevelopment processes andeffects.What IsEvaluation?in simplest terms, evaluation is"the systematic investigation ofmerit or worth"(Joint Com-mittee on Standards for Educa-tional Evaluation, 1994, p. 3).Systematic implies a focused,thoughtful, and intentionalprocess. We conduct evalua-tions for clear reasons and withexplicit intent. Investigationrefers to the collection andanalysis of pertinent informa-tion through appropriatemethods and techniques. Meritor worth denotes appraisal and judg-ment. We use evaluations to determinethe value of something-to help answersuch questions as, Is this program oractivity achieving its intended results? Isit better than what was done in thepast? Is it better than another,competing activity? Is it worth thecosts?Some educators understand theimportance of evaluation for event-driven professional development activi-ties, such as workshops and seminars,but forget the wide range of less formal,ongoing, job-embedded professionaldevelopment activities-study groups,action research, collaborative planning,curriculum development, structuredobservations, peer coaching, mentoring,and so on. But regardless of its form,professional development should be apurposeful endeavor. Through evalua-tion, you can determine whether theseactivities are achieving their purposes.

    Critical Levels of ProfessionalDevelopment EvaluationEffective professional development eval-uations require the collection and anal-ysis of the five critical levels of informa-tion shown in Figure 1 (Guskey, 2000a).With each succeeding level, the processof gathering evaluation information getsa bit more complex. An d because eachlevel builds on those that come before,success at on e level is usually necessaryfor success at higher levels.Level 1:Participants'ReactionsTh e first level of evaluation looks atparticipants' reactions to the profes- -sional development experience. This isthe most common form of professionaldevelopment evaluations, and theeasiest type of information to gather andanalyze.

    At Level 1, yo u address questionsfocusing on whether or not participantsliked the experience. Did they feel their

    time was well spent? Did thematerial make sense to them?Were the activities wellplanned and meaningful? Wasthe leader knowledgeable andhelpful? Did the participantsK find the information useful?Important questions forprofessional developmentworkshops and seminars alsoinclude, Wa s the coffee hot andready on time? Was the room atthe right temperature? Werethe chairs comfortable? Tosome, questions such as thesema y seem silly and inconse-quential. But experiencedp_rofessional developers knowthe importance of attending tothese basic human needs.Information on participants'reactions is generally gatheredthrough questionnaires handed

    Xout at the end of a session oractivity. These questionnairestypically include a combinationof rating-scale items an d open-ended response questions that

    > > allow participants to makeo personal comments. Because ofthe general nature of this infor-

    mation, many organizations use thesame questionnaire for all their profes-sional development activities.

    Some educators refer to thesemeasures of participants' reactions as"happiness quotients," insisting thatthey reveal only the entertainment valueof an activity, not its quality or worth.But measuring participants' initial satis-faction with the experience can helpyou improve the design and delivery ofprograms or activities in valid ways.Level 2: Participants' earningIn addition to liking their professionaldevelopment experience, we also hopethat participants learn something fromit. Level 2 focuses on measuring theknowledge and skills that participantsgained. Depending on the goals of theprogram or activity, this can involveanything from a pencil-and-paper assess-ment (Can participants describe thecrucial attributes of mastery learning

  • 8/6/2019 Guskey 2002 Eval Prof Development

    3/8

    and give examples of how these mightbe applied in typical classroom situa-tions?) to a simulation or full-scale skilldemonstration (Presented with a varietyof classroom conflicts, can participantsdiagnose each situation and thenprescribe and carry out a fair and work-able solution?). You can also use oralpersonal reflections or portfolios thatparticipants assemble to document theirlearning.

    Although you can usually gather Level2 evaluation information at the comple-tion of a professional developmentactivity, it requires more than a stan-dardized form. Measures must showattainment of specific learninggoals. This means that indicatorsof successful learning need to beoutlined before activities begin.You can use this information as abasis for improving the content,format, and organization of theprogram or activities.Level 3: OrganizationSupportand ChangeAt Level 3, the focus shifts to theorganization. Lack of organiza-tion support and change cansabotage any professional devel-opment effort, even when all theindividual aspects of professionaldevelopment are done right.Suppose, for example, thatseveral secondary school educa-tors participate in a professionaldevelopment program on coop-erative learning. They gain athorough understanding of thetheory and develop a variety of class-room activities based on cooperativelearning principles. Following theirtraining, they try to implement theseactivities in schools where students aregraded "on the curve"-according totheir relative standing among class-mates-and great importance isattached to selecting the class valedicto-rian. Organization policies and practicessuch as these make learning highlycompetitive and will thwart the mostvaliant efforts to have students coop-erate and help on e another learn(Guskey, 2000b).

    Traditionally, educators 7haven't paid much attention toevaluating their professionaldevelopment efforts.Th e lack of positive results in thiscase doesn't reflect poor training orinadequate learning, but rather organiza-

    tion policies that undermine implemen-tation efforts. Problems at Level 3 haveessentially canceled the gains made atLevels 1 and 2 (Sparks & Hirsh, 1997).

    such as these can play a large part indetermining the success of any profes-sional development effort.Gathering information at Level 3 isgenerally more complicated than atprevious levels. Procedures differdepending on the goals of the programor activity. They may involve analyzingdistrict or school records, examiningthe minutes from follow-up meetings,administering questionnaires, and inter-viewing participants and school admin-istrators. You can use this informationnot only to document and improveorganization support but also to informfuture change initiatives.

    IV

    A;~ ~~ ~ ~ - - IiE

    . . '

    That's why professional developmentevaluations must include information onorganization support and change.At Level 3, yo u need to focus onquestions about the organization charac-teristics and attributes necessary forsuccess. Did the professional develop-ment activities promote changes thatwere aligned with the mission of theschool and district? Were changes at theindividual level encouraged andsupported at all levels? Were sufficientresources made available, includingtime for sharing and reflection? Weresuccesses recognized and shared? Issues

    Level 4: Participants'Use ofNew Knowledge andSkillsAt Level 4 we ask, Did the new knowl-edge and skills that participants leamedmake a difference in their professionalpractice? Th e key to gathering relevantinformation at this level rests in speci-fying clear indicators of both the degreeand the quality of implementation.Unlike Levels 1 and 2, this informationcannot be gathered at the end of aprofessional development session.Enough time must pass to allow partici-pants to adapt the new ideas an d prac-tices to their settings. Because imple-

  • 8/6/2019 Guskey 2002 Eval Prof Development

    4/8

    F ! G U R E l

    Five Levels of Professional Development EvaluationI IEvaluation Level

    1. Participants'Reactions

    _- - .., --- .-- .2. Participants'Learning

    3.OrganizationSupport &Change

    4. Participants'Use of NewKnowledgeand Skills

    5.StudentLearningOutcomes

    Did they like it?Was their time well spent?Did the material make sense?Will it be useful?Was the leader knowledgeable and

    helpful?Were the refreshments fresh and tasty?Was the room the right temperature?Were the chairs comfortable?Did participants acquire the intended

    knowledge and skills?

    Was implementation advocated,facilitated, and supported?

    Was the support public and overt?Were problems addressed quickly and

    efficiently?Were sufficient resources made available?Were successes recognized and shared?What was the impact on the organization?Did it affect the organization's dimateand procedures?

    Did participants effectively apply the newknowledge and skills?

    What was the impact on students?Did it affect student performance

    or achievement?Did it influence students' physicalor emotional well-being?Are students more confident as learners?Is tudent attendance improving?Are dropouts decreasing?

    How Will InformationBe Gathered?Questionnaires administered at the endof the session

    Paper-and-pencil instrumentsSimulationsDemonstrationsParticipant reflections(oral and/or written)Participant portfoliosDistrict and school recordsMinutes from follow-up meetingsQuestionnairesStructured interviews wi(th participantsan d district or school administratorsParticipant portfolios

    QuestionnairesStructured interviews with participants

    and their supervisorsParticipant reflections(oral and/or written)Participant portfolios

    Direct observationsVideo or audio tapesStudent recordsSchool recordsQuestionnairesStructured interviews with students,

    parents, teachers, and/oradministratorsParticipant portfolios

    What IsMeasured orAssessed?Initial satisfaction withexperience

    New knowledge and sparticipants

    The organization's advsupport, accommodafacilitation, and recog

    Degree and qualityof implementation

    Student learning outcoCognitive (PerformanAchievement)Affective (AttitudesDispositions)

    Psychomotor (Skills &Behaviors)

    What Questions Are Addressedi

    .

    k

  • 8/6/2019 Guskey 2002 Eval Prof Development

    5/8

    Will InformationUsed?improve program design and delivery

    improve program content, format,organization

    document and improve organizationinform future change efforts

    and improve theof program content

    focus and improve all aspects of programimplementation, and follow-upthe overall impact ofdevelopment

    mentation is often a gradual and unevenprocess, yo u ma y also need to measureprogress at several time intervals.You ma y gather this informationthrough questionnaires or structuredinterviews with participants and theirsupervisors, oral or written personalreflections, or examination of partici-pants' journals or portfolios. Th e mostaccurate information typicaUy comesfrom direct observations, either withtrained observers or by reviewing video-or audiotapes. These observations,however, should be kept as unobtrusiveas possible (for examples, see Hall &Hord, 1987).

    You can analyze this information tohelp restructure future programs andactivities to facilitate better and moreconsistent implementation.Level 5: Student LearningOutcomesLevel 5 addresses "the bottom line":How did the professional developmentactivity affect students? Did it benefitthem in any way? Th e particular studentlearning outcomes of interest depend,of course, on the goals of that specificprofessional development effort.In addition to the stated goals, theactivity may result in important unin-tended outcomes. For this reason, evalu-ations should always include multiplemeasures of student learning (Joyce,1993). Consider, fo r example, elemen-tary school educators who participate instudy groups dedicated to finding waysto improve the quality of students'writing and devise a series of strategiesthat they believe will work for theirstudents. In gathering Level 5 informa-tion, they fnd that their students' scoreson measures of writing ability over theschool year increased significantlycompared with those of comparablestudents whose teachers did not usethese strategies.

    On further analysis, however, theydiscover that their students' scores onmathematics achievement declinedcompared with those of the otherstuderits. This unintended outcomeapparently occurred because theteachers inadvertently sacrificed instruc-tional time in mathematics to provide

    uperintendents, board 7members, and parents rarelyask, "Can you prove it?"Instead, they ask for evidence.

    more time for writing. Had informationat Level 5 been restricted to the singlemeasure of students' writing, this important unintended result might have goneunnoticed.Measures of student learning typicallyinclude cognitive indicators of student

    performance and achievement, such asportfolio evaluations, grades, and scoresfrom standardized tests. In addition, youma y want to measure affective out-comes (attitudes and dispositions) andpsychomotor outcomes (skills andbehaviors). Examples include students'self-concepts, study habits, school atten-dance, homework completion rates, andclassroom behaviors. You can alsoconsider such schoolwide indicators asenrollment in advanced classes, memberships in honor societies, participation inschool-related activities, disciplinaryactions, and retention or drop-out rates.Student and school records provide themajority of such information. Yo u canalso include results from questionnairesand structured interviews with studentsparents, teachers, and administrators.

    Level 5 information about a program'soverall impact can guide improvementsin all aspects of professional develop-ment, including program design, imple-mentation, and follow-up. In somecases, information on student learningoutcomes is used to estimate the costeffectiveness of professional develop-ment, sometimes referred to as "returnon investment" or "ROI evaluation"(Parry, 1996; Todnem & Wamer, 1993).Look for Evidence, Not ProofUsing these five levels of information inprofessional development evaluations,are you ready to "prove" that profes-sional development programs make adifference? Can you now demonstrate

  • 8/6/2019 Guskey 2002 Eval Prof Development

    6/8

    that a particular professional develop-ment program, and nothing else, issolely responsible fo r the school's 10percent increase in student achieve-ment scores or its 50 percent reductionin discipline referrals?

    Of course not. Nearly all professionaldevelopment takes place in real-worldsettings. The relationship betweenprofessional development and improve-ments in student learning in these real-world settings is far too complex andincludes too many intervening variablesto permit simple causal inferences(Guskey, 1997; Guskey & Sparks, 1996).What's more, most.schools are engagedin systemic reform initiatives thatinvolve the simultaneous implementa-tion of multiple innovations (FuJlan,1992). Isolating the effects of a singleprogram or activity under such condi-tions is usualy impossible.

    But in the absence of proof, you cancoUlect good evidence about whether aprofessional development program hascontributed to specific gains in studentlearning. Superintendents, boardmembers, and parents rarely ask, "Canyou prove it?" Instead, they ask forevidence. Above all, be sure to gatherevidence on measures that are mean-ingful to stakeholders in the evaluationprocess.Consider, fo r example, the use ofanecdotes and testimonials. From amethodological perspective, they are apoor source of data. They are typicalyhighly subjective, and they may beinconsistent and unreliable. Neverthe-less, as any trial attorneywiU teUl you,they offer the kind of personalizedevidence that most people believe, andthey should not be ignored as a sourceof information. Of course, anecdotesan d testimonials should never form thebasis of an entire evaluation. Setting upmeaningful comparison groups andusing appropriate pre- an d post-measures provide valuable information.Time-series designs that include mul-tiple measures collected before andafter implementation are another usefulalternative.

    Keep in mind, too, that good evi-dence isn't hard to come by if yo u

    know what you're looking for beforeyou begin. Many educators find evalua-tion at Levels 4 and 5 difficult, expen-sive, and time-consuming because theyare coming in after the fact to search forresults (Gordon, 1991). If you don'tknowwhere you are going, it's verydifficult to tell whether you've arrived.But if yo u clarify your goals up front,most evaluation issues faU into place.Working BackwardThrough the Five LevelsThree important implications stem fromthis model for evaluating professionaldevelopment. First, each of these fivelevels is important. Th e informationgathered at each level provides vital*data for improving the quality of profes-sional development programs.Second, tracking effectiveness at onelevel teUs yo u nothing about the impactat the next. Although success at an earlylevel ma y be necessary for positiveresults at the next higher one, it'sclearly not sufficient. Breakdowns canoccur at any point along the way. It'simportant to be aware of the difficultiesinvolved in moving from professionaldevelopment experiences (Level 1) toimprovements in student learning (Level5) and to plan for the time and effortrequired to build this connection.

    Th e third implication, and perhapsthe most important, is this: In planningprofessional development to improvestudent learning, the orderof theselevels must be reversed.You must plan"backward" (Guskey, 2001), startingwhere you want to end and thenworking back.In backward planning, you firstconsider the student learning outcomesthat you want to achieve (Level 5). Forexample, do yo u want to improvestudents' reading comprehension,enhance their skils in problem solving,develop their sense of confidence inlearning situations, or.improve theircollaboration with classmates? Criticalanalyses of relevant data from assess-ments of student learning, examples ofstudent work, and school records areespeciaUy useful in identifying thesestudent learning goals.

    Then you determine, on the basis ofpertinent research evidence, whatinstructional practices and policies wilmost effectively and efficiently producethose outcomes (Level 4). You need toask, What evidence verifies that theseparticular practices and policies willlead to the desired results? How good orreliable is that evidence? Was it gath-ered in a context similar to ours? Watchout fo r popular innovations that are

  • 8/6/2019 Guskey 2002 Eval Prof Development

    7/8

    more opinion-based than research-based, promoted by people moreconcerned with "what sells" than with"what works." You need to be cautiousbefore jumping on any education band-wagon, always making sure that trust-worthy evidence validates whateverapproach you choose.Next, consider what aspects of orga-nization support need to be in place forthose practices and policies to be imple-mented (Level 3). Sometimes, as Imentioned earlier, aspects of theorganization actually pose barriers toimplementation. "No tolerance"policies regarding student disciplineand grading, for example, ma y limitteachers' options in dealing withstudents' behavioral or learning prob-lems. A big part of planning involvesensuring that organization elements arein place to support the desired practicesand policies.Then, decide what knowledge andskills the participating professionalsmust have to implement the prescribedpractices and policies (Level 2). Whatmust they know and be able to do tosuccessfully adapt the innovation to

    their specific situation and bring aboutthe sought-after change?Finally, consider what set of experi-ences will enable participants to acquirethe needed knowledge and skills (Level1). Workshops and seminars, especiallywhen paired with collaborative plan-ning and structured opportunities forpractice with feedback, action researchprojects, organized study groups, and awide range of other activities can all beeffective, depending on the specifiedpurpose of the professional develop-ment.This backward planning process is soimportant because the decisions madeat each level profoundly affect those atthe next. For example, the particularstudent learning outcomes yo u want toachieve influence the kinds of practicesand policies you implement. Likewise,the practices and policies you want toimplement influence the kinds of orga-nization support or change required,and so on.

    Th e context-specific nature of this

    work complicates matters further. Evenif we agree on the student learningoutcomes that we want to achieve,what works best in one context with aparticular community of educators anda particular group of students might notwork as well in another context withdifferent educators and differentstudents. This is what makes developingexamples of truly universal "best prac-tices" in professional development sodifficult. What works always dependson where, when, and with whom.

    Above all, be sure to gathe 7evidence on measures thatare meaningful to stakeholdersin he evaluation process.Unfortunately, professional devel-opers can fall into the same trap in plan-ning that teachers sometimes do-making plans in terms of what they aregoing to do, instead of what they want

    their students to know and be able todo. Professional developers often planin terms of what they will do (work-shops, seminars, institutes) or how theywill do it (study groups, action research,peer coaching). This diminishes theeffectiveness of their efforts and makesevaluation much more difficult.Instead, begin planning professionaldevelopment with what you want toachieve in terms of learning andlearners and then work backward fromthere. Planning will be much more effi-cient and the results will be much easierto evaluate.Making Evaluation CentralA lot of good things are done in thename of professional development. Butso are a lot of rotten things. Whateducators haven't done is provideevidence to document the differencebetween the two.

    Evaluation provides the key tomaking that distinction. By includingsystematic information gathering and

    analysis as a central component of allprofessional development activities, wecan enhance the success of professionaldevelopment efforts everywhere. -ReferencesFullan, M. G. (1992). Visions that blind.EducationalLeadership,49(5) 19-20.Gordon, J. (1991, August). Measuring the"goodness" of training. Training,19-25Guskey, T. R.(1997). Research needs tolink professional development andstudent learning.Journal of StaffDevelopment, 18(2), 36-40.Guskey, T. R.(2000a). Evaluatingprofes-sionaldevelopment. Thousand Oaks,CA: Corwin.Guskey, T. R.(2000b). Grading policiesthat work against standards and ho w tofix them. NASSP Bulletin, 84(620),

    20-29.Guskey, T. R.(2001). The backwardapproach.Journal of StaffDevelop-mnent, 22(3), 60.Guskey, T. R. , & Sparks, D. (1996).Exploring the relationship betweenstaff development and improvements instudent learning.Journal of StaffDevelopment, 1 7(4), 34-38.Hall, G. E., & Hord, S.M.(1987). Changein schools:Facilitating heprocess.Albany, NY: SUNY Press.Joint Committee on Standards for Educa-tional Evaluation. (1994). Theprogramevaluation standards 2nd ed.). Thou-sand Oaks, CA : Sage.Joyce, B. (1993). The link is there, butwhere do we go from here?JournalofStaffDevelopment, 14(3), 10-12.Parry, S. B. 1996). Measuring training'sROI. Training&Development, 50(5),72-75.Sparks, D. (1996, February). Viewingreform from a systems perspective. TheDeveloper, 2, 6.Sparks, D., &Hirsh, S. (1997). A newvision or staffdevelopment.Alexandria, VA : ASCD.Todnem, G., &Wamer. M. P. (1993).Using ROI to assess staff developmentefforts. JournalofStaffDevelopment,14(3), 32-34.

    Copyright 2002 Thomas R. Guskey.

    Thomas R.Guskey (Guskeyuky.edu) isa professor of education policy studiesand evaluation, College of Education,University of Kentucky, Taylor EducationBuilding, Lexington, KY 40506, andauthor of Evaluating Professional Devel-opment (Corwin Press, 2000).

  • 8/6/2019 Guskey 2002 Eval Prof Development

    8/8

    COPYRIGHT INFORMATION

    TITLE: Does it make a difference? Evaluating professional

    development

    SOURCE: Educational Leadership 59 no6 Mr 2002

    WN: 0206003461010

    The magazine publisher is the copyright holder of this article and it

    is reproduced with permission. Further reproduction of this article in

    violation of the copyright is prohibited..

    Copyright 1982-2002 The H.W. Wilson Company. All rights reserved.