When A nd ho W to deve Lop A n iMpAct- oriented M ... · aut facias eum, ut arum eaque est omnis...
Transcript of When A nd ho W to deve Lop A n iMpAct- oriented M ... · aut facias eum, ut arum eaque est omnis...
Ipsus eum que dellaccum quam rest, excerum eos es simagnim fuga. Bus, ut magnam qui se numentiusda qui dusto odit, nihil magnis dolupis coneste nulparc hillatem qui de omnis magnatur rehendu sapelen tibeati nvelic testio con placcatibus, quassi sit que conse pa qui blanihil explitae volori berum et quo et alis nobisini cum rersped eicit, simil im vende aute cori dolestem sunt quate et illuptat fugia cor autem fugitio sapernam quo quiam rerit pellandia vel endi oditi occae dolorum quibus ressus seque reiusapita vendigenti ut et min con expliciis ditaecu stiisquatem. Bus et ero idebis erum aut facias eum, ut arum eaque est omnis sim aut faccusa pictur?Adit, tem sequatureni quod min con re maion cus.Ut porum re que naturis eictemporia ium voluptae nimodia epudicae doluptat.Us inum int, omnima pror re estio ommolorio dolorae. Is il inum estotat umenis ad quat enecti int, as aliquosam esedis unt eaquam re, quas ut quo istem facculpa quatemquae siti ut lis ut odi ut offictotat etus millam, omniminum discian isinctibus magnatur minctat aturessin poritios quaesecae volor aborendit quam, consed maioren ecuptat eum faces et re modicipsamet volessi taspeliam vollorecae eum assunto quia que nobis eaquam as cum dit aut earum laut autem nemperes endis susandam faces rem haruptibus evernat quodis atur? Od quam hicabor re, ad
March 2016
A Methods LAb pubLicAtion
odi.org/MethodsLAb
When And hoW to deveLop An iMpAct-
oriented Monitoring And evALuAtion systeM
Greet Peersman, Patricia Rogers, Irene Guijt, Simon Hearn, Tiina Pasanen and Anne L. Buffardi
The Methods Lab is an action-learning collaboration between the Overseas Development Institute (ODI), BetterEvaluation (BE) and the Australian Department of Foreign Affairs and Trade (DFAT). The Methods Lab seeks to develop, test, and institutionalise flexible approaches to impact evaluations. It focuses on interventions which are harder to evaluate because of their diversity and complexity or where traditional impact evaluation approaches may not be feasible or appropriate, with the broader aim of identifying lessons with wider application potential.
Readers are encouraged to reproduce Methods Lab material for their own publications, as long as they are not being sold commercially. As copyright holder, ODI requests due acknowledgement and a copy of the publication. For online use, we ask readers to link to the original resource on the ODI website. The views presented in this paper are those of the author(s) and do not necessarily represent the views of ODI, the Australian Department of Foreign Affairs and Trade (DFAT) and BetterEvaluation.
© Overseas Development Institute 2016. This work is licensed under a Creative Commons Attribution-NonCommercial Licence (CC BY-NC 4.0).
How to cite this guidance note: Peersman, G., Rogers, P., Guijt, I., Hearn, S., Pasanen, T., and Buffardi, A. (2016) ‘When and how to develop an impact-oriented monitoring and evaluation system’. A Methods Lab publication. London: Overseas Development Institute.
Overseas Development Institute203 Blackfriars RoadLondon SE1 8NJTel +44 (0) 20 7922 0300 Fax +44 (0) 20 7922 0399 [email protected]
BetterEvaluationE-mail: [email protected]
When And hoW to deveLop An iMpAct-oriented Monitoring And evALuAtion systeM 3
FocusThis guidance note focuses on:
• whatanimpact-orientedmonitoringandevaluationsystementails
• whyanorganisationmaywanttoestablishsuchasystem• whenintegratinganimpact-orientationintoanmonitoring
andevaluationsystemismostuseful• whatshouldbeconsideredindevelopingthemonitoring
andevaluationsystem,orintweakinganexistingsystem,tobecomemoreimpact-focused.
intended users Theprimaryaudienceforthisguidancenoteisinternalandexternalmonitoringandevaluationadvisorsinvolvedindesigningandimplementing,and/orassessingmonitoringandevaluationsystemstoincludeafocusonimpact.
Itwillalsobeusefulforseniormanagementoforganisationswhoneedtoknowhowbesttoplanforasustainablemonitoringandevaluationsystemthatsupportsimpactassessmentortoadaptanexistingsystemtoincorporateanimpactperspective.
4 Methods LAb
This guidance note has been developed by the Methods Lab, an action learning collaboration between the Overseas Development Institute (ODI), BetterEvaluation and the Australian Department of Foreign Affairs and Trade (DFAT).
The authors and their contributions are:
Greet Peersman (BetterEvaluation), who led the overall writing, reviewing, revising and finalisation of the guidance note; Greet conceptualised the content and wrote substantive sections of the guidance note, in particular chapters 1, 2, 4, 5 and 6.
Patricia Rogers (BetterEvaluation) also conceptualised the content for the guidance note and wrote sections on applying systems and complexity thinking, as well as using theories of change – specifically in chapters 2 and 5.
Irene Guijt (ODI) conceptualised the focus of the paper and identified key themes drawn from the different programmes upon which the guidance note is based. In particular, she contributed to chapters 1, 2 and 5.
Simon Hearn (ODI) wrote sections on defining impact-oriented M&E systems in Chapter 2.
Tiina Pasanen (ODI) helped to conceptualise the overall guidance note and identified key themes drawn from the case study programmes; she wrote chapters 3 and 6, and contributed to Chapter 4.
Anne Buffardi (ODI) also helped identify key themes from the case studies and contributed to chapters 1, 3, 4 and 5.
The authors would like to thank reviewer Professor William Bertrand (Tulane University, USA) and staff in the Methods Lab programmes from which examples and lessons learned were drawn.
Authors and acknowledgements
AcronymsBE BetterEvaluation, RMIT University, Australia
DAC Development Assistance Committee
DFAT Department of Foreign Affairs and Trade, Australia
DFID Department of International Development, UK
M&E Monitoring and evaluation
ODI Overseas Development Institute
OECD Organisation for Economic Co-operation and Development
ToC Theory of change
When And hoW to deveLop An iMpAct-oriented Monitoring And evALuAtion systeM 5
contents
1. Rationale and purpose of this guidance note 6
2. What is an impact-oriented M&E system? 7
2.1 What do we mean by impact? 7
2.2 What do we mean by monitoring and evaluation? 8
2.3 What does a monitoring and evaluation system involve? 9
2.4 What is an impact-oriented M&E system? 11
3. Why should an organisation consider developing an impact-oriented M&E system? 12
3.1 Shifting the focus from outputs to impact, from indicators to impact-related questions 12
3.2 Improving the availability and quality of data for impact assessment 12
3.3 Providing timely, relevant data to guide adaptive management 13
3.4 Offering space for collective sense-making of data collected 13
4. When is an impact-oriented M&E system appropriate? 14
4.1 Matching M&E efforts with implementation efforts and decision-making needs 14
4.2 Determining plausibility, utility and feasibility of impact assessment 15
4.3 Ensuring adequate capacity and resources for impact assessment 16
5. What are key issues to address when establishing an impact-oriented M&E system? 17
5.1 Using the theory of change as the foundation for impact-oriented M&E 17
5.2 Determining impact focus based on complexity thinking 18
5.3 Balancing emphasis on accountability and learning 19
5.4 Prioritising impact-related information needs 19
5.5 Clarifying M&E roles and sequencing M&E activities 19
6. Conclusion 21
References 22
Resources 24
6 Methods LAb
Manydevelopmentprogrammestaffhavehadtheexperienceofcommissioninganimpactevaluationtowardstheendofaprojectorprogrammeonlytofindthatthemonitoringsystemdidnotprovideadequatedataaboutimplementation,context,baselinesorinterimresults.Thisguidancenotehasbeendevelopedinresponsetothiscommonproblem.
Theopportunityoflearning-by-doingthroughengagementwithongoinginterventionshelpedtogroundthisguidancenoteinthepracticalexperiencesofprogrammemanagersandstaff,andthosecommissioningorconductingmonitoringandevaluation(M&E)ofdevelopmentinterventions.Theinterventionsinvolvedsharedsomecommonchallengesrelatedto:
• the type and delivery of the intervention: • multiplecomponentsimplementedbydifferentorganisationsacrossseveralsitesandaimingtoaffectchangeatmultiplelevels.Assuch,negotiationsaroundwhatimpactstoassesswereneededbetweenmanydifferentstakeholdersor
• interventionswithdefinedimpactsbutuncertainpathwaysastohowtogetthere.
• the type of impact, such as: • diffuseeffectswhicharedifficulttodiscernor • effectsattheendofalongcausalchainrequiringgoodintermediateorproxymeasures.
Somechallenges,intermsoftheimpactfocus,werespecifictothestageintheinterventioncycle(seeTable1).
Manyofthesechallengescouldhavebeenavoidedor,atleast,reducedbyplanningforimpactassessmentearlyonintheinterventioncycle.Whilethereare
benefitsofintegratingimpact-orientationearlyon,itcaneasilyoverwhelmprogrammestaff.Moreover,afocusonimpactisnotalwaysappropriate.Thisguidancenoteaimstofacilitateabetterunderstandingofwhatisinvolvedindesigning,implementingand/orassessingimpact-orientedM&Esystemsincluding:
• whatanimpact-orientedM&Esystementails • whyanorganisationmaywanttoestablishsuchassystem • whenintegratinganimpact-orientationismostuseful,and • whatshouldbeconsideredindevelopingthesystemorintweakinganexistingM&Esystemtobecomemoreimpact-focused.
1. rationale and purpose of this guidance note
table 1: challenges in impact focus of interventions involved in the Methods Lab
Interventions ‘en route’ or ‘ending’
Interventions ‘starting up’
• Lackedaclearimpactlogicorneededretro-fitting
• Portfolioslackedanoverarchingimpactlogicthatbringsresultsofdifferentcomponentstogether
• Existingdatanotfullyalignedwithimpactlogic,hencenotrelevantorunder-utilised
• Impactnotyetaddressed• Differentinterpretations
betweendifferentstakeholdersaboutwhatconstitutesimpact
• Difficultyprioritisingamongmanyrelevantimpact-relatedquestions
• Difficultybalancingshorter-termdemandstodemonstrateperformanceandlonger-termlearningaboutimpact
Figure 1: representation of a simplified results chain that includes impact
inputs Activities outcomesoutputs impact
Inthissection,wedefinethekeytermsusedinthisguidancenote:impact,monitoring,evaluation,impactmonitoring,impactevaluationandimpact-orientedM&Esystem.Definingthesetermsisanimportantpartofdevelopingandimplementingamulti-stakeholderM&Esystem:itensuresthatallthoseinvolvedhavethesameunderstandingfromtheoutsetandthushelpsavoidconfusionordisagreementlateron.
2.1 What do we mean by impact?Whiletherearemanydifferentdefinitionsof‘impact’(seethediscussioninHearnandBuffardi2016),inthisguidancenote,wedefineimpactaspertheDevelopmentAssistanceCommittee(DAC)oftheOrganisationforEconomicCo-operationandDevelopment(OECD):
‘Positive and negative, primary and secondary long-term effects produced by a development intervention, directly or indirectly, intended or unintended’ (OECD-DAC 2010)
Weusethetermimpacttorefertolong-termresultssuchashealthstatus,well-beingorsocialchangeastheultimateresultsattheendofacausalchain(seeFigure1).
Impactisdistinctfrom‘outputs’–whicharethedirectproductsresultingfromtheimplementationofinterventionactivities–andfrom‘outcomes’–whicharetheintermediate-termchangesinthetargetgroup(s)whohavebeenengagedintheinterventionandwhichprecede,andareusuallyapre-conditionfor,impacttooccur.
Theremaybeparticularchallengestoassessingthelong-termresults:itisusuallyhardertogatherevidence
thattheyactuallyoccurred;theyareoftennotvisibleduringthelifeofashort-termintervention;andtheyaremorelikelytobeaffectedbyotherinterventionsandotherfactors.Inpractice,aparticularinterventionisrarelysufficienttoproducetheintendedimpactsaloneandthereareoftenalternativewaystoachievethem.Itisfarmorelikelyfortheretobeasituationofjointcausalattribution(Figure2)oralternative(ormultiple)causalpaths(Figure3).And,insomecases,itmaynotbepossibletodefineimpactsand/orthepathwayinadvance(seesection5.2).
Jointcausalattribution(Figure2)iswhentheinterventionproducestheimpactsinconjunctionwithotherinterventions(i.e.,complementaryorotherongoinginterventions)orcertaincontextualfactors(i.e.,impactswillonlybeachievediffavourableconditionsarepresentand/orunfavourableconditionsareremoved).
Thealternative(ormultiple)causalpaths,showninFigure3,arewhenaparticularinterventioncanproducetheimpactsbuttheymayalsocomeaboutthroughotherinterventions(e.g.,participantsareabletoaccessservicesthroughanalternativeprovider)and/orexternalfactors.Thesesituationsarecommonandhaveimportantimplicationsforhowimpactassessmentisconductedandhowthefindingsareused–especiallyintermsofscale-upoftheinterventionorpotentialreplicationelsewhere(Rogers2014).
Impacthasmanydimensions(seeTable2),including:
• positive or negative –thatis,beneficialordetrimentalasjudgedbythoseaffectedbytheinterventionorotherstakeholders
2. What is an impact-oriented M&e system?
Figure 2: representation of joint causal attribution
Source: Rogers, P. (2014) Overview: Strategies for causal attribution. Methodological Briefs on Impact Evaluation, Nr 6. Florence:
UNICEF Office of Research.
ImpactsPROGRAMME
OR POLICY
OTHERPROGRAMMEOR POLICY, ORCONTEXTUAL
FACTORS
8 Methods LAb
• primary or secondary –theymayrelatetotheobjectivesoftheinterventionormaybeside-effectsorspill-overeffects
• direct or indirect –theremaybeadirectcausallinkwiththeinterventionactivitiesortheymaycomeaboutthroughcascadedactivities
• long-term –theyaredependentonotherresultsbeingachievedfirst,andthus,takelongertobematerialisedorobserved
• intended or unintended –theymaybespecificallytargetedthroughthechosenactivitiesortheymaybeadditional
• foreseen or unforeseen –theymaybepredictableornot.
WerefertotheMethodsLabpaperWhatisimpact?(HearnandBuffardi2016)forabroaderdiscussionaboutimpactdimensionsandtheirimplicationsforimpactassessment.
2.2 What do we mean by monitoring and evaluation?Monitoringistheroutinetrackingandreportingofpriorityinformationaboutanintervention.1Thisinformationcanrelatetotheintervention’sinputs,activities,outputs,outcomesandimpactsbutalsotoemergingissuesorresults,andtheinternalandexternalcontextinwhichtheinterventionoperates.
Monitoringisusedprimarilyforinternalmanagementandaccountability.Interventionmanagersandimplementerscanusemonitoringinformationtoassesswhethertheimplementationoftheinterventionisontrackandtoidentifyandcorrectanychallengesinatimelymanner.Asubsetofthisinformationisoftenreportedtoseniormanagementorfunders(upwardaccountability),interventionbeneficiaries(downwardaccountability)and/orpeersorimplementingpartners(horizontalaccountability).
Evaluationreferstodiscretestudiesthataimtoproduceanoverallevaluativejudgementaboutthemerit,worthorsignificanceofanintervention,inadditiontodescriptionsofthewaythingsareandanalysisofcausalrelationships.Evaluationfindingsareintendedprimarilytoinformdecisionsaboutaspecificinterventionbutalsoaboutfutureinvestmentsandplanning.
table 2: dimensions of defining impact
Intended Positive unintended Negative unintended
Foreseen Plannedprogrammegoals Predictedspill-overeffects Predictedrisksorside-effects
Unforeseen Emergentprogrammegoals Nicesurprise Calamity,mishaporbacklash
Source: Hearn and Buffardi 2016 – adapted from Ling 2014
1 Suchasaproject,programme,policy,portfolioofprojects,initiative.
ImpactsPROGRAMME
OR POLICY
OTHERPROGRAMMEOR POLICY, ORCONTEXTUAL
FACTORS
Figure 3: representation of alternative (or multiple) causal paths
Source: Rogers, P. (2014) Overview: Strategies for causal attribution. Methodological Briefs on Impact Evaluation, Nr 6. Florence:
UNICEF Office of Research.
progrAMMe or poLicy
ImpactsPROGRAMME
OR POLICY
OTHERPROGRAMMEOR POLICY, ORCONTEXTUAL
FACTORS
other progrAMMe or poLicy, or conteXtuAL
FActors ImpactsPROGRAMME
OR POLICY
OTHERPROGRAMMEOR POLICY, ORCONTEXTUAL
FACTORS
When And hoW to deveLop An iMpAct-oriented Monitoring And evALuAtion systeM 9
Monitoringandevaluationaredistinctbutcloselyinter-relatedactivities.Forexample,evaluationisoftentriggeredbymonitoringdata–suchaswhenunexpectedthingshappenthatneedmorein-depthinvestigationastowhytheyoccurred.Evaluationsarealsooftendependentoninformationthathasbeencollectedthroughongoingmonitoring–forinstance,documentedprogresswithimplementationofplannedactivities;suchinformationwouldbemuchharderandlessreliabletoobtainretrospectively.
Generally,monitoringandevaluationfindingsareusedatdifferenttimes,withdifferentregularity,differentresourceneedsandfordifferentpurposes.Table3summarisesthewaysinwhichmonitoringandevaluationareoftendefinedandpracticed.Ithighlightswhymonitoringandevaluationarebothneededforeffectiveprogrammemanagementanddecisionmaking;itisnotsufficienttoconductmonitoringwithoutanykindofevaluativereflectionand,giventheepisodicnatureofmostevaluationstudies,2theyare,bythemselves,inadequatetosupportadaptivemanagementof
anongoingintervention.Hence,itmakessensetoimplementM&Eactivitiesinamannerthatdrawsontheirrespectivestrengthsandtoplanforthemaspartofamonitoringandevaluationsystem.
2.3 What does a monitoring and evaluation system involve?Amonitoringandevaluationsystemismorethansimplyasystemforcollectingdataoralistofmeasuresormethodsfordatacollection.TheBetterEvaluationRainbowFramework(www.betterevaluation.org)providesanorganisingframeworkofseven‘clusters’ofmonitoringandevaluationtasks(seeTable4),fromdefiningwhatistobemonitoredandevaluated,clarifyingprimaryintendedusersanduses,andthensettingouthowdatawillbecollectedorretrieved,analysed,reportedandusedforparticularpurposes.
2 Therearenotableexceptionssuchasdevelopmentalevaluation,whichisparticularlysuitedtoguideadaptationtoemergentanddynamicrealitiesincomplexenvironments.
3 Somecharacteristicsarenotnecessarilyexclusivetoeitheroneofthefunctions.Forexample,routinemonitoringmayincludeassessingunintendedresultssuchasinearlywarningsystems;monitoringcanassessthelogicofparticularlinksinthetheoryofchangethroughanalysingthepatternsinincreasesordecreasesinindicatorvaluesthoughtheymaybemoredifficulttointerpretbythemselves.
table 3: how monitoring and evaluation are often defined or practised3
Key characteristics Monitoring Evaluation
Purpose and approach Routinelycollectspriorityinformation,oftenthroughstandardisedperformanceindicatorslinkedtotheobjectivesoftheintervention
Isepisodicandinvestigatesparticulardimensionsofaninterventionandobservedresults,usually,indepthandbyusingmultipledatasources
Understanding causality Linksinputsandactivitiestoresults,oftenlimitedtooutputsbutoutcomesand/orimpactsmayalsobetracked
Doesnotconductcausalinference
Tests(elementsof)theunderlyingtheoryofchange
Assessesspecificcausalcontributionsoftheinterventiontotheresults,goingbeyondoutputstoincludeoutcomesand/orimpacts
Use Providesactualresults,whichcanbecomparedwithintendedresults,oftenexpressedasspecific,pre-establishedtargets
Tracksunintendedresultsthatareforeseen
Identifiesareasofunder-achievement,whichmayalertmanagerstoproblemsthatneedtobecorrectedorfurtherinvestigated
Reportsachievementstofundersorpolicymakers(upwardaccountability)and/orbeneficiaries(downwardaccountability)
Analyseswhyintendedresultswereorwerenotachieved
Assessesunintendedresults,bothforeseenandunforeseen
Providesajudgementaboutthemerit,worthorsignificanceofanintervention
Provideslessonslearnedandoffersrecommendationsforinterventionimprovementand/orresourceallocation
See, for example, CDC (2003), Kusek and Rist (2004), Peersman and Rugg (2010).
10 Methods LAb
Amonitoringandevaluationsystemrequiresintegratedplanningaroundthepurposes,informationpriorities,underlyingvaluesandprinciples,rolesandresponsibilitiesbutalsocapacitiesofdifferentactorscontributingtothesystem,implementationproceduresandactivitiesandtools.
IfM&Eistofacilitateandfoster,notonlyindividual,butalsoorganisationallearningitneedstobebuiltintotheregularorganisationalandfinancialallocationprocessesinordertobecomeintegraltothethinkingandactingoftheorganisation(Dlamini2006).Thisrequires:
• establishingorganisationalstructures,strengtheninghumancapacityandbuildingstrategicpartnershipstoplan,coordinateandmanagetheM&Esystemincluding: • understandingthecapacityrequirementsformonitoringandevaluationatdifferentlevelsofthesystem(individual,organisational,acrossorganisations),and
• clearlydefiningrolesandresponsibilitiesdrawingonthestrengthsandcomparativeadvantageofdifferentactors
• supportfor,andregularcommunicationsabout,theusefulnessofM&E,andidentifyingM&EchampionstocreateasupportivecultureforM&Ewithintheorganisation
• identifyingandprioritisinginformationneeds,andselectingandsupportingappropriatedatacollection,verificationandanalysisstrategies
• storingandmanagingtheinformationinwaysthatprotectsensitivedatabutalsofacilitatesharingwhereappropriate(withintheorganisationandwithothers)andknowledgeaccumulation
• supportingdisseminationtailoredtodifferent primaryusersoftheinformation,andprovidingdedicatedtimeandappropriatespacesforitsuseindecisionmaking.
table 4: holistic approach to M&e using the betterevaluation Framework
Task What it entails
Manage TheplanningandmanagementoftheimplementationoftheM&Esystem,includingwhowillmakedecisionsaboutit,whowillleaddevelopmentandimplementationandtherolesandresponsibilitiesofdifferentactors
Define Developingorobtainingadescriptionoftheinterventionandhowitisunderstoodtowork
Frame SettingtheparametersforM&E–thepurposes,whattomonitorandwhattoevaluateincludingkeyevaluationquestionsandinformationneedsfordecisionmakingabouttheintervention,andthecriteriaandstandardstobeused
Describe Collectingorcollatingdatatoanswerdescriptivequestionsabouttheintervention,thevariousresultsobserved,andthecontextinwhichtheinterventionisimplemented
Understand causes Analysingdatatoanswercausalquestionsabouttheextenttowhichtheinterventionproducedobservedoutcomesand/orimpacts
Synthesize Usingmultiplesourcesofdatatosupportevaluativejudgementsaboutthemerit,worthand/orsignificanceofanintervention
Report and support use Developingandpresentingfindingsinwaysthatareusefulfortheprimaryintendedusers,andsupportingthemtomakeevidence-informeddecisions
When And hoW to deveLop An iMpAct-oriented Monitoring And evALuAtion systeM 11
2.4 What is an impact-oriented M&e system? InthesamewaythatanM&Esystembringstogetherelementsofmonitoringandevaluationinamutuallybeneficialway,animpact-orientedM&Esystemgoesbeyondinvestinginaone-offimpactevaluationprocessthatmightrunparalleltootherM&Eactivities.Theaimistoaligndifferentdatasources,howandwhentheyarecollectedandanalysedsotheycancontributetounderstandingimpact,notonlyperformancecomplianceandshort-termlearning.Beforewelookatwhatthisintegrationlookslike,wedefineimpactevaluationandimpactmonitoring.
Impact evaluationisaspecifictypeofevaluationthatsystematicallyandempiricallyinvestigatestheimpactsproduced,orcontributedto,byaninterventionandseekstodeterminewhatdifferencetheinterventionhasmade.Impactevaluationscanbeundertakenforformativepurposes–toimproveanintervention,orforsummativepurposes–toinformdecisionsaboutwhethertocontinue,discontinue,replicateorscale-upanintervention(Rogers2014).
Animpactevaluationaddressesthreetypesofquestions:descriptivequestions(askshowthingsareorwhathashappened);causalquestions(askswhetherornot,andtowhatextenttheinterventionbroughtabouttheobservedchanges);andevaluativequestions(asksabouttheoverallvalueortheinterventiontakingintoaccountintendedandunintendedimpacts,thecriteriaandstandardsestablishedupfrontandhowtheseshouldbeweightedandsynthesised).
Impact monitoringtracksandreportsinformationrelatedtothelonger-termbenefitsaninterventionintendstoachieve.Butitdoesnotestablishwhetheranyobservedchangesareduetotheinterventionornot.Themostobviousformofimpactmonitoringwouldinvolvedirecttrackingofimpact-levelresults–forexample,byperiodicmeasuringofthehealthofparticipantsormeasuringairqualityaroundaconstructionsite.However,impactmonitoringmayalsoinvolvedevelopingfeedbackmechanismstounderstandearlysignsofpossibleunintendedimpact(bothpositiveandnegative).
Weusetheterm‘impactassessment’morebroadlywheremakingthedistinctionbetweenimpactmonitoringandimpactevaluationisnotpertinent.
Animpact-oriented M&E system,then,isconcernedwithtrackingandjudgingimpact-levelresultsinadditiontoshort-termoutputsandintermediate-termoutcomes.Perrin(2012)suggeststhatongoingmonitoringcancontributefourtypesofinformationthatarecrucialtoevaluatingimpact:
1. informationaboutthenatureoftheintervention,suchasservicesprovided,whohasbeenserved,baselinedata,andchangesovertime
2. informationaboutthecontextoftheintervention,suchasotherinterventionsthatareco-occurring,externalfactorsandthepolitical,economic,socialandphysicalenvironment
3. informationaboutobservedorpotentialimpacts:existingevidenceorstrongsuggestionsthatchangesmaybetakingplace
4. otherpertinentinformation,suchasthecontinuedrelevanceoftheintervention,potentialimpactevaluationquestions,existingdatasources.
AsPerrin(2012)surmises,onlyinrarecircumstancescananimpactevaluationbeconductedindependentlyfromongoingmonitoring.Indeed,thepremiseofthisguidancenoteisthatimpactassessmentreliesonco-developingimpactevaluationandimpactmonitoring,alongwithotherformsofM&E,intoanimpact-orientedM&Esystemthatsupportsdecisionmakingmoreefficientlyandeffectivelythaniftheseelementsweretreatedseparately.ThismeansthateachoftheclustersofM&EtasksinTable4willneedtoaddressappropriatedimensionsofimpact.Forexample:
• ‘definetasks’willneedtoincludedescriptionsoftheintendedimpactandhowtheinterventionisexpectedtoleadtothese,commonlyreferredtoasthetheoryofchange(ToC)andvisualisedusingalogicmodel
• ‘frametasks’willneedtoincludethedevelopmentofimpact-focusedquestionssuchas‘Whatarethelong-term(negativeandpositive)effectsexperiencedbydifferenttargetedgroups?’
• ‘describetasks’willneedtoincludecollectionofinformationaboutimpactsandfactors(suchascontext)thatmightaffectimpacts
• ‘tasksforunderstandingcauses’willneedtoincludestrategiesforassessingattributionorcontributionoftheinterventiontotheobservedimpacts.
Insummary,animpact-orientedM&Esystemisaboutintentionallyfocusingonimpactsthroughouttheinterventioncycle,andbringinginformationaboutimpactsandtheircausesintothedecisionmakingabouttheintervention.
Key messages
Animpact-orientatedM&Esystem:
• integratestracking,describingandjudgingimpact-levelresultsinM&Eeffortsthroughouttheinterventionperiod
• requireslong-termM&Eplanningwithattentiontowhatneedstobedonetomaintainagoodqualitysystemovertime
• isdependentonasharedunderstandingofM&Econceptsandclarityaroundtheroleandresponsibilitiesofdifferentstakeholdersinvolved,and
• reliesonthecontinuedandactiveengagementofprogrammestaffindesigning,implementingand/ormanagingM&Efunctions.
12 Methods LAb
Thissectionfocusesontherationalefordesigningimpact-orientedM&Esystemsandwhyitcanbebeneficialtointegrateimpact-orientationearlyon.
ThepracticalexperiencesfromtheMethodsLabaction-learningandpreviousliteratureonimpactassessment(IEG2009,CONCORD2010,Guineaetal.2015,Vaessenetal.2014)highlightfourkeybenefitsofintegratinganimpact-orientationearlyonintheinterventioncycle,whichare:
1. shiftingthefocusfromoutputstoimpact,fromindicatorstoimpact-relatedquestions
2. improvingtheavailabilityandqualityofdataforimpactassessmenttodrawon
3. providingtimely,relevantdatatoguideadaptivemanagementoftheintervention,and
4. offeringspaceforcollectivesense-makingofdatacollected.
3.1 shifting the focus from outputs to impact, from indicators to impact-related questionsDuringtheproposalstage,projectsareoftenrequiredtocompletealogframeandidentifyindicatorsthatwilldemonstrateprogresstowardsfulfillingthestatedobjectives.Thus,M&Eframeworksoftenfocusoninputs,activitiesandoutputs.Forexample:
• ‘Howmanytrainingswereconducted?’ • ‘Whatisthequalityofthereportsproduced?’ • ‘Howmanybooksweregiventotheschoolchildren?’
Outputmonitoringiswithinaproject’ssphereofcontrolandprovidesvisiblesignsofprojectimplementation.However,itcanalsoorienttheprojectfromtheoutsettowardsindicatorsratherthanasetofkeyquestionsthatdecisionmakersandotherstakeholdersaremostinterestedinansweringabouttheproject.
Integratinganorientationtowardsoutcomesandimpactcanhelpsurfacewhatareoftenimplicitquestionsandassumptionsonwhichtheprojectdesignisbased.Forexample:
• ‘Whatlongertermresultsistheprojectaimingtoachieve?’
• ‘Whatinformationdostakeholdersneedtobeabletounderstandtheextenttowhichtheseareachievedandhowandwhytheyareoccurring?’
UsingtheprojectToCandimpact-focusedquestionsasthefoundationfortheM&Esystemcanhelpidentifywhichassessmentsormeasurements(includingindicators)aremostrelevantandwhichmethodsfordatacollectionandanalysisaremostappropriatetouse.
3.2 improving the availability and quality of data for impact assessmentGatheringdataonaperiodicorongoingbasis,ratherthanjustattheendoftheintervention(asisoftendonewithimpactevaluation)canhelpwiththeinterpretationofdataaboutlongertermresults.Forexample,informationonmonthlymicro-entrepreneurrevenue,seasonalagriculturalyieldsandannualhouseholdincomegivesamorecomprehensivepictureoftheextentanddirectionofchange,includinganyfluctuationsovertime.Collectinginformationwhenactivitiesarebeingimplementedalsoreducesrecallbiasandcanhelptoidentifyandcorrectformissingormisinterpretedinformation.Hence,includinganimpact-orientationinM&Eactivitiescanincreaseboththeavailabilityandthequalityofthedata.
3. Why should an organisation consider developing an impact-oriented M&e system?
When And hoW to deveLop An iMpAct-oriented Monitoring And evALuAtion systeM 13 13 Methods LAb
3.3 providing timely, relevant data to guide adaptive managementInadditiontogatheringdataforjudgingimpactattheendoftheinterventionperiod,inclusionofimpactindicatorsinongoingM&Eeffortscanprovideusefulinformationthroughouttheproject’slifetime.Thisinformationcandemonstratetrendsovertimeandprovideearlysignalsaboutprojectimpacts.Assuch,itcanhelptoguideprojectimplementationandinformthedesignofsubsequentprojects,theplanningforwhichoftenstartsyearsinadvance.Includingproceduresandprocessesforperiodicanalysisandreflectionthroughouttheimplementationcanuncoverunexpected(positiveornegative)impactsthatmaybeotherwisebeoverlooked.
3.4 offering space for collective sense-making of data collectedCONCORD(2010)suggeststhatimpact-focusedM&Ecanalso‘offerspacesforpoliticaldiscussionontheobjectivesofdevelopment,leadingtoareflectionontherelevance,sustainabilityandeffectivenessofactions’ (p.2).Suchdiscussionstookplaceatamoreoperational(ratherthanpolitical)levelinthevariousMethodsLabprojectsbutalsoofferedtheopportunityforjointdiscussionandcollectivesense-makingataproject-widelevel.Thisincluded:discussingtheproject’sToC,underlyingassumptionsandimplementers’understandingofimpact,andprioritisingevaluationquestionsandkeyindicators.Theinteractionbetweenimplementingstaffwasparticularlyimportantincaseswhereexternalconsultantsorgrantwritersattheorganisation’sheadquarterofficehaddeveloped
theM&Eframeworkwithlimitedengagementfromimplementingstaff(manyofwhomhadnotyetbeenhiredatthatstage).EngagementfromallstakeholdersinthedesignandimplementationoftheM&Esystemcanhelpbuy-inandencouragetheuseofdatafordecisionmakingandlearning.AreviewofevaluationsconductedintheEuropeanUnionsuggeststhattheuseofevidencefromevaluationsisinfluencedbythewayinwhichtheyareplannedandthedegreeofstakeholderinvolvement,amongotherfactors(Bossuytetal.2014).
CONCORD(2010)alsopointstothepotentialofanimpact-orientatedM&Esystemtoreinforceaccountabilityandcredibilitytowardsintendedbeneficiaries,donors,partnersandthewiderpublic,andtostrengthenownershipandempowermentofpartnerorganisationsandrights-holders.Forthistooccur,engagementprocessesneedtoincludetheseactors.IntheMethodsLab,allinterventionswerelarge,multi-site,multi-organisationalinitiatives.StakeholderengagementinM&Ewasprimarilyrestrictedtorepresentativesfromthedonoragencyandmanagerialstaffoftheimplementingorganisation(insomecases,governmentofficialswerealsoinvolved).NotoftenwerethoseactuallyimplementingtheinterventioninvolvedinsettingthedirectionforandimplementingorsupervisingM&Eactivities.Theintendedbeneficiarieswereoccasionallyconsultedattheprojectproposalstageandafewimplementingorganisationsplannedforbeneficiaryinvolvementinongoingmonitoringprocesses.However,beneficiarieswere,overall,lessinvolvedthanotherstakeholdergroups.Tobefeasible,largeprojectswithmanystakeholdersneedtotakethetimetoclarifywhowillbeinvolvedinwhichelementsofmonitoringandevaluation.
14 Methods LAb
Notallinterventionsneedtohaveimpact-orientedM&E.Thissectionfocusesonwhatconsiderationsandtoolscanhelpindecidingwhetheritmakessensetoinvesttimeandresourcesintodevelopinganimpact-orientedM&Esystem.
TheM&Esystemofalltypesofinterventionswilllikelyincludeneedsassessmentandmonitoringinputsandoutputsonceimplementationbegins.ExpectationstoconductadditionallevelsofM&Evarybythenature,sizeandmaturityoftheintervention,andalsobyits‘complexity’(seeFigure4).Thereareafewrulesofthumb.First,theextentandcostsofM&Eactivitiesshouldbecommensuratetothesize,reachandcostoftheintervention;M&Eshouldnevercompromiseorovertakeimplementation.Second,notallM&Eactivitiesareappropriateforalltypesofinterventions,ortheintervention’sstageofdevelopment(maturity).
4.1 Matching M&e efforts with implementation efforts and decision-making needsNotallinterventionsneedtocollectinformationaboutimpact.Focusingoninputs,activities,outputsandintermediateoutcomesisoftenenough.
M&Edatashould,firstandforemost,addressthespecificdecision-makingneedsoftheintervention.Thesedependonwhatisorisnotalreadyknownabouttheintervention.Forexample,theM&EstandardsoftheAustralianDepartmentofForeignAffairsandTrade(DFAT)statethatthedegreeofM&Erigourshouldbeproportionaltotheimportanceofthedecisionstobemade.4Aninterventionthataimstoaddressaneworpoorlyunderstoodneed,orthattrialsanewapproachtoapersistentproblem,maywanttoinvestmoreinM&Ethananinterventionthatreplicatesastandardisedserviceinadifferentbutsimilarsetting.
Allinterventionswouldneedtocarryoutinput,activitiesandoutputmonitoring.TheUKDepartmentofInternationalDevelopment(DFID),forexample,requiresallnewprojectstodosousingastandardDFIDBusinessCaseincludingalogframe.Interventionsmayalsobeexpectedtoconductaprocessevaluationassessingtheextenttowhich,
4. When is an impact-oriented M&e system appropriate?
Figure 4: setting realistic expectations for monitoring and evaluation
Num
ber o
f pro
ject
s
Levels of monitoring and evaluation effort
Input/outputmonitoring
Processevaluation
Outcome monitoring/evaluation
Impact monitoring/evaluation
All Most Some Few*
*Impact monitoring should be part of all national level development efforts to monitor population-level changes in health status, wellbeing
or social change; but it cannot be easily linked to specific interventions.
Source: Adapted from Rugg, D., Peersman, G., Carael, M. (eds) (2004) Global advances in HIV/AIDS monitoring and evaluation. New
Directions for Evaluation 103.
4 http://dfat.gov.au/about-us/publications/Documents/monitoring-evaluation-standards.pdf
When And hoW to deveLop An iMpAct-oriented Monitoring And evALuAtion systeM 15
andhow,theinterventionisbeingimplemented,and/orconductanoutcome/impactevaluationattheendoftheimplementationperiod.Althoughthenumberofimpactevaluationshassignificantlyincreasedinthepasttenyears(Savedoff2013,Cameronetal.2015),theyrepresentasmallproportionofthetotalnumberofdevelopmentevaluations.
ImpactM&Erequiresthatprocessmonitoringand/orprocessevaluationhasbeendonefirst;informationabouthowaprojectisimplementedinpractice,ratherthansolelyrelyingondocumentationofhowitwasdesigned,iscriticaltointerpretingfindingsaboutimpact.Forexample,theremaybevariationininterventiondeliveryacrosssitesbasedonstaffmotivationortheextenttowhichtheyadheretoimplementationprotocols.Processmonitoringandevaluationthatdocumentwhatactivitiestookplace,withwhom,whereandhowcanhelptoruleout‘implementationfailure’asapossibleexplanationforthelackofintendedlongertermchangesoccurring(Stame2010).
4.2 determining plausibility, utility and feasibility of impact assessmentImpactshouldbeafocusonlyifthereareplausiblelinksbetweentheinterventionactivitiesandthechainofresults.Otherkeyissuestoaddressfirstare:‘Istheresufficientinterestintheuseofimpactfindings?’;and,‘Isitfeasibletoassessimpact?’.Itisawasteoftimeandresourcesifimpactfindingsarelikelynottobeusedorcometoolatetoinformimportantdecisionmaking.Similarly,ifitisnotfeasibletocollectthetypesofimpact-relatedinformationthatarepertinenttodecision-makingneeds,itmakeslittlesensetoinvestinit.
Anevaluabilityassessmentorsimilarscopingcanhelpprogrammestaffaddressthesethreeconditionsinasystematicwaybeforeembarkingonimpactassessment(see,forexample,Dunn2008,Davies2013,Peersmanetal.2015).Suchanassessmentfocuseson:
• adequacy of the intervention design in terms of the impact it aims to achieve.Isitplausibletoexpectimpact,and,ifso,isitlikelytobeobservablewithinthetimeperiodstudied?
• conduciveness of the organisational context to support and use impact assessment.Aretheresultslikelytobeusedanduseful?
• feasibility of impact assessment.Isitpossible,withtheavailableresources,tocollectusefulimpactdata?
Theinterventionactivitiesshouldreasonablybeexpectedtoleadtotheintendedoutcomes5andimpacts(i.e.,thereisaplausiblerelationship).Verifyingthelogicofthisonthebasisoftheinterventiondesignmayrevealtheneedtomodifytheinterventionand/orrevisittheexpectationsregardinganticipatedoutcomesand/orimpacts.
Impactassessmentshouldonlybeundertakenwhenitsintendeduseanduserscanbeclearlyidentifiedandwhenitislikelytoproduceusefulfindings.Tomanageexpectations,clarityaboutwhoneedswhatinformation,when,andforwhatpurpose(s)iscrucial.Assessingstakeholderexpectationsaboutwhat‘evidence’isseenascredibleisequallyimportant.Stakeholders’needsandexpectationswillaffectthetimingoftheevaluation,thetypeofdatatobecollected,thewayinwhichtheyareobtainedandanalysed,andthestrategiesandchannelsbywhichtopresentandsharethefindingswithintendedusers.
Notalldataareeasytocollectanddatathatcanbemoreeasilyobtainedmaynotbeparticularlyrelevantorappropriatetounderstandcausalpathways.Similarly,thetimingoftheimpactassessmentiscrucialindeterminingwhatisworthassessing:itmaybeundertakentoolatetoinformimportantdecisions;or,ifundertakentooearly,itmayleadtoinaccuraciessuchasunderstatedimpacts(whentherehasnotbeensufficienttimeforimpactstoemerge)oroverstatedimpacts(whenoneneedstodeterminewhetherimpactslastovertime).
Otherpracticalconsiderationsinclude:
• characteristicsoftheinterventionsuchasroll-outovertimeandlocation,levelsofclientintakeandreachmayaffectthesamplingapproachorsamplesize,thebaselinedataneeds,optionsforacontrolorcomparisongroup(whereappropriate)ortheuseofotherstrategiestoinvestigatecausalattribution
• thesizeoftheavailableM&Ebudget.Thisplaysanimportantroleininfluencingwhichdesignsarepossible.Forexample,gatheringdataatthebeginningandendofaninterventionorforinterventionparticipantsandnon-participantscanincreaserequiredresourcessubstantially.
5 Outcomesaredefinedastheintermediate-termresultsthatareintendedtoleadtothedesiredimpacts.
16 Methods LAb
4.3 ensuring adequate capacity and resources for impact assessmentAnimpact-orientationshouldonlybeaddedifthereareadequatecapacitiesandresourcestodoitwell(IEG2009).Forexample,ifexistingM&Eeffortsaregatheringinformationinaninconsistentorincompletemanner–affectingdataquality–oriftheinformationisnotbeinganalysedregularlyorusedappropriately,thenitisnotworthaddingadditionalrequirementsforimpactassessment(seeBox1).Instead,resourceswouldbebettertargetedataddressingexistinggapsandweaknessesfirst.
HowM&Eisvaluedwithinanorganisationmay alsoinfluencewhatisconsideredworthinvestingin. Forexample,M&Estafftimeisoftenfocusedontrackingwhatiscontractuallyagreedratherthanonwhatisneededforgoodprogrammemanagement;capacityandtimeforcriticalreflectionmaynotbejudgedasimportantasstrengtheningcapacityforstatisticalanalysisordatabasemanagement.
Box 1: Importance of assessing existing M&E capacity before deciding on impact assessment
Basedon11projectsinwhichtheauthorshavebeeninvolvedoverthepastyear,projectsproposedgatheringanaverageof58indicators(range18-132,median58).Onelargeprojecthadacomprehensivedatacollectionsysteminplace,withawiderangeofinformationintendedtobegatheredatlocalandregionallevelsandprocedurestoaggregatethedataintoanationalmonitoringinformationsystemonamonthlybasis.Inpractice,however,theproject’sambitionsignificantlyoverwhelmeditscapacity.Theverylargenumberofindicators,highstaffturnoverandinsufficientM&Etraining,andlimitedcapacitytoimplementdataqualityassuranceandtoconductanalyses,resultedinsubstantialvariationinthequalityandcompletenessoftheinformationacrossdifferentsites.Notenoughspaceandtimewasallocatedtoanalyseandinterpretthedatasotheinformationthatwasgatheredwasnotfullyused.
DFAT’s2014M&EstandardsincludethatthoseresponsibleforimplementingtheM&Eplanhavethetime,resourcesandskillstodoso.TheyalsoencouragedocumentinghowM&Eeffortshaveinformedlearning,decisionmakingandaction.
Key messages
Integratinganimpact-orientationintoaM&Esystemshouldonlyhappenwhen:
• informationaboutimpactwillbeusefulandtimelytosupportspecifieddecision-makingneeds
• impactisdeemedplausibleandisfeasibletoassesswithrigor
• resourcesandcapacityforcollecting,analysingandinterpretingimpactdataareadequate.
When And hoW to deveLop An iMpAct-oriented Monitoring And evALuAtion systeM 17
Onceanorganisationhasdecideditisappropriateandtheyareabletodevelopanimpact-orientedM&Esystemitshouldaddress:
1.usingtheToCasthefoundationforimpact- orientedM&E
2.determiningimpactfocusbasedoncomplexitythinking3. balancingemphasisonaccountabilityandlearning4. prioritisingimpact-relatedinformationneeds5. clarifyingM&ErolesandsequencingM&Eactivities
5.1 using the theory of change as the foundation for impact-oriented M&e Atheoryofchangeexplainshowtheactivitiesofaninterventionareunderstoodtocontributetoachainofresults(short-termoutputs,medium-termoutcomes)thatproduceultimateintendedoractualimpacts.Itcanincludepositiveimpacts(whicharebeneficial)andnegativeimpacts(whicharedetrimental).Itcanalsoincludeotherfactorsthatcontributetoproducingimpacts,suchastheparticularcontextinwhichtheinterventionisimplementedandotherprojectsandprogrammes.
AToCcanbeausefultoolduringtheinterventionplanningphase–particularlyforidentifyingassumptionsabouttheplausibilityoftheoveralltheoryandanyofthespecificlinksinthecausalchain,andforencouragingchecksoftheseandrevisionstointerventiondesignforaddressinggaps.Itcanbeusedtoorientnewstakeholdersandtodevelopasharedunderstandingofanintervention,especiallyamongdiversestakeholdersandnewstakeholderscomingonboardovertime(FunnellandRogers2011).
Whendevelopinganimpact-orientedM&Esystem,aToCcanhelptoidentify:
• whatneedstobeassessed(i.e.describedormeasured),includingthequalityandquantityofinputandactivities,outputs,short-termandlonger-termoutcomesandimpacts
• whichlongertermresultswillbeabletobeobservedduringthelifespanoftheintervention,andwhichwillneedtobeprojectedonthebasisofotherevidence
• howdatawillneedtobeanalysedtounderstandlinkagesbetweendifferentvariables
• whereexistingdatacanbeusedandwhattheprioritiesareforadditionaldatacollection.
5. What are key issues to address when establishing an impact-oriented M&e system?
Box2providesanexampleofusingaToCforimpactorientation.
Box 2: Using theory of change to examine contribution to impact
Intwoagriculturalprojects,aToCwasusedtoexaminecontributionclaimsrelatedtoimpact(Guijt2014;VanHemelrijckandKyei-Mensah2015).ThedevelopmentoftheToCwasbasedonhowdifferentcomponentsoftheprojectswereactuallyimplemented.Eachcomponent(e.g.agriculturalproductionoragriculturalprocessing)claimedtomakeaspecificcontributiontotheoverallprojectimpact.
Anevaluationaimedtolookathowthedifferentcontributionclaimswerebeingrealisedandhowtheyinteractedtoachieveimpact.Attheoutsetoftheevaluation,stakeholdersagreedontheprojects’keymechanismsforchange,thecriticalassumptionsforeachandthekeyquestionsaboutdifferentcausalpathwaysthatshouldbeaddressedintheevaluation.Throughusingappropriateevidence,itwaspossibletomakethelinksbetweenvariouspathways(‘claims’)visiblebutalsomakeexplicitanydiscrepanciesbetweenexpectationsofperformanceandwhatwasactuallyachieved.Forexample,inbothcasestheproject’simpactonaccesstofoodandincomewasundeniablebutevidencealsoshowedthatthelower-than-plannedlevelofimplementationhadledtofewerandlesssustainablegainsinlivelihoods.
Whiletheevaluationwasundertakenastheprojectsstartedanewphase,thiskindofquestioningandanalysiscanbeundertakenmoreregularlyduringtheproject’slifespanbyusingexistingdatasupplementedbynewdatacollection.Theadvantageofbuildingthistypeofdatacollection/collationandanalysisintoongoingM&Eactivitiesis:
‘It enables actors to critically and collaboratively engage with the evidence collected on these links, probe their assumptions and hold each other accountable for their contribution to realising impact over time.’ (Van Hemelrijck and Kyei-Mensah 2015)
18 Methods LAb
5.2 determining impact focus based on complexity thinkingWhilemanytheoriesofchangearerepresentedasasimple,linearprocess,mostdevelopmentinterventionshavecomplicatedand/orcomplexaspects,whichareimportanttoacknowledgeandaddress.Itisusefultodistinguishbetweenwhatiscomplicated(involvingmultiplecomponentsandrequiringexpertiseineachcomponenttobringthecomponentstogethereffectively–butultimatelypredictable)andwhatiscomplex(emergent,adaptiveandresponsive;inherently,unpredictable)(GloubermanandZimmerman2002).Someaspectsofadevelopmentinterventionmightbebesttreatedassimple,someascomplicatedand/orsomeascomplex(seeTable5).Itis,therefore,notamatterofdecidinghowtocategoriseanentireintervention–simpleorcomplicatedorcomplex;itisamatterofcategorisingaspectsofit.
‘Simple’aspectsofinterventioninvolvedealingwiththeknown,wherecauseandeffectareunderstoodwellandgoodpracticescanbeconfidentlyrecommended.Theseaspectsofaninterventionrequirelessinvestmentinlearning-orientedreflectionandmoreonverifyingongoingimpact–i.e.isitstillworking?(Table5).
Whatis‘complicated’hasmanycomponentsandinterconnections,butifenoughexpertiseandplanningcanbebroughttobear,adetailedplancanbedeveloped,implemented,trackedandretrospectivelyevaluated.Interventionelementsthatarecomplicated
canbenefitfromarealistapproachtoimpactassessment–i.e.,whatworksforwhominwhatcontexts(Table5or,foradetaileddescriptionseealsoWesthorp2014,forexample).
Whatis‘complex’isnotjustvery,verycomplicatedbutisfundamentallydifferent,andthestrategiesusedtodealwithcomplicationarenotlikelytobeeffectivehere.Whatisconsidered‘complex’isemergenteitherbecausethesituationisrapidlychangingand/orthelevelofknowledgeavailableisinsufficient.Alinearapproachof‘situationanalysis,thenplanning,andthen‘doing’isboundtofail.Rather,complexinterventionaspectsrequireaniterativeapproach,withdevelopmentofearlyprototypesandrapidtriallingandadaptation,aswellasongoingscanningofthesituationasitchanges.
Interventionelementsthatarecomplexwillrequireconsiderablereflection,asoneneedstoanalysetheemergingevidenceofwhatseemstobeworkingandwhatseemsproblematic–i.e.whatisworkinginthecurrentconditions?Whatisthebestwayforwardatthispointintime?(Table5)andmakeevidence-baseddecisionsonhowtomoveforwardatthatpointintime.Thisdistinctionbetween‘complicated’and‘complex’isnotuniversallyusedbythoseclaimingtoaddresscomplexityinevaluation;manydiscussionsofcomplexityareactuallyreferringtolayersofcomplication.Itshouldbeemphasisedthatthistypologydoesnotrepresentahierarchyinwhich‘simple’isnecessarily‘easy’,or‘complex’isnecessarilybetterthan‘simple’.
table 5: distinguishing simple, complicated and complex aspects of interventions and their associated impact focus
Simple, ‘known’ Standardised–asinglewaytodoit
Worksprettymuchthesameeverywhere/foreveryoneBestpracticescanberecommendedconfidentlyKnowledge transfer
Impact focus: did it work or is it still working?
Complicated, ‘knowable’ Adapted –needtodoitdifferentlyindifferentsettings
WorksonlyinspecificcontextsthatcanbeidentifiedGoodpracticesinparticularcontextsKnowledge translation
Impact focus: what worked for whom in what ways and in what contexts?
Complex, ‘unknowable’ Adaptive–needtoworkitoutasyougoalong
DynamicandemergentPatternsareonlyevidentinretrospectOngoing knowledge generation
Impact focus: what is working in the current conditions? What is the best way forward at this point in time?
When And hoW to deveLop An iMpAct-oriented Monitoring And evALuAtion systeM 19
5.3 balancing emphasis on accountability and learningToworkwell,anyM&Esystem–includinganimpact-orientedone–needstoensurepeoplearemotivated,haveboththemeansandtheopportunitytogenerateanduseinformationforlearningaswellasaccountabilitypurposes.However,M&Eisalltoooftenatug-of-warbetweentheneedfor‘accountability’(showingyouaredoingwhatthecontractsays)andthedesiretoensure‘learning’(understandingwhatisandisnotworkingandwhy).Andoften,theneedforaccountabilityisprioritisedovertheneedforlearning.
Theresults-orientationofmanyinternationaldevelopmenteffortselevatessuccessthatisillustratedwithnumericdataovermorenuancedstoriesofsocialtransformation.Andmanyinterventionsareimplementedonthewrongassumptionthatthereismorepredictabilityandorderthanactuallyexists.Hence,performancetrackingisprioritisedoverlearningfromthecomplexdynamicsinwhichtheinterventionoperates(Guijt2010).Whendevelopinganimpact-orientedM&Esystem,itisimportanttohaveacandiddiscussionattheoutsetabouttherelativeemphasisonaccountabilityandlearning.
5.4 prioritising impact-related information needsAsalreadynoted,differentstakeholdersmayhavedifferentunderstandingsofwhatconstitutes‘impact’andthebroadOECD-DACdefinitionofimpact6certainlyleavesroomformanyinterpretations.Itis,therefore,importanttoclarifyhowimpactisdefined.Inanycase,integratinganimpact-orientationintotheM&Esystemincreasesthenumberofevaluationquestionsandtherangeofdatatobecollectedtodeterminewhatchangeshavetakenplaceandwhatfactorsmayhavecontributedtothem.Operationalising‘whatwastheimpactoftheintervention?’intoarealisticnumberofkeyquestionsrequiresprioritisation.Thisinvolves:
• identifyingwhoneedswhatinformation,whenandforwhatdecisions
• identifyingwhatelementsintheToCareleastunderstoodandmostcrucialtoassess
• exploringstakeholderpreferencesforparticularquestionsandtypesofevidence
• determiningwhatquestionsmayfeasiblybeansweredwithinthetimeframelinedtodecision-makingneedsandwiththeavailableresources.
Theprioritisationprocessrequiresreconcilingmany,sometimesconflicting,priorities–particularlywhentheinterventioninvolvesalargenumberofstakeholdersand/orhasmanycomponents.Forexample,stakeholdersinoneMethodsLabcaserepresentedfiveorganisationsleadingimplementationandninesupportingorganisationsinpartnershipwiththreegovernmentministries;togethertheyidentifiedmorethan40potentialevaluationquestionscovering13topicaldomains.
Often,grant-fundedprojectsidentifyabroadToCbutselectquitespecificindicatorstocollectaspartoftheprojectproposalstage.StartingwithindicatorsorientsM&Etowardswhatcanbemeasuredratherthanaskingkeyquestionsabouttheproject(forwhichtheremaybehypothesesratherthandefinitivemeasures).Identifyingimpact-relatedquestionsmaytakeplaceonceprojectsareapprovedandM&EsystemsarebeingdevelopedorexistingorganisationalM&Esystemsareappliedtothenewlyapprovedproject.MakingtheM&Esystemfitforpurposecanbefurthercomplicatedbymulti-componentprogrammesorinitiativeswhereprojectsaregroupedtogetherunderacommonsetofhigh-levelobjectivesandprogramme-wideToC,includingarequirementtouseasetofcommonindicators(BuffardiandHearn2015).
5.5 clarifying M&e roles and sequencing M&e activitiesDeterminingwhowillbeinvolvedinwhatM&EactivitiesisasimportantasdecidinghowtheM&Esystemwillbestructured.Identifyingandagreeingonwhowillbeinvolvedinwhichimpactmonitoringandevaluationtasks–particularlytheroleofimplementingstaff–shouldtakeplaceearlyon.Implementingstaffaretypicallyspecialisedinaparticularsectorofdevelopmentworkbutarenotnecessarilywell-versedinM&E.DifferencesinM&Eterminologyused(e.g.,whatconstitutesanoutput,outcomeorimpact)cancreateconfusionanddifferentexperienceswithM&Ecanunderscoretheperceptionthatitistheexclusivedomainofexperts.TakingonadditionalM&Etasksalsohasimportantimplicationsforstafftimemanagement.DividingM&Eresponsibilities,conductingjointimpactassessmentsand/orsharingdatawithinandacrossorganisationsshouldbeexploredwhereverpossible.
Afterprojectdesignandapproval,itmaytakeseveralmonthsforallkeyoperationalstafftobeinpost,witharelativelyshortwindowoftimebetweenstaffstartdatesandinitiationofprojectactivities.Thestakeholders involvedindesigningtheM&Esystemmaynot,
6 ‘Positiveandnegative,primaryandsecondarylong-termeffectsproducedbyadevelopmentintervention,directlyorindirectly,intendedorunintended.’(OECD-DAC2010)
20 Methods LAb
therefore,includekeyimplementationstaff.Thisdivisionoflabourrequiresadelicatebalance;M&Edecisionsshouldbemadeearlyenoughtoallowtimefordevelopmentandpotentiallypreparationforbaselinedatacollection,yetincludesufficientkeystafftoprovideinputandbuy-in.Moreover,thegroupofstakeholderswilllikelyalsochangeovertimeincludingstaffturnoverornewpartnerorganisationsbecominginvolved.Itmayalsobedifficulttoanticipateatthestartwhatprioritiesintermsofimpactwillbemostrelevanttoadonororganisationinofficefourorfiveyearslater.Prioritisationofimpact-relatedquestionsneeds,therefore,toberevisitedovertimeandadjustedasneeded.
ItisalsocriticaltosetrealisticexpectationsforwhatcanbeachievedinM&Eimplementation–andforwhatcanbesustainedlongerterm.Somegeneralgoodpracticerulesareto:
• buildonwhatisalreadyinplace,donotduplicateorsetupparallelsystems
• startsmallandstrengthenthesystemovertime • conductregularM&Esystemassessmentsandprioritisecapacitystrengtheningovertime
• staythecourse:prioritiseactionswithoutshort-changingimmediateneedsorcompromisinglong(er)termneeds.
When And hoW to deveLop An iMpAct-oriented Monitoring And evALuAtion systeM 21
Thisguidancenotewasdevelopedinresponsetoacommonchallengeexperiencedbyorganisationswherebytheycommissionanimpactevaluationattheendofinterventiononlytofindthatthereisinsufficientdataaboutimplementation,context,baselinesorinterimresults.Weprovidearationalefordealingwithimpact–ifdeemedrelevanttothetypeofintervention–earlyonintheinterventioncycleandthemainbenefitsofdoingso.Primarily,thesearethat:ithelpstoshiftthefocusoftheassessmentfromindicatorstoimpact-relatedquestions,therebybroadeningwhatcanbelearnedaboutthevalueandworthoftheintervention;itcanimprovetheavailability,timelinessandqualityofdatawhicharepertinentfordecisionmakingabouttheintervention;anditallowsforearlyattentiontocollectivesense-makingandappropriateinterpretationofdatacollectedaswellasbuildinginsupportforeffectiveuse.
Recognisingthatnotallinterventionsneedtohaveimpact-orientedM&E,wepresentconsiderationsandtoolstohelporganisationsdecidewhetheritmakessensetoinvesttimeandresourcesintodevelopingsuchasystem.Specifically,integratinganimpact-orientationshouldonlyhappenwhen:(i)informationaboutimpactwillbeusefulandtimelytosupportspecifieddecision-makingneeds;(ii)impactisdeemedplausibleandisfeasibletoassesswithrigor;and,(iii)resourcesandcapacityforcollecting,analysingandinterpretingimpactdataareadequate.
ThisguidancenotediscussestheimportanceofusingtheToCasthefoundationforimpact-orientedM&E;determiningimpactfocusbasedoncomplexitythinking;balancingemphasisonaccountabilityandlearning;prioritisingimpact-relatedinformationneeds;andclarifyingM&ErolesandsequencingM&Eactivities.ThisguidancesupportsM&EadvisorsandprogrammemanagersandimplementerstoplanforaM&Esystemthatsupportsimpactassessmentor,toadaptanexistingM&Esystemtoincorporateanimpactperspective.
6. conclusion
22 Methods LAb
Bossuyt,J.,Shaxson,L.,Datta,A.(2014)Assessing the uptake of strategic evaluations in EU development cooperation.Availableathttp://ecdpm.org/wp-content/uploads/assessing_uptake_strategic_evaluation.pdf
Buffardi,A.L.,Hearn,S.(2015)Multi-project programmes: functions, forms and implications for evaluation and learning.AMethodsLabpublication.London:OverseasDevelopmentInstitute (www.odi.org/publications/10181-multi-project-programmes-functions-forms-implications-evaluation-learning)
Cameron,D.,Mishra,A.,Brown,A.(2015)Thegrowthofimpactevaluationforinternationaldevelopment:howmuchhavewelearned?Journal of Development Effectiveness.http://dx.doi.org/10.1080/19439342.2015.1034156
CentersforDiseaseControlandPrevention(CDC),GlobalAIDSProgram,NationalCenterforHIV,STD,andTBPrevention(US),ORCMacro(2003)Monitoring & evaluation capacity building for program improvement.Fieldguide.http://stacks.cdc.gov/view/cdc/5609
CONCORD(2010)An Approach to Impact-Oriented Programming, Implementation, Monitoring and Evaluation. ACONCORDDiscussionPaper.www.dochas.ie/Shared/Files/4/An_approach_to_impact-oriented_programming.pdf
Davies,R.(2013)Planning evaluability assessments. A synthesis of the literature with recommendations.DFIDWorkingPaper40,October2013.London:DepartmentforInternationalDevelopment.http://r4d.dfid.gov.uk/pdf/outputs/misc_InfoComm/61141-DFIDWorkingPaper40-finalOct13.pdf
Dlamini,N.(2006)Transparency of Process – Monitoring and Evaluation in Learning Organisations.SouthAfrica:CDRA.www.cdra.org.za/articles-by-cdra-practitioners.html
Dunn,E.(2008)Planning for cost effective evaluation with evaluability assessment.Impactassessmentprimerseries.Publication#6.WashingtonDC:USAID.http://pdf.usaid.gov/pdf_docs/PNADN200.pdf
Funnell,S.,Rogers,P.(2011)Purposefulprogramtheory.Effective use of theories of change and logic models.SanFrancisco:Wiley/Jossey-Bass.
Glouberman,S.,Zimmerman,B.(2002)Complicated and complex systems: What would successful reform of Medicare look like? CommissionontheFutureofHealthCareinCanada.DiscussionPaper8.www.hc-sc.gc.ca/english/pdf/romanow/pdfs/8_Glouberman_E.pdf
Guinea,J.,Sela,E.,Gómez-Núñez,A.,Mangwende,T.,Ambali,A.,Ngum,N.,Jaramillo,H.,Gallego,J.,Patiño,A.,Latorre,C.,Srivanichakorn,S.,Thepthien,B.(2015)Impactorientedmonitoring:Anewmethodologyformonitoringandevaluationofinternationalpublichealthresearchprojects.Research Evaluation,Vol24(2):131-145.http://rev.oxfordjournals.org/content/24/2/131.full
Guijt,I.(2010)ExplodingtheMythofIncompatibilitybetweenAccountabilityandLearning.In:Ubels,J.,Acquaye-Baddoo,N.,Fowler,A.Capacity Development in Practice,p.277-91.TheHague:SNV.www.snvworld.org/sites/www.snvworld.org/files/publications/21_accountability_and_learning_-_exploding_the_myth_of_incompatibility_between_accountability_and_learning_-_irene_guijt_-.pdf
Guijt,I.withNguyen,A.,Ngoc,L.,Dang,H.,andManh,L.(2014)ReportontheParticipatoryImpactEvaluationofthe‘DoingBusinesswiththeRuralPoor’ProjectinBênTre,VietNam(2008-2013).APilotApplicationofaParticipatoryImpactAssessment&LearningApproach.Rome:IFADandBMGF.
Hearn,S.,Buffardi,A.L.(2016)What is impact?AMethodsLabpublication.London:OverseasDevelopmentInstitute(www.odi.org/publications/10326-impact)
IEG(2009)Institutionalizing Impact Evaluation Within the Framework of a Monitoring and Evaluation System.WashingtonDC:IndependentEvaluationGroup(IEG)andthePovertyAnalysis,MonitoringandImpactEvaluationThematicGroup,WorldBank.http://ieg.worldbank.org/Data/reports/inst_ie_framework_me.pdf
Kusek,J.,Rist,R.(2004)Ten steps to a results based monitoring and evaluation system.Ahandbookfordevelopmentpractitioners.WashingtonDC:TheWorldBank.
Ling,A.(2014)Revisiting impact in the context of unintended consequences.Presentationatthe2014AfricanEvaluationAssociationConference,Yaoundé,Cameroon.
OECD-DAC(2010)Glossary of Key Terms in Evaluation and Results Based Management.Paris:OEDC-DAC.www.oecd.org/development/peer-reviews/2754804.pdf
Peersman,G.,Rugg,D.(2010)Basic terminology and frameworks for monitoring and evaluation.Geneva:UNAIDSMonitoringandEvaluationFundamentals.
references
When And hoW to deveLop An iMpAct-oriented Monitoring And evALuAtion systeM 23
Peersman,G.,Guijt,I.,Pasanen,T.(2015)Evaluability assessment for impact evaluation. Guidance, checklist and decision support for those conducting the assessment.AMethodsLabpublication.London:OverseasDevelopmentInstitute&BetterEvaluation.http://betterevaluation.org/resource/guide/evaluability_assessment_for_impact_evaluation
Perrin,B.(2012)Linkingmonitoringandevaluationtoimpactevaluation.ImpactEvaluationNotesNo.2.InterAction.www.interaction.org/document/guidance-note-2-linking-monitoring-and-evaluation-impact-evaluation
Rogers,P.(2012)Introduction to Impact Evaluation.ImpactEvaluationNotesNo1.InterAction.www.interaction.org/document/introduction-impact-evaluation
Rogers,P.(2014)Overview of Impact Evaluation.MethodologicalBriefsonImpactEvaluation,No1.Florence:UNICEFOfficeofResearch.
Rogers,P.(2014)Overview: Strategies for causal attribution.MethodologicalBriefsonImpactEvaluation,No6.Florence:UNICEFOfficeofResearch.
Rogers,P.,Peersman,G.(2014)Addressing complexity in evaluation.Canberra:DFATworkshop.Rugg,D.,Peersman,G.,Carael,M.(eds)(2004)GlobaladvancesinHIV/AIDSmonitoringandevaluation.New Directions
for Evaluation103.Savedoff,W.(2013)‘Impact Evaluation?: Where have we been? Where are we going?’Presentationat‘Impactevaluation:
Howcanwelearnmore?Better?’.WashingtonDC:CenterforGlobalDevelopment,17July2013.http://international.cgdev.org/sites/default/files/Savedoff.pdf.
Stame,N.(2010)Whatdoesn’twork?Threefailures,manyanswers.Evaluation16(4):371-87.Vaessen,J.,Garcia,O.,Uitto,J.(2014)MakingM&EMore‘Impact-oriented’:IllustrationsfromtheUN.IDS Bulletin45
(6):65-76.http://onlinelibrary.wiley.com/doi/10.1111/1759-5436.12113/abstractVanHemelrijck,A.,Kyei-Mensah,G.(2015)FinalReportontheParticipatoryImpactEvaluationoftheRootandTuber
Improvement&MarketingProgram(RTIMP).PilotApplicationofaParticipatoryImpactAssessmentandLearningApproach(PIALA).IFAD,GovernmentofGhana,BMGF.
Westhorp,G.(2014).Realist impact evaluation: An introduction.AMethodsLabpublication.London:London:OverseasDevelopmentInstituteandBetterEvaluation.http://betterevaluation.org/resources/realist-impact-evaluation-introduction
24 Methods LAb
Programme theory / theory of change / logic model
• BetterEvaluation.Develop programme theory.http://betterevaluation.org/plan/define/develop_logic_model • Forti,M.(2012)SixTheoryofChangePitfallstoAvoid.Stanford Social Innovation Review,23May2012. www.ssireview.org/blog/entry/six_theory_of_change_pitfalls_to_avoid
• Funnell,S.,Rogers,P.(2011)PurposefulProgramTheory:EffectiveUseofLogicModelsandTheoriesofChange.SanFrancisco:Wiley/Jossey-Bass.
• Hivos(2015)TheoryofChangeThinkinginPractice:Astepwiseapproach.Wageningen:Hivos.www.theoryofchange.nl/resource/theory-change-thinking-practice-stepwise-approach
• Rogers,P.(2014)TheoryofChange.MethodologicalBriefsonImpactEvaluation,No.2.Florence:UNICEFOfficeofResearch.
• Valters,C.(2014)Theoriesofchangeininternationaldevelopment:Communication,learningoraccountability.London:JusticeandSecurityResearchProgramme,InternationalDevelopmentDepartment,LSEandAsiaFoundation.www.lse.ac.uk/internationalDevelopment/research/JSRP/downloads/JSRP17.Valters.pdf
Complexity thinking and M&E • Britt,H.(2013)Complexity-aware monitoring.USAIDDiscussionNoteVersion2.0.WashingtonDC:USAID.https://usaidlearninglab.org/sites/default/files/resource/files/Complexity%20Aware%20Monitoring%202013-12-11%20FINAL.pdf
• FSG(2014)Evaluating complexity. Propositions for improving practice.www.fsg.org/publications/evaluating-complexity • Hummelbrunner,R.,Jones,H.(2013)A guide for planning and strategy development in the face of complexity.ODIBackgroundNote.London:OverseasDevelopmentInstitute.www.odi.org/sites/odi.org.uk/files/odi-assets/publications-opinion-files/8287.pdf
• Mowles,C.,Stacey,G.,Griffin,D.(2008)Whatcontributioncaninsightsfromthecomplexitysciencesmaketothetheoryandpracticeofdevelopmentmanagement?Journal of International Development20(6):804–20.
• Rogers,P.(2008)Usingprogrammetheorytoevaluatecomplicatedandcomplexaspectsofinterventions.Evaluation14(1):29-48.
Monitoring and evaluation • BetterEvaluation.Aninternationalcollaborationtoimproveevaluationpracticeandtheorybysharingandgeneratinginformationaboutoptions(methodsorprocesses)andapproaches.www.betterevaluation.org
• BetterEvaluation.Impactevaluation.http://betterevaluation.org/themes/impact_evaluationIncluding13methodologicalbriefs,severalofthemwithananimatedvideoandwebinar.
• Bonbright,D.(2012)Use of impact evaluation results.Note4.WashingtonDC:InterAction.www.interaction.org/document/guidance-note-4-use-impact-evaluation-results
• CentersforDiseaseControlandPrevention(CDC),GlobalAIDSProgram,NationalCenterforHIV,STD,andTBPrevention(US),ORCMacro(2003).Monitoring & evaluation capacity building for program improvement.Fieldguide.http://stacks.cdc.gov/view/cdc/5609
• Guijt,I.(2010)ExplodingtheMythofIncompatibilitybetweenAccountabilityandLearning.In:Ubels,J.,Acquaye-Baddoo,N.,Fowler,A. Capacity Development in Practice,p.277–91.TheHague:SNV.www.snvworld.org/sites/www.snvworld.org/files/publications/21_accountability_and_learning_-_exploding_the_myth_of_incompatibility_between_accountability_and_learning_-_irene_guijt_-.pdf
• IFAD.EvaluationGuide.www.ifad.org/evaluation/guide/ • IndependentEvaluationGroup(2009)Institutionalizing impact evaluation within the framework of a monitoring and
evaluation system.WashingtonDC:IndependentEvaluationGroup(IEG)andthePovertyAnalysis,MonitoringandImpactEvaluationThematicGroup,WorldBank.http://ieg.worldbank.org/Data/reports/inst_ie_framework_me.pdf
• Peersman,G.,Guijt,I.,Pasanen,T.(2015)Evaluability assessment for impact evaluation. Guidance, checklist and decision support for those conducting the assessment.AMethodsLabpublication.London:OverseasDevelopmentInstitute.http://betterevaluation.org/resource/guide/evaluability_assessment_for_impact_evaluation
• Peersman,G.,Rugg,D.(2010)Basic terminology and frameworks for monitoring and evaluation.Geneva:UNAIDSMonitoringandEvaluationFundamentals.www.unaids.org/sites/default/files/sub_landing/files/7_1-Basic-Terminology-and-Frameworks-MEF.pdf
resources
When And hoW to deveLop An iMpAct-oriented Monitoring And evALuAtion systeM 25
• Perrin,B.(2012)Linking monitoring and evaluation to impact evaluation.ImpactEvaluationNotes2.WashingtonDC:InterAction.www.interaction.org/sites/default/files/Linking%20Monitoring%20and%20Evaluation%20to%20Impact%20Evaluation.pdf
• Rogers,P.(2012)Introduction to Impact Evaluation.ImpactEvaluationNotes1.WashingtonDC:InterAction.www.interaction.org/document/introduction-impact-evaluation
• Stern,E.,Stame,N.,Mayne,J.,Forss,K.,Davies,R.,Befani,B.(2012)Broadening the Range of Designs and Methods for Impact Evaluations.DFIDWorkingPaper38.London:DepartmentforInternationalDevelopment.www.gov.uk/government/uploads/system/uploads/attachment_data/file/67427/design-method-impact-eval.pdf
• Stern,E.(2015)ImpactEvaluation:Aguideforcommissionersandmanagers.London:Bond. • UNAIDS(2008)OrganisingFrameworkforaFunctionalNationalHIVMonitoringandEvaluationSystem.Geneva:UNAIDS.www.unaids.org/sites/default/files/sub_landing/files/20080430_JC1769_Organizing_Framework_Functional_v2_en.pdf
• UNAIDS(2009a)12 Components Monitoring and Evaluation System Strengthening Tool.Geneva:UNAIDS.www.unaids.org/sites/default/files/sub_landing/files/2_MERG_Strengthening_Tool_12_Components_ME_System.pdf
• UNAIDS(2009b)12 Components Monitoring & Evaluation System Assessment.GuidelinestoSupportPreparation,ImplementationandFollow-UpActivities.Geneva:UNAIDS.www.unaids.org/sites/default/files/sub_landing/files/1_MERG_Assessment_12_Components_ME_System.pdf
ODI is the UK’s leading independent think tank on international development and humanitarian issues.
Readers are encouraged to reproduce Methods Lab material for their own publications, as long as they are not being sold commercially. As copyright holder, ODI requests due acknowledgement and a copy of the publication. For online use, we ask readers to link to the original resource on the ODI website. The views presented in this paper are those of the author(s) and do not necessarily represent the views of ODI, the Australian Department of Foreign Affairs and Trade (DFAT) and BetterEvaluation.
© Overseas Development Institute 2016. This work is licensed under a Creative Commons Attribution-NonCommercial Licence (CC BY-NC 4.0).
ISSN: 2052-7209ISBN: 978-0-9941522-2-0
All ODI Reports are available from www.odi.org
Overseas Development Institute203 Blackfriars Road London SE1 8NJTel +44 (0)20 7922 0300 Fax +44 (0)20 7922 0399
www.odi.org