Logic Models: How to Develop, Link to M&E and Adapt Logic Models: How to Develop, Link to M&E and...
-
Upload
erik-bridges -
Category
Documents
-
view
214 -
download
1
Transcript of Logic Models: How to Develop, Link to M&E and Adapt Logic Models: How to Develop, Link to M&E and...
Logic Models: How to Develop, Link to M&E and Adapt
Evaluating Int’l Development Projects: One-Day Skills Building Workshop on M&E
Cornell International Institute for Food and Agriculture DevelopmentNovember 5, 2011
Lesli HoeyPhD Candidate
Cornell Department of City and Regional Planning
Outline 1.How to develop a logic model
2.Using logic models to design M&E
3.M&E across program phases
4.Linear vs. complex interventions
Step 1: Purpose and use
Why are you developing a logic model? Who will use it? How?
Step 2: Involve others
Who should participate in creating the logic model?
Step 3: Set the boundaries for the logic model
What will the logic model depict: a single, focused endeavor; a comprehensive initiative; a collaborative process? What level of detail is needed?
Step 4: Understand the situation
What is the situation giving rise to the intervention? What do we know about the problem/audience/context? Adapted from: Taylor-Powell and Henert, 2008
Developing a Logic Model
Adapted from: Taylor-Powell and Henert, 2008
Process Options1) Everyone identifies resources, activities, participants
and outcomes on post-it notes arranged on wall. Check for “if-then” relationships, edit duplicates, ID gaps, etc.
2) Small subgroups develop their own logic model of the program. The whole group merges these into one.
3) Participants bring a list of program outcomes. Sort into short- and long-term outcomes by target group. Edit duplicates, ID gaps, etc. Discuss assumptions about chain of outcomes, external factors. Link resources, activities.
4) Use web-based systems, e-mail or other distance methods.
5) Subcommittee creates the model and reviews with others.
Logic Models & Evaluation
Helps us match evaluation to the program
Helps us know what and when to measure
- Are you interested in process and/or outcomes?
Helps us focus on key, important information
-Where will you spend limited evaluation resources?- What do we really need to know?
Source: Taylor-Powell and Henert, 2008
Types of Evaluation Mapped Across the Logic Model
Needs/asset assessment: What are the characteristics, needs, priorities of target population?What are potential barriers/facilitators?What is most appropriate to do?
Process evaluation: How is program implemented? Are activities delivered as intended? Fidelity of implementation?Are participants being reached as intended? What are participant reactions?
Outcome evaluation: To what extent are desired changes occurring? Goals met?Who is benefiting/not benefiting? How? What seems to work? Not work?What are unintended outcomes?
Impact evaluation: To what extent can changes be attributed to the program? What are the net effects?What are final consequences? Is program worth resources it costs?
Source: Taylor-Powell and Henert, 2008
Source: Taylor-Powell, 2002
Water Quality Project Example
Formative Evaluation Questions Summative Evaluation Questions
Indicators
Source: Trochim, 2006
Initiation – Need dynamic, flexible, rapid feedback about implementation and process. Includes monitoring, post-only feedback, unstructured observation, sharing of implementation experiences. Mostly qualitative.
Development – Focus on observation, assessment of change in key outcomes, emerging consistency. Includes pre-post differences. Qualitative or quantitative.
Mature – When a program is routinized and stable, compare outcomes with expectations, with performance in alternative programs, or sites with no program. Includes experimental and quasi-experimental designs, more structured and comparative qualitative approaches.
Dissemination – Focused on transferability, generalizability or external validity. Measure consistency of outcomes across different settings, populations or program variations.
Program Phases and EvaluationFORMATIVE
SUMMATIVE
Three ways of conceptualizing and mapping theories of
change
1.Linear Newtonian causality
2.Interdependent systems relationships
3.Complex nonlinear dynamics
Source: Patton, 2008
Interdependent Systems Relationships
Dept 1
Dept 2
Dept 3
Dept 4
OUTPUTS
SHORT-TERM OUTCOMES
MID-TERM OUTCOMES
LONG-TERM OUTCOMES
Adapted from Chapel, 2006 in Taylor-Powell and Henert, 2008
Source: Patton, 2008
Strong
Timely,National/
GrassrootsCoordination
OpportunisticLobbying &
JudicialEngagement
StrongHigh Capacity
Coalitions
EFFECTIVEADVOCACY
DisciplinedFocusedMessage/Effective
Communications
Collaborating Funders/ Strategic Funding
Complex, Non-Linear Intervention
SolidKnowledge
&
Research Base
Conditions that challenge traditional model-testing
evaluation• High innovation• Ongoing development• High uncertainty• Dynamic, rapid
change• Emergent (difficult to
plan and predict)• Systems Change• Interdependence
AdaptiveManagemen
t
Adapted from: Patton, 2008
Ideal Type Evaluation Models
Adapted from: Patton, 2008
Traditional DevelopmentalTests models Supports innovation and adaptation
Renders definitive judgment of success or failure
Provides feedback, generates learning and affirms changes in certain direction
Measures success against predetermined goals
Develops new measures and monitoring mechanisms as goals emerge and evolve
Evaluator external, objective
Evaluator part of team, ‘learning coach’
Evaluator determines design
Evaluator collaborates on design
Design based on linear cause-effect model
Design captures system dynamics, inter-dependencies, emergent interconnections
Aim to produce generalizable findings across time & space
Aim to produce context-specific understand to inform ongoing innovation
Accountability directed externally, to control
Accountability focused on commitment to learning, for responding to lack of control
Engenders fear of failure Engenders desire to learn
See CIIFAD website for evaluation institutes and WMUVisit U Wisconsin Extension websiteLook at these books:
Bamberger, M., Rugh, J. and M. Linda. 2011 (2nd Ed). Real World Evaluation Working Under Budget, Time, Data, and Political Constraints. Los Angeles: Sage.
Patton, M.Q. 2008 (4th Ed). Utilization-Focused Evaluation. Los Angeles: Sage.
Patton, M.Q. 2011. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use. NY: Guilford Press.
Williams, B and I. Imam. 2006. Systems Concepts in Evaluation – An Expert Anthology. Point Reyes CA: Edge Press/AEA
World Bank.2006. Conducting Quality Impact Evaluations Under Budget, Time and Data Constraints. Washington, DC: Author
Useful Resources
Patton, M.Q. 2008. “Evaluating the complex: Getting to maybe”. Power point presented in Oslo, Norway. Available online: aidontheedge.files.wordpress.com/2009/09/patton_oslo.ppt
Taylor-Powell, E. and E. Henert. 2008. “Developing a logic model: Teaching and training guide”. Madison: University of Wisconsin – Extension
Trochim, W. 2007. “Evolutionary perspectives on evaluation: Theoretical and practical implications”. Paper Presented at the Colorado Evaluation Network
References Cited