Evaluation pal program monitoring and evaluation technology

50
www.evaluationpal.com Evaluation Pal Program monitoring and evaluation technology Canadian Evaluation Society Conference (2013) Brian Cugelman, PhD BETA 2 www.evaluationpal.com

description

In this session, Dr. Cugelman will discuss his work to develop an automated program monitoring and evaluation technology, called Evaluation Pal. He launched Evaluation Pal in 2011, then in 2012, pilot tested it for an evaluation of the Green Infrastructure Ontario Coalition which was submitted to the Ontario Trillium Foundation. Soon after, MaRS' Social Innovation Generation accepted it into their incubator program. In this session, Dr. Cugelman will provide a tour of the tool, and use the Green Infrastructure Ontario case study to demonstrate how automated data collection can be used in the program evaluation process. This presentation will also provide an opportunity to discuss the challenges and opportunities of using technology to aid program evaluation.

Transcript of Evaluation pal program monitoring and evaluation technology

Page 1: Evaluation pal program monitoring and evaluation technology

www.evaluationpal.com

Evaluation Pal Program monitoring and evaluation technology

Canadian Evaluation Society Conference (2013)

Brian Cugelman, PhD

BETA 2

www.evaluationpal.com

Page 2: Evaluation pal program monitoring and evaluation technology

AGENDA

1. Monitoring and evaluation, plus elephants

2. The vision of real-time feedback

3. Introducing Evaluation Pal

4. Project history

2

Page 3: Evaluation pal program monitoring and evaluation technology

© Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp.

MONITORING AND EVALUATION, PLUS ELEPHANTS

3

Page 4: Evaluation pal program monitoring and evaluation technology

4

But it's often about:

Accountability: Satisfying donor requirements

We say it's about:

Decision making: Making better decisions based on evidence

Performance improvement: Learning what works and improving performance

Risk mitigation: Identifying risks early, to avoid potential crises

M & E: THE ELEPHANT IN THE ROOM

Page 5: Evaluation pal program monitoring and evaluation technology

M & E FOR MANY ORGANIZATIONS

• Requires expensive consultants

• The process takes up too much staff time

• Valuable information often comes too late

• Few people read big reports

• Evaluators sometimes scare people

5

Page 6: Evaluation pal program monitoring and evaluation technology

© Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp.

THE VISION OF REAL-TIME FEEDBACK

6

Page 7: Evaluation pal program monitoring and evaluation technology

DESIGNING & IMPLEMENTING

7

Execution(-) Bad Execution (+) Good Execution

Design

(+) Evidence Informed

Promising intervention poorly executed

Promising intervention well executed

(-) Not evidence informed

Unlikely intervention poorly executed

Unlikely intervention well executed

• Research is only part of the equation • Execution is just as important

Page 8: Evaluation pal program monitoring and evaluation technology

FEEDBACK AND PERFORMANCE

8

Goals

Improving performance

Feedback on

performance

Page 9: Evaluation pal program monitoring and evaluation technology

WHAT SUCCESS NORMALLY LOOKS LIKE

9

• Both Internet marketing and public mobilization seem to follow power laws

• Growth can be logarithmic with peaks and valleys between campaigns

Time (years)

Imp

act

1 2 3 4 5 6 7 8 9 10

Page 10: Evaluation pal program monitoring and evaluation technology

WITHOUT FEEDBACK, ORGANIZATION CAN’T...

10

Know when they’re succeeding or failing

Tell if th

ey’re pursuing th

e right o

r

wrong goals

Judge which activates are most or least efficient

Adapt to difficult situations

Make improvements

Tailor their work to stakeholder needs

Tell

if th

eir

tact

ics a

re

backfirin

g

Feedback is essential to success,

for people or organizations

Page 11: Evaluation pal program monitoring and evaluation technology

TREND TOWARDS ITERATIVE LEARNING AND IMPROVING

11

1. Deploy

2. Assess 3. Revise

1. Deploy: Implementing the latest iteration

2. Assess: Measuring and learning

3. Revise: Rethinking and adapting

Unknown cousins: • Developmental evaluation• Lean start-up

Page 12: Evaluation pal program monitoring and evaluation technology

© Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp.

INTRODUCING EVALUATION PAL

12

Page 13: Evaluation pal program monitoring and evaluation technology

SOLUTION - EVALUATIONPAL.COM

13

A tool that helps organizations monitor their progress and improve their performance.

Mission: To build a

better world by helping

social organizations

become more effective.

Page 14: Evaluation pal program monitoring and evaluation technology

14

Page 15: Evaluation pal program monitoring and evaluation technology

A TOOL FOR LEARNING CULTURES

15

1. Describe

yourself or

organization

2. Ask for the feedback that you need

5. Improve your performance

4. Learn what's

working or not

3. Collect feedback from informants and add hard evidence

Page 16: Evaluation pal program monitoring and evaluation technology

CASE STUDY OF OUR EVALUATION OF GIO

16

1 2

Page 17: Evaluation pal program monitoring and evaluation technology

1. DESCRIBE

17

Page 18: Evaluation pal program monitoring and evaluation technology

DESCRIBE YOUR ORGANIZATION – LOGIC MODEL

18

Inputs ActivitiesOutcomes

Ultimate Goal

(short-term) (mid-term) (long-term)

Steering Committee members

Coalition staff Expert peer review committee (volunteers)

Consultants Workshop partners

Volunteers Intern Funding from Trillium

Funding from Steering Committee members

In-kind donations

Conducting outreach & education

Implementing 5 workshops

Building the Coalition Filing an Environmental Bill of Rights application to change the definition of infrastructure

Sharing best practices

Producing the Green Infrastructure Ontario Report

Carrying out the launch event

Posting & distributing content through the website

Producing & sending the e-update

Operating the Coalition Steering Committee

Meeting ministers & government staff

Increase awareness & support for green infrastructure among non-profit organizations

Increase awareness & support for green infrastructure among government staff

Increase coverage of green infrastructure issues in the media

Increase awareness & support for green infrastructure among decision makers

Increase political support & priorities for green infrastructure

Increase support & priorities for green infrastructure among the public

Increase green infrastructure funding mechanisms

Increase green infrastructure policy & legislation

Increase the implementation of green infrastructure in Ontario

Page 19: Evaluation pal program monitoring and evaluation technology

DESCRIBE YOUR ORGANIZATION – THREE LOGIC MODELS

19

There are also logic models for people, focused on personal and professional development.

1. Non-profit organization

2. Social enterprise

3. For-profit organization

Page 20: Evaluation pal program monitoring and evaluation technology

20

Page 21: Evaluation pal program monitoring and evaluation technology

21

Page 22: Evaluation pal program monitoring and evaluation technology

22

Page 23: Evaluation pal program monitoring and evaluation technology

2. ASK

23

Page 24: Evaluation pal program monitoring and evaluation technology

24

Constituent volunteering and donating

Implementation quality

and efficiency

Brand health & reputation

Stakeholder satisfaction

Impact

Likelihood of reaching goals

Investments

Over 40 base, extrapolated, and customer insight metrics and measures.

Market, strategy, foresight

Stakeholder demographi

cs & psychograp

hics

Personal development

Page 25: Evaluation pal program monitoring and evaluation technology

METRIC CATEGORIES

25

Demographics and

psychographics

Base-metrics• Investments• Implementation quality• Progress towards goals• Stakeholder & customer

engagement• Reputation and brand

health• Advice for success • Market Attractiveness• Equitable office

Extrapolated metrics• Value for money• Effective prioritizing• Effectiveness engagement

(Power Analysis)• Contribution of activities to goals• SWOT• PEST• Source credibility• Program implementation fidelity• Most significant change• Product and service attractiveness

Page 26: Evaluation pal program monitoring and evaluation technology

3. COLLECT

26

Page 27: Evaluation pal program monitoring and evaluation technology

ADD INFORMANTS

Internal Partner External

Staff / PeersManagers

Board membersHighly involved volunteers

VolunteersDonors / Funders

Partner organizationsConsultants and experts

CustomersConstituentsBeneficiaries

Peer organizations

Page 28: Evaluation pal program monitoring and evaluation technology

28

Page 29: Evaluation pal program monitoring and evaluation technology

TRADITIONAL SAMPLING VERSUS PANEL SURVEYS

1 6

35

4 2

Traditional surveys(all in one go)

Evaluation Pal panels(randomly divided across a year)

1

Page 30: Evaluation pal program monitoring and evaluation technology

Jan Feb Mar Apr May June July Aug Sep Oct Nov Dec

• Too much engagement • Too much information• Too late to act on insight

TRADITIONAL END OF PROGRAM ASSESSMENTS (all in one go)

1

Page 31: Evaluation pal program monitoring and evaluation technology

31

Jan Feb Mar Apr May June July Aug Sep Oct Nov Dec

• Only engage a small sample at a time• Random sampling offers confident findings• Insight available throughout the year• Randomization within key informant groups• Near real time feedback

EVALUATION PAL PANELS

(randomly divided across a year)

1 635

4 2

Page 32: Evaluation pal program monitoring and evaluation technology

32

Page 33: Evaluation pal program monitoring and evaluation technology

33

Sequence of respectful timed messages:• 28 days• 14 days• 7 days• 1 day before

Page 34: Evaluation pal program monitoring and evaluation technology

34

Page 35: Evaluation pal program monitoring and evaluation technology

35

Page 36: Evaluation pal program monitoring and evaluation technology

4. LEARN

36

Page 37: Evaluation pal program monitoring and evaluation technology

Browser

http://www.evaluationpal.com/panel/settings Search

Living logic model

Inputs ActivitiesOutcomes

Ultimate Goal

short-term mid-term long-term

Steering Committee members

Coalition staff Expert peer review committee (volunteers)

Consultants Workshop partners

Volunteers Intern Funding from Trillium

Funding from Steering Committee members

In-kind donations

Conducting outreach & education

Implementing 5 workshops

Building the Coalition Filing an Environmental Bill of Rights application to change the definition of infrastructure

Sharing best practices Producing the Green Infrastructure Ontario Report

Carrying out the launch event

Posting & distributing content through the website

Producing & sending the e-update

Operating the Coalition Steering Committee

Meeting ministers & government staff

Increase awareness & support for green infrastructure among non-profit organizations

Increase awareness & support for green infrastructure among government staff

Increase coverage of green infrastructure issues in the media

Increase awareness & support for green infrastructure among decision makers

Increase political support & priorities for green infrastructure

Increase support & priorities for green infrastructure among the public

Increase green infrastructure funding mechanisms

Increase green infrastructure policy & legislation

Increase the implementation of green infrastructure in Ontario

MixedOn track At risk Not assessed

Page 38: Evaluation pal program monitoring and evaluation technology

Browser

http://www.evaluationpal.com/panel/settings Search

Contribution of activities towards goals

Page 39: Evaluation pal program monitoring and evaluation technology

Browser

http://www.evaluationpal.com/panel/settings Search

Keep up the good workConcentrate here

Low priority Possible overkill

Effective focus

Page 40: Evaluation pal program monitoring and evaluation technology

Browser

http://www.evaluationpal.com/panel/settings Search

Efficiency

EfficientLeast Efficient

Efficient Most Efficient

Page 41: Evaluation pal program monitoring and evaluation technology

Browser

http://www.evaluationpal.com/panel/settings Search

Contribution of activities to goals

Page 42: Evaluation pal program monitoring and evaluation technology

42

Browser

http://www.evaluationpal.com/panel/settings Search

Contribution of activities to goals

Page 43: Evaluation pal program monitoring and evaluation technology

Browser

http://www.evaluationpal.com/panel/settings Search

Crowd sourced SWOT

Strengths•Active, influential and diverse coalition 15•Commitment, motivation and vision 10•Communications, outreach and online activities 10•Green infrastructure is an important topic 6•Credibility 5•Networking 5•Branding and design 3•Evidence based 3•Expertise and experience 3•Timing 3•Ethics and values 2•Focus on realistic goals 2•Inclusive process 2•Sharing best practices 2•Workshops and their output 2

Weaknesses•Public engagement and awareness 12•Political engagement and support 11

•Setting coalition goals and focusing on green infrastructure topics 7

•Funding 6•Making a persuasive case for green infrastructure 5•Media interest 3•Member commitment, engagement and collaboration 3

•Steering Committee coherence, contributions and leadership 3

•Not enough engagement with stakeholders 3•Achieving concrete outcomes 2•Capacity 2•Reach outside current network 2•Too much on green roofs 2

Page 44: Evaluation pal program monitoring and evaluation technology

Browser

http://www.evaluationpal.com/panel/settings Search

OpportunitiesExpand the Coalition and network 13Highlight economic opportunities and savings 9Raise public awareness and support 5Make links to climate change and green energy 4Align with government and municipal priorities 4Improve government relations and shape policy 3Highlight benefits 3Raise awareness through education and events 2Better use the Coalition 2Access funding 1Build local capacity 1Coalition’s capacity 1Election commitments 1Audit green infrastructure and report progress 1Design school curriculum 1

ThreatsBudget limits or perceptions that green infrastructure is not economical 17

Public awareness, apathy and competing issues 13GI is not understood or valued, or is seen as a fringe idea 10

Persuading implementers that green infrastructure is comparable to grey infrastructure (can’t make a strong case)

7

Lack of political relations, awareness and support 5Coalition governance and vision 4Lack of a clear message 1Lack of Canadian case studies 1Lack of media interest 1Not enough coordination among key actors 1Poor existing policy 1Scope of network too small 1Slow reaction time 1

Crowd sourced SWOT

Page 45: Evaluation pal program monitoring and evaluation technology

REGULAR AND SPECIAL REPORTSRegular reports(in every report)

• All core performance and impact measures

• Most significant change• Demographics and

psychographics

Special reports (once per year)

1. Gender and equity audit2. Stakeholder satisfaction3. Performance barriers and

solutions4. SWOT5. Staff peer appraisals 6. PEST

45

Page 46: Evaluation pal program monitoring and evaluation technology

4. IMPROVE

46

Page 47: Evaluation pal program monitoring and evaluation technology

A FEEDBACK TOOL FOR AN ENTIRE ORGANIZATION

47

•Save time collecting data

•Focus on learning, rather than harassing staff to collect data

•Support developmental evaluation and lean start-up

•Obtain evidence over time, for the end of program evaluation

Program evaluators

•Gain a top level overview of a program’s performance

•Obtain a tool to build a learning organization

•Identify potential threats to the organization or its programs

Management

•Marketing and communications: Better understanding of the key people who help their organization thrive

•Volunteer coordinator: Understand volunteer needs, barriers and satisfaction

•Fundraising: Gain insight into constituents and their donating habits over time

Staff

Page 48: Evaluation pal program monitoring and evaluation technology

© Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp.

PROJECT HISTORY

48

Page 49: Evaluation pal program monitoring and evaluation technology

PROJECT TIMELINE

49

BETA 1Invented &launched

(2011)

Market testing

(2012-2013)1st pilot study

(2011-2012)

BETA 2Redesigned & expanded

(2012)YLC project

(2013)

Analysismodels(2009)

MaRS SIG(2012)

BER citations• A. Emm, E. Ozlem, K. Maja, R. Ilan, & Florian, S. (2011). Value for Money: Current Approaches and Evolving Debates. London, UK: London School of

Economics. • Cugelman, B., & Otero, E. (2010). Basic Efficiency Resource: A framework for measuring the relative performance of multi-unit programs. : Leitmtoiv and

AlterSpark. • Cugelman, B., & Otero, E. (2010). Evaluation of Oxfam GB's Climate Change Campaign: Leitmotiv, AlterSpark, Oxfam GB. Download• Eurodiaconia (2012) Measuring Social Value. Brussels, Belgium.

2nd pilot study(2012)

Numerous NGOs &

evaluators

Page 50: Evaluation pal program monitoring and evaluation technology

Want to learn more?Brian Cugelman, PhD

(416) [email protected]

www.evaluationpal.com