Evaluation pal program monitoring and evaluation technology

Post on 22-Apr-2015

1.079 views 0 download

description

In this session, Dr. Cugelman will discuss his work to develop an automated program monitoring and evaluation technology, called Evaluation Pal. He launched Evaluation Pal in 2011, then in 2012, pilot tested it for an evaluation of the Green Infrastructure Ontario Coalition which was submitted to the Ontario Trillium Foundation. Soon after, MaRS' Social Innovation Generation accepted it into their incubator program. In this session, Dr. Cugelman will provide a tour of the tool, and use the Green Infrastructure Ontario case study to demonstrate how automated data collection can be used in the program evaluation process. This presentation will also provide an opportunity to discuss the challenges and opportunities of using technology to aid program evaluation.

Transcript of Evaluation pal program monitoring and evaluation technology

www.evaluationpal.com

Evaluation Pal Program monitoring and evaluation technology

Canadian Evaluation Society Conference (2013)

Brian Cugelman, PhD

BETA 2

www.evaluationpal.com

AGENDA

1. Monitoring and evaluation, plus elephants

2. The vision of real-time feedback

3. Introducing Evaluation Pal

4. Project history

2

© Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp.

MONITORING AND EVALUATION, PLUS ELEPHANTS

3

4

But it's often about:

Accountability: Satisfying donor requirements

We say it's about:

Decision making: Making better decisions based on evidence

Performance improvement: Learning what works and improving performance

Risk mitigation: Identifying risks early, to avoid potential crises

M & E: THE ELEPHANT IN THE ROOM

M & E FOR MANY ORGANIZATIONS

• Requires expensive consultants

• The process takes up too much staff time

• Valuable information often comes too late

• Few people read big reports

• Evaluators sometimes scare people

5

© Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp.

THE VISION OF REAL-TIME FEEDBACK

6

DESIGNING & IMPLEMENTING

7

Execution(-) Bad Execution (+) Good Execution

Design

(+) Evidence Informed

Promising intervention poorly executed

Promising intervention well executed

(-) Not evidence informed

Unlikely intervention poorly executed

Unlikely intervention well executed

• Research is only part of the equation • Execution is just as important

FEEDBACK AND PERFORMANCE

8

Goals

Improving performance

Feedback on

performance

WHAT SUCCESS NORMALLY LOOKS LIKE

9

• Both Internet marketing and public mobilization seem to follow power laws

• Growth can be logarithmic with peaks and valleys between campaigns

Time (years)

Imp

act

1 2 3 4 5 6 7 8 9 10

WITHOUT FEEDBACK, ORGANIZATION CAN’T...

10

Know when they’re succeeding or failing

Tell if th

ey’re pursuing th

e right o

r

wrong goals

Judge which activates are most or least efficient

Adapt to difficult situations

Make improvements

Tailor their work to stakeholder needs

Tell

if th

eir

tact

ics a

re

backfirin

g

Feedback is essential to success,

for people or organizations

TREND TOWARDS ITERATIVE LEARNING AND IMPROVING

11

1. Deploy

2. Assess 3. Revise

1. Deploy: Implementing the latest iteration

2. Assess: Measuring and learning

3. Revise: Rethinking and adapting

Unknown cousins: • Developmental evaluation• Lean start-up

© Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp.

INTRODUCING EVALUATION PAL

12

SOLUTION - EVALUATIONPAL.COM

13

A tool that helps organizations monitor their progress and improve their performance.

Mission: To build a

better world by helping

social organizations

become more effective.

14

A TOOL FOR LEARNING CULTURES

15

1. Describe

yourself or

organization

2. Ask for the feedback that you need

5. Improve your performance

4. Learn what's

working or not

3. Collect feedback from informants and add hard evidence

CASE STUDY OF OUR EVALUATION OF GIO

16

1 2

1. DESCRIBE

17

DESCRIBE YOUR ORGANIZATION – LOGIC MODEL

18

Inputs ActivitiesOutcomes

Ultimate Goal

(short-term) (mid-term) (long-term)

Steering Committee members

Coalition staff Expert peer review committee (volunteers)

Consultants Workshop partners

Volunteers Intern Funding from Trillium

Funding from Steering Committee members

In-kind donations

Conducting outreach & education

Implementing 5 workshops

Building the Coalition Filing an Environmental Bill of Rights application to change the definition of infrastructure

Sharing best practices

Producing the Green Infrastructure Ontario Report

Carrying out the launch event

Posting & distributing content through the website

Producing & sending the e-update

Operating the Coalition Steering Committee

Meeting ministers & government staff

Increase awareness & support for green infrastructure among non-profit organizations

Increase awareness & support for green infrastructure among government staff

Increase coverage of green infrastructure issues in the media

Increase awareness & support for green infrastructure among decision makers

Increase political support & priorities for green infrastructure

Increase support & priorities for green infrastructure among the public

Increase green infrastructure funding mechanisms

Increase green infrastructure policy & legislation

Increase the implementation of green infrastructure in Ontario

DESCRIBE YOUR ORGANIZATION – THREE LOGIC MODELS

19

There are also logic models for people, focused on personal and professional development.

1. Non-profit organization

2. Social enterprise

3. For-profit organization

20

21

22

2. ASK

23

24

Constituent volunteering and donating

Implementation quality

and efficiency

Brand health & reputation

Stakeholder satisfaction

Impact

Likelihood of reaching goals

Investments

Over 40 base, extrapolated, and customer insight metrics and measures.

Market, strategy, foresight

Stakeholder demographi

cs & psychograp

hics

Personal development

METRIC CATEGORIES

25

Demographics and

psychographics

Base-metrics• Investments• Implementation quality• Progress towards goals• Stakeholder & customer

engagement• Reputation and brand

health• Advice for success • Market Attractiveness• Equitable office

Extrapolated metrics• Value for money• Effective prioritizing• Effectiveness engagement

(Power Analysis)• Contribution of activities to goals• SWOT• PEST• Source credibility• Program implementation fidelity• Most significant change• Product and service attractiveness

3. COLLECT

26

ADD INFORMANTS

Internal Partner External

Staff / PeersManagers

Board membersHighly involved volunteers

VolunteersDonors / Funders

Partner organizationsConsultants and experts

CustomersConstituentsBeneficiaries

Peer organizations

28

TRADITIONAL SAMPLING VERSUS PANEL SURVEYS

1 6

35

4 2

Traditional surveys(all in one go)

Evaluation Pal panels(randomly divided across a year)

1

Jan Feb Mar Apr May June July Aug Sep Oct Nov Dec

• Too much engagement • Too much information• Too late to act on insight

TRADITIONAL END OF PROGRAM ASSESSMENTS (all in one go)

1

31

Jan Feb Mar Apr May June July Aug Sep Oct Nov Dec

• Only engage a small sample at a time• Random sampling offers confident findings• Insight available throughout the year• Randomization within key informant groups• Near real time feedback

EVALUATION PAL PANELS

(randomly divided across a year)

1 635

4 2

32

33

Sequence of respectful timed messages:• 28 days• 14 days• 7 days• 1 day before

34

35

4. LEARN

36

Browser

http://www.evaluationpal.com/panel/settings Search

Living logic model

Inputs ActivitiesOutcomes

Ultimate Goal

short-term mid-term long-term

Steering Committee members

Coalition staff Expert peer review committee (volunteers)

Consultants Workshop partners

Volunteers Intern Funding from Trillium

Funding from Steering Committee members

In-kind donations

Conducting outreach & education

Implementing 5 workshops

Building the Coalition Filing an Environmental Bill of Rights application to change the definition of infrastructure

Sharing best practices Producing the Green Infrastructure Ontario Report

Carrying out the launch event

Posting & distributing content through the website

Producing & sending the e-update

Operating the Coalition Steering Committee

Meeting ministers & government staff

Increase awareness & support for green infrastructure among non-profit organizations

Increase awareness & support for green infrastructure among government staff

Increase coverage of green infrastructure issues in the media

Increase awareness & support for green infrastructure among decision makers

Increase political support & priorities for green infrastructure

Increase support & priorities for green infrastructure among the public

Increase green infrastructure funding mechanisms

Increase green infrastructure policy & legislation

Increase the implementation of green infrastructure in Ontario

MixedOn track At risk Not assessed

Browser

http://www.evaluationpal.com/panel/settings Search

Contribution of activities towards goals

Browser

http://www.evaluationpal.com/panel/settings Search

Keep up the good workConcentrate here

Low priority Possible overkill

Effective focus

Browser

http://www.evaluationpal.com/panel/settings Search

Efficiency

EfficientLeast Efficient

Efficient Most Efficient

Browser

http://www.evaluationpal.com/panel/settings Search

Contribution of activities to goals

42

Browser

http://www.evaluationpal.com/panel/settings Search

Contribution of activities to goals

Browser

http://www.evaluationpal.com/panel/settings Search

Crowd sourced SWOT

Strengths•Active, influential and diverse coalition 15•Commitment, motivation and vision 10•Communications, outreach and online activities 10•Green infrastructure is an important topic 6•Credibility 5•Networking 5•Branding and design 3•Evidence based 3•Expertise and experience 3•Timing 3•Ethics and values 2•Focus on realistic goals 2•Inclusive process 2•Sharing best practices 2•Workshops and their output 2

Weaknesses•Public engagement and awareness 12•Political engagement and support 11

•Setting coalition goals and focusing on green infrastructure topics 7

•Funding 6•Making a persuasive case for green infrastructure 5•Media interest 3•Member commitment, engagement and collaboration 3

•Steering Committee coherence, contributions and leadership 3

•Not enough engagement with stakeholders 3•Achieving concrete outcomes 2•Capacity 2•Reach outside current network 2•Too much on green roofs 2

Browser

http://www.evaluationpal.com/panel/settings Search

OpportunitiesExpand the Coalition and network 13Highlight economic opportunities and savings 9Raise public awareness and support 5Make links to climate change and green energy 4Align with government and municipal priorities 4Improve government relations and shape policy 3Highlight benefits 3Raise awareness through education and events 2Better use the Coalition 2Access funding 1Build local capacity 1Coalition’s capacity 1Election commitments 1Audit green infrastructure and report progress 1Design school curriculum 1

ThreatsBudget limits or perceptions that green infrastructure is not economical 17

Public awareness, apathy and competing issues 13GI is not understood or valued, or is seen as a fringe idea 10

Persuading implementers that green infrastructure is comparable to grey infrastructure (can’t make a strong case)

7

Lack of political relations, awareness and support 5Coalition governance and vision 4Lack of a clear message 1Lack of Canadian case studies 1Lack of media interest 1Not enough coordination among key actors 1Poor existing policy 1Scope of network too small 1Slow reaction time 1

Crowd sourced SWOT

REGULAR AND SPECIAL REPORTSRegular reports(in every report)

• All core performance and impact measures

• Most significant change• Demographics and

psychographics

Special reports (once per year)

1. Gender and equity audit2. Stakeholder satisfaction3. Performance barriers and

solutions4. SWOT5. Staff peer appraisals 6. PEST

45

4. IMPROVE

46

A FEEDBACK TOOL FOR AN ENTIRE ORGANIZATION

47

•Save time collecting data

•Focus on learning, rather than harassing staff to collect data

•Support developmental evaluation and lean start-up

•Obtain evidence over time, for the end of program evaluation

Program evaluators

•Gain a top level overview of a program’s performance

•Obtain a tool to build a learning organization

•Identify potential threats to the organization or its programs

Management

•Marketing and communications: Better understanding of the key people who help their organization thrive

•Volunteer coordinator: Understand volunteer needs, barriers and satisfaction

•Fundraising: Gain insight into constituents and their donating habits over time

Staff

© Copyright 2013 | Brian Cugelman, PhD | AlterSpark Corp.

PROJECT HISTORY

48

PROJECT TIMELINE

49

BETA 1Invented &launched

(2011)

Market testing

(2012-2013)1st pilot study

(2011-2012)

BETA 2Redesigned & expanded

(2012)YLC project

(2013)

Analysismodels(2009)

MaRS SIG(2012)

BER citations• A. Emm, E. Ozlem, K. Maja, R. Ilan, & Florian, S. (2011). Value for Money: Current Approaches and Evolving Debates. London, UK: London School of

Economics. • Cugelman, B., & Otero, E. (2010). Basic Efficiency Resource: A framework for measuring the relative performance of multi-unit programs. : Leitmtoiv and

AlterSpark. • Cugelman, B., & Otero, E. (2010). Evaluation of Oxfam GB's Climate Change Campaign: Leitmotiv, AlterSpark, Oxfam GB. Download• Eurodiaconia (2012) Measuring Social Value. Brussels, Belgium.

2nd pilot study(2012)

Numerous NGOs &

evaluators

Want to learn more?Brian Cugelman, PhD

(416) 921-2055brian@alterspark.com

www.evaluationpal.com