Understanding Program Evaluation Module 1

59
1 Understanding Program Evaluation Module 1 Essential Skills Series: An Introduction to Evaluation Concepts and Practice Canadian Evaluation Society Date:_________________ Location:______________

description

Understanding Program Evaluation Module 1. Essential Skills Series: An Introduction to Evaluation Concepts and Practice Canadian Evaluation Society. Date:_________________ Location:______________. Workshop Agenda. Registration 8:30 - 9:00 a.m. - PowerPoint PPT Presentation

Transcript of Understanding Program Evaluation Module 1

Page 1: Understanding Program Evaluation Module 1

1

UnderstandingProgram Evaluation

Module 1

Essential Skills Series: An Introduction to

Evaluation Concepts and Practice

Canadian Evaluation Society

Date:_________________

Location:______________

Page 2: Understanding Program Evaluation Module 1

2

Workshop Agenda

Registration 8:30 - 9:00 a.m.Introduction to Program Evaluation 9:00 - 10:15

a.m.Break 10:15 - 10:30

a.m.Program Planning & Evaluation 10:30 - 12:00 p.m.Lunch 12:00 - 1:00

p.m.Types of Evaluations 1:00 - 2:15 p.m.Break 2:15 - 2:30

p.m.Major Evaluation Roles and Approaches 2:30 - 3:45 p.m.Evaluation Standards & Ethical Guidelines 3:45 - 4:00 p.m.Discussion of Upcoming Modules 4:00 - 4:20 p.m.Workshop Evaluation 4:20 - 4:30 p.m.

Page 3: Understanding Program Evaluation Module 1

3

Workshop Objectives

• Introduction to terms and concepts used by evaluators

• Overview of historical and current trends• Identify major benefits and uses of

evaluation• Examining the relationship between

planning, management and evaluation• Review major models of evaluation• Review evaluation standards, ethics and

‘fairness’• Summarize the Canadian evaluation

‘basics’

Page 4: Understanding Program Evaluation Module 1

4

What is Evaluation?

• What do you think of when you hear the term evaluation?

Page 5: Understanding Program Evaluation Module 1

5

Section 1.

Introduction to Program Evaluation

Page 6: Understanding Program Evaluation Module 1

6

Working Definition of Program Evaluation

Program evaluation is the systematic collection and analysis of information about program activities, characteristics, and outcomes to make judgements about the program, improve program effectiveness and/or inform decisions about future programming.

Source: Patton, M.Q. (1997). Utilization-focused Evaluation. Sage Publications.

Page 7: Understanding Program Evaluation Module 1

7

Working Definition of Evaluation Research

Evaluation research is the:

• systematic application of social science research procedures in assessing social intervention programs.

Program evaluation is viewed as a form of applied social research.

Source: Rossi, P., Lipsey, M., & Freeman, H. (2004). Evaluation: A Systematic Approach (7th edition), Sage Publications

Page 8: Understanding Program Evaluation Module 1

8

Canadian Context

Treasury Board Secretariat Definition:

The application of systematic methods to periodically and objectively assess effectiveness of programs in achieving expected results, their impacts, both intended and unintended, continued relevance and alternative or more cost-effective ways of achieving expected results. 

Source: Results-Based Management Lexicon, http://www.tbs-sct.gc.ca/rma/lex-lex_e.asp accessed November 2008

Page 9: Understanding Program Evaluation Module 1

9

Historical Context – Evaluation in Canada

1960’s - Emergence of Evaluation – an amalgam of social sciences

1970’s - Evaluation becomes more routinized– 1977 Treasury Board Secretariat

Policy – 3-5 year cycle

1980’s - Distinct, mandated function – some common standards

– 1981 Office of the Comptroller General Guide

– Standardized issues– Evaluation units in most federal

agencies (and many provincial agencies)

1990’s - Evaluation seen as serving organizational operations, strategic and tactical – evaluation a part of ‘review’

– 1994 Evaluation Policy2000’s - Evaluation as a tool for

accountability and management

Sources: Müller-Clemm and Barnes (1997) A Historical Perspective on Federal Program Evaluation in Canada, Canadian Journal of Program Evaluation, 12 (1), 47-70Segsworth (2005) Program Evaluation in Canada: Plus Ça Change… Canadian Journal of Program Evaluation, 20 (3), 195-197

Page 10: Understanding Program Evaluation Module 1

10

The ‘Original’ Canadian Evaluation Issues

1. Program Rationale (does the program make sense)a) To what extent are the objectives and mandate of the

program still relevant?b) Are the activities and outputs of the program

consistent with its mandate and plausibly linked to the attainment of the objectives and the intended impacts and effects?

2. Impacts and Effects (what has happened as a result of the program)a) What impacts and effects, both intended and

unintended, resulted from carrying out the program?b) In what manner and to what extent does the program

complement, duplicate, overlap or work at cross purposes with other programs?

3. Objectives Achievement (has the program achieved what was expected)a) In what manner and to what extent were appropriate

program objectives achieved as a result of the program?

4. Alternatives (are there better ways of achieving the results)a) Are there more cost-effective alternative programs

which might achieve the objectives and intended impacts and effects?

b) Are there more cost-effective ways of delivering the existing program? (OCG, 1981a, p. 7)

Source: Guide on the Program Evaluation Function, Office of the of the Comptroller General 1981 http://www.tbs-sct.gc.ca/eval/pubs/pubs-to-1995/orig-gd_e.asp . Accessed November 2008.

Page 11: Understanding Program Evaluation Module 1

11

Evaluation vs. Performance Measurement Contrasting

Paradigms

Evaluation• Behavioural

Sciences

• Logic Model• Academic / interntl

development / social development

• Periodic• Strategic• Heretical

Performance Measurement

• Accounting, Process Engineering, Marketing

• Ledger / Scorecard• Business

• Ongoing• Operational• Conformist

Source: Montague, S. (2005) Performance Planning, Measurement and Reporting For Continuous Improvement, CES-AEA Workshop

Page 12: Understanding Program Evaluation Module 1

12

Audit and Evaluation in Public Management

Audit Evaluation

DEFINITION checking, comparing, compliance, assurance

assessment of merit, worth, value of administration, output and outcome of interventions

TYPES traditional – financial and complianceperformance audit – substantive

– systems and procedures

wide variability – many ‘types’ noted in the literature

WHO DOES IT? internal auditors – part of organizationexternal auditors – independent agency

internal evaluators – part of organization‘external’ contracted consultants – not really independent?

ROLES provide assurancepublic accountabilityimprove management

not as well articulatedincrease knowledgeimprove delivery and management(re) consider the rationalevaries by a long list of potential clients

METHODS file review, interviews, focus groups, surveys, observations

wide variety of methods, from scientific and quasi scientific designs to purely qualitative and interpretative methods and methods linked to testing program theory

REPORTING attest to legislaturesdirect to management

managementvarious stakeholders

STRENGTH strong reputationsupported by professional associations

well established and followed standards

addresses issues of public concern (e.g. waste mis-management etc.)

addresses attributionexplains why?acknowledges complexity and uncertaintyflexible in design and practice

CHALLENGES dealing with complexityoperating in a collaborating state

credibilityperceived relevance

Source: Mayne, John (2006) Audit and Evaluation in Public Management, The Canadian Journal of Program Evaluation Vol. 21, No. 1

Page 13: Understanding Program Evaluation Module 1

13

Benefits of Program Evaluation

Evaluation provides information about:

•Relevance to need•Program operations•Program strengths and weaknesses•Attainment of goals and performance•Program issues•Attributable impact •Efficiency and cost-effectiveness

Source: Love, A. (2007)

Page 14: Understanding Program Evaluation Module 1

14

Uses of Program Evaluation

• Symbolic use (evaluation as part of a ‘token’ effort)

• Instrumental use (evaluation for direct design and delivery improvement)

• Conceptual use (evaluation to change the way people think or see a program)

Source: Weiss, C., Murphy-Graham, E. and Birkeland, S.(2005) An Alternate Route to Policy Influence, American Journal of Evaluation, Vol 26, No 1

Page 15: Understanding Program Evaluation Module 1

15

Two ‘Fundamental’ Uses

• Development (learning)• Accountability

Source: Chelimsky, E., & Shadish, W.R. (eds.) (1997). Evaluation for the 21st Century: A Handbook. Thousand Oaks, CA: Sage

Page 16: Understanding Program Evaluation Module 1

16

Some Potential Misuses of Program Evaluation

Evaluation information can be misused to:

• Postpone action• Whitewash a program• Eliminate a program• Justify a weak program

Page 17: Understanding Program Evaluation Module 1

17

The Results of Effective Evaluation

• Assessing the relevance of the program to support continued operation

• Providing objective assessment of the extent to which program results are being achieved

• Supporting submissions and Initiative Proposals (e.g. Memoranda to Cabinet)

• Identifying areas of program improvement and / or alternative delivery means

• Providing overall assessment of the cost-effectiveness of the program

Source: Treasury Board Secretariat of Canada Case Studies in Effective Evaluation. http://www.tbs-sct.gc.ca/eval/tools_outils/impact/impact_e.asp#5.0 Accessed November 2008.

Page 18: Understanding Program Evaluation Module 1

18

Conditions Where Evaluation is Considered Useful

• High quality / credibility / integrity• Provides clear support for decision making and

action• Responsive to user information needs• Adequate organizational infrastructure and

resources are available to support the function• Owned and embraced by users

Source: Cousins, Goh, Aubry, Lahey, Montague and Elliott (2006) What Makes Evaluation Useful in Government? A Concept Mapping Study American Evaluation Association Presentation, November 2006

Page 19: Understanding Program Evaluation Module 1

19

Program Evaluation:Truth Test and Utility Test

Truth Test

• Is it trustworthy?

• Can I rely on it?

• Will it hold up under scrutiny or attack?

Utility Test

• Does it provide direction?

• Does it yield guidance?

• Does it have an action orientation?

• Does it challenge the status quo?

Source: Love, A. (2007)

Page 20: Understanding Program Evaluation Module 1

20

Section 2.

Program Planning and Evaluation

Page 21: Understanding Program Evaluation Module 1

21

What is a Program?

• A program is:

– an organized set of activities whose objective is the production of changes in the recipients and / or their environment.

• What about policies, initiatives, projects etc.?

Page 22: Understanding Program Evaluation Module 1

22

Program

1. Meals on Wheels for Senior Citizens

2. Emergency Shelter Beds in Winter

3. Job Retraining

Desired Change

• Increased social interaction

• Nutritious & varied diet

• Relief from exposure to cold nights

• Homeless person uses shelter

• Increase in employment levels, salary levels, job satisfaction

Page 23: Understanding Program Evaluation Module 1

23

Unintended Outcomes

•Often not certain what changes are expected•Programs can produce unanticipated changes

•Specify 1 desired and 1 unintended change for a program that promotes responsible gambling

Specify Desired Change

________________________________________________________________________

Specify the Unintended

Change________________________________________________________________________

Group Exercise #1

Page 24: Understanding Program Evaluation Module 1

24

Program Logic Elements

• Inputs: The financial and non-financial resources used to produce outputs and accomplish outcomes.

• Activities: An operation or work process internal to an organisation, intended to produce specific outputs (e.g. products or services). Activities are the primary link in the chain through which outcomes are achieved.

• Outputs: Direct products or services stemming from the activities of a policy, program, or initiative, and delivered to a target group or population. Usually things you can count.

• Outcomes: An external consequence attributed to an organisation, policy, program or initiative that is considered significant in relation to its commitments. Outcomes may be described as: immediate, intermediate or final (end), direct or indirect, intended or unintended. A good outcome statement represents the type of change wanted, includes reference to the target population or intended beneficiary and does not include reference to the how.

Source: TBS Results-based Management Lexicon http://www.tbs-sct.gc.ca/rma/lex-lex_e.asp accessed Dec 15, 2008

Page 25: Understanding Program Evaluation Module 1

25

Needs

Objectives Inputs Activities Outputs Outcomes

Program

A ‘Program’ as Seen From an Evaluation Perspective

Source: McDavid, J. & Hawthorn, L. (2006) Program Evaluation and Performance Measurement Sage Publications

Adapted from Nagarajan, N. & Vanheukelen, M. (1997) Evaluating EU expenditure programmes: A guide (p 25)

Environment

Social value of inputs

Page 26: Understanding Program Evaluation Module 1

26

Section 3.

Types of Evaluations

Page 27: Understanding Program Evaluation Module 1

27

Two ‘Fundamental’ Types

• Formative – improvement and development oriented

• Summative – accountability oriented(Scriven 1967)

Page 28: Understanding Program Evaluation Module 1

28

Formative vs. Summative Evaluation

Formative evaluation is used for the improvement and development of an ongoing program. Based on the outcome(s) of the formative evaluation, the program can be modified to improve on problems or difficulties.

Summative evaluation usually serves an accountability function. At the end of the program, a summative evaluation is completed to describe the overall successes of the program and to determine whether the program should be continued.

Page 29: Understanding Program Evaluation Module 1

29

Types of Evaluations

Needs Assessment• Utilization

• Program Planning• Focus

• Program Need• Gap between Actual and

Desired State

Page 30: Understanding Program Evaluation Module 1

30

Evaluability Assessment• Utilization

• Program Design• Focus

• Program Rationale• Program Interventions and

Strategies

Types of Evaluations

Page 31: Understanding Program Evaluation Module 1

31

Process Evaluation• Utilization

• Program Operations and Implementation

• Focus

• Program Monitoring• Efficiency

Types of Evaluations

Page 32: Understanding Program Evaluation Module 1

32

Outcome Evaluation• Utilization

• Program Results and Impacts• Focus

• Program Results• Effectiveness

Types of Evaluations

Page 33: Understanding Program Evaluation Module 1

33

Evaluation and the Management Life-cycle

Source: Adapted from Birch-Jones, J., Integrating PM and Evaluation: Bridging the Chasm, CES-NCR, 2002.

Identifya need

Design anintervention

Program/Policystart-up

Reconsider,redesign, expand,

reduce or end

FINALOUTCOMES

IMMEDIATEOUTCOMES

Ongoing Performance/Outcome Monitoring/Measurement

INTERMEDIATEOUTCOMES

Initial Situation

Page 34: Understanding Program Evaluation Module 1

34

Key Evaluation Questions: Needs Assessment

• Who needs the program?• What kinds of services do they need?• Are the needed services available?• Are the needed services accessible?• Are the people in need aware that the

services exist?• Are there enough program resources to

address the need?

Page 35: Understanding Program Evaluation Module 1

35

Key Evaluation Questions: Evaluability Assessment

• What are the program components?• What are the goals of the program?• What are the indicators or criteria of goal

achievement?• Are the program’s goals and objectives

compatible with the organizations vision and mission?

Page 36: Understanding Program Evaluation Module 1

36

• To what extent is the program being implemented as designed?

• Who uses the program? What activities are participants involved in?

• How are time, money and personnel allocated?

• Are program participants satisfied with the program?

• Are they receiving quality services?• How does the program vary from one site

to another?

Key Evaluation Questions: Process Evaluation

Page 37: Understanding Program Evaluation Module 1

37

• To what extent is the program meeting its goals?

• How does the program compare against accepted standards?

• Is the program effective? • How does the program compare with

competitive programs?• Should the program be continued? • Should the program be expanded?

Key Evaluation Questions: Outcome Evaluation

Page 38: Understanding Program Evaluation Module 1

38

• Internal evaluation uses a staff member to evaluate an organization’s programs

• External evaluation uses someone who is not directly supervised by an organization to evaluate its programs

Who Should Conduct the Evaluation?

Page 39: Understanding Program Evaluation Module 1

39

Comparative Advantages of Internal and External Evaluation

Internal• Issues matched to

managers needs• Costs lower

• Results readily accepted by staff

• Data better understood by staff

External• Outside expertise and

experience• Credibility to funders

and stakeholders• Staff not diverted

from normal tasks• Brings fresh

perspective

Page 40: Understanding Program Evaluation Module 1

40

Comparative Disadvantages of Internal and External Evaluation

Internal• Evaluation skills

may not be available

• Staff busy and evaluations not completed on time

• Staff rarely independent

• Difficult to separate program from personnel evaluation

External• Need time to

become familiar with program

• Illusion of independence

• Financial costs can be high

• Consultants require careful management

Page 41: Understanding Program Evaluation Module 1

41

Understanding Program Evaluation

Small Group Exercise #2

Rockwood Community Services

Founded in the 1970s, Rockwood Community Services is a multiservice agency that helps over 10,000 diverse individuals of all ages annually achieve greater independence and support recovery from illness by providing community-based health and mental health services.

After reviewing the annual program statistics, Rockwood’s Board Planning Committee has identified a disturbing new trend – each month more and more clients are seeking help for serious gambling problems. It seems as if Rockwood has been caught in a perfect storm caused by recent changes to gambling legislation, increased access to gambling venues, and the recent surge in online gambling. Gambling is now being strongly promoted not only as an enjoyable form of entertainment, but as part of the modern lifestyle and perhaps even as a civic duty, because of huge revenues generated by government-sponsored gambling. These revenues fund a wide variety of important social and educational programs that would not exist otherwise. In the space of a few years, it is said that gambling has become the fastest growing industry in Canada and the United States.

The Board wants to know whether Rockwood has an effective response to deal with the apparent problem gambling epidemic. The Board is placing Rockwood’s Executive Director under tremendous pressure to evaluate the effectiveness of the agency’s programs to prevent and treat problem gambling. Because Rockwood prides itself in delivering high quality, evidence-based services, the Board wants a rigorous summative evaluation of these programs.

Rockwood’s clinical directors and staff have not responded well to these demands. They point out that their programs for problem gambling are in the developmental stages and that it is too early to evaluate their effectiveness. Because the surge of clients with gambling problems has caught Rockwood and other service providers by surprise, they are not sure about the nature and extent of the problem and the characteristics of the clients affected. They are in the process of assessing their needs and designing programs to prevent and treat problem gambling. They argue that now is not the time for an evaluation, and that summative evaluation of their current programs would be a waste of time and money.

Page 42: Understanding Program Evaluation Module 1

Small Group Exercise #2 Worksheet

Rockwood Community Services

1. What are the reasons for conducting an evaluation of this program now?

2. What are the reasons against conducting an evaluation of this program now?

3. In your opinion, where are Rockwood’s programs for problem gambling on the program / management development life cycle?

4. What type of evaluation would be appropriate for programs at this stage of program development cycle?

42

Page 43: Understanding Program Evaluation Module 1

43

Section 4.

Major Evaluation Roles and

Approaches

Page 44: Understanding Program Evaluation Module 1

44

• Researcher• Management consultant• Facilitator

Role of the Evaluator

Page 45: Understanding Program Evaluation Module 1

45

Evaluation Approaches

• Degree of evaluator:– Independence– Control over the design– Lead of the process

• Some important types:– Goal based– Participatory– Empowerment– Developmental

Page 46: Understanding Program Evaluation Module 1

46

Traditional vs. Developmental Evaluation

Traditional evaluations… Complexity-based, developmental evaluations…

Render definitive judgments of success or failure

Provide feedback, generate learnings, support direction or affirm changes in direction

Measure success against predetermined goals

Develop new measures and monitoring mechanisms as goals emerge & evolve

Position the evaluator outside to assure independence and objectivity

Position evaluation as an internal, team function integrated into action and ongoing interpretive processes

Design the evaluation based on linear cause-effect logic models

Design the evaluation to capture system dynamics, interdependencies, and emergent interconnections

Aim to produce generalizable findings across time & space

Aim to produce context-specific understandings that inform ongoing innovation

Accountability focused on and directed to external authorities and funders

Accountability centered on the innovators’ deep sense of fundamental values and commitments

Accountability to control and locate blame for failures

Learning to respond to lack of control and stay in touch with what’s unfolding and thereby respond strategically

Evaluator controls the evaluation and determines the design based on the evaluator’s perspective about what is important

Evaluator collaborates in the change effort to design a process that matches philosophically and organizationally

Evaluation engenders fear of failure Evaluation supports hunger for learning

Source: Patton, Michael Q. “Evaluation for the Way We Work.” The Nonprofit Quarterly, Spring 2006, pp. 28-33

Page 47: Understanding Program Evaluation Module 1

Evaluator Role for Each Evaluation Model

Adapted from: Love (1998)

Goal-Based ModelEvaluator directs theevaluation process.

Participatory ModelEvaluator guides evaluation process.Evaluator is facilitator and resource.

Empowerment ModelTeam has total authority and resourcesto evaluate and improve performance.Evaluator is empowerment facilitator.

Developmental ModelEvaluator supports teamwork.

Ownership is shared by all.Evaluator is advisor to program team.

Page 48: Understanding Program Evaluation Module 1

Small Group Exercise #3 Understanding Program Evaluation

Selecting an Evaluation Approach

The Board Planning Committee of Rockwood Community Services has decided to strike a Problem-Gambling Task Force to examine the extent of the gambling problem and develop a range of feasible options in response.

The Problem-Gambling Task Force invited Rockwood’s senior managers and clinical leads to discuss the problem gambling situation. It was obvious from this meeting that Rockwood lacked fundamental knowledge about problem gambling and that both senior managers and clinical staff would benefit greatly by receiving training from experts in the field.

Rockwood’s Problem-Gambling Task Force contracted with an organization with experienced trainers who were also therapists that specialized in problem gambling. The trainers provide evidence-based training and support materials designed to develop core competencies related to gambling and problem gambling.

This organization developed a six-session training program delivered one half-day per week for six weeks. The sessions were scheduled to cause minimal disruption to service delivery and to develop competencies that could be immediately used at Rockwood. Training topics included an overview of gambling terms and concepts, signs of problem gambling, assessment and screening tools, different evidence-based prevention and treatment models, strategies for supporting families, working with specific populations (youth, seniors, women, families, specific ethno-cultural groups), and brief referral and support services.

There was a great deal of debate among the Problem-Gambling Task Force members and Rockwood’s evaluation staff about the appropriate approach to the evaluation. Some members felt that a goal-based model was right because the purpose of the evaluation was to assess whether the training program developed the needed competencies or not. The key evaluation task was to do assessments of competencies before-and-after training.

Others disagreed. They felt that it was important for clinical staff and other stakeholders to participate in the evaluation process or for evaluation to be integrated into project development to maximize learning from the evaluation process. Still others felt that the end result of the evaluation should be to empower the clinical staff with the knowledge and skills they needed to develop programs unique to Rockwood and the specific needs of its clients.

48

Page 49: Understanding Program Evaluation Module 1

Small Group Exercise #3 Worksheet

Selecting an Evaluation Approach

Review the proposed Rockwood case.

1. In your view, what are the primary purposes of this evaluation?

2. Select one evaluation approach (either goal-based, participatory, developmental or empowerment) and discuss how selecting that approach would affect the design of the evaluation (e.g., focus of the evaluation, types of questions asked, methods used to collect data, ownership of the evaluation).

3. Now select another evaluation approach (either goal-based, participatory, developmental or empowerment) and discuss how selecting this second approach would affect the design of the evaluation (e.g., focus of the evaluation, types of questions asked, methods used to collect data, ownership of the evaluation).

49

Page 50: Understanding Program Evaluation Module 1

50

Small Group Exercise #4

• Which evaluation model would you personally use within your organization?

• List your reasons for using this model.

• How well does this model fit your organizational structure and culture?

Page 51: Understanding Program Evaluation Module 1

51

Section 5.

Evaluation Standards

and Ethical Guidelines

Page 52: Understanding Program Evaluation Module 1

52

The Canadian Program Evaluation Standards

Utility Standards• Ensure evaluation will serve the practical

information needs of users – informative, timely, and influential

Feasibility Standards• Ensure evaluation will be realistic,

prudent, diplomatic, and economical

Propriety Standards• Ensure evaluation will be conducted

legally and ethically

Accuracy Standards• Ensure evaluation will be technically

adequate

Source: The Joint Committee on Standards for Educational Evaluation, 1994

Page 53: Understanding Program Evaluation Module 1

53

American Evaluation Association Guiding

Principles for Evaluators

Systematic Inquiry• Evaluators conduct systematic, data-based

inquiries

Competence• Evaluators provide competent performance to

stakeholders

Integrity / Honesty• Evaluators display honesty and integrity in their

own behavior and attempt to ensure the honesty and integrity of the entire evaluation process

Respect for People• Evaluators respect the security, dignity, and self-

worth of respondents, program participants, clients, and other evaluation stakeholders

Responsibilities for General and Public Welfare

• Evaluators articulate and take into account the diversity of general and public interests and values

Source: American Journal of Evaluation March 2008

Page 54: Understanding Program Evaluation Module 1

54

Canadian Evaluation Society Ethical Guidelines

Competence• Evaluators are to be competent

Integrity• Evaluators are to act with integrity

Accountability• Evaluators are to be accountable

Page 55: Understanding Program Evaluation Module 1

Small Group Exercise #5Evaluation Standards and Ethical

Guidelines

1. A program manager used a Web-based survey package to measure client satisfaction. There was only a 5% response rate and little information about who responded and who did not. The evaluator recommended that decisions should not be based on the findings from this survey. Which standard or ethical guideline was the evaluator following?

2. An evaluator who conducted an evaluation using focus groups with program clients described the limitations of this methodology clearly in presentations and reports of the evaluation findings. Which standard or ethical guideline was the evaluator following?

3. Before designing an evaluation, the program evaluator met with key stakeholders to assess their evaluation information needs. The evaluator also was careful to clarify the purpose for the evaluation, how the evaluation findings might be used and factors in the program context that might affect the evaluation. Which standard or ethical guideline was the evaluator following?

4. An external evaluator presented a draft of the evaluation findings to the program manager and staff for review. The program manager met with the evaluator and demanded that the program’s strengths be emphasized and the weaknesses “buried” in the report – otherwise funding would be jeopardized. What standard or ethical guideline should the evaluator follow in this situation?

5. A small nonprofit program with an annual budget under $10,000 must evaluate outcomes to meet funding requirements. The evaluator recommended a simple evaluation design using a combination of internal and external resources to keep costs low and burden to staff and clients at a minimum. Which standard or ethical guideline was the evaluator following?

6. Before conducting an evaluation, the evaluator provided each client with a description of the evaluation, its benefits and risks, and asks for their written permission to participate. Which standard or ethical guideline was the evaluator following?

7. An evaluator who had little experience in evaluating problem gambling programs notified the Evaluation Steering Committee of this limitation. The Evaluation Steering Committee then hired an expert in problem gambling to work with the evaluator on the study. Which standard or ethical guideline was the evaluator and Evaluation Steering Committee following?

_____ Utility

_____ Feasibility

_____ Propriety

_____ Technical Adequacy

_____ Competence

_____ Integrity

_____ Accountability

Match the situations in the first column with correct standards or ethical guidelines in the second column.

Page 56: Understanding Program Evaluation Module 1

56

Quick Summary: Three Important Elements

There are three keys to Canadian program evaluation:

1. Evaluations are mostly issue driven2. Evaluations are results logic

(program theory) focused3. Evaluations rely on multiple lines of

enquiry

Page 57: Understanding Program Evaluation Module 1

57

Preview of the Upcoming Modules

Module # 2 Planning for Evaluations

Module # 3 Monitoring, Formative Evaluation and Data Collection

Module # 4 Outcome Evaluation

Page 58: Understanding Program Evaluation Module 1

58

Needs

Objectives Inputs Activities Outputs Outcomes

Program

A ‘Program’ as Seen From an Evaluation Perspective

Source: McDavid, J. & Hawthorn, L. (2006) Program Evaluation and Performance Measurement Sage Publications

Adapted from Nagarajan, N. & Vanheukelen, M. (1997) Evaluating EU expenditure programmes: A guide (p 25)

Environment

Social value of inputs

Needs Assessment

Monitoring and Measurement

Outcome / Impact Evaluation

Page 59: Understanding Program Evaluation Module 1

59

For Next Week

• Look at some programs (select a case)• Consider the results logic

– Inputs– Activities– Outputs– Outcomes

• Come prepared to discuss