Ensuring Credibility and Usefulness: Overseeing Independent and Quality GRPP Evaluations Christopher...

18
Ensuring Credibility and Usefulness: Overseeing Independent and Quality GRPP Evaluations Christopher D. Gerrard, Lead Evaluation Officer, Independent Evaluation Group, World Bank

Transcript of Ensuring Credibility and Usefulness: Overseeing Independent and Quality GRPP Evaluations Christopher...

Ensuring Credibility and Usefulness:Overseeing Independent and Quality GRPP Evaluations

Christopher D. Gerrard,Lead Evaluation Officer,Independent Evaluation Group, World Bank

Main Messages

► Both independence and quality are essential for credibility and usefulness

► For organizational and behavioral independence:• Oversight committee should be qualified and attentive• Management should provide logistical support and needed

documentation as requested by evaluators

• Oversight committee should ensure up-front provision for dealing with delicate issues that might arise during implementation: Conflicts of interest, political interference, evidence of wrong-doing

► For high quality evaluations:• Evaluation budget needs to be compatible with the evaluation design• Evaluation team needs to have required qualifications and

experience• Oversight committee and evaluation team need good working

relationship

2

Special Features of GRPPs

► Governing bodies usually lack evaluation expertise to oversee evaluations independently of management

► Evaluations are complex, multi-level and multi-dimensional

► Pool of independent evaluators is small — with the necessary sector and global expertise, and with no previous association with the program

► There are many potential threats to behavioral independence

► Programs have a global clientele, making transparent disclosure and dissemination important

3

Key Steps

► Drawing on necessary expertise to draft the TOR

► Signaling complexity and cost of the evaluation in the TOR, while remaining focused on its purpose

► Selecting a qualified and experienced external evaluation team

► Negotiating contracts, work program, and schedule

► Reviewing draft findings and disseminating the final evaluation report

4

Using an Oversight Committee► Appointed by and reports to the GB► Oversees and ensures the overall independence and

quality of the evaluation► Drafts and approves or recommends GB approval of

TOR► Reviews submissions, and selects or recommends

the evaluation team► Reviews issues that arise on contracts, conflicts of

interest, and access to information between the evaluation team and the program manager and staff

► Reviews inception report, where applicable► Reviews and comments on the draft final report

before submission to the GB

5

Good Practice Examples:Tapping Expertise

6

Program Source of Expertise

Early Transition Countries Fund

Host agency: EBRD for TOR

Africa Management Services Company

Host agency: IFC for TOR

UN Trust Fund for Violence Against Women

Partner: UNIFEM for TOR, team selection, oversight and review

Medicines for Malaria Venture

Donors: DFID and World Bank for TOR

Global Donor Platform for Rural Development

Hired consultants to draft TOR

Global Alliance for Vaccines & Immunization

Hired consultants to draft concept paper & proposers were invited to submit different evaluation approaches

Suggested Content of an Evaluation Terms of Reference► Basic information about the program► Purpose, scope and type of evaluation► Evaluation criteria and questions► Evaluation design and methodology► Required qualifications of the evaluation

team► Work plan and schedule► Obligations of key players in the

evaluation► Annexes

7

Provide Enough Detail to Signal Complexity and Cost of the Evaluation► Include basic information about the program:

• Governance arrangements, and roles of stakeholders• Scope of activities, portfolio and financial information

► Outline evaluation criteria and questions: • Not only relevance, efficacy, efficiency & sustainability• But also governance, management & resource

mobilization

► Cover special donor or host agency fiduciary needs:• Processes: competitive selection, standard contracts • Content: compliance, efficiency, diversity of consultants

► Be clear on audience and evaluation products

8

Selecting the Evaluation Team: Some Key Issues► Special expertise may be needed for GRPP

Evaluations:• Global vs. country-level interventions, and interactions• Governance and financial specialists• Ability to analyze future strategic options with scenarios

► Competitive or deliberative selection process?► Who assembles the evaluation team?

• Evaluator, before submitting proposals• Program, after selection of individuals• Some combination

► Disclosure of actual or potential conflicts of interest► Encouragement of local consultants, diversity,

consortia

9

Good Practice Examples: Addressing Conflicts of Interest (CoI) in the TOR

10

Program Situation

Global Environment Facility

Referred bidders to clearly defined GEF policy and required disclosure

CGIAR No program policy; defined CoI in TOR

infoDev Tor included requirement, under “obligations of evaluators” to disclose CoI at any time it should arise

UN Trust Fund on Violence Against Women

TOR called for bidders to attach their own policies on CoI

Good Practice Examples: Meeting Diversity Objectives in Selection

11

Program Criteria

Global Environment Facility

M&E Policy states principles which encourage diversity and use of local expertise TOR included such criteria for selecting evaluators

Human Repro-duction Program

TOR specified diversity among criteria for selecting evaluators

Integrated Framework for Trade-Related TA

Selection committee had developing country representation. At least one member of the evaluation team had to have experience working in developing countries

Contracting and Early Joint Planning► If not in TOR and RFP, contract should cover:

• Requirements on consultation or participation (for interviews, sampling)

• Required reporting to GB or management (inception, progress reports)

• Who owns data and products, and any requirements on confidentiality

• Principles and processes for ensuring independence of findings and products

• How to discuss and decide on needed changes in approaches• Approval required to add new staff, or deviate from work program• Who provides logistical, administrative support to evaluators• What to do on discovery of fraud, misconduct, or human rights

violations

► May need to reconcile, for GB approval, standard contract templates of host agency with the program’s own principles or procedures

12

Conduct of Evaluation

► Management to assemble background documents before evaluators begin:• Program’s authorizing environment, original

objectives and activities and their evolution• Documentation and effective date of any change of

membership, policies, rules, processes, principles • Portfolio and financial documentation

► Oversight Committee to have early discussions with evaluators:• Cover anything in previous slide not in contract• If no inception report, still aim to agree early on

approach, sample, who to interview, work program• Agree early on outline of evaluation products

13

Good Practice Examples: Early Discussion of Work Program

14

Format & Program Timing

Informal Discussion/Workshop

Medicines for Malaria Venture, Africa Program on Onchocerciasis Control, Africa Regional TA Centers

Timing varied among programs

Formal Inception/Scoping Report

Stop TB 1 week after contract signing

Global Water Partnership 3 weeks after contract signing

Global Development Network

Within 1 month of notifying selected evaluators

Cities Alliance 2 months after contract signing

Global Environment Facility (OPS3)

3 months after contract signing

Review of Drafts and Consultation► Ideally, an efficient review process was agreed to in

advance► And contract & work program provided for evaluator

availability during and after review process► Reviews can be multi-stage and multi-product

• Review of early findings (workshops or drafts)• Review of draft evaluation products (needs comprehensive

review)• May be several evaluation products for different audiences

► Need to specify who reviews, who approves each product:• Based on responsibility, expertise, and ability to contribute to

improved quality• Based on partner/stakeholder roles and interests

15

Good Practice Examples: Planning Ahead for Review of Draft Report

16

Program Review Process

Sub-Saharan Africa Transport Policy Program

Day-long meeting with Board, Management and Donors

Development Gateway Foundation

Reviewed by Executive Committee & World Bank before finalization

Integrated Framework for Trade-Related TA

Solicited feedback from beneficiaries (least developed countries) before finalization

Global Development Network

Dates fixed in TOR for GB and Management review

Dissemination and Disclosure► Evaluation planning should have covered all aspects:

• Dissemination of findings• Clearance/confidentiality needs (e.g. country or

institutional data)• Translation needs

► Passive dissemination: Meeting minimum requirements: • Covered in program charter, principles or M&E policy?• Covered in host agency policy?

► Proactive dissemination of final products• GB, Secretariat staff, donors• Country partners and other stakeholders – fit products to

audience• Global community, research and evaluation constituency

17

Conclusions

► It takes consistent effort by the oversight committee to ensure both independent and high quality evaluations

► For organizational and behavioral independence, the respective obligations of the governing body, oversight committee, program management and evaluation team need to be clear and respected

► For high quality evaluations, the oversight committee and the evaluation team need to have a good working relationship that draws upon the strengths of both

18