Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

15
Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009

Transcript of Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Page 1: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Development agency support for

impact evaluation

Harry Jones, ODI

Impact Evaluation Conference 2009

Page 2: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Recent ODI work

1. Comparative study on evaluation

policies & practices in development agencies

2. Improving impact evaluation production & use

Page 3: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Comparative study on evaluation

policies and practices in development agencies

Marta Foresti

with C. Archer, T. O’Neil, R. Longhurst

December 2007

Page 4: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Overview of study

Scope and Objectives A descriptive comparative study of evaluation policies and practices in key agencies, to inform AFD reform process.

• key features of evaluation function (e.g. mandate, position, management, roles etc. )

• main aspects of evaluation systems, processes and tools• practices involved in commissioning, managing and supporting evaluation

Activities Desk Case Studies: DANIDA, EU, OXFAM, IMF (Evaluation Units)Full Case Studies: DFID, SIDA, WB, AfDB, KFW (Evaluation Units)+ Key Informants Interviews

Outputs- Case Study Reports (AFD)- Final Comparative Report- Workshops: Mid Term (AFD Internal), Dissemination: AFD internal, ODI lunchtime meeting (Feb 08) and DAC network in March 08

Page 5: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Profiles of Evaluation Units - Overview

•Variability in budget and staffing

•Many evaluation policies being reviewed, updated or created

•Mandate not always clear in policies: lack of clarity across organisation

•No single/unified methodology

Page 6: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Independence vs Integration

• Most EUs sit outside management structure or operational dept. Report to minister/boards etc.

• Position of unit important, but also rules for budget allocation, appointment of staff, disclosure (WB, IMF)

• All recognise tension between independence and integration. ‘Being involved’ as important as ‘being detached’.

• Reliance on ‘usual consultants’: are they ‘really independent’ and ‘free’ to be critical?

Page 7: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

• Main responsibilities: tendering, contracts and managing evaluation processes, not doing evaluation.

• Different levels and intensity of consultation with other departments, more on implementation and dissemination, less at planning/decision phase

• Capacity and evaluation skills of EU staff a major constraint (DFID and others). Focus often on specific sectoral skills (e.g. economists at KFW)

• ‘New’ roles and responsibilities: KM and learning, communication, dissemination and capacity building

Staff capacity, roles and responsibilities

Page 8: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Communication and dissemination

• Of increasing importance, beyond ‘dissemination of findings’ towards effective communication, reach and active engagement of client/stakeholders (big push at WB).

• Disclosure policies and transparencies, all reports on website

• Products: more than reports: synthesis, briefs, seminars, internet etc.

• Limited feedback and weak evidence on utilisation (AfDB)

Page 9: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Improving Impact Evaluation

Production and Use

Nicola Jones, Harry Jones, Liesbet Steer, and Ajoy Datta

March 2009

Page 10: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Overview of study

Scope and Objectives Commissioned by DfID to inform discussions on IE production and use, particularly within NONIE• To determine how amenable various methods for IE are to different types of projects, programmes and policies;• To assess the dynamics around commissioning, production and delivery of IEs;• To analyse how IEs are disseminated and communicated;• To assess use and influence of IEs; and• To make recommendations to improve the production and use of IEs. Activities •Scoping study•Literature review•Annotated database of IEs:•Sector Case studies•Synthesis

OutputsODI Working paperOpinion piece

Page 11: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Methodologies: Suitability and opportunities for IE

Similarities between sectors• Projects with simple impact pathways• Methodological innovation• Call for pluralism

Differences across sectors• Sector history of IE• Gestation of impact• Coverage and relevance

Page 12: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Demand and supply: commissioning, production and delivery

• Largely supply-driven

• Upward accountability

• Less for downward accountability, learning

• Some exceptions in social development: range of implementing agencies, Southern government demand

Page 13: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Communication, Use and Influence

• Communication varied but difficult at a national level; greater interest and initiatives at international level

Use:• Some direct use• ‘Legitimation’ most common function• Indirect and ‘enlightenment’ use

Page 14: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

Emerging messages

1. Common aims but diversity of practices, diversity of sector experiences, and of evaluation questions

2. Need for plural approach to (I)E quality ; and balance between rigour and coverage for accountability

3. Improving agency learning from IEs difficult; but crucial to improve programmes

4. Disconnect between rhetoric on strategic importance of development evaluation, and practice in development agencies. An ‘institutional gap’: need to invest in institutional role of evaluation, at different levels.

5. How to strengthen Demand for development evaluation?

Page 15: Development agency support for impact evaluation Harry Jones, ODI Impact Evaluation Conference 2009.

[email protected]

Other relevant ODI work:

• Development effectiveness: the role of qualitative research in impact evaluation – Martin Prowse (RPGG)

• Re-thinking the impact of humanitarian aid – Karen Proudlock and Ben Ramalingam (ALNAP)

Thank you!