Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance...

36

Transcript of Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance...

Page 1: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country
Page 2: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

i

Contents

Acronyms ................................................................................................................................................... ii

I. Introduction ..................................................................................................................................... 1

II. Background ...................................................................................................................................... 1

III. Objectives, Purpose, Scope and Limitations ........................................................................ 1

IV. Approach ........................................................................................................................................... 4

V. Methodology ................................................................................................................................. 14

VI. Phases, Deliverables and Workplan/timeline ................................................................. 19

Annex 1: Terms of Reference .......................................................................................................... 22

Annex 2: Interview Template – Executive Directors ............................................................. 30

Annex 3: Interview Template – Managers and Staff (HQ and Regional hubs) ............. 31

Annex 4: Survey Template ............................................................................................................... 32

Annex 5: Report Outline .................................................................................................................... 33

Page 3: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

ii

Acronyms

AfDB African Development Bank

ADF African Development Fund

ADOA Additionality and Development Outcome Assessment

APPR Annual Portfolio Performance Review

BTOR Back-to-Office Report

CPPR Country Portfolio Performance Review

CSP Country Strategy Paper

DAM Delegation of Authority Matrix

DBDM Development and Business Delivery Model

EVRD Evaluation Results Data Base

IDEV Independent Development Evaluation Department

IPRR Implementation Progress and Results Reporting

KPI Key Performance Indicator

M&E Monitoring & Evaluation

MTR Mid-Term Review

OECD Organization for Economic Co-operation and Development

OM Operations Manual

PCR Project Completion Report

PCREN Project Completion Report Evaluation Note

PMG Performance Monitoring Group

PPER Project Performance Evaluation Report

QoS Quality of Supervision

RISP Regional Integration Strategy Paper

RLF Results-based Logical Framework

RMF Results Measurement Framework

RMR Results monitoring and reporting

RPPR Regional Portfolio Performance Review

SES Self-Evaluation System

TOC Theory of Change

TYS Ten-Years Strategy

WBG World Bank Group

XSR Extended Supervision Report

Page 4: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 1

I. Introduction

The Independent Evaluation Department (IDEV) of the African Development Bank Group (AfDB or the Bank) has retained the services of Centennial Group International to carry out the Evaluation of the Bank’s self-evaluation system and processes (SES). Annex 1 provides the Terms of Reference for the assignment. This Inception Report contains six sections covering: (i) background; (ii) objectives, purpose, scope, and limitations; (iii) approach; (iv) methodology; (v) phases, deliverables, and workplan/timeline.

II. Background

Aside from the independent evaluation function performed by IDEV, AfDB’s management also employs self-evaluation systems and processes (SES) that help in: assessing the Bank’s investment efforts, learning from operational experiences, and monitoring and improving the Bank’s performance. The SES are defined in different Bank documents including:

Operations Manual (OM), which was initially adopted in 1993 and revised in 1999, and more recently in 2014. (The next revision of the current OM is expected to commence in 2019.);

Delegation of Authority Matrix (DAM) and relevant Presidential Directives; Additionality and development outcomes assessment (ADOA) framework for the Bank’s

non-sovereign operations; and 4-level Results Measurement Framework (RMF), which is central to the Bank’s SES.

Various strategy and policy papers are also important. On the strategy front the Bank’s Medium-Term Strategy (MTS), was replaced in 2013 with a Ten-Year Strategy (TYS)—At the Center of Africa’s Transformation 2013-2022. Following the adoption of the TYS, the Bank has undergone major changes, particularly the adoption of the High 5s priorities and related strategies. This has been accompanied by the Board’s endorsement and subsequent implementation of the new Development and Business Delivery Model (DBDM).

In response to changing contexts within and outside the Bank, the Bank’s SES continues to evolve over time.

III. Objectives, Purpose, Scope and Limitations

The overall objective of this assessment is to provide support for the evaluation of the SES as per the attached Terms of Reference (Annex 1) and as modified in this Inception Report. The evaluation team will be responsible for identifying and systematically synthesizing evidence from various sources and for conducting the evaluation following the Organization for Economic Co-operation and Development/Development Assistance Committee (OECD/DAC) evaluation criteria and principles as well as the good practices from the Evaluation Cooperation Group (ECG).

More specifically, the objectives are to:

Assess how well the Bank’s SES perform, focusing on their relevance, coherence, efficiency, effectiveness, and short-term impact in serving three primary objectives—improving performance, enhancing accountability for performance, and promoting learning;

Identify and assess the enablers and barriers that affect the design, implementation, and results of the Bank’s self-evaluation systems and processes;

Page 5: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 2

Distil lessons, good practices, and formulate recommendations to enable the Bank to enhance the quality and performance (design, scope, implementation and results) of its self-evaluation systems and processes.

The focus of this evaluation will be the Bank’s SES as they functioned between 2013 and 2018. This period covers a considerable part of the implementation of the TYS, the adoption of the High 5s strategies, as well as recent reforms including implementation of the DBDM and process reengineering. This period also encompasses the issuing of the updated Operations Manual in 2014; findings from the evaluation would thus support and inform the upcoming 2019 revision.

The purpose of the evaluation will be to support the Bank’s management and operational staff through its findings, conclusions and recommendations in:

Improving self-evaluation, and performance management of operations, strategies, and policies;

Improving the relevance and quality of the Bank’s Operations Manual whose revision is to start in 2019;

Promoting learning from experience, and enhancing operational effectiveness; Supporting the implementation of the New Development and Business Delivery Model

(DBDM), and process engineering; Accounting to the Board of Directors and other stakeholders for the results of the

investments in the Bank’s SES.

The evaluation will build on and complement the:

IDEV Evaluation of the African Development Bank’s quality assurance across the project cycle (2013-2017) and Evaluation of the Integrated Safeguards System; and

Independent audit of Bank results monitoring and reporting (RMR).

Scope. The evaluation’s intended audience are primarily Bank staff and managers as well as members of the Bank’s Board. The evaluation is also expected to be of interest to comparator organizations.

With the Bank’s self-evaluation systems covering many different operational product lines the scope of the evaluation would include the range, processes, and methodologies being employed by these products. Some of the Bank’s operations and reports are validated by IDEV and feed into corporate scorecards and results measurement systems. An important distinction can be made between the mandatory self-evaluation products (e.g. IPR, PCR, XSR, CSPCR) and voluntary evaluation studies such as impact evaluations and occasional programmatic evaluations or retrospective studies commissioned by individual business units. Table 1 below shows the current coverage of AfDB’s self-evaluation tools and systems.

The evaluation will assess SES for sovereign and non-sovereign development operations. It will focus on:

• Country strategy papers (CSPs) and Regional Integration Strategy Papers (RISPs); • Sector and, more recently, High -5s Strategies; • Investment operations (both sovereign and non-sovereign); • Program-based Operations (PBOs); and • Self impact evaluations undertaken by Bank.

Page 6: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 3

Table 1: Self-evaluation tools and processes

Activity level

Area of focus Name of the self- evaluation tool

(acronym)

Who is responsible for its

preparation?

Frequency and coverage

IDEV Role & validation coverage

Main Purpose of self- evaluation

Project Lending IPR, XSR

Management

Bi-annual

Performance Management

MTR

Management

Mid-term

Performance Management. Course correction.

PCR

Management

Completion

Validation

Learning

PCREN, XSREN IDEV Completion Validation Learning Impact Evaluation

Management

Completion (occasional and voluntary).

Learning

Advisory services and analytics

Completion Report Management Completion Learning

Trust Fund activities Trust Fund grants

Completion Report, Annual Progress Reports

Management Completion Accountability

Program Country Program

CSP-CR Management Completion Validation (pilot) Learning

Regional and Sector Program

RISP-CR, Management Completion Validation Learning Sector Strategy, Progress Reports

Management Occasionally Accountability

Completion Reports Management Completion Learning Global and Regional Partnership Programs

Completion Reports

Management Completion Learning

Corporate Corporate Results framework

Progress Reports Management Completion Learning

TYS, High 5s Progress Reports

Management Completion Learning

Page 7: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 4

Aggregated self-evaluation results feed into apex reports that provide more aggregate, corporate accountability. These include:

The Bank’s RMF and the associated Annual Development Effectiveness Review; Portfolio monitoring reports; and Reporting to the Board on progress in implementing strategies.

Limitations. As indicated in the Terms of Reference, personnel self-evaluation systems and processes fall beyond the scope of this assignment.

Evaluations that typically are not covered by self-evaluation and hence outside the scope of this evaluation include:

Board operations Control functions Treasury operations

Possible limitations to the evaluation may come from the availability of SES products and documents as filed by the task managers in the Bank’s system.

IV. Approach

Theory of Change (ToC): IDEV’s evaluation policy, the OECD-DAC criteria and the Evaluation Cooperation Group’s Big Book on Evaluation Good Practice Standards will guide the evaluation that will be based on a theory of change presented schematically in Figure 1.

The theory of change underpinning the self-evaluation architecture is based on the fundamental logic that well-functioning SES can play a useful and effective role in helping to improve:

(i) performance management and how the availability of reliable information and evidence can help management take timely decisions to improve portfolio performance;

(ii) accountability and how the provision of key information at different levels (project, program, corporate) signals that the AfDB holds itself accountable for achieving results; and

(iii) learning and how the SES can be a tool for continuous learning and adaptation internally and externally.

The Evaluation will look at broad aspects of the architecture of the SES (components, processes and mechanisms). It will also examine the causal pathways going from the basic inputs into the self-evaluation systems (the portfolio at entry, the M&E systems, the business processes, the leadership signals and incentive structure, the budget, and the various strategic and guidance documents put in place by the institution) and how they influence the achievement of outputs, outcomes and impact. In this context the evaluation will also consider the stakeholders of the self-evaluation systems’, and their perspectives and inter-relationships.

Page 8: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 5

Figure 1: Preliminary Theory of Change

Page 9: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 6

Also to be examined during the evaluation are the links between inputs and outcomes that are ensured through the periodical production of a number of reports (outputs) generated during project supervision and at closing (PSR/IPPR, XSR, MTR, and CSP/RISP-CR etc., and occasionally self-impact evaluations). These reports feed into broader schemes of reporting arrangements at corporate level. Other links between the self-evaluation systems and other major systems will be reviewed to assess how they influence the overall response culture; the incentive structure and performance, such as project logframes; IDEV’s own independent evaluations and ‘validation’ exercises (e.g., PCREN, XSREN); commitments made at the corporate level; the Bank’s Operations Manual; and other requirements. The interfaces between the various systems, gaps in coverage, overlaps, relevance, periodicity, and the overall supporting environment will be analysed as they can influence performance of the self-evaluation systems architecture.

In examining the various causal pathways, a number of assumptions and facts will be tested to probe the robustness and credibility of the system and identify the weak links that could lead to recommendations for improvements. The key assumptions for the different levels of ToC causality cover:

Assessment of the enabling environment and barriers to effective self-evaluation;

Assessment of prevailing incentive structure, including how it influences individual behaviours;

Balance between compliance and results-orientation leading to the intended results;

Cost-effectiveness of what is produced compared to its added value; Adequate production, use and relevance of the various project rating

systems; and Likelihood that timely corrective action can be undermined by transaction

costs and aversion to risks, etc.

The evaluation will be forward looking and offer Management a number of recommendations that can enhance the performance of the tools, methods, indicators, processes and incentives that are most likely to establish the trust in the SES and the credibility of their results. This, in turn, would promote more effective use of self-evaluation outputs especially for decision-making, learning, and accountability for improving the Bank’s development effectiveness.

This evaluation will primarily rely on the OECD-DAC criteria of relevance, effectiveness and efficiency, while assessing results beyond outputs. Coherence within the Bank’s self-evaluation systems and processes and the Bank’s independent evaluation function will also be assessed accordingly.

Page 10: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 7

Evaluation Questions: As suggested in the Theory of Change, the overarching question to be addressed by the evaluation is:

Do the self-evaluation systems and processes (SES) support Performance Management, Accountability, and Learning at the Bank?

With two underlying sub-questions:

1. How well are the SES performing? 2. To what extent are the SES impacting on the quality of development

results?

Questions/sub-questions in the Evaluation Matrix below are organized for each of the three pillars (Performance Management, Accountability and Learning) and subdivided according to the four evaluation criteria (Relevance & Coherence, Cost-efficiency, Effectiveness and Impact contribution, and Incentives & barriers).

Page 11: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 8

Table 2: Evaluation Matrix

Relevance & Coherence Evaluation Questions Assessment Criteria and Indicators Source

Performance Management

EQ1. Are the SES and its different instruments relevant and coherent on how it is aligned with the Bank’s strategy and guidance and serves its purpose for performance management

[To what extent there is evidence that the implementation of the SES results in enhanced performance of project/country programs]

Extent to which the set of policy and guidance documents describe a coherent process for the SES, with added value by each instrument, clear roles and responsibilities, coherent and objective description of the rating process.

Main instruments to be analysed are: i) Projects and project portfolio – IPRR/XSR (for non-sovereign

operations), MTR, PCR; CPPR and APPR. Aide Memoires and BTOs will also be considered

ii) Country/regional strategies – CSP/RISP-MTR, CSP/RISP-CR iii) Sector/thematic strategies – “Reviews”

Extent to which the set of SES instruments is aligned with the main policy and guidance documents:

i) Operational Manual, Presidential Directives and Delegation of Authority Matrix (DAM), DBDM, H5s, TYS, etc…

ii) Focus on priority areas: gender, fragility, safeguards, fiduciary, governance

Extent to which the implementation of the SES and its instruments comply with the main guidance documents Extent to which the provided ratings system reflects a consistent, timely and accurate approach to performance management for the various processes around each instrument

Extent to which the SES is used as a tool for corrective action and analysis of possible impediments

Desk-based review Stakeholders survey, interviews, and focus group discussions (staff and managers; HQ and field) Recent Evaluation reports, quality at entry, quality of supervision, CEDR, PCR and XSR evaluation notes, cluster evaluations Case studies, process reviews Benchmarking analysis

Accountability EQ2. Are the SES and its different instruments a reliable and relevant framework geared towards verification of

To what extent the SES relies on strong M&E systems as a critical input for the credibility of the system To what extent are the different policy and guidance documents

describing the SES adequate to verify achievements of results and lines of accountability

Desk-based review Stakeholders survey, interviews, and focus

Page 12: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 9

results, accountability and reporting at: - staff/management level - corporate level

[does the SES generate useful information for staff, management and the Board signalling that the AfDB holds itself accountable for achieving results]

To what extent is the SES internally coherent and consistent with the Bank’s independent evaluation function To what extent the SES provides a broad and relevant perspective on

the results achieved and communicate overall performance in an easily understood way To what extent is staff and management being held accountable for

the proper implementation of the SES towards its intended results-based objectives. To what extent the SES and its different levels of aggregation

provides a relevant and accurate reporting of results through the RMF and other reporting tools (dashboard, etc..)

group discussions (staff, managers, and Executive Directors; HQ and field) Recent Evaluation reports, quality at entry, quality of supervision, CEDR, PCR and XSR evaluation notes, cluster evaluations Benchmarking analysis

Learning EQ3. Are the SES being used as a reliable and relevant framework for learning and innovation [is the information produced being used to shape how learning takes place]

To what extent are processes, roles and tools well-defined, comprehensive and integrated for learning and knowledge management To what extent are comprehensive scoring and IT systems in place to

ensure learning from relevant project cycle and thematic activities Is the SES providing the most relevant information to the different

audiences To what extent is the SES used as a tool for learning, and a repository of evidence of good practices, failures and lessons learned.

Cost-Efficiency Evaluation Questions Assessment Criteria and Indicators Source

Performance Management

EQ4. To what extent the SES and its instruments provide a reliable and cost-effective framework for portfolio management

[Are the SES implemented in a cost-effective and timely manner in terms of time, financial and human

Are the costs of the Bank’s SES across the project cycle appropriate relative to the results achieved To what extent are the resource requirements comparable to the

frameworks of other organizations To what extent the elimination of possible overlaps, redundancies or

requirements could improve cost-efficiency To what extent the different instruments of the SES are adding value

compared to their costs To what extent is the budget or staff overload a factor in the proper

implementation of the SES as described

Desk-based review Stakeholders survey, interviews, and focus group discussions (staff, managers, HQ and field) Recent Evaluation reports, quality at entry, quality of supervision, CEDR, PCR

Page 13: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 10

resources to deliver its intended benefit]

To what extent is proactivity by staff for corrective action promoted or impeded by transaction costs, project restructuring procedures, etc…

and XSR evaluation notes, cluster evaluations Case studies Benchmarking analysis

Accountability EQ5. Do the SES provide a reliable and cost effective framework for reporting and accountability internally and externally [is the money spent for the SES commensurate to the quality and the usefulness of the information provided]

To what extent quality at entry is a factor in the implementation of a cost-effective SES system for accountability and reporting To what extent are M&E systems a factor in the establishment of a

credible accountability framework To what extent reporting requirements are adequate and cost-

effective in providing the right information to the different audiences To what extent users find data input costly in terms of time

compared to the benefits To what extent are the ratings and their aggregation being used as a

cost-efficient tool for corporate performance reporting To what extent the way aggregation takes place for corporate

reporting is adequate and cost-efficient

Learning EQ6. Are the SES being implemented as a cost-effective tool for learning [is the money spent for the SES commensurate to the amount of learning and innovation that it generates]

To what extent templates and instruments support efficient recording of lessons and add most value

To what extent SES and rating validation could be a factor in increasing the opportunities for learning

To what extent the SES is being used to record and gather data for learning purposes, events and knowledge management

To what extent the SES is trusted enough to be a source of learning and what could be the role of independent evaluations

Effectiveness & Impact Contribution Evaluation Questions Assessment Criteria and indicators Source

Performance Management

EQ7. Is the SES architecture being implemented as a tool to enhance portfolio

To what extent is the system designed to provide the right balance between ensuring compliance and pursuing results To what extent the SES guidance is geared towards promoting

proactivity in addressing issues and corrective actions

Desk-based review

Page 14: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 11

performance and the achievement of results [What is the evidence that the SES contributes to improve quality at exit]

To what extent the inputs in the SES (quality at entry, M&E, processes, instruments, budget) are adequate to ensure proper delivery of the SES To what extent the implementation of the rating system is

sufficiently trusted to be a reliable tool for performance management To what extent has the SES facilitated the implementation of

mitigation and safeguard measures To what extent has the SES contributed to effectively addressing

cross-cutting issues such as gender, fiduciary, fragility

Stakeholders survey, interviews, and focus group discussions (staff, managers, HQ and field) Recent Evaluation reports, quality at entry, quality of supervision, CEDR, PCR and XSR evaluation notes, cluster evaluations Case studies Benchmarking analysis

Accountability EQ8. Is the SES architecture being implemented as a tool to enhance accountability consistently with provided guidance and Bank’s priorities [Is the degree of accountability generated by the SES conducive to improve results]

Are roles and responsibilities sufficient clear in the definition of the different processes and requirements surrounding the preparation, conduct, review, sign-off, follow-up for the various steps of the different SES instruments Is the design and guidance around corporate reporting processes

effective and well integrated in the SES system Has the Bank’s SES been delivered as expected for accountability

purposes To what extent has enforcement of procedures been enacted upon

when needed To what extent the implementation of the rating system is

sufficiently trusted to be a reliable tool for accountability and reporting requirements To what extent is attribution a factor in determining the degree of

the Bank’s accountability

Learning EQ9. Have the SES contributed to the identification and use of lesson learned [is the SES, the way it is designed and implemented, the right tool for learning]

To what extent the focus on accountability can undermine the relevance and usefulness of the SES for learning. To what extent independent or arm-length evaluation can play a role

in enhancing learning opportunities To what extent the concerns over ratings and disconnects distract

from learning To what extent the SES can be used more strategically to meet

knowledge gaps and for lesson learning

Page 15: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 12

Incentives and barriers Evaluation Questions Assessment Criteria and indicators Source

Performance Management

EQ10. Are the incentives in place conducive to candid assessments and proactivity for portfolio performance and corrective action [Do the incentives in place ensure that the SES is implemented as designed for quality results]

To what extent a compliance mindset and a focus on ratings can distort the systems and create biases or a candour gap. To what extent guidance provided can enhance incentives for results

and corrective action rather than compliance and transaction costs To what extent can team or third-party validation help the system to

remain honest To what extent is evidence available to justify the ratings and what is

the risk of “gaming the system” To what extent is poor project performance seen as having a link

with staff performance potentially leading to risk aversion and fear of tainting staff reputation To what extent are the incentive in place reflecting the right balance

between the focus on lending vs. supervision

Desk-based review Stakeholders survey, interviews, and focus group discussions (staff, managers, HQ and field) Recent Evaluation reports, quality at entry, quality of supervision, CEDR, PCR and XSR evaluation notes, cluster evaluations Case studies Benchmarking analysis

Accountability EQ11. Are the incentives in place conducive to candid assessments for accountability [Can incentives influence behaviours so that accountability is towards the achievements of results rather than the compliance with rules]

To what extent should the adequate implementation of the SES be linked to staff performance evaluation To what extent greater focus on the inputs (quality at entry, budget,

business processes, M&E systems) is likely to improve the incentive structure in place for effective implementation of the SES To what extent is Management exerting leadership over the correct

implementation of the system and lines of accountability To what extent staff and managers see the SES as a relevant

accountability tool that requires trust and close follow up for it to be useful

Learning EQ12. Is the incentive structure geared towards the use of the SES for continuous learning and innovation

To what extent new mechanisms for lessons sharing, “safe space” for debate and incentives for transparency would enhance the credibility and relevance of the SES for learning purposes To what extent learning and feedback loops can lead to system

improvement in enhancing flexibility, better procedures for project

Page 16: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 13

[Do the incentives in place ensure that the SES serve as a learning tool]

restructuring, recognition for excellence, differentiation according to specific situations (fragile context) To what extent a more direct involvement of third independent

parties and specific thematic/country events are likely to increase the credibility of the system and learning opportunities To what extent incentives can change behaviours in terms of

documenting learning evidence of proactivity for corrective action, best practices of ratings follow up, awards for innovation, etc.. To what extent better incentives can increase the perceived value of

the knowledge created, mainstream risks and failures as part of the business, create opportunities for mining lessons and knowledge and organizational learning To what extent the focus on corporate results reporting for

accountability and ratings contributes to weaken staff attention for other purposes such as learning

Page 17: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 14

Evaluation Framework. The framework adopted for the evaluation is illustrated in Figure 2 below.

Figure 2: Evaluation Framework

V. Methodology

In conducting this evaluation, the team will cast its analytical efforts in the context of a continuum of evaluative efforts carried by the Bank and IDEV in the last couple of years to accompany the transformation that the institution has been going through.

Some very recent evaluation studies have been carried out by IDEV and are relevant for the SES evaluation, as they will provide an opportunity to carry it out in the context of a logical sequence of validated analysis aimed at improving the Bank’s capacity to deliver a renewed strategic and operational framework. In particular, some of these evaluation studies have examined the relevance, efficiency, effectiveness and institutionalization of the Bank’s Quality Assurance processes through the project cycle at entry, supervision and exit (2013-2017), country strategy and program evaluations, the Integrated Safeguards System, the independent audit of Bank results monitoring and reporting (RMR), the preparation of PCR Evaluation

Page 18: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 15

Notes for all projects closed in 2017), and the Comprehensive Evaluation of the Development Results of the African Development Bank Group (2017).

This evaluation will not duplicate efforts but build on this already-validated base of evidence to focus its efforts more specifically on the performance of the SES system itself and tailor the methodology accordingly. Therefore, to the extent possible it will build on the relevant data and evidence already collected, while filling the gaps as needed vis a vis new data requirements specific to the SES. More specifically, this evaluation will pair effectively with the Quality at Supervision and Exit evaluation (QoS) and with the independent preparation of the 2017 PCRENs. While the QoS has looked at the various components of project supervision including from the Borrower’s perspective, the SES evaluation will focus on the internal processes, instruments and mechanisms put in place to monitor, report and learn on project and sectoral performance and results. To that extent, while the quality of supervision relies in good part on the performance of country Governments, partners, local stakeholders, etc.., the SES is entirely under the Bank’s control as well as in its capacity to adopt and implement recommendations. The PCREN prepared in 2018 will represent the end-result of the application of the SES over all the closed projects and a very valuable pool to draw from for validation purposes.

The evaluation will follow a mixed methods approach and rely on diverse methodological approaches targeted to answer particular evaluation questions. Data collection methods will match particular questions and multiple sources will be used to triangulate information. A range of information sources will be used.

Meta-analysis: This will comprise literature and desk reviews of evaluations of self-evaluation systems conducted by other MDBs, during which the team will assess the experiences of organizations with similar self-evaluation systems and examine how these institutions assess their self-evaluation systems with respect the three pillars of Performance management, Accountability and Learning (individual and corporate). It will also examine the common issues across the MDBs including factors affecting the three pillars.

Benchmarking: This will be conducted in parallel with the meta-analysis and compare various components of the self-evaluation system of comparator/sister organizations, including other MDBs. It will complement the meta-evaluation and will include the key elements of each self-evaluation system, coverage, processes, and discussions with key stakeholders. It will also cull lessons of experience and good-practice. For the purpose of this analysis we will examine the experience of the World Bank Group, IFAD and the Asian Development Bank.

Case Studies: The evaluation will analyse:

1. CSPs (inception to mid-term review and CSP-CR) and CPPRs for 2 countries. It will examine the coherence of the SES with respect to: i) the clarity of strategic objectives and underlying theory of change/results framework, policy dialogue, risk assessment process, consultation with RMC/ partners on strategic issues including cross-cutting issues, ii) clarity of results monitoring indicators and M&E system (data collection, data analysis and use), frequency

Page 19: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 16

and timeliness, iii) adequate production, relevance of the CSP-MTR and CSP-CR and use (learning), and iv) use of Self Evaluation results in independent evaluation of the CSP (IDEV). The countries for case study are proposed to be selected from Cote d’Ivoire, Kenya, Morocco and Tunisia.

2. A Purposive sample of closed projects or PBOs (roughly 15) out of the pool of projects reviewed by IDEV in 2018 and for which a PCREN was independently prepared and PCRs validated. This would include a small sub-set of projects for private sector operations. The sample will also include projects in the countries selected for the CSP/CPPR case studies.

The projects would be followed through their life cycle to evaluate the consistency and sequencing of the various reports and action. The purpose would be to review the coherence of the SES with respect to: i) efforts for corrective actions taken between project effectiveness and closing, ii) conformity in the application of existing reporting requirements, quality of the results framework, theory of change and underlying assumptions, iii) clarity of results monitoring indicators and M&E system (data collection, data analysis and use), iv) quality production and relevance of project readiness review, IPR, MTR and PCR and use (for learning), v) review of results aggregation at sector/country level and use in the RMF, vi) consultation with stakeholders for the production of country portfolio performance review (mutual accountability and learning), and vii) use of IPR, PCR data and analysis in synthesis reviews and validation of PCR (IDEV).

3. The East-Africa RISP will be used to analyse the application and the effectiveness of the SES for a strategy paper associated to a multi-country environment. It will mostly look at its evaluation at exit and learning opportunities for the next RISP or RISPs in other sub-regions.

Case studies approach. The approach taken is consistent with the intention of the evaluation to look at the SES as a whole and across the board on how it performs vis a vis the three main outcomes identified in the Theory of Change as: i) performance management, ii) accountability, and iii) learning.

Under performance management the evaluation will examine efforts made

(or lack thereof) to: i) provide relevant and accurate information on project performance, and ii) use the SES to take corrective measures when needed to improve quality at exit. The availability of the PCREN for each of these projects will provide a solid benchmark against which to compare SES performance during the life of the project/program. The evaluation will seek to better understand the correlation between the availability of quality SES products with quality at exit.

Under accountability the evaluation will assess the extent to which the Bank complies with its own policy, operational guidance and processes, and accountability mechanisms enacted in line with roles and the

Page 20: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 17

responsibility at the various levels. This implies a review at two levels to assess whether: i) the degree of accountability generated by the SES is conducive to improve results, and ii) extent to which accountability at the corporate level generated by the SES through the RMF and other aggregated instruments provides relevant and credible information that is well integrated and connected with portfolio and strategic performance. To that extent, the capacity of the RMF to distil the main outcomes and lessons of the SES for external reporting will be analysed to understand how it links with the portfolio and how it connects with the various project reports.

Under learning the evaluation will examine available documents for follow-up projects under preparation out of the pool of the 2017 PCRENs. This analysis will allow to verify the use being made of the SES for its learning potential in the preparation of project documents and knowledge products.

Case studies interviews. The Case studies will be supplemented with interviews with the task managers and managers responsible for the selected projects, the 2 CSPs and the RISP. The questions will evolve in the form of a conversation relative to the specific product that they were responsible for along the following lines.

Table 3: Illustrative areas of enquiry for the case studies interviews with Task Managers

Performance Accountability Learning

Based on your experience, what were some of the main lessons derived from the application of the SES guidelines, particularly with respect to the production and use of the mandated documents and ratings,

Existing incentives or disincentives around candour, objectivity and relevance that contributed to behavioural changes within the specific projects you were involved in

Main reasons for the source of disconnect between IDEV validation (and the QoS report) and the findings brought forth by the SES application in your projects

The nature of leadership signals during the period of implementation of the projects you managed, and the degree of accountability exerted by management and by the Board (through the RMF),

Within the context of your specific projects, to what extent was the application of the SES and portfolio performance discussed with management or at the level of the staff performance evaluation.

Issues around compliance-check vs proactive follow-up on problems identified during supervision. How it impacts staff capacity to call on the right expertise and support borrower’s implementation capacity and accountability.

The extent to which the main outcomes and lessons learned (PCRENs) were used for the preparation of a follow-up project, portfolio reviews or thematic documents,

The nature of the incentives that could be put in place for a more effective use of the SES mechanisms for future projects,

Would the use of more independent third-party interventions help improve the credibility of the system and opportunities for learning.

Page 21: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 18

Interviews: The key informant interviews (beyond those related to the cases studies) will cover staff and managers as practitioners and resource persons knowledgeable about the SES. Executive Directors will also be included as key informants as part of the analysis of external accountability.

Table 4: Key Informant Interviews (case studies)

HQ Regional Hubs Total Task Managers (sovereign) 5 5 10 Managers 3 2 5 Task Managers (private sector) 3 2 5 Task Managers CSPs and RISP 1 2 3 TOTAL 12 11 23

Interview questions: Given the proposed methodology, and the coverage already provided to the same or similar questions in the QoS, the interviews can be relatively limited and focused on the handling and performance of the SES from the perspective of the staff and managers. Evidence from the QoS will be used to complement SES data. The questions presented in Annexes 2 and 3 will be used as the underlying thread for a conversation on that subject. The questions for staff and managers will be the same, allowing the team to detect possible discrepancies and misalignments. Questions for the EDs will be different.

Survey: A survey, to be conducted early in the evaluation, will focus on the professional, mostly operational, staff who are directly or indirectly involved in the production or utilization of the information from the self-evaluation system (see Annex 4). The survey will seek to gather information on depth of knowledge of the various components of the system, familiarity with issues related to compliance and quality, understanding and use of learning tools (e.g. EVRD), incentives and other drivers, and various accountability mechanisms. It will also gather information on how specific priorities are being addressed such as gender, fragility, and safeguards. Together with the informant interviews, the survey will provide a rich source of information on the broad question of system architecture, performance and drivers. A client survey is not proposed since the self-evaluation system is internal to an organization and directly impacts the three pillars, but only indirectly affects the development outcomes; the benefits of a client survey might not justify the cost including the imposition on RMC officials. Annex 4 contains a proposed survey template.

Data Analysis: The documents and data sources, highlighted above, will form the basis for the generation of findings, and drawing of conclusions, lessons and recommendations, which will be the basis for the preparation of the background papers/reports, and the evaluation synthesis report. Emerging findings will be shared with stakeholders for feedback.

Page 22: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 19

VI. Phases, Deliverables and Workplan/timeline

Phases of the Evaluation: The Evaluation process will comprise the following phases:

Inception: The inception phase is of vital importance. It will include (i) obtaining Bank documents and data; (ii) a broad document review – including Bank policy and guidance, Operations Manual, relevant evaluations1, guidance notes for preparation of self-evaluation reports; (iii) Bank processes for preparation of various self-evaluation reports including validation and internal/external reporting; (iv) examples of Sovereign and non-sovereign project documentation; (v) scoping interviews/focus groups with key informants amongst Bank staff and management; (vi) reconstruction of the theory of change including elaboration of the evaluation matrix, and mapping of stakeholders; (v) rapid assessment of available data; (vi) development of the survey and interview instruments; and (v) an evaluability assessment.

Meta-analysis and Benchmarking: will begin in parallel with the inception phase.

Data collection: This phase will comprise the staff survey, interviews with key informants and case studies supplemented with interviews of task managers and managers responsible for selected projects and other outputs covered in the case studies.

Data analysis: This phase will concern all the data sources, highlighted above, allowing for the generation of findings, and drawing of conclusions, lessons and recommendations, which will be the basis for the preparation of the background papers/reports, and the evaluation synthesis report. Emerging findings will be shared with stakeholders for feedback.

Synthesis and Report preparation: This phase will entail: synthesizing the analysis and the messages from the background papers/reports to prepare the draft evaluation synthesis report; and presenting the evaluation findings to the Evaluation Reference Group (ERG) and other stakeholders for feedback.

Production and delivery of the final evaluation report: In this phase, the Evaluation report will be finalized to take account of the feedback as the basis for dissemination and follow up.

Communication and dissemination of evaluation results: This would be the final phase.

This assignment will begin in February 2019 and last for a period of 5 months.

1 Especially Quality Assurance across the Project cycle, PCREN synthesis reports, CEDR and associated background documents, Mid-term evaluation of CSP, various cluster evaluations.

Page 23: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 20

Deliverables: The Evaluation team will deliver three principal outputs, each first in draft and then in final form after taking account of feedback from IDEV and, where relevant, the Reference Group:

1. An Inception Report: The inception report will set out in greater depth the Evaluation team’s understanding of the assignment as well as its approach to undertake it; the plan will identify the sources of primary and secondary data that will be used and the detailed schedule of work. It will also include a framework for semi-structured interviews, focus-group discussions, sources for content analysis of project performance data, and institutional benchmarking. This draft is being provided to IDEV and will be finalized to reflect IDEV feedback. The final inception report, reflecting IDEV feedback, will provide the basis to proceed with the diagnostic study; and

2. Background Reports: The team will prepare the following brief background papers/reports:

i. Inception Report ii. Benchmarking and Meta-analysis

iii. Survey/Interview findings iv. Case studies

3. The Evaluation report: The summary report will include a brief (2 page)

Executive Summary. See Annex 5 for a detailed outline of the report.

Excluding annexes, it is anticipated that the main report will be around 25 pages in length, with additional information provided as annexes or, where appropriate, provided separately for IDEV records; any data files will be provided in soft copy. The team will provide a draft for IDEV feedback/comments and for consultation within the Bank and revise the report in light of the feedback/on the draft.

The deliverables are shown in Table 3 and the associated timeline are shown in Table 4 on the following page. The timeline assumes contracting and availability of the required documents by end-January.

Table 3: Deliverables Deliverable Date (2019)

Draft Inception Report February (week 1) Final Inception Report February (week 4) Draft Background Reports May (week 2) Draft Evaluation Report May (week 3) Revised Background Reports May (week 4) Presentation of report to RG June (week 2) Final Report June (week 4)

Page 24: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 21

Table 4: Timeline

Tasks

Tentative Dates (Weeks beginning)

14 Jan

21 Jan

28 Jan

4 Feb

11 Feb

18 Feb

25 Feb

4 Mar

11 Mar

18 Mar

25 Mar

1 Apr

8 Apr

15 Apr

22 Apr

29 Apr

6 May

13 May

20 May

27 May

3 Jun

Data colletion and analysis

Drafting Inception Report (first draft)

Feedback on the the Inception Report

Revising Inception Report (final draft)

Drafting Background Reports Feedback on the the Background Reports

Revising Background Reports (final

draft)

Drafting Evaluation Report

Feedback from Reference Group

Drafting Final Report

Note: The timeline assumes contracting and availability of documents by the end of January.

Page 25: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 22

Annex 1: Terms of Reference

AFRICAN DEVELOPMENT BANK

TERMS OF REFERENCE

Consultancy Services to Conduct an Evaluation of the Self-

evaluation Systems & Processes of the African Development Bank

I. Introduction

The Independent Development Evaluation Department (IDEV) of the African Development Bank Group (hereafter "the Bank") requires the services of a consultancy firm (hereafter, “consultant”) familiar with International Financial Institutions’ operations, and monitoring and evaluation systems and processes to evaluation its self-evaluation systems and processes. The evaluation aims to assess the relevance, coherence, effectiveness, efficiency and contribution of the Bank’s self-evaluation systems and processes. The assignment will be conducted under the general supervision of an IDEV Task Manager.

II. Context

The Bank has both independent evaluation and self-evaluation systems and processes, which are mutually dependent. These systems and processes help the Bank to account for its investment effort, to learn from its experiences, and to monitor and improve its performance. The mandate of independent evaluation resides with IDEV, while that of self-evaluation rests with Bank management including operational complexes.

The Bank’s self-evaluation systems are defined in various Bank documents including the Operations Manual (OM), and policies. They are multi-purpose including monitoring, and judgement of development results, accountability for and learning from development implementation and results. They concern various aspects of the Bank including policies, strategies, programs, projects, processes and systems. They depend on multiple actors, guidelines, processes and tools including the results measurement framework (RMF) at corporate, program and project levels, and the additionality and development outcomes assessment (ADOA) framework. The RMF and ADOA framework are core to the Bank’s self-evaluation systems and processes.

Page 26: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 23

In responding to the changing contexts within and outside the Bank, the Bank’s self-evaluation function has evolved over time. The Bank adopted its operations manual (OM) in 1993, and revised it in 1999, and then in 2014. The revision of the 2014 OM will start in 2019. Bank replaced in 2013 its Medium Term Strategy (MTS) with a Ten Year Strategy (TYS), 2013-2022. The Bank, since adopting its TYS in 2013, has gone through major organizational restructuring and adjustments in 2014 and 2016, and changes in policies, and operational and institutional processes including:

The adoption in 2015 of the High5 priorities within the context of the TYS, leading the development of appropriate strategies for each of the High5s.

The development and adoption of the New Development and Business Delivery Model (DBDM) in support of the High5s; comprising five major pillars.

Creation of structures such as Delivery Accountability and Process Efficiency Committee (DAPEC), and Technical Quality Assurance Committee (TQAC) for improving the operational and institutional processes.

Certain aspects of the Bank’s self-evaluation function have been the subject of evaluations/reviews. Some of these evaluations/reviews have been completed, and others are ongoing including:

IDEV’s evaluations:

Evaluation of the African Development Bank’s quality assurance across the project cycle (2013-2017), September 2018. This evaluation covers project quality at entry, quality of supervision, quality at exit, and environmental and social safeguards.

Evaluation of the Integrated Safeguards System (ongoing) Independent evaluation of the Bank’s additionality and development outcomes assessment

(ADOA) framework (2014) Synthesis Report – Second Independent Assessment of Quality at Entry in Public Sector

Operations (2013). Independent Evaluation of Quality at Entry for ADF-11 Operations and Strategies African Development Bank Group (2010)

Project Supervision at the African Development Bank (2001-2008) – An Independent Evaluation (2009)

Country strategy and program evaluations, which include aspects of self-evaluation systems, and processes.

Other evaluations/reviews of the Bank include:

The ongoing independent audit of Bank results monitoring and reporting (RMR), which focuses on adequacy and compliance of Bank policies, procedures, organizational arrangements, monitoring and reporting frameworks, and data management.

An assessment of the Bank’s quality assurance tools (2018). The diagnostic study of AfDB’s practices to assure the quality at entry of public sector

operations (2018). Multiple Mid-term reviews, and completion reports at strategy, program and project levels.

III. Evaluation purpose, objectives, scope and questions

a) Purpose and objectives

Page 27: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 24

The purpose of the evaluation is to support the Bank’s management and operational staff in:

Improving self-evaluation, and performance management of operations, strategies, and policies;

Improving the relevance and quality of the Bank’s operations manual whose revision is to start in 2019;

Promoting learning from experience, and operational effectiveness; Enhancing the implementation of the New Development and Business Delivery Model

(DBDM), and process engineering; Accounting to the Board of Directors and other stakeholders for the results of the

investments in the Bank self-evaluation systems and processes.; The evaluation will also build on and complement the:

IDEV’s Evaluation of the African Development Bank’s quality assurance across the project cycle (2013-2017), and Evaluation of the Integrated Safeguards System;.

Independent audit of Bank results monitoring and reporting (RMR).

The evaluation’s intended users are primarily staff and Board members of the Bank. The inception phase of the evaluation will clearly define the intended users and uses of the evaluation.

The objectives of the evaluation are to:

Assess how well the Bank’s self-evaluation systems and processes performed, focusing on their relevance, coherence, efficiency, effectiveness, and short-term impact;

Assess the enablers and barriers that affected the design, implementation and results of the Bank’s self-evaluation systems and processes;

Distil lessons, good practices, and recommendations to enable the Bank to enhance the quality and performance (design, scope, implementation and results) of its self-evaluation systems and processes;

b) Scope and questions The evaluation will focus on the Bank’s self-evaluation systems and processes during the period 2013-2018, which covers a substantial part of the implementation of the TYS, and recent institutional reforms including the DBDM and process engineering. This period is also that of the 2014 OM.

The evaluation will only cover self-evaluation systems and processes for sovereign and non-sovereign development. It will not deal with personnel self-evaluation systems and processes. The evaluation will mainly use the OECD-DAC criteria of relevance, effectiveness, and efficiency, and focusing on results beyond outputs. It will also assess coherence within the Bank’s self-evaluation systems and processes, and with the Bank’s independent evaluation function. The evaluation will be forward looking, and its key issues include:

(i) the self-evaluation systems’ stakeholders, and their perspectives and inter-relationships;

(ii) the components, processes and mechanisms of the self-evaluation;

Page 28: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 25

(iii) the enabling environment for, and barriers to self-evaluation; (iv) appropriate data collection, analysis and storage tools and systems for self-evaluation; (v) the use of self-evaluation outputs especially for decision-making, learning,

accountability and for improving development quality and effectiveness.

The key evaluation questions, presented in the table below, are indicative. They will be refined and finalized during the inception phase of the evaluation.

Criteria Questions

How well are the Bank’s self-evaluations systems and processes performed?

Relevance & coherence

To what extent are the self-evaluation systems and processes relevant and coherent?

To what extent are the parts (e.g. structures; instruments; processes; methods; mechanisms) of the self-evaluation systems complete, relevant and credible?

To what extent is the theory of change for the self-evaluation systems explicit, complete, embedded, relevant, and credible?

To what extent are the designs of the self-evaluation systems and processes adequate?

To what extent are the self-evaluation systems and processes aligned with the bank key policies, strategies (H5s; TYS), business model (DBDM) and guidance documents (OM)?

To what extent are the self-evaluation systems internally coherent, and coherent with the Bank’s independent evaluation function?

How well the self-evaluation systems and processes mainstreamed cross-cutting issues; e.g. gender/inclusivity; safe-guards; fragility.

Efficiency To what extent are the self-evaluation systems and processes efficient, effective and impacting on development quality and organizational learning?

How efficient are self-evaluation systems and processes in producing desired outputs; credible evidence/information? (are the self-evaluation systems’ results cost effective?)

How timely are self-evaluation systems and processes in producing desired outputs; evidence/information?

To what extent has the Bank deployed adequate resources including human in self-evaluation and processes? How efficiently were these resources used to achieve the planned results?

To what extent the Bank invested in good and/or innovative practices in self-evaluation processes, mechanisms, tools and methods?

Page 29: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 26

Effectiveness To what extent are the self-evaluation systems and processes producing relevant, reliable, timely and useful outputs?

To what extent are the self-evaluation systems and processes‘ outputs used for (i) informing decision-making (e.g. programming/improvement; policy, strategy, & program/project design/implementation); (ii) accountability; (iii) learning?

Impact/contribution What is the evidence of the contribution of the self-evaluation systems and processes to development quality (project; strategy; policy), outcomes, and learning?

What contribution have the self-evaluation systems and processes made to corporate development effectiveness and organizational learning?

Enablers and barriers

What are the factors that have enabled or constrained the design, implementation and results of the self-evaluation systems and processes?

Lessons and recommendations

What relevant lessons, good practices, and recommendations that can be drawn for improving the quality and performance of the self-evaluation systems and processes?

IV. Methodology and processes

The IDEV evaluation policy and the Evaluation Cooperation Group’s Big Book on Evaluation Good Practice Standards 2 will guide this evaluation. The evaluation approach will require a reconstitution of the supposed theory of change, underlying the Bank’s self-evaluation systems and processes. The supposed theory of change will guide the refinement of the indicative evaluation questions, and the development of the evaluation methodological framework. The inception phase of the evaluation will clearly define and detail the most credible methodological framework for responding to the evaluation questions. The methodological approach should be of mixed designs and methods. The data sources, the basis for the evaluation streams of evidence, should include but not be limited to the following:

Desk review of relevant documents/reports and databases including those of the Bank,

other MDBs, and the literature;

Substantive interviews and discussions with key stakeholders within and outside.

Staff survey (staff; RMC official);

In-depth case studies, based on appropriate sampling;

Benchmarking with other MDBs and other appropriate agencies.

2 ECG Big Book on evaluation good practice standards, http://www.ecgnet.org/document/ecg-big-book-good-practice-standards. Both documents reflect the standard OECD-DAC development evaluation criteria and quality standards.

Page 30: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 27

The evaluation process will include the following phases:

Inception phase to produce the inception report, which will include the full evaluation methodology (including sampling, evaluation matrix, limitations, risks and mitigations, data collection and analysis tools/instruments, rating scale and standards), evaluation team composition and responsibilities for each of the individual evaluation team members. This will involve inter-alia desk reviews and discussions with key stakeholders, rapid assessment of available data, reconstruction of the supposed theory of change, stakeholder mapping and preparation of the inception report.

Meta-analysis phase covering: (i) evaluations on self-evaluation systems; and (ii) Bank policies; strategies, programs/projects, and management action record system (MARS). This phase will overlap with the inception phase.

Data collection and analyses for the generation of findings, and drawing of conclusions, recommendations and lessons learned: This phase will concern all the data sources, highlighted above. It will be the basis for the preparation of the background reports, and the evaluation synthesis report. Emerging findings will be shared with stakeholders for feedback.

Synthesis, report writing and feedback leading to the draft evaluation synthesis report and its presentation to the evaluation Reference Group (defined under the quality assurance section below), and other stakeholders for feedback on the draft evaluation findings.

Production and delivery of the final evaluation report in the appropriate format (in English) for dissemination and follow up.

Communication and dissemination of evaluation results

Risks and mitigation actions: The evaluation risks and mitigation actions will be identified at the inception phase by the evaluation team.

Available documents and databases including the following: Bank strategies, policies, project databases, MARS, results measurement frameworks, annual reports, work programmes, progress reports, and self-evaluation guidelines and reports, budget reports, relevant IDEV evaluations/reviews.

V. Deliverables and timeline

The consultant will deliver the following outputs (in English):

Inception report (draft and final).

Background reports (Benchmarking; meta-analysis; case study; survey; interview notes;

case studies…).

Draft evaluation report and its presentation to the evaluation reference group, and for peer review; the evaluation report will include an executive summary, background and context, evaluation purpose, objectives and questions, key aspects of the methodological approach and limitations, findings, conclusions, lessons and recommendations, and annexes (see Annex 5 for further details).

Final evaluation report including an executive summary of up to two pages and essential annexes.

Page 31: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 28

Technical annexes including the methodology and its instruments and evidences.

Electronic version of data collected and evidence set (analyzed data).

The evaluation will have an indicative input duration of 280-person days over a period of five months, and its timeline is presented in the table below. A final evaluation report is scheduled to be completed and delivered in June 2019.

Evaluation phase-Delivery-Timeline

Phase/Output Deadline Responsibility

Contracting phase December 2018 IDEV

Inception phase:

Draft inception report Comments on draft report Final inception report

incorporating comments Approval of inception report

Week 1 February 2019

Consultant & IDEV

Consultant IDEV Consultant IDEV

Data collection & analysis phase (including literature/document review, interviews, survey meta-analysis, benchmarking, and case studies)

Week 3 April 2019 Consultant

Reporting phase:

Draft & revised background reports

Draft evaluation report Presentation of draft findings

to RG for feedback Final report incorporating

comments/suggestions Feedback on evaluation

process

Week 1 June 2019 Week 2 May 2019 Week 2 May 2019 Week 3 May Week 1 June

Week 1 June

Consultant

Communication & dissemination phase:

From February- 2019 IDEV

VI. Profile of the Evaluation Team (qualifications, experiences and competencies)

Centennial Group International A firm (the consultant) will undertake the evaluation using a balanced team with demonstrated professional knowledge, skills and experience in:

Page 32: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 29

Evaluation/review/synthesis/benchmarking theories/practices

Evaluation/monitoring and evaluation systems and processes in international

development

International development work and issues especially within the contexts of Africa and

MDBs.

How the MDBs work, and MDB activities including evaluation functions

Evaluation report writing and presentation

Fluency in English, and working knowledge of French; at least two of the evaluation team

members plus the research assistant are fluent in both English and French.

Standard applications and analytical packages

VII. Management and Quality Assurance Arrangements

An IDEV Task Manager will be responsible for: (i) providing overall guidance to the consultant, and approval of the evaluation process and outputs (inception report; background reports, draft and final evaluation reports); (ii) quality assurance processes including the external peer review of the key evaluation products, and receiving comments from the Evaluation Reference Group (ERG); (iii) recruiting the consultant (iv) briefing the consultant; (iv) establishing the Evaluation Reference Group (ERG); (vi) receiving from the consultant all data, files (including raw data, coded data, interview notes, databases) that will be produced; (vii) communicating to the Bank’s Management and Board of Directors, and disseminating the final evaluation results to the key stakeholders. IDEV will also recruit at least two competent and experienced international experts (content-area; evaluation) for the external peer review of the evaluation process and outputs; (viii) ensuring the payment of the consultant.

The Evaluation Reference Group (ERG) will comprise selected Bank staff from the relevant complexes/Departments/Units. The ERG will review and comment on the evaluation process and outputs (inception report; evaluation reports), and provide a sounding platform for rapid feedback especially on the evaluation plan (including design and methods) and emerging evaluation findings.

VIII. Evaluation Budget

The evaluation budget will comprise all expenses including fees, travel and taxes. The firm/consultant will provide a detailed budget with breakdown against activities and key milestones.

Page 33: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 30

Annex 2: Interview Template – Executive Directors

The interviews/conversations with the Executive Directors will evolve around the following main questions:

Executive Directors Performance To what extent the indicators provided to the Board as an aggregation of the SES through

the RMF and the dashboard, provide sufficient evidence of the performance of the portfolio and achievement of results

Degree of satisfaction of the RMF and overall corporate reporting. Is the RMF the result of a negotiated reporting template reflecting the requests and wishes of the shareholders or deriving from the aggregation of the Bank’s operational evidence and results as normally supported and collected by the SES.

Accountability Is the SES and its aggregated reporting tools (RMF, etc..) providing to the Board the

required comfort that the Bank is holding itself accountable for the achievement of results Could a focus on accountability work against the capacity of the SES to be an accurate and

trusted tool for verifying and reporting the achievement of results and learning opportunities.

Learning To what extent a more direct and structured involvement of third independent parties

(including Board members) and specific thematic/country events are likely to increase the credibility of the system and learning opportunities.

For what the Board is concerned, what kind of incentives could be put in place by Management for the SES to be used more strategically to meet knowledge gaps and for lesson learning

Page 34: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 31

Annex 3: Interview Template – Managers and Staff (HQ and Regional hubs)

Interviews (beyond those for the case studies) would cover the following main points:

Performance Accountability Learning Is the implementation of the

SES, through its various reporting instruments, generally in compliance with the main guidance documents ? What are the main impediments to a more cost-efficient design and implementation of the SES,

Do the provided ratings reflect a candid, timely and accurate approach to performance management for the various instruments? What is the general perception about accurate reporting on special priorities (gender, fragility, safeguards). What are the limitations and how to improve its use.

To what extent is the SES being used as a tool for analysis of possible impediments and corrective action vs a compliance mechanism against a check-list of requirements.

Is Management exerting leadership and enforcement over the correct implementation of the SES and over the lines of accountability,

To what extent is project performance and staff performance perceived to be related and is the current incentive structure conducive to candour,

To what extent do you see the SES as a relevant and trusted accountability tool. What are its strengths and weaknesses relative to the inputs (M&E, quality at entry, budget, processes, transaction costs) and to the outputs (IPRs, MTRs, PCR, etc..).

To what extent the lending pressure, the focus on corporate results reporting for accountability, and the ratings system contributes to weaken staff attention to learning,

What kind of incentives could change behaviours in terms of better documenting and using learning evidence, best practices, lessons learned, ratings follow up, innovation.

To what extent a more direct involvement of third independent parties and specific thematic/country events are likely to increase the credibility of the system and learning opportunities.

Page 35: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 32

Annex 4: Survey Template

The survey will be designed to use “survey monkey” (or equivalent) software with a scale of ratings from 1 to 6. The number of questions will be kept to a minimum given that the typical response rate is proportional to simplicity and time taken to complete (estimated at 8-10min). The survey will be organized along the three main outcomes of the ToC: Performance, Accountability, and Learning, and is in line with the thrust and the main questions of the evaluation matrix. The survey will also elicit open-ended comments on more generic issues allowing to examine underlying criteria such as Relevance & Coherence, Cost-efficiency, Effectiveness & Impact Contribution, and Incentives.

Performance Accountability Learning

Is the application of the SES

and the production of the different reports and documents aligned with the Bank’s strategy and guidance provided?

To what extent do the SES and its instruments provide a reliable, timely and cost-effective framework for portfolio management?

Do the SES contribute to improved quality at exit (project closing)?

Are the incentives in place conducive to candid assessments and proactivity for corrective action and portfolio performance?

Do the SES provide a reliable,

timely and cost-effective framework for verification of results, reporting and accountability at: i) staff/management level, ii) corporate level?

Is the degree of accountability generated by the SES conducive to improve results?

In which direction are the incentive around the SES mostly influencing behavior: i. Corrective and timely action for

improving quality of results ii. Fear for staff reputation and

risk aversion iii. Little need for accountability if

no enforcement iv. Compliance and box-ticking v. Avoiding a disconnect with

IDEV validation vi. Corporate reporting (RMF)

vii. Learning

Are the SES being used as a

reliable and relevant framework for learning and innovation?

Is the money spent for the SES commensurate to the amount of learning and innovation that it generates?

Are the SES contributing to the identification and use of lesson learned for follow up projects, CSP, RISPs, thematic reviews?

Open-ended questions What would be the main incentives in the use of the SES that would influence staff behavior for better

compliance with reporting requirements, candid assessment, achievement of results? Please elaborate. What are the main impediments to increasing the relevance of the SES, in its design and

implementation? Please elaborate. Do you think there are redundancies or gaps in the use of the SES? What would be your

recommendation with respect to improving cost-efficiency. Please elaborate. How useful are the SES to address specific priorities such as gender, fragility, and safeguards. How to address issues around compliance-check vs proactive follow-up on problems identified during

supervision, and how it impacts staff capacity to call on the right expertise to support borrower’s implementation capacity.

Page 36: Contentsidev.afdb.org/sites/default/files/documents/files... · APPR Annual Portfolio Performance Review BTOR Back-to-Office Report CPPR Country Portfolio Performance Review CSP Country

| Page 33

Annex 5: Report Outline

Executive Summary

1. Background and Context 2. Purpose, Objectives and Scope 3. Methodology

i. Evaluation Methods ii. Theory of Change

iii. Limitations iv. Evaluation Questions

4. Architecture and cost of the Bank’s Self-Evaluation Systems vs those of comparators 5. Main SES characteristics and finding related to:

i. Performance Management ii. Promoting Accountability

iii. Learning 6. Conclusions, Lessons and Recommendations

i. Findings ii. Recommendations

Annexes

1. Terms of Reference 2. Evaluation Matrix 3. Comparator Review 4. Executive Directors Interview Results 5. Staff and Managers Interview Results 6. Case Studies 7. Survey Results