Country Strategy

Good Practice Standards on Country Strategy and Program Evaluation

Background

Jump to the Good Practice Standards (GPS) on Country Strategy and Program Evaluation

Rationale for MDB County-Level Evaluation

Country strategy and program evaluations (CSPEs)[1] seek to describe and explain the performance of a multilateral development bank (MDB) at the country level. They question whether the country program did the right things, in terms of whether the design and its implementation were right for the circumstances of the country. They ask not just "Did the country program work?” but "What made it work or fail?” and "How can we make it better”? Due to the fact that they usually evaluate both completed and ongoing operations, their forward-looking nature, and the controversy that they may generate, CSPEs tend to engage evaluation clients more than other forms of independent evaluation. Typically, they have been one of the more influential types of evaluation. Consequently, they play an increasingly important role in the work programs of the independent evaluation offices that conduct them.

CSPEs undertaken by MDBs are major and often costly evaluation exercises. They are classified as higher level evaluations because of their focus on strategic issues and because they build on the findings of evaluations of projects, programs, and sector or thematic issues of concern. A 1999 review of MDB evaluation experience describes the benefits of CSPEs:

(i) CSPEs can identify and assess broad and long-term issues and concerns better than other forms of evaluation;

(ii) they provide valuable information about the country strategy process, whether project selection was based on merit, impact of non-project forms of assistance, aggregating results of activities across all sectors and providing input into, and strengthening, subsequent country strategies;

(iii) CSPEs are better able to identify overall program and project delivery weaknesses, institutional difficulties, capacity utilization constraints, borrower's acceptance, commitment and compliance to conditions and impact of other aid agencies and external factors;

(iv) they provide a framework for rating overall performance in meeting development goals and objectives, and better assess impact and sustainability issues for long-term aid effectiveness; and

(v) they provide a valuable instrument for improving aid co-ordination among institutions and bilateral agencies and for the broader participation goal of increasing the role of national and local governments, civil society and the private sector in the developmental process.[2]

As the locus of MDB assistance shifts from individual projects toward country-based strategies, programs, and interventions having economy-wide effects, the country becomes the most logical unit of aid management and accountability. Adoption of similar goals for development assistance (e.g., the Millennium Development Goals [MDGs]) and agreements to harmonize and align assistance with national poverty reduction strategies also make country-level evaluation of external assistance imperative.

CSPE Process

CSPEs differ by purpose, by depth, and by the entity undertaking the evaluation. Within the MDBs, country assistance is typically evaluated as part of the preparation of new country strategies, both by the operational teams involved in preparing the country strategies and by the independent evaluation offices.[3] Most independent CSPEs undertaken by MDBs would be categorized as in-depth evaluation exercises or full-fledged CSPEs. These are most suitable and rewarding when there is something of value to learn through an in-depth assessment, plus an opportunity to make use of the findings. This would include cases in which

  • a diverse portfolio of MDB assistance has been provided over an extended period,
  • activities are sufficiently mature to be able to identify and/or anticipate results,
  • government external assistance policies or aid agency assistance strategies are being formulated, and/or
  • the lessons gleaned from the particular country case are expected to be of interest to other MDB member countries.

GPS Formulation Process

In October 2005, as part of ECG's ongoing effort to harmonize MDB evaluations, and consistent with the commitment to the OECD–DAC, ECG members declared their intention to prepare a set of GPS for the evaluation of country strategies and programs. The formulation has benefitted from consultations with MDB country evaluators and users and is also informed from experiences in specific country evaluations that provided an understanding of good practices from the perspective of country users in government, resident missions, development partners, and other stakeholders. The GPS was completed and presented to ECG in 2008.

Purpose of GPS

The standards aim to

  • contribute to the ECG objectives of harmonization of evaluation criteria and processes,
  • help MDBs link evaluation and operational standards in pursuit of corporate missions and objectives, and
  • assist in learning from experience among MDBs for improved results.

Guiding Framework

The GPS have been developed within the general framework of the OECD–DAC evaluation principles, and they draw on the findings of an ECG review of CSPEs undertaken in 2007.[4] The GPS also build on the foundation of good evaluation practices that have already been identified and endorsed by the ECG in its GPS for public sector and GPS for private sector operations. More specifically, those GPS established for the organization and governance of the MDB independent evaluation process, as set forth in the 2002 Good Practice Standards for Evaluation of MDB-Supported Public Sector Operations, will likewise apply to CSPEs. Consequently, those GPS are not repeated in this Chapter.

The identified GPS on CSPEs are classified as "Core” GPS [C-GPSs] and "optional” GPS [O-GPSs]. A core GPS is defined as one that establishes the key principles for CSPEs and is necessary to permit comparability of evaluation results, to the extent possible, among MDBs. While the core GPS listed in this paper are currently in practice to some extent in all members, institutional differences may affect the pace at which harmonization can be achieved. An optional GPS is defined as one that is not strictly needed for comparability but is nonetheless designed to help improve accountability and learning within each institution. These GPS have a combined total of 86 C-GPS and O-GPS which are grouped into 16 standards consisting of 50elements. A summary of the standards, elements and number of C-GPS and O-GPS is presented in the Table below.

Summary of Standards and Elements on EPs and Number of OPs on CSPE 

Evaluation Principles

Number of C-GPS and O-GPS*

Standards

Elements

A. Process-Related GPS

1.   CSPE Goals, Objectives, Client Responsiveness, and Unit of Analysis

A.    CSPE Goal

1

B.    Objectives

1

C.    Client Responsiveness

1

D.    Unit of Analysis

1

2.   Country Selection and Mutual Accountability

A.    Country Selection

2

B.    Joint CSPs

1

C.    Mutual Accountability

2

3.   Timing

A.    Timely CSPEs

2

4.   Advance preparations

A.    Preparation Steps

1

B.    Sector/Thematic Studies

2

5.   Coverage

A.    Time Period

2

B.    Product and Service Coverage

2

C.    Second- or Third generation CSPEs

2

D.    Limited Scope CSPEs

2

E.    Validation Reports

1

6.  Approach paper for CSPEs

A.    Specific Evaluation Approach

1

7.  Preparation period

A.    CSPE Implementation Period

1

8.   Staffing

A.    Evaluation Team

2

9.  Guidelines

A.    Uniform Guidelines, Quality Control, and Appropriateness

3

B. Methodology-Related GPS

10. Methods and approaches for CSPEs

A.    Overview

6

B.    Evaluation Questions

2

C.    Counterfactuals

2

D.    Attribution and Contribution

2

E.    Evaluability

2

F.    Multiple Evidence Sources

2

G.    Client Participation

1

H.    Disclaimers

1

11.  Evaluation criteria for CSPEs

A.    Relevance

6

B.    Positioning

C.    Coherence

D.    Efficiency

1

E.    Effectiveness

3

F.    Sustainability

1

G.    Impact

1

H.    Institutional Development

1

I.      Borrower Performance

1

J.     MDB Performance

1

K.    Partnership and Harmonization

1

12.  Performance rating

A.    Rating Principles and Comparability

5

B.    Rating Criteria

3

C.    Rating Subcriteria

1

D.    Weighting Criteria

1

C. Reporting-Related GPS

13.  Findings, Lessons, and recommendation

A.    Findings and Lessons

2

B.    CSPE Recommendations

1

14.  Reporting and review

A.    Reporting

3

B.    CSPE Review

1

15.  Making findings available

A.    Disclosure

2

B.    Dissemination

1

16.  Generalizing findings and tracking recommendations

A.    Generalizing CSPE Findings

1

B.    Tracking recommendations

1

Total no. of standards: 16

Total no. of elements: 50

Total no. of C-GPS and O-GPS: 86

* Core GPS (C-GPS) is defined as one that establishes the key principles for CSPEs and is necessary to permit comparability of evaluation results, to the extent possible, among MDBs. Optional GPS (O-GPS) is defined as one that is not strictly needed for comparability but is nonetheless designed to help improve accountability and learning within each institution.

Source: GPS on CSPEs in next Section.

Application

The GPS pertain to the evaluation of country strategies and programs of both public and private sector-oriented MDBs, since they both strike a balance in their evaluation between "bottom-up” project-level evaluations and "top-down” assessments of business climate quality and the macroenvironment. It is also acknowledged that there are some differences between the CSPEs undertaken by public sector-oriented and by private sector-oriented MDBs. The private sector-oriented MDBs have financial return objectives that must pass the market test; they have far fewer assistance instruments aimed at having country-wide effects; their operations depend largely on market demand; and their corporate and country strategies tend to be illustrative of the range of activities in which their banks wish to engage. Consequently, their evaluations include more analysis of performance determinants, outcomes and impacts of projects, and technical cooperation activities. Moreover, private sector-oriented MDBs are very exposed to market fluctuations, and they maintain a frequent monitoring of the overall project portfolio for accounting and financial reporting purposes. The focus of private sector-oriented MDB CSPEs should therefore be more on lessons identified from strategy impact assessments such as environmental impacts, broader private sector development impacts, transition impacts, and economic/social impacts in the immediate area of the various projects.

Updating GPS

CSPE methods, approaches, rating criteria, and their application will continue to evolve over time. Adoption of results-based monitoring and evaluation systems in partner countries, and improvements in both the self-evaluation and independent evaluation of MDB operations, sector and thematic studies, special studies, and impact evaluations, will influence the nature of the evaluation data base upon which CSPEs are built. It is envisaged, therefore, that the GPS will require periodic stocktaking and updating. As members reach further agreements on CSPE methods, approaches, criteria, rating standards, and applications to special CSPE cases, they will document them in subsequent refinements of these GPS.

 


[1] Use of the term "country” does not imply any judgment by the authors or the Evaluation Cooperation Group as to the legal or other status of any territorial entity (Source: p. ii of GPS on CSPE, 2008).

[2] OECD-DAC. 1999. Evaluating Country Programmes. Report of the Vienna Workshop. Paris. p. 115.

[3] Self-evaluation of country assistance is briefly discussed in Chapter VI.

[4]  Tabor, Steven and Suganya Hutaserani. 2007. Phase I Background Report for the Preparation of GPS for CSPEs.

Good Practice Standards on Independent Country Strategy and Program Evaluations

 Process-Related GPS

GPS Category

(Standards and Elements)

Core GPS

Optional GPS

Notes

A.1. Country Strategy and Program Evaluation (CSPE) Goals, Objectives, Client Responsiveness, and Unit of Analysis

CSPE Goal

(a)   The main goal of a multilateral development bank (MDB) CSPE is to provide information on MDB performance at the country level that is credible and useful and enables the incorporation of lessons and recommendations that can be used to improve the development effectiveness of the MDB's overall strategy and program of country assistance.

 

 

Objectives

(b)   CSPEs are used for both accountability and lesson-learning purposes in the MDBs.

 

        They provide an accounting to the MDB's board of directors regarding the results achieved from MDB assistance in a country over an extended period of time. CSPEs also serve as an important learning experience by drawing on evaluation results to engage in a constructive dialogue on what could be done to improve the effectiveness of an MDB's assistance program in the future.

 

 

Client Responsiveness

(c)   CSPEs are designed to meet the information requirements of the main target clients, which would generally be the board, senior management, and relevant operations personnel within the country department.

 

        Identifying the government as the main target client is also a good practice, because the government will need information on past assistance performance if it is to demand better service from the MDB.

 

 

Unit of Analysis

(d)   CSPEs focus on evaluating the results of MDB assistance. They take the country as the unit of analysis and attempt to evaluate MDB assistance to the country using already prepared country strategy(ies) as a point of reference.

 

 CSPEs do not evaluate the performance of a government or the progress of a country, although a CSPE may draw on country progress indicators to assess the performance of the assistance program.

A.2. Country Selection and Mutual Accountability

Country Selection

(a)   In practice, certain strategies and programs in some countries warrant more attention than others. Faced with limited evaluation resources, it is best to select those countries and programs for CSPEs where the findings and lessons will be most beneficial to the MDB and to the country.

(a)   Factors such as portfolio size, country development characteristics, and the likely relevance of the evaluation findings to similar issues in other member countries should be considered in making the selection of countries for which a CSPE is to be undertaken.

 

        It is desirable to treat each borrowing member equally, and hence to make an effort to undertake CSPEs for all countries to which an institution provides assistance.

 

Joint CSPEs

(b)   Increasingly, evaluation on a broader scale than the traditional project, sector, or thematic levels will be required, not only to assess results at the country level but also to look more closely at the role of the different institutions in the process. Joint or multi-aid agency CSPEs can provide broader perspective while fostering cross-agency learning and reducing evaluation transaction costs for in-country stakeholders.

 

        While the situation varies in each case, MDBs should endeavor to reduce potential bottlenecks by undertaking joint CSPEs within each institution.

 

 As of 2008, the vast majority of CSPEs have been undertaken by individual MDBs. Only a handful has been undertaken jointly by two MDBs, or by MDBs and other development partners. In many cases, joint CSPEs between MDBs have been conducted as parallel exercises, with separate reports. The main benefit of such a joint activity is the reduction in the burden and cost for the recipients.

Mutual Accountability

(c)   While some bottlenecks are outside of the control of the evaluators (e.g., different reporting requirements or different country strategy timing), the broader efforts to foster MDB harmonization (e.g., joint MDB country strategies or pooled funding arrangements) are likely to make it more feasible to undertake multipartner CSPEs in the medium term.

 

        While multipartner CSPEs are recommended, the decision on whether or not to join forces with partners in a CSPE is best made on a case-by-case basis.

(b)   There is also a need for multipartner evaluations of country assistance extending beyond MDBs to include all sources of external assistance, for which the evaluation challenges are significantly greater. Multipartner evaluations of the totality of country assistance should be encouraged. To the extent possible, the GPSs set forth in this report will be applied in such joint evaluation exercises.

 

A.3. Timing

Timely CSPEs

(a)   A CSPE should be timed to permit the results, recommendations, and lessons identified to feed into the preparation of the next MDB country strategy and to be available to management and the executive board in time for reviewing or approving the new strategy.

(a)   Optionally, the results of a CSPE could be provided at a time when the government is willing to make strategic decisions about the use of external assistance.

 

A.4. Advance Preparations

 Preparatory Steps

(a)   CSPEs build on the existing stock of MDB self- and independent evaluations..

 

Operational personnel should also be encouraged to prepare self-evaluations in a timely manner.

Sector/Thematic Studies

 

(a)   At the discretion of each evaluation unit, sector or thematic studies, special evaluations, or impact assessments may be undertaken to prepare for a CSPE.

 

        If sector or thematic evaluations are undertaken in advance of a CSPE, then it is advisable to issue these as separate reports and to discuss them with the government agencies responsible for the particular sectors or thematic areas.

 

 

 

(b)   Application of the same evaluation criteria and ratings systems at the sector/thematic level as those to be used for the CSPE facilitates the aggregation of sector / thematic assessments at the country level.

 

A.5. Coverage

Time Period

(a)   CSPEs should cover a period of assistance that is long enough to witness development results, while providing more emphasis on evaluating recent performance during the current strategy period to ensure that the findings are operationally relevant.

 

 

 

 

(b)   Newly initiated, completed, and ongoing assistance activities will be covered in an MDB CSPE.

 

 

Product and Service Coverage

(c)   A CSPE will cover the full content of the MDB's program of engagement with the country over the relevant time period.

 

        It will cover a series of MDB strategies and assistance in projects, programs, technical assistance, economic and sector work, and knowledge products as well as nonfinancial services‚Äîincluding the role that MDB assistance plays in policy dialogue; processes used in addressing issues in the execution of the program; as well as those used in coordinating, harmonizing, and catalyzing assistance from other development partners, the private sector, and civil society.

(a)   In large country cases in which there are too many interventions to cover all of them, a CSPE will draw its inferences from a purposeful sample of an MDB's assistance activities that is representative of the main thrusts of the MDB's strategy and program of assistance.

 

 

(d)   By necessity, some areas will be covered in more depth than others. Those areas of focus should be determined based on client needs and on the areas of past programs that can evoke the most important lessons for future strategy.

 

 

Second- or Third Generation CSPEs

(e)   Second- and third-generation CSPEs summarize the findings from previous CSPEs and take stock of the extent to which the lessons and recommendations of the earlier CSPEs were utilized.

 

 

 

(f)    Coverage of the second- (or third-) generation CSPE will overlap with the previous CSPE by a period of a few years to validate end-of-period assessments and to provide continuity with the previous evaluation.

 

 

Limited Scope CSPEs

(g)   A limited-scope CSPE may be warranted when an MDB's role in the country is quite minor, when there are likely to have been few results achieved during the CSPE period, or when there is little likelihood of findings and lessons from the CSPE going beyond what is already known from existing project and program evaluations.

(b)   A limited-scope CSPE may also be needed to deliver evaluation findings to meet tight time-sensitive demands.

While recognizing that a full performance assessment of a complex assistance program should not be undertaken in a superficial manner, in special cases a limited-scope CSPE may be appropriate.

Validation Reports

(h)   A validation report of a self-evaluation report can be treated as a special category of a limited-scope CSPE.

 

        In addition, validation of a country-level self-evaluation can serve to assess whether or not a full CSPE is required to investigate more deeply issues raised in the completion report.

 

        Independent validation of the completion reports should be undertaken to encourage internal consistency in the evaluations (often between indicators and evaluative judgments) and can be used to assess the adequacy of the documentation and performance ratings.

 

 

A.6. Approach Paper for CSPE

Specific Evaluation Approach

(a)   A CSPE approach (or position) paper will be prepared to define the country-specific evaluation approach, to set out the main evaluation parameters, and to brief the evaluation team and stakeholders within the MDB and the government.

 

 

A.7. Preparation Period

CSPEs Implementation Period

(a)   After the CSPE approach/position paper is approved, an in-depth CSPE will generally be implemented over a period of 6‚Äì12 months for data collection, analysis, reporting, and review.

 

        This should provide sufficient time for an in-depth review of secondary materials and for field visits, while ensuring that findings are delivered in a timely manner.

 

 

A.8. Staffing

Evaluation Team

(a)   An MDB CSPE will generally be led by an experienced evaluator with sufficient experience in MDB operations to understand well the processes involved in formulating country strategies and assistance programs.

(a)   To the extent that resources permit, a multidisciplinary team will be employed to undertake the CSPE.

 

A.9. Guidelines

Uniform Guidelines, Quality Control, and Appropriateness

(a)   CSPE guidelines will be prepared by each MDB. While the guidelines should provide some latitude to tailor CSPE methods, coverage, and approach to diverse country circumstances, a uniform set of guidelines will be used to explain the CSPE, as an evaluation instrument, to stakeholders in the MDB, the country, and elsewhere. The guidelines will serve to establish a core set of CSPE goals and objectives, methods, evaluative criteria, evaluation questions, procedures, reporting formats, quality control processes, and outreach and dissemination arrangements.

 

 

 

(b)   If a formal rating is included, then the guidelines should clearly specify the rating criteria and performance assessment methodology. Quality control processes should ensure that the principles set out in the guidelines are strictly adhered to so that performance assessments and other findings will be comparable across CSPEs.

 

 

 

(c)   While the principles set out in the CSPE guidelines should be strictly adhered to, the detailed scope, methods, and approach may need to be tailored to diverse country circumstances and to equally diverse assistance roles that the MDBs play.

 

 

 

Good Practice Standards on Independent Country Strategy and Program Evaluations

Methodology-Related GPS

GPS Category

(Standards and Elements)

Core GPS

Optional GPS

Notes

B.1. Methods and Approaches for Country Strategy and Program Evaluations (CSPE)

Overview

(a)   A CSPE is premised on the assumption that a series of multilateral development bank (MDB) country strategies and programs can be disaggregated into a contextual diagnosis, strategic and programmatic objectives, and an intervention logic that is amenable to formal evaluation. A typical MDB CSPE exercise begins with an effort to make explicit the causal model implicit in the design of the assistance program. It includes a contextual analysis to identify program objectives; assess the validity of the MDB's diagnosis (in terms of the relevance of the objectives); and examine the relevance of the MDB's strategy toward meeting the objectives, including the definition and delivery of the lending and nonlending assistance program.

 

 

 

(b)   Top-down, bottom-up, and attribution-cum-MDB contribution assessments will be used to garner evidence on the extent to which strategic objectives were achieved and to test the consistency of evaluation findings.

 

 

 

(c)   The evidence base will then be analyzed, using various techniques, to identify performance determinants and to examine the contribution made by the MDB to the achievement of development results.

 

 

 

(d)   A set of evaluative criteria is applied to the evidence base to rate or otherwise reach an evaluative judgment about the performance of the country assistance in meeting its goals and objectives (see section "B.2. CSPE Evaluation Criteria: below).

 

 

 

(e)   Key findings and lessons are drawn from the performance assessment and provide the foundation for future-oriented recommendations.

 

 

 

(f)    In MDB CSPE reports, the methodology used is clearly explained to ensure common understanding and to avoid disputes.

 

 

Evaluation Questions

(a)   A number of fundamental evaluation questions are defined to guide the assessment of country strategy and program performance. These will include both questions that are standard to all CSPEs, as well as those defined for the specific country case.

 

 

 

(b)   The CSPE is expected to provide evidence- based answers to these questions. At the discretion of each evaluation unit, standard questions may be similar to the following:

¬∑         Were the MDB's strategy and program relevant to the development challenges facing the country?

¬∑         Were suitable instruments of assistance selected to achieve strategic priorities?

¬∑         Did the MDB assistance achieve its desired objectives? If so, were they achieved efficiently?

¬∑         Are these achievements sustainable over time?

¬∑         Was the MDB's assistance effective in producing results, both at the level of individual interventions and at the level of the program as a whole?

¬∑         What is the overall impact of the MDB's assistance, for example on the economy, on poverty reduction, and on the MDGs?

¬∑         Did the MDB's assistance contribute to outcomes that will improve the country's capacity to manage the economy, combat poverty, and foster sustainable socioeconomic development?

¬∑         Was there a suitable division of labor, and were there effective coordination arrangements with other development partners?

Both the general and the evaluation-specific questions that are asked will be documented in the CSPE report for the readers to be able to judge whether the evaluation team has sufficiently assessed them.

 

 

Counterfactuals

 

(a)   The most accurate measure of an MDB's contribution is a comparison of the situation prevailing with and without its assistance. In practice, such counterfactuals are difficult to derive and defend for a country program as a whole. These should be used only when they are possible and defensible.

Separate impact evaluations are generally not conducted as part of a CSPE because of the cost, time required, and the limited extent to which the findings can be generalized.

 

 

(b)    In some instances, comparison with similar countries can be used as a counterfactual, although these tend to compare performance across countries and not across assistance program outcomes. It may, however, be possible to derive reasonable counterfactuals for specific components of an assistance program, such as cases in which one region was assisted and others were not, or when formal impact evaluations have been undertaken in advance of the CSPE.

 

Attribution and Contribution

(a)  Formal attribution (i.e., separating the MDB's role from that of other internal or external players) is extremely difficult in a CSPE because of the multiplicity of factors that affect development outcomes and impacts at the country level. Therefore, the assessment of program results will focus on determining whether the MDB has made a contribution to key results or outcomes that is both plausible and meaningful, and identifying the main drivers of the outcomes.

A plausible association of MDB assistance with development results can be assessed by:

¬∑      characterizing the role played by the MDB in the sector or thematic domain (i.e., lead MDB, main policy interlocutor),

¬∑      examining the policies and actions of other major development partners for consistency with those of the MDB, and

¬∑      examining evidence that the main outcomes were not achieved primarily due to exogenous events.

(a)         In addition, CSPEs will attempt to characterize the nature of the MDB's contribution to results by assessing the extent to which MDB assistance delivered additional value beyond the financing provided.

 

Evaluability

(a)  Evaluability, at the country level, is a measure of how well a proposed strategy or program sets out criteria and metrics to be used in its subsequent evaluation. A CSPE will include an assessment of the evaluability of the country strategy(ies) and program(s) of assistance.

 

Various factors influence the evaluability of country assistance, including the quality of the country diagnostic; the link between that diagnostic and the intervention logic; and the degree to which targets and indicators were specified ex-ante, baseline information was collected, outcomes were monitored, and results were reported.

 

(b)     Evaluability of country strategies and assistance programs can be a serious problem, especially if country strategies are very broad and have goals and indicators far removed from an MDB's contribution; if the intervention logic is not well defined; or if there are large backlogs of projects that should, but do not, have project completion reports.

 

Evaluability constraints can be overcome by:

¬∑         reviewing strategy, program, and project documents to reconstruct program objectives, indicators, and/or baselines;

¬∑         retrofitting results frameworks from the reconstructed program logic;

¬∑         undertaking sector reviews to assess performance of completed and ongoing operations;

¬∑         collecting before-and-after performance evidence from executing agencies; project files; and, in selected cases, beneficiary surveys; and

¬∑         concentrating the analysis on key trends in assistance performance for which data exist.

 

 

Multiple Evidence Sources

(a)   CSPEs examine quantitative and qualitative evidence from a wide range of both primary and secondary data sources. Differences in the evidence base need to be carefully reconciled and explained. The aim should be to obtain the widest possible breadth of information, to analyze the evidence carefully, and to base findings on information that has been successfully validated from multiple sources.

 

(a)   Formal sample surveys, while less common, can also be used to assess project performance, to solicit feedback on the responsiveness of the MDB to key government agencies, and to assess the quality of the MDB's performance as a development partner. Client perception surveys can also be used to provide valuable evidence about MDB performance

Secondary data include documentation from the MDB and other development partners, government, research institutions, and other outside sources.

Primary data are drawn from various sources, including:

¬∑         interviews with key stakeholders, which are used to validate the key findings and reveal the reasons for particular patterns of performance;

¬∑         focus group discussions, which are used to address specific issues or obtain beneficiary views; and

¬∑         field visits to project sites, which are sometimes included to crosscheck information obtained from project files and government reports.

Client Participation

(a)  Client participation in the CSPE process encourages respect for the fairness and objectivity of the CSPE, and contributes to early buy-in of the key results and recommendations. MDB CSPEs will endeavor to involve key stakeholders in the CSPE process from the design of the evaluation through its execution to the discussion of its key findings.

 

However, MDB CSPEs are independent evaluations, so they are not conducted jointly with the country.

Disclaimers

(a)  Given the breadth and complexity of the task, and the possible weaknesses in the evidence base, there is only so much that any CSPE can conclusively evaluate. Therefore, the limitations of the CSPE methodology, and its application, should be frankly acknowledged in the evaluation report.

This would include factors impinging on the accuracy of the performance assessment and the breadth and depth of the evidence base upon which performance assessments are drawn. This also makes it possible for evaluation clients to establish the degree of precision with which CSPE findings can be interpreted.

 

 

 

B.2. Evaluation Criteria for CSPEs

Overview

(a)  The performance of a country assistance strategy and program of assistance should be formally assessed using a set of well-defined evaluation criteria. The standard evaluation criteria that are applied to projects and programs can be interpreted and applied to the evaluation of country assistance. For harmonization purposes, relevance, efficiency, effectiveness, sustainability, and impact are considered mandatory criteria.

(a)  Positioning, coherence, institutional development, borrower performance, an MDB's performance, and partner coordination are optional criteria.

 

Relevance, Positioning, and Coherence

(a)  A diagnosis of the evolving country context is used to assess the extent to which an MDB's strategic objectives and assistance program were relevant to the critical constraints affecting the country's long-term socioeconomic development and to the government's policies and strategic priorities, in light of other development partners' strategies, and to assess the consistency of its program with its strategy.

(a)  The processes used to maintain relevance, such as an MDB's research and policy dialogue, may also be assessed.

Relevance refers to the degree to which the design and objectives of an MDB's strategy and program of assistance were consistent with the needs of the country and with the government's development plans and priorities.

 

 

(b.1)   Positioning may be used to evaluate the design of the country assistance strategy and program.

Several subcriteria have been used to assess the extent to which an MDB's assistance was positioned appropriately, including the extent to which assistance:

¬∑         was concentrated in areas of an MDB's evolved comparative advantage;

¬∑         built on lessons of past experience; and

¬∑         was selective/focused on a few sectors to reduce transaction costs and provided a sufficient quantum of assistance in any one area.

Positioning is a measure of how well an MDB responded to (or even anticipated) the evolving development challenges and priorities of the government, built on its comparative advantage, and designed the country strategies and programs in a manner that took into consideration the support available from other development partners.

 

 

(b.2)   Coherence may be used to evaluate the design of the country assistance strategy and program.

 

Coherence may be examined along three dimensions:

¬∑         definition of programmatic focus in terms of anticipated results,

¬∑         integration across an MDB's instruments in support of program objectives, and

¬∑         specification of the division of labor with other development partners.

 

Coherence refers to the extent to which there were measures aimed at fostering internal and external synergies within an MDB's program. This can include complementarity between different program elements, the extent to which policies of an MDB are self-reinforcing, and the extent to which external partnerships promote an efficient and effective division of labor in providing assistance that allows for complementarities and synergies with other development partners' programs.

Efficiency

(a)   Measuring efficiency is difficult at the overall country program level because of the difficulty of estimating the combined benefit flows of various categories of an MDB's assistance (i.e., policy support, capacity building, or aid coordination). Instead, CSPEs typically draw on proxy indicators of the efficiency of an MDB's support in comparison to cost. This may include indicators related to project/program implementation, for example, of planned versus actual commitments, disbursement patterns, project supervision, projects at risk, design and supervision coefficients, monitoring and evaluation arrangements, implementation problems and their resolution, and other factors affecting program implementation.

Ratings accorded to projects, programs, and technical assistance are also used as a proxy for returns-on-investment and timely delivery of services, while economic internal rates of return for major investments may also be reviewed. Various proxies for transaction costs to the government may be assembled and analyzed, including the number of missions per year; the proportion of time that senior government officials devoted to servicing an MDB's missions; and the average amount of time that executing agencies have allocated to the design, implementation, monitoring, and evaluation of MDB-supported assistance activities. Factors affecting the efficiency with which resources are used are identified in an MDB's CSPEs.

 

 

Efficiency refers to the extent to which the design and delivery of assistance were most cost effective.

Effectiveness

(a)   Outcomes are assessed in a CSPE with respect to program objectives at different levels; across similar lending and nonlending projects; within key sectors and/or thematic thrusts; and at broader institutional, macroeconomic, and socioeconomic levels.

Drawing primarily on a (bottom-up) analysis of cumulative program performance, CSPEs assess achievement of results both in terms of the extent to which strategic outcomes were achieved, and the extent to which sufficient development progress was made.

Results are generally compared in three ways:

¬∑         before and after the country assistance period being reviewed;

¬∑         between the country and similar countries (within the same region or at a similar level of development), as appropriate; or

¬∑         benchmarking against any absolute standards (e.g., the MDGs, costs of capital, rates-of-return).

(a)   CSPEs are also uniquely suited to assess the suitability of an MDB's policies in different country contexts, such as compliance and results of safeguard policies, financial management policies, decentralization, human resource policies, relations with civil society, cofinancing policies, adequacy of an MDB's instruments, and responsiveness of an MDB's services to country-specific assistance requirements. Not all MDB's policies can be assessed in all country cases. In an MDB's CSPEs, a distinction will be drawn between those policies whose coverage is mandatory and those whose coverage is optional.

Effectiveness refers to the extent to which the assistance instruments achieved the intentions and objectives set.

 

(b)  The determinants of an MDB's performance in attaining strategic objectives are identified in the CSPE report

 

 

Sustainability

(a)  The degree to which the results of an MDB's assistance are likely to be sustained after the conclusion of the program will be covered by examining the degree to which past interventions have been sustained, identifying risks that could affect benefit flows, and assessing the extent to which policies are in place to mitigate such risks.

In assessing the sustainability of benefit flows, a key issue is the extent to which adequate institutional arrangements have been established to further the implementation of program-supported measures. Similarly, factors that negatively affect sustainability, such as fiscal distress or insufficient attention to recurrent financing, may also be assessed.

 

Sustainability refers to the likelihood that actual and anticipated results will be resilient to risks beyond the program period.

Impact

(a)  Impact is generally assessed with reference to an MDB's contribution to the attainment of specified development goals (i.e., macroeconomic balance, socioeconomic conditions, transition impact, MDGs, and other specified national poverty reduction goals and objectives) and to the contribution of an MDB's assistance individually to the national and/or sector-specific impact objectives established during the programming process.

Program impacts will most often be assessed using before-and-after comparisons, and to a lesser extent by comparing performance with similar countries or with internationally accepted standards (e.g., MDGs). Factors exogenous to the program will be examined to distinguish those impacts that can reasonably be associated with the assistance program from those whose proximate determinants lie elsewhere.

 

Impact refers to an MDB's contribution to long-term changes in development conditions.

Institutional Development

 

(a)   The extent to which an MDB's support has helped to develop institutional capacity may be separately assessed (if not part of impact assessment) by examining changes in the performance and governance of public institutions, nongovernment organizations, the private sector, and civil society.

Institutional development is more frequently assessed as part of an overall assessment of effectiveness and impact, since capacity building has come to be treated as an integral crosscutting objective of most MDB programs.

Institutional development refers to the extent to which an MDB's assistance improved or weakened the ability of the country to make more efficient, equitable, and sustainable use of its human, financial, and natural resources, for example through better definition, stability, transparency, enforceability, and predictability of institutional arrangements; and/or better alignment of missions and capacities of organizations with their respective mandates.

Borrower Performance

 

(a)    Borrower performance may be assessed by examining the degree of client ownership of international development priorities, such as MDGs and an MDB's corporate advocacy priorities; the quality of policy dialogue; and the extent to which the government provided consistent support for MDB-assisted programs. However, borrower performance should not be formally rated.

Borrower performance focuses on the processes that underlie the borrower's effectiveness in discharging its responsibilities, with specific focus on the extent to which the government exhibited ownership of the assistance strategy and program.

MDB Performance

 

(a)     An assessment of an MDB's performance typically considers

¬∑         the relevance and implementation of the strategy and the design and supervision of its lending interventions;

¬∑         the scope, quality, and follow-up of diagnostic work and other analytical activities;

¬∑         the consistency of its lending with its nonlending work and with its safeguard policies; and

¬∑         its partnership activities.

It may also include the extent to which the MDB was sensitive and responsive to client needs and fostered client ownership. The views of operational personnel, the borrower, executing agencies, and other development partners are also typically considered in assessing the MDB's performance. 

An MDB's performance focuses on the processes that underlie its effectiveness in discharging its responsibilities as a development partner, including compliance with basic corporate operating principles; consistency with furtherance of its corporate, country, and sector strategies; and its client service satisfaction.

Partnership and Harmonization

 

(a)     Robust partnerships are required to address complex development challenges. In recognition of this, CSPEs examine the extent to which an MDB has been an effective partner in a multistakeholder development assistance effort.

This may include an assessment, but not a formal rating, of the MDB's participation in aid agency/partner groups, the extent to which its activities were well coordinated with those of other aid agencies, the degree to which it helped improve the government's capacity for mobilizing and utilizing external assistance, and the manner in which it fostered involvement of all stakeholders (e.g., government, private sector, civil society, nongovernment organizations, and other development partners) in the development process. The degree to which the Paris Declaration on Aid Effectiveness principles (i.e., government ownership, alignment with government strategies, results orientation, program approaches, use of country systems, tracking results, and mutual accountability) have been promoted should be covered in the assessment of the MDB's contribution to building robust development partnerships.

Partner coordination refers to the contribution made by an MDB to coordinating external assistance and to building government and country ownership of external assistance processes.

B.3. Performance Rating

Ratings Principles and Comparability

(a)  If a quantitative rating is undertaken, then the rating system should use well-defined criteria and be kept as simple as possible, because ratings that are too numerous or too detailed may confuse the user.

 

Moreover, discussion of the ratings should not distract from the main messages.

(a)   A quantitative rating system is generally viewed as a useful component to a CSPE, because it can help to organize and discipline the evaluation and can make the assessment process transparent and uniform across countries.

While there will always be some element of evaluator judgment, strict adherence to CSPE rating guidelines and careful quality control can help to promote ratings that are comparable across CSPEs in those evaluations that include a quantitative rating.

 

(b)   For MDBs that wish to include ratings, the manner in which the ratings are derived should be clearly stated in MDB CSPE reports, and the summary evidence upon which they were made should be presented along with the rating itself.

 

 

 

(c)    The limitations of the CSPE rating system should also be frankly acknowledged.

 

 

 

(d)   Ensuring that CSPE ratings are comparable across CSPEs implies the need for a rating system that is uniform, both in its definitions and in its application in different country cases.

 

 

 Rating Criteria

(a)  If a quantitative rating is undertaken, the ratings of the mandatory criteria (relevance, efficiency, effectiveness, sustainability, and impact) are considered to be C-GPS.

(a)  The ratings of the additional criteria (positioning, coherence, institutional development, borrower performance, an MDB's performance, and partner coordination) are considered to be O-GPS. 

 

 

(b)   The ratings for each criterion that is employed should be presented separately so that the results of the performance assessment are fully transparent to the evaluation users.

 

 

Rating Subcriteria

 

(a)   For MDBs that quantitatively rate performance, defining subcriteria, if any, in a way that is applicable to specific country cases can help to provide an evaluative framework for more uniform, systematic, and comparable assessment.

MDB evaluators have drawn on a decade of experience in undertaking CSPEs to evolve a set of evaluative subcriteria suitable for assessing country assistance performance in different country settings. A list of CSPE-specific subcriteria for each of the criteria indicated above is provided in Appendix 2. This list is not meant to be either exhaustive or minimal; it reflects many of the factors found to be important determinants of country assistance performance, a subset of which is likely to be suitable in varied settings. An evaluative judgment is required to assess the degree to which chosen subcriteria have been achieved in a particular evaluation.

Weighting Criteria

 

(a)   If overall performance ratings (or headline ratings) are generated‚Äîas an optional good practice‚Äîthen more emphasis should be accorded in the weighting to the results (i.e., effectiveness and impact) of the assistance program and to the sustainability of the net benefits.

 

 

Good Practice Standards on Independent Country Strategy and Program Evaluations

Reporting-Related GPS

GPS Category

(Standards and Elements)

Core GPS

Optional GPS

Notes

C.1. Findings, Lessons, and Recommendations

Findings and Lessons

(a)  Country strategy and program evaluation (CSPE) reports will include evaluation findings that are country-specific, follow logically from the main evaluation questions and analysis of data, and show a clear line of evidence to support the conclusions drawn.

 

 

 

(a)   CSPEs will identify a few number of lessons that are unambiguously rooted in the evaluation evidence, and have clear operational implications.

 

 

Recommendations

(b)   CSPE recommendations will be conveyed constructively in the form of proposals that are actionable within the responsibilities of the users, few in number, country-specific, strategic, operational, and (ideally) not obvious.

 

 

C.2. Reporting and Review

Reporting

 (a) Standard CSPE reporting formats will be used to foster uniformity in coverage and presentation while providing sufficient latitude to tailor the reports to the needs of a particular country case.

 

 

 

(a)   The report should include coverage of the country context, country strategy and program, program implementation, program outcomes and impacts, partnerships, thematic issues, lessons, and recommendations.

 

 

 

(b)   The CSPE report will be presented in plain language. It will be evidence- and analysis-based, and will focus on those key issues that could be evaluated conclusively, rather than on all issues that have been examined.

 

 

CSPE Review

 

(c)    For quality control purposes, the draft CSPE will be rigorously reviewed internally by the staff and management of the independent evaluation office, and externally by MDB operations personnel; government stakeholders; and, optionally, by external reviewers.

 

The CSPE review process should also extend to parallel or supporting studies to ensure that they are contextually correct and consistent with the CSPE process.

(a)      The revised CSPE report will reflect these comments and acknowledge any substantive disagreements. In cases in which there are such disagreements, the formal views of management, government, external reviewers, and/or the board will be reflected in the final CSPE report.

 

C.3. Making Findings Accessible

Disclosure

(a)   It is recommended to publish the findings of CSPEs. Publishing the CSPE findings helps to foster learning beyond the immediate client groups and also helps to promote transparency in the evaluation process.

 

 

 

(b)   To spotlight the diversity with which CSPE findings can be interpreted, CSPE publications will generally include the formal views of management, government, and the board

 

 

Dissemination

 

(a)   It often requires considerable effort to ensure that the CSPE findings are disseminated beyond a small group of senior MDB and government officials. Presentations to parliament, public seminars, consultation workshops, and press briefings are some of the ways in which CSPE findings can be more widely disseminated.

 

 

 

(b)   Summarizing the CSPE in a readily accessible form (such as an evaluation pr√©cis) and translation of CSPE findings into the local language can contribute to wider dissemination of findings and results.

 

C.4. Generalizing Findings and Tracking Recommendations

Generalizing CSPE Findings:

(a)   The findings from CSPEs will be summarized and used for comparative purposes in the annual and/or biannual reviews of evaluation findings prepared by the independent evaluation offices.

Using CSPEs for comparative purposes helps foster a more general understanding of the factors that influence country assistance performance.

 

 

Tracking Recommendations

 

(a)   Tracking and reporting on the progress by which CSPE findings, lessons, and recommendations are actually utilized by the MDB helps to facilitate institutional learning practices. This can be accomplished through either recommendation tracking systems or periodic reviews of the utilization of CSPE findings and recommendations