G l o b a l E n v i r o n m e n t Fa c i l i t y

GEF/ ME/C.28/3
May 9, 2006

GEF Council Meeting
June 6-9, 2006













MANAGEMENT RESPONSE TO THE
2005 GEF ANNUAL PERFORMANCE REPORT

(Prepared by the Secretariat in consultation with the Implementing and Executing
Agencies)


Recommended Council Decision
The Council, having reviewed document GEF/ME/C.28/3, Management Response to the
2005 Annual Performance Report
, notes the actions being taken by the Secretariat and
the Implementing and Executing Agencies to improve the quality of design and
proposed implementation of monitoring and evaluation components in project
proposals. The Council requests the Secretariat to ensure in its review of project
proposals that required monitoring and evaluation standards are met. The Council
further requests the Implementing and Executing Agencies to continue to improve the
quality of their terminal evaluations of projects.





Table of Contents

Executive Summary........................................................................................................................iii

Introduction......................................................................................................................................1

Results ..............................................................................................................................................1

Processes ..........................................................................................................................................1

Monitoring and Evaluation ..............................................................................................................2

Conclusions ......................................................................................................................................4

ii





Executive Summary
1.
We welcome the GEF Annual Performance Report (APR) 2005, prepared by the GEF
Evaluation Office. Although the report acknowledges that its findings have limitations, it
provides a series of useful insights that can contribute to portfolio management at the GEF.

2.
We welcome the report's conclusion that most of the completed GEF projects that were
assessed this year have acceptable performance in terms of outcomes and sustainability.
According to the report, 88 % of the 41 GEF projects reviewed in FY 2005 were rated
moderately satisfactory (MS) or above in their outcomes. In terms of the effectiveness of the use
of GEF funds, 95% of the $260 million allocated to the projects reviewed in FY 2005 went to
projects that achieved MS or better outcomes. In terms of sustainability, 80% of the allocated
GEF funds were for projects with a sustainability rating of moderately likely or better.
4.
We also welcome the report's assessment of project outcomes and sustainability, delays
in project completion, materialization of co-financing, and quality of the M&E arrangements at
CEO endorsement. The GEF Secretariat, together with the Agencies and Focal Area Task
Forces, aims to take into consideration the conclusions and recommendations of the APR 2005.
5.
According to the APR 2005, a large proportion of projects did not meet the 2003
minimum M&E requirements "at entry" and would not have met the new minimum M&E
requirements of the new M&E policy. As this is a technical project design issue for which the
GEF Agencies are accountable, the Agencies have been making efforts to address the situation,
including providing monitoring and results management training to staff, issuing checklists and
improved guidance on indicators, monitoring and evaluation, and providing effective budgeting
for monitoring. In addition, the Secretariat is redrafting project review guidelines and standards
to ensure compliance with the new M&E minimum requirements, including more guidance to
Secretariat program managers for reviewing M&E design in project documentation. STAP is
also considering ways to enhance the contribution of STAP roster reviews during the process.
5.
We are encouraged by the report's conclusion that the overall quality of monitoring of
GEF projects is showing signs of improvement and that Focal Area Task Forces have made
significant progress in developing indicators and tracking tools at the portfolio level. However,
this is an on-going process and we agree that there is still room for further improvement,
especially to adequately address the need to measure and aggregate results at the portfolio level.
6.
With the establishment of the independent GEF Evaluation Office, the GEF Secretariat
has taken leadership in portfolio level monitoring. The Secretariat, together with the Agencies,
the Focal Area Task Forces, and the STAP, is developing a Results Management Framework for
the GEF, for Council review in December 2006, with the aim of identifying appropriate units of
accountability for results, and associated tools and practices.
7.
Reflecting the APR 2005 recommendation that the Secretariat should support Focal Area
Task Forces with corporate resources to develop indicators and tracking tools, a request is being
made for a Special Initiative for Results-Management in the FY07 Corporate Budget. This
activity would be in line with the efforts to develop a GEF Results Management Framework.

iii





Introduction
1.
We welcome the GEF Annual Performance Report (APR) 2005, prepared by the GEF
Evaluation Office. The 2005 APR presents its findings under three main sub-headings: results,
processes, and monitoring and evaluation. Under these sub-headings, the report provides an
assessment of (i) project outcomes and sustainability; (ii) delays in project completion, (iii)
materialization of co-financing; and (iv) quality of the M&E arrangements at the point of CEO
endorsement.
2.
According to the report, the assessment of project outcomes, project sustainability and
delays in project completion relies on an analysis of 41 projects, for which the terminal
evaluations were submitted by the Implementing Agencies to the Evaluation Office in FY 2005.
For assessment of the materialization of co-financing, all the 116 terminal evaluations submitted
after January 2001 were considered. Of these, 70 (60%) terminal evaluations provided
information on actual materialization of co-financing. The assessment of quality of the M&E
arrangements at the point of CEO endorsement is based on the 74 full size projects that were
CEO endorsed in FY 2005.
3.
The report acknowledges that the findings presented have several limitations due to small
number of projects for some agencies, inadequate data in some cases, reliance on self reporting
by the agencies, and uncertainties in the process of verification of terminal evaluation reports
submitted by agencies.
4.
Nevertheless, the discussion of the issues assessed in the APR 2005 provides a series of
useful insights that can contribute to portfolio management at the GEF.
Results
5.
We welcome the report's conclusion that most of the completed GEF projects that were
assessed this year have acceptable performance in terms of outcomes and sustainability.
However, the figures for performance ratings for many of these projects indicate a higher
achievement level than simply having "acceptable performance". According to the report, 88 %
of the 41 GEF projects reviewed in FY 2005 were rated moderately satisfactory (MS) or above in
their outcomes. In terms of the effectiveness of the use of GEF funds, 95% of the $260 million
allocated to the projects reviewed in FY 2005 went to projects that achieved MS or better
outcomes. In terms of sustainability, 80% of the allocated GEF funds were for projects with a
sustainability rating of moderately likely (ML) or better.

Processes
6.
We are pleased that the projects that were examined have realized almost all co-financing
promised at the project inception. However, we are concerned that the exceptions to this positive
picture are global projects and those in Africa.
7.
The report states that excessive delay in project completion is associated with lower
performance in terms of outcomes and sustainability. However, it also states that this association

1





does not imply causality because excessive delay in project completion is more likely to be a
symptom than an underlying cause affecting outcomes and sustainability. It is stated that the
Office of Evaluation will further analyze the underlying causes in other evaluations such as the
Joint Evaluation of the GEF Activity Cycle and Modalities, as well as future Annual
Performance Reports, to ascertain the extent and the specific forms in which project delay affects
project outcomes and sustainability. We look forward to the outcome of this further analysis.
Monitoring and Evaluation
8.
We are encouraged by the report's conclusion that the overall quality of monitoring of
GEF projects is showing signs of improvement. This is an indication that efforts made by the
GEF Secretariat and the GEF partner agencies have begun to pay off. The actions taken by the
Agencies to address weaknesses in project monitoring systems have led to improvements.
However, we acknowledge that this is an on-going process and agree that there is still room for
further improvement. As the monitoring responsibility at the portfolio level has been shifted to
the Secretariat, to be undertaken in coordination with the Agencies, we are working on
developing a Results Management Framework for the GEF, for Council review at its December
2006 meeting, with the aim of identifying appropriate units of accountability for results, and
associated tools and practices1.
9.
According to the APR 2005, a substantial proportion of projects did not meet the 2003
minimum M&E requirements "at entry" and would not have met the new minimum M&E
requirements of the new M&E policy. As this is a technical project design issue for which the
GEF agencies are accountable, there have been efforts made to address the situation at this level.
For example, the World Bank has been providing monitoring and results management training to
its staff who are involved in reviewing as well as designing and implementing GEF projects.
UNDP has also substantially improved its M&E guidance and practices. For example, in the
Biodiversity Focal Area, UNDP has issued improved guidance on indicators, monitoring and
evaluation ­ including effective budgeting for it. UNDP also scrutinizes these M&E elements
carefully in its internal review processes. UNEP has produced a number of tools, including a
revised internal project review process, and checklists and guidance for staff to ensure that the
M&E standards are met at entry.
10.
The reports also states that there are gaps in the present project review process and that
M&E concerns are, consequently, not being adequately addressed. We agree that there exists
room for improvement in reviewing M&E elements in project design. We agree with the APR
recommendation that the GEF Secretariat should redraft project review guidelines and standards
to ensure compliance with the new M&E minimum requirements. In fact, this work is already
underway to incorporate minimum requirements for M&E more clearly into the GEF Project
Review Criteria, including provision of more guidance to Secretariat program managers for
reviewing M&E design in project documentation. STAP is also considering ways and means to

1 The policy recommendations under discussion for the fourth replenishment of the GEF Trust Fund directs the GEF
Secretariat, GEF Agencies and the GEF Evaluation Office to develop a common set of quantitative and qualitative
indicators and tracking tools for each focal area to be used consistently in all projects with a view to facilitating
aggregation of results at the country and program levels and assessment of GEF transformational impact. A
complete Results Management Framework is to be developed by the GEF Secretariat and brought forward for
Council consideration by the end of 2006.

2





enhance the contribution of STAP roster reviews during the process. In addition, the GEF
Secretariat will consider modifying the Proposal Agreement Review template used for project
reviews by adding a section that addresses the candor and realism of the risk assessments, as
suggested by the APR 2005. However, the Secretariat will try to do this without using language
that presupposes lack of honesty or transparency.
11.
The report states that although focal area task forces are developing portfolio level
indicators and tracking tools, these tools are not yet developed enough to adequately address the
need to measure project level results. However, the report does not clearly differentiate between
the level of progress made by different focal areas and the different levels of ability to measure
portfolio level results.
12.
In this respect, we would like to highlight that the biodiversity focal area has made
important strides with the portfolio monitoring system which it has developed and is currently
implementing. In fact, this year, in addition to submitting project implementation reports for
individual biodiversity projects, the GEF agencies were also requested to submit tracking tools
for GEF-3 projects under Strategic Priority One (Catalyzing Sustainability of Protected Area
Systems) and Strategic Priority Two (Mainstreaming Biodiversity in Production Landscapes and
Sectors) that were part of the PIR 2005 cohort.
13.
The tracking tools are central to the portfolio monitoring system that has been established
by the GEF Secretariat and the Agencies in the biodiversity focal area. The system, which was
developed for application at the start of GEF-3, allows for key project-level indicators to be
rolled up to the level of the biodiversity portfolio in order to present a consolidated picture of
portfolio-level coverage and outcomes. The portfolio monitoring system will continue to be
implemented in the GEF-4 period.
14.
Meanwhile, international waters, climate change, persistent organic pollutants and land
degradation focal areas are currently undertaking activities to identify program level indicators
and strategies to roll-up project level indicators to the program level. These initiatives are
expected to be completed by December 2006, in congruence with the completion of the
development of the GEF Results Management Framework.
15.
We agree with the conclusion that although Focal Area Task Forces have made
significant progress in developing indicators and tracking tools at the portfolio level, there
remain some technical difficulties to be overcome to adequately address the need to measure and
aggregate results at the portfolio level. Reflecting the APR 2005 recommendation that the
Secretariat should support Focal Area Task Forces with corporate resources to develop indicators
and tracking tools to measure the results of the GEF operations in the various focal areas, a
request is being made for a Special Initiative for Results-Management in the FY07 Corporate
Budget. This activity would be in line with the on-going efforts to develop a GEF Results
Management Framework.
16.
The report asserts that the present project-at-risk systems at the partner agencies of the
GEF vary greatly and may have to address issues such as insufficient frequency of observations,
robustness and candor of assessments, overlap and redundancy, and independent validation of
risk. While we agree that there is always room for improvement, GEF Agencies have been

3





making progress in addressing most of these issues. For example, the project at risk system in the
World Bank is already well developed, having been in place since 1996. It revised from time to
time based on reviews undertaken by the World Bank's Quality Assurance Group. Similarly,
UNDP has improved its Risk Management System, both in terms of reporting and central
monitoring. Risk management is now conducted using the Risk Module in ATLAS, UNDP's
corporate enterprise resource platform for project financial management. It contributes to
achieving results and impacts by allowing systematic and early project risk identification and
analysis, and by facilitating risk monitoring, and improving adaptive management.
17.
We are pleased with the assessment that overall quality of terminal evaluations is
improving. We agree that there are still some areas where improvements are necessary and we
expect that the FY06 APR will reflect a further improvement as a result of the additional
measures set in place by the Agencies during FY05.2 The GEF Secretariat will work with the
GEF agencies to make sure that these improvements are realized.
Conclusions
18.
The APR 2005 is a welcome assessment of the current status of project outcomes and
sustainability, delays in project completion, materialization of co-financing, and quality of the
M&E arrangements at CEO endorsement, based on an analysis of terminal evaluation reports
submitted by GEF agencies.
19.
In developing a Results Management Framework, the GEF Secretariat, together with
Focal Area Task Forces, aims to take into consideration the conclusions and recommendations of
the APR 2005 and complement the APR exercise, in the future, with a serious effort at portfolio
level monitoring of outcomes and, if possible, impacts. The establishment of the independent
GEF Evaluation Office has provided an opportunity for the GEF Secretariat to take leadership in
the area of portfolio level monitoring. It is envisioned that under the GEF Results Management
Framework which will build on the results management systems already in place in the
Agencies, there will be a "division of labor" among GEF entities, where IAs can be responsible
for project level quality and monitoring of their respective portfolios, while the GEF Secretariat
can concentrate on GEF-wide program and portfolio level performance, strategic issues and
portfolio monitoring across agencies. Such a division of labor (repeatedly called by various
M&E studies) will also help improve quality at entry and streamline the project cycle by
avoiding overlap of the review functions at project entry. The annual PIR exercise will be revised
and improved in line with these goals.
20.
The GEF Secretariat is working to advance such thinking, working with the GEF
agencies, the Focal Area Task Forces, and the STAP, and to deve lop a GEF Results Management
Framework which will reflect these principles.



2 Refer to the Management Action Record for the 2004 Annual Performance Report for Agency actions to improve
the quality of terminal evaluations.

4