2.2. Evaluating efficiency and effectiveness in public sector delivery 2.2.1. Why to invest in evaluations Evaluation is an essential component of any expenditure manage ment system.
It is accepted management theory and practice that evaluation find ings can assist the government in budgetary decisions and provide the opportunity to improve the management of projects and programs.
Without over promising what evaluative activity can deliver, evaluations provide better evidence about what is working and therefore how best to design and deliver policies and programmes to achieve desired out comes. There are many analytical processes and research that can as sist good decision making. Evaluations rarely give definitive answers.
Evaluation makes an important contribution but not always "the an swer". Political considerations are critical. Evaluation provides govern ment officials, development managers, and civil society with better means for learning from past experience, improving service delivery, planning and allocating resources, and demonstrating results as part of accountability. Evaluation ensures that decision makers at all levels have access to the best information available.
Evaluation encompasses efficiency (the ability to undertake an activ ity at the minimum cost possible) and also effectiveness (whether the activity is achieving the objectives which were set for it). The concept of performance should be central. Performance measurement is an es sential activity because it provides an opportunity and a framework for asking fundamental questions such as: ‘What are you trying to achieve’; ‘What does “success” look like’; ‘How will you know if or when you’ve achieved it’ Development of evaluation capacity can support broader govern ance, institutional development and public sector reform. Mandated evaluation can help focus on the performance of oblast and municipal governments receiving intergovernmental fiscal transfers. Successful commercialization and the private sector delivery of budgetary services require a clear understanding of program objectives. Assessments of the performance of alternative service delivery are required pre privati zation, on an ongoing basis post privatization. Evaluation is a key sup port to civil service reform. Devolution to managers of increased levels of responsibility and autonomy goes hand in hand with focus on evaluating and appraising the performance of personnel – recognizing that individual performance is reflected to some extent in project or program performance.
Evaluation is an important tool to support anti corruption efforts – improving financial management systems and timely performance re porting. The consequence of a high government priority on evaluation is a strengthening of auditing functions and watchdog agencies, providing for greater transparency in policymaking and implementation.
A decision to give priority to Evaluation activity will require serious consideration of incentives for decision makers and managers, and of training for managers and evaluators.
Evaluation techniques There are many different types of evaluation tools and they can be used in a variety of ways. Although these tools are related, the different terminologies employed by evaluation practitioners can lead to confu sion. The tools all address performance measurement: ongoing moni toring and performance indicators; project and program evaluation—ex ante, ongoing/formative and ex post/summative; performance (or value for money) audits; financial auditing. Each of these tools pro vides information on the performance of an activity, and each can be used in a variety of contexts. This broad spectrum of performance measurement activities is also known by other generic labels, such as monitoring & evaluation72.
The UNDP Handbook on Participatory Evaluation describes “ap proaches, techniques and tools for monitoring and evaluation” and de fines important concepts 73.
“Feedback is defined as a process within the framework of monitor ing and evaluation by which information and knowledge are dissemi nated and used to assess overall progress towards results or confirm the achievement of results. Feedback may consist of findings, conclu sions, recommendations and lessons from experience. It can be used to improve performance and as a basis for decision making and the promotion of learning in an organization.” A “lesson learned” is an instructive example based on experience that is applicable to a general situation rather than to a specific circum stance. It is learning from experience. Stakeholders are more likely to internalize “lessons learned” if they have been involved in the evaluation process. Lessons learned can reveal “sound practices” that suggest how and why different strategies work in different situations.
The terms ‘performance measurement’ and ‘evaluation’ are often used interchangeably in a generic, shorthand sense to encompass these various terms and concepts.
There are a variety and range of evaluation techniques, including:
cost benefit analysis, economic impact analysis (EIA), modeling, and opinion and client satisfaction surveys. We will want to explore the po tential value of specific techniques with respect to their application to the health, education, science and culture sectors in Russia.
The World Bank Operations Evaluation Directorate’s Manual “Moni toring & Evaluation: Some Tools, Methods & Approaches” provides clear and brief explanations of various evaluation approaches74.
As the Directorate explains, some of the monitoring and evaluation (M&E) tools and approaches are complementary; some are substitutes.
Some have broad applicability, while others are quite narrow in their Definitions of Monitoring, Evaluation and Reporting can be seen in the UNDP Report http://stone.undp.org/undpweb/eo/evalnet/docstore3/yellowbook/documents/part_1.pdf http://www.undp.org/eo/documents/who.htm http://lnweb18.worldbank.org/oed/oeddoclib.nsf/24cc3bb1f94ae11c85256808006a46/a5efbb5d776b67d285256b1e0079c9a3/$FILE/MandE_tools_methods_approaches.pdf uses. The choice of which is appropriate for any given context will de pend on a range of considerations. These considerations include the uses for which M&E is intended, the main stakeholders who have an interest in the M&E findings, the speed with which the information is needed, and the cost.
Annex 7 provides concise descriptions of the major evaluation tech niques.
Building evaluation capacity Incentives Action starts at the top. Senior management must demand evalua tions. If Ministers, chief executives and other senior managers do not assume a leadership role, if they do not demand high quality evaluative evidence to inform their decisions, then there is unlikely to be much im provement in its supply. One way to increase the demand for evalua tions is to present the threat of external evaluation. The best way, how ever, to create demand for high quality evaluative evidence is for chief executives and senior managers to demonstrate their own commitment to using that evidence to support their own decisions and advice. Levers in the public management system, the budget, reporting and chief ex ecutive performance management processes can be designed to sup port a “culture of inquiry and review”. To encourage evaluation activity, the government must change reporting requirements for departments and state owned enterprises. Reporting requirements should encour age a greater focus on outcomes and using evidence to support their decisions on intervention. Two areas can be addressed:
- Change the chief executive performance management process so that the emphasis is on debate about the evidence and underlying decisions, the quality of thinking underlying management decisions, and learning from the past to improve future decisions; and - Change the budget process so that it is more focused on overall government expenditure (e.g. value for money reviews), not just new initiatives, so that new initiative proposals are more likely to be informed by evaluative findings. Proposals for new initiative should outline future evaluation and monitoring intentions.
The most ambitious approach is to set as an objective the develop ment of a national evaluation system. If this is deemed to be infeasible or too expensive, a more realistic approach is to focus on sectors or major programs and projects. Experience warns against an attempt to quantify the optimal level of evaluation capacity and the extent of the shortfall of capacity to conduct and use evaluative activity. Experience warns against an attempt to specify the ideal numbers of trained evalua tors that are needed. It would be unwise to prepare a central plan – a de terministic or general intervention from government to directly create a supply of evaluation specialists. Not every department should have an evaluation unit.
In the short term, the appraisal process for senior officers could use formal and informal measures to support and encourage chief execu tives to demonstrate how they are using evaluations to improve the ef fectiveness and efficiency of their agency's interventions. One example is to reward them if they have implemented an overall evaluation strat egy that prioritizes the entire department's evaluative activity in the con text of policy priorities. One appraisal criterion should be whether their agency collects and makes available good administrative data.
The role of the Finance Ministry A strategy to build, strengthen and promote evaluation activity works best if it is a centrally driven initiative of a powerful Finance ministry, linked closely to its main area of influence, the annual budget process.
There are advantages and disadvantages of a centrally driven ap proach. The main disadvantage is weak commitment by line ministries.
The “not invented here” syndrome will lead to dishonest evaluations designed to conform to bureaucratic requirements. Finance ministries try to soften such disadvantages by relying on persuasion wherever possible, and by providing a range of positive support and assistance, rather than by using more forceful methods. “In contrast, the limitations of a reliance on advocacy of good practice principle can be seen from the results of the second generation reforms (and from the initial effects of the ‘let the managers manage’ reforms in the mid 1980s). Thus the appropriate interpretation appears to be that a balance of ‘carrots, sticks and sermons’ works best.” Annex 2 summarizes some Australian and New Zealand experience on the introduction of a government wide priority on Evaluation activity75.
http://lnweb18.worldbank.org/oed/oeddoclib.nsf/DocUNIDViewFor JavaSearch/14163969A1A709BD85256E5100013AA8/$file/ecd_wp_11.pdf To introduce the merits of evaluation, the Finance Ministry should be the patron and champion, the client, of an “evaluation activity report” from each Ministry and major state owned enterprise. The budget process should be adapted so that:
- When seeking new funding, ministries describe evaluation inten tions;
- Incorporate the use of “Sunset clauses”, where there is substantial uncertainty about the effectiveness of new initiatives. Where possi ble establish pilots and then review impacts after an appropriate time to determine whether the initiative should continue unchanged or be adapted.
In the medium term, after the concepts and mechanism are intro duced, the client for evaluations should be the ministries’ management, rather than the Ministry of Finance.
Training We have identified the lack of training of employees as a serious problem in Russia – that “formal prohibitions don’t control anything”.
Ideally, program heads, senior officers and chiefs, and principal budg etary managers would willingly accept, endorse and work with evalua tions. The Finance ministry should adopt the objective of creating an evaluation and monitoring community of practice, the parishioners to include evaluation specialists (both within the State sector and in the private sector) and policy and programme managers. Such a commu nity could eventually:
- Promote and share good evaluative practice throughout the State sector;
- Share information about training opportunities and/or deliver semi nars or training workshops and promote access to international ex pertise for knowledge sharing and peer review; and - Engage in discussions with a range of universities on State sector evaluative activity.
The government will need to provide guidance and training to offi cials involved in evaluations. The Finance ministry should support agen cies to develop evaluative capability by offering training about:
- The value of different types of evaluative activity;
- Good information management practices (including administrative data);
- How to understand and use evaluative findings’.
Promotional and reference materials should be prepared to:
- Emphasize the importance of undertaking prioritized evaluative ac tivity and explain the value of different types of evaluative activity at different stages of the policy/programme cycle (for example, when monitoring and/or research is likely to be sufficient and when an evaluation involving randomized controlled trials might be better);
- Explain that departments are expected to develop an evaluation strategy that prioritizes their evaluative activities;
- Explain the requirements for departments to report planned evalua tive activities in their Annual Evaluation Plans and Web site, which should include reporting on how any major findings of previous evaluative activity have shaped their intervention decisions for the coming period; and - Promote the importance of using existing data effectively, gathering new data where necessary, and incorporating data analysis as a routine part of business planning and performance monitoring.
Training courses for evaluation Any existing Executive Leadership Programs should be used as a mechanism for ensuring that future leaders are aware of the need to effectively target and use evaluative activity. To build evaluative capa bilities, there must be increased evaluation training through tertiary es tablishments. There are extensive materials to draw on. Annex 3 de scribes several existing courses. These can be exploited by enrolling officials who could in the future design and deliver custom tailored Rus sian courses.