76063 The World Bank PREMnotes September 2012 NUMBER 21 Special Series on Establishing a National M&E System in South Africa Ian Goldman, Ronette Engela, Ismail Akhalwaya, Nolwazi Gasa, Bernadette Leon, Hassen Mohamed, and Sean Phillips South Africa has a number of actors with legal or constitutional mandates for monitoring and evaluation (M&E). There has been a major shift in emphasis concerning M&E since 2009, partially stimulated by a political need to improve service delivery, but also from the extensive exposure of both technocrats and political leadership to international experiences. As a result, the Ministry of Performance M&E was created in the Presidency in 2009, and the Department of Performance M&E (DPME) in January 2010. The DPME has introduced a number of initiatives since its es- tablishment, including a focus on 12 government priority outcomes; the assessment of the quality of management performance of national and provincial departments; a new system of monitoring front-line services; a national evaluation system; and a municipal performance assessment tool, which is still in development. These tools have contributed to a major increase in the availability of evidence for policy and decision making. Rapid recent progress is due to strong support at the onset from South Africa’s President, learning from international experience, and strong teams in DPME and the National Treasury. Despite these positive developments, significant challenges remain in ensuring the coherence of reform initiatives conducted by central government departments, improv- ing administrative data quality, and establishing M&E as a core role of management. South African degree of autonomy, but not as much as in a federal Government Context system such as Canada. Planning and M&E sys- tems have to operate across these different levels, After the democratic elections of 1994, South and developing a common approach across the Africa developed a semifederal system with three multiple actors is a complex process, even though spheres of government—national, provincial, and South Africa is a unitary state. local. Some areas of competence are unique to one sphere of government, for example, land and Who Is Responsible for justice are a national function. Others are shared between different spheres, for example, education M&E in South Africa? and health are both national and provincial, and There are a number of departments and institu- all spheres of government have responsibilities tions responsible for planning and M&E in South for housing and roads. The provinces are respon- Africa (figure 1). Responsibilities and mandates sible for implementation of most developmental are dispersed in national, provincial, and local functions (education, health, agriculture, social governments due to the semifederal nature of the development, and others), with local government state. The national government has limited pow- responsible for water, electricity distribution, ers to drive M&E in other government spheres, integrated planning, local roads, and amenities. and is also limited by its own organizational Provinces have provincial legislatures and a strong design. FROM THE POVERTY REDUCTION AND ECONOMIC MANAGEMENT NETWORK Figure 1. Main Stakeholders in M&E in South Africa and Their Source of Authority Auditor General National Treasury Presidency • Independent monitoring • Regulate departmental • National Planning of compliance ve-year and annual plans Commission (NPC): • Auditing of performance and reporting o Produce long-term information • Receive quarterly plan (20 years) • Reporting to Parliament performance information • Department of • Expenditure reviews Performance Monitoring Public Service Commission and Evaluation (DPME) Cooperative Governance Dept o Produce • Independent monitoring and (DCOG) governmentwide evaluation of public service M&E frameworks • Regulate local government • Focus on adherence to public planning o Facilitate production service principles in of whole of • Monitor performance of Constitution local government government ve-year • Reporting to Parliament plans for priorities • Intervention powers over local government o Monitor and Constitutional power evaluate plans for Public Service Dept (DPSA) priorities as well as Legal power for performance of • Monitor national and individual Executive power provincial public service departments and • Regulate service delivery municipalities improvement Source: Presentation by Dr. Sean Phillips, Director General to Donor Forum, Presidency, June 2012. The Constitution mandates that the Auditor sector reform, and disputes about the nature of General and the Public Service Commission carry many initiatives.1 out independent monitoring of certain aspects While a strong point in the original conceptu- of government and report on this to Parliament. alization of the governmentwide M&E (GWM&E) Three national departments have strong legal system was agreement that the system should powers to regulate certain types of planning and, be built over time (Engela and Ajam 2010), this by implication, also M&E: the National Treasury approach has proved difficult in practice. With (departmental strategic plans, annual perfor- different paradigms of reform and views of the mance plans, and quarterly reporting against state, this approach has led to central departments these); the Department of Public Service and creating separate M&E reporting systems. Similar Administration in relation to the performance information may be requested three or four times of the public service; and the Department of from departments, leading to additional reporting Cooperative Governance, regarding monitoring burdens on departments already battling consid- of local government. The Presidency has also erable constraints in terms of skills and capacity, taken on certain planning and M&E roles, using and serious reporting fatigue. the authority of its position and Cabinet decisions rather than legal powers. The President also has Evolution of the M&E powers from the Constitution to ensure efficient System in South Africa government. It has proved difficult to achieve optimal The development of a GWM&E system coordination and avoid duplication of activities Historically there have been various poles of M&E among the core M&E stakeholders. Superficially, in South Africa, without a centrally driven system. this can be attributed to “turf battles� between During the 2000s, there was a growing interest in departments. However, not all ministries and de- M&E, and the M&E role in the Presidency began to partments share the same view on the nature and strengthen. In 2005, the Cabinet approved a plan role of the state, nor the nature or form of services for the development of a GWM&E system. It was that should be delivered. Moreover, there are dif- envisaged as a “system of systems� in which each ferent paradigms driving the approach to public department would have a functional monitoring 2 PREMNOTE September 2012 system, out of which the necessary information these outcomes, which the President would can be extracted. In 2007, a policy framework monitor; was published to guide the GWM&E system • the development of cross-government plans (Presidency 2007), which included the need for (delivery agreements) to deliver these out- frameworks for program performance informa- come targets, with a results-based manage- tion, statistical data quality and evaluation, and in ment structure and indicators and targets at the process sought to strengthen the links between the different levels; the Presidency, the Treasury, and the national sta- • the use of existing coordination structures as tistics agency. Policy frameworks were developed “implementation forums� to focus on achiev- for these elements between 2007 and 2011. ing these outcomes; and • regular monitoring and reporting to Cabi- New administration in 2009 net of performance on delivery agreements The government that came to power following progress. the 2009 elections faced a number of pressures, After the creation of DPME, officials visited including (i) persistent poverty and inequality; Canada, the United Kingdom, Mexico, Colom- (ii) widespread service delivery protests at the bia, the United States, Malaysia, Indonesia, municipal level; and (iii) loss of some political sup- Singapore, and Australia. These visits directly led port in 2009 elections. These pressures resulted to the development of a series of different M&E in greater willingness of the ruling party and the roles: the experience of Canada led to assessing government to be frank about the poor quality the management performance of departments, of public services, corruption, and other gover- and the experiences of Mexico and Colombia nance problems as well as a political consensus influenced the development of an evaluation to improve government performance, including policy. A push from the president for hands-on through a greater focus on M&E. The Ministry of monitoring by an inspectorate led to the ap- Performance M&E was created in the Presidency proach of front-line service delivery monitoring. in 2009, and the Department of Performance Most recently, the poor performance of local M&E (DPME) in January 2010. In addition, an government has led to an initiative to strengthen advisory body, the National Planning Commission oversight and identify appropriate support (NPC), was established in the Presidency to focus strategies. So what has occurred is a process of on a long-term 2030 plan. responding to political priorities and drawing South African officials visited the Republic on international experience to avoid reinventing of Korea, Malaysia, India and Brazil; reviewed wheels. These developments are outlined in this the experience of the United Kingdom’s Delivery note, in order of their establishment, followed Unit, which focused on delivery of a few strategic by a discussion on the emerging challenges and priorities; and conducted a desktop study on the lessons. M&E systems of 14 countries. Based on this ex- perience, soon after the new administration took Principal Components office, the new Ministry for Performance M&E of the M&E System published a position paper, “Improving Govern- ment Performance: Our Approach� (Presidency Outcomes approach 2009), which outlined the basis for the outcomes In January 2009, the Presidency attempted to approach, including: reform the Cabinet reporting system, asking • a focus on a limited number of cross-govern- departments to develop appropriate indicators ment outcomes (which eventually became to allow accurate monitoring of their services. 12) so that efforts to promote change could Departments struggled to fulfill this request due be focused; to their lack of technical M&E knowledge and • moving government to an impact focus, rath- in some cases bureaucratic unwillingness to be er than a focus on just conducting activities, transparent. Informed by this experience, the which in many cases did not translate into initial work on the outcomes approach was very impacts on citizens; much driven by the Presidency. While it managed • a performance agreement to be signed by to move government in a new direction, it created ministers including high-level targets against resistance from departments. September 2012 PREMNOTE 3 The work on outcomes became the initial mentation to inform improvements in their focus of the new DPME. The 12 outcomes, programs. including education, health, crime and others, were agreed on in January 2010; performance Linking performance agreements signed with the ministers in April monitoring to planning 2010; outcome facilitators at the deputy director M&E systems should be closely linked to plan- general (DDG) level2 were appointed to support ning and budgeting. In South Africa, until 2009, implementation of the outcomes between July there was no clear planning mandate in govern- and November 2010; delivery agreements con- ment, and no national plan. Prior to 2009, the cluded and signed by the different departments Treasury, in effect, performed the main planning and provinces by November 2010; and the first role, linked to budget decisions. The National quarterly monitoring report on the outcomes Planning Commission has just produced a long- produced in November 2010. term 2030 National Development Plan, which Since then, these reports have been produced was approved in August 2012. However, this does on a quarterly basis, highlighting the progress not yet link through to planning at different levels compared to the plans at output and suboutput of government. levels, problems, and actions taken to resolve the The Treasury established the basic planning problems. These reports are considered at quar- and M&E system for South Africa’s government, terly implementation forums that bring together in which national and provincial departments different departments in some outcomes (for produce five-year strategic plans and annual example, on rural development), or national and performance plans (APPs) that are monitored provincial departments where these are concur- quarterly. This system was applied to provincial rent functions (for example, in education and departments in 2000 and national departments health). The reports are then taken to Cabinet in 2010. Subcommittees and the Cabinet. Outcome facili- This system works, although there are chal- tators also produce independent progress briefings lenges with compliance in some departments. for the Cabinet. The system has evolved and is now more linked Some key remaining challenges are: to outcomes. DPME is now involved in the bud- • the difficulty of keeping delivery agreements get process, and because budget guidelines ask short and strategic—there is a tendency for departments to include links with the delivery detail with too many indicators, which is agreements in their plans, the DPME checks the difficult to manage; strategic plans and APPs to ensure that the link • ministers and heads of departments of the with outcomes is in place. same rank have struggled to coordinate with The APP system tends to be at quite a low each other; level and could be strengthened by improving the • a result of the focus on activities has been outcome focus in the strategic plans. A challenge a tendency to produce process indicators, is that the terminology used is not the same as in rather than indicators that measure actual the outcomes, but there is room for streamlin- improvements at output or outcome level; ing and making the outcomes and APP systems • poor translation of the delivery agreements integrate better. into departmental plans, and from strategic It has also become apparent that there is a to operational plans; conceptual misalignment between budget reform • information management systems to produce based on expenditure programs and M&E reform the data required are not fully in place in based on implementation programs. While the many departments, and required data are same terminology has been used in both reforms, often unavailable; the program categories required for appropriate • differences between (overpositive) depart- management of implementation, monitoring, mental reports and public experience of and evaluation are different than the program services; and categories used in the budget and expenditure • departments not necessarily using moni- allocation process. The performance informa- toring data on delivery agreements’ imple- tion reforms will not proceed appropriately until 4 PREMNOTE September 2012 this problem is resolved. DPME is working with • a collaboration with offices of the premier Treasury for a solution. in provinces3 to perform a similar role for provincial departments; and Monitoring the management • repeating annually to track improvement. performance of departments Table 1 shows the levels for 1 of the 31 stan- The DPME embarked on a study tour of Canada dards on M&E, and figure 2 shows the results. and was impressed by its system for assessing man- These figures reveal that 44 percent of national agement performance. Research was conducted on and provincial departments are not compliant similar systems in different countries, and South with the legal requirements on M&E, and that Africa developed its own system (now called only 13 percent are being “smart,� in this case the Management Performance Assessment Tool, implementing evaluation. MPAT), which was rolled out in October 2011. Overall, around 120 out of 140 national The system has a number of underlying principles: and provincial departments have completed the • a focus on management performance, not process, and the Cabinet has decided that all de- service delivery performance; partments must participate in the next assessment • building on existing tools used by govern- cycle. The response to the process has been very ment departments to promote buy-in and positive, despite worrying results. avoid reinventing wheels; In general, managers are very interested • complements the Treasury’s Financial Man- in how their departments perform compared agement Capability and Maturity Model; to others, and many departments have already • a focus on facilitated self-assessment, fol- implemented improvements in preparation for lowed by peer moderation, to promote owner- the next assessment cycle. The MPAT process ship of the process; has also identified areas of management where Table 1. MPAT Levels on M&E 1.3 Performance Area: M&E 1.3.1 Indicator name: Use of M&E outputs Indicator definition: Extent to which the department uses M&E information Secondary data: AGSA findings on predetermined objectives—reported information not reliable Question: which set of statements best reflects the department’s use of M&E outputs? Performance Statement Evidence level Department does not have an M&E policy/ Not required framework or does not have capacity to gen- 1 erate information. Monitoring reports are available, but are not • Quarterly monitoring reports used regularly by top management and pro- • Minutes of top management meet- 2 gram managers to track progress and inform ings or program meetings to assess inprovement. use of reports Monitoring reports are regularly used by top • Quarterly monitoring reports management and program managers to track • Minutes of top management meet- 3 progress and inform improvement. ings or program meetings to assess use of reports All above in level 3 plus: evaluations of major All above in level 3 plus: programs are conducted periodically and • Evaluation reports the results are used to inform changes to • Changes to programs and plans 4 program plans, business processes, APP, and strategic plan. Source: Presidency 2012. September 2012 PREMNOTE 5 Figure 2. Scores on M&E (% of departments scoring at the levels shown in table 1) 14% 30% 44% 13% Source: Presentation by Ismail Akhalwaya to Donor Forum, Presidency, June 2012. national policy departments need to implement delivery sites. In response, the DPME started a support initiatives. The DPME is also now going program of unannounced visits to service sites to assess the performance of department heads, such as health facilities, social grant facilities, which includes using MPAT information. police stations, and municipal customer walk-in There are also some good examples of na- centers. The objectives are to collect evidence tional and provincial departments’ performing on the quality of services and to work with the well in the management assessment. These range relevant departments to show them how to use from departments scoring well on one of the key such monitoring information for improvement. performance areas (for example, human resource The visits are conducted by monitoring teams management), to scoring well across all four per- comprising officials from the DPME and officials formance areas. Case studies have been compiled from the M&E units in the provincial offices of for these examples and are being shared widely the premier. During the monitoring visits, the in government (box 1). These case studies should teams interview users and staff for their view on serve as positive motivation and guidance to other system performance and a scorecard is produced departments. for each facility, as well as an improvement plan. A key challenge is ensuring that improved Between June 2011 and July 2012, about 200 management results in improved levels of services sites were visited. A picture of what is emerging to citizens. Failure to prove this in future MPAT countrywide can be seen in figure 3, and figure 4 evaluations will discredit the process. shows an example of scorecards for police stations visited in one area. Monitoring front-line service delivery Although there has only been one year of The President had a vision of M&E that included GWM&E implementation, monitoring data have monitors in the field collecting evidence at service already identified policy and system weaknesses, such as poor facility maintenance and the lack of effective operational management systems. When Box 1. Good Practice Example—National policy and system weaknesses are identified, they Department of Environmental Affairs are escalated to senior department management or to the ministerial level as needed. The National Department of Environmental Affairs (DEA) scored highly on all four of the key management Another initiative is the Presidential Hot- performance areas. The DEA has a sound organiza- line set up in 2009 to allow citizens to log their tional culture driven by strict policies and procedures, complaints and queries regarding service delivery. such as concrete planning, measurable outcomes, The hotline was transferred to the DPME in 2011 effective performance monitoring, and clear expecta- to ensure that government accountability and tions in terms of managers. There was evidence that responsiveness to these queries improves and the DEA managers are living their values, and there is to analyze trends in citizen concerns. To date, a culture of performance. Manager turnover is low. The specific best practices were: (i) the clear division of re- more than 135,000 cases have been logged and sponsibilities; (ii) strategic planning on a continuous assigned to the relevant departments and agen- basis, rather than just for periodic deadlines; (iii) clear cies for resolution, and 82 percent of these cases deadlines in supply chain management and financial have been resolved (that is, feedback/assistance administration; and (iv) the proactiveness of the Hu- has been given to the caller). On a monthly basis, man Resources Unit. departments are informed of their responsiveness Source: DPME Good Practice Case Study, September to complaints via a scorecard on progress in case 2012, Pretoria, DPME. resolution. 6 PREMNOTE September 2012 Figure 3. Summary of Results for First Quarter 2012/13 100 90 80 70 60 percent 50 40 30 20 10 0 monitor monitor monitor monitor monitor monitor monitor monitor user user user user user user user user staff staff staff staff staff staff staff staff location & visibility & queue digni�ed cleanliness & safety opening & complaint accessibility signage management & treatment comfort closing times management waiting times system poor (1) average (2) satis�ed (3) above expectation (4) Source: DPME quarterly report. Note: The results are for 78 sites monitored April 2012 to June 2012. Figure 4. Scorecard for Police Stations Visited April 2012 to June 2012 Sector: SAPS Baselines visits Key: Assessment ratings Assessment areas Above expectation 4 Cleanliness & Closing Times Waiting Times Management Satisfied Queue Man- Accessibility 3 agement & Location & Opening & Vis bility & Treatment Complaint Dignified Signage Comfort Average System 2 Safety Poor 1 Hanover Park/Philippi Police Staff 3 2 3 3 3 3 N/A 3 Station Monitor 1 2 1 2 2 3 N/A 2 Citizen 1 2 1 2 2 3 N/A 2 Tsineng Police Station Staff 2 2 3 3 1 2 N/A 2 Monitor 2 2 3 3 1 3 N/A 2 Citizen 2 2 3 3 3 3 N/A 1 Rosedale SAPS Staff 4 4 4 4 4 1 3 2 Monitor 4 4 4 4 4 4 3 3 Citizen 4 4 4 4 4 4 3 3 Seshego Police Station Staff 2 3 3 4 4 2 4 4 Monitor 3 3 2 4 3 4 4 3 Citizen 3 4 2 2 3 4 3 2 Hillbrow Police Station Staff 3 3 3 3 3 3 2 3 Monitor 3 3 1 3 3 3 2 2 Citizen 3 3 1 3 2 3 2 1 Natalspruit Police Station Staff 3 3 3 3 3 3 3 3 Monitor 2 3 2 3 2 3 2 2 Citizen 3 3 2 3 3 3 2 1 Source: DPME Report for 2012/13 Quarter 1 (internal DPME document). September 2012 PREMNOTE 7 The national evaluation system approved by the Cabinet in June 2012, and work is South Africa’s M&E work in the 2000s focused on now progressing on the eight evaluations included monitoring, although some departments conduct- in the plan. Fifteen evaluations are planned for ed evaluations. In 2011, a study tour to Mexico, 2013/14, and 20 per year thereafter. A similar Colombia, and the United States led to develop- process is occurring at the provincial level, and ment of a National Evaluation Policy Framework, the DPME is working with three provinces to adopted by the Cabinet in November 2011. The pilot provincial evaluation plans. In 2013/14, all framework uses a strategic approach that focuses provinces will develop provincial evaluation plans. on important policies/programs/plans, and those Monitoring of local government selected are embedded in the National Evaluation Local government is performing poorly, and at Plan. The focus has been on utilization, public dis- present there is no integrated set of minimum semination of all evaluations unless confidential, norms or standards of performance (administra- and inclusion of an improvement plan, which is tive, political, or service delivery). Drawing on then monitored. The approach emphasizes learn- the monitoring of management performance of ing rather than a punitive approach, so as to build national and provincial departments, the DPME is a culture of evaluation into departments and not working with key national and provincial partners promote resistance and malicious compliance. to establish a similar process for municipalities Six types of evaluations are envisaged, while to provide an integrated and holistic picture of specific evaluations may be a combination of performance for each municipality. The objec- types: tives are to: (i) diagnostic—identifying the root cause of • enable strategic leadership of the local gov- problems, and potential options to address ernment sector and inform policy reform them; initiatives; (ii) design—a short program design evaluation • provide evidence for tailored and coordinated by M&E units within departments to ensure support and/or intervention measures to designs are robust, ideally before implemen- specific municipalities; and tation starts; • guide national and provincial departments (iii) implementation—measuring an interven- to better support municipalities in identified tion’s progress and determining how it can areas of underperformance. be strengthened; The municipal assessment tool is at a draft (iv) impact—identifying the impact of interven- stage and covers planning, human resources, fi- tions and its attribution, and how they can nances, service delivery, community engagement, be strengthened; and governance. It will be finalized and piloted (v) economic—the cost-effectiveness or cost in selected municipalities in the current fiscal benefit of interventions; and year, before full rollout begins in the 2013/14 (vi) evaluation synthesis—drawing lessons across fiscal year. a number of evaluations. Evaluations are conducted jointly by the Emerging Successes, department(s) concerned and the DPME, and the Challenges, and DPME partially funds the evaluations (average of Sustainability Issues US$60,000 per evaluation). An Evaluation and Research Unit has been established in the DPME Emerging successes to drive the system and provide technical support, Initial impacts of the South African M&E system supported by a cross-government Evaluation include: Technical Working Group. Guidelines are being • Increased strategic focus of government on developed, as well as standards for evaluation achieving a limited number of outcomes. and competencies for program managers, M&E Quarterly reports enable the Cabinet to staff, and evaluators. Training courses start in regularly monitor progress in meeting the September 2012. government’s key strategic agenda. The first evaluation is currently being com- • Introduction of whole-of-government plan- pleted. The first National Evaluation Plan was ning linked to key cross-cutting outcomes, 8 PREMNOTE September 2012 which clearly links inputs and activities to Challenges outputs and outcomes. Challenges to be addressed in the next stages are • A higher level of understanding of how the detailed below. work of the different departments affects Improved coherence and coordination areas other departments, and greater coordina- include: tion between departments and spheres of • Strengthening the coherence of central gov- government. ernment departments and their understand- • More systematic M&E is beginning to facili- ing of their roles in the M&E system, ensuring tate more efficient use of limited resources. a common conceptual base and that systems • The emphasis on measuring results is catalyz- integrate better. The DPME is currently ing change; some departments are embracing considering introducing legislation to address the approach, putting in place improvement some of the gaps and overlaps. plans, focusing on measurable results, and • Strengthening the role of implementation improving their data. forums in coordinating implementation of the outcomes. Factors supporting the system • Improving the status of M&E by acknowl- Commitment from the president—There has been edging it as a key part of the strategic func- high-level political commitment to a strong M&E tion, essential for planning and budgeting, system. The exact form is not necessarily clear, tracking progress, learning, and improving has varied in focus, and has evolved in practice. implementation. However, this commitment has facilitated the Monitoring and reporting challenges include: rapid buildup of capacity, and the DPME now • Avoiding duplication of reporting, and the has around 200 staff. This also creates a potential view of some actors that the DPME is du- challenge in that should the president change, plicating the work of the Auditor General, there could be a problem with maintaining com- National Treasury, and other agencies. mitment to the systems, as happened in Australia • Strengthening the incorporation of monitor- in 1996. ing as part of the management function, and Institutional elements—Having flexible support facilitating continuous improvement. has been crucial in supporting the emergence of • Improving citizen feedback as part of the DPME4—support includes the strong National monitoring system, which can speed up Treasury, which provides support to departments improvement cycles. to help them improve their financial manage- • Strengthening monitoring of local govern- ment capacity; a well-established departmental planning and reporting system (even if there are ment. some challenges with it); capacity to undertake • Now that an Evaluation Policy Framework evaluations in some departments and the Public has been developed, developing a policy Service Commission; and the presence of an EU- framework for monitoring. funded program, the Program to Support Pro-Poor Planning areas needing improvement Policy Development (PSPPD), which promotes include: evidence-based policy making. • Better integration of planning systems and Development of a high-quality team in the a strengthened link with M&E. A specific DPME—A very strong team developed in the challenge is improving the planning of imple- DPME, which has enabled the rapid development mentation programs at much lower levels and implementation of the system, while increas- than budget programs (including developing ing the DPME’s credibility. log frames with a matrix of indicators), which Learning, including from others—South Africa will facilitate program implementation as has sought to avoid reinventing wheels, using well as M&E. study tours, conducting research and knowledge • Improving the ability to predict (and hopeful- exchanges, and building ongoing networks with ly avoid) problems—a key desire of politicians. peer countries to learn from others. In addition, Challenges for support roles include: reflective processes are being used to ensure learn- • Strengthening the capacity to use evidence ing at all stages. to support policy and decision making—this September 2012 PREMNOTE 9 includes policy and data analysis skills and broader public service reform and manage- bringing in additional skill sets such as opera- ment development process, and supporting tional analysts to improve problem solving. continuous improvement and M&E as a key • Improving the quality of administrative data task of a central body such as the DPME. and, where possible, getting single entry of • Strengthening the legislative base around data at field level. M&E to institutionalize the role of the Presi- dency, reducing the risk of relying on a single, Sustainability strong political champion for M&E, who is There are two main issues in terms of sustain- vulnerable to political changes. ability: the sustainability of the DPME as a department (in particular beyond the term of Conclusions the current President who initiated this phase of M&E), and the sustainability of the systems Overall, the system has evolved tremendously and capacities that have been developed, such as in the two plus years that the DPME has been MPAT or evaluation. The measure of the DPME’s in existence. The systems are developing cred- success will be when many of its functions have ibility, and with over one year left prior to the been internalized in departments. Key ways of end of this term of government, they should be supporting sustainability include: fairly well established by the end of the current • Moving from a directive style to a coordinat- term. There needs to be stability in the DPME ing style, where the DPME is a champion team for the systems established to flourish and and shows leadership in M&E, but builds the have impact. involvement and commitment of partners. The experience in South Africa shows how • Enhancing the use of M&E information, when the situation is sufficiently favorable, for example, by the Cabinet, so key decision an M&E system can be rapidly developed and makers appreciate the value of M&E. implemented, and also how using international • Building alliances, for example, through experience can speed up the process. Unlike its the national Evaluation Technical Working peers, South Africa has tried to establish the M&E Group, proactive work with Treasury, joint system across both national and provincial levels, study tours, M&E forums, and so forth. and is now developing the local government ele- • Building forums that strengthen the M&E ment of the system. This shows that M&E can be voice (for example, national and provincial implemented at the local level, although it does M&E forums). increase the complexity of the process, adding • Strengthening capacity for M&E, using learn- many different stakeholders who have to buy into ing events, exchanges, and training. the system and change their behavior. • Strengthening the perceived value the DPME There are a number of challenges because the is providing in helping departments achieve system is not yet consolidated across the multiple their objectives,5 while also helping them in- actors, and there is a long way to go in developing a ternalize M&E to improve their performance, culture of M&E in the government system. A criti- and in the process ensure that they receive cal issue is the relationship between the key center credit for their success. of government stakeholders, notably the DPME • Simplifying systems, for example, to reduce and the National Treasury, and considerable duplication in reporting, therefore increasing work is underway to strengthen that relationship M&E’s perceived value. through practical collaboration at the technical • Increasing responsiveness to politicians, so level as well as higher-level relationship building. they see M&E as a valuable tool to achieve The M&E system is promising, and much has their goals. been achieved in two years. However, the system is Areas where further work is needed include: still emergent and not yet fully institutionalized, • Continuing to improve the effectiveness of and it is too early to see extensive use of M&E in- outcomes’ planning and M&E and ensuring formation in decision making. It will take three to M&E is adding value. five years to confirm that the M&E system is mak- • Developing political consensus on the im- ing real contributions to improving performance portance of internal monitoring as part of a and accountability in South Africa’s government. 10 PREMNOTE September 2012 Acknowledgment 2. The highest-level technical official in a govern- The authors acknowledge helpful comments from ment department is a director general, equivalent Keith Mackay, Anna Reva, and Gladys Lopez- to a permanent secretary, who works for a minister Acevedo of the World Bank. and deputy minister. Deputy director generals (DDGs) are therefore high-level officials, ap- pointed at this level to be able to work effectively About the Authors with the DGs of different sector departments Ian Goldman is the Deputy Director General and provinces. (DDG) in DPME responsible for developing the 3. Each province has a premier, like a provincial national evaluation system. Sean Phillips has been prime minister, appointed by the provincial leg- the Director General of DPME since its inception islature. The office of the premier is responsible in April 2010. Ismail Akhalwaya is the DDG for overall provincial coordination, although pro- responsible for driving the system of institutional vincial departments also have a strong link to their performance assessments, and acting DDG of national equivalents. Monitoring information, the Public Sector Oversight Branch in DPME. say on education, goes directly from provincial Ronette Engela is the DDG for the M&E Systems departments to the national department, not via Coordination Branch, taking forward the GWM&E the office of the premier. system and capacity development. Nolwazi Gasa 4. The PSPPD is a partnership between the is the DDG responsible for the Outcomes M&E Presidency and the European Union that funded Branch. Bernadette Leon is the DDG responsible research; supported a wide range of capacity for developing and implementing the Front-Line development activities including study tours, Service Delivery (FSD) Monitoring Programs. Has- exchanges, seminars, and conferences; supported sen Mohammed is the DDG responsible for local the development of the M&E system (notably government in the Outcomes Branch. evaluation); and undertook some knowledge management. 5. This was influenced by the United Kingdom’s Notes experience. 1. There are two main reform approaches in South African public services: (i) a strong public References expenditure reform agenda with an emphasis Engela, R., and T. Ajam. 2010. “Implementing a Gov- on efficiency, economy, effectiveness issues, and ernment-Wide Monitoring and Evaluation System budget reforms, such as instituting a medium- in South Africa.� ECD Working Paper Series No. 21, term expenditure framework and moving toward World Bank, Washington, DC. performance budgeting; and (ii) the approach fo- National Treasury. 2007. “Framework for Programme Performance Information.� Pretoria, South Africa. cused on communities of practice, networks, and Presidency, Office of. 2007. “Policy Framework for so forth, with a strong emphasis on training and the Goverment-wide Monitoring and Evaluation learning. While the first approach focuses on com- System.� Pretoria, South Africa. pliance and prescriptive management systems, the ———. 2009. “Improving Government Performance: Our latter presupposes that these systems already exist Approach.� Pretoria, South Africa. and are institutionalized, and focuses on more 2012, “Management Performance Assessment Tool: Report on results of assessment process for sophisticated dimensions of knowledge manage- 2011/12�, Pretoria, South Africa ment and continuous learning in departments. It StatsSA (Statistics South Africa). 2008. “Statistical is a moot point which of these approaches is most Quality Assurance Framework.� Pretoria, South appropriate to a given developing country. Africa. This note series is intended to summarize good practices and key policy findings on PREM-related topics. The views expressed in the notes are those of the authors and do not necessarily reflect those of the World Bank. PREMnotes are widely distributed to Bank staff and are also available on the PREM Web site (http://www. worldbank.org/prem). If you are interested in writing a PREMnote, email your idea to Madjiguene Seck at mseck@worldbank.org. For additional copies of this PREMnote please contact the PREM Advisory Service at x87736. This series is for both external and internal dissemination September 2012 PREMNOTE 11