Address by Collins Chabane, Minister in The Presidency for Performance Monitoring, Evaluation and Administration at the Conference and annual general meeting of the Association of Southern African Schools and Departments of Public Administration and Manag

Programme Director and Conference Convenor: Professor Yogi Penceliah
Professor Nelson Ijumba, Deputy Vice-Chancellor: Research
Professor Henry Wissink, Dean and Head: School of Management, Information Technology and Governance
Professor Tryna van Niekerk, Chairperson of ASSADPAM Executive Committee
Members of ASSADPAM, Practitioners and Post graduate students
Esteemed guests and conference participants
Members of the media
Ladies and gentlemen.

Higher education institutions, like the University of KwaZulu-Natal, occupy a special place in society. These institutions are credited as the generators of new knowledge and skills that help sustain the knowledge capital of our society. The knowledge and skills required to drive our economy, generate ideas to address our main trio challenges of poverty, underdevelopment and inequality.

I have always maintained that if you educate one child in a family, you have guaranteed that family better economic prospects and possibility of eradicating poverty. I wish to appeal to our nation that to build a better society, education remains our key to prosperity. To effectively address our trio challenges, education is key. As government we have elevated education as one of our key priorities among our apex of priorities. Educating this nation is however, not a challenge government can address without active participation of society. Let us all work together to build a prosperous country.

Programme Director, the theme of the conference addresses the key question that is currently top of our agenda, that is: how do we ensure effective planning and monitoring and evaluation to improve governance and the performance of the state? 

Since 2009, we adopted the outcomes-based approach to mid-term planning and monitoring and evaluation whereby we translated the electoral mandate of government into 12 priority outcomes. 

These outcomes were then translated into cross-sectoral and intergovernmental plans called delivery agreements, that are countersigned by various clusters of Ministers as a commitment to focus the delivery of services to our people, monitor and evaluate progress against them, and solve problems as they arise. 

Having been through two annual cycles of trying to improve government performance using the outcomes approach, we are now able to share some lessons learnt. On the bright side, for the first time government was able to move out of silos and agree on inter-departmental and intergovernmental plans for key cross-cutting outcomes. The process of producing delivery agreements has resulted in a higher level of understanding of the challenges which other departments face, and how the work of the different departments affect each other. 

The quarterly reports provide Cabinet with a strategic agenda and it is able to focus on assessing progress against key priorities. And lastly, emphasis on measuring results is working as a catalyst for change in government away from spending too much time merely focusing on the inputs and activities, but rather striving to achieve the agreed upon service delivery targets linked to the priority outcomes. 

One of the key principles that inform the outcomes approach is about being frank about problems, and most importantly, putting mechanisms in place to solve problems rather than push them under the carpet. This is at the heart of good governance. In that spirit of transparency and accountability, I would also like to share some of the challenges we have experienced. 

Firstly, some of the delivery agreements tended to be too long with too many indicators, which has resulted in them being not strategic enough leading to difficulty in ensuring stakeholder coordination as well as assessing progress and challenges experienced in a much more focused manner. 

Secondly, whilst the main thrust of the outcomes-based approach is about focusing on results, there is a considerable focus on activities in the public service without due regard to the results of those activities. 

Thirdly, there is inadequate translation of the delivery agreements into strategic and operational plans at a departmental level. 

Fourthly, there are challenges with the underlying departmental information systems that are supposed to enable the production of quality analytical reports which enhance evidence-based decision-making by the Executive.

Fifthly, departments sometimes produce overly positive reports that are sometimes at odds with the reality on the ground against public experiences of government services. 

Lastly, but not least, we are experiencing that departments are not necessarily using the results of monitoring of the implementation of the delivery agreements to inform improvements to their programmes.

Programme Director, early this year, we published the mid-term review of government performance against the 12 priority outcomes. That report is very frank about what has been achieved in this short period and what still needs to be done in specific strategic areas. For example, in health we identified issues of combating HIV and AIDS, and TB; improving maternal and child health; and improving health system effectiveness, as key areas of our focus. 

Our review shows that the number of people living with HIV has stabilised; there is a reduction in mother-to-child transmission from 8% in 2008 to 3.5% in 2011 – protecting more than 30 000 babies per annum from infection. Further, more than 19.9 million people have tested for HIV from April 2010 to date; 1.7 million people are receiving antiretroviral therapy, up from 1.1 million in 2009 and the costs of ARV drugs has been halved so that we can treat more people within the same resource envelope. 

On the aspect of improving health system effectiveness, substantial steps have been taken towards the establishment of the National Health Insurance scheme; there is good progress towards establishing the Office of Health Standards Compliance; the health department is conducting an audit of service quality in over 75% of health facilities and developing improvement plans; a new human resource strategy is in place, linking intake to projected demand; training plans are in place to ensure hospital managers meet minimum competency requirements; and so forth. 

Despite these significant improvements, our health indicators are still poor by international standards relative to expenditure levels – which emphasises the need to sustain the current trajectory and fast-tract implementation of our plans and strategies. The mid-term review document is on our website as with many of our reports and I urge you to engage with it and help us with your knowledge and experience to improve. 

So far our main focus has been on monitoring progress against the priority outcomes and less on evaluations. We have now developed the National Evaluation Policy Framework which was adopted by Cabinet last year. We have taken a strategic approach focusing on important government initiatives, and those selected are now embedded in a National Evaluation Plan. In the spirit of accountability, we have committed ourselves to focus on the use of the findings of the evaluations and that they must have an improvement plan whose implementation should be monitored very closely. 

The approach emphasises learning rather than a punitive approach, so as to build evaluation into the culture of departments and not promote resistance and malicious compliance. The results of the evaluations will be made public.

We have also been continuously assessing the management practices that are critical in ensuring the efficiency of government institutions in implementing their mandates. In this regard, in May this year, we published the first results of the Management Performance Assessment Tool (MPAT), which was based on a rigorous self-assessment by government departments at national and provincial spheres of government. MPAT focuses on key performance areas of strategic management, governance and accountability, human resources and systems management, and financial management. 

The results show a need for improvement in many areas of compliance with government prescripts, and also a number of good practices. Such good practices are now being documented and shared widely with the other departments in order for them to improve their performance. We have now managed to assess up to 83 national and provincial departments. 

I am sure all of you would agree that the Department of Home Affairs (DHA) has been known to be one of the failing institutions of government for many years, as confirmed by various adverse audit outcomes and attendant public concern about its services. The department realised that the challenges in the department were so entrenched that it would require not just a quick training of front-line staff, but a new approach to doing its business. Drawing on management approaches commonly used in production industries, it introduced an Operations Management approach and backed with tools and on-the-job training to ensure the approach becomes part of the culture of the organisation. 

Some of the positive results are:

  • reduced turnaround times for IDs from an average of 127 days to less than 45 days;
  • 93% of customers polled said waiting times for IDs were faster than expected and 92% said they were impressed with the new SMS notification;
  • an efficient Customer Contact Centre answering 95% of calls in 20 seconds and resolving 90% of calls on first contact. 

This Home Affairs story shows that it is possible to turn around the current problems that we are experiencing in relation to government service delivery and institutional in capacity.

Earlier on I mentioned the challenge that government faces with regards to the disjuncture between what we sometimes receive as overly positive reports from government officials versus how the public experiences service delivery on the frontline. In this regard, working closely with the Offices of the Premier and sector departments, we are conducting unannounced site-visits to key frontline services of government such as Home Affairs offices, police stations, courts, schools, drivers licence centres, grant issuing sites, clinics and hospitals, across the country. 

During the site-visits, we assess location and accessibility, opening and closing times, visibility and signage, queue management and waiting times, dignified treatment of the public, cleanliness and safety as well as complaints and compliments management. We have thus far been able to conduct visits in about 315 service sites since we began in 2011.

Our findings indicate that, in general, there are acceptable levels of access, safety, adherence to opening and closing times and visibility of service delivery sites. Areas that require significant improvement include: long waiting times, queue management, inappropriately trained security guards that are used for queue management, general lack of a visible presence of managers at the front-line of the service facilities, and complaints and compliments systems that are usually under-utilised. 

Of grave concern is the significantly below acceptable standards of cleanliness and comfort, with wide-spread severe neglect of facilities management and basic maintenance in these places. It is pleasing though that we are beginning to see significant improvements on the sites that have been visited for the second round.

Community members using these services are extremely appreciative our visits. They express the need for these types of unannounced verification visits to be done more regularly. 

We are now designing a citizen-based monitoring system that would help in ensuring partnership with civil society in dealing with the issues that affect our people on a daily basis. As most of you might be aware, we host the Presidential Hotline which is a citizen complaints monitoring and management system. Since its establishment in 2009, the Presidential hotline has served as an important source of information for government-wide performance monitoring and evaluation, and for monitoring the impact of government on citizens as it enables government to track what are the important issues for citizens and respond accordingly. We are very excited about its performance and the extent to which as government we are able to respond and follow-up on the issues raised by the people.

All these efforts are directed at ensuring the effective use of planning and monitoring and evaluation in improving service delivery by government to the people. It is quite early to conclusively assess the impact of all of these initiatives, but we believe that focused implementation, resilience and continuous learning will bear us the requisite fruit in the long-run. 

My department has been engaging the Schools of Public Administration and Management about the need to develop capacity via mainstreaming monitoring and evaluation in the university curriculum so that we can produce more skilled people who will capably interrogate and improve on these initiatives. Our proposal is that there needs to be an increase in the monitoring and evaluation component of the Public Administration and Management courses. We should also work together, maybe starting by ensuring that evaluators of public sector initiatives can meet certain minimum professional standards and competencies. 

I hope that your deliberations in this conference will be able to help generate new knowledge and ideas that would assist us in constantly reflecting on our various experiences on how to ensure good governance and improve public sector performance.

I wish you a fruitful and successful conference.

I thank you.

Share this page

Similar categories to explore