Examining the evaluation methodology applied on skills development programs :a case study of the Gauteng Department of Social Development

No Thumbnail Available

Date

2019

Authors

Zulu, Michael Sphiwe

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

Monitoring and Evaluation of programmes and projects should be considered as a standard practice by development organizations, including NGOs and other stakeholders within the development field. This applies particularly to both capacity-building and social intervention programmes as they are intended to affect and impact those who hope to be empowered as programme participants. The fundamental challenge lies in measuring the qualitative change using an evaluation methodology that is efficient and effective in evaluating programme outcomes and the impact envisaged at the end of the programme. Whereas evaluations that are quantitatively orientated are applicable in measuring programme inputs and outputs, evaluating programmes meant for social development interventions demand a nuanced evaluation method that is capable of clearly indicating the qualitative change that has taken place in the lives of those that the programme was intended to impact. Nevertheless, some development organizations have experienced a great deal of difficulty in executing this task. It is against this background that this research study was centred on examining the Gauteng Department of Social Development (GDSD)’s evaluation method on its skills development programmes offered for the unemployed youth by its funded None-Profit Organizations. Purposive sampling was used to select the research participants and qualitative research methods were employed to collect data through semi-structured interviews with programme participants from Get Informed Youth Development Centre; one of the departments’ funded NPO. The research findings reveal that although the department’s monitoring method is applicable and efficient, its evaluation method is ineffective, inefficient and flawed in terms of measuring programme outcomes and impact. Following these insights, the following recommendations were made to strengthen and improve GDSD’s evaluation methodology. First, GDSD can capacitate its employees on conducting effective Monitoring and Evaluation by contracting by a service provider to train the Departmental staff, particularly the community development practitioners that are entrusted to monitor NPOs. As an alternative, GDSD can consider outsourcing a private entity that has expert knowledge on M & E to evaluate these training programs. More significantly, the beneficiaries’ views about training should form an integral part of evaluation as the program outcomes and impact of these skills training programs is experienced by them.

Description

A research report submitted in partial fulfilment of the requirements for the Degree of Masters of Arts in Development Studies to the Faculty of Humanities University of Witwatersrand, Johannesburg July, 2019

Keywords

Citation

Collections

Endorsement

Review

Supplemented By

Referenced By