Tải bản đầy đủ (.pdf) (50 trang)

Improving Learning in South African Schools pdf

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (1.73 MB, 50 trang )

Improving Learning in South African Schools:
The Quality Learning Project (QLP)
Summative Evaluation (2000 to 2004)
A Kanjee & CH Prinsloo
August 2005
HUMAN SCIENCES RESEARCH COUNCIL
Free download from www.hsrcpress.ac.za
Compiled by the Assessment Technology and Education Evaluation Research Programme,
Human Sciences Research Council
Funded by the Business Trust
Intervention Programme Managed by JET Education Services
Evaluation by the Human Sciences Research Council
Published by HSRC Press
Private Bag X9182, Cape Town, 8000, South Africa
www.hsrcpress.ac.za
© 2005 Human Sciences Research Council
First published 2005
All rights reserved. No part of this book may be reprinted or reproduced or utilised in
any form or by any electronic, mechanical, or other means, including photocopying
and recording, or in any information storage or retrieval system, without permission
in writing from the publishers.
ISBN 0-7969-2145-8
Cover Design and Layout: Vinesh Naidoo
Production: Shereno Printers
Distributed in Africa by Blue Weaver
PO Box 30370, Tokai, Cape Town, 7966, South Africa
Tel: +27 (0) 21 701 4477
Fax: +27 (0) 21 701 7302
email:
www.oneworldbooks.com
Distributed in Europe and the United Kingdom by Eurospan Distribution Services (EDS)


3 Henrietta Street, Covent Garden, London, WC2E 8LU, United Kingdom
Tel: +44 (0) 20 7240 0856
Fax: +44 (0) 20 7379 0609
email:
www.eurospanonline.com
Distributed in North America by Independent Publishers Group (IPG)
Order Department, 814 North Franklin Street, Chicago, IL 60610, USA
Call toll-free: (800) 888 4741
All other enquiries: +1 (312) 337 0747
Fax: +1 (312) 337 5985
email:
www.ipgbook.com
Free download from www.hsrcpress.ac.za
CONTENTS
LIST OF FIGURES AND TABLES vi
LIST OF ANNEXURES vii
ACKNOWLEDGEMENTS viii
PREFACE x
ABBREVIATIONS USED FOR THE QLP DISTRICTS xi
Map showing location of Quality Learning Project (QLP) districts xi
EXECUTIVE SUMMARY xii
Background 1
Framework for the Evaluation Study 1
Evaluation Methodology and Research Design 3
Evaluation Criteria 6
Success of the Quality Learning Project 7
Results of the QLP Evaluation 10
District-level Functioning 10
School-level Functioning 11
Classroom-level Functioning 13

Learner Context and Performance 18
Intervention Coverage and Quality 20
Effects of Interventions on Functioning 21
Effects of Improved Functioning on Learner Performance 22
Effects of Interventions on Learner Performance 22
Additional Observations Emanating from the Path Analysis 23
Concluding Statements and Recommendations 24
Recommendations 24
Recommendations on Methodology and Design 24
District-level Recommendations 26
School-level Recommendations 27
Classroom-level Recommendations 28
Causal Modelling Recommendations 28
Conclusion 29
NOTES 37
HUMAN SCIENCES RESEARCH COUNCIL
v
CONTENTS
QLP SUMMATIVE EVALUATION
Free download from www.hsrcpress.ac.za
LIST OF FIGURES AND TABLES
Figures
Map showing location of Quality Learning Project (QLP) districts xi
Figure A: The QLP model at district, school and teacher level 2
Figure B: Number of schools in the evaluation survey and case-study samples 3
Figure C: District-functionality levels in 2004 and change from 2002 to 2004 10
Figure D: School-functionality levels in 2004 and change from 2002 to 2004 12
Figure E: Grade 9 mathematics classroom-functionality levels in 2004 and change
from 2002 to 2004 14
Figure F: Grade 11 mathematics classroom-functionality levels in 2004 and change

from 2002 to 2004 14
Figure G: Grade 9 language classroom-functionality levels in 2004 and change
from 2002 to 2004 15
Figure H: Grade 11 language classroom-functionality levels in 2004 and change
from 2002 to 2004 15
Figure I: National mean mathematics scores for QLP and control schools
by year and grade 18
Figure J: National mean language scores for QLP and control schools
by year and grade 19
Figure K: Index scores for district-intervention coverage and quality by year 20
Figure L: Index scores for school-intervention coverage and quality by year 21
Figure M: Index scores for teacher-intervention coverage and quality by year,
subject and grade 21
Tables
Table I: Indicators at Grade 12 level of the success of the QLP (from 2000 to 2004) xiii
Table A: Total sample obtained for mid-term and summative evaluations 4
Table B: Number of schools sampled per district 4
Table C: Change in Grade 12 learner performance between 2000 and 2004
across QLP and control schools by province 8
Table D: Indicators at Grade 12 level of the success of the QLP (from 2000 to 2004) 9
HUMAN SCIENCES RESEARCH COUNCIL
vi
FIGURES AND TABLES
QLP SUMMATIVE EVALUATION
Free download from www.hsrcpress.ac.za
LIST OF ANNEXURES
(Graphs comprising findings from the path analyses)
Annexure 1:
Standardised path weights for modelling Grade 9 mathematics causal patterns 30
Annexure 2:

Standardised path weights for modelling Grade 9 language causal patterns 31
Annexure 3:
Standardised path weights for modelling Grade 11 mathematics causal patterns 32
Annexure 4:
Standardised path weights for modelling Grade 11 language causal patterns 33
Annexure 5:
Standardised path weights for modelling matriculation pass rate causal patterns 34
Annexure 6:
Standardised path weights for modelling matriculation English pass rate causal patterns 35
Annexure 7:
Standardised path weights for modelling Grade 12 (SG) mathematics pass rates 36
HUMAN SCIENCES RESEARCH COUNCIL
vii
ANNEXURES
QLP SUMMATIVE EVALUATION
Free download from www.hsrcpress.ac.za
ACKNOWLEDGEMENTS
The hard work and commitment of many dedicated individuals and organisations have over five years made it
possible to bring the formal QLP evaluation to a close by publishing the final, summative report. All contributions
are appreciated and are hereby acknowledged.
The Business Trust is acknowledged for providing the funding for this project.
Acknowledgement is made of JET Education Services and the Business Trust for managing the QLP and for
providing valuable support and assistance through the project Steering Committee and other mechanisms. Nick
Taylor, Anele Davids, Hemant Waghmarae, Jackie Moyana, and Leigh-May Moses, all from the QLP programme
management team of JET Education Services and Charles Barnard, Brian Whittaker, Mdu Ndhlovu and Nomfundo
Mqadi, with Theuns Eloff at the outset, from Business Trust, all deserve special mention. They contributed
continued guidance throughout the project, with considerable effort and time spent on many rounds of comment
to draft versions of instruments and documents of many kinds, and especially the baseline, mid-term and current
summative reports.
The Department of Education is acknowledged for its continually increasing role in direct communications and

meetings between the HSRC and participants at national and provincial level (at the Director General’s office and
through strategic planning sessions, respectively).
Consortium members and service providers are thanked for valuable inputs at various stages and through
feedback sessions, Partners Forum meetings, and otherwise.
Acknowledgement is made of the fieldworkers and observers, especially through contracts with AC Nielsen,
assisted by Mictert in 2002, and the many qualified teachers, as well as the data-capturing team of Datanet under
the guidance of Pio Combrink.
Professor Johann Mouton is thanked for his comments and ongoing advice, mainly on methodology, during the
first half of the study.
Special mention has to be made of the contributions by district managers, regional or circuit managers
(institutional development officials), learning area specialists, school principals, teachers, learners and their
caregivers, for allowing researchers into their institutions, offices, classrooms and lives, and for making
themselves available for observations, interviews, the completion of questionnaires, and for making learners
available for sitting for performance tests.
The role of and contributions by the official QLP co-ordinators in the office of JET Education Services are also
acknowledged. The QLP co-ordinators became an institutionalised channel through which certain business was
conducted just so much more efficiently. Access to the districts and schools, and district-level and intervention
information collection are two cases in point. [The co-ordinators were: Alfred Mabina (Gauteng); Kedibone Boka
(Mpumalanga); Darwin Solomon (Northern Cape); Noel Daniels (Western Cape); Samuel Nkosi (KwaZulu-Natal,
Inanda and Ixopo districts); Thulani Dlamini (KwaZulu-Natal, Ubombo district); Rose Machobane (North West
Province); Nosipho Nxiweni (Free State); Vuyani Mrwetyana (Eastern Cape); and Maxwell Malatji (Limpopo).]
Marcel Croon, professor at Tilburg University in the Netherlands, is thanked for providing invaluable assistance
and training in 2002/3, and again in 2004/5, with the data modelling and analysis, and with related software.
HUMAN SCIENCES RESEARCH COUNCIL
viii
ACKNOWLEDGEMENTS
QLP SUMMATIVE EVALUATION
Free download from www.hsrcpress.ac.za
Shereno Printers must be mentioned for printing and packing research instruments within very tight schedules
during the three evaluation years, and also for producing the mid-term technical report, this summative report,

and its technical companion report.
Wordsmiths English Consultancy is acknowledged for language editing, formatting and laying out the manuscript
of the mid-term technical report in 2003, and for language editing this summative report and its technical
companion report in 2005.
Professor H.S. Bhola, Professor Brahm Fleisch, and Hersheela Narsee are thanked for reviewing the final
manuscript and for their helpful comments and assistance in improving it.
The following HSRC team members (listed alphabetically) are also noted with gratitude for their respective roles
as part of the evaluation team at different points in time during the final evaluation phase following 2003:
1
Brutus
Malada, Carla Pheiffer, Elsie Venter, Gerda Diedericks, Godwin Khosa, Heidi Paterson, Hendrik de Kock, Lerato
Mashego, Lolita Winnaar, Makola Phurutse, Matthews Makgamatha, Natalie le Roux, Nicolaas Claassen, Sannie
Reyneke, Sophie Strydom, Vijay Reddy, Xola Mati and Zinhle Kgobe.
HUMAN SCIENCES RESEARCH COUNCIL
ix
ACKNOWLEDGEMENTS
QLP SUMMATIVE EVALUATION
1 Participation and contributions during the mid-term and baseline evaluations are acknowledged in the appropriate reports following those periods.
Free download from www.hsrcpress.ac.za
PREFACE
This report marks the end of a unique, long-term and extensive teaching and learning intervention programme,
the Quality Learning Project (QLP). In concluding the evaluation activities of the QLP, a reflection on the
evaluation processes and findings is desirable. For this reason, some of the most important functions and roles of
the QLP and its evaluation are placed in perspective. In doing so, the report emphasises the crucial nature and
function of evaluation for teaching and learning, more so in view of the transformation context of the South
African education system.
At the most apparent and immediate level, this summative report provides a conclusive account to the sponsors
of the QLP of how successfully the funds of the project have been spent.
Additional value also lies in reflecting more deeply on the complexities inherent in large-scale and lengthy
endeavours such as the QLP. These reflections carry a positive verdict about the methods and models selected for

the QLP intervention programme and its evaluation. Finally, the reflections allow affirmation of the policy
decision implicit in undertaking the QLP work at the outset. In this regard, professional and policy experts can
find justification in the soundness, replicability and sustainability of the road travelled by the QLP.
This report also provides background information on the interventions as well as on the evaluation design and
methodology of the QLP, its findings, and the conclusions and recommendations derived from the findings.
HUMAN SCIENCES RESEARCH COUNCIL
x
PREFACE
QLP SUMMATIVE EVALUATION
Free download from www.hsrcpress.ac.za
ABBREVIATIONS USED FOR THE QLP DISTRICTS
Lu = Lusikisiki
Fl = Flagstaff
Li = Libode
ThM = Thabo Mofutsanyana
JSM = Johannesburg South Mega
SeW = Sedibeng West
In = Inanda
Ix = Ixopo
Ub = Ubombo
Mo = Moretele
Ma = Mafikeng
Ze = Zeerust
Ka = Karoo
Bo = Bolobedu
Ko = Konekwena
Zb = Zebediela
WCME = Western Cape Metropole East
Map showing location of Quality Learning Project (QLP) districts*
* District labels appear next to markers indicating the centroid of the particular district. It has to

be noted that for certain districts, such as Zeerust and Karoo, actual district areas may be quite large.
HUMAN SCIENCES RESEARCH COUNCIL
xi
ABBREVIATIONS
QLP SUMMATIVE EVALUATION
Free download from www.hsrcpress.ac.za
HUMAN SCIENCES RESEARCH COUNCIL
xii
EXECUTIVE SUMMARY
EXECUTIVE SUMMARY
The summative report of the Quality Learning Project (QLP) encapsulates the evaluation work of the project
spanning the past five years. In addition, it focuses on the successes and findings of the QLP, and the implications
and recommendations flowing from the evaluation study. The details underpinning the summative report can be
found in the technical companion report.
The success of the QLP
The QLP adopted a specific theoretical model for interventions designed to improve learning and teaching in
schools, and for evaluating the success of these interventions. As such, the hierarchical levels of the system
(districts, schools and classrooms) were taken into account. Observations were made at three points in time to
study trends and causal patterns. Comparisons were also made between project and control schools.
Performance targets for the QLP were set at the outset. QLP schools were to show an improvement, measured by
overall learner performance, against a comparable sample, by the end of 2004. What was required was:
• A 10% improvement in mean overall matriculation pass rate;
• A 10% improvement in mean mathematics pass rate; and
• A 10% improvement in mean English Second Language pass rate.
However, pass rates, when used as sole indicators, have certain weaknesses. For example, small increases from
low baselines (previous poor matriculation results) appear as large improvements. Moreover, schools are able to
artificially inflate Grade 12 pass rates by holding back potentially unsuccessful Grade 11 learners or by requiring
learners to take subjects at the Standard Grade (SG). These targets were therefore refined after the 2002 mid-term
evaluation, using categories that more reliably reflected school-performance outcomes. These categories were:
(a) The increase in the absolute number of learners passing, as an indication of the quantitative improvement

of learner results;
(b) The increase in the number of learners passing with university exemption, and with mathematics at
Higher Grade (HG), rather than Standard Grade (SG), as an indication of an improvement in quality of
the learner results; and
(c) The increase in matriculation pass rate, as an indication of improved efficiency in learner results.
Table I compares the performance of QLP evaluation schools with that of control schools in terms of the
final evaluation criteria adopted. It shows that the matriculation results of QLP schools consistently
improved more than those of control schools with respect to quantity, quality and efficiency.
QLP SUMMATIVE EVALUATION
Free download from www.hsrcpress.ac.za
Table I: Indicators at Grade 12 level of the success of the QLP (from 2000 to 2004)
Selected indicators at school level (Grade 12) Percentage points by which improvement
in QLP schools is higher than in control schools
QUANTITY OF OUTPUT
Number of learners passing matriculation examinations 16.84
Number of learners passing English Second Language (HG) 36.03
QUALITY OF OUTPUT
Number of learners passing with endorsement (exemption) 61.79
Number of learners passing mathematics (HG) 924.19#
Number of learners passing mathematics (SG) 0.70*
EFFICIENCY OF OUTPUT
Overall school matriculation pass rate 8.20
# The very low baseline of 6 in QLP schools increased by 55, resulting in this high percentage point increase.
For control schools, 133 was reduced by 10 to show a decline of 7.52 percentage points.
* QLP schools were discouraged from having an increase in the number of learners in this category; hence,
the change in quality of output was not significantly higher than that obtained in the baseline study.
Trends and causal patterns
The QLP evaluation focused on identifying the impact of the intervention programmes on district, school and
classroom functioning as well as on learner performance.
Functionalit

y within QLP districts improved at each level for the period from 2002 to 2004, as is evident from
the findings outlined below:
• Overall district functioning improved over time, but did not surpass moderate functioning levels. The
strongest improvements (by more than 10% over the duration of the project) were in the design and use
of job descriptions, financial management, within-district planning, school-support planning, and school-
support implementation.
QLP schools fared much better than control schools in:
• Overall school functioning, with the greatest contributions coming from the supply and use of resources
and facilities, learning support materials (LSMs), curriculum leadership, school management, and school
administration;
• Aspects of mathematics teacher and classroom functioning, including teacher competency (mainly
experience and qualifications) (Grade 9), curriculum coverage (Grade 11), lesson pedagogy (Grade 9),
access to and use of LSMs (Grade 9), classroom practices (Grades 9 and 11), and homework practices
(Grade 11);
• Aspects of language teacher and classroom functioning (with a focus on reading and writing), including
teacher competency (mainly experience and qualifications) (Grade 9), curriculum coverage (Grades 9 and
11), pitching lessons at the appropriate level for learners (Grade 9), lesson pedagogy (Grade 9), and
classroom practices (Grades 9 and 11); and
HUMAN SCIENCES RESEARCH COUNCIL
xiii
EXECUTIVE SUMMARY
QLP SUMMATIVE EVALUATION
Free download from www.hsrcpress.ac.za
• Overall teacher and classroom functioning, at the level of Grade 9 mathematics only. (Language
classroom and teacher functioning, at the levels of Grades 9 and 11, also improved over time, while
mathematics classroom and teacher functioning remained stable, but at a high level of functionality.)
With regard to learners, two observations are warranted:
• Learner perceptions and attitudes fluctuated and even sometimes declined over the period of the project.
• In terms of learner performance at the national level, a marked improvement was observed only in the
case of writing skills (language) for Grade 11 learners. Learners fared better in selecting correct information

for mathematics and for reading and writing than they did in constructing responses, especially where
demands were more abstract and challenging. Learners who took the test in their home language (English
or Afrikaans) obtained higher scores. Similar trends were noted in the control schools, although their
scores were always higher given that the control schools were not part of the QLP sample (and thus were
deemed not to require any interventions).
The outcomes of the causal modellin
g and analyses of the programme’s effects, between 2002 and 2004, are
indicated below.
• There has been consistency over time with regard to interventions, functioning and learner performance
across all levels, subjects and grades, indicating that critical mass and impetus, once achieved, can be
sustained.
• There are many indications that service providers targeted interventions dynamically and interactively in
areas that needed them most.
• Interventions improved functioning in areas targeted by the QLP. This is evidenced in improved school
functioning driven by good classroom and teacher interventions. District interventions also played a role
in improving school functioning.
• Improved functioning within the QLP led to better learner performance in many instances. This applied
especially in cases of school and teacher/classroom functioning.
• QLP interventions led to improved learner performance in some cases. The positive effects of district
interventions on Grade 11 mathematics performance during 2003/4, and of language-teacher
interventions on overall matriculation pass rates in 2003/4 are especially significant (bearing in mind that
language interventions focused on language across the curriculum).
• The dosage and quality of QLP interventions were subject to the risk of fatigue effects over time, making
improvements harder to sustain. District coverage and Grade 9 language-teacher interventions were the
exceptions.
HUMAN SCIENCES RESEARCH COUNCIL
xiv
EXECUTIVE SUMMARY
QLP SUMMATIVE EVALUATION
Free download from www.hsrcpress.ac.za

Recommendations
The complete technical report and the summative report contain detailed recommendations emerging from the
study. A selection of the most significant recommendations is provided below.
• Policy makers need to sustain and enhance the programme’s benefits. This can best be done by adopting
logical, structured, integrated and comprehensive approaches to school development, based on sound
theoretical principles. For example, greater attention needs to be paid to the role of language in learning,
given that reading and writing skills, abstract thinking, and producing meaning are central learning
objectives. In addition, resource allocation should be prioritised according to need. It is especially
important to monitor and evaluate all policy effects.
• Education planners need to provide integrated and coherent intervention plans to improve teaching and
learning. Such plans should earn the support and commitment of participants through appropriate
engagement, and should adhere to sound frameworks and models. Interventions should target the earlier
stages of school life rather than focus on matriculation learners.
• Education managers in districts and provincial offices should sustain their efforts to manage their school-
support and monitoring roles. Visionary supervision, mentorship and leadership are required, with due
regard for capacity, infrastructure and process.
• Curriculum developers should support teaching and learning by producing relevant, practical, and high-
standard learning content underpinned by solid foundational knowledge.
• School management teams (SMTs) should nurture the professional development of their teachers through
good mentoring and motivation. Sound management, discipline, and curriculum leadership (the heartbeat
of the school) are crucial to this process.
• The provision of adequate numbers of excellent teacher trainers and mentors (including learning area
specialists (LASs) or subject advisors) can no longer be neglected. These are needed to support teachers
through mentoring, motivation, and technical (learning area or subject) assistance.
• Classroom teachers should not compromise on teaching time and curriculum coverage. The dignity of
teacher-learner relationships and interactions, discipline, and the provision of sufficient facilities and
learning materials all have to be maintained. Lack of subject expertise should not be allowed to kill the
inherent curiosity of learners and the joy and fun of learning. Commitment and passion characterise good
role models.
• Parents and learners should pursue every opportunity for reading, and should value all learning.

• Funding agencies should support large-scale, long-term and complex interventions and evaluations similar
to those undertaken by the QLP, given the sheer scale of the challenges of the education system.
• Evaluators should pursue sophisticated methodologies, designs, methods, models, data management and
statistical analyses, in keeping with complex programme characteristics. Secondary analysis, capacity
development, and evaluation should be supported.
• Book publishers should seek to disseminate findings from and information on interventions and studies
such as this one.
HUMAN SCIENCES RESEARCH COUNCIL
xv
EXECUTIVE SUMMARY
QLP SUMMATIVE EVALUATION
Free download from www.hsrcpress.ac.za
HUMAN SCIENCES RESEARCH COUNCIL
xvi
HSRC • QLP 2005
QLP SUMMATIVE EVALUATION
Free download from www.hsrcpress.ac.za
Background
The Quality Learning Project (QLP) was a multi-level, multi-site educational intervention that aimed to improve
learner performance in 524 South African high schools.
The QLP was underpinned by the understanding that mathematics and language are the foundations for all
further learning, and that teachers should foster the development of better mathematics, reading and writing
skills. For this reason, the improvement of mathematics, reading and writing skills was the main focus of the QLP.
In order to ensure that schools obtained effective support and monitoring from districts and that the good
practices gained from the project would be institutionalised, the programme also focused on the development of
district systems and officials.
In improving the quality of learning outcomes, the QLP adopted a systemic approach, which entailed improving:
• Learning outcomes in the languages of instruction and mathematics in Grades 8 to 12 in 524 schools;
• The teaching of mathematics, and reading and writing skills in 524 schools;
• The effectiveness of governance and management in 524 schools; and

• The effective management of 17 district offices in the nine provinces.
The QLP aimed to achieve the above by developing management capacity at district and school levels, and by
improving the classroom skills of teachers to enhance learner performance.
During the first two years of implementation, the key outcomes of the QLP were streamlined so that:
each provincial cohort of the QLP schools would, by the end of 2004, show an improvement in school
performance measured by overall learner performance with special emphasis on:
• A 10% improvement in mean overall matriculation pass rate;
• A 10% improvement in mean mathematics pass rate; and
• A 10% improvement in mean English Second Language pass rate,
against a comparable sample of control schools drawn for the province (JET QLP proposal).
Framework for the Evaluation Study
The final evaluation framework for the formative and summative evaluation studies was derived from the baseline
evaluation model applied to the QLP. Before the end of 2002, amendments were made to the original evaluation
model, and integrated into the theoretical position that underpinned the intervention programme and evaluation
project. The original hierarchical process model for evaluation was improved through these amendments. This
led to the addition of control schools, the extension of observational data to all sites, the concomitant reduction
in evaluation sample size, as well as various improvements and amendments to the instruments used for data
collection. (In addition, some factors associated with the availability and coverage of intervention data for the
period 2000 to 2002 determined that the effect of interventions would only be modelled for the purposes of the
summative evaluation.) The QLP therefore was a theory-driven intervention, and was underpinned and informed
by the following model:
IF the demands [to perform better] on the school and teacher are increased AND we enable the district
to provide high quality support to the schools AND we train the school governing bodies and school
management teams (SGBs, SMTs, etc.) to manage their schools more effectively AND we train the
teachers to teach mathematics and the languages better, THEN we should get improved TEACHING
QUALITY IN THE CLASSROOMS which WILL LEAD TO IMPROVED LEARNER PERFORMANCE.
HUMAN SCIENCES RESEARCH COUNCIL
1
BACKGROUND
QLP SUMMATIVE EVALUATION

Free download from www.hsrcpress.ac.za
HUMAN SCIENCES RESEARCH COUNCIL
2
FRAMEWORK FOR THE EVALUATION STUDY
QLP SUMMATIVE EVALUATION
The interventions made by the programme at various levels of the education system (district, school and
teacher/classroom) were aimed at building institutional capacity to manage, support and monitor educational
activities between the district and the school, as well as between the school and the teacher.
Figure A demonstrates the articulation and operationalisation of each of the three levels of the QLP. It indicates
the primary expected outcomes at each level, and how these outcomes were defined operationally. The figure
also shows which indicators were identified as relevant and measurable, and finally points out the sources and
methods of evidence that had to be utilised at a given level.
The theoretical model provided conceptual coherence and integrity for the QLP interventions, as well as
analytical guidance for its evaluation.
Figure A: The QLP model at district, school and teacher level
EFFECTIVE FUNCTIONING OF
DISTRICT OFFICE
• Effective organisational
development, planning and
management
• Effective HR management
• Effective financial management
Effective school support
Effective school monitoring
Effective school development planning
EFFECTIVE FUNCTIONING OF
SCHOOL
• Effective school management
• Effective human resource
performance monitoring

• Effective school administration
(tracking of learners)
OUTCOMES
• Improved learner
participation in class
• Improved learner
performance
EFFECTIVE CURRICULUM
MANAGEMENT
• Monitoring delivery of
curriculum
• Support of teachers
• Instructional leadership
EFFECTIVE TEACHER
• More effective management of
learning programmes
• Improved assessment practices
• More effective use of Learning
Support Material (LSM)
Free download from www.hsrcpress.ac.za
HUMAN SCIENCES RESEARCH COUNCIL
3
EVALUATION METHODOLOGY AND RESEARCH DESIGN
QLP SUMMATIVE EVALUATION
Evaluation Methodology and Research Design
The QLP Steering Committee, comprising staff from JET Education Services, the Business Trust, the Department
of Education (DoE), and the HSRC (through representatives from the evaluation team), accepted the improved
logic model for the QLP, which made explicit the details at the different levels, focused the evaluation, and
integrated the evaluation with the intervention programme.
A core feature of this project was the establishment of an explicit and common framework to ensure that all

intervention and evaluation activities would be aligned over time. For the evaluation study, a disadvantage was
that some continuity was sacrificed between the initial evaluation model and the new evaluation framework. The
former had already served as the basis for the HSRC’s instrument development and data collection for baseline
purposes. The longer-term benefits of these changes, though, outweighed the short-term delays and
discontinuities related to the baseline study.
A first implication of the change from the baseline study was that many of the instruments had to be amended.
Although strong efforts were made to keep core design aspects as consistent as possible, the changes did imply
that specific questions were improved upon. The changes also meant that certain sources of data or the ways in
which data was collected were adjusted to modes that were considered more likely to render reliable and valid
findings (for example, a shift was made from questionnaire responses to observations).
A second change to the earlier evaluation studies was the evaluation sample. The reduction in the number of
experimental schools to 70 in 2002 from 102 in the original 2000 baseline sample implied a certain loss in data
comparability over time, but brought the benefit of added commonality (and with it the opportunity for
triangulation) between the data sources underpinning the self-reporting and performance measurement
instruments, and the case studies and observations.
Thirdly, the revised evaluation model included 16 control schools from across four of the QLP districts. This
feature had not added value during the mid-term evaluation, but became important for the trend analyses from
2002 to 2004. The inclusion of control schools in the evaluation assisted in the evaluation of the performance of
the experimental schools undergoing interventions against the control schools, which did not receive any
interventions. As a result, the initial disadvantages of the loss of continuity in monitoring programme impact at
the mid-point of the programme were turned into advantages for the final summative evaluation phase in 2004.
Final 2002 and 2004 evaluation sample figures are shown in Figure B and Tables A and B.
Figure B: Number of schools in the evaluation survey and case-study samples
Total QLP schools - 524
Experimental schools
70
Control schools
16
Free download from www.hsrcpress.ac.za
HUMAN SCIENCES RESEARCH COUNCIL

4
EVALUATION METHODOLOGY AND RESEARCH DESIGN
QLP SUMMATIVE EVALUATION
Table A: Total sample obtained for mid-term and summative evaluations
Target group QLP 2004 QLP 2002 Control 2004 Control 2002
Learners
1
2 033 2 067 368 430
Teacher questionnaires 271 259 48 46
Class observations 403 405 79 84
School principals 66 67 12 14
Circuit managers 39 29 - -
District managers 15 17 - -
Mathematics learning area specialists 11 15 - -
Language learning area specialists 11 13 - -
Table B: Number of schools sampled per district
2
Provinces Number of Survey Site-visit QLP Control
3
Districts QLP schools sample in sample in sample in schools
per district 2000 2000 (2002)/2004 (2002)/2004
Eastern Cape*
Lusikisiki 21 1 1 (2)2
Flagstaff 31 5 3 (3)3
Libode 37 7 3 (7)7
Free State
Thabo Mofutsanyana 29 6 2 (4)4 (3)3
Gauteng
Johannesburg South Mega 39 4 2 (4)4 (3)3
Sedibeng West 27 4 2 (4)4

KwaZulu-Natal
Inanda 21 4 2 (3)3
Ixopo 27 6 2 (4)4
Ubombo 27 6 2 (4)4
Mpumalanga *
Moretele 32 10 2 (4)4
North West
Mafikeng 31 3 2 (3)3
Zeerust 36 12 2 (5)4 (4)3
Northern Cape
Karoo 32 6 2 (4)4
Limpopo
Bolobedu 30 10 3 (4)4 (1)1
Konekwena 36 6 2 (5)5 (2)1
Zebediela 24 6 2 (2)2 (1)1
Western Cape
Western Cape Metro East 34 6 2 (5)5
Total 514 102 36 (67) 66 (14) 12
1 Figures are based on Grade 9 reading and writing instruments - i.e. the lowest number of learners per school.
2 Districts* were restructured after the sample was selected, which resulted in a skewed distribution of schools selected in the final sample.
Also, one school was later classified under the Lusikisiki district, and not under Flagstaff.
3 Reflects the realised sample.
Free download from www.hsrcpress.ac.za
HUMAN SCIENCES RESEARCH COUNCIL
5
EVALUATION METHODOLOGY AND RESEARCH DESIGN
QLP SUMMATIVE EVALUATION
A useful component of the summative evaluation was the availability of sound, useful and complete intervention
data. However, the conversion of information for quality control and monitoring purposes into intervention data
for the mid-term evaluation and report early in 2003 proved to be an insurmountable task. As a result, it was

decided not to incorporate correlation analyses in the mid-term report, but rather to re-design the format in which
the intervention information would be collected. This was also done with a view to converting the information
into indices and indicators in the same way as the other QLP evaluation data had been converted.
Against this background, it comes as no surprise that the lessons learnt as the programme interventions and
evaluation processes unfolded were many. One could not expect much different from a first venture of this kind,
in terms of content, rigour and scale. Thus, the expectation that the summative report would provide many
valuable findings has not been unfulfilled.
The characterisation of the HSRC’s role as both “independent” and “formative” evaluator proved to be a useful,
but sometimes contradictory, requirement. In the formative sense, participation often occurred to share early
insights, which were used to direct subsequent interventions and their implementation. This participation took
place immediately after the baseline and mid-term reports, during many feedback forums, strategic planning
sessions, programme management and steering committee meetings, and service provider discussions. Such
contact also had the benefit of sensitising the evaluation team to many of the dynamics of the QLP programme.
This increased the understanding of the evaluation team about many of the issues at stake, and of how to collect
reliable and objective data on such issues. It also conveyed much insight into the process and benefits of a
scientific evaluation to future beneficiaries and other stakeholders. However, some contamination of objectivity
may have occurred, as respondents often formulated ideas, some realistic and others not, about desirable
outcomes anticipated by the evaluators.
The evaluation team had to surrender some control of the approach followed in the collection of data. For
example, intervention data was obtained from the QLP management team, while district-level information was
collected by QLP provincial co-ordinators. This was in some ways a useful occurrence, as liabilities associated
with strict independence, such as alienation and the non-credibility of the evaluator, could have led to losses in
terms of access to, and collaboration with, the programme participants.
All said, a reasonable compromise on the processes for the summative evaluation was achieved. The complexity
and magnitude of both the intervention programme management as well as the evaluation process should not be
underestimated. The instrument development, data management, and analytical skills required were extensive,
as were the skills required for research design, sampling, and methodology in general. Continuity and critical
mass in terms of human resource expertise were therefore central to the success of the project. In addition, the
logic model that drove both the interventions as well as the evaluation provided a unifying approach to the
project that secured coherence and integrity for the whole venture.

Free download from www.hsrcpress.ac.za
HUMAN SCIENCES RESEARCH COUNCIL
6
EVALUATION CRITERIA
QLP SUMMATIVE EVALUATION
Evaluation Criteria
One could at the outset, in addition to the path analysis that the HSRC undertook, compare the QLP
(experimental) and control schools in terms of available information, although this comparison falls outside the
original sampling and design decisions taken at the beginning of the evaluation project. Reasonably complete
matriculation statistics are available for this purpose. As a result, and subsequent to the preliminary discussions
and findings presented in December 2003 as a separate report (Grade 12 results of QLP schools: Supplement to
the QLP mid-term evaluation (2002/2003)), additional analyses along three dimensions were undertaken. These
were to determine:
(a) The increase in the absolute number of learners passing their matriculation examinations as well as
English (HG) as an indication of the quantitative improvement of learner results;
(b) The increase in the number of learners passing with university exemption, and with mathematics at
Higher Grade, rather than Standard Grade, as an indication of an improvement in quality of the learner
results; and
(c) The increase in matriculation pass rate, as an indication of improved efficiency in learner results.
One advantage of this approach is that the control and QLP schools could be compared more directly. In four of
the provinces, the initial allocation of schools, and early changes to this allocation, resulted in a small number
of schools falling outside the QLP eventually. These schools formed the control-school sample. In these cases, it
can be assumed that district contextual and impact factors had been quite similar for the QLP and control
schools, which only or mainly, again by assumption, differed in terms of receiving QLP interventions, or not. As
a result, differences in learner performance changes over time can be ascribed to the QLP interventions,
especially at school and teacher or classroom levels.
For further consistency in terms of the approach followed thus far in the evaluation, it was decided to stick to the
sample of 70 QLP and 16 control schools. Given that the control schools only existed in four provinces, two sets
of comparisons were made. First, direct comparisons were made between QLP and control schools for the four
provinces in which control schools were available. Second, broader comparisons were made between QLP

schools in all nine provinces and control schools located in the four provinces.
Such analyses would also be in the spirit of initial intentions to keep a close watch on the improvement in the
matriculation results of the schools participating in the QLP as one of the criteria set for evaluating the project’s
success. Another basis for such a comparison is the initial criteria set for evaluating the key outcomes of the QLP
4
,
already cited verbatim in the background to this report.
4 QLP proposal to Business Trust (1999).
Free download from www.hsrcpress.ac.za
HUMAN SCIENCES RESEARCH COUNCIL
7
SUCCESS OF THE QUALITY LEARNING PROJECT
QLP SUMMATIVE EVALUATION
Success of the Quality Learning Project
The report focuses on two questions: did the QLP model and intervention programme make a difference to
teaching and learning in terms of both processes and outcomes? (more bluntly, was the QLP worth the effort?)
and, secondly, how can the lessons learnt from the QLP be made applicable to other similar programmes?
Table C portrays the findings based on the refined criteria used in the supplementary report released in December
2003. Data for the control and QLP schools is directly comparable for only four provinces. However,
comparisons were also made between the data of these control schools and the QLP schools from all nine
provinces.
The increase in the numbers of learners that had passed was calculated for all QLP schools, then aggregated by
QLP district, and totalled for all QLP schools in the nine provinces, and for the QLP schools in the four provinces,
where control schools were also present. The percentage change from 2000 to 2004 was then calculated in each
instance. The same procedure was followed with regard to the numbers of learners passing Grade 12 with or
without university exemption (or endorsement), mathematics (HG and SG), and English (HG, Second Language),
as well as the overall matriculation pass rate. Towards the bottom of Table C, comparisons are reported between:
(1) the QLP schools in all nine provinces and the control schools in the four provinces; and (2) the control schools
in the four provinces and the corresponding QLP schools in these provinces.
Table C provides information on the change in Grade 12 learner performance, aggregated by province, from

2000 to 2004 in terms of:
• The number of learners passing their matriculation examinations (e.g., 53 or 10.3% more learners in the
Eastern Cape passed their matriculation examinations in 2004 compared to 2000);
• The number of learners passing their matriculation examinations with exemption;
• The number of learners passing English HG;
• The number of learners passing mathematics HG and mathematics SG; and
• The overall pass rate.
A summary of these results is presented in Table D in terms of the quantity, quality and efficiency indicators
developed to determine the success of the QLP programme.
Free download from www.hsrcpress.ac.za
HUMAN SCIENCES RESEARCH COUNCIL
8
SUCCESS OF THE QUALITY LEARNING PROJECT
QLP SUMMATIVE EVALUATION
Table C: Change in Grade 12 learner performance between 2000 and 2004 across QLP and control schools by province
Province Overall Overall English HG Maths HG Maths SG Overall pass
/group passes exemptions rate
No% No% No%No%No%% pts%
QLP Schools
EC 53 10.33 35 145.83 9 0.82 14 1400.00 159 47.18 11.46 38.73
FS 78 147.17 19 271.43 5 2.72 1 32 266.67 39.10 222.07
GP 206 46.50 84 158.49 262 40.49 31 1033.33 40 31.25 20.79 41.72
KZN 111 28.32 39 58.21 274 63.43 0 126 92.65 23.29 56.10
MP -49 -28.65 -8 -42.11 -380 -57.23 0 8 24.24 3.20 9.50
NW -53 -16.83 -11 -33.33 39 8.41 15 500.00 43 38.39 -1.52 -2.77
NC -19 -14.39 10 90.91 -36 -22.09 1 20.00 -3 -9.38 22.11 35.17
LM -57 -9.34 25 21.74 177 23.98 8 71 77.17 12.57 24.09
WC 135 39.47 1 3.57 214 54.18 11 115 155.41 18.36 36.88
All QLP 405 13.63 194 54.34 564 11.79 81 324.00 591 61.82 15.72 37.12
4 QLP 174 12.24 117 56.25 483 23.76 55 916.67 186 54.07 15.48 31.95

Control Schools
FS -24 -12.06 6 11.76 -182 -47.52 -5 -15.15 18 26.87 25.84 63.51
GP 71 16.67 12 10.91 390 153.54 -19 -24.68 133 114.66 5.15 7.65
NW -84 -26.42 -40 -40.82 -337 -92.58 4 17.39 26 20.47 -2.13 -2.69
LM -13 -9.03 6 20.00 -16 -8.84 10 5 16.13 31.25 70.97
4 Contr -50 -4.60 -16 -5.54 -145 -12.27 -10 -7.52 182 53.37 13.94 23.75
Differences between QLP and Control-school Performances *
Q > C4 455 18.23 210 59.88 709 24.05 91 331.52 409 8.45 1.78 13.38
Q4 > C4 224 16.84 133 61.79 628 36.03 65 924.19 4 0.70 1.53 8.20
* In the first row, numbers and percentages pertaining to learners from QLP schools in all nine provinces are
compared to the control-school figures from the four provinces, while in the second row, direct comparisons
are made between the statistics of learners from the QLP and control schools in the four provinces for which
equivalent data was available.
It is clear from Tables C and D that the initial project objectives were met almost without exception. (It is useful
to point out that JET Education Services made similar comparisons for purposes of quality control and programme
management on information about all 524 schools, taking provincial and national figures and trends from all
non-QLP schools as the comparative basis. These analyses also overwhelmingly reflect the outcomes listed
above.)
Free download from www.hsrcpress.ac.za
HUMAN SCIENCES RESEARCH COUNCIL
9
SUCCESS OF THE QUALITY LEARNING PROJECT
QLP SUMMATIVE EVALUATION
Table D: Indicators at Grade 12 level of the success of the QLP (from 2000 to 2004)
Selected indicators at school level (Grade 12) Percentage points by which improvement
in QLP schools is higher than in control schools
QUANTITY OF OUTPUT
Number of learners passing matriculation examinations 16.84
Number of learners passing English Second Language (HG) 36.03
QUALITY OF OUTPUT

Number of learners passing with endorsement (exemption) 61.79
Number of learners passing mathematics (HG) 924.19#
Number of learners passing mathematics (SG) 0.70*
EFFICIENCY OF OUTPUT
Overall school matriculation pass rate 8.20
# The very low baseline of 6 in QLP schools increased by 55, resulting in this high percentage point increase.
For control schools, 133 was reduced by 10 to show a decline of 7.52 percentage points.
* QLP schools were discouraged from having an increase in the number of learners in this category; hence,
the change in quality of output was not significantly higher than that obtained in the baseline study.
In terms of quantity improvements (that is, the overall number of learners passing matriculation examinations),
pass rates in QLP schools were approximately 17 percentage points higher than those in control schools. For
learners passing English (HG, Second Language), the difference, when compared across the four provinces that
comprised both QLP and control schools, was 36 percentage points in favour of the QLP schools. However, when
comparing QLP schools in all nine provinces to the control schools from the four provinces, the increase was
slightly lower at 24 percentage points.
In terms of quality improvements (that is, the number of learners passing their matriculation examinations with
endorsement, and with mathematics at HG), QLP schools managed to achieve increases in matriculation
exemptions that were 60 to 62 percentage points higher than those of the control schools. The growth trend
towards producing candidates with a pass in mathematics at HG level is also markedly more dramatic in QLP
schools, at 332 percentage points (for QLP schools from all nine provinces against control schools in four
provinces) and at 924 percentage points (direct comparison across four provinces) higher than control schools.
However, very low baselines contributed to some inconsistency and to the seemingly large changes. The QLP
and control schools were rather similar in terms of the increases in the number of candidates passing
mathematics at SG level. This finding can be seen as positive, in that QLP schools were deliberately encouraged
to prepare teachers and learners to consider offering mathematics at a higher level only.
Improvement in efficiency (as based on the overall increase in matriculation pass rate) was higher by eight to 13
percentage points in QLP schools above control schools. Given that only potentially successful candidates could
have been allowed through to Grade 12 from Grade 11, this figure is not as meaningful as the figures indicating
quality and quantity improvements. Nevertheless, the comparative picture is informative.
Pass-rate changes (as shown in the technical report) were also compared for Grade 12 English (HG) and

mathematics (SG). In the case of English, improvements in QLP schools were greater than those in control schools
by 19 percentage points (34% improvement in QLP schools above the 15% in control schools) when direct
comparisons were made in the four provinces. The figure dropped to 11 percentage points when the full QLP
group in all nine provinces was considered (26% improvement in QLP schools above the 15% in control schools).
In the case of Grade 12 mathematics (SG), improvements in QLP schools exceeded those in control schools by
Free download from www.hsrcpress.ac.za
HUMAN SCIENCES RESEARCH COUNCIL
10
RESULTS OF THE QLP EVALUATION
QLP SUMMATIVE EVALUATION
just over seven percentage points (94% improvement in QLP schools above the 87% in control schools) in direct
comparisons made in the four provinces. The figure changed to three percentage points in favour of the control
schools when the full QLP group in all nine provinces was considered (the 84% improvement in QLP schools is
below the 87% in the control schools).
Results of the QLP Evaluation
Analysis was conducted to determine the effect of the QLP interventions on the functioning of districts, schools
and classrooms as well as on the performance of learners.
District-level Functioning
Of all the levels in the education system, the district level seemed to be most directly affected by early
restructuring processes in the Department of Education (DoE). As late as 2004, 13 districts indicated that they
had undergone restructuring, with some indicating that such events had occurred five times. It is reasonable to
assume that provincial restructuring exercises would have undermined the QLP model and impacted negatively
on key elements for improving the quality of schooling.
Figure C shows the change in district-functioning levels from 2002 to 2004 (see scale on the left-hand side). These
changes are indicated by the bars in the figure. For district functionality, change scores ranged from -6 to +6. A
moderate improvement is observed when all the district scores are combined. While district functioning in
Gauteng, KwaZulu-Natal and Limpopo provinces deteriorated, it showed a marked improvement in the Free State
and Northern Cape provinces and in the district of Zeerust.
Markers (black dots) indicate the level of district functioning in 2004, and represent the index scores (indicated
on the right-hand side in the figure), which range on a scale from 0 to 13. Score values below 4.5 can be

considered low, while those above 9.5 are high. Scores in between this range are considered moderate. Most
districts still function only at moderate levels, with low functionality noted in the Limpopo province and
Sedibeng West district. (See Tables 3.14 and 3.15 in the technical report for additional details.)
Figure C: District-functionality levels in 2004 and change from 2002 to 2004
The functionality of districts was measured on the basis of 13 indices, which were constructed according to the
QLP model (see Figure A). Scores for 11 indices fell in the moderate category. The two exceptions were the
Free download from www.hsrcpress.ac.za
HUMAN SCIENCES RESEARCH COUNCIL
11
RESULTS OF THE QLP EVALUATION
QLP SUMMATIVE EVALUATION
existence and use of organograms, which fell in the high category, and appropriate financial management
practices, which fell in the low category. A noteworthy improvement from the 2002 functioning levels is evident
both in terms of individual indices and the overall district-functionality index.
Results of the statistical modelling reveal that QLP interventions affected district functioning (especially during
2003/4), as well as functioning at other levels of the system; namely, school and classroom levels. Evidence also
indicates that the programme targeted interventions at those points where the needs were greatest.
The QLP model posited that districts drive the improvement of learner performance, mainly through school
support and monitoring. The latter specifically covers the quality of school practices pertaining to school
management and development, curriculum management and development, and teacher-learner interactions in
the classroom.
Functioning levels are still moderate for the majority of indicators, with the financial management indicator of
greatest concern. However, the fact that most of the individual indices and the overall functionality index
improved from 2002 to 2004 is one of the most exciting outcomes of this evaluation.
Modelling suggests that these increases can partly be attributed to the interventions (district and other), especially
to the principle of efficiency of resource usage, which was applied by targeting interventions where the need was
greatest.
The “distance” between districts and classrooms/learners is perhaps still quite large, and the implementation of
a district-based school-development model is still in its infancy. This situation, paired with continued instability
and restructuring, and large numbers of dysfunctional districts in some provinces, could explain the absence of

larger or more immediate effects of interventions on the improvement of districts and learner performance.
It is recommended that the rationale, model and logic of district-based school improvement be accepted
as sound. However, lack of resources and capacity, poor infrastructure, and related issues still pose a
vast challenge. Despite these challenges, this project indicates that good order, discipline and the will to
succeed can make the district level a strong force in facilitating desired change in schools.
Many of the comments made in the mid-term report remain valid, and it is worth being reminded about these:
• Instability, inappropriate structures and lack of resources at the district level of the education system impact
dramatically on the transition between the general and further education and training bands.
• The lack of parity between the ranks and roles ascribed to various officials in districts across the provinces
creates uncertainty, and issues related to reporting lines, hierarchies, and authority need to be clarified so that
interference with school (teacher) support and monitoring can be avoided.
• More critical comparisons are needed between the school-effectiveness and school-improvement approaches
(more recently referred to as the outside-in and inside-out approaches to school improvement), especially
with a view to interrogating existing practices and policy making within the national education system.
School-level Functioning
Figure D shows the change in overall school functionality from 2002 to 2004 by district, for QLP and control
schools. Change is indicated by the bars in the figure, against the scale on the left-hand side. For school-level
functionality, change scores ranged from -3 to +8. A moderate improvement in school-level functioning scores is
Free download from www.hsrcpress.ac.za

×