District ranking based on SSLC results – a zero sum game

This is an analysis of the process of "district ranking" of the SSLC results and what needs to be done with the results for improving educational outcomes.

A shorter version of this article was carried by Deccan Herald, May 22, 2015.

“Udupi tops the state”, “Gadag slides to bottom” are two headlines for this year's SSLC results in Karnataka. Every time the SSLC public examination results are announced in a state, a key media highlight is the district rankings, that lines up districts from top to bottom, based on pass percentage of students from the district.

The education head of Gadag district is quoted as saying woefully, “In 2012-13 the district stood at 18th place and ... this year we expected to be within 10th place”. Whereas Udupi is delighted at having moved up from 9th rank last year to 1st this year. The business of ranking leaves one wondering how all districts can hope to improve theirs. If some districts 'goes up', others have to 'go down'. If a district is 'at the bottom', does that mean that the quality of education has actually declined or no improvements have taken place there?

The district pass percentage figure by itself does not give any analytical insight to the public, or the officials and professionals working in the education sector, on what needs to be done. To be useful for decision making, this must be broken down by subject, by block and by school. 2 out of 3 students who have appeared for the exam from Gadag have passed; what needs to be done for the remaining one-third? More resources for students? More teacher training? Of what kind? More testing? Or less? The average pass percentage reveals no answer to these questions. But what does it hide?

Take the case of Bangalore South district, which is ranked at 29th of the 34 districts. A large number of schools with 100% results are also from this district; Bangalore South is in the 3rd place for this latter statistic in the state, with 139 schools showing 100% pass results. This suggests that the average pass percentage hides a stark variation in the results. There are some 'high performing' schools and some 'very poorly performing' schools and the average of these, has placed Bangalore South in the “not too bad' 29th position, which is quite misleading. At a minimum, disaggregating results by management (private, aided and government schools) and by subjects is necessary. The fixation with the “100% pass” has also resulted in little or no attention being paid to how well have the students done; how many just passed and how many got distinction or first class, this is also an important indicator of school quality.

Other than not revealing answers and hiding truths, the district rankings can tend to generate intense pressures among districts to “perform”, ranging from multiple preparatory examinations, encouraging or turning a blind eye to copying, conducting coaching classes, creating guides and answer keys. A significant part of teacher energies is devoted to 'Mission 40' in which the focus is on drilling students to get the “right answer” to 'likely questions' so that they get the minimum 40% pass marks, this may not be associated with any actual learning. Udupi credits its “mission 40” with its jump in the rankings!

Even if we accept that the SSLC pass percentage is only a proxy for school quality, we cannot deny that the pass percentages can give some idea, especially cases of outliers, (schools with very high or very low pass percentages). So what is to be done? What kinds of analysis should be done on the SSLC results, to drive school improvement as well as policy and programmatic correction?

Performance by school, by subject, across time, can help identify 'schools with challenges', but this needs to be done at the block level, the lowest tier in the high school system. If there are schools with a pass percentage that is much lower than the average for that block, there is a need to investigate this performance. The 'outliers' (both the 'high' and 'low' performers) can provide insights for action, provided meaningful investigations are made. The key here is the keenness to understand through 'investigation' what can help the schools, rather than 'punishment'.

Investigations would certainly include discussing with the teachers and head teacher the challenges the school faces and enlisting their views on the underlying causes for the challenges. It would not be restricted to 'examining records' (high school teachers now need to maintain a large number of records, including for Continuous Comprehensive Evaluation which many complain, takes away from instruction time), or doing random tests of students through questions to 'assess' learning levels or teaching quality.

Such an investigation may throw interesting observations. It may show that most schools are close to the block average and hence the challenge is not simply with a particular school or a set of schools, but larger issues that are relevant across the block. It may show that some schools are doing 'poorly' on most subjects, while others are doing 'poorly' on some subjects. Looking at the data across a few years is important. Numbers from a single year will not tell the full story. What is important to glean is if there is a pattern of poor performance in any subject across years. What are the correlations between subject-wise pass percentages? What are the pass percentage variation across different languages of instruction? What is the correlation between school strength and pass percentage? How many students have learning difficulties?

If the data does point out to challenges, in terms of sustained 'poor performance' for a subject or set of subjects in a school, this suggests that the school requires urgent attention of the support system; local administration support (Block Education Office), teacher academic support (Block Resource Centre, District Institute of Education and Training), community members and institutions. It is important to guard against a pre-mature conclusion that 'the teacher is not doing her job', but begin a mutually respectful dialogue with the teachers and the school administration to improve the quality of education.

Such a dialogue may reveal deeper substantive issues, such as the quality of teaching-learning environment, the availability of libraries and laboratories, teacher education/ teacher preparation, relevance of curricular materials to student contexts and their connections to application and real life, scope of physical education, music, art and other 'co-curricular' activities, student engagement, parental involvement and awareness and sensitivity to adolescent issues. Each of these will require custom responses depending on the school contexts and needs; a one-size-fits-all is unlikely to work. Here, we will not explore these, but will merely stop at pointing out that that solutions will tend to be comprise a complex set of measures addressing different challenges, not a simplistic silver bullet to 'change the teacher's mindset'.

One important component of improving school performance is helping schools analyze their own performance. The entire assessment data is 'computerised' and is is usually available only with the state examination board. Schools do get the student mark sheets, but these are distributed and the school does not have the data of individual students. Since the data is held in digital form, it is eminently possible to provide student-wise marks to every school and block institution. Teachers and head teachers need to be trained and empowered to use spreadsheets (software available on any computer), to analyse the marks by subject and across years. Teachers and head teachers can be trained in data analyses to understand patterns in pass percentages, the break-up of pass percentage by subjects, the ranges of student marks, by subject and so on. Similarly, block level analyses of schools by subjects and across years can be done using spreadsheets. Data visualization would show trends and problems far more sharply than a table of data would. Aggregated data on marks by kinds of questions (application, conceptual etc.) are also available with the examination board and must be shared with the teachers for analyses and for improving educational outcomes.

Finally, see the visual below of analyses of the district rankings of 2015 and 2014. When one looks at the graph it is obvious the variation amongst districts is not high significant, the numbers move around a rather narrow range. Then, what is all this fuss about?

 

 

District performances arranged on descending order of 2015 results. The inter district variations are largely across a narrow band.

District ranking seems simply a game played to identify 'top ranking districts' and feel good about them, and in the process condemn 'bottom districts' as no good, effectively obscuring the need to study and provide the resources for tackling the real challenges of school improvement. The ranking of students in itself is quite a problematic issue, yet it can be justified on the basis that seats in higher education are limited and the rank becomes an 'objective' basis for deciding admission. In case of district level performances there is no such compulsion. Education authorities, non governmental actors, academic institutions and media would be well placed to ignore this meaningless statistic and instead, act in tandem through creative collaboration at the micro-local level, to improve the quality.