It’s just a year ago since I met the development team at school analytics service SISRA and we discussed what seemed a rather crazy idea at the time: what if we asked schools if they were willing to collaborate by sharing their data? If we did that, we could solve one of the current headaches of school leadership, knowing with reasonable confidence how well your school has done against the Progress 8 performance measure, as soon as possible and without having to wait weeks for official figures.
The most affirming thing I learned from this exercise was that schools can, and will, cooperate with each other where there is both trust and a common purpose. In the end, over 1,100 schools covering more than 180,000 pupils agreed to share their data which was anonymised at the point of collection. Schools of all types opted to share their data, and when we think about the accountability climate in recent years and how this has driven competition between schools, this is incredibly reassuring.
However, this year we’ve witnessed something of a backlash against data. Headline measures such as Progress 8 and EBacc have been seen as potent ingredients in an increasingly toxic accountability mix. They’ve been blamed for encouraging unethical behaviour such as off-rolling pupils, for the introduction of distorted or inappropriate curriculum models and, ultimately, for upward pressure on teachers’ and school leaders’ workloads. Fear of the floor and coasting, both functions of performance measures, have made posts less attractive in the very schools which need great leaders.
Using data properly
As we wait for the 2018 results and another batch of these measures, it’s worthwhile reminding ourselves where it’s important to use data properly; understanding whether each pupil in the school did as well as he or she could in each of their subjects. Despite the political rhetoric to the contrary, schools actually vary little from one another. But in-school variation (the difference between the best performing subjects and the worst in any particular school), can be stark and should be the main focus of our attention.
Mike Treadaway from FFT once said to me that if pupils in every school performed universally as well as they had in their best performing subject, not only would we greatly improve the disadvantaged attainment gap, we would also be the top performing jurisdiction in the world. And all that would come without any intervention from politicians of any persuasion. It’s an attractive thought, although Ofqual would have something to say about it.
Getting it right
Understanding how well individual subjects have done is very important and it’s essential to get it right. I’ve often heard from our members that they use the EBacc section of Attainment 8 to set targets and evaluate performance, but this is comparing apples with pears. Not all EBacc subjects follow the same grade distribution against prior attainment. English and maths are different from each other, modern languages is very different from all the others and lumping them together can lead to incorrect conclusions.
To compare apples with apples we need tools which compare the performance of pupils in an individual subject to the performance of all pupils who took that subject nationally. This is best done using ‘single subject value added’, or using transition matrices. These both link grades to prior attainment. My preference is transition matrices because they show the whole spread of performance, not just whether pupils are above or below average.
The transition matrix for maths in 2017 shows the picture of what grades were plausible and reasonable. For example, it shows that nationally, very few pupils with 3a at Key Stage 2 managed to get a grade 5 or higher, and that anything below a grade 6 for 5a pupils would be very disappointing.
New ASCL data toolkit
For several years, ASCL members have been able to analyse performance using a spreadsheet toolkit developed by David Blow, Headteacher of Ashcombe School in Surrey. This toolkit cleverly combines the DfE transition matrices and the individual school data checking file which is published to schools in late September. In 2018 we have developed and extended this toolkit so that schools will be able to upload their checking file to a portal and then use a range of analysis tools. The analysis for mathematics, which compares cumulative grades by starting point with the national picture looks like this:
(NB ths blog is also available to read and download as a PDF to assist with viewing the graphs above.)
By checking whether the red school line is above the blue national line, which would be better than expected, you can quickly see the overall picture and compare different levels of prior attainment. Filters let you compare boys with all boys, disadvantaged with other disadvantaged pupils and so on, always comparing like with like.
In addition to Transition Matrix analysis, schools will have access to:
a workload-saving utility to assist school data managers and exams officers during the process of checking exam results
Attainment 8 and Progress 8 tools also developed by David, and which were mentioned as a useful source of information by Ofsted in last year’s autumn update
a range of other tools, including the ability to download pdf copies of reports
The most useful part of using an online portal is the ability to capture data and collaborate. Currently, the DfE waits until after the production of performance tables before it publishes subject transition matrices for a wide range of subjects. Schools, on the other hand, need this type of information early in the autumn term. But by uploading and agreeing to share key data (grades and prior attainment, but no personal data), it becomes possible to build the transition matrices when we need them.
All that we need is for enough schools to collaborate
Following the success of our work with SISRA in 2017, I am confident we can do this. We are very grateful to Steve Howse, the CEO of SMID for making this available to ASCL members completely free of charge. To sign up, please visit smidreport.com. The more who do, the more accurate our results will be. Thank you for your support.
Duncan Baldwin and David Blow are among our keynote speakers at ASCL’s Getting to Grips with Accountability Measures, Leadership of Data Autumn Conferences on 27 September in Manchester, and 2 October in London.