What are the main problems school leaders can face in dealing with the ‘data deluge’?
By Kristian Shanks @HistoryKss | Seneca CPD
Last December, Seneca hosted a webinar about how leaders can better use assessment & progress data in school.
You can watch the recording on our YouTube channel here.
Below, it’s the article written by Kristian Shanks, our keynote speaker, summarising his ideas.
Don’t forget to check our other CPD opportunities on this link.
There is not a school leader in the country who does not find themselves confronted with a ‘data deluge’. Whether you are senior or middle leader, academic or pastoral, you will find that dealing with data forms a significant part of your role. However, given the limited time that we all have, it is important to try to filter the good data from the bad.
As a serving middle leader, and with previous senior leadership experience, I have significant experience in navigating the aforementioned data deluge. In turn, I have found some common problems in the roles that I have held.
Problem 1: The flaws of departmental-level data
By departmental-level data, I am referring to both data produced by a department, and data produced about a department. In the former case, this often takes the form of internal assessment scores or grades, as well as predicted grades about future performance on, for example, formal examinations. Potential problems here include the fact that departmental colleagues may have an over-generous view of student performance, perhaps informed by assessments that do not accurately reflect the demand of the curriculum being studied.
An issue I’ve personally encountered is where significant assessments were not properly sampled from the entire domain of knowledge being taught. Instead, students were told in advance the topics that were to come up on their History mock exam. Students performed well on said mock exams. But of course, they were only being tested on a tiny sample of the vast domain of knowledge that students are required to know for a GCSE subject. When the real exams rolled around, it was a different story and student underperformance was significant.
In terms of problems produced about departments, the issue here is that often it can be very noisy. As a Head of Department, you might find yourself being tasked to analyse the performance of tiny numbers of students within certain high profile pupil ‘groups’ such as disadvantaged or SEND, but there’s often not a lot you can meaningfully learn from this. Ultimately, the larger the data set, the more likely you are to be able to say something reliable about it. Much of the data that you encounter probably have confidence levels attached to it, but these might not always be supplied by the person giving you the data!
Problem 2: Our data focuses on product not process
As Tom Sherrington says, ‘the journey from the spreadsheet to the classroom is often circuitous at best, and normally doesn’t happen at all…It might, at best, give a picture of where students are at – but it does nothing to take them further.’
Much of the data that you will encounter deals with an end product, but doesn’t necessarily give us much insight into the process that led to that. Furthermore, some of the data that you could use to do that are also flawed. Take attendance data, for example. The most commonly cited figures relate to session attendance percentages. But they don’t take account of attendance (or punctuality) to specific subjects, and often factors like whether students are in isolation for a bunch of lessons is not accurately tracked on the central figures.
One of the most common ways of trying to get data on the ‘process’ is through an effort grade. But this too is a problem. Often, the data is an afterthought and not much is actually done to analyse and do something with the information despite collecting it. There’s little attempt to standardise and check that a ‘3’ for effort in one subject reasonably equates with a ‘3’ for effort in another.
Problem 3: Unused data and the problem of time
Time is the most precious resource in schools, arguably even more so than money. However, analysing data is a drain on the time on middle and senior leaders, and often time is being spent collecting that isn’t actually used (see, the effort grade point above) or analysing data that tells us things we already know.
We need to consider carefully whether the data we ask teachers and middle leaders to submit and analyse is having the impact that could be gained by doing another task, for example, planning better lessons or giving improved feedback to students. Do we really need them to spend yet more time inputting yet more numbers onto the spreadsheet du jour?
Indeed, as Tom Sherrington has mentioned before, if we lost all our progress and attainment data in a large fire or computer meltdown, would it really have a massive impact on students? Would it have any impact?
Problem 4: Data Induced Blindness
Data can ultimately lead us to weigh the pig, rather than trying to fatten it. We can become too focused on granular, micro solutions with limited impact than the big picture strategies for improvement that tend to work in most places like our own. We can focus too much on the quantitative over the qualitative and the formal over the informal.
Ultimately, we probably already know the big-ticket items that will make the biggest difference for our students – the development of excellent teachers, an effective system for managing behaviour, and ensuring that attendance is high and pupils are kept safe. While data can help us focus from time to time, we mustn’t lose sight of the bread and butter items that keep schools and students moving forward.
Why does this matter, and what should we do moving forward?
Ultimately, the DfE’s own workload challenge identified ‘exams and data-driven ethos’ as a key factor in explaining problems in teacher retention and in driving up the hours we are all working. Getting a handle on our data culture is something we can do to help ensure that the system is not bleeding teachers more than is already happening.
In terms of what we can do with data to extract the most value from it – Adam Boxer posed three really interesting questions for us to answer.
How does this data inform and affect my teaching of this class, right now?
How does this data inform and affect my teaching of this class, in the future?
How does this data inform and affect my teaching of future classes?
As for middle and senior leaders, perhaps adopting a less is more approach is needed. I am not saying data is useless – far from it. But let’s make sure we focus on the data we really need and extract the most value from it. If we can turn our data deluge into something more streamlined, it will be worth it.
Kristian is currently Curriculum Leader at an 11-18 secondary comprehensive school in North Yorkshire, situated between Leeds and York. Previously, he has worked in a number of other schools in roles ranging from classroom teacher to senior leader. His interests are in teaching and learning, curriculum and pedagogy, and issues affecting the wider education landscape.