In every school I’ve worked in (4 including PGCE placements), I have been required to enter all test results into a spreadsheet, commonly known as QLA (Question Level Analysis).
Until recently, this has seemed to me like a fairly reasonable thing to do. Sometimes more useful than others depending on the nature of the test and the actual results produced by the students, but generally OK… Takes a bit of time to enter that data but, hey, I need to add up the scores anyway and I love a nice spreadsheet…
But I wanted to know what others thought. So I posted the following poll on Twitter:
Maths tchrs: Are you marking mocks at the moment?
What do you think of QLAs? (spreadsheets where you enter marks for each Q)
Vote, then RT!— Mark Horley Maths (@mhorley) March 19, 2017
As Dave Gale (@reflectivemaths) pointed out, I might have added a category…
@mhorley I’d like a vote option of
Has some use but not convinced it’s worth the time.I do mail merge the results for parents evening.
— Dave Gale (@reflectivemaths) March 19, 2017
I agree, that would have been good, but actually what is more interesting than the results is the discussion that this generated. It seems that there is a quite a wide range of opinion on the humble QLA.
So, let’s start with the positives. Some people really value them:
@mhorley Essential in my view. One of the most useful ways of identifying focus areas (whole class and individual).
— Maths Mr Cox (@MathsMrCox) March 19, 2017
@mhorley Individual feedback for student revision (I provide @hegartymaths clip numbers) and focuses my planning. Absolutely essential.
— Mr. Taylor (@taylorda01) March 19, 2017
@mhorley @solvemymaths @Just_Maths @mathsjem I’m a huge fan of QLA myself, I feel the time it takes is well worth it.
— Jonathan Hall (@StudyMaths) March 19, 2017
@taylorda01 @mhorley I’ve always found they accelerate progress. They often expose gaps I thought were closed.
— Kim McKee (@MrsKimMcKee) March 19, 2017
@mhorley I create the spreadsheets for our dept but have a student slip. Used for independent learning pic.twitter.com/XYf3FUinRM
— Terri Ridings (@TerriRidings) March 19, 2017
Others felt that the workload requirement was unrealistic so maybe we should get someone else to do it but that the end result may have some uses:
@mathsjem @mhorley have used google docs and kids to enter own data with a good group before. Can have issues with accuracy tho.
— Nicke Jones (@NEdge9) March 19, 2017
@mhorley @MathsMrCox get the students to input there scores as homework and then send you the spreadsheet. Copy paste into big file
— Yuvraj Singh (@YuviLite) March 19, 2017
@mhorley @Just_Maths @mathsjem If someone handed me the analysis I’d probably use it to inform some queries, but if I have to create it…
— solve my maths (@solvemymaths) March 19, 2017
And then others who question their value and have stopped doing them:
@mathsjem @mhorley I found making notes on the markscheme as I went along effective. It made a note of why students were making mistakes.
— Richard Deakin (@RichardDeakin) March 19, 2017
@mhorley @MrBlachford The QLA model works well for short predictable 1question = 1 concept tests, but otherwise lots of false positives and negatives…
— Requires Improvement (@RequireImprove) March 19, 2017
This last point is crucial to me. There is a problem if we are doing something that is a waste of time. Time is a finite resource and every additional task we do drains our energy for the important part of the job – being in a classroom full of kids. But actually the problem is greater than it being a waste of time. There is a danger that we read TOO much into our wonderful spreadsheets. The practice of labelling a question with a single topic is highly dubious. Even if we got more sophisticated and labelled each question with the multiple topics that are embedded within it, something which is more prevalent with the new GCSE vs. the old, this still struggles to capture what the question is about and how difficult it is. There is a wide range of difficulty and sub-topics in say, a question on standard form. Answering a simple conversion question accurately could give false confidence that this topic was sorted.
With key skills tests lower down the school where one question maps directly onto one topic, there could be value in charting this progress over time. But for GCSE papers?
There are some elaborate tools available such as Pinpoint learning which produce reports and customised question booklets based on the results. However they still rest on the premise that we can get a reliable read on a students’ proficiency, confidence and competence in a topic by answering one question on it.
Today, as I was marking the latest set of year 11 mocks (the 5th they have done since December), I thought I would do a little experiment. I generally mark the papers first, then enter the data into the spreadsheet. The alternative is to enter the data as you go along. This may seem like a trivial difference, but when we are spending several hours doing this work, it’s worth analysing the most efficient way to do it. I’m not the fastest marker, but I calculated that I averaged 7 minutes per paper when entering the data at the same time and 4.7 minutes per paper when not going near the computer. I was surprised the difference was as much as this. I then timed how long it took me to enter the data, which averaged 1.7 mins per paper. So yes, about 10% quicker to enter the data at the end.
Apart from being mind-numbingly boring (which some teachers get their very understanding partners to help with!), this data-entry task takes 30-50 mins per paper for a class of 30. So at least 1.5 hours work for a full 3-paper set of mocks.
I could spend 1.5 hours entering all that data into a spreadsheet that probably won’t get used, and if it does get used may lead to false conclusions. Or I could use that time for something else, like planning carefully how I want to hand back their test papers ensuring that I am using the lesson time most effectively, maybe selecting or writing new questions that I have a pretty good hunch everyone needs to work some more on. That hunch comes from doing the marking in the first place and doesn’t need a spreadsheet to back it up.
Reblogged this on The Echo Chamber.
LikeLike