Confessions of an examiner
When I took my A-levels and GCSEs, I imagined them being marked by wise, Einstein lookalikes with great tufts of white hair, locked away in ivory towers. I pictured these sages ruminating over the points I had carefully laid out, raising an interested eyebrow from time to time, and often pausing to reflect on my argument. The idea of doubting the mark I had been awarded didn't even enter my head. The markers, after all, were the experts, and I was the lowly student.
But my childlike trust in the infallibility of exam marking has steadily eroded after spending more than a decade examining. As I sat down to mark my first paper all those years ago, it dawned on me that these scripts weren't marked by reassuringly eccentric academics and experts. In fact, they were marked by people like me, sitting in my dressing gown and shoveling Weetabix into my face on a Saturday morning, while contemplating an impossibly high mountain of papers and an unfeasibly tight deadline.
As I progressed up the hierarchy, I monitored other people's marking – and the disillusionment was complete. I saw it all. At one end of the scale there were those who just didn't care. I came across one marker who had completed all the scripts they had to send off as samples perfectly, but when we explored the rest of their allocation we found that they had apparently not even opened the papers, just written marks on the front. Such cynicism is difficult to combat; they were sacked from the exam board I was working for, but could apply to any other board the following year.
At the other end, there were those who tried so diligently, and really wanted to do well, but just seemed incapable of differentiating between responses of different levels. One examiner springs to mind who had annotated the papers in remarkable detail, but just didn't get it. She had given marks equivalent to an E to a student who had written work which was, to my mind, very clearly A* work. In this case we caught the problem pre-results and were able to have the papers remarked, but there are doubtless other less severe examples which went undetected.
So why are there so many problems? Partly they just can't recruit enough suitably qualified people to do the marking in many subjects. Teachers are, after all, busy and a lot of people would rather stab themselves in the face than take on extra marking in the holidays. Partly the whole operation is just so huge, with millions of scripts flying round the country and thousands upon thousands of examiners, that things are bound to go wrong from time to time. But I've also seen exam boards cut back on face-to-face training of examiners in recent years, and there is no doubt in my mind that the rise of online standardisation has led to less reliable marking. Nothing beats sitting down in person and talking through a mark scheme together.
It's only fair to say that the majority of examiners do a good job, and that the exam boards have processes in place which are meant to deal with the worst offenders. They probably do catch the very worst practice most of the time, but judging by some of the marking I've seen during the post-results re-marking period, a lot slips through the net. If ever you get a result back and are surprised by how low it is, for goodness sake get a re-mark.
All this is sailing dangerously close to 'moaning teacher territory', so let's talk about possible solutions. Certainly face-to-face standardising would help: I know that I am better at explaining how to mark in the flesh, taking questions, pointing out common pitfalls and reading the body language of the other participants. An online programme just doesn't cut it. Perhaps offering examiners more money would help attract more and better qualified markers, but of course that money comes via schools and the coffers are not exactly full at the moment.
Marking is increasingly moving online and this (as distinct from the training) will bring some advantages as it's easier to monitor how an examiner is doing across their whole allocation. Rather than sending off a selection of their scripts mid-way through the process, a senior examiner can track a marker's work throughout. Plus there's not the delays and faff associated with scripts crisscrossing the country in post vans. It sticks in my throat to say it, but Michael Gove also had one good idea, which was to have a single exam board for each subject, removing competition between them. This would improve marking reliability as all teachers would build expertise on a common specification, rather than the multitude of offerings there are around from different exam boards at the moment. Needless to say this was one idea that Gove dropped in favour of his other barmy plans.
All this really matters. If we can't trust our exam system, so much of what we do in schools is futile. How many tears have been shed by students, how many teachers berated, and how many heads of department ushered into the head's office for "a quick chat", all over results which were not right in the first place? How many students have been denied access to sixth forms, further or higher education because of unreliable marking? It will never be perfect of course, but let's talk about how we can create a system where the trust exhibited by my teenage self doesn't seem hopelessly naive.ere ...
Having received my fifth year end of module assignment result today this strikes a chord with me, my result is much lower than my results for the previous years and in my tma results through the year. This one result has ruined my chance of getting a good result and apparently there is no course for redress unless the mistake is procedural which I don’t know because my results are unimproved. This is one of many reports/blogs from examiners I have read TODAY which say the same thing you can’t be sure the result is correct which makes me wonder what’s the point my career prospects have been damaged possibly unfairly with little or no chance of getting things put right. And all because of a single assessment result which I cannot even be sure is properly marked even though I apparently fulfilled the question criteria of including enough sources and evaluating them and inanswered the question which was asked although it was a portly phrased question with an obvious answer I could have answer in primary school.