Exam problems

May 14, 2015

I was interested in the study into inconsistent marking, in particular that the examiners awarded “hugely inconsistent marks” and that “a large element of unreliability” would always remain in assessment (“Low marks on an essay? Try a different examiner”, News, 30 April). Both the findings and the conclusion are unsurprising as the essay format is one of the least accurate formal assessment methods because of examiner variance. The process of moderation simply adds one unreliable practice on top of another, and Sue Bloxham is correct in saying that we should stop using it.

However, the wider truth is that many assessments in higher education are simply not fit for purpose. Among the common problems are lack of proper training for examiners, poorly designed exams and inadequate statistical evidence about the performance of the examination or its component parts. Standards such as pass marks and degree grade boundaries are seldom set using a proper method. Instead, they are often predetermined and set out in regulations without regard to the content or difficulty of the questions, which is ludicrous.

Undergraduate and postgraduate medical exams tend to be reliable and valid because of the requirements of the General Medical Council. Even in medicine, however, examiners can encounter problems, as Kevin Fong described a few weeks ago (“Campus Hunger Games”, Opinion, 5 March). He identified questions that were considered too easy, and the fact that candidates can get the correct answer to a multiple choice question through a blind guess.

These two articles illustrate an important but often ignored principle of examinations. There are three different kinds of expertise necessary to design and quality-assure examinations. The first is subject expertise. The second is expertise in examination design, which includes expertise in assessment theory, methodology, number of items required, testing time and standard setting. The third is expertise in psychometrics, which includes the measurement and interpretation of statistics such as reliability, item difficulty and ability to discriminate between passing and failing candidates.

Many examiners in higher education might not have much more than subject expertise, and until this changes, unreliable and invalid examinations will probably continue.

Gareth Holsgrove
Consultant in medical and dental education

Times Higher Education free 30-day trial

You've reached your article limit

Register to continue

Registration is free and only takes a moment. Once registered you can read a total of 6 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

Have your say

Log in or register to post comments

Featured Jobs

Research Nurse UNIVERSITY OF BRISTOL
Fully-funded Studentships In Social Sciences NOTTINGHAM TRENT UNIVERSITY (NTU)
Fully-funded Studentships In Science and Technology NOTTINGHAM TRENT UNIVERSITY (NTU)
Fully-funded Studentships within Nottingham Law School NOTTINGHAM TRENT UNIVERSITY (NTU)

Most Commented

question marks PhD study

Selecting the right doctorate is crucial for success. Robert MacIntosh and Kevin O'Gorman share top 10 tips on how to pick a PhD

India, UK, flag

Sir Keith Burnett reflects on what he learned about international students while in India with the UK prime minister

Pencil lying on open diary

Requesting a log of daily activity means that trust between the institution and the scholar has broken down, says Toby Miller

Microlight pilot flies with flock of cranes

Reports of UK-based researchers already thinking of moving overseas after Brexit vote

Portrait montage of Donald Trump and Nigel Farage

From Donald Trump to Brexit, John Morgan considers the challenges of a new international political climate