University managers have looked to delay the release of coursework marks to avoid upsetting undergraduates before they fill in their National Student Survey forms, an academic study has revealed.
Duna Sabri, visiting research fellow at the Centre for Public Policy at King’s College London, said discussion of various “gaming strategies” emerged during interviews she conducted with staff and students for a paper about the NSS, which polled about 304,000 final-year students this year.
According to this year’s survey, published on 13 August, 85 per cent of students are satisfied with their course, with the University of Bath achieving the highest score for a university with a 94 per cent approval rating.
However, managers at an unnamed university told Dr Sabri that a range of tactics were being considered to boost satisfaction scores, including delaying the release of module results until students had completed their forms “to avoid negatively influencing students’ state of mind”.
Managers had also considered diverting resources towards the final year because they were aware that students were mostly influenced by short-term concerns and were “answering in the now”. Such discussions were unlikely to be unique to that university, Dr Sabri said.
Call for survey to be moved
A review of the NSS, due to be published next spring by the Higher Education Funding Council for England, should call for the survey to be conducted after graduation, Dr Sabri recommended.
“The NSS asks students to take an overview of their university experience at precisely the time they are least able to do so – during final assessments in the last two terms of their final year,” she said.
“Other countries wait until after the students have graduated and survey them at the same time as asking about first destinations,” she added.
Students were also adjusting their scores because they believed that their marks would be used as “ammunition” against their tutors as NSS data are used increasingly in university league tables, according to Dr Sabri’s paper, which is due to be published on the Sociological Research Online website this autumn.
“There was a fairly common suspicion that poor scores would be used by university management against lecturers,” said Dr Sabri.
Students were also frustrated that questions were related solely towards their course when their complaints were often directed to other areas of university life, the paper adds.
“Hardly any of the problems are to do with the tutors…the problems are to do with [the larger] university structure,” says one student interviewed for the study.
Dr Sabri called for a system closer to the US model, in which students are asked to assess their own level of engagement with their course and are encouraged not to be passive consumers of education.
Marks starting to plateau
Adam Child, senior policy and strategy officer at Lancaster University, who has also studied the NSS, agreed.
“Ten years ago, the [current] approach made sense, but organisations such as the Quality Assurance Agency are now driving forward ideas of ‘student engagement’ and students as partners,” he said.
He added that the recurrence of last year’s record-high satisfaction scores – teaching quality achieved an 86 per cent satisfaction rating – indicated that marks were starting to plateau and might struggle to go much higher.
As a result, the NSS had lost its original aim of improving the quality of teaching and “needed a shake-up”, he said.
“The NSS once served a useful purpose, but we are now getting the same messages each year from it,” Mr Child said.
“Any actions prompted by the NSS to improve quality have probably been gone over several times,” he added.
The likely drop in student satisfaction scores in 2015 – when the first cohort of students paying tuition fees of £9,000 a year will be polled – would also pose difficulties for the NSS, Mr Child said.
“If we see a significant dip, it will show the survey is really acting as a customer survey. It will be shown to be measuring student expectations rather than absolute quality,” he added.