OfS: National Student Survey will drop ‘satisfaction’ question

But first phase of review ordered by ministers finds preference to continue with ‘existing annual census’

March 30, 2021
Source: iStock

The UK’s National Student Survey should no longer use the term “satisfaction” and instead look at overhauling its core questions, according to the first phase of a major review.

Possible alternative types of survey, including widening the NSS to more than just final-year students or running it every two years, should also be looked at, according to the results of the review.

The Office for Students was asked to carry out a “radical, root-and-branch” review of the NSS by universities minister Michelle Donelan last year in a policy paper that claimed it had “exerted a downwards pressure on standards”.

It said any alternative that resulted from the review should look at reducing the bureaucratic burden on universities and be able to provide reliable data “without depending on a universal annual sample”.

This was backed up in a letter of guidance to the OfS from the education secretary, Gavin Williamson, last month that stressed reform of the survey was a “high priority” in order to “address the downwards pressure that student surveys of this sort may exert on standards”.

However, the report on the first phase of the NSS review, which involved consultation with students, staff and sector leaders among others, has said it had been told that overall “the benefits of the NSS outweigh the burden”.

“The clear preference of most of those to whom we spoke is to continue with the existing annual census, but with changes to the questions to improve their usefulness.”

It had therefore recommended there should be a review of “the core survey questions to ensure they remain fit for purpose and stand the test of time”, including “the removal of the term ‘satisfaction’ for the summative question or using an aggregate score to replace question 27”.

Question 27 of the NSS – which asks if overall students are satisfied with their course – forms the basis of many general comparisons of universities and courses.

The phase one report says that “despite the perceived usefulness” of the question by some respondents to the OfS review, “most recognised that the question was unhelpful for the survey as a whole”.

It adds that “critics often derogatively dub the NSS as a ‘satisfaction survey’, which they regard as a passive, consumer-driven concept not suitable for a survey of this type. There was strong support for phase two of the review to look at alternatives to question 27.”

Elsewhere, the report says that “academic staff were much more likely to report burden” when it came to the NSS, “in particular in relation to chasing improvements in NSS scores, which they felt could be a distraction from teaching”.

But it adds that the review team “did not find any evidence of a systemic issue of grade inflation or a lowering of standards from providers or students”, although “anecdotal comments from academics suggest this could be happening at a local level”.

On grade inflation, the report contains a deeper analysis of the rise in top degree honours and whether it could be linked to the NSS, but it concludes that the data do not “provide evidence that the NSS causes grade inflation”.

The review also considers specific alternatives to the form of the survey, which takes place every year and typically receives responses from at least 70 per cent of final-year undergraduates.

It concludes that the next stage of the review should look at the feasibility of a biennial survey or widening the NSS to sample other undergraduates besides those in their final year.

However, in a separate press statement, the OfS said “the strong view of most of those consulted was that an annual census remained the best option, with a radical review of the questions”.

“Over the next few months, in phase two of the NSS consultation, there will be extensive discussion with students, universities, colleges and regulators about new questions for the survey, including how best to reflect students’ overall assessment of their experience,” the English regulator said.

“New questions will be tested alongside the 2022 NSS and are likely to replace the existing questions.” 


Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please Login or Register to read this article.

Related articles

Reader's comments (2)

To have an annual census is valid; however it can often be an outlet for those with an axe to grind, as with any anonymised process, and there might almost be an obligation to pander to them in order to gain high scores, which might impact on grading. Thus, there appears to be a sense of fear around the survey as so much weight is given to it. However, the range of questions target different aspects of institutions and the burden of gaining high scores tends to rest on front line staff, yet they can have little influence on some of the aspects considered in the survey. When a particular area of an institution appears to have weaker satisfaction scores the pressure exerted and investigative examination can be counter-productive, as there is a lack of reading between the lines. It is also not helped by the idea that a neutral score is a negative score, as many who are unsure will plumb for the middle ground. Can there be a review of the survey to minimise the need to read between the lines, and removal of the neutral is negative aspect to scoring?
The problem for the government is that the emotional reactions of students is not a good way to measure value for money for the government. Most students are not paying themselves, the government is the funder and therefore the customer. What is needed is a clear statement of what the government wants from higher education and then find a way of measuring how well those objectives are being achieved, rather than some ideology driven push for Students are consumers.