The future robustness of the National Student Survey could be undermined by a sharp upward trend in the proportion of students giving the same answer to every question, a report has warned.
The report, published on 3 July by the Higher Education Funding Council for England, found that while only 1 per cent of NSS respondents in 2005 gave the same answer to every question – a phenomenon known as “yea-saying” – that figure had increased to 5.4 per cent by 2013.
According to the report, UK Review of the Provision of Information about Higher Education: National Student Survey Results and Trends Analysis 2005-2013, most of the yea-sayers chose the “definitely agree” category and were far greater in number than would be expected, despite a 5 percentage point rise in overall satisfaction – up to 85 per cent – since 2005.
The removal of the yea-sayers from the 2013 responses still makes “no material difference” to the sector-wide results. But if they continue to proliferate “there is a possibility” that they “could affect the robustness of the NSS results [in the future]”. The report finds no evidence of a link between high levels of yea-saying and the incentives and prizes awarded by some institutions to students who complete the NSS. But it is unable to say whether the phenomenon merely reflects a lack of engagement from students completing the survey. However, a related report commissioned by Hefce, also published on 3 July, suggests that at least one group of NSS questions should be phrased negatively “to counter the problem that some students are completing the NSS without sufficient thought”.
That report – Review of the National Student Survey: Report to the UK Higher Education Funding Bodies by NatCen Social Research and the Institute of Education – finds widespread support among institutions for the NSS’ role in quality enhancement, and among students for its value in helping them to choose between institutions.
However, it acknowledges a widespread feeling that the NSS now takes insufficient account of student engagement with learning. It recommends piloting 11 new questions probing this issue, which could be introduced by 2017. These cover issues in “academic challenge/reflective and integrative learning” (using questions such as “my course has challenged me to achieve my best”); “learning community/collaborative learning” (“I do not feel part of a group of students and staff committed to learning”); and “student voice” (“staff appear to value the course feedback given by students”).
The report also acknowledges that “the NSS may inadvertently encourage HE institutions to act in ways which do not enhance students’ academic experiences”. For instance, it suggests that “prompt” be replaced by “timely” in the statement “feedback on my work has been prompt”.
The report acknowledges concern that ever-higher and very similar scores recorded by many institutions “undermine the credibility of the NSS in some stakeholders’ eyes”, and constitute “potential limits to [its] usefulness for […] comparisons of quality”.
But it says that institutional differences are larger at subject level and suggests that guidance be issued, including to the press, warning against using NSS data to construct league tables that do not account for institutions’ subject mix. It acknowledges that league tables also discourage institutions from sharing best practice, although concerns that some were “gaming” their NSS results were supported by “little hard evidence”.
The report also finds that overall satisfaction is lowest among male, disabled, older and black Caribbean students, and those from independent schools.