the logo

The National Student Survey gets student satisfaction wrong

The National Student Survey is essential, but a disappointment. A final year student at University College London explains why

  • Admissions
  • Rankings
  • Rankings for Students
  • Student life
Laura Warner's avatar

Laura Warner

March 15 2016
Happy and sad cardboard faces

Share

I make myself a cup of tea, I clear my desk and I get comfortable in my chair.
I’m ready. It’s time. 
It’s the National Student Survey.

During my three years at university, I really feel like I’ve spent time preparing for this moment. As a prospective student I was curious about the results; as a current student I am both frustrated with and complacent about the results. I think the notion of a student survey is important, useful and necessary. I think it is important to hear what students think about their course and about their university; the survey should provide a valuable platform from which prospective students can make informed choices and universities can access constructive feedback.

BMI CTA Wide
THE Student
Step into your future: attend our events

But the question is, does it really do that?

The survey is made up of 23 closed questions with tick-box answers, ranging from “Definitely agree” to “Definitely disagree” with a “Not applicable” option as well. At the end of the survey, participants have the option to highlight any notably positive or negative aspects of their course. The survey took me just over five minutes to complete, start to finish, and I was actually left feeling really disappointed. My university experience has been a mixed one, to say the least, but the survey didn’t really give me the opportunity to discuss that or elaborate on it.

It’s extremely difficult to answer 23 closed questions about a whole three years of your life – and it’s inevitable that you make generalisations, and therefore either emphasise the good or bad bits of your experience. For example, the first statement is “Staff are good at explaining things”: some staff are, some staff aren’t; staff I’ve come into contact with in my final year have been better at explaining things than some of the staff I engaged with in my first year. What’s my answer? If I disagree then that’s a huge kick in the teeth to all the staff who have tirelessly explained and re-explained concepts to me, but if I agree then I’m forgetting about all those lectures I went to and left not having a clue about what’s just gone on. I can’t just “Neither agree nor disagree” to every single question.

The survey is split up into eight sections, ranging from teaching to assessments to personal development. Arguably, it tries to capture all elements of one’s university experience. In doing so, however, I think it misses the most important factors. There are no questions about engagement with your course or your department, or your course or department’s engagement with you; there is nothing about your actual enjoyment of the course; nothing about whether the course in which you participated was the one that was advertised; and nothing about how that experience has changed you – from a pre-graduate to a graduate.

Sure, I can say that my timetable fits around my activities, but is that really deemed one of the most fundamental factors in my university experience? My favourite part of the survey was the personal development section – I thought, “Right, here we are, I’m going to be asked about something that matters to me, that should matter to my department and that will matter to other students”.

I was so wrong. I was provided with three statements: 1) The course has helped me present myself with confidence; 2) My communication skills have improved; 3) As a result of the course, I feel confident in tackling unfamiliar problems. I don’t know about you, but I feel like these are some of the most generic and meaningless statements I could have been given. Yes, my course has helped me to present my knowledge of neoliberalism and post-colonialism and climate change with confidence, but at times it’s also driven me into the ground with disappointment, dissatisfaction and disinterest. I am definitely not leaving university a more confident person, but is that what the question is asking? I don’t know if my communication skills have improved; I send a lot of emails and write a lot of essays, but I’ve gone from doing almost weekly presentations at sixth form to one presentation in three years at university. I’m probably not as good at public speaking now. In terms of tackling unfamiliar problems, I tackled university – one big unfamiliar problem, and I feel like it’s gradually becoming more familiar and less problematic.

So, National Student Survey, thank you for existing and trying to give students an opportunity to have a voice – but unfortunately I don’t think you’re doing a very good job of permitting a rounded and accurate voice. We’re talking about a survey that universities stress endlessly over, which does, and should, affect admissions and student decision-making. We need to make a space to recognise changes over time: to show universities and students that, sure, this might not have been great three years ago but look how good it’s become now – and, equally, vice versa. We need to show students what it’s like now and what it should be like when they start university, not what it was like when I started university.

We need to recognise that my second-year tutor didn’t even reply to my emails, let alone know my name, but one of my third-year lecturers bought me a drink in the summer holidays and spent three hours chatting with me about ways the department could improve. There needs to be the opportunity for students to provide accurate results on these surveys, rather than futile and unreflective generalisations.

We also need to think about what it is that students and universities need to know. I understand that it is important for departments to be able to learn how students are responding to the feedback given on assessments, but the National Student Survey isn’t really the ideal place for this. I’ve left university by the time the results are published; if I was unsatisfied for three years, what good is publishing that now? Equally, I don’t know if it’s a significant enough feature of a course to be the make or break for a prospective student.

Why don’t we start thinking about what prospective students need to know, and what’s going to be important to them – as a means to decide what should be included in the survey? As it stands, I don’t think the results of the survey are particularly useful for anyone.

Let’s place importance on student satisfaction, but let’s do it better. 


You may also like

Top 20 best small universities in the world (illustration)

The world’s best small universities 2016

Browse the full list of the world's best small universities in 2016-2017 and read student accounts of what it's like studying at each institution

Carly Minsky

January 25 2016
sticky sign up

Register free and enjoy extra benefits