Universities were able to silence critical voices in the recent pilot national student satisfaction survey by "weeding out" up to a third of those eligible to take part, The Times Higher has learnt.
Student leaders and experts said this week that the fact some universities had sent questionnaires to only 66 per cent of the eligible student body rendered the exercise invalid. They had, in effect, erased students who had dropped out, switched courses, failed or had outstanding complaints against their institutions.
The National Union of Students said it would be campaigning to ensure the first full national survey, to be published in 2005, contained the views of all final-year students, not just those most likely to give positive answers.
Chris Weavers, NUS vice-president (education), said: "Weeding out students who are most likely to have a negative opinion will skew results to give a falsely positive impression."
The results of the survey were revealed with a fanfare by the Department for Education and Skills last month. Alan Johnson, the higher education minister, said the survey would give students "better information to help them make informed choices about where and what to study".
But the interim report on the survey's development, published on the survey website, revealed that universities taking part in the pilot were given discretion to remove problem students.
The interim report says students were removed for "a variety of reasons".
These included students in dispute with their institution, those repeating a year, changing or transferring courses and those who would not graduate in 2002-03 for other reasons, including withdrawal or failure.
Although most of those omitted were removed because they were categorised as short-course continuing education students, the report confirmed that almost 4,000 students were cut from the list of 47,974 across 23 institutions.
"The number of students returned as a proportion of the original target lists varied considerably between HEIs (between 100 per cent and 66 per cent)," the report says. "This was largely due to the element of discretion that HEIs had in removing students."
Mr Weavers said the inconsistencies between numbers weeded out were also a cause for concern.
He said: "We'd like to see a survey that will cover all final-year students who leave the institution."
Lee Harvey, director of the Centre for Research and Evaluation at Sheffield Hallam University, who is widely seen as the architect of the modern student satisfaction survey, also had reservations about the validity of the survey.
"If institutions exclude students with whom they are in dispute, the results will clearly be biased."
He said the problem stemmed from a move away from the original plan to interview all students - whether they had passed or failed - at the end of their courses. Instead, students will be interviewed during their final year, to improve response rates.
A spokesman for the Higher Education Funding Council for England said this first pilot was designed to test methods rather than to generate results.
Another pilot under way allows less flexibility to exclude students.
"We will consult on the definition (of the student population to be surveyed) as one of the issues in the broader consultation on the survey.
Once the sector has agreed a way of defining those who should be surveyed, this will be applied systematically so that inconsistencies are removed.
'Weeding out' will be allowed only for very specific reasons such as students being deceased."