The overall results of the TQA show that higher education teaching has improved. But, asks Phil Baty, is this genuine or have departments learnt how to pull the wool over the assessors' eyes?
Less than 1 per cent of university departments were judged to be failing during the largest-ever review of teaching quality, while almost half of all courses were considered "excellent".
The complete results of the £300 million teaching quality assessment, exclusively analysed for The THES , show that after almost 2,000 visits to university departments since numerical assessment grades were first given out in 1995, just 15 departments were found to be failing - 0.8 per cent of all provision. Only one department - 0.05 per cent of those inspected - was found to be permanently failing.
On the flipside, 48.3 per cent of the 1,916 departments inspected were deemed to be "excellent", scoring at least 22 or more out of a maximum aggregate mark of 24. Excellence was found in 63.7 per cent of old university departments.
The Quality Assurance Agency said the exercise proved "beyond reasonable doubt, that the state of higher education in the UK is generally very good". But the findings have led experts to claim it was a waste of time and money.
"Was it worth the public money to be told that the overwhelming majority of higher education provision is satisfactory? I don't think so," said Geoffrey Alderman, one of the first people to chair a TQA inspection team. "The only real purpose of quality assessment should be quality enhancement. What's the evidence of quality enhancement?"
The supporters of the TQA - more recently billed as "subject review" - insist there is real evidence that the exercise led to an improvement in the quality of university teaching. Peter Williams, chief executive of the Quality Assurance Agency, said: "There can be very few honest and reflective teachers in higher education who would not admit that the prospect of a TQA forced them to think very carefully about what they were doing. The stakes were high and it mattered.
"It is also true, I believe, that the new and more structured - more professional - attention given to the quality of courses and the facilitation of learning since the TQA's introduction has made it possible to teach the ever-increasing numbers of students without sacrificing the most basic values and standards which are the bedrock of higher education."
The results of the exercise, analysed for The THES by Roger Cook, an academic development advisor at Napier University, certainly appear to show an improvement in standards over time.
From 1995, departments were assessed in each of six "aspects of provision": curriculum design and content; teaching, learning and assessment; student progression and achievement; student support and guidance; learning resources; and quality management and enhancement. Departments were given a mark out of four in each aspect, giving an aggregate score out of 24.
In the first round of inspections, between 1995 and 1996, the average score was 20.06 out of 24. This increased to 20.44 in 1996-98, and peaked at 21.70 in 1998-2000. In the final round, 2000-01, the average grade was 21.12.
The proportion of departments deemed by the inspectors to be excellent - scoring 22 or more out of 24 - has also seen a dramatic increase, from a quarter in 1995-96, to just over a third in 1996-98, and up to more than half in the 1998-2000 round and the final round. In the final round, of 11 separate subjects reviewed, excellence was almost universal in seven subjects. In philosophy, the average score per department was a staggering 23.31 out of 24, and in Celtic studies it was 22.75.
But critics have said the improvements reflected "gamesmanship", as institutions gradually learnt how to give the inspectors what they wanted.
Professor Alderman, former pro-vice chancellor for quality at Middlesex University, used to run courses for departments hoping to improve their TQA results. "The TQA provided a great opportunity for academics to earn extra money to help departments pull the wool over the eyes of QAA," he said. "We would teach departments how to draft their self-assessment document to make sure they never aspired to any objectives they couldn't demonstrate they could meet.
"We would brief staff on what to say, we'd tell them to never let inspectors walk about alone. We'd brief them on how to carefully choose and brief the employers and former students that the inspectors would meet. We'd tell them to send bad lecturers away for a couple of weeks and buy in good ones."
Tales abound of staff moving the best office furniture around an institution, and of quick paint jobs, to improve departments in preparation for the inspectors' visit.
Mr Williams admitted: "There may be some truth in the criticism that TQA was as much about theatre as quality."
But he insists that the results are a valuable public resource. "The project has produced a vast amount of information about good, and not so good, practice across the whole of higher education. This resource is publicly available and should be used. The QAA intends to publish a synopsis of the major trends and findings of TQAs and subject reviews later this year.
"It was expected by the sceptics and cynics to reveal a widespread dilution of academic quality and standards. It did not do so. The TQA's failure to confirm the prejudices and cynicism of the wiseacres and soothsayers harping back to a golden age has been one of its greatest successes."
But the data analysis shows, if the data can be relied on, that one of the clearest messages of the TQA is that the old, pre-1992 universities are far better at teaching than the former polytechnics, which are themselves much better than college-sector providers of higher education. This hardly confounds the cynics.
The THES data analysis, for English institutions, show that since the TQA turned to numerical gradings in 1995, York University has performed best, with an average score of 23.13 out of 24 in all TQAs. Close behind is Cambridge, with 23.08, followed by Oxford, with 23.06, Warwick and then Loughborough. Indeed, there are no new universities in the top ten, and the best performing institution in the post-1992 sector is Kingston, in 17th place - the only new university in the top 20.
In contrast, every university in the bottom ten of the table is a new university, with the University of East London the worst performer, with 19.08 out of 24. Slightly better is Thames Valley University, with 19.56, South Bank University, with 19.63, Derby University, with 19.79, and Lincolnshire and Humberside University, with 19.91.
A few new universities have bucked the trend, with higher than average performances within their sector. West of England and Northumbria universities were in joint 21st place, with Oxford Brookes in 25th. The much-maligned Luton University came a respectable 52nd.
As well as performing better on average, excellence is overwhelmingly concentrated in old universities. Across the whole exercise since 1995, 63.7 per cent of departments awarded 22 or more out of 24 were in old universities, compared with 41.9 per cent in new universities and 26.8 per cent in colleges. In the 2000-01 round, 85.9 per cent of "excellent" grades were given to old universities, compared with 64.7 per cent in new universities and 29.3 per cent in colleges.
In contrast, failures are concentrated in colleges that provide higher education courses. Since 1995, only two old universities have been deemed to be failing - Liverpool University, for its nursing courses, and Leeds University, for media studies. Both passed subsequent reinspections. This compares with 11 further education colleges, including Stockport College, which failed its re-inspection and closed the offending course in building and civil engineering.
Dr Cook said: "I don't think there are any huge surprises in this, it reflects the binary divide, which has never gone away in terms of funding."
But the TQA was not supposed to be a simple judgement about an institution's resources. The QAA's initial guidebook on how to conduct subject reviews included a specific reminder that "assessors should be aware that very good teaching and learning can take place in unsuitable conditions".
Dr Cook believes that despite the TQA's attempts to largely factor out the effect of university funding - too political at the time of the removal of the binary divide - an institution's resources have made a clear contribution to its success. This is borne out by the performance of an institution in the one aspect of provision where it is impossible to ignore a department's relative wealth - "learning resources", which examines library and computing facilities.
In the 2000-01 round, 93 per cent of old universities were awarded four out of four for the quality of their learning resources. This compares with 89.5 per cent in new universities, and just 62.9 per cent in colleges.
Whatever its merits, Mr Williams admitted that the QAA has to dismantle the TQA's legacy, which has left institutions believing that quality assurance is about meeting arbitrary external demands and not about ensuring the best education for students.
He said: "It will be a few years before any objective verdict can be offered on the TQA story. My own view is that, whatever its negative aspects may have been, we probably couldn't have done without it. It has served a valuable purpose. The future which is now beckoning - a future of strong, effective, collegially-based self-regulation for higher education, which is predicated on clear, shared, values and responsibilities, and which respects and reflects the legitimate interests of society - will only be realised if we heed all the lessons that our experience of TQA has taught us."