A poor policy poorly managed leaves little to show for £315m

Hefce admits that the multimillion-pound Cetls scheme had minimal impact on student learning. Paul Ramsden is not surprised

March 15, 2012



Credit: Daniel Mitchell


The higher education funding council chose to publish an evaluation of its largest-ever single initiative to improve teaching and learning quietly. Perhaps this is not surprising: as an example of the failure of public policy in higher education, the Centres for Excellence in Teaching and Learning would be hard to beat.

It cost taxpayers £315 million over five years to fund 69 centres. The evaluation's uncomplimentary conclusion is that "we do not believe the Cetl programme itself has led to material changes in non-participating higher education institutions and across the sector as a whole".

The evaluation is based mainly on self-reports from Cetls themselves. The Higher Education Funding Council for England did not ask the evaluators to assess the performance of individual centres, so its report does not reveal the best and worst ones - even though it is widely believed that their performance was mixed. However, the report leaves no room for doubt that the programme as a whole failed in its primary purpose of enhancing teaching and learning in higher education.

I have nothing but admiration for those who worked in Cetls, most of whom produced good work that contributed to the reputation of their institutions despite the ill-conceived scheme. Nevertheless, the Cetls legacy is meagre.

Hefce found itself in a difficult position in 2003. It was ordered by the government to distribute millions of pounds to excellent teaching departments, and to do it quickly. The original concept in the 2003 White Paper was hazy: "We should also celebrate excellent practice in teaching departments. The very best will be designated as Centres of Excellence, and given funding of £500,000 a year for five years to reward academics and to fund extra staff to help promote and spread their good pedagogical practice."

The White Paper made it clear that the Cetls initiative was intended as a reward for good teaching. It would be a form of performance funding, allocating more resources to better departments. Being part of a Cetl would also be a feather in the cap for academics who delivered the performance. In an eerie pre-echo of "students at the heart of the system", the White Paper foresaw students self-selecting the better teaching departments and driving up quality through competition.

Negotiations and consultations with a powerful, self-regarding sector led to a different outcome altogether. The universities lobby succeeded in transforming the idea of extra payments to excellent teaching departments into money for quasi-research units that would "recognise" teaching. They would really have liked the cash without any strings at all, but they settled for the next best thing.

So universities got funds for "research and development" in teaching rather than a reward for employing good practice and attracting the best students. "Pedagogic research" is, in my experience, work that would only rarely be admissible for the research assessment exercise or research excellence framework.

Hindsight is a great thing, but it's still surprising that no one seems to have predicted that the Cetls scheme would be at odds with other initiatives proposed in the same White Paper. This was particularly true in the case of Higher Education Academy subject centres. Because most Cetls were subject-based, there seems to have been a good deal of duplication of subject centres' work. Some centres never got over being sidelined by competing Cetls.

It is hard to find in this evaluation much evidence of impact on student learning. But then, there probably was not any to find. Among the skimpy returns on the investment appears to be some stroking of academic vanity. The "measures of impact" cited by Cetls in their own evaluations, such as number of publications or events, are not evidence of improvements in the student experience or student attainment. A revealing statistic is that less than a third of the pro vice-chancellors polled by the evaluators thought that Cetls had had an impact on student retention, achievement or employability.

It is impossible to determine from the evaluation whether the limited positive effects would have happened anyway, or might have happened more generally had the initial White Paper concept of rewarding and spreading excellence been maintained.

Of course we cannot expect a report commissioned by Hefce to be self-critical. Hefce made little attempt to coordinate this expensive initiative or to support and instruct Cetls and the HEA to enable them to work in harmony. Instead, it pushed the responsibility for its own unwillingness to provide guidance over to the Cetls and the HEA - a classic case of weak management that magnified the flaws in policy.

That's £315 million frittered away, then. A vague policy, undermined by self-interest and badly managed in its execution. Must do better next time.

You've reached your article limit.

Register to continue

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments
Register

Have your say

Log in or register to post comments