Two-thirds of students think TEF based on Ofsted-style inspection

Only one in 50 respondents to DfE survey correctly state that ratings are not based on class observations

一月 30, 2019
railroad-inspector
Source: Getty
Standard the TEF is based on data

Two-thirds of applicants who have heard of the UK’s teaching excellence framework mistakenly believe that the ratings are based on Ofsted-style inspections of universities, new research has revealed.

The survey of 2,838 students who submitted an application to enter higher education in 2018 or 2019, commissioned by the Department for Education, found that only 43 per cent of respondents were aware of the TEF at the time they applied and only 15 per cent used it to help their decision-making, despite “better informing student choice” being one of the assessment’s stated objectives.

Significantly, 66 per cent of respondents who had heard of the TEF wrongly believed that awards of gold, silver or bronze were allocated following official inspections of providers and their teaching, and only one in 50 (2 per cent) correctly stated that this was not the case. Thirty-one per cent of respondents said that they did not know.

TEF awards are actually based on data relating to student satisfaction, retention and graduate employment, as well as institutional submissions that are considered by expert panels.

Misconceptions about the assessment’s methodology continued in interviews conducted for the DfE, in which applicants referenced lecture observations and reviews of course content and student progress. “I imagine it’s an Ofsted for universities,” one student said.

Andrew Gunn, a researcher in higher education policy at the University of Leeds, said that a lack of public understanding was not surprising “given that many [people] working within universities don’t actually understand the complex scheme or how it’s being rolled out”.

“The TEF isn’t informing student choice on the scale the government wished,” Dr Gunn said. “If the TEF isn’t providing useful product information, as part of the ‘food labelling’ of degree courses, it’s not delivering one of its own objectives.”

Paul Ashwin, professor of higher education at Lancaster University, said the fact that most applicants who had heard about the TEF had learned about it from institutions suggested that it was “mainly used by institutions as a way of marketing their provision”.

“As a whole, the evaluation paints a picture of the TEF as having very little to do with teaching quality or excellence,” he said. “Instead, it is about institutions managing the TEF process to maximise their TEF outcome and then, providing they do not get bronze, using this as a way of marketing their provision to prospective students who generally are not aware of what the award means.”

The research was released by the DfE as a review of the TEF, led by Dame Shirley Pearce, former vice-chancellor of Loughborough University, gets under way.

It also includes a survey of senior teaching staff and university colleagues who coordinated their institutions’ participation in the TEF, which reveals that 40 per cent of respondents at bronze-rated institutions said that it was responsible for a drop in staff morale, while 29 per cent of respondents at gold providers reported an increase.

Seven to 8 per cent of respondents said that their university had closed courses or departments because of TEF-related metrics.

A spokesman for England’s Office for Students, which operates the TEF, said that the research “followed the first year of TEF results in summer 2017, so the level of awareness and understanding at that point is not surprising”.

“We would expect these numbers to grow as the TEF becomes more embedded and is used increasingly on student information websites,” he said.

anna.mckie@timeshighereducation.com

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.

Reader's comments (2)

Your previous article said it best, "The Hepi/Advance HE Student Academic Survey finds undergraduates at gold-rated institutions do not rank teaching staff highly" The TEF is an invalid measure. It does not measure quality teaching. https://www.timeshighereducation.com/news/students-gold-providers-see-good-value-not-good-teaching
Having taught in FE (subject to Ofsted) before slithering into academia, I don't think either are good measures of quality learning and teaching! Using 'student satisfaction' as an indicator is daft - some of the best teachers are unpopular with many students (especially the less able or lazier ones) because they challenge and stretch them well beyond their comfort zones, yet those who put in the effort learn extremely well.