Logo

Are ‘quick wins’ possible in assessment and feedback? Yes, and here’s how

It takes coordination, communication and credibility to implement quick improvements in assessment and feedback, as a team from the University of Exeter explain

,

,

21 Jul 2023
copy
0
bookmark plus
  • Top of page
  • Main text
  • More on this topic
A row of students each concentrating on an individual assignment

Created in partnership with

Created in partnership with

University of Exeter

You may also like

Effective feedback techniques for struggling students
3 minute read
Student feedback

Student satisfaction is consistently lower in assessment and feedback than in many other aspects of the student experience, National Student Survey (NSS) and Postgraduate Taught Experience Survey (PTES) data indicate. These assessment and feedback satisfaction scores appear sticky across the sector, suggesting that the efforts of many institutions are not translating directly into tangible benefits for students.

One reason for slow progress in this area might be because of the difficulty of making changes in-year. Assessments are often confirmed in the previous academic year, when module specifications for next year’s teaching are approved. This timeline is also out of sync with module feedback and survey results, reducing educators’ ability to respond meaningfully to current students’ experiences. This can lead to a sense of delay and inertia in assessment innovation, which is frustrating for educators and students, especially in the context of changing sector perspectives on assessment.

Although substantial in-year change to assessment design might prove difficult, many updates to assessment practice can be implemented within the academic year. Marking criteria can be clarified, marking teams can be better trained and organised, and communication with students can be improved to address concerns and knowledge gaps about the assessment objectives. The impact of these quick fixes is accentuated when they are delivered consistently across the institution. To make this happen, we suggest a format based on three factors: coordination, communication and credibility.

Coordination: setting up an assessment expert group

Improving assessment practice requires insight from educators at the chalkface. One way to establish this is to form an “assessment and feedback expert group”. Bringing together assessment expertise from educators and academic development specialists, and student participants across the institution establishes a community of practice beyond those in formal leadership roles, who can share their experience and bring opportunities for improvement back into their local networks. 

Focusing the group on “quick wins” can encourage discussion to address specific tips and tricks that educators can use without changing their assessment briefs and without significant preparation. Each expert group meeting follows a different theme relating to assessment and feedback – such as managing marking teams, communicating with students about the “journey of an assessment” through the marking process, or embedding effective formative feedback practices in the classroom.

When ideas surface that can easily be translated into “quick wins” guidance, relevant group members can quickly pull together a short article or set of tips to be shared more widely in the institution. This takes us to the next factor: communication.

Communication: sharing good practice with staff and students

The success of the expert group depends on the timely dissemination of advice and findings with colleagues across the university. One way to achieve this is to reach out to colleagues via an “assessment matters” newsletter. This helps to engage the wider academic community in continuous debate and discussion about assessment and feedback, despite limited resource and capacity. A clickbait approach works to draw readers in, with short, easy-to-read articles that feel more like practice-sharing than prescription – titles might include “six ways to help students understand the marking criteria”, or “improve moderation on your module in five minutes”. Articles can then link out to case studies of local good practice, discussion forums, workshops and other opportunities to learn in more depth. 

The newsletter can disseminate resources that educators can use in their teaching – such as an infographic explaining to students the steps that happen between submitting their essays and receiving their feedback. It also enables the team to capture educators’ engagement with support and guidance through analysis of click-throughs and other metrics, helping to identify areas of particular challenge or interest to the community.

Credibility: setting expectations in assessment and feedback from the ground up

A key benefit of this approach is the legitimacy conferred on the recommendations and ideas produced by groups of experts in teaching practice rather than groups of university leaders. This ensures that expectations for assessment and feedback are developed from the ground up, by educators with disciplinary expertise and credibility. Involving student representatives and academic development specialists enables further buy-in from across the institution, ensuring ideas are joined up and helping to dismantle disciplinary silos. This enables the group to develop and communicate agreed principles on, for example, the qualities of feedback to which an institution aspires. As understanding about the role of the expert group spreads, the participants can be supported to share their discussions with peers in their home departments, supporting less-experienced colleagues, embedding meaningful peer dialogue and growing the network of educators willing to participate in the future.

Changing students’ perceptions about assessment is not easy. However, the “expert group” model offers potential for tangible, co-created change in assessment and feedback practice within the current academic year. Conversations that start with simple clickbait prompts can result in a growing community of practice in which the chalkface experience of colleagues can set expectations for delivering wider cultural and institutional change.

Beverley Hawkins is associate dean for education at the University of Exeter Business School, Eleanor Hodgson is senior academic developer and director of ASPIRE PRP at the University of Exeter and Oli Young is associate dean for taught students and chief diversity officer, also at the University of Exeter Business School.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

Loading...

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site