Logo

Four directions for assessment redesign in the age of generative AI

The rise of generative AI has led universities to rethink how learning is quantified. Julia Chen offers four options for assessment redesign that can be applied across disciplines

Julia Chen's avatar
21 Jul 2023
copy
0
bookmark plus
  • Top of page
  • Main text
  • More on this topic
Asian students learning

Created in partnership with

Created in partnership with

PolyU logo with text

You may also like

ChatGPT and the rise of AI writers: how should higher education respond?
5 minute read
Image showing a human and AI writing together

The availability of generative AI (GenAI) tools, such as ChatGPT, has made some conventional assessment tools, such as literature-review-based term papers, practically redundant. At the Hong Kong Polytechnic University, our institutional position is to embrace GenAI and rethink learning purposes and assessment design.

Instead of banning it or using detection tools to check student use of such tools, starting this September the university will allow, and indeed expect, the use of GenAI in take-home assessments. This means teachers have to rethink what they want students to learn and consider how their assignments can be redesigned in such a way that GenAI can be used to help with, but not complete, an assignment for a student.

Working with numerous departments, the university’s Educational Development Centre (EDC) has taken four directions in aiming for effective assessment redesign that is intended to be applicable across disciplines and institutions.

Two major guiding principles that steer the redesign of assessment are: (a) the assessment cannot be completed mostly or wholly by GenAI; and (b) students need to demonstrate their understanding of the concepts/skills they have learned. The four directions below align with these principles.

Direction 1: From written description to multimodal explanation and application

The first step is working with teachers to rework assessment tasks so that students are asked to explain concepts or skills in a specific context; this works especially well in a local context, with which GenAI is unlikely to be familiar. Last month, I worked with a professor whose previous assignment included a literature review on a certain topic. In one of his redesigned assessments, instead of describing the primary components of an engineering system, students will have to apply what they learn in class to analyse that system in a certain location that they are familiar with (but which GenAI may not be), such as their home or the university campus. Not only do students submit their explanations in writing, they also have to include snapshots of the locations to support their analyses.

In another example, instead of describing the top elements of a framework, a redesigned assessment might require students to explain ways these elements are exhibited or not exhibited in a specific context. Such a context could be a particular district or an organisation in the university’s town or city. Meanwhile, assessments in a higher-level course can ask students to make comparative analyses of two distinctively different contexts or locations. Students could be asked to produce multimodal submissions that include not just written text but also video of themselves on location explaining their application of knowledge.

Direction 2: From literature review alone to referencing lectures

For courses without on-site, practical aspects, Direction 1 is clearly not applicable, so a method for assessment redesign that follows the two guiding principles noted above is to require students to refer in their term papers not only to the literature but also to discussions in lectures/tutorials. A humanities professor at my university is asking students to “draw on theories from the course literature and the discussions we had in class to support your argument”. Teachers can consider specifying the exact lessons and discussions to which the assessment should refer.

Direction 3: From presentation of ideas to defence of views

A common assessment at university is the oral presentation. However, given that GenAI tools can generate attractive slides with a full script, the time allocated to the presentation part should be shortened and more time spent on a question-and-answer session in which the student-presenters have to defend their views, methodology and conclusion. Both the teacher and fellow students should ask challenging questions, and student-presenters are scored on how clearly and logically they respond.

One area worth working on with teachers is the use of “negative” questions in the Q&A, such as “wouldn’t method B have been more effective?”, “why didn’t you consider another perspective?” and/or “couldn’t the results have been interpreted in a different way?” The purpose is to encourage students to think critically about how and why they did their assessment in one way and not another.

Direction 4: From working alone to student-staff partnership

Particularly since many students are more familiar with GenAI tools than their teachers, redesigning assessment is a valuable opportunity to involve students as pedagogic partners. Teachers in my university are encouraged to pair up with senior-year students who have taken their courses to discuss ways in which assessments can be revised so that GenAI cannot complete those assessments for students. Successful student-staff partners recently shared their collaborations at two faculty development events, and more open-to-all sharing sessions will be organised to further showcase good practices.

Conclusion

The emergence of GenAI has afforded higher education many opportunities to reconsider the purpose of assessment and how they gather evidence of students’ learning. Redesigned assessments can allow students to more fully exploit all available resources, including GenAI, and avoid regurgitation of information by requiring students to demonstrate understanding via application of knowledge in situated contexts and to employ critical thinking and communication skills when completing and defending their assessment.

Julia Chen is director of the Educational Development Centre at the Hong Kong Polytechnic University.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

Loading...

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site