
Students told us what GenAI guidance works. Here’s their advice

“I know ChatGPT can help with my literature review but I’ve no idea if I’m allowed to use it, how to cite it, or whether I’m somehow cheating,” a postgraduate student told us.
When we asked students at our university about GenAI use in an educational context, they didn’t express anxiety about academic integrity. They expressed frustration that nobody had taught them how to use it properly.
Working with colleagues, we ran focus groups with students across disciplines and found that teaching responsible GenAI use doesn’t require technical expertise. It requires clarity, structure and reframing these tools as learning support rather than academic threats.
Most UK universities remain stuck in the starting blocks. Some mainstream institutions are beginning to roll out GenAI platforms – Northumbria University and London School of Economics and Political Science, for example – but these are exceptions. Waiting for infrastructure that may never arrive leaves students navigating GenAI alone. But, working in partnership with students, individual academics can provide the guidance institutions haven’t.
Understand how students use AI
Our research found that students use GenAI strategically. They treat it like a study partner: testing understanding through questions, breaking down complex concepts, and overcoming the blank page problem. International students we interviewed particularly valued GenAI for navigating unfamiliar academic conventions and language barriers.
Students aren’t bypassing learning. They’re trying to learn more effectively. So, talk to them, put GenAI as an item on your next student-staff liaison meeting, and speak to your students’ union about what students are telling them.
- How to create a higher education AI policy
- Why clear GenAI guidance matters to neurodivergent students. And how to get it right
- Promoting ethical and responsible use of GenAI tools
You don’t need to understand neural networks. You need to give students three things:
1. Clear, assignment-specific boundaries. Provide concrete guidance for each assessment. “You may use GenAI to generate essay outlines and check grammar,” “You may not use GenAI to write paragraphs or arguments,” “You must cite GenAI use in your bibliography.” Simple. Specific. Actionable.
2. Practical prompting skills. Level the playing field by running a 30-minute workshop on asking specific questions, providing context, and critically evaluating outputs. Show students the difference between “explain photosynthesis” and “explain photosynthesis to someone who understands basic chemistry but not biology, focusing on the light-dependent reactions”.
3. Critical evaluation habits. Students know GenAI makes mistakes. But they need structured verification methods. Encourage students to check sources exist, cross-reference claims against course materials and recognise plausible-sounding nonsense. Teach them to ask: “How would I verify this?” “What would contradict this?” And “Where are the gaps?”
Address the social and structural barriers
The motivation to use GenAI is socially constructed. Students’ usage shifts based on peer conversations, module leader comments and social media discourse. Framing and instructions matter. When lecturers say, “AI is here, let’s use it responsibly,” and discuss it transparently, students will engage openly. Silence creates anxiety.
Dedicate 15 minutes of your first lecture to GenAI. Show good and bad prompts. Demonstrate how you would use GenAI for the module and show examples of where its use will fail. Make it normal, not taboo.
Mind the access gap. Free and premium ChatGPT aren’t the same tool. Students who can afford subscriptions get better support. Point students towards capable free tools. Push your department for institutional access.
Make it discipline-specific
Generic policies fail because GenAI’s usefulness varies wildly. Chemistry students use it differently from literature students, so provide discipline-specific examples. “Here’s how you might use GenAI to understand reaction mechanisms.” “Here’s why GenAI struggles with analysing modernist poetry.” Students need contextualised guidance, not abstract principles.
Start with one slide
If you do nothing else, add one slide to your next opening lecture. Title it “Using GenAI for assessment on this module.” Add three bullet points: what’s permitted, what’s not, and how to cite. Students won’t abuse this clarity. They’ll appreciate it. Across institutions and contexts, students say they appreciate clear policies and practical advice, not just warnings.
We don’t need to wait for a fully-fledged institutional infrastructure to act. Instead:
- Start small: one clear slide per assessment
- Build up: a 15-minute lecture introduction
- Go further: discipline-specific workshops on prompting and verification.
The technology will change, but clarity, boundaries and student-teacher collaboration don’t require a computer science degree. They require what good teaching has always required: clear communication about expectations and genuine engagement with how students actually learn.
UK higher education can’t afford to wait. Students are using GenAI now. The question isn’t whether to engage with this, but whether we’ll provide the guidance students want.
Yanyan Li is an early-career fellow at the Institute of Advanced Study and a fellow of the Higher Education Academy at the University of Warwick.
Tom Ritchie is a reader in Chemistry Education and the director of student experience at the University of Warwick and is currently serving as a US-UK Fulbright Scholar at Elon University.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.


