
Show students what thoughtful engagement with GenAI looks like
You may also like
Generative artificial intelligence has reshaped how students think, work, write and even define their own sense of authorship, but do they respect it? While most students now use GenAI regularly for coursework, far fewer believe it truly deepens their learning or increases their engagement. Many describe it as a shortcut, an efficient way to finish tasks rather than a tool that expands understanding.
Yet the instinct to combat GenAI with rules and suspicion does not move learning forward. Students will use GenAI because it is instantaneous and handily built into platforms they already navigate. If students experience GenAI primarily as a threat to be managed, they are less likely to disclose if and how they are using it.
The real question is not how to stop them, but how to support them. How do we preserve curiosity when answers come instantly? How do we cultivate ownership when a tool can generate polished writing with a single prompt? And how do we keep motivation alive in a digital environment that often prioritises speed over depth?
- Why GenAI helps some students but not others (and what to do about it)
- GenAI can join the dots, so teach students to draw new lines in empty space
- What your students are actually doing with GenAI
The key lies in creating learning experiences that focus less on the final product and more on the intellectual journey. Students need opportunities to reveal what they attempted, questioned, misunderstood or revised, not just the text they turned in. When we design moments for students to articulate their reasoning, we make the invisible learning process visible again. A short voice reflection, a few sentences explaining how a tool was used or an annotated draft tracing the evolution of an idea can help students recognise their own agency in a GenAI-rich environment.
Test understanding, not memory
Supporting students also requires shifting the focus of assessment. GenAI excels at perfect formatting, grammar and organisation. If these remain the heart of our evaluation criteria, students will naturally rely more heavily on automated tools. But when we reward curiosity, critical thinking, evidence of decision-making and the originality of connections students make, the centre of gravity moves from the output to the mind behind it. Students quickly learn that while GenAI can generate text, it cannot generate understanding – and understanding is what matters.
Define what’s appropriate
Another essential shift is helping students develop ethical reasoning instead of simply obeying rules about integrity. Many students are still unsure what “appropriate use” actually means. Is asking GenAI to help brainstorm unethical? What about reorganising ideas? What about checking tone? These are not just policy questions; they are developmental learning opportunities.
By inviting students to discuss ambiguity openly, instructors help them develop the judgement they will need in academic work, professional environments and civic life. One way to approach this is to normalise conversations about the uncertainties that accompany new technologies. Instead of presenting GenAI use as a simple boundary between permitted and prohibited actions, invite students to consider the intent behind their choices and the role the tool played in shaping their work.
Asking students to briefly explain how GenAI influenced their thinking or where they decided to rely on their own judgement can make these decisions visible. Over time, such practices help students recognise that responsible AI use involves ongoing interpretation rather than simple rule-following.
The classroom today is multigenerational, and students come with vastly different assumptions about what technology is “normal”. Some remember life before smartphones; others have never known it. Some see GenAI as a threat to learning; others see it as an everyday tool for solving problems. Acknowledging these differences rather than flattening or dismissing them creates a shared foundation for understanding.
When we’re transparent about expectations, consistent in our explanations and willing to engage in dialogue, students feel safer asking questions and more confident navigating new tools. Explaining the reasoning behind these expectations, not only the rules themselves, helps students understand the educational values those expectations are intended to protect.
Demonstrate transparent use
Modelling is equally essential. Students notice how instructors use GenAI, whether we acknowledge it or not. When educators demonstrate responsible, transparent AI use, such as testing prompts during a lesson, exploring alternative explanations, comparing human-written and GenAI examples or showing how to check for accuracy and bias, we teach students more than any policy statement can. We show them what thoughtful engagement looks like.
But supporting students also means reflecting on our own habits and assumptions. How often do we use GenAI behind the scenes to speed grading, rephrase feedback or generate teaching ideas? Are we honest with students about this? If we expect them to understand the limitations of AI, do we model that understanding ourselves? Do our assignments reward depth or do they inadvertently reward efficiency? These questions help educators build integrity, not as a rule, but as a shared practice.
The importance of empathy
Finally, empathy matters more than ever. Students worry about falling behind peers who seem more adept with AI. Others fear being accused of misconduct if their writing sounds too polished. Some simply feel overwhelmed by the pace of technological change. When instructors acknowledge these emotions and make space for uncertainty, they create classrooms grounded in trust. Empathy helps students take risks, ask honest questions and learn without fear of being misunderstood.
Supporting students in the age of AI does not mean lowering standards or abandoning rigour. It means rethinking rigour itself. Rigour based on complexity, reflection, analysis and creativity still matters. GenAI can automate tasks, but it cannot replicate the intellectual habits that define education: questioning assumptions, identifying bias, making meaning, building understanding and transforming knowledge into action.
The future of higher education will not be defined by GenAI tools, but by how we teach students to engage with them. When we design learning that values thinking over typing, transparency over perfection, and curiosity over speed, we empower students to become not just consumers of information but active, ethical and reflective learners. We prepare them for a world where AI is ever-present but not all-powerful, influential but not determinative.
AI may reshape the methods of learning, but educators still shape the meaning of learning. The challenge is not to compete with the technology, but to ensure that human thinking remains at the centre of education.
Walaa Awad is an educator at Colorado State University Global.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.




