Logo

‘Our agency over our words is central to our agency over ourselves’

Using LLMs is easy but developing human intellect is hard – and that’s what the university classroom should be for. Here’s how to help students work on their mental muscles
Carla Arnell's avatar
Lake Forest College
9 Apr 2026
copy
  • Top of page
  • Main text
  • More on this topic
A student writes in a notebook
image credit: iStock/Pressmaster.

You may also like

An insider’s guide to how students use GenAI tools
4 minute read

“What’s the name for a technologically backward person? You know, not a technology hater, like a Luddite. Someone who just can’t get the new-fangled innovations.” That’s a question I asked my computer science major son recently. He replied, “I dunno. I think it’s you, Mom.” 

He’s right. I’m someone who struggles just to figure out how to shift cells in an Excel file. I get muddled by Moodle’s many options. I get lost among the bells and whistles of Adobe Acrobat Pro. And it took me a year to discover how to edit text messages – and I still can’t, really. 

But GenAI machines such as ChatGPT or Claude I get – and I got how to use them in about a day of play. There’s something infinitely intuitive about using an LLM. One learns to use it pretty much the way children learn to play video games – through trial and error. And large language models (LLMs) have an additional advantage: one need only ask ChatGPT how to use ChatGPT! 

Why, then, are higher education institutions so anxious about teaching college students to use GenAI to prepare them for an AI-enabled workplace? Sure, universities have an important responsibility to engage students in serious study of the ethical, political, psychological and social implications of AI technology, as well as what LLMs are – and aren’t. Above all, universities must adapt their traditional instruction in information literacy to address what’s now retrievable via LLMs.

But beyond those forms of education, the race to fold GenAI tools into classrooms is misplaced, and the anxiety about being too slow to do so is overwrought. Incoming college students already know much about how to use GenAI – and what they already know is easily adaptable to the level of skill needed for most first-time jobs. 

What students don’t come to college knowing is the hard intellectual work of using their own minds to read critically, master perplexing mathematics problems, create cogent arguments, express ideas without wilting in the face of critical challenges, and write with fluency and flair. In the face of that dearth, the classroom must remain a space for educating human minds. Learning to use LLMs is very easy but learning to develop the human intellect is hard. That basic fact seems rarely acknowledged in all the mystification around the rise of LLMs.

Yet, when students have such easy access to LLMs for the intellectual skills they came to college to learn, and when they turn to them for doing their assigned work, they subvert the traditional processes meant to train their own minds. To address that predicament in the case of writing, I recently argued that we might now need to create additional credit-bearing options – writing labs or formal workshops – for students to practise writing. Otherwise, the seductive temptation to use LLMs to brainstorm, outline, draft, revise or edit assigned writing outside class could make the traditional writing assignment a fool’s errand.

These labs or workshops would require time – because the whole writing process takes time. Contrary to what some educators assume, the early stages of invention, such as brainstorming and idea mapping are not extraneous activity that’s fine to farm out to machines before getting to the “real work” of drafting. 

As University of California, Irvine philosophy professor Anastasia Berg observes: “No aspect of cognitive understanding is perfunctory.” Those early, stumbling stages of imaginative and intellectual work are critical exercises of mental muscles that need development. Even supplanting the editing phase of writing with AI-assisted technology can hinder students from deep thinking about their own choices of diction, sentence structure and style. Grammarly’s aggressive AI function removes both the work and play involved in having agency over our own words. And our agency over our words is central to our agency over ourselves.

Through a credit-bearing writing lab or workshop, free of GenAI, students would be able to slow down and experience the invention process from beginning to end, with expert human guidance and a community of human readers. 

Hannah Pittard recently described the utility of slow writing in a no-tech course on creative writing. In addition to forbidding electronic technology, except for those students with accommodations, she goes one step further and requires students to handwrite their creative work because “the tools shape the work”. She continues, “If you hand a student a machine designed for speed and infinite substitution, you will get speed and substitution. If you hand them a page and say, ‘Linger here for a while’, something else happens. They slow down. They hesitate. They cross things out. They try again. They think. They think.” Notice the italicised “they”.

In two recent literature courses, I experimented with sustained use of in-class writing, asking students to create commonplace books, once a popular Renaissance custom. In a course on George MacDonald’s fantasy fiction, every day began with students opening a physical journal, handwriting responses to questions about the readings or developing the seed of an argument they could later cultivate in oral discussion and then return to and refine. 

Like Pittard, I found that most students loved the idea of handwriting their reflections and marking them with their personalities – from doodles and filigreed illustrations to quotable “MacDonaldisms” and snatches of classroom conversation. Some books took on the look of a lovely illuminated manuscript. Indeed, I have never had so many students request to collect their academic writing at the end of the semester. They seemed to treasure what they had created. Ultimately, though, this isn’t about handwriting; it’s about challenging the human mind’s wellspring of intellect and imagination.

To be sure, some would argue that there shouldn’t be a hard and fast dichotomy between educating human minds with or without GenAI tools. As someone who is always suspicious of simplistic binaries, I’m sympathetic to that objection. Couldn’t GenAI make reading more fun? Couldn’t it provide excellent training in argumentation and dialectic? Couldn’t such tools model good writing – or make writing unnecessary so that students can focus on “higher-level” forms of thinking? Why teach young people what a machine can do for them?

Of course, anything’s possible but decades of integrating new technologies to improve student learning outcomes have left a very poor track record of success. In the US, the constant creep of so-called edtech into primary and secondary school classrooms has not created students with better reading skills, stronger maths literacy, better psychological health or greater ability to interact with peers in non-polarising ways.

And GenAI is poised to only worsen that trend. As Berg argues, “To leave our students to their own devices – which is to say, to the devices of AI companies – is to deprive them of indispensable opportunities to develop their linguistic mastery, and with it their most elementary powers of thought.” 

The emergence of GenAI should be a creative opportunity for higher education. Not because such technology will improve student learning but rather because it will improve teaching. It will force us – some of us, at least – to redouble our efforts on the hard problem of honing human intellects. And that hard work will pay dividends for ourselves and our students. 

The most agile future employees will not be those who have spent the most class time playing with ChatGPT. LLMs need to be recognised as human beings’ competitors, as several of my colleagues tell their students. Accordingly, the students who are most prepared for that competitive future will be those young people who have trained their own minds to understand tough texts, write cogent and compelling prose, and critically discern what’s true. Teaching students these mind-quickening skills is not a dead end. It is the best path forward for human education. 

Carla Arnell is professor of English and associate dean of the faculty at Lake Forest College.

If you’d like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site