
‘AI turns the classroom from structured event into improv class’

You may also like
The ostrich approach towards artificial intelligence, hoping that it will all blow over, is ill-placed. AI is here, and we cannot continue teaching as before. What we did even a handful of years ago has little bearing on our current situation. We need to move from the educator being the sole arbiter of knowledge to being a sparring partner and guide, working alongside AI. We need to say goodbye to scripted lectures and linear slides.
Yet students tell us they still want professors in the classroom because self-learning is hard and rarely successful, as the low completion rates of otherwise excellent massive open online courses show.
AI means rethinking a fundamental: our role as teachers.
In practice, that means less reliance on scripted material and more workshop-style environments in the classroom. We can give students AI prompts for a task and ask them to improve upon them. We can live-critique and compare their solutions, and guide students through audits of AI outputs to test the large language models’ sources, assumptions, biases and logic. This rethink builds on proven alternative pedagogical concepts: studio-style sessions, flipped problem-based work, team-based clinics, Socratic debates, in-class case simulations and live data analysis.
- GenAI as a teaching colleague in assessment: a case study
- Show students what thoughtful engagement with GenAI looks like
- GenAI can join the dots, so teach students to draw new lines in empty space
However, to be most effective, that approach requires us to start treating the classroom less like a structured end-to-end event and more like a kind of improv class. What does this look like? The teacher uses an outline of intended learning outcomes, gives students agency, and watches to see where it takes the class, together. This shifts our role from transmitting prerecorded answers to provoking better questions, giving us a more active, participatory role. It requires us to embrace open-endedness and, yes, challenges to our own knowledge.
Granted, it is initially stressful to give up so much control over the lesson plan, but it has led to some of the most rewarding classes I have taught. If we embrace it, it can help pedagogues, too. It can put us on the learner side again, helping us to see old material in a fresh light, and rediscover the joy of learning in our field.
To do so, educators will need to develop critical AI literacy. The more they know of AI, the more relaxed they will become with the capabilities of machine learning.
Then, on top of this, educators need to train their soft, human pedagogical skills, the ways of learning and understanding that AI cannot model or perform. This includes intuition, empathy, openness, improvisation and trust. These facets of being human will only become more important as AI becomes more embedded in our lives.
In class, educators will train these through letting students’ questions shape the path of enquiry, and learning to let a discussion filled with curiosity flow. Prioritising productive in-person study will be key and will increase sociability and group empathy.
Relinquishing top‑down control and adapting ourselves to AI now being embedded in education will require a rewiring of norms and bravery. By taking the bold step to co‑construct knowledge and meaning in the classroom with their students, educators will, though, find themselves breathing new life into enquiry and critical engagement.
The knock-on effect of rethinking teaching is ensuring assessments are also fit for purpose. We need to design assignments that require students to critically interrogate AI output, to justify their choices, verify evidence and explain how they arrived at a conclusion. Faculty can be given the option to choose for each assignment whether it can be solved with the help of AI or not. Oral exams do not scale well, but they are a better form of assessment than multiple choice in an AI-suffused world.
Of course, the challenge of equipping our students with the skills and knowledge they need for the world, and for work, does not end at the classroom door or course login.
We must make AI literacy a core starting skill. At Bocconi, horizontal AI courses for all incoming students teach AI literacy, critical thinking and bias detection. We have also updated our code of conduct so that students know, in detail, what we expect from them when using AI.
None of this will be easy, and we are, right now, in an uncomfortable transition period. Eventually, as we adapt, higher education will look wildly different from how it did even a few years ago.
There are, though, plenty of reasons to be hopeful: the calculator, the internet, Wikipedia and Moocs did not destroy higher education. Learning has a key social component, as the pandemic made painfully clear. Most students will not use AI to shortcut thinking, but will harness it to learn and study in novel ways and expand both their own and our collective knowledge.
It is up to us to guide their curiosity towards deeper understanding, not simpler answers. When Bocconi introduced OpenAI access to all faculty, staff and students last June, a colleague approached me and asked what we should do in response. The concern in their question was understandable, but it also betrayed an illusion: that AI was around the corner. Even then, it was already here.
Dirk Hovy is an associate professor in the department of computing sciences at Bocconi University, Italy.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.