
The questions publishing students need to be asking about GenAI

You may also like
How do we teach the future workforce of an industry that’s currently being reshaped by generative artificial intelligence? While many creative industries are concerned about the potential effects of new technology, publishing has been under the cosh lately. Hachette pulled horror novel Shy Girl from the shelves after GenAI tics were identified in the writing, and Bloomsbury admitted to unknowingly using an AI-generated image on a book cover.
But publishing has been using GenAI, too – several publishers work with Elevenlabs for AI audiobooks, academic publishers use bespoke AI in their workflows, and platforms such as Shimmer are popular AI marketing tools. Both trade and academic publishers have made deals with OpenAI and Anthropic.
- ‘If you like, I can…?’ Why GenAI needs to come with a health warning
- Bringing GenAI into the university classroom
- Why GenAI helps some students but not others (and what to do about it)
Working with GenAI undoubtedly offers advantages: quickly understanding large manuscripts or datasets, automating key workflow tasks, making content more available across different accessibility needs. But there are also trade-offs, too: the legal and ethical considerations of adding content into a model’s training database, the impact on the labour market and, of course, the environmental cost.
None of this is easy terrain to traverse, much less to teach the students who are expecting to go into a fast-changing publishing industry.
Students’ concern about AI
Publishing students are generally open to learning a huge variety of skills they probably never thought they needed. They often come from a humanities background and are widely diverse in the range of life experiences and interests that they bring to a classroom. But often, they’re inherently sceptical of the value of GenAI.
Students are concerned that its use can take away from the “humanness” of writing a book. That feeling of connection, immersion and even a sense that a good book can change a life sits at the heart of this scepticism. How can a machine truly create something real and unique?
This concern is echoed in the wider industry, with the launch of the Society of Authors’ Human Authored initiative – providing a logo that allows authors to declare that their work has been written by a real-live person, without using GenAI. Perhaps we feel this is necessary, as it becomes harder to distinguish GenAI content from human-generated writing.
How we can manage the fear and judge opportunities
When teaching students to think and work critically with GenAI, we’ve asked them to make use of the theory we teach them and to engage with a tool of their choice to debate it. Ask it questions, challenge it, check its work to see what, or if, it is hallucinating, how it may try to push them into believing it is correct, and to critically analyse their conversations with the tools.
By asking students to first fully understand a theoretical concept and take a stance on it – say, “What is an author?” – then asking them to discuss this concept with a chatbot, they are on a better footing to evaluate the quality of any output. The outcomes of asking students to engage in this manner has produced some of the most interesting student work I’ve seen in years, where they consider the impact that using the tool has on their thinking.
This can take the form of students using more than one tool and learning which one was only stroking their egos, always telling them how smart and wonderful they are, and the students critically considering what this means. It’s especially interesting when the student purposely inputs things that are incorrect or tries to pin down particular quotes or page numbers from the tool, which, many times, it was unable to do with accuracy. Some students took a particular stance in relation to authorship and set out to see if the AI tool could change their mind, seeking to answer questions around whether AI has the capacity to understand the meaning behind something and what it means if a tool can “hold” a particular perspective.
Following on from this, we also ask them to consider the ethical and environmental implications of having done this task. This, in particular, depended on the students themselves and their own positioning on these topics. I found that bringing these conversations into the classroom well before the assignment was due enabled peer-to-peer learning around the environmental impact, the use of natural resources vs googling, which often matters for Gen Z learners.
Likewise, being able to have conversations around if it was ethically OK to feed someone else’s work into the AI tools, and linking this to the idea of authorship, writing and making a living out of publishing, can bring out critical thinking skills around who owns content and who has the right to give it to these large language models.
Preparing students for the world as it is, not the world they may want
Our role as academics is not to tell students going into the publishing industry that they should or should not embrace the use of GenAI. Instead, we need to prepare students to have the mindset of an innovation manager, in whatever area of the industry they go into. This does not mean that they need to be thinking of how they can always adopt AI; it’s about teaching them the skills they need to judge these new technologies and to be able to put forth a case for or against adoption at a specific point in time, and to be unafraid to revisit that decision.
We should ask students what their starting point is on GenAI use and then teach them to systematically work to see the other side. We need to pose questions that challenge them to think about the value or detriments of GenAI use in different areas of publishing – where it might work better, such as AI translations of academic work that wouldn’t otherwise be translated, or where it’s less valuable, like creating new fiction. This can lead them to develop a nuanced stance on GenAI use and how they want to approach it in their careers.
Miriam Johnson is senior lecturer in publishing media at Oxford Brookes University.
If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.