Logo

Inoculating students against AI-generated scientific misinformation

GenAI raises an urgent pedagogical question for universities: how can we train students to evaluate scientific claims critically when the language of scholarship can be so convincingly simulated?
Elissar Gerges's avatar
Zayed University
7 Apr 2026
copy
  • Top of page
  • Main text
  • More on this topic
Detail of medic drawing a vaccination shot
image credit: MargJohnsonVA/iStock.

You may also like

How to spot misinformation online
4 minute read

A new form of academic misinformation has emerged, which differs from earlier waves of misleading online content. Unlike older forms, which relied on poor sources or obvious bias, misinformation from generative AI tools includes convincing explanations and citations and professional-looking visualisations. These “AI slop” outputs appear credible and authoritative, and they rarely indicate whether the sources are real, fabricated or incomplete. 

So, for students still developing their academic judgement, distinguishing between legitimate research and hallucinated outputs is increasingly difficult. GenAI raises an urgent pedagogical question for universities: how can we train students to evaluate scientific claims critically when the language of science itself can now be convincingly simulated?

One response is for universities to foster “competent outsiders”: students (and graduates) who might not possess deep, specialist knowledge in every field but are equipped with the evaluative and social competencies to engage critically with scientific information. To train students in the necessary critical thinking and analysis skills, instructors can draw on an interdisciplinary framework that combines science education, psychological theory and media literacy

Inoculate students against AI-generated misinformation

An effective approach is to expose students directly to AI-generated scientific claims and guide them through the process of verification. Inoculation theory, a psychological method for building resistance, draws a powerful analogy from medicine. It suggests that pre-emptively giving individuals a weak dose of misinformation, followed by a clear refutation, can build mental antibodies. This process helps individuals recognise and resist persuasion techniques. In class, this can take several forms. Technique-based inoculation exposes students to the logical fallacies and rhetorical tactics commonly used in misinformation, such as ad hominem attacks or false dichotomies. Fact-based inoculation directly corrects specific falsehoods with credible data. 

A more experiential method might involve asking students to evaluate an AI-generated infographic or research summary, before debriefing them on its misleading tactics.

Inoculation should not be a one-off activity. If students analyse an AI-generated claim about genetic modification in one module, for example, they should later encounter a different example, perhaps related to climate policy or public health, and apply the same analytical tools. As with immunisation, repeated exposure helps students recognise recurring persuasion techniques across contexts. 

And because AI systems can generate new variations of misleading claims almost instantly, the effects of a single inoculation can fade. Brief, recurring exercises embedded across courses or programmes throughout a semester are more effective than a single workshop on misinformation. 

Teaching scientific media literacy

Inoculation theory finds its essential partner in scientific media literacy. In simple terms, this involves understanding scientific content while applying knowledge of both science and media to evaluate how scientific claims appear in news outlets, social media, AI outputs and other forms of communication. This is where the interdisciplinary mission of higher education institutions becomes paramount. Scientific media literacy should not be siloed within science departments. It is a core-cutting competence that connects formal coursework to the realities of public discourse. Different formats can teach students to read critically and increase their sensitivity to misinformation:

  • A politics class might analyse how media outlets frame scientific uncertainty during policy debates.
  • A business course could examine sustainability reports or marketing campaigns to assess how scientific evidence is presented.
  • A literature seminar might explore how contemporary fiction constructs narratives about science and technology.
  • A computer science or digital literacy course could examine how generative AI produces scientific explanations and where hallucinated citations or misleading claims emerge.

How AI summaries can support critical thinking

Using AI-generated summaries alongside news articles and original research papers for analysis and assessment allows educators to evaluate a student’s ability to judge the quality of evidence, identify potential biases and formulate an informed opinion. For example, instructors might ask students to verify the references in a summary from a large language model. Fabricated citations or links to unrelated articles make the limitations of AI-generated scientific explanations immediately visible. 

Scientific media literacy requires instructors to possess interdisciplinary understanding of media genres, the nature of science and the processes of scientific consensus-building. Faculty development programmes, time for preparation and communities of practice, where educators from different disciplines can collaborate on curricula and share resources, can support educators to acquire the necessary skill set. 

Students should understand that scientific knowledge is often contested. During the Covid-19 pandemic, evolving guidance was frequently interpreted as incompetence or manipulation rather than as part of the normal process of scientific revision. Classrooms are ideal spaces to clarify this. Instructors can explicitly discuss how consensus forms, why recommendations change with new evidence, and how uncertainty differs from unreliability. Making these processes visible reduces the likelihood that students will misinterpret scientific disagreement as failure.

Universities cannot eliminate misinformation but they can equip students to navigate it. This requires more than adding a single module on media literacy. It demands sustained, interdisciplinary attention to how evidence is produced, communicated and contested. By combining inoculation strategies with scientific media literacy across disciplines, institutions can produce students who are knowledgeable in their fields and capable of evaluating claims responsibly in public life. In an era where AI can generate persuasive scientific misinformation at scale in seconds, that capacity is no longer optional. It is a core outcome of higher education.

Elissar Gerges is assistant professor in the College of Interdisciplinary Studies at Zayed University, UAE.

If you would like advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the Campus newsletter.

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site