AI is a wake-up call for those doing scholarship by numbers

ChatGPT must compel humanities scholars to rethink their acceptance of intellectual mediocrity and lax standards, says James Walker

April 2, 2023
Painting by numbers
Source: iStock

Since the late 1990s, cultural production seems to have been running in a sort of recursive loop. Film studios reboot Spiderman and his comic book ilk multiple times. Tom Cruise recently put his flight suit back on took off from where he landed in 1986. The journey from orphan nobody to Jedi hero is undertaken by a young Englishwoman instead of an all-American farm boy, but the moves remain familiar. As the cultural critic Mark Fischer put it in 2014, the future has been cancelled.

Accordingly, cultural criticism has become even more algorithmic. As a graduate student in the humanities, I witnessed my colleagues write academic papers that were similarly comparable to painting by numbers. They applied postcolonial theory to tourism advertisements, gender theory to music videos, or interpreted “the discourse of the NBA” through a racial lens. It was not that, as many on the right claim, their conclusions were absurd. The advertising campaigns of countries like Morocco and Turkey are indeed guilty of self-exoticism. It’s true that the depictions of women in music videos are often highly sexualised. And it’s undeniable that some thoughtless sports commentators fall into stupid racial tropes about “humble, earnest” white athletes and their supposedly “arrogant, trash-talking” Black counterparts. However, uninventive statements of the banally obvious are rarely impressive. It was as if these would-be cultural critics were merely passing parameters into an existing computer algorithm.

Its authors may have defended this work by claiming that despite its prosaic quality, it still possessed value as a sort of political exercise. Insight, originality or the application of intellectual imagination were never the values to which this “scholarship” aspired. Rather, writing (and presumably reading) such material is supposed to enhance one’s understanding of the machinations of power and culture. To participate in this consciousness-raising is to “do the work”. However, as it turns out, this work can be automated. Now the algorithm doesn’t need the human to feed it the data.

The rise of ChatGPT has brought about a lot of hand-wringing among humanities academics. How will professors identify potential plagiarists? Will the student essay even survive? The bigger question is whether academia itself will survive. When thinking itself becomes a rote exercise in “applying a lens” and prose a vulgar recantation of fashionable buzzwords, it should come as no surprise that an AI should come along to replace the scholar who is, in some sense, already an automaton.

But I welcome the age of AI. What all the takes on ChatGPT seem to agree upon is that the essays it produces are extremely… okay. At their very best, they are informative but perfunctory texts lacking voice: a B+. At their worst, they are error-laden, facile, repetitive compositions that don’t do their subject matter any justice – but that’s still only a C- in today’s age of massive grade inflation. Surely this must compel scholars in the humanities to look at their own all too human acceptance of intellectual mediocrity and lax standards of academic rigour. Surely it must compel them to re-examine what purpose they serve and what they should value.

If a humanities education exists at the undergraduate level so that students can become masters at telling their professors what they want to hear, then it might as well be automated. If humanities scholarship exists merely to produce articles with completely uninspired and unimaginative conclusions in journals that will go largely unread, then it’s labour more fitting for a robot than a human being.

Large language models such as ChatGPT draw from a large corpus of pre-existing text. Its domain is the word already written, the phrase already spoken. But criticism at its best is something much stranger and idiosyncratic. It is the manner in which readers bring themselves uniquely to a text, with all their abnormalities and eccentricities.

It remains to be seen whether an AI will ever be able to radically reread a poem, novel or parable in the same manner that Kierkegaard brought his own distinctive interpretation to the sacrifice of Isaac, a story that had already existed for thousands of years. Machine learning has made undeniable advancements in pattern recognition, yet I am more sceptical of its ability to see the likeness in things that are not only considered dissimilar but are understood to belong to separate categories altogether. The French philosopher Gilles Deleuze drew metaphors such as the “rhizome” from the distant intellectual shores of biology and botany to rethink our fundamental assumptions concerning ontology, applying them in ways that were provocative and controversial, but never predicable.

Scholars and critics may be satisfied with thinking and writing algorithmically, but ChatGPT should sound a dire warning that robots themselves will soon do that better. If academics are to evade the unemployment line, they will need to embrace the original, the unique, the nearly schizophrenic that defines genius in scholarship and criticism as much as it does in art itself.

James Walker is an American writer and critic.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

The AI chatbot may soon kill the undergraduate essay, but its transformation of research could be equally seismic. Jack Grove examines how ChatGPT is already disrupting scholarly practices and where the technology may eventually take researchers – for good or ill

16 March

Reader's comments (1)

Superb. AI exposes rife mediocrity in scholarship, but little more so far at least.

Sponsored