Finding a permanent job in the humanities has never been easy

The lost golden age of hiring and wider social appreciation of the disciplines never existed, says Harvey Graff

三月 22, 2023
Source: Alamy

A recent article in Times Higher Education on the dire state of the UK job market in English studies is just the latest in a sequence of similar pieces lamenting the plight of junior humanities scholars on both sides of the Atlantic.

The article takes to task the senior scholars who tell precariously employed juniors to just hang on and wait for the supposedly inevitable permanent position to open, accusing the former of being out of touch with what are assumed to be historically low odds of landing a permanent position in an era that is assumed to have a historically poor appreciation for what the humanities offer.

It is true that those who land permanent jobs are often blissfully ignorant of the long odds. But I have heard similar complaints about the decline of the humanities since I was a student in the 1960s. The supposed lost golden age never existed.

I graduated in 1970, but the intellectual emphases of the 1960s were already clashing with pressures to major in business, engineering, pre-law or pre-med. I chose to defy my parents and do a PhD in history rather than attend law school despite knowing that an academic “jobs crisis” had persisted through much of the 1950s and that the 1960s boom, driven by expanding enrolments, had weakened dramatically. Faculty advisers were open about the gap between numbers of graduating scholars and posted jobs, particularly in the humanities.

Sure enough, when I received my doctorate in 1975, I faced a dire jobs market. I am still awaiting rejection letters for positions I applied for then, as well as in 1980 and even later.

But I was lucky. A new public institution, the University of Texas at Dallas, was hiring more than 120 new faculty in the arts, humanities and social sciences, a condition of the legislation that had converted the former research centre for Texas Instruments into an initially science-only university less than a decade earlier.

Most of those hired were fellow new PhDs, a handful with one or two years of postgraduate teaching experience. Few of us even visited the campus before relocating; I was hired after an interview in a hotel room at Toronto airport with the founding dean. Given the lack of positions elsewhere, we didn’t have much choice – and the university found itself the employer of an extraordinarily talented band of scholars. With tongue only partly in cheek, one Princeton economist observed: “Aren’t we all someone famous’ best student?”

By contrast, all but one or two of the handful of tenured professors among the founding faculty had been denied tenure at previous institutions. Conflicts of rank, generation, talent and attitude were acute. Not surprisingly, the new PhDs were both more suited to and more enthusiastic about the newest university on the block’s purported interdisciplinary orientation – another concept that has a much longer history than its modern rediscoverers suppose.

But this commitment to interdisciplinarity was only slogan-deep. A handful of us worked across the university, but it did not go well. As a quantitative social scientist and “new” historian, I was hired by arts and humanities but initially housed with social sciences, as I wished. But at the end of the first year, the provost ordered everyone to “return to where you are budgeted” for the accountants’ convenience.

In reality, the administrators – most of whom had limited qualifications and experience for the task at hand – embraced interdisciplinarity only as a budget-saving measure, obviating the need for departments with chairs, offices, staff and funding for separate programmes. The rhetorically misnamed “neoliberal university” actually came into being at the end of the Second World War, not in the 1970s, 1980s, 1990s or 2000s, as a succession of books allege.

No one in charge had any conception of the founding student populations, either. These largely consisted of military veterans and college dropouts, especially women returning to college after their children grew up or first marriages ended. Not surprisingly, then, the gap between course offerings and student interests and understanding was massive. It is no exaggeration to say that the three hired ethnomusicologists outnumbered the number of students who knew what the word meant.

Most of us were younger than our students, too, which didn’t always make it easy to convey scholarly authority. I realise that younger scholars today cannot dream of landing a tenure-track job at the age of 26, but our employment was not at all secure.

Mandatory third-year “probational reviews” were a massacre; few of us were informed beforehand that the Texas state system allowed an assistant professor to be terminated without a full review before the end of three years. Some colleagues were fired because they intimidated their “senior” colleagues, others (including the musicologists) because their courses did not attract enough students.

Some of those dismissed found satisfying positions at universities elsewhere. Others dropped out of academia. But almost every one of those with whom I kept in contact found successful ways to use their knowledge and skills more or less directly, in fields such as philanthropy, congressional research and non-profit advocacy.

Understanding their paths should inform any efforts to rethink graduate recruitment, education and preparation for a range of careers.

Harvey J. Graff is professor emeritus of English and history at The Ohio State University and inaugural Ohio Eminent Scholar in Literacy Studies. This essay is part of a book-length project, Reconstructing the ‘Uni-versity’ for the 21st Century from the Ashes of the Multi- or Mega-versity.

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.

Reader's comments (8)

I'm not entirely sure that the article does counter the view expressed in the Townsend piece published on 15 March. What it does show I think is that there has often been a cycle of expansion and contraction with academic jobs. For instance, in the UK context, which I'm more familiar with, the early 1980s was a time when there were not many jobs going compared to other periods of expansion such as the 1960s or perhaps the post-2012 boom in tripling of fees. Two other points. There is probably a longer lead time now between PhD and getting a permanent post (if that ever happens) meaning that some candidates have incredible CVs (multiple books, publications, grants, impact, teaching qualifications, extensive experience), in some cases better than members of recruitment panels or those with permanent posts. And there has undoubtedly been an increase in precarious jobs in the last decade or so to carry out often core teaching and research functions, certainly in the UK. This again is probably a newer development, and as the author suggests perhaps, like his contemporaries, those in these positions should consider and be encouraged to consider a wider range of jobs outside academia.
As the first commenter suggests, I don't think my piece speaks about a "golden age" of hiring in our profession. In fact I acknowledge that everyone has a story about how hard it was to land a job. What I was suggesting, and what is demonstrably true, is that it is currently uniquely bad, at least in the U.K., and that despite the ongoing crises in academia, we're still producing fresh generations of Humanities Ph.D. candidates who feel themselves trained for little other than unavailable academic job. I agree with the previous commenter that the 'wilderness years' between gaining a Ph.D. and getting a job (if you do pull off that feat) must be at an all time high, and that there are scholars of exceptional quality left fighting over short-term stop-gap teaching posts. I know of several people who have more than one book, published at good presses, who are still stuck in precarious teaching roles. As I mentioned in my piece, I personally haven't been able to get an interview in the past few years, let alone fight it out for a job (for what it's worth, my C.V. has two books, a dozen articles, and 500+ hours of teaching and lecturing on it). I don't doubt it was never easy to get a job in academia, Prof. Graff, but at this particular moment it's almost not possible. Thanks for your thoughts.
This report from the Royal Society (https://royalsociety.org/~/media/royal_society_content/policy/publications/2010/4294970126.pdf) in 2010 states that only 3.5% of PhD finishers in STEM subjects will land permeant academic positions. It was based on research from 2005, so 18 years ago. Thus its almost certainly been the case that the chances of landing a permeant position have been vanishingly small for at least 20 years. Its also been accepted in STEM for at least 40 years that you'd don't even bother to start looking for a faculty position until you've at least 6 years of post-PhD experience under your belt. I am thus surprised that Dr. Townsend's peers/mentors have been telling her to hang on and that if they persevere, they'll eventually get a job. One of the first things that I tell any PhD student starting with me, is that the chances of a long term future in academia is next to non-existant. For new postdoctoral researchers, I talk to them about their future ambitions, and make it very clear that while trying for a long-term academic careers is a noble and worthwhile thing, that the chances are low, and they ought to be have other plans as well. I'm pretty sure I am not alone in having these discussions.
While I value comments, I ask people to reply to what I actually wrote. Please please avoid myth-making rhetoric from golden age(s) to wilderness. Yes, every individual experience is valuable. But, no, we cannot generalize from one or a few. I am planning to edit a collection of personal statements that illuminate and explore "alternative" academic careers from the 1960s to the present.
We are responding to what you wrote, Harvey. Given that the only use of the phrase "wilderness years" here was from me, I'd want to query why you see that as myth-making -- it's now an entirely accepted part of the process of landing a job in the Humanities that young scholars will spend a period of some several years piecing together what is often exploitative teaching work, usually off contract. You're right that we can't generlize from one or a few experiences, Harvey, but I wonder why you think yours is relevant at all here -- I note from information available online that you landed your first post in the same year that you finished your Ph.D (Texas, 1975), and that you stayed in the same institute for the next 28 years. It took me three years to land my current precarious teaching role, with only patchy non-contract teaching in the interim, which is much more representative of the way things are today. I'll repeat my basic point, which is that we already know it was always tough for some people to get a job (though apparently not you) -- the point of my article is that it is demonstrably worse now, however you slice it. At no point do I invoke any mythologies of a golden age, just a less rust-covered one. P.S., Ian Sudbery, yes, the advice is surprising, though I'm not talking about formalized career advice per se in my piece -- this is the behind the scenes, between colleagues language of trying to make it stick, and it's much of what has followed my Ph.D., rather than the advice I received during it (which was actually fairly minimal either way, but that might just be reflective of my institute). I go by "he", by the way, for what it's worth!
I'm really sorry for misgendering. I've no idea where I got the idea you went by she/ her and I should have checked.
Read what I wrote and what Townsend wrote, and see for yourselves. There were almost no tenure track positions in 1975. I remained at UT-Dallas for 23 years not 28., partly by creating my own set of studies and relationships with UT-Dallas and across the Metroplex. Four of those years were spent away on fellowships. I was offered a superb alternative in 1981 but the state of Massachusetts declared bankruptcy. I had to combat British anti-semitism to gain tenure. BUT my undergraduate and graduate advisors and best professors prepared me. I did not enter graduate school with blinders. I was advised and prepared broadly without "great expectations." I was aware that my then world famous professors faced limited prospects them (especially women) through much of the 1960s. But my experience, and relationships, were unusual, less so then than later. We more aware of basic realities. I detail much of this in my forthcoming My Life with Literacy: The Continuing Education of a Historian. Intersections of the Personal, the Political, the Academic, and Place. At the top and at the bottom, is the failure of the arts and sciences, especially the humanities and most especially literary studies, classics, and history to adapt to changing times: from joining together within and across universities; to changing our appeal to students; to advising and preparing much more broadly at the graduate level. To a tragic degree, the humanities have been our own worst enemy. This is the subject of two books in progress: "Reconstructing the new 'uni-versity' from the ashes of the 'mega- and multi-versity'' and an edited collection of personal, critical essays on the continuing variety of "academic career paths" from the 1960s to the present. History matters and must replace our myths.
There is no basis for directly comparing humanities and STEM PhDs experience or career tracks. They have never been similar in any respect. I don't understand the point of that comment. As to "wilderness," consider the history and usage of rhetoric. I also underscore that myth has never meant false. Myths can't be accepted at all if they do not accord at least in part with some aspect of some peoples' sense of "reality." Both issues are central to literary studies, I note.