Every week, Times Higher Education provides rich and varied coverage of the world of universities. This sentence has not been lifted from advertising department copy, but is an entirely genuine comment about the diverse contents of the periodical you are now reading.
But the sting in the tail of the sentence is in the word "diverse" - this diversity reflects both the intelligence and sophistication of academics, but also the quicksand of clichés within and about our profession - a mire through which we have to make our way each day. Navigating the bog of "robust policies", "dynamic research" and "developing the enhancement agenda already under way" (all recent examples of current university-speak) adds a new demand to academic life: the ability to continue to wage what Martin Amis has described as the "war against cliché".
Many of us are quite prepared to take up this battle: the management-speak of universities has been widely attacked, and Laurie Taylor exposes its vacuity in this publication every week. So it is not difficult to attack this sitting duck of a target, which, despite its facile assumption of energetic activity (all that robust dynamism), never advances an argument or clarifies an intention. Rather more difficult to identify, and inherently far more dangerous, are some of the underlying clichés about the position (and importance) of universities in the contemporary world, clichés that are often less apparent but far more insidious.
Of these clichés, the "knowledge economy", the value of "private" money in higher education and the centrality of "skills" to the university curriculum are among the most pervasive. The idea of the "knowledge economy" has now become a defining sentence in university creeds: it is an assumption that has acquired an almost religious authority. But if we consider it for a moment, we have to ask about the understanding of history and social change that informs this idea: do those who believe in the "knowledge economy" seriously think that pre-industrial societies had no knowledge, or that the incremental accumulation of knowledge (and particularly technological knowledge) is synonymous with human progress and emancipation? We (hopefully) no longer describe societies as "underdeveloped", but in the everyday reiteration of assumptions about knowledge, we do much to perpetuate the idea that Western access to higher education is in some meaningful sense an indication of a higher form of human life and a demonstration of social progress. Once we make this assumption, it is an easy step to using it to endorse Western engagements in societies outside the West; we take for granted that other societies need "knowledge" in much the same way as we assumed that they needed Western democracy.
There are, obviously, certain kinds of knowledge that all of us are in favour of for everyone: analgesics, germ theory and literacy are just a few probable examples. But suppose we look at knowledge in terms other than that of homogeneity or a simple social ingredient ("just add knowledge and stir"), then we may see some of the ways in which "knowledge" emerges: through contest, debate and difference, and not through accreditation and credentialism.
Yet every time a vice-chancellor trots out the graduation ceremony litany of "going out equipped with the necessary skills for the knowledge society", a picture of the past as lacking in skills and knowledge is enforced, a picture that both obscures much between Plato and Nato and reinforces the view that only contemporary (and Western) knowledge is of any use.
The shotgun marriage between skills and knowledge often produces a tense relationship. Universities are repeatedly told (and to their lasting shame sometimes tell their staff) that their function is to produce graduates with the "skills" for the "knowledge economy", but premarital counselling often fails to explain to the partners their mutual relevance.
Outside vocational subjects, there is no necessary or absolute "knowledge"; a degree in sociology does not have to cover, let us say, urban sociology any more than a degree in history has to include the history of the Norman Conquest. Like many partners in many marriages, Ms or Mr Knowledge and Ms or Mr Skills have little knowledge about the other partner and no real sense of their role in the marriage. As an academic, the attributes I hoped for in my students were enthusiasm for the subject in question and the wish to engage with it. I did not care whether they could do a PowerPoint presentation, work in a team or demonstrate some of the other "skills" that seemed to me to be more characteristic of the office junior than academic practice.
Yet it is through the toleration of the subtle invasion of the curriculum by "skills" that we collude with the cliché that "we must equip students for the contemporary world". What follows from this widely made assertion is seldom a precise (let alone honest) account of either those skills or the "modern world". If universities, under pressure from the CBI and successive governments, want students to have the "correct" attitude and attributes for working in the capitalist labour market, then it would be helpful to say so.
As it is, the mystification about "skills" serves only to deepen many of the existing class divides in higher education: students (and staff) at elite universities know that actually getting into that institution is a "skill", those at less august institutions (in which the skill of achieving entry is less evident) feel obliged to demonstrate the "usefulness" of their qualifications. Indeed, equipping students "to take their place in the modern economy" is a phrase that has started to appear with some frequency in university web pages.
The Darwinian assumptions behind this last cliché are part of a mindset about universities and higher education that assumes that a perfect fit can, and should, be achieved between the progression of the economy and the academy. The justification for this view is that universities receive a great deal of public money and should therefore return that money with the added interest of the "socially useful graduate". Again, this view has become enshrined in that cliché of pseudo-democracy: accountability. Never mind that the state frequently takes actions for which it has no public or legal mandate, or that privilege still allows differential (and unaccountable) access to education; many universities have accepted without question that they are the accountable ones in a one-sided relationship.
These various clichés about skills, knowledge and the needs of the contemporary world are all converted into fact (and policy), both through the absence of opposition and through a particular use of language that demonises debate. Take this sentence in the 17 September 2009 issue of Times Higher Education, a comment made by David Lammy, who was at the time higher education minister: "Any sensible analysis can only conclude that you need to find new ways to leverage private money into the system."
To oppose this idea then would seem "not sensible", even though the sentence is packed with questionable ideas: what exactly, in an economy hugely supported by the state, is "private" money; why should this money want anything except a position of command related to its own interests; and what do universities need the money for? Is the money to be spent providing better conditions for that army of part-time teachers who have made possible the expansion of undergraduate teaching? The answer to this entirely rhetorical question is likely to be "probably not".
In the various political and ideological confusions and contradictions of the market economy, the third way and neoliberalism, the universities, and those within them, have deep seams of cliché and evasion to explore. The managers and the quality assurance staff, whose salaries have so inflated university budgets, provide rich resources for fiction and satire, not least because the arrival of this group has coincided with more rather than less concern about the current worth of degrees. Increased surveillance, it would seem, does not produce improved academic performance.
Here, perhaps, lies part of the greatest cliché of all: not just that the quality of higher education can be easily measured, but that it can be measured in immediate, quantifiable terms. Many students now graduate with what could be described as an "Upper Second in Glazed Indifference": they have followed the rules, acquired the course handbook and become expert at precis and summary. In various ways they have worked hard, but many of them know that what they are doing is acquiring a degree that is in itself a cliché, a marker of a certain kind of rite de passage, from which the possible sincerity of learning has been drained, along with the accompanying possibilities of being wrong, passionately engaged or even completely uninterested. At the same time they are often much more aware than their teachers that critical, subversive and transformative ideas are as likely to emerge outside the university as within it.
As universities increasingly surround higher education with clichés about skills and the knowledge economy, so they distance themselves from an ideal of passionate involvement with ideas that should be at the heart of the academy. Students and staff often long for precisely this kind of message about higher education, yet instead we are fed empty slogans that diminish and curtail the very possibilities of universities. It would be a brave vice-chancellor who stood up at graduation (or even better at the welcome address to first-year students) and said that the only important function of a university is to allow individuals a chance to think about a particular curriculum. Although this may shock some students, the long-term social effect may be at least as productive as a "skills agenda": generations could be introduced to the truly democratic idea that they have a right to think for themselves and not a given, authoritarian, social agenda.