For Lennard J. Davis, universities are akin to licensed madhouses and obsession is the name of the game.
"If you're an academic, you had better be obsessive - that sort of behaviour is rewarded. Any academic setting will have people who are obsessive, research junkies and graphomaniacs. And most academics collect books and often get funding for it," he says.
"It's not just publish or perish, it's publish continuously. The reality of having so much 'free time' is that you are always working. I get up at 6am and go to my computer."
Davis should know. He teaches across a range of disciplines at the University of Illinois at Chicago and is also visiting professor in the department of English and linguistics at the University of Westminster. Recently he published a book called Obsession: A History.
It is an apt subject, as obsessiveness is built into the warp and weft of the academy.
"The development of the modern university system in the early 19th century was devoted to specialisation. The scientific method is about removing all the variables and looking at one particular thing. Examining one allele of a single gene is a source of a particular kind of knowledge, but it's also a strange way to spend one's life," he says.
Other core academic activities such as teaching and writing tend to attract and encourage obsessives.
"When I begin a new book," Davis explains, "I research it obsessively. I am literally possessed by the topic. Everything seems connected to the subject. It's actually a form of delusion, although that is what makes a good book. You need that single-minded drive."
But writing books also performs an essential service: preventing obsessives from becoming a public nuisance.
"Imagine if you met an author at a bar and they talked at you about their pet subject for the time it takes to read a book. Writing books stops people assaulting strangers and being the biggest bores that ever existed."
Something similar applies to teaching. "Having people listening and taking notes encourages obsession," Davis claims. "Universities allow people to pursue their obsessions in their classes or on their own - classes are a sort of padded cell for obsessives."
Obsession's argument is wide-ranging, but it starts with a disconcerting account of Davis' childhood fixations.
He writes: "I had a compulsion to swallow coins, mostly pennies and dimes, but there were the nickels as well, which I did on a regular basis, with the subsequent visual delight of seeing these gleaming circles emerge from me shiny and cleaned by the acid of my digestive system.
"When I ate elbow macaroni, I would slide each elbow on the tine of a fork, so that the utensil contained four straightened tubes of pasta, and then I would swallow each one whole. Continuing on the culinary front, I divided my food into absolute and irrevocable sections that must never mix or touch each other."
It is part of Davis' point that adult parallels to such behaviour are common - and indeed encouraged - in universities. That is what makes academic life so agreeable.
"The academics I know are a fairly happy lot," he says. "I find university a very congenial place to be. It's as if you are a kid and you can ride your bike all day or build as many model aeroplanes as you want."
Michael Fitzgerald, Henry Marsh professor of child and adolescent psychiatry at Trinity College Dublin, takes this argument a step further.
He has clinically diagnosed more than 1,800 individuals with autism and Asperger's syndrome, and he has written about the links between creativity and autistic spectrum disorders. Universities, he says, are "places where people with Asperger's get asylum".
Although this is a general phenomenon in higher education - "academics are not known for their social skills; many are loners, happy in their own company, and find other people an interruption" - it applies particularly to mathematics, philosophy, geology and engineering.
Fitzgerald believes that Asperger's is 93 per cent heritable and that "the genes for academic talent are the same as for Asperger's. It often makes (leading academics) egocentric as well as eccentric, but also gives them amazing focus and persistence.
"People have always known about the thin line between genius and madness. It's hard to be a good academic without Asperger's, especially in the laboratory, because you need sharp eyes for noticing things other people don't."
This view is backed up by Terence Kealey, vice-chancellor of the University of Buckingham, who used to work as a clinical biochemist.
"Scientists are significantly more autistic than people in the fashion industry, for example," he says. "I once criticised a PhD student's presentation and he said: 'I don't want to be judged by how well I give a talk.' Science is often a refuge from being judged on charm, coolness or haircuts. It would be rare to find a charming biochemist."
The nature of different disciplines determines the kind of academics they require. "The life sciences are about collecting lots of data," explains Kealey, "so people in charge of labs need to run very tight ships and use technicians as machines. It's crucial to collect the maximum amount of data. Biochemists need maximum observation - and then the facts speak for themselves."
So in Kealey's view, the best people to put in charge of biochemistry labs have "slave-driving and obsessive personalities". Warm and cuddly won't cut it.
"I think you have to be reasonably obsessed to get ahead in research," agrees Rivka Isaacson, postdoctoral fellow in the Centre for Structural Biology at Imperial College London. "A head of department told me that I should consider a career in academic science only if I think about science all the time.
"My sister often teases me about this. Once we were in the pub and bought pint bottles of Magners. They give you a glass with quite a lot of ice, so you can't fit all the cider in. Not wanting to carry the bottle and the glass, we were debating whether it was better to drink some of the Magners from the glass to make room for what was left in the bottle to avoid ice-dilution effects, or whether we should just down what was left in the bottle while it was still cold.
"We talked about this for too long, and then my sister said: 'See, you do think about science all the time' - and she has found numerous everyday occasions to say it since."
Harry Collins, a professor at the School of Social Sciences at Cardiff University, is also a firm believer in the value of obsession.
"The great insights or solutions happen when you are asleep and you wake up and have the answer. This happened to me in respect of an analysis problem just a couple of weeks ago. For two nights in a row, I woke up in the early hours with new ways to think about the problem and had to get up to try them out on my computer before I could go back to sleep.
"If you are not obsessively thinking about the stuff, this does not happen. For example, during my three-year stint as head of school at the University of Bath, the quantity of my output was not affected, but it was all pretty shallow because my head was obsessed with administrative problems."
There are, of course, dozens of stories, both funny and sad, about the sheer oddity of the great academic obsessives. Fitzgerald's books explore the lives and careers of many great mathematicians as well as celebrated figures from Sir Isaac Newton and Albert Einstein to Lewis Carroll and Ludwig Wittgenstein. Another striking example of obsessiveness in the academy is one of the UK's greatest scientists, theoretical physicist Paul Dirac.
When not yet 30, Dirac was awarded the Lucasian chair of mathematics at the University of Cambridge (other holders of which have included Newton and Stephen Hawking). A year later, he became the youngest-ever Nobel laureate in his field. Yet he was occasionally mistaken for a tramp and was once said by a journalist to be "as shy as a gazelle and modest as a Victorian maid".
His lack of empathy and ability to miss the point were legendary. Told by a fellow guest at a meeting in a castle that a ghost always appeared at midnight, he replied: "Is that midnight Greenwich time or daylight saving time?" When his future wife sent him a mildly flirtatious letter asking what he was doing, he tabulated his thoughts and, in the words of his biographer, Graham Farmelo, "answered her queries as tersely as a speak-your-weight machine".
Dirac, says Farmelo, senior research fellow at the Science Museum in London, "was metronomic in his routine and always went for a walk on Sundays. By any normal standards he was incredibly focused. He had no other interests and a completely monochromatic personality. He would go for two weeks without speaking, even to his family.
"My biography (The Strangest Man: The Hidden Life of Paul Dirac, Quantum Genius) includes a group photograph of leading physicists where he is reading a book. He couldn't even be bothered to pose for a picture. In the normal course of things, I wouldn't have relished meeting him."
Although Dirac died in 1984, his oddities have continued to provide entertainment.
"He was considered strange, even by the standards of workaholic theoretical physicists. Dirac stories are still current - about his literal-mindedness, his extreme taciturnity, his linear way of thinking."
Yet his weirdness was inextricably entwined with his greatness. He had the depth of vision that enabled him to deduce the existence of antimatter from sheer obsessive thinking.
And his insight may have considerable practical significance. With the rise of nanotechnology, concludes Farmelo, Dirac's equation, "once seen as mathematical hieroglyphs with no relevance to everyday life", could become "the theoretical basis of a multibillion-dollar industry".
So, if universities have long been notable or notorious for obsessives, this surely ought to be celebrated. Let's hear it for the absent-minded professors, the shambling unworldly eccentrics, the mavericks, the oddballs, the researchers with collections of lizard excrement or samples of sand from all over the world.
Let us ensure that we always have places for people who want to devote years of their life to beetles, Beowulf or Buffy the Vampire Slayer. If we are keen for more students from a wider variety of backgrounds to go into science, we may need to produce brochures showing that scientists can be young, worldly and attractive - but don't let's pretend that all or even most scientists would do well as chat-show hosts.
Academia is - and ought to be - about weird passions that yield strange fruit. Bob Horvitz, Nobel laureate for physiology in 2002, "spent 30 years of his life studying the 22 cells of a worm's vulva", according to his co-winner John Sulston.
One might not necessarily want someone like that as a son-in-law, but they add to the gaiety of nations and, far more important, are often the people who challenge received wisdom, shift the paradigms and make the breakthroughs - including life-saving medical ones - we all need.
Where can such characters flourish? It is generally agreed that Cambridge in the early 20th century was highly tolerant of eccentricity and provided a perfect home for odd fish such as Dirac and Wittgenstein. But has something gone wrong? Are the great academic obsessives under threat simply because today's universities don't know how to handle them?
"The real breakthroughs are made by eccentric individuals," claims Fitzgerald, "not team players. More corporate universities are a disaster for such people. They wreck the place if they become head of department."
A stress on values such as "roundedness" or "social skills" creates precisely the wrong environment for the obsessives.
As a clinician, Fitzgerald has occasionally diagnosed school-leavers with exceptional but highly specific talents and suggested that universities take a look at them. But he has seldom found admissions departments with the plasticity needed to see beyond paper qualifications. In the past, there was more flexibility. For example, Cambridge gave Wittgenstein a lectureship and fellowship in 1929 although he didn't have a degree.
Even if students with Asperger's do get in, "sometimes they drop out after the first year because they can't manage social situations - they're not good at handling small groups," Fitzgerald adds. But those lost to the academy may be some of the people it needs most.
Farmelo agrees. "The great universities have to give a very wide berth to (maverick geniuses). Einstein and Dirac weren't interested in teamwork; they wanted to play on their own."
Today, however, teamwork is so highly valued that it is taught in primary schools and constantly features on appraisal forms. The effect can be counterproductive.
"Managerialism sometimes cannot cope with the really great thinkers," says Farmelo.
"In the past," adds Buckingham's Kealey, "gentlemen scientists were allowed to do their own thing. Today, there's huge pressure on team-playing, because it creates lots of research, but great thinkers don't work like that."
As an example, he cites the case of Peter Mitchell, "a truly great genius" who won a Nobel prize for his "completely novel chemiosmotic hypothesis. Yet he wouldn't survive ten minutes in a modern university.
"It took him seven years to do his PhD at Cambridge - today he would have been thrown out and his supervisor admonished."
Although Mitchell was invited to set up and run the Chemical Biology Unit at the University of Edinburgh, professional frustrations combined with illness led him to branch out on his own and build a private lab in Cornwall.
The career of James Lovelock, who established another "experimental station" in the same area, tells a similar story. He once worked for Nasa and the Medical Research Council, but long ago established himself as an independent scientist and inventor, acting as a consultant to businesses and the security services, and even selling his blood to keep his family afloat at one point.
Although he will be 90 in July, Lovelock's Gaia hypothesis and writings on climate change keep him at the centre of scientific and political debate. Yet he has achieved this largely outside the university system.
John Gribbin, one of the UK's best-known science writers and visiting fellow in astronomy at the University of Sussex, recently co-wrote, with his wife Mary, a biography of Lovelock, He Knew He Was Right. Lovelock, he says, "is the archetypal example of someone who doesn't fit into any system.
"He always had problems with hierarchy and bureaucracy. The breadth of his ideas would have made it difficult for him in the mainstream university system, where it is hard to shift sideways. The standard career path to head of department or head of lab means people do less and less research. Lovelock is iconoclastic and never accepts received wisdom."
He is anything but a conventional team player.
"That word 'impossible' was like a red rag to a bull," Lovelock has said. "Right from my earliest days in science, I never took it at face value when some senior bloke said something was impossible."
So how worrying is it that Britain's universities today often fail to provide congenial homes for brilliant but sometimes abrasive mavericks?
"Universities in the past were better at providing homes, full stop," says Gribbin. "Once you were in, you were in. The current system weeds out both ends, the mavericks as well as the deadbeats. So it's a mixed blessing, but I suspect it would be harder for someone such as Dirac to make his way now in this country."
Perhaps the most powerful argument on these lines comes from Bruce Charlton, reader in evolutionary psychiatry at Newcastle University. His recent paper in the journal Medical Hypotheses, provocatively entitled "Why are modern scientists so dull?", suggests that a stress on "perseverance and sociability at the expense of intelligence and creativity" has had the effect of excluding the "brilliant, impulsive, inspired, antisocial oddballs".
"In a nutshell," Charlton explains, "I think that creativity of genius level usually needs high IQ and moderately high 'psychoticism', ie, somewhat antisocial and impulsive behaviour with the ability to fluently generate rather loosely associated ideas."
Recent developments in the academy - long apprenticeships, avoidance of speculative and risky projects, selection procedures that look for hard-working, compliant and agreeable people - all work against this.
"What we need are stratospherically intelligent semi-crazies. But what is left at the end of the modern process are hard-working, moderately intelligent dullards ... If present trends continue, all our best people will emigrate to the US and we will be doomed to be a nation of third-rate research and development technicians posing as scientists."
Similar factors, says Charlton, "apply throughout the educational system" to exclude those who are "too abrasive, impatient, impulsive". This approach would have left people such as Wittgenstein, F.R. Leavis in the humanities and many of the best scientists out in the cold.
But such problems are particularly acute in the sciences, Charlton says. On the Nobel league table, "Cambridge is now below public universities such as the University of Colorado at Boulder and the University of Washington at Seattle - yes, these 'unknowns' really do outperform Cambridge in revolutionary science - and way behind the University of California, Berkeley.
"How would someone such as Dirac manage the research assessment exercise, the Quality Assurance Agency, grant applications and the multitudes of meetings? Would he change his research to get more grants? Well, he wouldn't, would he? He could never take a major chair. At best he would be a long-term research fellow living off short-term grants."
Charlton's polemic can be summed up as a demand for less "plodding perseverance and social inoffensiveness" and more "strange and luminous fools". At least for some positions, we need procedures that select for "superhuman intelligence and high creativity" but "only enough agreeableness to exclude psychotics and psychopaths".
Letting such people play with their toys - and occasionally throw them out of the pram - may sound indulgent and extravagant, but perhaps it is one of the missions universities need to embrace once again.