Stack 'em high, pitch 'em low

March 19, 1999

Darrel Ince argues that the brash computer publishing infecting our bookshops reflects a dangerous dumbing-down in computing

A visit to Oxford has, until recently, been a pleasant experience: a walk around Christ Church meadow, lunch at the Randolph and then a pleasant hour or two in Blackwell's bookshop. I used to enjoy browsing among the computing shelves, seeing how many copies of my books were stocked and looking at the output of friends and adversaries.

I still require no excuse to go to Oxford: Christ Church meadow is still the same, the Randolph has survived the takeover of Forte Hotels by Granada, and Blackwell's is still the most humane and interesting bookshop in Britain. But what has taken the edge off my visit is that a sort of bibliographic algae has infected the computing section. Where rows of sober-jacketed academic books discreetly greeted the browser, there is now nothing but a veritable Bartholomew Fair of gaudy-jacketed books of huge dimensions with over-hyped titles such as Windows NT Unleashed, Secrets of the Windows 98 Gurus and The C++ Bible; and, for the intimidated, The Idiot's Guide to Perl and The Dummies' Guide to the World Wide Web.

This change has affected almost every academic bookshop. In the past when I was visiting a university and I wanted to gain an insight into what its undergraduate computing degrees were like I could get a quick idea from the bookshelves. No longer. Shelves are full of books that are regarded as not large enough if they are fewer than 600 pages long; not important enough if they do not use every primary colour and at least three or four composite colours on their covers; and not weighty enough if they do not deal with the latest version of a piece of software that usually differs by less than 1 per cent from the previous version.

There are a number of characteristics that distinguish these books from others. The first is the liberal use of white space. The second is the fact that large chunks of the books consist of material that can be easily taken from websites or from software documentation. Third, is the fact that the material in such books usually deals with easy topics: the programs are rarely longer than 50 lines, useful but difficult facilities are glossed over and large amounts of text are devoted to the simplest topics. The fourth characteristic, and the most important, is the fact that most of the new wave of books deal with back-end activities that usually consume no more than about 10 per cent of resources devoted to a software project.

This publishing phenomenon is reflected in the worrying trend of employers preferring training qualifications to computing degrees; a recent survey in the United States indicated that most of respondents preferred to hire staff with a Microsoft-certified software engineer qualification rather than a computer science degree.

What does this flood of low-level arriviste books say about the computing industry, academic publishing and academia?

First, it provides the reader with the impression that low-level activities such as programming are the most important activity on the software project: when you see 90 per cent of the bookshelves of an academic bookseller containing low-level books then it is hardly surprising that readers come to that conclusion. While programming is obviously important there are many other activities that have a much greater effect on project success, such as performance prediction.

The second potential fallout from the new wave is that we are developing programmers who cannot program. The readers of such books are never faced with alternative program code: all they see is small snippets of code that are used to illustrate some facility within a programming language rather than a broadly useful programming technique.

The food critic and academic Nahum Waxman has written about the gradual dumbing-down of recipe books to the point where recipes specify every small step and every implement to be used. This leads to a lack of knowledge of cooking materials, an ignorance of the diversity of those materials and a denial of the vast potentialities embodied in even a few ingredients. In the same way, the new computing books deny the concept of design as the exploration of alternatives within a solution space, the properties of different software architectures and the importance of qualities such as performance, maintainability and usability.

So what can academics do? The first step is for us to realise that we are partly responsible for what has happened. The past decade, with its focus on research assessment exercises, has reduced the quantity of good books written by academics to a trickle. We have devised curricula that are over heavy and in which important topics such as design and performance analysis are taught minimally or not taught at all. We confine software engineering to usually a single ghetto lecture course.

We have concentrated on formality at the expense of creativity and forgotten we are teaching an engineering subject where the end point is the development of a system that meets functional and performance criteria and can be changed easily during operation.

There are some promising signs: over the past five years there has been an increase in interest in design patterns. These are snippets of software architecture that represent good design practice and can be implemented in different ways in a variety of software systems. This is the equivalent to providing a broad description of a recipe, say a bouillabaisse, and allowing cooks to create different recipes with different fish, herbs, quantities of garlic and varieties of vegetable. But even this topic is being spoiled by academics with an over-emphasis on the formalisation of patterns using mathematics and by an over concentration on taxonomy, rather than viewing the whole area as that of exemplifying good design.

Eventually, sense will prevail. Unfortunately I fear it will prevail after disaster. The past three years have seen a rapid increase in software disasters, mainly in client-server software, where developers have tried to build large heterogeneous systems using distributed software technology.

How can we as academics take advantage of what will happen? First take out some professional indemnity insurance and register yourself as an expert witness. There will be lots of well-paid work out there in the next five years. And second, try to fit in a bit of book writing on topics such as distributed computing, client-server technology, component-based software development, managing heterogeneous projects and performance modelling; pretty soon software developers will realise their major problems cannot be solved by upgrading to the next version of some slick technology or by hiring a horde of young programmers brought up on The Super Guru's Black-belt Windows NT 4.1324 Bible.

Darrel Ince is professor of computing science at the Open University.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored