Links forged to break old chains

Rethinking Hypermedia

September 12, 1997

In hypertext, as in many other fields, the pattern of advance has been that the visionaries have radical ideas, but their attempts to realise the ideas fail; several years later another set of researchers refines the ideas, and, taking advantage of all the advances in software and hardware that have occurred in the meantime, successfully exploits the ideas.

Remarkably, the original ideas behind hypertext were published in 1945, when the visionary, Vannevar Bush, imagined realising the ideas using mechanical contrivances. Only now have we reached the stage where the majority of Bush's ideas have been exploited, and a few more will doubtless be realised in the next decade.

This book describes research at the Multimedia Research Group at Southampton University. The group has been working for nearly ten years on a hypertext project called Microcosm, and the book centres on this. The project originally began with the challenge of trying to produce an electronic version of a huge existing archive, that of Lord Mountbattem, and has recently also become associated with the Churchill archives at Churchill College, Cambridge. A working version of Microcosm has existed since the early 1990s: it has been released to other universities, and there is now a commercial version.

The overriding design goal of Microcosm has been that it be an "open" system, in the sense that hypertext facilities should be made available in a general way to all software. For example if there is a link between information in one document prepared using supplier X's word processor, and another document prepared using supplier Y's spreadsheet, then it should be possible to make the link, and to use the link within X's and Y's software.

The ideas of openness were originally promulgated by some visionaries at Brown University, but, following the general pattern, the visionaries failed to exploit their ideas; the Southampton team have added to the ideas and brought them closer to general realisation. In one respect, however, the Southampton team are visionaries in their own right. They have generalised the idea of a hypertext link to be more of a link between concepts than a link between particular points in documents.

The book describes the ideas behind Microcosm and their design and implementation. There is also a chapter on the design of some existing applications. It is written for computer scientists and covers such matters as the modules that make up the implementation. The material is clearly presented, and, although Microcosm is very much the focus of the book, there are useful comparisons to other systems, and good references to the literature. The last chapter, which covers ongoing research, gives an impression of the wide spread of issues that the Microcosm team are tackling. There is material on authoring in the context of CSCW (computer-supported collaborative work), the problems of trying to retrieve material from a multimedia database, the use of agents, and matters concerning digital publishing and seamless interfaces to electronic journals.

The term "hypermedia" is often used as a synonym for "hypertext", but the Microcosm project has always focussed on true multimedia aspects; there is thus plenty of discussion in the book about such issues as linking between points within video or sound files. A key characteristic of Microcosm is that the links should be separate from the documents they apply to, rather than embedded in documents as they are in web pages. This approach has advantages and disadvantages, a big disadvantage being the problem of keeping the links in synchronisation with the document when the document is changed. However, as one moves towards true multimedia, the advantages become stronger and the idea of embedding links in documents becomes less tenable.

Almost all the initial ideas for hypertext came from North America.

Over the past decade, however, Europe has been the prominent focus in hypertext advances, notably, of course, with the World Wide Web. The Microcosm work is part of this, and the book's discussion of the so-called Dexter Model perhaps gives a clue to why the intellectual lead switched towards Europe. At the end of the 1980s, hypertext researchers in the US scored a spectacular own-goal: they tried to standardise before the underlying concepts were understood. The attempted standard, which is what the Dexter Model is, doubtless acted as a straitjacket for US researchers for some time thereafter. The Europeans, who were largely excluded from this activity, gained as a result.

The book is liberally peppered with the like of "Microcosm is the only system capable of providing the functionality required" or Microcosm's "infinitely more powerful hypertext model". The steady hard sell becomes tedious after a while, and is likely to focus the reader on one of the book's weaknesses: it would have been better if it had contained a chapter about experience of Microcosm applications, with quantitative measures of the advantages and disadvantages of Microcosm, to support the claims made.

Peter Brown is professor of computer science at the University of Kent at Canterbury. For the past academic year he has been working at the Xerox Research Centre Europe.

Rethinking Hypermedia: The Microcosm Approach

Author - Wendy Hall, Hugh Davis and Gerard Hutchings
ISBN - 0 7923 9679 0
Publisher - Kluwer
Price - £60.50
Pages - 216

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored