When the mathematician Claude E. Shannon published his classic paper "A Mathematical Theory of Communication" in the Bell System Technical Journal in 1948, he affected a variety of different scientific disciplines. Shannon introduced a theory that took the varying probability of the occurrence of different signal types into consideration. This probability determines a so-called surprisal value and stands in direct relation to the transferred information. One of the key factors in this theory is that calculating the information that is transferred (measured in bits) stands in relation to the information that a receiver already had before it received the signal.
Information theory predicts how signal systems should be designed to optimise information transmission. Not surprisingly, researchers have tried to apply it to animal communication. Jack Hailman's book Coding and Redundancy: Man-made and Animal-evolved Signals gives a comprehensive overview of different types of signals and what information they can carry. He compares animal signals to those used by humans. There is little mention of human language, but Hailman considers common man-made signals such as traffic lights, lighthouses and bar codes.
The book concentrates on how to calculate the bit-value of different signal types and what kinds of redundancy can be found in signalling systems. With a few exceptions, Hailman does not explain what one can use these values for. The reason for this is that information theory is of only limited use to analyse animal communication systems if we do not know what an animal extracts from a received signal and what information it had prior to receiving it. Hailman acknowledges this problem and uses the theory to the extent that is possible. He understands the limitations of the theory and does not really attempt to use bit-values to compare different animal communication systems.
It is in the nature of complex communication that it is not easy to determine some of the parameters needed for the application of information theory to its analysis. Designed for improving man-made communication codes, it requires knowledge of how many possible different alternatives there are to each signal. In a man-made code, the designer will have this information.
For example, it is easy to understand how many different alternatives there are to signals presented by a traffic light. But when analysing naturally selected communication systems, the number of alternatives is less clear. Even for human language, it is not obvious how many different possible sentences there are to start a conversation. So how would we know for complex animal communication systems?
For this reason, most past and current animal behaviour research uses more feasible ways of analysing the meaning of complex communication signals. These are based on signal-detection theory and correlations between signal reception and receiver reactions under different conditions. These methods can be used without assumptions about what an animal might know before it receives a signal, and have been proven to be more fruitful.
In summary, I conclude that Hailman's book uses human-made signals well to explain the application of information theory. It also gives some good examples of simple communication systems in animals that can be analysed in the same way. However, examples from more complex systems seem vague and are less convincing.
The book can therefore be recommended to anyone interested in simple signalling systems, but others may find it a somewhat dry read. In its scope, the book clearly demonstrates the limitations of the use of information theory in the analysis of naturally evolved communication systems.
Coding and Redundancy: Man-made and Animal-evolved Signals
By Jack P. Hailman. Harvard University Press. 257pp, £25.95. ISBN 97806740954. Published 6 June 2008