The brain is waking and with it the mind is returning. It is as if the Milky Way entered upon some cosmic dance." Fifty years since the physiologist Charles Sherrington conjured this vision of brain activity, quoted in Stairway to the Mind, the nature of the neuronal dance still eludes us in its mysterious association with consciousness. Until recently neuroscientists have monitored and catalogued the biological "machinery/ clockwork" of the brain without offending or interesting anyone outside the charmed, bench-bound circle. Now it is generating increasingly vigorous debate. But how are the veterans in the field, the philosophers, reacting to the increasingly common concept of a "science of consciousness"?
Despite the rich diet of scientific forays served up on the mind, no obvious, widely accepted strategy has emerged for relating the objective to the subjective. One problem is that although scientists have meticulously dissected the objective world of the physical brain, for the most part they have given scant attention to considering the features of the phenomenological experience in a comparable vein. In Conscious Experience more than 20 philosophers (with the conspicuous exceptions of Thomas Nagel and John Searle) have their say.
The collection of papers in Conscious Experience redresses this imbalance: the focus is emphatically on the subjective. The book is divided into eight sections, each prefaced with an introduction of several pages that summarises the main issues so that each section of some three papers can be taken as a module. In addition to references cited in the normal way, at the end of each contribution there is also an excellent selective bibliography on "Consciousness in philosophy, cognitive science and neuroscience: 1970-1995", compiled by Thomas Metzinger and David Chalmers. The entire bibliography is some 45 pages long and will provide an invaluable starting point for anyone looking at areas normally outside their purview.
As might be expected from such a large collection of different papers, no single question, let alone any single answer, or theory, emerges. None the less, four broad strategies can be discerned which would encompass many of the contributions. One tack is to attempt a form of deconstruction, reminiscent of classic scientific strategy, on the nature of phenomenal consciousness. Accordingly, Guven Guzeldere, Norton Nelkin and Michael Tye each discuss types of consciousness whereas David Rosenthal explores its distinction from nonconscious states. There is no clear consensus as to how relevant (or otherwise) differences in different conscious states actually are. However, several valuable points emerge. For example, Diana Raffman makes a strong case for "subjective facts" while Joseph Levine points out the important distinction between role player and role when considering consciousness. Just because consciousness might have survival value and have played an important role in evolution, for example, does not mean to say that that is the actual nature of consciousness. The questions of the role of consciousness and the nature of consciousness are clearly distinct. In general, this section tends to sort out questions rather than to deliver any new answers.
Leaving aside evolutionary roles and indeed the distinction of self- consciousness (discussed by Nelkin), the most basic question to tackle is the nature of the "simple" subjective experience. The authors tend to emphasise one of two possible aspects to this issue: either the nature of the individual as different from any other, and certainly the nature of the consciousness of one species as opposed to that of another. Nagel's famous bat is predictably resurrected repeatedly. An alternative line is not to dwell so much on how different one consciousness is from another, but more on the fact that one is having any experience at all, in the first place.
It is this aspect of phenomenal consciousness that is picked up in the second theme, namely how subjective states can be mapped on to physical events in the brain. Both David Papineau and Colin McGinn adopt a pessimistic stance in that they fail to find a ready way of establishing a correlation between specific mental and physical events. McGinn points out the lack of correspondence between space, and indeed time, between the physical and mental world.
But although there is no discernible matching, a relationship clearly must exist. For example, patients with damage to the frontal lobes can exhibit "source amnesia". Source amnesia does not affect memory for facts, but rather for the aspects of memory relating only to time and place. We can see then from this dissociation that the brain, more specifically the frontal region, must in some way normally be able to impose space-time co-ordinates on mental functioning. Hence it is not so much that there is no relation between physical and mental, but rather that it must be more complex than a one-to-one correlation between specific events and specific neurons.
This is the view adopted by Robert Van Gullick who argues for a more holistic scheme for the physical substrate of consciousness, beyond specific correlations. It is an approach that is both possible to develop and that also has the right flavour to it. For example, Papineau dismisses correlating the sensation of pain with merely activation of C-fibres (the pathways that convey pain signals into the brain). However, neuroscientists know that there is far more to the physiology of pain than simply activating Cfibres. The effects of morphine for example, are not so much to suppress pain sensation, but to make it seem irrelevant. Hence consciousness of pain can clearly be modified, as opposed to abolished, by the brain's own opiate system (the target for morphine). By studying these changes in consciousness as a correlation of neurochemical action over wide areas of brain, brain researchers might eventually be able to track down how changes in consciousness correlate with changes in brain states.
In this vein, Metzinger pursues more global brain conditions, citing the oft quoted work of physiologist Wolf Singer who demonstrated neuronal synchronisation in pattern recognition. The idea that certain assemblies of neurons are correlated with certain states of consciousness is one that is gaining widespread support. However, concentrating on just one feature of such assembles, such as neuronal oscillation, smacks of the quest for the magic bullet as well as falling into the trap of confusing the role player (oscillating neurons) with the role (consciousness).
A similar complaint could be made against the role-playing microtubules promoted by Roger Penrose and Stuart Hameroff and criticised by Patricia Churchland and Rick Grush who point out that such a humdrum cellular feature is even more reductionist than the computationalists' neural networks that Penrose rejects. Indeed, the tone of this article is slanted generally to fighting the computationalists' corner. It is a shame, incidentally, that Penrose and Hameroff's reply to this critique, though cited, was not included in this volume. Zeal in identifying the physical factor in consciousness also bumps up against the difficulty of establishing sufficiency over necessity: as far as I know, no single neuronal mechanism has as yet been identified which functions only during consciousness and on its own causes consciousness. It seems more likely that a constellation of factors will contribute to the formation of transient neuronal assemblies that are in turn correlated with conscious states.
Michael Tye's Ten Problems of Consciousness also devolves from the issue of relating the subjective to the objective. Each "problem" is an aspect of phenomenal consciousness and includes the questions posed by specific observations such as blindsight, and "felt location" of, say, a pain in the leg, as well as more broad concepts such as inverted and absent qualia, not to mention the actual mechanisms for qualia (the subjective sensation of, for example, redness).
In a concise and easy-to-read style, aided by boxed summaries and even cartoons, in Ten Problems of Consciousness, Tye demonstrates how many of the problems posed by these ten issues can be resolved by revising the classic concept of a distinction between higher-order and "raw" phenomenal consciousness. He argues for a merging of these two levels of consciousness whereby the seemingly more disembodied "cognitive" consciousness is in fact derived from the more specific, concrete instances that constitute phenomenal consciousness.
Central to Tye's thesis is the rather misleading acronym "Panic". Hence brain processes are poised for use but Abstract in the sense that they do not correspond in a one-to-one way with a particular physical object: at the same time, they are nonconceptual, as shown also by Raffman in Conscious Experience, in that they must be experienced directly, "perspectival": hence they will always be subjective. Lastly they have intentional content because we are always conscious of something ("representational" consciousness) even when indulging in some "higher order" cognitive state such as happiness. Since the subjective higher order state is drawn from instances peculiar to the processing of each individual brain, Tye preserves the personal subjective quality of consciousness, while embracing its objective physical basis. However, if brain states and phenomenal experiences are indeed two sides to the same coin, this "perspectival physicalist" stance throws little light on the nature of any possible common coinage: instead of a Rosetta stone, Tye sees merely a gap between irreducibly different concepts that will never be closed.
The third strand running through Conscious Experience is that of functionalism. This approach, which concentrates on the idea of systems and their interrelations as being of prime importance, is an obvious positive direction in which to move for those who do not deal in real neurons. The corollary of such an outlook is, naturally, to harness the awesome power of silicon systems and hence to contemplate conscious computers. Dieter Birnbacher shows how the usual objections to conscious nonbiological systems on grounds of technical, conceptual and nomological impossibility do not hold. Daniel Dennett eloquently puts the case for proceeding to build ever more complex artificial "brains" that will spontaneously develop an initially low-grade consciousness, while David Chalmers uses thought experiments whereby neurons are gradually exchanged for chips to try to prove that the actual material of a system is not critical to its function, in this case consciousness.
My personal objection to this line of thinking is that the baby of physiological truth is thrown out with the bath water of an imagined neuroscience. For example, Birnbacher dismisses the argument of technical impossibility as theoretically unimportant. A neuroscientist would retort that overcoming the conceptual or nomological problems by postulating that a system built like a brain will be conscious like a brain, is a mere tautology, and that it is the technical considerations that are after all the most challenging. Only when modellers successfully come up with a strategy as to how they might accommodate, for example, the "technical" fact that dopamine is needed to make certain types of movement but in functional excess leads to schizophrenia will the exercise in artificial brain construction appear promising. Similarly, if Dennett hopes that consciousness will spontaneously evolve in his expanding artificial brain, and is happy that it should only be a modicum of consciousness as we know it, why not start with "simple" biological brains themselves, and save all the waiting and the work? After all, if Dennett's synthetic brain is truly conscious, the system will have the potential for feeling pain and thus be subject to the same ethical constraints as currently apply to experimental animals.
Finally, Chalmers uses as fuel to his argument of irrelevance of brain material the fact that substitution of chips for neurons would not result in fading or biphasic ("dancing") qualia. One problem here is the fallacious assumption mentioned earlier of a crude one-neuron (or circuit) -to-one-state correlation. Hence the dancing qualia scenario is not as likely as implied since it rests on the assumption first that an isolated circuit will be responsible for a blue quale. In real life, the brain functions holistically. In this spirit it is plausible to imagine that in real life we are not really conscious of a red quale for example, as opposed to the more complex quale of a red flower or even the more "holistic" quale/qualia of a garden in bloom. Hence to deride the concept of a disembodied fading colour red is not necessarily very realistic. But in any event, clinical observations do suggest that consciousness can wax and wane; "fade". It is generally unclear how functionalism will actually help us understand consciousness. Nor do any of the three authors cited here reveal what question about consciousness a sentient robot will help us tackle that a real brain could not.
The fourth approach strikes a happy balance between being neither overpessimistic and dismissive of true brain events nor over-zealous regarding magic bullets, be they biological or synthetic. Peter Bieri argues the case for first developing the right sort of questions, Martine Nida-Rumelin pleads for a "phenomenal" vocabulary while Kathy Wilkes points to the term "psyche" as an example of a technical, rather than folklore term which could be pressed into service when relating physical and mental. In short, the theme here is one of developing or discovering a Rosetta stone for relating phenomenal consciousness to scientific fact. This is surely the approach allowing the most possibilities and thus promising the greatest chance of success. First the respective phenomenal and physiological lines would have to be drawn up: Conscious Experience represents the best concerted voice to date from the philosophers' camp.
On the other side, the latest arrival to the scientists' ranks is Alwyn Scott's Stairway to the Mind. Although dealing with some very heady material, such as quantum theory and biophysics, Scott has clearly made an enormous effort, in which he is largely successful, to be intelligible to the general reader. Not only is the depth of his topics impressive, but the breadth is astonishing for such a concise book. We travel chapter by chapter from physics through chemistry to neurophysiology via systems to brains, then on to a survey of some 18 contributions to mind-brain theories: pausing at a survey of the contribution of anthropology the reader finally arrives at a theory of consciousness. The distribution is slightly top-heavy, or rather bottom-heavy, in that much time is spent on the finer points of physics and chemistry which might frustrate the reader anxious to press on towards understanding consciousness: at the same time, the critical survey of 20th-century thought is fascinating and highly relevant, although the reader is again frustrated, this time because the material is too brief.
However, the equal allocation of chapter space to the more basic physical sciences becomes apparent as Scott's approach unfolds. The general idea behind this progress through science is one of a ladder from simpler components to ever more complex phenomena. The take-home message is that emergent properties arise at all levels of science from more "basic" components or disciplines. Hence consciousness is seen as the most complex phenomenon of all, the result of assemblies of assemblies of assemblies of neurons. In arguing this idea Scott puts a cogent case against both vitalism and nonbiological functionalism. His third alternative is appealing, though not necessarily highly original, and still rather vague. But neither of these symptoms is terminal to a theory: moreover, the main point of Scott's book is primarily to show how the physical world is composed of nested levels of emergent properties, and hence how consciousness can escape the snares of both the computationalist as well as the spiritualist.
But the road ahead still divides. The physicist and astronomer David Darling does not so much attempt to pin consciousness down to the physical brain, as to let it escape. In After Life he draws from near-death experiences to posit, as do physicalist philosophers and neuroscientists, that the subjective and objective are "all one": but in Darling's view, developed from near-death experiences, such unity is only appreciated after death, once consciousness has been freed from the confines of the physical brain that "hobble" it to a restricted and egocentric state. An immediate objection to this thesis is that if the ego restricts our consciousness, do simpler brains with less ego, namely neonates and animals, have "more" consciousness? Such a scenario seems unlikely. Although After Life offers interesting anecdotes and reassurance concerning immortality of a sort, the omniscient consciousness we are promised seems an even slipperier concept, if that were possible, than our familiar, phenomenal consciousness. There is little here to help either scientists such as Scott or the philosophers contributing to Conscious Experience. Only if the brain is seen not as a prison but as a window, will philosophers and scientists have the greatest chance of dialogue, and the greatest chance of progress.
Susan Greenfield is a lecturer, department of pharmacology, University of Oxford, and author of Journey to the Centers of the Mind (1995). There will be four lectures on consciousness at University College London on February 7 (Jeffrey Gray), February 14 (Semir Zeki), February 28 (David Oakley) and March 13 (David Elwell). For details see the Databank section of our website (http://thesis.newsint.co.uk).
Editor - Thomas Metzinger
ISBN - 0 907845 10 X and 05 3
Publisher - Imprint Academic
Price - £34.00 and £21.00
Pages - 544