Scientists must foster the public's trust in order to improve understanding of their work, argues John Durant
In the midst of unprecedented scientific and technological progress our culture is beset by doubts and uncertainties. As the historian Eric Hobsbawm points out in his magisterial review of the 20th century, Age of Extremes, "No period in history has been more penetrated by and more dependent on the natural sciences than the 20th century. Yet no period, since Galileo's recantation, has been less at ease with it".
It does not seem difficult to work out what Hobsbawm means. Ours is the century of astonishing scientific and technological progress coupled with appalling scientific and technological peril. It is the century of relativity theory and radionuclides, of plastics and plastic explosives, of antibiotics and atomic bombs, of DNA and DDT, of agro-chemicals and Agent Orange. The two most brutal totalitarian ideologies of the 20th century - Nazism and Stalinism - have both used science and technology. No wonder, then, that Hobsbawm should find our century uniquely uneasy about science.
As the fruits of science have cascaded from laboratories into homes, hospitals and high streets, so scientific expertise has been steadily elevated to a position of presumed authority not far short of that traditionally accorded to kings and priests. As the sociologist Anthony Giddens has observed, the proliferation of "expert systems" - systems of technical or professional accomplishment that organise large areas of our lives - is a defining characteristic of modernity; and in our modern culture, science has become the purest form of the expert system. This is important because what the public expects of experts is reliable advice; and what experts demand of the public is trust. We are required to trust experts because, under the terms of modernity, none of us has, or can hope to have, direct access to all the specialist knowledge upon which expertise depends; but the flip-side of the coin of trust is doubt.
Giddens says: "Science has long maintained an image of reliable knowledge which spills over into an attitude of respect for most forms of technical specialism. At the same time, lay attitudes to science and to technical knowledge generally are typically ambivalent. This is an ambivalence that lies at the core of all trust relations, whether it be trust in abstract systems or individuals. For trust is only demanded where there is ignorance - either of the knowledge claims of technical experts or of the thoughts and intentions of intimates upon which a person relies. Yet ignorance always provides grounds for scepticism or at least caution."
Here, then - this time from an eminent sociologist, rather than an eminent historian - we have words like ambivalence, scepticism and caution that are expressive of unease about science; but now the unease of which we are speaking is not a response to particular abuses of science but rather a structural response to the very position of science in modern society. We are required to trust scientists and scientific expertise, and by the same token, we are inclined to be cautiously sceptical about them. We look for signs that science is delivering what it promises; but we are alert to the possibility of mistakes. For we do not have access to all of the evidence on which scientific judgements are made; and so what else can we do?
Recent research illustrates sceptical public attitudes to scientific expertise. A couple of years ago, for instance, our Science Museum research group received a grant from the European Commission to investigate British public attitudes towards the Human Genome Project, the international programme aimed at first mapping and then sequencing all of the genes in the human genome. Interviewing groups of six or seven people at a time, we found ambivalence about this whole area - it was seen simultaneously as an area of great promise and of concern. On the side of promise, there were the prospects for the better understanding and treatment of genetic disease; and on the side of concern, there was the spectre of eugenics and the multiple difficulties associated with the availability of ever increasing amounts of personal genetic information.
A similar ambivalence shows through in a study on the way the British press has reported science and technology over the past 50 years. The Media Monitor Project contains a random sample of daily and Sunday national newspaper articles from 1946 to 1990, collected and coded on a standard set of measures.
The codes include measures of the attitude ("evaluative tone") adopted by journalists to the science that they report. A positive evaluative tone means that the reporter concentrates on the advantages of science; and a negative tone means emphasis on the costs of science. The study shows that for the postwar period, there is a significant trend from positive to negative. It seems that newspaper coverage of science has moved from the celebratory to the critical, and then back towards what can only be described as the ambivalent. Similar results have been obtained in a study of media coverage of science in Germany. This impression of scepticism is confirmed by a recent study of public perceptions of biotechnology in the European Union. In Denmark and Germany the public judges the risks associated with recombinant DNA technology to be fairly high; but despite this, the Danish public seems more willing than the Germans to support its continued use. The obvious question is why should people who share a common view of a technology as being risky differ in their willingness to see it developed?
One possible answer is given by the results of another question in the study, which asked respondents to rank the degree of confidence that they felt in different social and political institutions to provide them with reliable information. Across the EU this question produced rather striking results, with environmental and consumer groups emerging as the most trusted institutions, and industry, trade unions and government emerging as the least trusted. However, the data for Denmark and Germany shows that on average the Danes have much greater trust than the Germans in the reliability of their public authorities as sources of information about biotechnology. This sort of survey gives a glimpse of something important in the changing relationship between science and the public which is not just to do with knowledge and ignorance; it is also about trust and distrust; and with the various compounds of trust and distrust that are best described as ambivalence.
Think for a moment of recent national debates about, say, the safety of eating cheese, eggs, or British beef; or recent international debates about the safety of civil nuclear power, the problem of global warming, or the fate of the Brent Spar oil platform in the North Sea. In these and many other cases arguments about risk increasingly dominate public debates about particular sciences and technologies. And on each occasion questions of risk move to centre stage, we find along with them questions about trust and confidence. One important reason why professional and lay estimates of risk so often differ is that lay perceptions frequently embody intuitive assessments of the trustworthiness of particular institutions responsible for the safe management of risk. In this sense the concept of risk dissolves the boundaries between science and the wider society; for technical and social judgements are both equally relevant to lay risk assessment.
What does all of this mean for the subject we study - the public understanding of science? I believe that we must take seriously the reality of public unease about science and technology, and this means that we must take seriously the issue of trust. Our agenda for the public understanding of science is dominated by the twin aims of inspiring interest and fostering learning. I suggest that alongside them we should add the aim of cultivating trust between scientists and non-scientists. For trust is the crucial medium of exchange in our society: with it, almost anything is possible; without it, almost nothing can be done. Constructing a new agenda for the public understanding of science around the notion of trust involves learning to think in an entirely different way. Rather than thinking of the public as "the great unwashed" - we need to think of it as constituting an arena in which scientists and non-scientists meet as equals to consider questions honestly. Rather than thinking of understanding as formal knowledge - the sort of thing that students are taught in class - we need to think of it as mutual appreciation between equals who have respect for one another's various viewpoints. And rather than thinking of science as a closed body of definitive truths handed down to the public from on high, we need to think of it again as "public knowledge"; as a body of evolving findings open to public scrutiny.
I am not suggesting that everyone should have a point of view about the structure of nylon, or the physiological function of alcohol dehydrogenase. My point is that when all the completely unproblematic findings of the day-before-yesterday have been dealt with, we are still left with a great deal of science which is in the public domain precisely because it is problematic. It is just this kind of science that the public understanding of science movement needs to engage with more closely, because it is of the greatest public concern.
There will be sceptics who find the ideal of public participation in science absurdly utopian. Is it really possible, they will ask, to engage the public in serious debate and decision-making about some of the most complex matters facing our society? Surely, they will say, we must leave these things to the experts? I would argue that in our society, the ideals of democracy and justice rest on a fundamental faith in the ability of the public to cope, even in the face of the most complicated issues. We do not argue against elections or trials by jury on the grounds of the supposed incompetence of voters or jurors, for the very good reason that to do so would undermine the foundations of democracy itself. For the same reasons, we should not argue against public participation in science. Moreover, interest in public participation in science is growing in the industrialised world; and wherever the ideal has been put into practice the experience has been positive. At the Public Agenda Foundation in Washington, for instance, John Doble has done fascinating experiments involving comparison of the ways in which representative samples of scientists and non-scientists deal with complex policy issues such as the global warming threat and the safe disposal of solid waste. His verdict? "The public's judgement about both issues . . . is strikingly similar to the scientists' views. Further, the few areas of divergence seem rooted more in value differences than in expertise." What is significant is that Doble found no evidence that his non-scientific respondents were seriously handicapped by their lack of technical knowledge.
A little closer to home, the Danish parliament has pioneered a new form of public participation in science: the consensus conference. A consensus conference is a dialogue between lay people and experts in which a panel of lay volunteers conducts an investigation of a scientific or technological issue, cross-examines experts, and arrives at a point of view which is published and presented at a press conference. Since 1987, the Danish Board of Technology has run a series of consensus conferences on subjects such as human molecular genetics, food irradiation and childlessness. Lay panel reports have been presented to the Danish Parliament - in several cases these reports have influenced the course of public debate and policy-making. It is worth reminding ourselves here that we have already noted that the Danish public has a relatively high level of confidence in the Danish public authorities. It is an interesting question how far the consensus conference initiative of the past eight years may have helped bring about this enviable situation.
The public understanding of science movement is an important response to growing public ambivalence about science and technology. Much good has been done but we need initiatives that take seriously the public's concerns about science's impact on society. Such initiatives will only be undertaken by risk takers; by scientists, for example, who are confident enough of their cause to engage with the public about aspects of their work that are important because they are morally, legally, socially or politically vexed. Public participation is the next great challenge for the public understanding of science.
John Durant is professor of public understanding of science, Imperial College.