Why does alcohol pose an acceptable risk while GM foods call for caution? It is all about trust and control, says Nick Pidgeon.
Risk and the way people respond to and perceive risk has, over the past five years or so, reached the top of the public policy agenda. We can all point to examples where people are apparently unconcerned by some very serious public health problems, such as alcohol or smoking. On the other hand, some hazards, which appear to pose a comparatively low risk according to some experts, such as applications of genetics research, are the focus of considerable controversy and concern.
A first point to make is, of course, that the statistical assessment of risk is relatively easy to conduct for everyday activities that many people are routinely engaged in, such as smoking, but much more difficult for many of the controversial risks that stretch far into the future or of which we have little experience. Accordingly, the risks from technological advances may be fraught with considerable and profound uncertainty - until the technology is developed, by which time it may be too late to prevent some unforeseen consequences - or surrounded by disagreement among experts.
Regarding public understanding of risk, the research evidence from two decades of social science work on "risk perception" shows us quite clearly that people have a good grasp of what is likely to kill them tomorrow, but when asked about risk, they bring other factors into the account. For example, risks are seen as less acceptable, or in need of more attention, if they are difficult to personally control - for example, flying - or imposed on people without their consent - for example, industrial facilities.
Risk acceptability can also vary with people's cultural or demographic characteristics (there are many "publics", each with differing views on risk), or in relation to judgements about fundamental values, such as whether the risks and benefits are unevenly distributed across a society in either space or time. A particular problem comes when activities bring present benefits but pose risks to future generations.
Sociological work, in particular, has highlighted an important aspect of people's risk concerns - that of trust or distrust in risk-managing institutions. Again, matters here are far from simple, and we have yet to develop a sufficiently deep understanding of this issue. For example, surveys show that some professionals, such as doctors or the emergency services, are consistently accorded high levels of trust across society. Politicians, perhaps not surprisingly, are often accorded very poor levels of trust, while government agencies that manage risk on our behalf can have quite varied trust profiles depending on their perceived function and performance history.
In the United Kingdom and Europe, trust questions have been central to people's concerns about BSE, and, more recently, genetic modification of foodstuffs. Some sociologists would argue that our reliance on expertise and institutionalised risk management has become a defining feature of a modern "risk society". Reading the risk stories that constantly appear across all sections of the media, one could hardly disagree, and it is also clear that there are many issues today (for example, the privacy consequences of the increasing use of large, interconnected databases) that would not in the past have been labelled as "risk issues".
Trust is important for a number of reasons, not least because it will impact upon the many "risk communications" we receive daily. Most obviously, results from persuasive communication research show us that if we do not trust the messenger we may not believe the message. And institutional trust (accorded to governments or business) may be lost following a serious incident or disaster, particularly if a cover-up or less than full efforts to learn the lessons is suspected.
It is for these reasons that many commentators argue that we need to go beyond simple efforts to "educate" people about risks, or improve public understanding of science. Particularly, in mapping the future path of the more socially conflicted risk questions (such as human genetics research and its development), we may need a form of analytic-deliberative process that allows open interrogation of the basic science of the matter plus some acknowledgement that people's values and beliefs about risk and trust should count. This has been tried for a range of technologies in forms such as citizens' juries and consensus conferences. Whatever the vehicle and the issue, it is important to recognise that society's decisions about risk will often have to involve a judicious blend of sound science and public values. This is the real risk challenge of the new millennium.
Nick Pidgeon is director of the centre for environmental risk at the school of environmental sciences, University of East Anglia. This article is based on a presentation given to the British Association for the Advancement of Science in London.