There is a four in a million chance that an aeroplane will land on your head. Is this an acceptable risk? Ayala Ochert reports on the US and the UK's conflicting approaches to risk
Ten years ago, John Graham had a life-altering experience. No bright lights appeared in the sky, nor did he go through a spiritual awakening. Rather, Graham heard that the United States Congress was considering a law that nearly shattered his faith in his country's system of government. The law, had it been passed, would have required the immediate shutdown of any factory causing an extra cancer risk of one person in a million over a lifetime.
"People do not realise that a baby born today has not one but four chances in a million of being killed on the ground by a crashing aeroplane," says Graham, appalled at legislators' ignorance of risk assessment.
Inspired to fulfil what he saw as a "broad social need to help people understand risk", Graham set up the Center for Risk Analysis at Harvard University. Apart from educating decision-makers, the centre is devoted to creating the best tools for risk assessment to produce the most accurate numbers for any given risk.
So, what number does he think represents a "safe threshold"? Not one in a million, or even one in 1,000 - there is no magic number, Graham says. Each risk must be taken on its own terms. "There is no logical rationale for picking a particular number."
The questions may be complex, but Graham has absolute faith in the science of risk analysis to break them down and provide clear-cut answers. He says that untrained lay people tend to lump together "facts" and "values", while scientists are trained to keep them apart. While those in the field of risk assessment work on getting more accurate statistics about particular risks (the facts), those in the field of risk evaluation study people's preferences, which risks they most fear - their values. Risk analysts ask members of the public to rate quality of life by assigning a value between 0 and 1 for any particular condition, where 0 is the worst health (or death) and 1 is perfect health. If people regard a disease such as cancer with more dread than heart disease, that can be factored into the analysis.
"We don't ask lay people to assess the probabilities, we ask them to assess the outcomes. Then we combine the two," Graham says. Any departure from this approach "murders statistical lives and leads to more death and human misery than is necessary".
Not everyone agrees that risks can be so neatly packaged and measured. Sheila Jasanoff, professor of science and public policy at Harvard, argues that rigid separation of "facts" and "values" is unrealistic. She believes no part of the risk analysis process is so sacred that it should be kept from the public, whose participation should be sought at every point, including the earliest stages, when decisions are being made about what questions to ask to frame the nature of a risk. That does not happen now, she says, because analysts are blinkered and focus only on coming up with the best numbers. "Some people are caught up in the love of technical analysis, so they don't recognise the institutional limitations," she claims.
Graham's and Jasanoff's offices may only be separated by the Charles River, which runs between Boston and Cambridge, but it might as well be the Atlantic. Graham typifies the US approach to risk, while Jasanoff's style has a more European flavour.
"Right now there is an impasse between the two big models of 'risk' and 'precaution'. Americans are saying that we can understand risks on the basis of sound science and risk analysis. Europeans say that is an empty assertion, that there are too many unknowns and therefore we should adopt a precautionary standpoint," Jasanoff says.
When it comes to involving the public in evaluating risks such as the BSE scare, she says, Europe has been slow to catch up with the US, which has for many years opened its political, if not its scientific, doors to citizens. But with the recent public outcry in Europe over genetically modified foods, that is beginning to change.
While he agrees that public consultation is crucial to risk analysis, Graham warns of the dangers of going too far down the populist route. Even in a democracy, he argues, people's feelings should not always be taken into account. "When you get into the business of how people feel, if you don't have any grounding in actual probability, then I think you are on a slippery slope."
Graham is also critical of the European model of precaution, which he says is more rhetoric than substance - safety standards in the US are often tougher than they are in Europe. "When I went to France and Germany recently, I was amazed. There was all this talk of bio-engineered foods, yet there they were sitting in bars blowing cigarette smoke in each others faces!"