A web of disasters waiting to occur

The Next Catastrophe
July 20, 2007

In 1984, Yale University organisational sociologist Charles Perrow was shocked by the near-catastrophic meltdown at Three Mile Island nuclear power station just upstream of Harrisburg, Pennsylvania. He wrote a book called Normal Accidents as a result.

He looked about him and realised there were many complex systems in our modern lives - jumbo jets, nuclear power stations, chemical factories - that were simply bound to produce disasters because of cascading failures in day-to-day management, maintenance, design and environment. Small errors and failures accumulate and the concatenation results in "surprising" disasters. His 1998 second edition of Normal Accidents was able to take into account the heart rending examples of the disasters at Bhopal in India and Chernobyl in Ukraine.

Now, Perrow brings us his thoughts about The Next Catastrophe . With Hurricane Katrina and the events of September 11, 2001, behind us, is this book simply a third edition of his earlier thinking, something more or something less?

I regret to say that this book is a disappointment. His earlier analysis brought Perrow to a startling and important conclusion - a conclusion that meant that Normal Accidents was almost never cited by mainstream disaster planners in the US. This was that there are systems so complex that human beings should not build them.

For purists, however, there is another part of his argument. "Tight coupling" is a characteristic of variables that defines sensitivity to small changes. Thus, complex and tightly coupled systems - such as a jumbo jet composed of more than a million pieces - is just too much for human ingenuity to use, day in and day out, without a failure. This view was not popular in the boardrooms of Boeing, Westinghouse, General Electric and Dupont - or in government circles.

The sequel will not be ignored by the same people, as Perrow jumps on the terrorism bandwagon. But, more to the point, his conclusion is not at all radical. In the face of three kinds of hazard - natural, technological and terrorist - what we need to do, he argues, is "reduce the size of targets". There is no more talk of abandoning complex and tightly coupled systems.

In fairness, the critical edge is still there, but one has to dig for it. To begin with, the sections on the history and sad rise and decline of Fema (the Federal Emergency Management Agency) are solid and pull no punches. The Fema debacle after Hurricane Katrina was a logical outcome of its organisational capture by the Department of Homeland Security.

Organisational sociologist Perrow is on his own very competent turf here. He also insists that the US and the rest of the world are far more at risk from natural hazards and those created by technology (such as the chemical and nuclear industries) than from terrorism. This is certainly not a view designed to please the incumbent administration in Washington since it rode to power by manipulating public fear of terrorism and will doubtless keep the colours flying ("orange" and "red" alert status) through to the next election.

So why does Perrow think that the US needs to "reduce targets"? It is because of concentration. Population and hazardous industrial production and storage are concentrated spatially in large urban regions. Economic and political control is concentrated. Much strategic infrastructure such as electricity and information systems is concentrated in the hands of a few large corporations: hence the 2003 power outage that affected 50 million people in the eastern US and Canada caused by the failure of First Energy Corporation in Ohio, and the threat of computer viruses to those who depend on the Microsoft "monoculture".

So far, so good. But Perrow fails to probe deeper. Why is there growing concentration of economic and political power? He doesn't say, and ultimately delivers a relatively shallow and undemanding analysis, either uninterested in or unaware of a vast literature on risk that has gone far deeper. His bibliography relies on journalism with a sprinkling of scholarly works by such figures as Dennis Miletti and Enrico Quarentelli. However, he neglects, for example, thoughtful essays commissioned by the US Social Science Research Council on Hurricane Katrina ( http:///understandingkatrina.ssrc.org/ ) or the work associated with the UN's International Strategy for Disaster Reduction ( http:///www.unisdr.org ). He is also seemingly unaware of the excellent history of US disaster management by historian Ted Steinberg ( Acts of God: The Unnatural History of Natural Disaster in America , 2000).

Buy the book if you are a risk junkie or are worried about what Ulrich Beck calls "the risk society" in his eponymous 1992 book. It is true that Perrow turns a good phrase, and without doubt he is correct that the US chemical industry (and others) are more dangerous than terrorism. But books like this do not bring about change. If you are seriously concerned about the next catastrophe, take a walk around your child's school or your neighbourhood with open eyes; speak with teachers, parents, neighbours. Together you'll identify hazards and vulnerabilities yourselves and can go to work on them in the political arena.

Ben Wisner is adviser to the UN University Institute for Environment and Human Security, Bonn.

The Next Catastrophe: Reducing our Vulnerabilities to Natural, Industrial and Terrorist Disasters

Author - Charles Perrow
Publisher - Princeton University Press
Pages - 353
Price - £18.95
ISBN - 9780691129976

Please login or register to read this article

Register to continue

Get a month's unlimited access to THE content online. Just register and complete your career summary.

Registration is free and only takes a moment. Once registered you can read a total of 3 articles each month, plus:

  • Sign up for the editor's highlights
  • Receive World University Rankings news first
  • Get job alerts, shortlist jobs and save job searches
  • Participate in reader discussions and post comments

Have your say

Log in or register to post comments