Complex systems and risk management

greenspun.com : LUSENET : TimeBomb 2000 (Y2000) : One Thread

This post by Brian Randell, dated 3 Mar 1988, on risk assessment, seems very pertinent to the current discussion, even though it's over 11 years old. Found it at http://catless.ncl.ac.uk/Risks/6.37.html. (Quotes from the book he mentions are enclosed in double quotes; all other material is Randell's.) If this has already been mentioned on this forum, I'm not aware of it.

I've set off the most telling paragraph (IMO) in boldface.

I've recently been reading Normal Accidents, by Charles Perrow, (Basic Books, New York, 1984)... [snip- (tc)] ...although it contains few explicit references to computers, and is written from the viewpoint of a social rather than a computer scientist, I thought the following quotes from it might be of interest:

"Complex systems are characterized by:

"* proximity of parts or units that are not in a production sequence;

"* many common mode connections between components (parts, units or subsystems) not in a production sequence;

"* unfamiliar or unintended feed-back loops;

"* many control parameters with potential interactions;

"* indirect or inferential information sources; and

"* limited understanding of some processes."

"Complex systems are not necessarily high risk systems with catastrophic potential; universities, research and development firms, and some government bureaucracies are complex systems . . ."

"In complex systems, not only are unanticipated interdependencies more likely to emerge because of a failure of a part or a unit, but those operating the system (or managing it) are less likely, because of specialized roles and knowledge, to predict, note, or be able to diagnose the interdependency before the incident escalates into an accident."

"On the whole, we have complex systems because we don't know how to produce the output through linear systems. If these complex systems have catastrophic potential, then we had better consider alternative ways of getting the product, or abandoning the product entirely."

"Tight coupling is a mechanical term meaning that there is no slack or buffer or give between two items. What happens in one directly effects what happens in the other....Elaborating the concept as used by organizational theorists will allow us to examine the responsiveness of systems to failures, or to shocks. Loosely coupled systems, whether for good or ill, can incorporate shocks and failures and pressure for change without destabilization. Tightly coupled systems will respond more quickly to these perturbations, but the response may be disastrous. Both types of systems have their virtues and vices."

"Since failures occur in all systems, means to recovery are critical. One should be able to prevent an accident, a failure of a part or a unit, from spreading. All systems design-in safety devices to this end. But in tightly coupled systems, the recovery aids are largely limited to deliberate, designed-in aids, such as engineered-in safety devices..."

The above quotations are from the main analytical chapter in the book. Subsequent chapter titles are: 'Petrochemical Plants', 'Aircraft and Airways', 'Marine Accidents', 'Earthbound Systems: Dams, Quakes, Mines and Lakes', and 'Exotics: Space, Weapons and DNA'. The final chapter in entitled 'Living with High Risk Systems', from which the following quotes come:

"I propose using our analysis to partition the high-risk systems into three categories. The first would be systems that are hopeless and should be abandoned because the inevitable risks outweigh any reasonable benefits (nuclear weapons and nuclear power); the second, systems that we are unlikely to be able to do without but which could be made less risky by considerable effort (some marine transport), or where the expected benefits are so substantial that some risks should be run, but not as many as we are now running (DNA research and production). Finally, the third group includes those systems which, while hardly self-correcting in all respects, are self-correcting to some degree and could be further improved with quite modest efforts (chemical plants, airlines and air traffic control, and a number of systems which we have not examined carefully but should mention here, such as mining, fossil fuel power plants, highway and automobile safety). The basis for these recommendations rests not only with the system accident potential for catastrophic accidents, but also the potential for component failure accidents. I think the recommendations are consistent with public opinions and public values."

"My recommendations must be judged wrong if the science of risk assessment as currently practiced is correct. Current risk assessment theory suggests that what I worry about most (nuclear power and weapons) has done almost no harm to people, while what I would leave to minor corrections (such as fossil fuel plants, auto safety, and mining) has done a great deal of harm."

This leads on to a very interesting critique of risk assessment, from which I have extracted:

"While not as dangerous as the systems it analyzes, risk assessment carries its own risks ..."

"When societies confront a new or explosively growing evil, the number of risk assessors probably grows - whether they are shamans or scientists. I do not think it an exaggeration to say that their function is not only to inform and advise the masters of these systems about the risks and benefits, but also, should the risk be taken, to legitimate it and to reassure the subjects."

"This is a very sophisticated field. Mathematical models predominate; extensive research is conducted ... yet it is a narrow field, cramped by the monetarization of social good."

"The risk assessors, then, have a narrow focus that all too frequently (but not always) conveniently supports the activities elites in the public and privare sector think we should engage in. For most, the focus is on dollars and bodies, ignoring social and cultural criteria. The assessors do not distinguish risks taken for private profits from those taken for private pleasures or needs, though the one is imposed, the other to some degree chosen; they ignore the question of addiction, and the distinction between active risks, where one has some control, and passive risks; they argue for the importance of risk but limit their endorsement of approved risks to the corporate and military ones, ignoring risks in social and political matters."



-- Tom Carey (tomcarey@mindspring.com), October 14, 1999

Answers

From Annotated Bibliography of Metaphor and Cognitive science published at

http://metaphor.uoregon. edu/annbib.htm:

[snip]

Perrow, Charles. (1984). Normal Accidents: Living with high risk technologies. New York: Basic Books.

This incredible book examines how in highly complex systems such as a nuclear reactor or a research lab, the interaction of multiple component failures can cause normal (or system) accident, which can be disaster when the systems are not just complex and interactive but tightly coupled Tight coupling is characteristic of systems like the nuclear power plant or air traffic without much slack or alternative ways of fixing something--while the research lab without production pressures, deadlines and schedules is loosely coupled. Perrow's book is extraordinarily rich in detailed examples of accidents in the nuclear power industry, the petrochecmical industry, the airplane and air traffic industry, ship accidents, modifying the ecosystem (dams, quakes, lakes, and mines), the space program, biotechnology and the nuclear weapons industries. This is an example of the best sort of empirical work where the theory simply leaps forward from the detailed presentation of data. Then, in summing up his work, Perrow uses his analysis to pose serious questions not just about the social benefits of high-risk technologies, but about the rationality of risk assessment. Is it right to assess risk in terms of bare numbers and statistics (as the professional risk analysts often do), or does the fact that ordinary people consistently and reliably assess risk differently than the experts suggest that the experts might be overlooking something? This is the old debate about the usefulness of expert knowledge from Plato's Protagoras--are we to place blind faith in the expert possessor of the expert knowledge? Can an expert be wrong? Much of the evidence from cognitive psychology--and some of the best work on risk assessment (such as the Slovic and Fischhoff work Perrow cites)--recognizes that ordinary people reason differently than experts, and the difference is reliable, numerically meaureable and predictable, and perhaps explainable. But it is explainable only in terms which do not lend themselves easily to numbers. Instead, they are explainable only in terms of a thick description (cfi. Geertz) as opposed to the thin quantitative descriptions of the experts. This suggests to Perrow and myself that there is more than one kind of rational assessment--the absolute and objective rationality of the experts, achieved by standing outside the problem, a bounded (or limited) rationality which, after admitting our cognitive abilities are limited, suggests that the experts' numbers need to be supplemented with appropriate heuristics to overcome these deficiencies, and social or cultural rationality which takes into account the messy and hard to describe logic of our ordinary reasoning and the limits on our cognitive abilities (cf. pp. 316- 323). The foirmer two types, Perrow suggests, are thin rationalities--only the third is a thick rationality.
[/snip]

and

From Emergency Service Multi-System Disruption and Recovery During Catastrophic Events published at http://www.emergency.com/ emrchaos.htm:

[snip]
Normal Accident Theory explains that systems are so cpmplex that accidents are inevitable. However, High Reliability Theory, as recent studies aboard U .S.Navy Aircraft carriers and F-14 squadrons shows, explains that complexity can mitigate risk. High Reliability Organizations use organizational structure to adapt to and then mitigate the uncommon yet catastrophic event.
[/snip]

-- Critt Jarvis (critt@critt.com), October 14, 1999.

Moderation questions? read the FAQ