Learning Lab Denmark - INCUBEus

Normal Accident Theory

When we try to understand the causes of technological accidents, it often turns out to be very difficult to pinpoint exactly what went wrong. The reason for this is that technologies are intrinsically complex and depend on many things working closely together: Materials and components of different quality are structured into tightly engineered sub-systems, which are operated by error-prone humans in not always optimal organisational structures, which in turn are subject to production pressures and all kinds of managerial manoeuvring.

Failure in just one part (material, sub-system, human, or organisation) may coincide with the failure of an entirely different part, revealing hidden connections, neutralised redundancies, bypassed firewalls, and random occurrences for which no engineer or manager could reasonably plan.

This is what 'Normal Accident Theory' is about: When a technology has become sufficiently complex and tightly coupled, accidents are inevitable and therefore in a sense 'normal'.

Accidents such as Three Mile Island and a number of others, all began with a mechanical or other technical mishap and then spun out of control through a series of technical cause-effect chains because the operators involved could not stop the cascade or unwittingly did things that made it worse. Apparently trivial errors suddenly cascade through the system in unpredictable ways and cause disastrous results.

This way of analysing technology has normative consequences: If potentially disastrous technologies, such as nuclear power or biotechnology, cannot be made entirely 'disaster proof', we must consider abandoning them altogether.

Charles Perrow, the author of Normal Accident Theory, came to the conclusion that "some technologies, such as nuclear power, should simply be abandoned because they are not worth the risk".

This political statement has made Normal Accident Theory highly controversial, and the main body of research has since then concentrated on how to make organisations and high-risk technologies more reliable, i.e. 'disaster proof', so that the political and democratically important discussion of allowing or not allowing specific technologies not needs to be taken.

Further reading
Perrow, C. (1984/1999). Normal Accidents: Living with High-Risk Technologies, Princeton University Press.

Sagan, S.D. (1993). "The Limits of Safety: Organizations, Accidents, and Nuclear Weapons".

Comment this page

Nine Technologies

A. Air Transport
B. Buildings
C. Land Transport
D. Marine Transport
E. Bridges and Dams
F. Oil Tankers
G. Chemical Industry
H. Medical Industry
I. Nuclear Industry

Nine Theories

Quantitative Risk Assessment
Decision Analysis
Cost-benefit Analysis
Psychometrics
Normal Accident Theory
High Reliability Organisations
Risk communication
Arena Theory
Cultural Theory

Five Categories

Hazard (0-1000)
Casualties
Range (km2)
Fear Factor (0-10)
Media Effect (0-100)

Files: