As our society has become increasingly dependent on technology, the gap between scientists and the public has deepened. Since the 60's the public debate on science and technology has been characterised by, on one side, scientific experts who regarded the public as irrational and ignorant about science, and on the other side, ordinary people who found themselves surrounded by technology, which they had little chance of understanding or influencing.
If we assume the Foucaultian perspective that knowledge is power (Foucault 1977), it is clear that the gap between experts and laypeople is a power relation. This gap will only deepen as time goes by, because laypeople tend to withdraw from discussing scientific subjects that they feel ignorant about, and as such, the public ignorance becomes a self-fulfilling prophecy (Michael 1996).
From a normative perspective, the public should be asked about dangerous technologies. Traditional risk communication has used the 'deficit model' (Wynne 1996), which is characterised by viewing the public as mechanistic - an empty bottle to be filled up with information. Other communication models have used surveys in order to discover the gaps in laypeople's scientific knowledge.
These approaches ignore the psychological aspects of ignorance. Studies have shown that some people are proud of being ignorant - they define themselves as being in opposition to science, and others see their scientific ignorance as part of their social position in society (Michael 1996). Thus, risk communication is not just about transferring facts to a homogenous public.
Paul Slovic (1986) emphasises the importance of the risk communicator being able to appreciate public attitudes and perceptions and use them in his communication. With Hazard Cards we deliberately try to eliminate the authoritative power relation between science and laypeople. This is done by prioritising the common knowledge of the layman. Furthermore, with the design of the cards, and the opportunity for the users to create their own cards, we aim at facilitating dialogue about technology in society.
Michael, M., "Ignoring Science: discourses of ignorance in the public understanding of science", in: Misunderstanding Science? The Public Reconstruction of Science and Technology, edited by Irwin, A. and Wynne, B., Cambridge University Press, 1996.
Slovic, P. (2000).The Perception of Risk, Earthscan Publications Ltd., London and Sterlin, Va.
Jaeger, C.C., Renn, O., Rosa, E.A., Webler, T. (2001). Risk, Uncertainty, and Rational Action, Earthscan Publications Ltd., London and Sterling, VA.
Otway, H., Wynne, B. (1989). "Risk Communication: Problem and Paradox." Risk Analysis, 9, no. 2, pp. 141-146.
Comment this page
A. Air Transport
C. Land Transport
D. Marine Transport
E. Bridges and Dams
F. Oil Tankers
G. Chemical Industry
H. Medical Industry
I. Nuclear Industry
Quantitative Risk Assessment
Normal Accident Theory
High Reliability Organisations
Fear Factor (0-10)
Media Effect (0-100)