Irrational (but Reproducible) Risk Research
Date: Tue, 8 Feb 94 12:59:02 PST
Subject: Irrational (but Reproducible) Risk Research
From: "email@example.com" <firstname.lastname@example.org>
From: Paul (other info lost)
From: Phil Agre (email@example.com)
* Irrational Risk Research
The 1 Feb 1994 New York Times (science section) includes an article by Daniel
Goleman entitled "Hidden rules often distort ideas of risk". It's about a set
of social psychological ideas about "perceptions" of risk that become
newsworthy about once a year despite never seeming to change. These include
* Risks that are imposed loom larger than those that are voluntary.
* Risks that seem unfairly shared are also seen as more hazardous.
* Risks that people can take steps to control are more acceptable
than those they feel are beyond their means to control.
* Natural risks are less threatening than man-made ones.
* Risks that are associated with catastrophes are especially frightening.
* Risks from exotic technologies cause more dread than do those
involving familiar ones.
The article reports a spectrum of views about the best explanation of these
results and the best policies to deal with them. This spectrum might be
categorized as follows:
Conservative: People are irrational, so forget 'em.
Moderate: People are irrational, but we can persuade them.
Liberal: People are irrational, but hey, everyone has faults,
so let's humor them a little.
The common element, of course, is that is a view of ordinary people as
irrational because their rankings of the risks from various technologies are
considerably different from those of the experts.
What somehow never ceases to me is that all three approaches neglect a
perfectly obvious explanation, which is that people distrust the institutions
that seek to reassure them about unfamiliar technologies, having been
repeatedly and egregiously lied to in the past by many of those same
institutions (they were feeding plutonium to *whom*?), and they resent living
in a world dominated by such institutions, so they refuse to acknowledge the
claims of any technological project that has not been organized and evaluated
in a democratic way. (The article does remark that people don't trust the
numbers, but that's apparently because people irrationally fail to weigh the
nuclear power plants that blow up against all the ones that don't.) Probably
that's too simple, but it explains the data much more straightforwardly than
the known alternatives.
The interesting sociological question is how this feat of conceptual
constriction actually *works*. Does this explanation literally never occur to
the people who do this research and write these articles? How can that be?
Is it a conscious PR thing? That would be disappointing in a way (too
straightforward), but it's certainly true enough with numerous other issues.
Clearly, as articles on the science pages so often conclude, further research
Phil Agre, UCSD
© 1994 Peter Langston