The risk game

https://doi.org/10.1016/S0304-3894(01)00248-5Get rights and content

Abstract

In the context of health, safety, and environmental decisions, the concept of risk involves value judgments that reflect much more than just the probability and consequences of the occurrence of an event. This article conceptualizes the act of defining and assessing risk as a game in which the rules must be socially negotiated within the context of a specific problem. This contextualist view of risk provides insight into why technical approaches to risk management often fail with problems such as those involving radiation and chemicals, where scientific experts and the public disagree on the nature of the risks. It also highlights the need for allowing the interested and affected parties to define and play the game, thus emphasizing the importance of institutional, procedural, and societal processes in risk management decisions.

Introduction

The practice of risk assessment has steadily increased in prominence during the past several decades as risk managers in government and industry have sought to develop more effective ways to meet public demands for a safer and healthier environment. Dozens of scientific disciplines have been mobilized to provide technical information about risk, and billions of dollars have been expended to create this information and distill it in the context of risk assessments.

Ironically, as our society and other industrialized nations have expended this great effort to make life safer and healthier, many in the public have become more, rather than less, concerned about risk. These individuals see themselves as exposed to more serious risks than were faced by people in the past, and they believe that this situation is getting worse rather than better. Nuclear and chemical technologies (except for medicines) have been stigmatized by being perceived as entailing unnaturally great risks [1]. As a result, it has been difficult, if not impossible, to find host sites for disposing of high-level or low-level radioactive wastes, or for incinerators, landfills, and other chemical facilities.

Public perceptions of risk have been found to play an important role in determining the priorities and legislative agendas of regulatory bodies such as the Environmental Protection Agency (EPA), much to the distress of agency technical experts who argue that other hazards deserve higher priority. The bulk of EPA’s budget in recent years has gone to hazardous waste primarily because the public believes that the cleanup of Superfund sites is the most serious environmental threat that the country faces. Hazards such as indoor air pollution are considered more serious health risks by experts but are not perceived that way by the public [2].

Great disparities in monetary expenditures, designed to prolong life, may also be traced to public perceptions of risk. As noteworthy as the large sums of money devoted to preventing a statistical fatality from exposure to radiation and chemical toxins are the relatively small sums expended to prevent a fatality from mundane hazards such as automobile accidents. Other studies have shown that serious risks from national disasters such as floods, hurricanes, and earthquakes generate relatively little public concern and demand for protection [3], [4].

Such discrepancies are seen as irrational by many harsh critics of public perceptions. These critics draw a sharp dichotomy between the experts and the public. Experts are seen as purveying risk assessments, characterized as objective, analytic, wise, and rational — based upon the real risks. In contrast, the public is seen to rely upon perceptions of risk that are subjective, often hypothetical, emotional, foolish, and irrational (see, e.g. [5], [6]). Weiner [7] defends the dichotomy arguing that “This separation of reality and perception is pervasive in a technically sophisticated society, and serves to achieve a necessary emotional distance …” (p. 495).

In sum, polarized views, controversy, and overt conflict have become pervasive within risk assessment and risk management. Frustrated scientists and industrialists castigate the public for behaviors they judge to be based on irrationality or ignorance. Members of the public feel similarly antagonistic toward industry and government. A desperate search for salvation through risk-communication efforts began in the mid-1980s — yet, despite some localized successes, this effort has not stemmed the major conflicts or reduced much of the dissatisfaction with risk management. This dissatisfaction can be traced, in part, to a failure to appreciate the complex and socially determined nature of the concept “risk”. In the remainder of this paper I shall illustrate this complexity and point toward the need for new definitions of risk and new approaches to risk management.

Section snippets

The need for a new perspective

New perspectives and new approaches are needed to manage risks effectively in our society. Social science research has provided some valuable insights into the nature of the problem that, without indicating a clear solution, do point to some promising prescriptive actions.

For example, early studies of risk perception demonstrated that the public’s concerns could not simply be blamed on ignorance or irrationality. Instead, research has shown that many of the public’s reactions to risk can be

The subjective and value-laden nature of risk assessment

Attempts to manage risk must confront the question: “What is risk?”. The dominant conception views risk as “the chance of injury, damage, or loss” [10]. The probabilities and consequences of adverse events are assumed to be produced by physical and natural processes in ways that can be objectively quantified by risk assessment. Much social science analysis rejects this notion, arguing instead that risk is inherently subjective [11], [12], [13], [14], [15], [16]. In this view, risk does not

Technical solutions to risk conflicts

There has been no shortage of high-level attention given to the risk conflicts described in the introduction to this paper. One prominent proposal by Breyer [25] attempts to break what he sees as a vicious circle of public perception, congressional overreaction, and conservative regulation that leads to obsessive and costly preoccupation with reducing negligible risks as well as to inconsistent standards among health and safety programs. Breyer sees public misperceptions of risk and low levels

Acknowledgements

Preparation of this paper was supported by the Alfred P. Sloan Foundation, the Electric Power Research Institute, and the National Science Foundation under Grant Nos. 91-10592 and SBR 94-122754.

Reprinted from ‘The risk game’, by [32]. Copyright 1998 By Elsevier Science Limited. Reprinted with permission.

References (32)

  • R. Gregory et al.

    Org. Behav. Human Decision Process.

    (1993)
  • P. Slovic

    Reliability Eng. Syst. Safety

    (1998)
  • R. Gregory et al.

    Am. Sci.

    (1995)
  • U.S. Environmental Protection Agency (U.S. EPA), Unfinished Business: A Comparative Assessment of Environmental...
  • R.I. Palm, Natural Hazards: An Integrative Framework for Research and Planning, Johns Hopkins, Baltimore,...
  • H. Kunreuther

    J. Risk Uncert.

    (1996)
  • R.L. DuPont, Nuclear Phobia — Phobic Thinking About Nuclear Power, The Media Institute, Washington, DC,...
  • V.T. Covello, W.G. Flamm, J.V. Rodricks, R.G. Tardiff, The Analysis of Actual Versus Perceived Risks, Plenum Press, New...
  • R.F. Weiner

    Risk Anal.

    (1993)
  • P. Slovic

    Science

    (1987)
  • P. Slovic

    Risk Anal.

    (1993)
  • N. Webster, Webster’s New Twentieth Century Dictionary, 2nd Edition, Simon and Schuster, New York,...
  • S.O. Funtowicz, J.R. Ravetz, in: S. Krimsky, D. Golding (Eds.), Social Theories of Risk, Praeger-Greenwood, Westport,...
  • S. Krimsky, D. Golding, Social Theories of Risk, Praeger- Greenwood, Westport, CT,...
  • H. Otway, in: S. Krimsky, D. Golding (Eds.), Social Theories of Risk, Praeger-Greenwood, Westport, CT, 1992, pp....
  • N. Pidgeon, C. Hood, D. Jones, B. Turner, R. Gibson, in: Royal Society Study Group (Ed.), Risk: Analysis, Perception...
  • Cited by (109)

    • Influence of psychological factors in food risk assessment – A review

      2020, Trends in Food Science and Technology
      Citation Excerpt :

      Moreover, whilst risk estimates are often based on theoretical models, there is still a subjective component to them. Slovic (2001) highlights the subjective judgements experts make at every stage of risk assessment, for instance when structuring the risk problem, deciding which consequences to include in analyses, identifying and estimating exposures, and choosing the dose-response relationship. Even when experts present objective information such as mortality risk, subjective judgements guide choice of how to express this (in terms of deaths per million, deaths per unit of concentration, loss of life expectancy etc.), choices which may influence risk management decisions.

    • Risk frames and multiple ways of knowing: Coping with ambiguity in oil spill risk governance in the Norwegian Barents Sea

      2019, Environmental Science and Policy
      Citation Excerpt :

      Özesmi and Özesmi (2004) suggest that by examining the structure of mental maps, we can identify groups who can act as “catalyst for change”: these include participants that perceive more relationships and thus have more options to change the system. Defining and assessing risks can be seen as an exercise in power: the dominant bodies have the power to define both the risks and the solutions (Slovic, 2001). As shown above, the participants identify ways to change the current risk governance framework, but their actual capacity to do so depends on social, political and economic factors (Özesmi and Özesmi, 2004).

    • Managing values in disaster planning: Current strategies, challenges and opportunities for incorporating values of the public

      2019, Land Use Policy
      Citation Excerpt :

      This form of risk analysis has the aim of identifying and reducing risks to arrive at ‘acceptable risks’. However, studies using psychological methods suggest differences in risk perceptions among people may be due to different perspectives on issues that are based in different values, not only different levels of scientific understanding (Kahan et al., 2011; Slovic, 2001). Thompson and Calkin (2011) describe a focus in wildfire risk assessment on managing probabilities and other scientific aspects of risk, while a more significant problem is poor understanding of what is valued by society that is at risk, which can be described as ‘value uncertainty’.

    View all citing articles on Scopus
    View full text