A methodology for modeling operator errors of commission in probabilistic risk assessment

https://doi.org/10.1016/0951-8320(94)90082-5Get rights and content

Abstract

This paper describes a methodology to incorporate operator errors of commission (EOCs) in nuclear power plant probabilistic risk assessments (PRAs). This is done by taking appropriate information from the plant PRA, operating procedures, information about plant configuration in terms of systems and functions, as well as physical and thermal-hydraulic information. This is combined with a set of performance influencing factors (PIFs), and results in an initial condition set which is fed into the primary tool for the methodology, called Human Interaction TimeLINE (HITLINE) to systematically generate sequences of human actions, including errors. Screening is applied to combinations of hardware failures, instrument failure, and PIFs to select the combinations that meet the criteria developed for the purpose. The criteria are based on the operator action or inaction causing a transition from one event tree (ET) branch to another.

The strategy of utilizing mapping tables to accomplish all major steps of implementation decomposes the analysis into two separate parts. Values used in different scenario-dependent (error) likelihoods, and adjustors for these likelihoods to account for the influence of the scenario-independent PIFs, are separately assigned. While developing the HITLINE, the methodology then involves the use of mapping tables to generate a set of PIFs, given the relevant information about plant and emergency operating procedures (EOPs). This set is then used to select predetermined weights and adjustors to compute the final weight to be assigned to each branch at a single branching point.

Quantification at each branching point is done through the multiple factors assigned to systematically assign the weights for different actions. Dependencies are carried from one branch point to another through the use of operator related variables such as operator diagnosis, and expectation about plant behavior. Size of HITLINE is managed by applying merging, truncation and termination rules at each time step.

Similar end states with respect to the ET are terminated and their weights are combined. Combination is also applied among HITLINEs constructed for different initial condition sets for a given initiating event. Incorporating the results back into the ET either causes a reordering of sequence probabilities or including additional operator related top events to the ET. The methodology is demonstrated through a hypothetical example.

References (27)

  • P.C. Cacciabue

    A cognitive model in a blackboard architecture: synergism of AI and psychology

    Reliab. Engng System Safety

    (1992)
  • D.I. Gertman

    INTENT: A method for estimating human error probabilities for decision based errors

    Reliab. Engng System Safety

    (1992)
  • A.D. Swain et al.

    Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Applications

    (1983)
  • E.M. Dougherty et al.

    Foundations for a time reliability correlation system to quantify human reliability

  • G.W. Hannaman et al.

    Human Cognitive Reliability Model for PRA Analysis

    (1984)
  • D.E. Embrey et al.

    SLIM-MAUD: An Approach to Assessing Human Error Probabilities Using Structured Expert Judgment

    NUREG/CR-3518

    (1984)
  • W.G. Stillwell et al.

    Expert Estimation of Human Error Problems in Nuclear Power Plant Operations: A Review of Probability Assessment and Scaling

    NUREG/CR-2255

    (1982)
  • D. Hunns et al.

    The method of paired comparisons and the results of the paired comparisons consensus exercise

  • L. Potash et al.

    Experience in integrating the operator contribution in the PRA of actual operating plants

  • A.I. Siegel et al.

    Maintenance Personnel Performance Simulation (MAPPS) Model

    NUREG/Cr-3626

    (1984)
  • D.D. Woods et al.

    An Artificial Intelligence Based Cognitive Environment Simulation for Human Performance Assessment

    NUREG/CR-4862

    (1987)
  • A.P. Macwan et al.

    Methodology for Analysis of Operator Errors of Commission during Nuclear Power Plant Accidents with Application to Probabilistic Risk Assessment

  • U.S. NRC

    Interim Report: The Onsite Analysis of the Human Factors of Operating Events

    (1991)
  • Cited by (46)

    • Systematic review of human and organizational risks for probabilistic risk analysis in high-rise buildings

      2019, Reliability Engineering and System Safety
      Citation Excerpt :

      This framework classifies unsafe acts into two types of activities: errors, which he defines as unintended actions; and violations, which are intended actions. Macwan and Mosleh [95] introduced another cognitive error framework that provides reference models to categorize human error. The human reliability analyses suggest error identification through task analyses and influence diagrams in the context of specific accident or risk scenarios, utilizing performance shaping factors that influence risk outcomes.

    • Investigating a homogeneous culture for operating personnel working in domestic nuclear power plants

      2016, Reliability Engineering and System Safety
      Citation Excerpt :

      However, it is still careful for HRA practitioners to directly employ information from existing HRA databases because of the absence of two underlying PSFs: organization and social aspect. For example, detailed PSFs belonging to the organization category (such as team cohesion and communication characteristics) are significant for understanding the performance of operating personnel working in NPPs [25,29,33,36,43]. At the same time, these PSFs affect others as shown in existing studies where significant interrelations among team cohesion, communication characteristics, and workload have been demonstrated [10,44,52,53].

    • Phoenix - A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

      2016, Reliability Engineering and System Safety
      Citation Excerpt :

      The development of the second generation HRA methods has taken place mostly along two parallel tracks. One track attempts to enhance the quality of HRA analysis within the “classical” framework of PRA [7,8]. The other track reflects the belief that substantive improvement in HRA for PRA applications requires structural changes to the PRA methodology, moving from the static, hardware-driven view of the world to a more flexible dynamic model of accident scenarios.

    • Investigating the appropriateness of a decision chart to characterize the level of task descriptions in nuclear power plants

      2013, Progress in Nuclear Energy
      Citation Excerpt :

      From the point of view of managing large process control systems, these benefits are essential because one of the dominant factors affecting their operational safety has been known as human performance related problems (e.g., human error) (Frostenson, 1995; HSE, 2005; Pyy et al., 2001; Taylor, 2000). In other words, if procedures are effective for enhancing the performance of human operators, then the provision of good procedures will be a practical way to reduce the risk of large process control systems (Brito, 2002; Dien et al., 1992; Hattermer-Apostel, 2001; Macwan and Mosleh, 1994; Salminen and Tallberg, 1996; Wieringa and Farkas, 1991). Actually, Degani et al. (1999) articulated this expectation by advocating such that: “In complex human-machine systems, successful operations depend on an elaborate set of procedures provided to the human operators.

    • Human and organizational error data challenges in complex, large-scale systems

      2009, Safety Science
      Citation Excerpt :

      Examples of perceptual errors include failures to recognize dangerous situations, or approaches to dangerous situations; failures to recognize patterns of events that could lead to failures; or a lack of awareness of surroundings, situations or behavior that could led to adverse events. Cognitive error frameworks provide reference models to categorize human error; in addition, human reliability analyses suggest error identification through task analyses and influence diagrams in the context of specific accident or risk scenarios, utilizing performance shaping factors that influence risk outcomes (Macwan and Mosleh, 1994; Swain, 1987; Swain and Guttmann, 1983). Similarly, systemic models of accident causation and human performance adopt the view that since both failures and successes in large-scale systems are the outcome of normal performance variability, it is necessary to study both successes and failures in systems, and to find ways to reinforce the variability that leads to successes, as well as to dampen the variability that leads to adverse outcomes.

    View all citing articles on Scopus

    Current Address: Lab. for Measurement and Control, Department of Mechanical Engineering, University of Delft, Mekelweg 2, 2628 CD Delft, The Netherlands.

    View full text