Safety understandings among crane operators and process operators on a Norwegian offshore platform
Introduction
Both the media and the Norwegian authorities have through the recent years been paying considerable attention to the safety and the safety culture in the petroleum industry. In line with this focus, different actors have implemented initiatives that are meant to improve the safety.
The company of the study has introduced a general principle requiring all employees to demand a stop in any work operation if they find it dangerous.1 This paper focuses on the definitions of dangerous work situations made by process operators and crane operators. A number of studies show that cultural, structural and psychological factors influence definitions of dangerous situations (e.g., Johannesen and Olsen, 2003, Hale and Glendon, 1987, Vaughan, 1997, Perrow, 1984). In this paper, the cultural basis for definitions of work situations as dangerous is studied. There is a vast literature on (safety) culture in organizations which generally understand culture as the attitudes, beliefs and values shared by groups of people. Some students of culture apply a Geertzian perspective, and refer to culture as shared “systems of meaning”; values, beliefs, symbols, etc., which are interrelated and form a meaningful pattern (Richter, 2001, p. 26). I understand culture in line with the Geertzian perspective, and refer to culture as “ways of thinking”, a concept inspired by Richter’s “systems of meaning”.
The first aim of the paper is to present certain ways of thinking mobilized by two different work groups on a Norwegian offshore platform interpreting, negotiating and defining work situations as dangerous. The ways of thinking are compared.
Various students of culture in organizations push for a focus on work groups (e.g., Alvesson, 2002, Hale, 2000, Gherardi et al., 1998). Alvesson claims that members of work groups tend to develop a similar understandings of their work (Alvesson, 2002, p. 13).
The second aim of the paper is to analyze the discernments required for defining certain situations as dangerous. A number of studies show that definitions of dangerous situations invariably require discretion (Vaughan, 1997, Vaughan, 2002, Johannesen and Olsen, 2003).
Section snippets
The two work groups
The two work groups are involved in very different work processes. The process operators (POs) work with closed systems that in many ways resemble Perrows’ (1984) notion of interactive complexity. Unlike the crane operators (COs), the POs are not in a position to directly see the dangers related to their work, and therefore have to draw inferences about dangers indirectly by means of computers and alarms.
The POs (16 persons on the day shift) work both in the control room (2–3 POs) and alone at
The literature on (safety) culture in organizations
The 1980’s saw a huge increase in the amount of literature on culture in organizations (Alvesson and Berg, 1992, p. 9). Organizational theorists refer to expressions as “the cultural turn” and “the culture boom” to describe the trends of organizational analyses at that time (Thompson and McHugh, 2002: 1991). These expressions illustrate that the focus shifted from economy and structure to culture in the 1980’s prevailing literature on organizations. A consequence of the “cultural turn” is that
Culture depicted as ways of thinking
The cultural anthropologist Clifford Geertz is often depicted as the most important source of inspiration for interpretive studies of cultural phenomena in organizations (Richter, 2001, Hatch, 2001). Geertz defines culture as: “(…) an ordered system of meaning and of symbols, in terms of which social interaction takes place.” (Geertz, 1993, 144). He views culture as a prerequisite for thoughts and actions (Haukelid, 1998, Haukelid, 2001, Haukelid, 2006). Geertz stresses that culture must be
Discernments required for definitions of dangerous situations
To employ discretion is to consider particular situations in the light of general conditions, to discern relevant elements in relevant situations. Judgments can be made on the basis both of explicit rules and procedures and of professional skill. Discernments in the latter sense can occur both as analytical considerations and as tacit knowledge. Tacit knowledge is an implicit form of knowledge not codified, but based on experience (Polanyi, 2000). Dreyfus and Dreyfus (1988) describe the
Interviews and analysis
The quantitative approach is the most popular when it comes to studies of safety culture (Glendon and Stanton, 2000, p. 209). The greatest advantage of quantitative research designs is that they produce representative results that can be generalized. In spite of this, I choose to conduct qualitative interviews, in order to be able to identify relatively coherent understandings and ways of thinking in relation to dangerous situations.
The data presented in this paper are a result of 14 formal
Metaphors/cultural expressions
The COs’ “best practice, enclosure A: roles and responsibilities” states that the operator of lifting arrangements must not participate in lifting operations if he or she feels physically or mentally unfit. Only one of the COs referred to this rule when talking about situations where the CO was unfit to operate the crane.
We have an open environment. If a new CO is doing something on the boat, and he doesn’t manage it, because he is inexperienced, or it is not his day or something. (…) Everyone
Discussion
The two ways of thinking presented in this paper share an essential characteristic: both the COs and POs primarily attribute hazards to the individual. They understand their own safety contribution in relation to for instance technological and organizational factors but their focus is on human error as a source of danger. The COs seem to place weight on inattention, which I have analytically labeled as a lack of “flow” (Dreyfus and Dreyfus, 1988, p. 40). The POs emphasized the hazard potential
Conclusions
I have, trough the presentation and analysis of the work groups’ ways of thinking, made an attempt to give images of how COs and POs think and act in relation to safety and danger. The shared ways of thinking illustrate how the work groups conceive of two typical work hazards in their work operations and how they act to prevent these dangerous situations. The ways of thinking seem to direct attention to factors of key importance to safety.
The shared ways of thinking regarding work hazards
Acknowledgement
I am very grateful to the Norwegian Research Council’s Petromaks program, which has funded this research.
References (40)
- et al.
Assessing safety culture in offshore environments
Safety Science
(2000) Managing safety in the workplace: an attribution theory analysis and model
Journal of Safety Research
(1994)- et al.
Perspectives on safety culture
Safety Science
(2000) Editorial: culture’s confusions
Safety Science
(2000)- et al.
Integration, differentiation and ambiguity in safety cultures
Safety Science
(2004) Cultural Perspectives on Organizations
(1993)Organisasjonskultur og Ledelse
(2002)- et al.
Corporate Culture and Organizational Symbolism
(1992) - et al.
Sociological Paradigms and Organizational Analysis: Elements of Sociology of Corporate Life
(1985) - et al.
Safety culture: philosopher’s stone or man of straw?
Work and Stress
(1998)
Mind over Machine: The Power of Human Intuition and Expertise in the Era of the Computer
Interviewing: the art of science
The Interpretation of Cultures
What do you mean by safety? Conflicting perspectives on accident causation and safety management in a construction firm
Journal of Contingencies and Crisis Management
Toward a working theory of culture
Individual Behavior in the Control of Danger
Organisasjonsteori
En historie om risiko: antropologiske betraktninger om sikkerhet, bedriftskultur og ledelse i norsk oljevirksomhet
TMV skriftserie
Cited by (21)
From clapham junction to macondo, deepwater horizon: Risk and safety management in high-tech-high-hazard sectors: A review of English and Dutch literature: 1988–2010
2020, Safety ScienceCitation Excerpt :It might be possible that strong emphasis on human error in the past makes us blind to a kind of inborn human ‘safety instincts’. For example, Nævestad reports on crane operators on oil platforms hoisting heavy and awkward loads (Nævestad, 2008). They decide themselves when to quit work or do so when they hear from their teammates that “it's not your day”.
Consistency control and expert consistency prioritization for FFTA by using extent analysis method of trapezoidal FAHP
2017, Applied Soft Computing JournalHuman error identification and risk prioritization in overhead crane operations using HTA, SHERPA and fuzzy VIKOR method
2015, Expert Systems with ApplicationsCitation Excerpt :Even though the area of human error analysis is well developed but only few studies have been done in the analysis of human errors in overhead crane operations. Few notable studies are the study by Nævestad (2008) in investigating the knowledge and attitude of crane operators in the field of safety on a Norwegian offshore platform and that of Swuste (2013) which studied the design faults which resulted in the accident for a tower crane during operations. The above mentioned techniques mostly cater to the human error identification and quantification of human reliability.
A 'normal accident' with a tower crane? An accident analysis conducted by the Dutch Safety Board
2013, Safety ScienceCitation Excerpt :The expression ‘it’s an off day for him today’ is an accepted phenomenon in risk communication. There is strong internal control because the consequences of mistakes made during hoisting can have major repercussions (Nævestad, 2008, 2010). In recent decades, accident investigation has emphasised the impact of management decisions and organizational conditions as distal factors in the accident process.
Understanding "communication gaps" among personnel in high-risk workplaces from a dialogical perspective
2012, Safety ScienceCitation Excerpt :Hence, we argue for a more nuanced approach to, and understanding of, communication and communicative practices, an approach that needs to encompass more contextual frames of reference than the organizational ones. Furthermore, several previous studies construct broad typologies of cultures or discourses that characterize and differentiate entire workgroups (e.g. Gherardi et al., 1998; Nævestad, 2008) or entire organizations (e.g. Gephart et al., 1990; Turner and Tennant, 2009). We have argued for an analytical approach that hesitates to reinforce the notion of homogeneous groups, stressing instead the value in demonstrating collective social construction processes, and commonalities and linkages so as to facilitate inter-group solidarity and possibly productive change (cf. Willmott, 2006).
Evaluating a safety culture campaign: Some lessons from a Norwegian case
2010, Safety Science