Next Article in Journal
Accident Report Interpretation
Previous Article in Journal / Special Issue
A Safety Culture Maturity Matrix for Nuclear Regulatory Bodies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Safety Culture Oversight: An Intangible Concept for Tangible Issues within Nuclear Installations

Bel V, 1070 Brussels, Belgium
Submission received: 9 August 2018 / Revised: 21 September 2018 / Accepted: 6 October 2018 / Published: 9 October 2018
(This article belongs to the Special Issue Nuclear Safety)

Abstract

:
Traced back to the Chernobyl Accident analysis (INSAG-1), the concept of safety culture is regarded as a central phenomenon influencing behaviors and values within high-risk organisations. Many studies have already been conducted on safety culture within nuclear installations. Describing a model designed to capture and assess safety culture observations, this paper intends to highlight the role of safety culture within the overall regulatory nuclear safety oversight, and to show how intangible cultural elements can lead to the identification of tangible safety issues.

1. Introduction

Safety culture is “a relatively stable, multidimensional, holistic construct shared by (groups of) organisational members that supplies a frame of reference and which gives meaning to and/or is typically revealed in certain practices” [1]. Constructed over the years, these frames of reference are made of multiple layers [2] and drive people’s actions and shape perceptions related to safety. As a critical dimension ensuring success or causing failure in nuclear organisations [3,4,5,6,7,8,9,10,11,12,13], safety culture has generated numerous studies in recent years [14] (A google scholar search of the keyword “safety culture” revealed 3,540,000 articles. For the period 2008–2018, 1,620,000 articles have been identified—research performed in September 2018).
From a regulatory perspective, safety culture oversight is also a major issue in order to capture weak signals or deep-seated cultural issues that could lead to an event. Consisting of interpretations, assumptions and beliefs guiding behaviors towards risks, safety culture is a fundamental dimension of safety to be addressed by a regulatory body [15].
Safety culture is also a safety dimension almost impossible to regulate at a distance: a regulator cannot impose a good safety culture in the same way it can impose a rule. In addition, defined as a set of deeply rooted frames of reference, safety culture implies having a deep knowledge of field practices, operators’ way of thinking and basic assumptions that people act upon. In other words, these building blocks of safety culture are mainly intangible elements. On the one hand, safety culture is a critical dimension for safety; on the other hand, safety culture remains difficult to discern: How far then could a regulatory body influence the safety culture of a nuclear installation?
The aim of this paper is, therefore, to shed light on a method used to capture and to assess safety culture from a nuclear regulatory body perspective, in this case in Belgium. More precisely, the key idea of the paper is to show that intangible cultural elements [2] can lead to the identification of tangible safety issues to be addressed by a licensee. For that purpose, we will provide a description of a model designed to capture and assess safety culture observations: We will discuss, what elements of culture should we focus on? How do we observe safety culture? Is safety culture quantifiable? How do we assess it? Secondly, we will show how this model could be applied to a Nuclear Power Plant (NPP) case study. Finally, we will focus on the potential impacts of safety culture assessment on safety oversight.

2. Capturing Safety Culture

2.1. A Model Based on Safety Culture Observations

Safety culture could be addressed through different means, such as questionnaires, document reviews, focus groups, interviews and observations [16]. The model proposed in this paper is based on a “Safety Culture Observations” (SCO) process applied for several years within the Belgian TSO (Bel V is the Belgian TSO (Technical Safety Organisation). As a subsidiary of the FANC (Federal Agency for Nuclear Control), Bel V aims to carry out the surveillance of the Belgian nuclear installations within the frame of the Belgian laws and regulations. Bel V provides technical support concerning nuclear safety assessments (Safety Evaluation Reports), performs conformity checks of new plants or modifications and inspections in existing installations. The Belgian Regulatory Body is composed of the FANC and Bel V).
This model is fed by field observations provided by inspectors or safety analysts during any contact with a licensee (inspections, meetings, phone calls…). These observations are recorded within an observation (e.g., excel) sheet aimed at describing factual and contextual issues. These observations are thereafter linked to safety culture attributes based on IAEA (International Atomic Energy Agency) standards [17].
Operationally speaking, the applied observation sheet template gives a homogenous framework to introduce information about the facility, the type of intervention during which the observation has been made (inspection, meeting, etc.), the topic (matter of inspection/discussion) and the date of observation. More fundamentally, a safety culture observation also implies the description of the context, the identification of safety culture attributes, and an argumentation developing the reasons why the observed fact is linked to safety culture. As an important feature, observations can be positive or negative.
In other words, safety culture observations are fully integrated in the inspectors’ daily practices. Actually, performing a SCO is an opportunity to capture and record Human and Organisational Factors (HOF) issues which are not always addressed within an inspection report. As an illustration, the following SCO example shows that an observation can raise issues related to human performance. More precisely, according to this example, the SCO identifies what is called a “confirmation bias”, i.e., the tendency to search for information in a way that confirms one’s preexisting beliefs or hypotheses.
“During an inspection within the main control room, an alarm occurs. The Main Control Room operator directly clears the alarm without checking the alarm card. The operator explains to the inspector that the alarm was related to maintenance works on a system. Nevertheless, the operator is unable to describe the technical links between the maintenance intervention and the alarm. After a short investigation the inspector found that the link between the maintenance intervention and the alarm was not relevant”.
Performing a SCO is also an opportunity to gain more insights in a situation. As an illustration, the following SCO example shows that an observation helps to make assumptions about “the way people do things around here”.
“Requested checklists related to the use of hot cells are not systematically completed. That remark has already been made several times to operators by the nuclear authority”.
Obviously, a single observation is not enough to provide an overall cultural picture, but in that last case, we can observe some shortcoming regarding procedure adherence and use, and the capacity of the organisation to take into account comments from the authority. In addition, this observation also gives an indirect insight in the way the management line or the health physics department fulfil their respective roles. These gaps are therefore valuable findings to be further investigated during future inspections.

2.2. Methodological Aspects of Safety Culture Observations

2.2.1. Seeking Visible and Invisible Elements

The INSAG-4 [18] states that “Safety culture is that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance”.
As a main strength, this definition highlights an important feature of safety culture, i.e., its two fundamental sides: safety culture is both structural (organisational structure, roles and responsibilities, documentation, policy statement…) and attitudinal (perceptions, social norms, way of thinking, and patterns of behavior). Therefore, this means that safety culture observations must take into account different types of activities at different levels:
  • Individual level: elements such as questioning attitude, individual awareness, accountability, reporting, rigorous and prudent approach…
  • Group level: elements such as communication, teamwork, decision making, supervision, peer check…
  • Organisational level: elements such as definition of responsibilities, definition and control of practices, qualification and training, review functions, management commitment, procedures, safety policies, resources…
In addition, according to Schein’s model [19], this implies also that safety culture observations must take into account not only visible “artefacts” (system or material elements and behaviors) but also tacit “espoused values” (guiding principles as goals, beliefs, norms) and deep-seated “basic assumptions” (basis that people act upon). Using the iceberg metaphor (Figure 1), we can easily understand that culture shows both visible and invisible sides.
First, “Artefacts” are material representations such as safety guidance pocket books, charters, workspace and other manifestations that include behaviors, rituals, dress code or the manner in which people interact. Second, “Espoused values” are defined as values adopted and supported by an organisation through general statements—such as “Safety first” or concerning teamwork, decision-making or reporting practices. Third, according to the Schein model, the deepest layer of culture is the underlying assumptions, i.e., the taken-for-granted and unconscious beliefs that determine perceptions and behaviors. These shared assumptions are implicitly understood within an organisation, often unquestioned and deeply grounded on practices that resulted from a learning process.
Bearing this in mind, the tacit and invisible dimensions cannot be observed directly and are complex to address. Since cultural aspects are, most of the time, “submerged”, safety culture is mainly observed through artefacts and espoused values. Nevertheless, from the safety culture observations, clues about the deepest layers can be drawn out [13]. As a challenge, the closer we can observe submerged layers, the deeper we can explore safety culture.

2.2.2. Observations are Rather Descriptive than Normative

As a result, the meaning of a safety culture observation does not appear spontaneously. To some extent the positive or negative sides of an observation are not definable at first sight. In contrast, standards and guidelines in the field develop different lists of key attributes indicating what a good safety culture is (The concept was first introduced in the IAEA INSAG-1 (1986) and further expanded in INSAG-3 (1988) and INSAG-4 (1991)). Following these IAEA publications, several other documents have been published in order to enhance safety culture through key issues to be observed (INSAG-15, 2002), surveys or self-assessment methods to be implemented (TECDOC-1321, 2002; TECDOC-1329, 2002) or the identification of safety culture development stages (SRS-11, 1998). In addition, the GS-R-3 (2006) and the GS-G-3.1 (2006) standards draw out the five main characteristics describing safety culture. According to the GSR Part 2 (2016), safety culture assessment is now a requirement. We can also note the WANO and INPO position (INPO 12-012, rev.1, 2013) concerning the safety culture key dimensions).
Many factors, such as a questioning attitude, the trust between management and operators or cross-functional teamwork, are a.o. attributes commonly considered as characteristics of a strong safety culture. Conversely, warning signs of a weak safety culture could be identified from such factors as a lack of a systematic approach, insufficient reporting practices or a resource mismatch.
The normative view of safety culture actually gives a useful framework for defining what safety culture should be. However, what is a good or a bad safety culture is, is not so clear-cut in the workplace. For instance, a statement such as a lack of “compliance with regulations, rules and procedure” is obviously significant but, adopting a safety culture point of view, it is more important to understand why people did not follow the rule: are we facing a case of bad behavior, or rather, a bad rule issue?
A question could also arise as to knowing why operators did not comply: does it mean that we are facing an understanding problem (lack of training, knowledge of work process…) or a procedure fitness problem (adaptation of the procedure to a specific task)? Relating to the group level we can raise issues concerning the legitimized level of compliance within a group (department, team, plant…). In terms of management, the questions could be oriented towards the commitment of management, the leadership style or the supervision practices.
Therefore, adopting a “why approach”, safety culture observations are not “black or white” and need to be understood in a systemic way. As an illustration, the following SCO example shows that the positive or negative sides have to be evaluated carefully.
“A manager of the Operation department goes on the field after work hours in order to check all work in progress. Some gaps are observed and reported by the manager to the team”.
In the first instance, this fact reflects the commitment of this manager and the continuous improvement capacity of the system, but at the same time, this observation raises the issue of the effective field presence of the management when daily works are carried out. Thus, safety culture observations provide valuable data, but mostly make sense when they are considered in interconnection.

2.2.3. Toward a Deep Understanding of the Workplace

Observations focus on facts—i.e., information based on real occurrences: behaviors, statements, discrepancies…—but should also take into account the context, i.e., the workplace elements.
The first objective is therefore to answer the “What happened?” question. An observable fact could be either organisational (a resource mismatch, a backlog, a staffing problem…) or behavioural (a statement concerning cooperation or communication, a lack of verification or communication, a relevant decision, a disregard for rules…). Second, an observation also has to be enhanced with answers to some other generic questions (who, where, when…) in order to describe the workplace situation as far as possible: the operation or activity, the people involved (function, department, organisation…), the problem to be solved, the document used or not, the management role, the communicational context (one way communication, participation…), work conditions (stress, workload…), etc.
In this line of thinking, providing an observation is not only about establishing a link between a statement and a dedicated attribute. The important point is to describe what is behind the link, seeking to shed light on the underlying reasons as to why e.g., rules were ignored. That means that observations are not context-free: what is at stake, is a deep understanding of the workplace situation.
As a case in point, we can relate a fictitious, and a somewhat caricatured, example of an observation describing the fact (§1) and, afterwards, the organisational and behavioural context (§2):
(§1) “During a routine inspection in the main control room of the unit X, it has been observed a discrepancy between the level of the tank ICS C07 (Intermediate Cooling System) indicating 86% and the X-DOC-15 procedure referencing a Technical Specifications criterion of 56% < N < 80% (TS 16.XXX).
At the current status of the observation, we can notice that a simple focus on this fact, as described, could lead an inspector or a safety analyst to identify a compliance issue. We are then facing a classic statement driven by a compliance-based approach. However, safety culture observations imply going further.
(§2) The observation has been made at the beginning of the morning shift in the control room. The unit operated at full power. Questioned about the tank level, the operator in charge stated that he was not aware of this indication: “I rarely take this level into account. It’s not in my procedure. We do not check it systematically”. Rapidly, the chief operator opened the Technical Specifications and stated that the tank maximum level was not reported in the TS. Only the minimum level was reported”.
Taking stock of this example, it seems that various directions could be followed. On the one hand, the operator did not take ownership or show a questioning attitude concerning the check of the tank level. On the other hand, playing his supervisory role (maybe a bit late), the chief operator showed his involvement. Therefore, linking an observation to an attribute must not be considered as an end, but rather, as a starting point for further questions. As said, safety culture observations require a systemic approach: a central question is to know, what does this observation tell us about the system?

3. Assessing Safety Culture Observations

3.1. Safety Culture Assessment through a Quantitative Approach

From a functionalist perspective [20,21,22], culture is something that the organisation has. Safety culture is then a set of behaviors, attributes, processes or policies ensuring that safety is an overriding priority. Considered as an ideal to which organisations should aspire, (a good) safety culture is established when a set of features are implemented. On the one hand, this ideal should be adapted to serve the organisation. On the other hand, it implies that management plays a major role as the initiator of safety culture shaping.
Within this top-down approach, safety culture can then be engineered. A common approach to quantitatively assess safety culture is to apply survey methods, such as questionnaires [23] and to identify the general attributes of a strong or good safety culture [24,25]. Some research within this standpoint directs attention to safety climate which can be defined as a “snapshot” [26], a manifestation of safety culture. Following the seminal work of Zohar [27], many authors have therefore attempted to determine factors reflecting safety culture or climate [4,6,28]. Self-completion questionnaires are useful tools for capturing perceptions about safety and exploring differences between groups or organisational levels [14]. These instruments are also appropriate in order to provide a baseline for further comparison over time. However, results obtained through quantitative methods could be limited to an organisation’s safety climate snapshot [29], i.e., to explicit measures influenced by a set of factors such as organisational circumstances or socially desirable response strategies [30]. In other words, culture can hardly be expressed through numbers only: firstly, a focus on quantitative results could lead to a superficial description of the culture of an organisation and, secondly, numerical abstractions or calculations tell us very little about human dynamic processes, i.e., the way people solve their problem on the field [31].

3.2. Safety Culture Assessment through a Qualitative Approach

Conversely, from an interpretive perspective, culture is something the organisation is. Considered as a shared pattern of meanings constructed within social groupings, safety culture defines beliefs—what is safe or dangerous [32]—and motivates and legitimizes behaviors through a shared repertoire of positively- and negatively-loaded meanings [33], or enables collective identity [34]. In contrast with the previous perspective, culture is a bottom-up phenomenon emerging through interactions within groups grounded in a specific context of technology [35].
This perspective is then reluctant to adopt an instrumental treatment of the concept or to seek generic features of a good safety culture [13]. Interpretive studies on safety culture focus on thick descriptions of work activities, actors’ meanings and occupational culture [36,37,38,39,40]. However, except for some scholars [41,42], there have been few attempts to adopt this kind of ethnographic approach in the nuclear field. As its main pitfalls, qualitative data could lead to an overgeneralisation from a small number of findings, or remain focused on the area of expertise of the observer.

3.3. Safety Culture Oversight from a Regulatory Perspective

According to the assessment model developed, safety culture observations are analysed through a four-dimension model structured by two axis (see Figure 2). First, safety culture observations could concern “organisational processes” (processes, procedures and documentation, the interfaces between departments…) or “behavioural” issues (way of doing, norms, attitude…). Second, safety culture observations could concern “managerial” issues (what is said and done by managers) or “workplace practices” (what is done in the field).
Then, at the intersection of these two axes, we found four dimensions—Management system, Leadership, Human Performance and Learning—that reflect the different “building blocks” of safety culture:
  • Management system: within this dimension we can find safety culture elements such as safety policies, work process, procedures, interfaces… The main issue here is to assess the level of integration of safety within the management system and related documentation;
  • Leadership: within this dimension we can find safety culture elements such as commitment, decision making, supervision... The main issue here is to assess the level of managers’ involvement regarding operations management;
  • Human performance: within this dimension we can find safety culture elements such as a questioning attitude, compliance, team skills, situation awareness… The main issue here is to assess the consistency between field practices and human performance principles as well as the adaptation capabilities of field operators;
  • Learning: within this dimension we can find safety culture elements such as reporting or assessment practices, knowledge transfer, continuous improvement… The main issue here is to assess the learning capabilities of the organisation.
During the assessment step, the four dimensions are used to gather observations showing similarities (clustering step). (Concerning the assessment side, a “Safety Culture Coordinator” (SCC) provides a set of evaluations aimed at identifying early signs of safety problems (through a quarterly monitoring) and deep-rooted cultural issues (through annual and pluriannual assessments). As a result of these evaluations, it could be decided to analyse a plant’s performance more in detail in order to understand the underlying causes of a problem or to focus inspections on specific aspects. On a yearly basis, a detailed safety culture assessment report is released and a synthesis is presented and discussed with the licensee) For instance, safety culture observations related to the managers’ commitment are taken together in order to understand how deep commitment is demonstrated. Considering all the observations related to the behaviours of the management line, we can build up an overall view of leadership. The same process is then applied for the other dimensions. It is worth mentioning that some observations could be used in several dimensions.
Safety culture observations are then assessed through these four key dimensions which allow for placing the emphasis on specific safety culture attributes, and, adopting an overall view, to identify the major cultural traits of a nuclear installation. As a holistic approach, the main method is to understand the connections: firstly, the connections between observations in order to provide relevant clusters at the level of each dimension; secondly, the connections between clusters in order to draw a cultural picture at the level of a nuclear installation.

4. Applying the Model: An NPP Case Study

The presented case study concerns an NPP characterised by several occurrences of infringement of nuclear regulations related to compliance with the Plant Technical Specifications (OL&C). For example, some of the violated conditions were related to time delay for performing tests or to bring the systems back in full compliance with the plant Technical Specifications. Some other events were also related to erroneous position of valves.
Several of these events triggered reactive inspections by the Regulatory Body. The licensee had been requested to provide immediate and short-term actions supporting compliance and as well as a longer term action plan aimed at achieving a cultural change. These plans validated by the Regulatory Body covered a large set of dimensions, including HOF issues such as leadership, training management or ergonomics of procedures. In essence, the actions taken were dedicated to increasing the operator’s vigilance, the reinforcement of the leader’s presence on the field and the improvement of methods addressing HOF in event analysis. In short, this set of measures aimed at reinforcing safety culture. The Regulatory Body followed up the implementation phase and performed a set of dedicated inspections aimed at assessing the effectiveness of scheduled actions. Among these inspections, two specific safety culture inspections (within two years) have been conducted based on qualitative interviews techniques in order to evaluate the depth of the undertaken changes.
To some extent, this case shows strong similarities with a recent situation that occurred within the “Arkansas Nuclear One” (ANO) NPP in 2014 (USNRC. ANO-NRC Supplemental Inspection Report. https://www.nrc.gov/docs/ML1616/ML16161B279.pdf (retrieved on 10 August 2017)). For instance, in terms of safety culture, it has been identified that the most significant causes for declining performance were ineffective change management with respect to resource reductions, and leadership behaviors. Actually, the licensee reduced resources across its fleet in 2007 and 2013, but it did not adequately consider the unique staffing needs for ANO created by having two units with different designs. Additionally, an unexpected increase in employee attrition between 2012 and 2014 caused a loss in experienced personnel, which led to a reduced capacity to accomplish work, and an increased need for training and supervision. In both cases, the regulator ensured a follow-up of the licensee action plans, conducted a set of inspections and performed a safety culture assessment.
Regarding our case, we performed a safety culture assessment on the basis of 199 observations (SCO) gathered during a three-year period and following the method described in previous chapters. As explained these safety culture observations have been grouped according to their belonging to the four dimensions presented. Then “Artefacts” and manifestations as “Espoused values” have been interpreted in order to arrive at a description of shared underlying assumptions. The following elements synthesize the main findings of the assessment.
Through the lens of the management system, most of the observations are related to a lack of adherence to procedures and to some discrepancies related to major work processes. In addition, a large set of observations shows a poor use of prescribed forms and a lack of rigor in document management as well.
Concerning the leadership dimension, positive observations demonstrate the involvement of the upper management in the improvement of safety. Besides, some examples of conservative decision-making or transparency to the regulator reflect the safety commitment of field managers. Nevertheless, a set of observations raises the question of the effectiveness of the managers’ field presence. Actually, managers were in the field but mainly when problems occurred. This implies a management style that could be considered as “management by exception”.
As regards the human performance dimension, a set of observations shows weaknesses in rules compliance, a questioning attitude and in the quality of work interventions. Underlying reasons of these weaknesses are grounded in a lack of ownership—i.e., regarding processes, corrective actions, respect of time delay or peer-check—and in a routinisation process—i.e., the force of habits and the normalisation of long-standing practices.
Regarding the learning dimension, some positive observations are related to the licensee’s capacity to investigate technical root causes and to perform deep analysis. However, some reluctance was also observed to tackle recurrent events, particularly when they are rooted in organisational or human issues. More significantly, a large set of observations show weaknesses in the implementation of effective corrective actions, questioning the capacity of the licensee to conduct in-depth changes.
It is also important to notice that these findings emerge recurrently after several assessments, indicating deep-seated issues. Therefore, adopting a holistic view, we can draw out cultural traits for each of the four dimensions (see Figure 3).
  • Management system: Loss of meaning regarding rules;
  • Leadership: Lack of effective field presence and leadership by exception;
  • Human performance: Lack of ownership and routinisation of practices;
  • Learning: Insufficient capacity for in depth changes.
However, a cultural analysis also needs a holistic view. In other words, these four dimensions are not isolated features, but rather, tightly connected elements of a larger cultural system influencing the way people think and act within an installation. Regarding our case study, we can see that the four dimensions are strongly linked: in a nutshell, the lack of an effective field presence by managers has contributed to a loss of meaning regarding rules. Slowly, people considered work activities as routines and did no longer demonstrate a strong sense of ownership. This also implies some weaknesses regarding the capacity of the organisation to ensure continuous improvement.
In other words, the identified traits should demonstrate an internal consistency (i.e., strong links between dimensions) in order to draw a relevant cultural picture. It is also important to note that this picture is not only a snapshot but could also be used to “anticipate”, as a far as possible, potential evolutions or consequences. For instance, these findings could be connected to the work of Snook [43] who identified four evolutions of states of socio-technical systems that entail a specific safety situation:
  • “Designed organization”: firstly, the global rules are followed;
  • “Engineered organization”: these rules can be considered as not necessary because their usefulness is no longer perceived;
  • “Applied organization”: it appears that local rules take precedence in daily practices;
  • “Failure”: ultimately it is the whole system that becomes vulnerable.
In the same way, the cultural picture identified could lead to adverse effects. If the traits are not under control, these discrepancies can lead to what Snook called a “Practical drift”, a slow and insidious drift causing the uncoupling between the written rules and the actual practices in the field.

5. Conclusions: Tangible Effects of Safety Culture Oversight

The aim of this paper was to present safety culture observations as a valuable framework for reporting, recording and assessing cultural factors that impact safety. As an underlying and prevailing framework, safety culture shapes, in a stable way, how people perceive a situation, make sense of it and act. In other words, observing safety culture outcomes implies to adopt a global (i.e., holistic) point of view: facts or statements drawn out during specific interactions with licensees (meetings, inspections, assessments, walk-down, informal contacts…) are part of a broader human system. Considering the individual, group and organisational levels, an observation must then be a tool that matches artefacts with deeper cultural layers.
From a regulatory body approach, the implementation of a safety culture oversight is also an opportunity to capture safety issues that are sometimes poorly addressed (e.g., leadership style, capacity to change, workforce perceptions…). As a result, Human and Organisational Factors topics [44] could be better integrated within the technical inspection programme. In other words, according to the findings of a safety culture assessment, a regulatory body has a better view on strengths and weaknesses of a nuclear installation. Indeed, the provided assessment highlights areas (practices, competences, equipment, department…) in need of attention.
In addition, a safety culture observations process opens new avenues for the regulatory body oversight strategy. Through a cultural analysis as described within this paper, a regulatory body has a valuable insight in critical safety issues to be addressed by the licensee and, therefore, to verify the capability of the licensee to provide appropriate actions to tackle these issues.
Actually, it is important to notice that the results of a safety culture assessment are not only “metaphors”: for a licensee they are also operational leverage points for change and for safety improvements. Within the model, the licensees obviously retain the prime responsibility for safety, but a regulator has an opportunity to promote safety culture enhancements, identify topics to be improved and monitor the directions taken by a licensee. Therefore, safety culture assessment findings are no longer intangible, but have rather become tangible safety aspects to be managed.

Funding

This research received no external funding.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Guldenmund, F.W. The Nature of Safety Culture: A Review of Theory and Research. Saf. Sci. 2000, 34, 215–257. [Google Scholar] [CrossRef]
  2. Guldenmund, F.W. (Mis)understanding Safety Culture and its Relationship to Safety Management. Risk Anal. 2010, 30, 1466–1480. [Google Scholar] [CrossRef] [PubMed]
  3. Lee, T. Assessment of Safety Culture at a Nuclear Reprocessing Plant. Work Stress 1998, 12, 217–237. [Google Scholar] [CrossRef]
  4. Lee, T.; Harrison, K. Assessing Safety Culture in Nuclear Power Stations. Saf. Sci. 2000, 34, 61–97. [Google Scholar] [CrossRef]
  5. Wilpert, B.; Itoigawa, N. Safety Culture in Nuclear Power Operations; Taylor and Francis: London, UK; New York, NY, USA, 2001. [Google Scholar]
  6. Harvey, J.; Erdos, G.; Bolam, H.; Cox, M.A.A.; Kennedy, J.N.P.; Gregory, D.T. An Analysis of Safety Culture Attitudes in a Highly Regulated Environment. Work Stress 2002, 16, 18–36. [Google Scholar] [CrossRef]
  7. Findley, M.; Smith, S.; Gorski, J.; O’neil, M. Safety Climate among Job Positions in a Nuclear Decommissioning and Demolition Industry: Employees’ Self-reported Safety Attitudes and Perceptions. Saf. Sci. 2007, 45, 875–889. [Google Scholar] [CrossRef]
  8. Mengolini, A.; Debarberis, L. Safety Culture Enhancement through the Implementation of IAEA Guidelines. Reliab. Eng. Syst. Saf. 2007, 92, 520–529. [Google Scholar] [CrossRef]
  9. Reiman, T.; Pietikainen, E.; Oedewald, P.; Gotcheva, N. System Modelling with the DISC Framework: Evidence from Safety-critical Domains. Work 2012, 41, 3018–3025. [Google Scholar] [PubMed]
  10. Mariscal, M.A.; Garcia Herrero, S.; Toca Otero, A. Assessing Safety Culture in the Spanish Nuclear Industry through the Use of Working Groups. Saf. Sci. 2012, 50, 1237–1246. [Google Scholar] [CrossRef]
  11. Garcia Herrero, S.; Mariscal, M.A.; Gutierrez, J.M.; Toca Otero, A. Bayesian Network Analysis of Safety Culture and Organizational Culture in a Nuclear Power Plant. Saf. Sci. 2013, 53, 82–95. [Google Scholar] [CrossRef]
  12. Rollenhagen, C.; Westerlund, J.; Naswall, K. Professional Subcultures in Nuclear Power Plants. Saf. Sci. 2013, 59, 78–85. [Google Scholar] [CrossRef]
  13. Schobel, M.; Lostermann, A.; Lasalle, R.; Beck, J.; Manzey, D. Digging Deeper! Insights from Multi-method Assessment of Safety Culture in Nuclear Power Plant based on Schein’s Culture Model. Saf. Sci. 2017, 95, 38–49. [Google Scholar] [CrossRef]
  14. Van Nunen, K.; Li, J.; Reniers, G.; Ponnet, K. Bibliometric Analysis of Safety Culture Research. Saf. Sci. 2018, 108, 248–258. [Google Scholar] [CrossRef]
  15. Antonsen, S.; Nilsen, M.; Almklov, P.G. Regulating the Intangible. Searching for Safety Culture in the Norwegian Petroleum Industry. Saf. Sci. 2017, 92, 232–240. [Google Scholar] [CrossRef]
  16. IAEA Safety Reports Series. Performing Safety Culture Self-Assessments; IAEA: Vienna, Austria, 2016; No. 83. [Google Scholar]
  17. Bernard, B. Safety Culture as a Way of Responsive Regulation: Proposal for a Nuclear Safety Culture Oversight Model. Int. Nucl. Saf. J. 2014, 3, 1–11. [Google Scholar]
  18. IAEA Safety Reports Series. Safety. Culture; INSAG-4; IAEA: Vienna, Austria, 1991. [Google Scholar]
  19. Schein, E.H. Organizational Culture and Leadership: A Dynamic View; Jossey Bass: San Francisco, CA, USA, 1985. [Google Scholar]
  20. Glendon, A.I.; Stanton, N.A. Perspectives on Safety Culture. Saf. Sci. 2000, 34, 193–214. [Google Scholar] [CrossRef]
  21. RichteR, A.; Koch, C. Integration, Differentiation and Ambiguity in Safety Cultures. Saf. Sci. 2004, 42, 703–722. [Google Scholar] [CrossRef]
  22. Naevestad, T.-O. Mapping Research on Culture and Safety in High-Risk Organizations: Arguments for a Sociotechnical Understanding of Safety Culture. J. Conting. Crisis Manag. 2009, 7, 126–136. [Google Scholar] [CrossRef]
  23. Smith-Crowe, K.; Burke, M.J.; Landis, R.S. Organizational Climate as a Moderator of Safety Knowledge-Safety Performance Relationship. J. Organ. Behav. 2003, 24, 861–876. [Google Scholar] [CrossRef]
  24. Do Nascimento, C.S.; Andrade, D.A.; De Mesquita, R.N. Psychometric model for safety culture assessment in nuclear research facilities. Nucl. Eng. Des. 2017, 314, 227–237. [Google Scholar] [CrossRef]
  25. Warszawska, K.; Kraslawski, A. Method for quantitative assessment of safety culture. J. Prev. Process. Ind. 2016, 42, 27–34. [Google Scholar] [CrossRef]
  26. Cox, S.J.; Flin, R. Safety Culture: Philosopher’s Stone or a Man of Straw? Work Stress 1998, 12, 189–201. [Google Scholar] [CrossRef]
  27. Zohar, F. Safety Climate in Industrial Organisations—Theoretical and Applied Implications. J. Appl. Psychol. 1980, 65, 96–102. [Google Scholar] [CrossRef] [PubMed]
  28. Flin, R.; Mearns, K.; O’connor, P.; Bryden, R. Measuring Safety Climate: Identifying the Common Features. Saf. Sci. 2000, 34, 177–192. [Google Scholar] [CrossRef]
  29. Guldenmund, F.W. The Use of Questionnaires in Safety Culture Research—An Evaluation. Saf. Sci. 2007, 45, 723–743. [Google Scholar] [CrossRef]
  30. Marquardt, N.; Gades, R.; Robelski, S. Implicit Social Cognition and Safety Culture. Hum. Factors Ergon. Manuf. Serv. Ind. 2012, 22, 213–234. [Google Scholar] [CrossRef]
  31. Hopkins, A. Studying Organisational Cultures and their Effects on Safety. Saf. Sci. 2006, 44, 875–889. [Google Scholar] [CrossRef]
  32. Vaughan, D. The Challenger Launch Decision. Risky Technologies, Culture and Deviance at NASA; Chicago University Press: Chicago, IL, USA, 1996. [Google Scholar]
  33. Reiman, T.; Oedewald, P. Measuring Maintenance Culture and Maintenance Core Task with CULTURE-questionnaire—A Case Study in the Power Industry. Saf. Sci. 2004, 42, 859–889. [Google Scholar] [CrossRef]
  34. Gherardi, S.; Nicolini, D.; Odella, F. What Do You Mean by Safety? Conflicting Perspectives on Accident Causation and Safety Management in a Construction Firm. J. Conting. Crisis Manag. 1998, 6, 202–213. [Google Scholar] [CrossRef]
  35. Rochlin, G.I. Safe Operation as a Social Construct. Ergonomics 1999, 42, 1549–1560. [Google Scholar] [CrossRef]
  36. Atak, A.; Kingma, S. Safety Culture in an Aircraft Maintenance organisation: A View from the Inside. Saf. Sci. 2011, 49, 268–278. [Google Scholar] [CrossRef]
  37. Antonsen, S. The Relationship between Culture and Safety on Offshore Supply Vessels. Saf. Sci. 2009, 47, 1118–1128. [Google Scholar] [CrossRef]
  38. Naevestad, T.-O. Safety Understandings among Crane Operators and Process Operators on a Norwegian Offshore Platform. Saf. Sci. 2008, 46, 520–534. [Google Scholar] [CrossRef]
  39. Farrington-Darby, T.; Pickup, L.; Wilson, J.R. Safety Culture in Railway Maintenance. Saf. Sci. 2005, 43, 39–60. [Google Scholar] [CrossRef]
  40. Brooks, B. Not Drowning, Waving! Safety Management and Occupational Culture in an Australian Commercial Fishing Port. Saf. Sci. 2005, 43, 795–814. [Google Scholar] [CrossRef]
  41. Perin, C. Shouldering Risks. The Culture of Control in the Nuclear Power Industry; Princeton University Press: Princeton, NJ, USA; Oxford, UK, 2005. [Google Scholar]
  42. Bourrier, M. Organizing Maintenance Work at Two American Nuclear Power Plant. J. Conting. Crisis Manag. 1996, 4, 104–112. [Google Scholar] [CrossRef]
  43. Snook, S.A. Friendly Fire: The Accidental Shootdown of US Black Hawks over Northern Iraq; Princeton University Press: Princeton, NJ, USA, 2000. [Google Scholar]
  44. Bernard, B. Comprendre les Facteurs Humains et Organisationnels. Sûreté nucléaire et organisations à risques; EDP-Sciences: Paris, France, 2014. [Google Scholar]
Figure 1. Adapted depiction of Schein’s layered model.
Figure 1. Adapted depiction of Schein’s layered model.
Safety 04 00045 g001
Figure 2. Key dimensions for safety culture assessment.
Figure 2. Key dimensions for safety culture assessment.
Safety 04 00045 g002
Figure 3. Cultural picture of the installation.
Figure 3. Cultural picture of the installation.
Safety 04 00045 g003

Share and Cite

MDPI and ACS Style

Bernard, B. Safety Culture Oversight: An Intangible Concept for Tangible Issues within Nuclear Installations. Safety 2018, 4, 45. https://0-doi-org.brum.beds.ac.uk/10.3390/safety4040045

AMA Style

Bernard B. Safety Culture Oversight: An Intangible Concept for Tangible Issues within Nuclear Installations. Safety. 2018; 4(4):45. https://0-doi-org.brum.beds.ac.uk/10.3390/safety4040045

Chicago/Turabian Style

Bernard, Benoît. 2018. "Safety Culture Oversight: An Intangible Concept for Tangible Issues within Nuclear Installations" Safety 4, no. 4: 45. https://0-doi-org.brum.beds.ac.uk/10.3390/safety4040045

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop