Safety Culture: What is it and how can we build a Pro-active Safety Culture in the NHS?

Safety Culture: What is it and how can we build a Pro-active Safety Culture in the NHS?

10th December 2023 Blog 0
nhs safety culture

 

“If eternal vigilance is the price of liberty, then chronic unease is the price of safety.” Reason (1997)

With hindsight it is almost always possible to identify warning signs which went unheeded. This article discusses how healthcare might implement a safety culture capable of predicting and avoiding accident risks.

being human in healthcare - christmas

Introduction

There are many different definitions of safety culture. These are a selection:

  • ‘a combination of the attitudes, values and perceptions that influence how something is actually done in the workplace, rather than how it should be done’ (HSE)
  • ‘characteristics and attitudes’ of people and the organisation (INSAG, 1991)
  • a ‘safety-related facet of organisational culture’ focused on the ‘norms and values’ relating to safety within organisations (Noort et al, 2016)
  • ‘The way we do things around here’ (CBI, 1990)

These definitions emphasise that safety culture is not simply a set of beliefs but relates to the actions and norms that those attitudes and beliefs inspire. This is reflected in the recent healthcare definition of safety culture, which includes emphasis on collaboration and co-construction, psychological safety, teamwork and continuous learning around safety risks (NHSE, 2022a).

Corporate safety values are commonly clearly stated as part of safety culture implementation however there can be less focus on tackling undesirable safety-related actions and norms.

Implementing safety as a value therefore goes beyond inclusion in mission statements, to include the understanding that values can be learned from others and that everyday experiences will either reinforce or weaken that value. If organisations have goals conflicting with safety values, belief in the commitment to safety as a value will be weakened (NHSE, 2022a). Within healthcare safety as a value is often articulated however targets conflicting with the prioritisation of safe thorough care are common.

The various elements of a safety culture capable of predicting and avoiding accident risk, referred to here as ‘proactive safety culture’, will be considered and potential methods and approaches for implementing a proactive safety culture discussed in general and in relation to their application within healthcare. Finally, Reason’s statement “If eternal vigilance is the price of liberty, then chronic unease is the price of safety” will be revisited.

Proactive safety culture

An organisational safety culture concerned only with response to incidents could be termed an immature reactive safety culture, whereas a safety culture capable of predicting and avoiding accidents, could be thought of as a more mature proactive safety culture (Boskeljon-Horst et al, 2023).  A proactive safety culture could include concepts of collaborative working, shared knowledge around safety, incident reporting and analysis, how risk is communicated and proactive measurement techniques (Noort et al, 2006). A proactive safety culture therefore has many facets and includes features of a Just Culture (Dekker, 2016), a learning culture and a reporting culture which help create the conditions for this approach.

Measurement of Safety Culture

A first step in the implementation of a proactive safety culture could be to measure existing safety culture with the intention of building on strengths and identifying and responding to elements that were reactive, immature or misaligned. This process could also be useful for monitoring progress of implementation. Potential methods include safety attitude surveys, safety culture workshops, safety management audits and analysis of safety performance indicators such as data on feedback to employees on safety improvement ideas (RSSB, 2011). In general, surveys explore worker perceptions and experience of organisational commitment to safety, resource allocated to safety, incident reporting processes, the extent to which safety is a collaborative process and how the organisation communicates around safety. Industry specific surveys are widely used, for example, Air Traffic Management (ATM) safety culture survey (Reader et al, 2015), the Manchester Patient Safety Framework (NPSA, 2006) and the annual NHS staff survey, incorporating safety culture dimensions. (NHSE, 2022b).

However, there are recent concerns around the predictive value of safety culture assessment with some finding a negative correlation with incident findings (Boskeljon-Horst et al, 2023) (Antonsen, 2009). Questions raised include whether a questionnaire can adequately capture such a complex concept. The terms ‘safety climate’ and ‘safety culture’ are often used interchangeably but a distinction may be drawn between them, for example by considering that safety culture is to ‘personality’ as safety climate is to ‘mood’, emphasizing that climate is readily observable and changeable, but culture is more enduring in nature (Noort et al, 2006). Safety ‘culture’ tools may measure elements of safety climate (RSSB, 2011). Accident reporting systems

Knowledge of an orgnaisation’s incident profile can assist with focusing proactive efforts on those processes with most significant or frequent unwanted outcomes and ensures those involved in safety incidents are cared for appropriately. An accident reporting system should also include feedback processes, increasing meaning and ongoing motivation around reporting, and requires high levels of psychological safety and belief in Just culture. Organisations such as rail, with mature accident reporting systems, use tools (Ryan et al, 2009) to track improvement in reporting and listening culture and the conditions which support this.  

In blame cultures, including much of healthcare, reporting systems can be weaponized with staff ‘threatening’ reporting if, for example, attendance by a doctor to a particular ward is not prioritized. Additionally, teams shielding themselves from anticipated punitive treatment may collectively agree not to report. The new Patient Safety Incident Response Framework (NHSE, 2022c) mandates many elements required for successful voluntary reporting systems and represents an opportunity to move healthcare towards a proactive safety culture. Practical elements such as the usability of reporting system software should also be considered.

Accident models

Consideration of accident definitions and accident models can be helpful when implementing a proactive safety culture. An early accident definition (Arbous & Kerrich, 1951) which included concepts of the multi-causal nature of accidents, system-failure and implied the requirement for a systems-based approach, contributed to later accident models such as the General Accident Scenario (GAS) (Wagenaar et al, 1990) and more recent systems-based literature (Carayon et al, 2015).

Wagenaar’s GAS included the role of ‘Psychological Precursors’, such as perceptions, attitudes and motivations which align with concepts inherent in safety culture. Understanding that these ‘Psychological Precursors’ within an organisation can be altered by non-safety related policy is essential. An example is the impact of rewarding prompt operating theatre start times in isolation, which alters attitudes and motivations around safety of pre-operative tasks and erodes safety culture.   

Wagenaar’s GAS also included ‘latent failures’, emphasizing these belong to the organisation rather than the individual. Tools have been developed to aid identification of latent failures such as TripodDELTA (Hudson et al, 1994) and can be used proactively. The General Failure Types (GFT) within the tool map to the whole work system, providing a systems-based proactive approach. These tools are not widely used in healthcare, where quantification of contribution may be problematic as discussed later. Wagenaar’s GAS also includes ‘failure of defences’ and this consideration of potential detection of and recovery from incidents is relevant to the implementation of a proactive safety culture.

Similar elements are incorporated in the Swiss Cheese Model of Organizational Vulnerability (Reason, 1997) which has been instrumental to the understanding of proactive safety culture in healthcare and acceptance of latent failures. Potential organisational latent failures are less well understood, for example targets, or poorly constructed policy.

The ergonomic model has three main elements: the Person, the Task and the Environment. It promotes consideration of the interactions between these elements and therefore that incidents are system failures. In healthcare, the Systems Engineering Initiative for Patient Safety, SEIPS, is a broadly similar model (Carayon et al, 2014).

Proactive Tools

Approaches and tools that have been specifically designed for the proactive approach to safety culture include the Probabalistic Safety Assessment (PSA) process. The PSA in technical systems is the process of defining and quantifying hazards and the risk they present, deciding the acceptability of that risk and reducing the risk where required. If humans are included in the system then there is a need to include a form of Human Reliability Assessment (HRA) within the process, which also includes error recovery or mitigation (Kirwan, 2015). This approach is established in many safety critical industries and should form an essential part of a proactive safety culture. Given the predominantly human contribution to the healthcare system HRA might be expected to be embedded but the technique is not widely used (Lyons et al, 2004).

A key step in an HRA is the definition of the problem, followed by a form of Hierarchical Task Analysis (HTA). HTA could be of particular value within healthcare as many activities and processes are under-specified and the process of collaboratively constructing an HTA could prompt debate and highlight areas of ambiguity or variability (Sujan et al, 2020). There follows a process of Human Error Identification (HEI) often employing taxonomies of behaviour or error modes. This can necessitate binary judgements on purely external or observable errors that may not reflect the realities of teamwork in complex socio-technical systems such as healthcare. Some versions of HEI, for example TRACER designed for ATM, also include cognitive error, potential mechanisms and consideration of performance influencing factors to reflect complexity (Shorrock & Kirwan, 2002). The next step of representing the potential errors via fault and event trees may again be more challenging in healthcare.

If a quantitative output is required, such as nuclear industry regulation, then a stage of Human Error

Quantification (HEQ) or estimation of the probability of error occurring, is included, such as in

Technique for Human Error Rate Prediction (THERP) (Swain & Guttmann, 1983). Human Error Probabilities (HEP) may be refined by multipliers linked to performance influencing factors, which takes account of context to an extent. However, there are difficulties in ensuring the accuracy of probabilities across different contexts and industries, uncertainties around data informing the denominator, and potential for significant variation within and between individuals in complex systems. These concerns are especially relevant to HEQ in healthcare, where applying the SPAR-H (Standardised Plant Analysis Risk – Human Reliability Analysis) (Sands et al, 2015) resulted in a probability of human error approaching one for every consideration (Sujan et al, 2020). This is at variance with the observation that despite error-prone situations in healthcare, the majority of the time errors don’t occur, or at least not with the high probability HEQ might predict. This, in addition to the difficulty specifying complex healthcare team activity, might mean that HEQ is not a realistic aim within healthcare. Other proactive approaches could still be valuable, for example, Healthcare Failure Mode and Effects Analysis (HFMEA) (DeRosier et al, 2002) which assigns agreed scores to the various aspects considered, or qualitative techniques such as SHERPA: a systematic human error reduction and prediction approach (Embrey, 1986), which has been used successfully in healthcare, for example in theatres (Phipps et al, 2008).

Once potential errors have been identified and perhaps quantified, the potential impact and risk associated with each error can be described, existing mitigations considered, and a decision reached around acceptability. An organisation could then proactively consider how unacceptable risk levels could be reduced, use the HRA process to inform redesign and then consider the impact of these

Error Reduction Measures on the system. In many industries, the concept of reducing risk to ‘As Low As Reasonably Practicable’ is used as a guide, for example in rail (HSE, 2001). In healthcare, this concept is generally not accepted with ‘zero harm’ being a common goal.

Accident prevention systems

Established accident prevention system frameworks (Wilson, 1980) should guide organisations around the infrastructure and processes necessary to implement a proactive safety culture. Wilson’s framework incorporates a work system based on the ergonomic model as already discussed. In healthcare, this element could be represented prospectively by knowledge of the work system using SEIPS, also, the potential ‘hazard’ could be illustrated by interactions and outcomes within SEIPS.

Consideration of the infrastructure to support the various methods of accident prevention within the framework could align with prospective safety culture implementation. Post-accident services can illustrate tangible prioritization of safety. Prioritising design to prevent accidents demonstrates a shift away from reflex blame of workers and collaboration in design processes can further enhance safety culture. Propaganda campaigns should be relevant with equipment to facilitate compliance provided. In healthcare, for example, handwashing campaigns alongside frequently empty gel dispensers could weaken safety culture. Multi-professional team healthcare training, such as simulation training, can increase awareness of other roles, promoting a coherent proactive safety message and is increasingly recognized within healthcare (Ockenden, 2022).

Legislation and regulation also have a role (Management of Health and Safety at Work regulations 1992), and the lack of regulatory drive for a proactive approach within healthcare is influential. Inspection and audit as methods of accident reduction are context and culture dependent and often capture Work-as-Disclosed rather than Work-as-Done (Shorrock,2017). Acknowledgement of previous punitive associations and emphasis on ensuring systems are supporting workers may be helpful. A positive example in healthcare is the ‘Fresh Eyes’ initiative in midwifery supervision (NICE, 2022).

Standard Operating Procedures (SOPs) are useful for clarity of task and represent ‘best practice’, although if unachievable could have negative impact. SOPs are often stored on a computer system and frontline staff may have little opportunity to read them or low awareness of their existence. SOPs can also be out of date, not suited to emergencies and are often unwieldy documents having been frequently amended. In a blame culture, fear of deviation may constrain valuable performance variability resulting in unwanted outcomes. Additionally, fear of non-compliance also impacts on willingness to report.

Protection of workers should only be required when it is impossible to remove the hazard or the people from the hazard and a design-based approach is preferable. Reliance on weak controls such as warning signs can undermine statements that safety is a ‘key concern’. Similarly, inadequate availability or poor design of PPE may impact worker perceptions around commitment to safety. A proactive safety culture should include measures to ensure availability, acceptability and usability of PPE and be alert to unintended consequences, such as communication difficulties when wearing FFP3 masks.

Risk Management

Risk Assessment includes identification of hazards and estimation of the likelihood of the risk happening. A risk rating is then generated informing a Risk Action Plan to proactively manage risk as part of a proactive safety culture in stable systems and contexts, such as technical equipment or highly specified and repeatable procedures (Hollnagel, 2012a). In healthcare’s typically poorly specified systems however, accuracy is questionable.

In healthcare, conditions, including demands and resources, can change rapidly. Staff must be constantly aware of the possibility of failure (Hollnagel, 2012a). The acceptance of goal conflicts and curiosity around ‘local ingenuity’ achieving safety could perhaps provide more meaningful insight than traditional approaches (Boskeljon-Horst et al, 2022) which are of limited predictive value in such systems. The successful implementation of a proactive safety culture may be more dependent on the level of resilience of the system than on ability to predict accidents. However, a systems-based qualitative approach, drawing on existing HRA methods, coupled with resilience engineering concepts may be a way forward (Hollnagel, 2012a)(Sujan et al, 2020).  Resilience engineering 

Resilience has been defined as the ‘intrinsic ability of a system to adjust its functioning… so that it can sustain required operations under both expected and unexpected conditions’ (Hollnagel et al, 2011). A proactive safety culture could therefore be seen in terms of strengthening this resilience engineering approach. This is of relevance where work is complex and difficult to specify, where performance variability is seen as an attribute, and where traditional HRA methods are problematic, as in healthcare (Sujan et al, 2020).

The role of resilience engineering has additional synergy with a proactive safety culture, as it moves thinking away from retrospective linear cause and effect models and a reductionist approach, towards a deeper understanding of emergence of a range of outcomes from the same complex work system. This could result in proactive strengthening of resilient elements enhancing ability to adapt, adjust and actively anticipate and prevent unwanted outcomes. Additionally, the resilience model (Hollnagel et al, 2006) has three main elements of anticipation, attention and response which is a familiar medical training approach.

Resilience engineering may therefore be a better ‘fit’ for implementation of proactive safety culture in healthcare than methods relying on high levels of specification and stable work systems. Currently, resilience-based methods ready for direct application to the workplace are elusive. Practical ‘resilience’ suggestions for the healthcare domain include learning from everyday work and associated enabling performance variability, a focus on frequent rather than severe events, allocating time for reflection, learning and sharing as a norm, remaining ‘sensible to the possibility of failure’ and imaginative around potential prevention or mitigation strategies (Hollnagel, 2012a).

This aligns with earlier thoughts that safety is a ‘continuously emerging property’ (Reason, 1997).

Organisations have ‘variability in activity but stability in cognition’ perhaps indicating that a stable ‘resilience’ mindset, at both organizational and individual level, might be the key to a proactive safety culture (Weick, 1987) rather than a particular method. The Functional Resonance Analysis Method (FRAM) has been developed to understand variability in the system (Hollnagel, 2012b) and is useful for modeling the impact of variance and coupling of processes across the work system. FRAM therefore has the flexibility to model complex adaptive socio-technical systems such as healthcare (Sujan et al, 2020), although experience in healthcare with FRAM is mostly limited to research-based activity.

Conclusion

Reason’s statement would seem to hold true for many contexts as a proactive safety culture incorporates many facets and mechanisms for improving safety by being constantly alert to ‘warning signs’ and predicting ‘disaster’. Reason later described a state of ‘twitchy mindfulness’, like a grey squirrel, as being an example of this ‘chronic unease’ (Reason, 2008). Prediction mechanisms are less certain for complex socio-technical systems such as healthcare, where difficulties in specifying work and the adaptive nature of the systems under analysis raise difficulties (Sujan et al, 2020). Perhaps ‘unease’ might be reframed to a more positive concept reflecting a ‘resilience mindset’ for individuals and organisations. In such a proactive safety culture, staff are alert to risky situations but rather than fleeing, squirrel-like, whenever danger is sensed, they are empowered and facilitated to reflect on everyday work, learn, and use imagination and skill to adapt, adjust, and to therefore ‘make’ safety in multiple small ways. 

Dr Jennifer Blair 2023

Please get in touch is you would like to discuss any training needs or take a look at our website for our full range of healthcare training such as PSIRF, or consultancy services.