Taking the human factor into account when setting SIL requirements

A well-known fact from accident investigations is that the human factor plays a huge role. In many large accidents, the enquiry will mention organizational factors, leadership focus, procedures and training as important factors in a complex picture involving both human factors and technological factors. In the oil and gas industry it has been found that more than half of the gas leaks detected offshore are down to human factors and errors made during operation, maintenance or startup. On the other hand – humans may also play the role of the safeguard – an operator may choose to shut down a unit behaving suspiciously prior to any dangerous situation occurring, a vehicle driver may slow down to avoid relying heavily on the ABS system for braking on icy roads, an electrician suggests to exchange a discolored socket that otherwise is well-functioning. All of these are human actions that lower the risk. The human thus always comes into the risk picture and can both enhance the safety, and threaten the safety of an asset. This all depends on leadership, training, organizational maturity and attitudes. How do we deal with this in the context of safety integrity levels?

There are many practices. There are thorough methodologies for analysis of human performance as part of barrier systems available, such as human reliability analysis (HRA), developed first in the nuclear industry but now also commonplace in many sectors (petroleum, chemical industry, aviation and transport). On the other side there are the extremes of assuming “humans always fail to do the correct thing” or “humans always do the right thing”. When performing a SIL allocation analysis using typical methods for this such as layers of protection analysis or RiskGraph (both described in IEC 61511), an important thing to consider is: can the bad things be avoided by human intervention? In many cases humans can intervene, and then we do need to have a notion of how reliable the human is. Human performance is influenced by many factors, and these factors are analyzed in depth in the framework of HRA. During a LOPA very detailed analysis of the human contribution is usually not within the scope, and a more simple approach is taken. However, there are some important questions we can bring from the HRA toolbox that will help us build more trust into the numbers we use in the LOPA, or the trust we put in this barrier element in the RiskGraph:

  • Is the operator well-trained and is the task easy to understand?
  • Does the operator have the necessary experience?
  • Does the organization have a positive safety culture?
  • Are there many tasks to handle at once and no clear priorities?
  • Is the situation stressful?
  • Does the operator have time to comprehend the situation, analyze the next action and execute before it is too late?

In many cases the operator will be well-trained exactly for the accident scenarios in question. Also, if designed correctly, there will be clear alarm prioritization and helpful messages from the alarm system – but it is always good to challenge this because quality of alarm design is varying a lot in practice. The situation is almost always stressful if the consequence of the accident is grave and there is some confusion to the situation but training can do wonders in handling such situations by resorting to reflex operating steps – think of basic training of field skills in the military. The last question is always important – does the operator have enough time? What enough time is, can also be hard to give a fixed limit on; for simple situations it is maybe sufficient with 10-15 minutes, whereas for more complex situations maybe a full hour would be needed for human intervention to be a trustworthy barrier element. Companies may have different guidelines regarding these factors – it should always be considered if these guidelines are in line with current knowledge of human performance. No shorter reaction times than 15 minutes should be allowed in analysis if credit is given to the operator. For unusual scenarios, such as the case is for “low-demand” safety functions, a PFD of the human intervention lower than 10% should not be used.

Giving credit to human intervention in SIL allocation is good practice – but the credit given should be realistic based on what we know about how humans react in these situations. Due to the large uncertainty, especially when performing a “quick-and-dirty” shortcut analysis such as discussed above, conservative values for human error should be assumed.

Also note that when a human action is included as an “independent protection layer” in a LOPA, the integrity of the entire barrier system includes this action as well. This means that in order to have control over barrier integrity, the company must carefully manage the underlying factors such as organizational maturity, safety leadership and competence management. Increased attention also to these factors in internal hazard reviews could lead to improved safety performance; maybe could the number of accidents with human errors as root cause be significantly reduced through more structured inclusion of human elements in barrier management thinking.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s