Overfilling of vessels can occur in any process that handles liquid. It is of primary concern when the liquids are hazardous with the potential to impact personnel, the environment, assets or the business. A review of articles on overflow incidents shows that they continue to occur and that human error is a significant factor
A study of 242 storage tank incidents between 1960-2003 in North America, Asia and Europe, showed that 30 percent of the incidents were due to operational and maintenance error. Overfilling was identified as the most frequent type of incident in the operational error category. A further survey of 330 incidents showed human error as the cause of 72 of them. A review of incidents data set in 21st century showed that the human factor contribution increased to 35 percent for the period 2000-2013 as compared to their contribution share in 1961-2013 (Hemmatian, 2014).
Waite (2013) highlighted 10 tank overfilling incidents that happened between 1972-2009; with eight out of 10 incidents resulting in fatalities and one incident resulting in 43 injuries. These incidents are similar in nature to that of refinery incidents in which columns have been flooded, resulting in loss of containment through the relief system. Waite highlights the need to challenge and review the operator mental models such as reliance on pressure or liquid head without understanding their relation with density, and on out-flow information from a vessel to confirm liquid is not accumulating and to address them adequately during Process Hazard Analysis with special focus on human factors (Waite, 2013). While these analyses point toward human error, a more focused review of the human and organizational factors that influence such errors will enable prevention of such incidents.
Alarms are intended to provide an indication to operators to take the necessary actions to prevent undesired consequences. Thus, alarms are often considered as a layer of protection during Process Hazard Analysis (PHA) or Safety/Hazard Review. In an unmanaged alarm system, lower priority alarms can obscure the operator’s response to critical alarms. The identification of alarms as a protection for process hazards such as a vessel overflow, the design of the alarms and the design of the Human Machine Interface (HMI) are important steps in management of the alarm system as described in ISA-18.2 and IEC62682. Without due consideration for human factors that are necessary to ensure its effectiveness, alarms only give a false sense of security.
In several process industry incidents, the designed layers of protection using the operator response to alarm failed. There are several mechanisms for this failure relating to systems or human factors. Among the human factors, there are several situational awareness demons such as attentional tunnelling, data overload and misplaced salience as identified by Endsley (2011).
This paper will examine several overflow incidents. It will look at the causes specifically related to the failure of alarms during different stages of operator response (detect, diagnose and respond), design and maintenance activities, the human factors component & related situation awareness demon. It will also explore how it can be addressed during the facility life cycle especially during alarm identification, rationalization, crediting adequately during LOPA analysis and during incident investigation. The incidents presented in this paper are from a major chemical manufacturing company and from the chemical industry. Each incident is mapped to alarm failure mechanisms and situational awareness demons.
One of the incident analyses is presented below:
Incident 10 Tank Overflow
During a start-up of a process unit, a tank of organic solvent, above its flash point, overflowed causing a vapor cloud, which did not ignite. The level rise generated at least two different alarms prior to the overflow. Two other units in the span of control of the same control room operator were conducting maintenance tests that generated a significant number of alarms. The operator did not respond to the tank-level alarms. A field operator reported the vapor cloud.
The facts related to the alarm system failures can be stated as follows:
- The operator was distracted by the maintenance testing in the two units.
- The operator did not respond to the tank high-level alarms.
- The tank high-level alarms were not nuisance alarms.
The failures can be mapped to the following failure mechanisms:
- Diagnose Failure
Operator did not respond to the tank high-level alarms.
Operator was distracted by the maintenance testing.
Operator may have believed the alarms were triggered by maintenance testing.
The failures can be mapped to the following SA conditions:
- Attention Tunnelling
Operator was distracted by the maintenance testing.
- Errant Mental Model
Operator did not connect tank high level alarm to unit that was in start-up.
Overflow incidents are common, and the contribution of human error is a significant factor. In this paper, we have highlighted the need to ask the right questions during incident investigations in order to learn and understand the human factors related to alarm failures. We have demonstrated the importance of understanding the human factors related to alarms and addressing them adequately during the alarm life cycle using alarm failure analysis framework. Improvements can be made by ensuring situation awareness demons and potential failure mechanisms are addressed adequately in the project life cycle when alarms are identified as safeguards and credited as protection layers.