The normalcy bias, or normality bias, refers to a mental state people enter when facing a possible problem or disaster. It causes people to underestimate both the possibility of a the problem or disaster occurring and its possible effects. This often results in situations where people fail to adequately prepare. The assumption that is made in the case of the normalcy bias is that since a fire or explosion, or disaster never has occurred then it never will occur. It also results in the inability of people to cope with a disaster once it occurs. People with a normalcy bias have difficulties reacting to something they have not experienced before. People also tend to interpret warnings in the most optimistic way possible, seizing on any ambiguities to infer a less serious situation.
As applied to the combustible dust processing industries, we have seen the implications in our business of helping manufacturers prevent fires and explosions. Many times it is said to me that "we have never had an explosion", implying that they never will. Even though they may have all of the ingredients for the Fire Triangle or Explosion Pentagon, but may never have had all of the right ingredients together at the right time. And even though these manufacturers utilize or create combustible dust as a byproduct of their process, and they may even have a history of fires, many times they do not connect the dots. It may help to understand that if you use or create combustible dust and have potential ignition sources, then you have the ingredients for a fire, and every fire is simply a failed explosion!
Processes are not static, they change over time. Material mixes change or deviate, products or equipment are changed out with "like kind" and may not be cause for management of change. Machinery wears. People change. Housekeeping varies. Many times production personnel do not understand the implications of even small changes to the process or product, or even housekeeping. Maintenance personnel may not understand the implications of deviating from safety procedures, or exchanging or modifying parts and machinery, or safety equipment. And this is where fires and explosions happen, and people get hurt.
From safetymattersblog: "Normalization of a Deviation"
"Normalization of a Deviation"
These are the words of John Carlin, Vice President at the Ginna Nuclear Plant, referring to a situation in the past where chronic water leakages from the reactor refueling pit were tolerated by the plant’s former owners.
The quote is from a piece reported by Energy & Environment Publishing’s Peter Behr in its ClimateWire online publication titled, “Aging Reactors Put Nuclear Power Plant ‘Safety Culture’ in the Spotlight” and also published in The New York Times. The focus is on a series of incidents with safety culture implications that have occurred at the Nine Mile Point and Ginna plants now owned and operated by Constellation Energy.
The recitation of events and the responses of managers and regulators are very familiar. The drip, drip, drip is not the sound of water leaking but the uninspired give and take of the safety culture dialogue that occurs each time there is an incident or series of incidents that suggest safety culture is not working as it should.
Managers admit they need to adopt a questioning attitude and improve the rigor of decision making; ensure they have the right “mindset”; and corporate promises “a campaign to make sure its employees across the company buy into the need for an exacting attention to safety.” Regulators remind the licensee, "The nuclear industry remains ... just one incident away from retrenchment..." but must be wondering why these events are occurring when NRC performance indicators for the plants and INPO rankings do not indicate problems. Pledges to improve safety culture are put forth earnestly and (I believe) in good faith.
The drip, drip, drip of safety culture failures may not be cause for outright alarm or questioning of the fundamental safety of nuclear operations, but it does highlight what seems to be a condition of safety culture stasis - a standoff of sorts where significant progress has been made but problems continue to arise, and the same palliatives are applied. Perhaps more significantly, where continued evolution of thinking regarding safety culture has plateaued. Peaking too early is a problem in politics and sports, and so it appears in nuclear safety culture.
This is why the remark by John Carlin was so refreshing. For those not familiar with the context of his words, “normalization of deviation” is a concept developed by Diane Vaughan in her exceptional study of the space shuttle Challenger accident. Readers of this blog will recall that we are fans her book, The Challenger Launch Decision, where a mechanism she identifies as “normalization of deviance” is used to explain the gradual acceptance of performance results that are outside normal acceptance criteria. Most scary, an organization's standards can decay and no one even notices. How this occurs and what can be done about it are concepts that should be central to current considerations of safety culture.
For further thoughts from our blog on this subject, refer to our posts dated October 6, 2009 and November 12, 2009. In the latter, we discuss the nature of complacency and its insidious impact on the very process that is designed to avoid it in the first place.