November 14, 2013 | Mark Paradies

Root Cause Analysis Tip: Why Do Root Cause Analysis?

© Copyright 2013 By System Improvements Inc. Used by Permission.



Have you thought about why you do root cause analysis? What is your purpose? I ask because many people go through the motions of root cause analysis without asking this essential question.

For most people, the purpose of root cause analysis is to learn to stop major accidents by finding the root causes of accidents and fixing them. Obviously, we must analyze the root causes of fatalities and serious injuries. But waiting for a serious accident to prevent a fatality or serious injury is like shutting the barn door after the cow has escaped.

Instead of waiting for a major accident, we need to learn from smaller incidents that warn us about a big accident just around the corner. Thus, root cause analysis of these significant warning events is a great idea.

The same philosophy applies to other types of adverse events that you want to prevent. Quality issues, equipment failures, production upsets, or environmental releases. You want to use root cause analysis to learn from the minor events to prevent the major ones.

This seems obvious. But why do so many companies seem to wait to learn from major accidents? And why do so many others waste tremendous time and money investigating incidents that don’t have the potential to cause a serious loss? Read on for ideas…


Many companies seem to wait for big accidents before they decide to make serious change to the way they manage safety. They think they are doing everything needed to be safe. They may even have evidence (like decreasing lost time injury/medical treatment rates) that they are improving. But, when a major accident happens, the investigation reveals multiple opportunities that were missed before the major accident to have learned from minor incidents. That makes me wonder … Why aren’t they learning?

I’ve seen eight reasons why major companies to fail to learn. These reasons can occur separately or rolled up together as a “culture issue.” They include:


Near-Misses Not Reported

If you don’t find out about small problems, you will wait until big problems happen to react. Often people don’t report near-misses because they are unofficially discouraged to do so. This can include being punished for self-reporting a mistake or being assigned to fix a problem when it is reported. Even the failure to act when a problem is reported can be seen as demotivating.

Screen Shot 2013-10-23 At 3.10.31 Pm

Hazards Not Recognized

Another reason that near-misses/hazards are not reported (and therefore not learned from) is that they aren’t even recognized as a reportable problem. I remember an operator explaining that he didn’t see an overflow of a diesel fuel tank as a near-miss, rather, he saw it as a “big mess.” No report means that no one learned until the diesel caught fire after a subsequent spill (a big accident).

 Content Wp-Content Uploads 2013 01 Hphotos-Ash3 68679 4809987280955 1411304410 N

Shortcuts Become a Way of Life (standards not enforced)

This is sometimes called the “normalization of deviation.” If shortcuts (breaking the rules) become normal, people won’t see shortcuts as reportable near-misses. Thus, the bad habits continue until a big accident occurs.

Deepwater Horizon Night Closer View Copy

Process Safety Not Understood

We’ve built a whole course around this cause of big accidents (The 2-Day Best Practices for Reducing Serious Injuries & Fatalities Using TapRooT® Course). When management doesn’t understand the keys to process safety, they reward the wrong management behavior only to suffer the consequences later.


Ineffective Root Cause Analysis

If a problem is reported but is inadequately analyzed, odds are that the corrective actions won’t stop the problem’s recurrence. This leaves the door open to future big accidents.


Inadequate Corrective Actions

I’ve seen it before … Good root cause analysis and poor corrective action. That’s why we wrote the Corrective Action Helper® module for the TapRooT® Software. Do you use it?

Corrective Actions Not Implemented

Yes. People do propose good corrective actions only to see them languish – never to be implemented. And the incidents continue to repeat until a big accident happens.

Trends Not Identified

If you aren’t solving problems, the evidence should be in the incident statistics. But you will only see it if you use advanced trending tools. We teach these once a year at the pre-Summit 2-Day Advanced Trending Techniques Course.

 Wikipedia Commons 9 98 Oww Papercut 14365


Another problem that I’ve seen is companies overreacting. Instead of ignoring problems (waiting for the big accident), they become hyperactive. They try to prevent even minor incidents that never could become fatalities or serious injuries. I call this the “Investigating Paper Cuts” syndrome.

Why is overreacting bad? Because you waste resources trying to prevent problems that aren’t worth preventing. This usually leads to a backlog of corrective actions, many of which have very little return on investment potential. Plus you risk losing the few critical improve-ments that are worthwhile in “the sea of backlog.” Thus, an improvement program that isn’t properly focused can be a problem.


You need to truly understand the risks presented by your facility and focus your safety program on the industrial and process safety efforts that could prevent fatalities and serious injuries. Don’t overlook problems or make the mistake of trying to prevent every minor issue. Focus proactively on your major risks and reactively on incidents that could have become major accidents. Leave the rest to trending.


An ounce of prevention is worth a pound of cure.”
Benjamin Franklin

Show Comments

Leave a Reply

Your email address will not be published. Required fields are marked *