Author Archives: Mark Paradies

Avoid Big Problems By Paying Attention to the Small Stuff

Posted: May 16th, 2018 in Accidents, Courses, Performance Improvement, Pictures, Root Cause Analysis Tips, TapRooT

Almost every manager has been told not to micro-manage their direct reports. So the advice above:

Avoid Big Problems By Paying Attention to the Small Stuff

may sound counter-intuitive.

Perhaps this quote from Admiral Rickover, leader of the most successful organization to implement process safety and organizational excellence, might make the concept clearer:

The Devil is in the details, but so is salvation.

When you talk to senior managers who existed through a major accident (the type that gets bad national press and results in a management shakeup), they never saw it coming.

A Senior VP at a utility told me:

It was like I was walking along on a bright sunny day and
the next thing I knew, I was at the bottom of a deep dark hole.

They never saw the accident coming. But they should have. And they should have prevented it. But HOW?

I have never seen a major accident that wasn’t preceded by precursor incidents.

What is a precursor incident?

A precursor incident is an incident that has low to moderate consequences but could have been much worse if …

  • One of more Safeguards had failed
  • It was a bad day (you were unlucky)
  • You decided to cut costs just one more time and eliminated the hero that kept things from getting worse
  • The sequence had changed just a little (the problem occurred on night shift or other timing changed)

These type of incidents happen more often than people like to admit. Thus, they give management the opportunity to learn.

What is the response by most managers? Do they learn? NO. Why? Because the consequences of the little incidents are insignificant. Why waste valuable time, money, and resources investigating small consequence incidents. As one Plant Manager said:

If we investigated  every incident, we would do nothing but investigate incidents.

Therefore, a quick and dirty root cause analysis is performed (think 5-Whys) and some easy corrective actions that really don’t change things that are implemented.

The result? It looks like the problem goes away. Why? Because big accidents usually have multiple Safeguards and they seldom fail all at once. It’s sort of like James Reason’s Swiss Cheese Model…

SwissCheese copy

The holes move around and change size, but they don’t line up all the time. So, if you are lucky, you won’t be there when the accident happens. So, maybe the small incidents repeat but a big accident hasn’t happened (yet).

To prevent the accident, you need to learn from the small precursor incidents and fix the holes in the cheese or add additional Safeguards to prevent the major accidents. The way you do this is by applying advanced root cause analysis to precursor incidents. Learn from the small stuff to avoid the big stuff. To avoid:

  • Fatalities
  • Serious injuries
  • Major environmental releases
  • Serious customer quality complaints
  • Major process upsets and equipment failures
  • Major project cost overruns

Admiral Rickover’s seventh rule (of seven) was:

The organization and members thereof must have the ability
and willingness to learn from mistakes of the past.

And the mistakes he referred to were both major accidents (which didn’t occur in the Nuclear Navy when it came to reactor safety) and precursor incidents.

Are you ready to learn from precursor incidents to avoid major accidents? Then stop trying to take shortcuts to save time and effort when investigating minor incidents (low actual consequences) that could have been worse. Start applying advanced root cause analysis to precursor incidents.

The first thing you will learn is that identifying the correct answer once is a whole lot easier that finding the wrong answer many times.

The second thing you will learn is that when people start finding the real root causes of problems and do real root cause analysis frequently, they get much better at problem solving and performance improves quickly. The effort required is less than doing many poor investigations.

Overall you will learn that the process pay for itself when advanced root cause analysis is applied consistently. Why? Because the “little stuff” that isn’t being fixed is much more costly than you think.

How do you get started?

The fastest way is by sending some folks to the 2-Day TapRooT® Root Cause Analysis Course to learn to investigate precursor incidents.

The 2-Day Course is a great start. But some of your best problem solvers need to learn more. They need the skills necessary to coach others and to investigate significant incidents and major accidents. They need to attend the 5-Day TapRooT® Advanced Root Cause Analysis Team Leader Training.

Once you have the process started, you can develop a plan to continually improve your improvement efforts. You organization will become willing to learn. You will prove how valuable these tools are and be willing to become best in class.

Rome wasn’t built in a day but you have to get started to see the progress you need to achieve. Start now and build on success.

Would you like to talk to one of our TapRooT® Experts to get even more ideas for improving your root cause analysis? Contact us by CLICKING HERE.

Root Cause Analysis Tip: Why Did The Robot Stop? (Comparing 5-Why Results with TapRooT® Root Cause Analysis Results)

Posted: May 9th, 2018 in Pictures, Root Cause Analysis Tips, TapRooT

Find the Root Cause

I hear people say that 5-Whys is a good root cause analysis system for “simple” incidents. So, I thought I would show a simple incident that was provided as an example by a very experienced 5-Why user and compare it to the analysis that would be performed using TapRooT®.

Taiichi Ohno, the father of the Toyota Production System and the creator of the 5-Why method of root cause analysis, is the source of the example – a robot failure. He used the example to teach employees the 5-Why technique while he was at Toyota. Here is the example as he described it…

1.    Why did the robot stop?

–    The circuit has overloaded, causing a blown fuse.

2.    Why did the circuit overload?

–    There was insufficient lubrication on the bearings, so they locked up.

3.    Why was there insufficient lubrication on the bearings?

–    The oil pump on the robot is not circulating sufficient oil.

4.    Why is the pump not circulating sufficient oil?

–    The pump intake is clogged with metal shavings.

5.    Why is the intake clogged with metal shavings?

–    Because there is no filter on the pump.

For Mr. Ohno, that was the end of the root cause process: Install a filter and get back to work. But this isn’t even the start of the root cause analysis process in TapRooT®.

Let’s look at this incident using TapRooT® and see how 5-Whys compares to the advanced TapRooT® Root Cause Analysis System.

TapRooT® Root Cause Analysis

TapRooT® is more than a tool. It is a systematic process with embedded tools to help an investigator find and fix the root causes of a problem. It starts with either the TapRooT® 5-Step Process for low-to-medium risk incidents or the the TapRooT® 7-Step Process for major investigations. The 5-Step Process is shown below…

To start investigating the problem, one gathers evidence and draws a SnapCharT® (shown below being drawn by a team in a TapRooT® 2-Day Root Cause Analysis Course).

Notice that the 5-Whys that Mr. Ohno asked in the example above turned out to be mainly the sequence of events leading up to the failure in the  SnapCharT® (shown below).

The SnapCharT® makes the example event easier to understand than the 5-Why example above. Plus, the SnapCharT® goes beyond the 5-Whys by indicating that there was no low oil pressure alarm.

In TapRooT®, if the investigator decides that there is more to learn, the investigator continues to collect evidence (grows the SnapCharT®) to expand his/her understanding of what happened. A good TapRooT® Investigator would have several areas to look at.

First, what happened to the filter? Was it forgotten during maintenance or was it never designed into the system?

Next, where did the metal shavings come from? Metal shavings in a lube oil system are unusual. What was the source?

The new information provides a fairly complete understanding of what happened and is shown on the SnapCharT® below.

Notice that in TapRooT®, we complete the collection of evidence about what caused the metal filings and what caused the filter to be missing. These were significant issues that were left out of the 5-Why analysis. This type of omission is common in 5-Why analyses – even when experts apply 5-Whys. Thus the problem isn’t with the investigator or their training – it is embedded in the 5-Why system.

Causal Factors

Once one understands what happened, the third step is to identify the Causal Factors that, if eliminated, would have stopped the accident from occurring or reduced the seriousness of the incident. A simple technique called Safeguard Analysis is used to do this. The four Causal Factors for the Robot Stops incident were identified as:

  1. Mechanic A uses cloth to cover openings in system.
  2. Mechanic A does not report metal shaving contamination.
  3. Mechanic B does not install oil filter.
  4. Operator does not know oil pressure is low.

Where Mr. Ohno only had one root cause, TapRooT® has already identified four Causal Factors. Each of these Causal Factors could have multiple root causes so TapRooT® is already highlighting one of the weaknesses of 5-Whys: that it usually focuses on a single cause and misses additional causes (and the needed corrective actions for those root causes that aren’t identified).

TapRooT® Root Causes

In fourth step of the TapRooT® 5-Step Process, each Causal Factor is analyzed using the Root Cause Tree® to guide the investigator to the Causal Factor’s root causes. The tree is described in detail in the TapRooT® Book (CLICK HERE for info).

For this example, we won’t show the entire analysis of all four Causal Factors using the Root Cause Tree® and Dictionary. For people who would like to know more about the 15-question Human Performance Troubleshooting Guide and the way the tree is used to help investigators find causes beyond their current knowledge, we recommend  attending a 2-Day or 5-Day TapRooT® Course.

However, we will describe the analysis of the Causal Factor “Operator doesn’t know oil pressure is low.”

This starts out on the tree as a Human Performance Difficulty that leads us to the Human Performance Troubleshooting Guide. When asking the 15 Questions, two questions get a “yes” for this Causal Factor and guide us to the Human Engineering, Procedures, and Training Basic Cause Categories on the back side of the Root Cause Tree®.

Copyright © 2015 by System Improvements, Inc.
Used by permission. Duplication prohibited.

In analyzing these categories, no causes are found in the Procedures or Training Basic Cause Categories. However, two root causes are found to be applicable in the Human Engineering Basic Cause Category (above).

Thus, it was determined that if the operator needed an oil pressure display/alarm (displays NI root cause) to make the detection of a problem possible (errors not detectable root cause). If the display/alarm had been present, then the robot could have been stopped and fixed before damage to the bearings had occurred. Thus, the incident would have been made significantly less severe.

The corrective action for these two root causes would be to install a bearing lube oil pressure indicator and a low bearing lube oil pressure alarm to notify the operator of impending equipment problems before the bearing would lock up.

After analyzing just one Causal Factor using the TapRooT® Root Cause Tree® we have found that even an expert like Taiichi Ohno could miss important root causes when using 5-Whys. But there is more. There are still three more Causal Factors to analyze (and then Generic Causes – an optional technique in the 5-Step Process).

Why would you use a root cause tool with known, proven weaknesses? Why would you risk lives, your corporate reputation, and large sums of money on an inferior approach to problem solving? If something is worth fixing, it is worth fixing it right! Learn and apply TapRooT® Advanced Root Cause Analysis to find the real root causes of problems and effectively fix them. Attend an upcoming course to learn more.

Admiral Rickover’s 7 Rules

Posted: May 9th, 2018 in Performance Improvement, TapRooT

Hyman Rickover 1955

Rule 1. You must have a rising standard of quality over time, and well beyond what is required by any minimum standard.
Rule 2. People running complex systems should be highly capable.
Rule 3. Supervisors have to face bad news when it comes, and take problems to a level high enough to fix those problems.
Rule 4. You must have a healthy respect for the dangers and risks of your particular job.
Rule 5. Training must be constant and rigorous.
Rule 6. All the functions of repair, quality control, and technical support must fit together.
Rule 7. The organization and members thereof must have the ability and willingness to learn from mistakes of the past.

Are you using advanced root cause analysis to learn from past mistakes? Learn more about advanced root cause analysis by CLICKING HERE.

Newest Aircraft Carrier Breaks Down During Sea Trials

Posted: May 8th, 2018 in Current Events, Equipment/Equifactor®, Video

USS Ford underway for sea trials …

An article in Popular Mechanics said the the USS Ford had to return early from sea trials because of an overheating thrust bearing on one of the four main engines. Bloomberg reported that:

“inspection of the parts involved in the January 2018 incident revealed improperly machined gears at GE’s facility in Lynn, Massachusetts as the ‘root cause.'”

Is “improperly machined gears” a root cause? That would be a Causal Factor and the start of a root cause analysis in the TapRooT® System. And why wasn’t the “improper” machining detected prior to installation and sea trials?

Here is some footage of sea trials (including a brief glimpse of one main shaft turning).

Hazards and Targets

Posted: May 7th, 2018 in Accidents, Current Events, Human Performance, Performance Improvement, Pictures, Root Causes, TapRooT

Most of us probably would not think of this as a on the job Hazard … a giraffe.

Screen Shot 2018 05 07 at 9 40 49 AM

But African filmmaker Carlos Carvalho was killed by one while working in Africa making a film.

Screen Shot 2018 05 07 at 9 42 38 AM

 Do you have unexpected Hazards at work? Giant Asian hornets? Grizzly bears? 

Or are your Hazards much more common. Heat stroke. Slips and falls (gravity). Traffic.

Performing a thorough Safeguard Analysis before starting work and then trying to mitigate any Hazards is a good way to improve safety and reduce injuries. Do your supervisors know how to do a Safeguard Analysis using TapRooT®?

Cyber Attack Root Cause Analysis

Posted: May 4th, 2018 in investigation, Root Cause Analysis Tips, Root Causes, Video

(if you can’t see the video, here’s a link)

Yes .. It happened right here in Knoxville! A cyber attack on the county computer system on election night!

What is the root cause? The county is having an outside contractor look into it.

Can you use the TapRooT® Root Cause Analysis System to do a root cause analysis of a cyber security attack. Yes! People have been doing it for decades.

What is Senior Leadership’s Role in Root Cause Analysis

Posted: May 2nd, 2018 in Performance Improvement, TapRooT

Screen Shot 2018 04 30 at 3 40 59 PM

Senior leadership wants root cause analysis to uncover the fixable root causes of significant accidents and precursor incidents AND to recommend effective fixes to stop repeat incidents.

But what does senior leadership need to do to make sure the happens? What is their role in effective root cause analysis? Here’s a quick list:

  • Best root cause system
  • Insist that it is used
  • Be involved in reviews
  • Insist on timely implementation of fixes
  • Check status of the implementation of fixes
  • Use trends to manage
  • Steer system to be more proactive

Let’s look at these in slightly more detail.

Best root cause system: Because so much is riding on the effective performance of a root cause system, leaders knows that second best systems aren’t good enough. They don’t want to bet their company’s future on someone asking why five times. That’s why they feel assured when their team uses advanced root cause analysis to find and fix the real root causes of problems. CLICK HERE to find out more about advanced root cause analysis.

Insist that it is used: One common theme in companies that get the most from their performance improvement programs is that senior leaders ASK for investigations. When they see a problem, they insist that the advanced root cause analysis process is used to get to the root causes and develop effective fixes. When middle management and employees see senior leadership asking for investigations and root cause analysis, they want to be involved to help the company improve.

Be involved in the reviews: When senior leaders ask for investigations, it’s only logical that they would want to review the outcome of the investigation they asked for. But it goes beyond being present. Senior management knows what to look for and how to make the review process a positive experience. People often get rewarded for good investigations. When the review process is a positive experience, people want to participate and have pride in their work.

Insist on timely implementation of fixes/Check status of the implementation of fixes: You might not believe this but I’ve seen many examples of companies where they performed root cause analysis, developed fixes, and then were very slow to implement them. So slow that the incident repeated itself, sometimes several times, before any fix was implemented. Good senior leadership insists on prompt implementation of fixes and makes sure they are kept up to date on the progress of implementation.

Use trends to manage: Good root cause analysis efforts produce statistics that can help leaders manage. That’s why senior leadership understands the use of advanced trending techniques and gets reports on the latest root cause trends.

Steer system to be more proactive: Would you rather wait for an accident or incident to find your next improvement opportunity? Or would you rather target and audit or assessment and have them apply advanced root cause analysis to develop effective improvements? The best senior leaders know the right answers to these questions.

That’s it! Senior leaders use proactive improvement and investigations of precursor incidents and major accidents (which rarely happen) to find where improvement needs to happen. They are involved with the system and use it to keep their company ahead of the competition. They are updated about the status of fixes and current trends. They reward those who make the system work.

Does that sound like your facility? Or do you have an improvement opportunity?

Press Release: CSB to Investigate Husky Refinery Fire

Posted: April 26th, 2018 in Accidents, Current Events, investigation, Pictures

CSB

Washington, DC, April 26, 2018 –  A four-person investigative team from the U.S. Chemical Safety Board (CSB) is deploying to the scene of an incident that reportedly injured multiple workers this morning at the Husky Energy oil refinery in Superior, Wisconsin. The refinery was shutting down in preparation for a five-week turnaround when an explosion was reported around 10 am CDT.

According to initial reports, several people were transported to area hospitals with injuries. There have been no reports of fatalities. Residents and area schools near the refinery were asked to evacuate due to heavy smoke.

The CSB is an independent, non-regulatory federal agency charged with investigating serious chemical incidents. The agency’s board members are appointed by the president and confirmed by the Senate. CSB investigations look into all aspects of chemical accidents, including physical causes such as equipment failure as well as inadequacies in regulations, industry standards, and safety management systems.

The Board does not issue citations or fines but does make safety recommendations to plants, industry organizations, labor groups, and regulatory agencies such as OSHA and EPA. Visit the CSB website, www.csb.gov

Here is additional coverage of the fire …

NewImage

http://www.kbjr6.com/story/38049655/explosion-injuries-reported-at-husky-energy-superior-refinery?autostart=true

How many precursor incidents did your site investigate last month? How many accidents did you prevent?

Posted: April 25th, 2018 in Accidents, investigation, Performance Improvement, Pictures, Root Cause Analysis Tips, TapRooT

A precursor incident is an incident that could have been worse. If another Safeguard had failed, if the sequence had been slightly different, or if your luck had been worse, the incident could have been a major accident, a fatality, or a significant injury. These incidents are sometimes called “hipos” (High Potential Incidents) or “potential SIFs” (Significant Injury or Fatality).

I’ve never talked to a senior manager that thought a major accident was acceptable. Most claim they are doing EVERYTHING possible to prevent them. But many senior managers don’t require advanced root cause analysis for precursor incidents. Incidents that didn’t have major consequences get classified as a low consequence event. People ask “Why?” five times and implement ineffective corrective actions. Sometimes these minor consequence (but high potential consequence incidents) don’t even get reported. Management is letting precursor incidents continue to occur until a major accident happens.

Perhaps this is why I have never seen a major accident that didn’t have precursor incidents. That’s right! There were multiple chances to identify what was wrong and fix it BEFORE a major accident.

That’s why I ask the question …

“How many precursor incidents did your site investigate last month?”

If you are doing a good job identifying, investigating, and fixing precursor incidents, you should prevent major accidents.

Sometimes it is hard to tell how many major accidents you prevented. But the lack of major accidents will keep your management out of jail, off the hot seat, and sleeping well at night.

Screen Shot 2018 04 18 at 2 08 58 PMKeep Your Managers Out of These Pictures

That’s why it’s important to make sure that senior management knows about the importance of advanced root cause analysis (TapRooT®) and how it should be applied to precursor incidents to save lives, improve quality, and keep management out of trouble. You will find that the effort required to do a great investigation with effective corrective actions isn’t all that much more work than the poor investigation that doesn’t stop a future major accident.

Want to learn more about using TapRooT® to investigate precursor incidents? Attend one of our 2-Day TapRooT® Root Cause Analysis Courses. Or attend a 5-Day TapRooT® Root Cause Analysis Course Team Leader Course and learn to investigate precursor incidents and major accidents. Also consider training a group of people to investigate precursor incidents at a course at your site. Call us at 865-539-2139 or CLICK HERE to send us a message.

Is April just a bad month?

Posted: April 24th, 2018 in Accidents, Courses, Pictures, Summit

I was reading a history of industrial/process safety accidents and noticed that all the following happened in April:

Texas city nitrate explosion

April 16, 1947 – 2,200 tons of ammonium nitrate detonates in Texas City, Teas, destroying multiple facilities and killing 581 people.

Deepwater horizon

April 20, 2010 – A blowout, explosions, and fire destroy the Deepwater Horizon, killing 11. This was the worst oil spill in US history.

West texas

April 17, 2013 – 10 tons of ammonium nitrate detonates in West, Texas, destroying most of the town and killing 15 people.

Maybe this is just my selective vision making a trend out of nothing or maybe Spring is a bad time for process safety? I’m sure it is a coincidence but it sure seems strange.

Do you ever notice “trends” that you make you wonder … “Is this really a trend?”

The best way to know is to apply our advanced trending techniques. Watch for our new book coming out this Summer and then plan to attend the course next March prior to the 2019 Global TapRooT® Summit.

Are you ready for quality root cause analysis of a precursor incident?

Posted: April 17th, 2018 in Accidents, investigation, Pictures, Root Cause Analysis Tips, TapRooT

Many companies use TapRooT® to investigate major accidents. But investigating a major accident is like closing the barn door after the horse has bolted.

What should you be doing? Quality investigations of incidents that could have been major accidents. We call these precursor incidents. They could have been major accidents if something else had gone wrong, another safeguard had failed, or you were “unlucky” that day.

How do you do a quality investigation of a precursor incident? TapRooT® of course! See the Using the Essential TapRooT® Techniques to Investigate Low-to-Medium Risk Incidents book.

NewImage

Or attend one of our TapRooT® Root Cause Analysis Courses.

Mark Paradies Speaks about Root Cause Analysis at the Gas Processor’s Association

Posted: April 16th, 2018 in Current Events

Screen Shot 2018 04 16 at 11 29 20 AM

Attending the GPA Conference in Austin?

Then attend the Safety Committee Meeting on Tuesday (1:30 – 5:30) and hear Mark Paradies talk about common root cause analysis problems and how you can solve them.

Where to Start When Finding Root Causes

Posted: April 11th, 2018 in investigation, Pictures, Root Cause Analysis Tips, TapRooT

I had someone ask me the other day …

”Where do I start when finding root causes?”

To me, the answer was obvious. You need to understand what happened BEFORE you can understand why it happened.

That’s why the TapRooT® System starts by developing a SnapCharT® of what happened.

Here is a simple example.

Someone sprains their ankle while walking to their car in the parking lot.

What is the root cause.

You might think the obvious answer is …

“They didn’t have their eyes on path!”

But you are jumping to conclusions! You don’t know what happened. So start here…

NewImage

You are starting to develop the story of what happened. You keep working on the story until you have clearly defined Causal Factors …

SprainSnapwCF

That’s a lot more information! It isn’t as simple as “eyes on path.”

Now you are ready to start identifying the root causes of each of the four Causal Factors.

So, that’s where you need to start to find root causes!

Scientific Method and Root Cause Analysis

Posted: April 4th, 2018 in Human Performance, investigation, Investigations, Performance Improvement, Root Cause Analysis Tips, TapRooT

Screen Shot 2018 03 26 at 2 15 18 PM

I had someone tell me that the ONLY way to do root cause analysis was to use the scientific method. After all, this is the way that all real science is performed.

Being an engineer (rather than a scientist), I had a problem with this statement. After all, I had done or reviewed hundreds (maybe thousands?) of root cause analyses and I had never used the scientific method. Was I wrong? Is the scientific method really the only or best answer?

First, to answer this question, you have to define the scientific method. And that’s the first problem. Some say the scientific method was invented in the 17th century and was the reason that we progressed beyond the dark ages. Others claim that the terminology “scientific method” is a 20th-century invention. But, no matter when you think the scientific method was invented, there are a great variety of methods that call themselves “the scientific method.” (Google “scientific method” and see how many different models you can find. The one presented above is an example.)

So let’s just say the scientific method that the person was insisting was the ONLY way to perform a root cause analysis required the investigator to develop a hypothesis and then gather evidence to either prove or disprove the hypothesis. That’s commonly part of most methods that call themselves the scientific method.

What’s the problem with this hypothesis testing model? People don’t do it very well. There’s even a scientific term the problem that people have disproving their hypothesis. It’s called CONFIRMATION BIAS. You can Google the term and read for hours. But the short description of the problem is that when people develop a hypothesis that they believe in, they tend to gather evidence to prove what they believe and disregard evidence that is contrary to their hypothesis. This is a natural human tendency – think of it like breathing. You can tell someone not to breath, but they will breath anyway.

What did my friend say about this problem with the scientific method? That it could be overcome by teaching people that they had to disprove all other theories and also look for evidence to disproves their theory.

The second part of this answer is like telling people not to breath. But what about the first part of the solution? Could people develop competing theories and then disprove them to prove that there was only one way the accident could have occurred? Probably not.

The problem with developing all possible theories is that your knowledge is limited. And, of course, how long would it take if you did have unlimited knowledge to develop all possible theories and prove or disprove them?

The biggest problem that accident investigators face is limited knowledge.

We used to take a poll at the start of each root cause analysis class that we taught. We asked:

“How many of you have had any type of formal training
in human factors or why people make human errors?”

The answer was always less than 5%.

Then we asked:

“How many of you have been asked to investigate
incidents that included human errors?”

The answer was always close to 100%.

So how many of these investigators could hypothesize all the potential causes for a human error and how would they prove or disprove them?

That’s one simple reason why the scientific method is not the only way, or even a good way, to investigate incidents and accidents.

Need more persuading? Read these articles on the problems with the scientific method:

The End of Theory: The Data Deluge Makes The Scientific Method Obsolete

The Scientific Method is a Myth

What Flaws Exist Within the Scientific Method?

Is the Scientific Method Seriously Flawed?

What’s Wrong with the Scientific Method?

Problems with “The Scientific Method”

That’s just a small handful of the articles out there.

Let me assume that you didn’t read any of the articles. Therefore, I will provide one convincing example of what’s wrong with the scientific method.

Isaac Newton, one of the world’s greatest mathematicians, developed the universal law of gravity. Supposedly he did this using the scientific method. And it worked on apples and planets. The problem is, when atomic and subatomic matter was discovered, the “law” of gravity didn’t work. There were other forces that governed subatomic interactions.

Enter Albert Einstein and quantum physics. A whole new set of laws (or maybe you called them “theories”) that ruled the universe. These theories were proven by the scientific method. But what are we discovering now? Those theories aren’t “right” either. There are things in the universe that don’t behave the way that quantum physics would predict. Einstein was wrong!

So, if two of the smartest people around – Newton and Einstein – used the scientific method to develop answers that were wrong but that most everyone believed … what chance do you and I have to develop the right answer during our next incident investigation?

Now for the good news.

Being an engineer, I didn’t start with the scientific method when developing the TapRooT® Root Cause Analysis System. Instead, I took an engineering approach. But you don’t have to be an engineer (or a human factors expert) to use it to understand what caused an accident and what you can do to stop a future similar accident from happening.

Being an engineer, I had my fair share of classes in science. Physics, math, and chemistry are all part of an engineer’s basic training. But engineers learn to go beyond science to solve problems (and design things) using models that have limitations. A useful model can be properly applied by an engineer to design a building, an electrical transmission network, a smartphone, or a 747 without understanding the limitations of quantum mechanics.

Also, being an engineer I found that the best college course I ever had that helped me understand accidents wasn’t an engineering course. It was a course on basic human factors. A course that very few engineers take.

By combining the knowledge of high reliability systems that I gained in the Nuclear Navy with my knowledge of engineering and human factors, I developed a model that could be used by people without engineering and human factors training to understand what happened during an incident, how it happened, why it happened, and how it could be prevented from happening again. We have been refining this model (the TapRooT® System) for about thirty years – making it better and more usable – using the feedback from tens of thousands of users around the world. We have seen it applied in a wide variety of industries to effectively solve equipment and human performance issues to improve safety, quality, production, and equipment reliability. These are real world tests with real world success (see the Success Stories at this link).

So, the next time someone tells you that the ONLY way to investigate an incident is the scientific method, just smile and know that they may have been right in the 17th century, but there is a better way to do it today.

If you don’t know how to use the TapRooT® System to solve problems, perhaps you should attend one of our courses. There is a basic 2-Day Course and an advanced 5-Day Course. See the schedule for public courses HERE. Or CONTACT US about having a course at your site.

Who Invented Operational Excellence?

Posted: March 28th, 2018 in Documents, Performance Improvement, Pictures, Quality

Who Invented Operational Excellence?

Admiral Hyman G. Rickover

NewImage

As a Navy Nuclear Power trained officer, I experienced the rigors of achieving operational excellence first hand.

Rickover explained that there were a series of principles that helped the Nuclear Navy achieve excellence but the top three were:

  1. Total Responsibility
  2. Technical Competence
  3. Facing the Facts

Read about these three principles in a series of articles that I wrote:

http://www.taproot.com/archives/54027

 Rickover lived out this quote:

NewImage

He fought against the lax standards that the Navy practiced and implemented a system of excellence to run the Navy’s nuclear reactors.

You might think that he would be praised and lauded by the Navy for his success. Instead, he had to fight every inch of the way to steer a course true to his principles. And the oldest Admiral ever was fired by the youngest Secretary of the Navy ever. Sometimes that’s how Washington politics works.

Want to read more about Rickover’s life and how he developed his concepts of operational/process excellence? Read his semi-official biography (written by the official Nuclear Navy historian Francis Duncan) Rickover – The Struggle for Excellence. (Picture of the book at the top of the page.)

McD’s in UK Fined £200k for Employee Injured While Directing Traffic

Posted: March 27th, 2018 in Accidents, Current Events

NewImage

An angry motorist hits a 17-year-old employee who is directing traffic and breaks his knee. Normally, you would think the road rage driver would be at fault. But a UK court fined McDonalds $200,000.

Why? It was a repeat incident. Two previous employees had been hurt while directing traffic. And McDonalds didn’t train the employees how to direct traffic.

What do you think? Would a good root cause analysis of the previous injuries and effective corrective actions have prevented this accident?

Root Cause Analysis Audit Idea

Posted: March 22nd, 2018 in Investigations, Performance Improvement, Pictures, Root Cause Analysis Tips

Screen Shot 2018 03 22 at 3 02 19 PM

In the past couple of years has your company had a major accident?

If they did, did you check to see if there were previous smaller incidents that should have been learned from and if the corrective actions should have prevented the major accident?

I don’t think I have ever seen a major accident that didn’t have precursors that could have been learned from to improve performance. The failure to learn and improve is a problem that needs a solution.

In the TapRooT® root cause analysis of a major accident, the failure to fix pervious precursor incidents should get you to the root cause of “corrective action NI” if you failed to implement effective corrective actions from the previous investigations.

If this idea seems like a new idea at your facility, here is something that you might try. Go back to your last major accident. Review your database to look for similar precursor incidents. If there aren’t any, you have identified a problem. You aren’t getting good reporting of minor incidents with potential serious consequences.

If you find previous incidents, it’s time for an audit. Review the investigations to determine why the previous corrective actions weren’t effective. This should produce improvements to your root cause analysis processes, training, reviews, …

Don’t wait for the next big accident to improve your processes. You have all the data that you need to start improvements today!

What does bad root cause analysis cost?

Posted: March 7th, 2018 in Accidents, Performance Improvement, Pictures, Root Cause Analysis Tips, Training

NewImage

Have you ever thought about this question?

An obvious answer is $$$BILLIONS.

Let’s look at one example.

The BP Texas City refinery explosion was extensively investigated and the root cause analysis of BP was found to be wanting. But BP didn’t learn. They didn’t implement advanced root cause analysis and apply it across all their business units. They didn’t learn from smaller incidents in the offshore exploration organization. They didn’t prevent the BP Deepwater Horizon accident. What did the Deepwater Horizon accident cost BP? The last estimate I saw was $22 billion. The costs have probably grown since then.

I would argue that ALL major accidents are at least partially caused by bad root cause analysis and not learning from past experience.

EVERY industrial fatality could be prevented if we learned from smaller precursor incidents.

EVERY hospital sentinel event could be prevented (and that’s estimated at 200,000 fatalities per year in the US alone) if hospitals applied advanced root cause analysis and learned from patient safety incidents.

Why don’t companies and managers do better root cause analysis and develop effective fixes? A false sense of saving time and effort. They don’t want to invest in improvement until something really bad happens. They kid themselves that really bad things won’t happen because they haven’t happened yet. They can’t see that investing in the best root cause analysis training is something that leads to excellent performance and saving money.

Yet that is what we’ve proven time and again when clients have adopted advanced root cause analysis and paid attention to their performance improvement efforts.

The cost of the best root cause analysis training and performance improvement efforts are a drop in the bucket compared to any major accident. They are even cheap compared to repeat minor and medium risk incidents.

I’m not promising something for nothing. Excellent performance isn’t free. It takes work to learn from incidents, implement effective fixes, and stop major accidents. Then, when you stop having major accidents, you can be lulled into a false sense of security that causes you to cut back your efforts to achieve excellence.

If you want to learn advanced root cause analysis with a guaranteed training, attend of our upcoming public TapRooT® Root Cause Analysis Training courses.

Here is the course guarantee:

Attend the course. Go back to work and use what you have learned to analyze accidents,
incidents, near-misses, equipment failures, operating issues, or quality problems.
If you don’t find root causes that you previously would have overlooked
and if you and your management don’t agree that the corrective actions that you
recommend are much more effective, just return your course materials/software
and we will refund the entire course fee.

Don’t be “penny wise and pound foolish.” Learn about advanced root cause analysis and apply it to save lives, prevent environmental damage, improve equipment reliability, and achieve operating excellence.

Is Having the Highest Number of Serious Incidents Good or Bad?

Posted: March 6th, 2018 in Accidents, Current Events, Medical/Healthcare, Pictures

NewImage

I read an interesting article about two hospitals in the UK with the highest number of serious incidents.

On the good side, you want people to report serious incidents. Healthcare has a long history of under-reporting serious incidents (sentinel events).

On the good side, administrators say they do a root cause analysis on these incidents.

On the bad side, the hospitals continue to have these incidents. Shouldn’t the root cause analysis FIX the problems and the number of serious incidents be constantly decreasing and becoming less severe?

Maybe they should be applying advanced root cause analysis?

Highlights from the 2018 Global TapRooT® Summit

Posted: March 5th, 2018 in Performance Improvement, Pictures, Summit, Video

IMG 7328
 

Coming up with a highlight reel from the 2018 Global TapRooT® Summit is almost impossible. I always think that the Summit just can’t get any better and then we outdo ourselves planning the next one. Here are some highlights followed by the top six items attendees shared that they needed to do better at their facilities.

First, a video to share the experience…

By the way, the next Global TapRooT® Summit is scheduled for March 11-15, 2019, in Houston (Montgomery, TX) at the La Torretta Lake Resort & Spa (picture below).

Screen Shot 2018 03 05 at 11 54 44 AM
 

Now for my impressions of the highlights …

First, the Keynote Speakers were outstanding.

We started with Dr. Carol Gunn who gave an inspiring talk about medical errors and how to encourage error reporting and effective investigations. Carol is a TapRooT® User and medical doctor … she knows what she is talking about.

Next, we had an UNBELIEVABLE talk by Boaz Rauchwerger who told us all to take a positive approach to improvement.

IMG 7342
 

The final keynote on Wednesday was Inky Johnson. How can I explain how he inspired us? There was a long line of people who just came up to thank Inky and get their picture taken with him. If you don’t know Inky’s story. watch it below for motivation to accomplish more in life.

 

After Inky’s keynote, Carl Dixon entertained us at the Summit Reception. Here I am singing Proud Mary with him…

 

On Thursday, I was the opening keynote and concluded with a TapRooT® Implementation Gap Analysis. People shared where they needed to improve their TapRooT® implementation. What were the top 6 items they needed to do better?

  1. Use advanced root cause analysis (TapRooT®) for both reactive and proactive investigations.
  2. Use an investigation rewards program more effectively.
  3. Guide their improvement programs through management’s use of performance measures and advanced trending techniques.
  4. Proactive improvement that drives improvement success (tie with 5).
  5. Develop a leadership succession plan.
  6. Communicate improvement accomplishments successfully.

These items will help us plan sessions for next year’s Summit.

Screen Shot 2018 03 05 at 1 39 03 PM
 

Vincent Phipps closed out the day with his discussion of the four personality types and how to use them to communicate more effectively.

On Friday we started with the session that helped attendees develop plans to fill their program gaps (started with the gap analysis performed on Thursday).

IMG 7338

 

Then, I (Mark Paradies) interviewed Mike Williams about his experiences when the Deepwater Horizon experienced a blowout, explosion(s), and fire. Wow! What an experience. People were sitting on the edge of their seats as Mike answered my questions and all the questions from the Summit participants. We actually ran over the finish time by 15 minutes as people asked interesting questions. The lessons learned from this one session about emergency response, investigations, and safety were … UNBELIEVABLE! I learned several things about the accident that I didn’t know from the various reports (CSB, Presidential Commission, BP, or Coast Guard) that added to my understanding of the Causal Factors and root causes. There was also an important lesson for investigators about empathy and PTSD after a major accident.

And that’s just a summary of the Keynote Speakers impact. There were also some great Best Practice Sharing sessions and speakers. My favorite was the TapRooT® Users Share Best Practices Session (soon, you will be seeing videos from this session to share some of the best practices).

Here are what some TapRooT® Users had to say about their 2018 Global TapRooT® Summit experience…

 

What a Summit! Hope to see you in Houston in 2019!

Big Fines for Safety Incidents in the UK

Posted: February 27th, 2018 in Accidents, Current Events, Pictures

NewImage

$1.1m £ fine for ejection seat manufacturer after Red Arrow pilot killed. Click here.

$120k £ fine for employee injured by circular saw. Click here.

$1.4m £ fine for Tata Steel after crane crushes worker. Click here.

If you have facilities in the UK, are you doing all you can to avoid HSE issues? You should consider improving your root cause analysis to improve your efforts to stop accidents. Learn about advanced root cause analysis by CLICKING HERE. Then attend one of these public courses in the UK and Europe.

 

 

 

Great Courses! Getting Ready for the 2018 Global TapRooT® Summit…

Posted: February 26th, 2018 in Pictures, Summit, Training

IMG 7302

There are over a hundred people attending various TapRooT® Courses in Knoxville on Monday and Tuesday before the 2018 Global TapRooT® Summit that starts on Wednesday.

IMG 7293

There people taking the 2-Day TapRooT® Root Cause Analysis Training …

IMG 7296

and several advanced courses teaching trending, software, interviewing, human factors, safety culture, Equifactor®, and risk management.

IMG 7289

IMG 7310

IMG 7316

 

IMG 7319

IMG 7322

IMG 7323

And the set-up is progressing for Wednesday’s 2018 Global TapRooT® Summit General Session kick-off. Looking forward to seeing you there!
 
IMG 7327

Comments from a TapRooT® Course Participant

Posted: February 23rd, 2018 in Courses, TapRooT

Malcolm Gresham, one of our instructors from Australia, sent me this video of a long time TapRooT® User who was attending a course for refresher training. Click the link below to watch a .mov (QuickTime) video.

CourseCritique.mov

Thanks Michael for having the course and you have a great memory! Keep reading the definition! Hope to see you at the Global tapRooT® Summit one of these years!

Nuclear Plant Fined $145,000 for “Gun-Decked Logs”

Posted: February 21st, 2018 in Current Events, Human Performance, Performance Improvement, Pictures

When I was in the Navy, people called it “gun-decking the logs.”

In the Navy this means that you falsify your record keeping … usually by just copying the numbers from the previous hour (maybe with slight changes) without making the rounds and taking the actual measurements. And if you were caught, you were probably going to Captain’s Mast (disciplinary hearing).

The term “gun-decking” has something to do with the “false” gun deck that was built into British sailing ships of war to make them look like they had more guns. Sometimes midshipmen would falsify their navigation training calculations by using dead reckoning to calculate their position rather than using the Sun and the stars. This might have been called “gun-decking” because the gun deck is where they turned their homework over to the ships navigator to be reviewed.

NewImage

What happened at the Nuke Plant? A Nuclear Regulatory Commission inspector found that 13 operators had gun-decked their logs. Here’s a quote from the article describing the incident:

“An NRC investigation, completed August 2017, found that on multiple occasions during the three-month period, at least 13 system operators failed to complete their rounds as required by plant procedures, but entered data into an electronic log indicating they had completed equipment status checks and area inspections,” the NRC said in a statement.”

What was the corrective action? The article says:

“The plant operator has already undertaken several corrective actions, the NRC said, including training for employees, changes in the inspection procedures and disciplinary measures for some staff.”

Hmmm … training, procedures, and discipline. That’s the standard three corrective actions. (“Round up the usual suspects!”) Even problems that seem to be HR issues can benefit from advanced root cause analysis. Is this a Standards, Policy, and Administrative Controls Not Used issue? Is there a root cause under that Near Root Cause that needs to be fixed (for example, Enforcement Needs Improvement)? Or is discipline the right answer? It would be interesting to know all the facts.

Want to learn to apply advanced root cause analysis to solve these kinds of problems? Attend one of our 5-Day TapRooT® Advanced Root Cause Analysis Courses. See the upcoming public courses by CLICKING HERE. Or CLICK HERE to contact us about having a course at your site.

Connect with Us

Filter News

Search News

Authors

Angie ComerAngie Comer

Software

Anne RobertsAnne Roberts

Marketing

Barb CarrBarb Carr

Editorial Director

Chris ValleeChris Vallee

Human Factors

Dan VerlindeDan Verlinde

VP, Software

Dave JanneyDave Janney

Safety & Quality

Garrett BoydGarrett Boyd

Technical Support

Ken ReedKen Reed

VP, Equifactor®

Linda UngerLinda Unger

Co-Founder

Mark ParadiesMark Paradies

Creator of TapRooT®

Michelle WishounMichelle Wishoun

Licensing Paralegal

Per OhstromPer Ohstrom

VP, Sales

Shaun BakerShaun Baker

Technical Support

Steve RaycraftSteve Raycraft

Technical Support

Wayne BrownWayne Brown

Technical Support

Success Stories

An improvement plan was developed and implemented. Elements of the improvement plan included process…

Exelon Nuclear

I know that accidents happen, but I always try and learn from them and absolutely aim…

ARKEMA
Contact Us