Category: Human Performance
If you are a TapRooT® User, you may think that the TapRooT® Root Cause Analysis System exists to help people find root causes. But there is more to it than that. TapRoot® exists to:
- Save lives
- Prevent injuries
- Improve product/service quality
- Improve equipment reliability
- Make work easier and more productive
- Stop sentinel events
- Stop the cycle of blaming people for system caused errors
And we are accomplishing our mission around the world.
Of course, there is still a lot to do. If you would like to learn more about using TapRooT® Root Cause Analysis to help your company accomplish these things, get more information about TapRooT® HERE or attend one of our courses (get info HERE).
If you would like to learn how others have used TapRooT® to meet the objectives laid out above, see the Success Stories at:
I’ve heard many high level managers complain that they see the same problems happen over and over again. They just can’t get people to find and fix the problems’ root causes. Why does this happen and what can management do to overcome these issues? Read on to find out.
Blame is the number one reason for bad root cause analysis.
Because people who are worried about blame don’t fully cooperate with an investigation. They don’t admit their involvement. They hold back critical information. Often this leads to mystery accidents. No one knows who was involved, what happened, or why it happened.
As Bart Simpson says:
“I didn’t do it.”
“Nobody saw me do it.”
“You can’t prove anything.”
Blame is so common that people take it for granted.
Somebody makes a mistake and what do we do? Discipline them.
If they are a contractor, we fire them. No questions asked.
And if the mistake was made by senior management? Sorry … that’s not how blame works. Blame always flows downhill. At a certain senior level management becomes blessed. Only truly horrific accidents like the Deepwater Horizon or Bhopal get senior managers fired or jailed. Then again, maybe those accidents aren’t bad enough for discipline for senior management.
Think about the biggest economic collapse in recent history – the housing collapse of 2008. What senior banker went to jail?
But be an operator and make a simple mistake like pushing the wrong button or a mechanic who doesn’t lock out a breaker while working on equipment? You may be fired or have the feds come after you to put you in jail.
Talk to Kurt Mix. He was a BP engineer who deleted a few text messages from his personal cell phone AFTER he had turned it over to the feds. He was the only person off the Deepwater Horizon who faced criminal charges. Or ask the two BP company men who represented BP on the Deepwater Horizon and faced years of criminal prosecution.
How do you stop blame and get people to cooperate with investigations? Here are two best practices.
A. Start Small …
If you are investigating near-misses that could have become major accidents and you don’t discipline people who spill the beans, people will learn to cooperate. This is especially true if you reward people for participating and develop effective fixes that make the work easier and their jobs less hazardous.
Small accidents just don’t have the same cloud of blame hanging over them so if you start small, you have a better chance of getting people to cooperate even if a blame culture has already been established.
B. Use a SnapCharT® to facilitate your investigation and report to management.
We’ve learned that using a SnapCharT® to facilitate an investigation and to show the results to management reduces the tendency to look for blame. The SnapCharT® focuses on what happened and “who did it” becomes less important.
Often, the SnapCharT® shows that there were several things that could have prevented the accident and that no one person was strictly to blame.
What is a SnapCharT®? Attend any TapRooT® Training and you will learn how to use them. See:
2. FIRST ASK WHAT NOT WHY
Ever see someone use 5-Whys to find root causes? They start with what they think is the problem and then ask “Why?” five times. Unfortunately this easy methods often leads investigators astray.
Because they should have started by asking what before they asked why.
Many investigators start asking why before they understand what happened. This causes them to jump to conclusions. They don’t gather critical evidence that may lead them to the real root causes of the problem. And they tend to focus on a single Causal Factor and miss several others that also contributed to the problem.
How do you get people to ask what instead of why?
Once again, the SnapCharT® is the best tool to get investigators focused on what happened, find the incidents details, identify all the Causal Factors and the information about each Causal Factor that the investigator needs to identify each problem’s root causes.
3. YOU MUST GO BEYOND YOUR CURRENT KNOWLEDGE
Many investigators start their investigation with a pretty good idea of the root causes they are looking for. They already know the answers. All they have to do is find the evidence that supports their hypothesis.
What happens when an investigator starts an investigation by jumping to conclusions?
They ignore evidence that is counter to their hypothesis. This problem is called a:
It has been proven in many scientific studies.
But there is an even bigger problem for investigators who think they know the answer. They often don’t have the training in human factors and equipment reliability to recognize the real root causes of each of the Causal Factors. Therefore, they only look for the root causes they know about and don’t get beyond their current knowledge.
What can you do to help investigators look beyond their current knowledge and avoid confirmation bias?
Have them use the SnapCharT® and the TapRooT® Root Cause Tree® Diagram when finding root causes. You will be amazed at the root causes your investigators discover that they previously would have overlooked.
How can your investigators learn to use the Root Cause Tree® Diagram? Once again, send them to TapRooT® Training.
The TapRooT® Root Cause Analysis System can help your investigators overcome the top 3 reasons for bad root cause analysis. And that’s not all. There are many other advantages for management and investigators (and employees) when people use TapRooT® to solve problems.
If you haven’t tried TapRooT® to solve problems, you don’t know what you are missing.
If your organization faces:
- Quality Issues
- Safety Incidents
- Repeat Equipment Failures
- Sentinel Events
- Environmental Incidents
- Cost Overruns
- Missed Schedules
- Plant Downtime
You need to be apply the best root cause analysis system: TapRooT®.
Learn more at:
And find the dates and locations for our public TapRooT® Training at:
Here is a video that discusses some root cause tips, common problems with root cause analysis, and how TapRooT® can help. I hope you enjoy!
Like what you see? Why not join us at the next course? You can see the schedule and enroll HERE
This incident notice is from the UK Rail Investigation Branch about an overspeed incident at Fletton Junction, Peterborough on 11 September 2015.
At around 17:11 hrs on 11 September 2015, the 14:25 hrs Virgin Trains East Coast passenger train service from Newcastle to London King’s Cross passed through Fletton Junction, near Peterborough at 51 mph (82 km/h) around twice the permitted speed of 25 mph (40 km/h). This caused the carriages to lurch sideways resulting in minor injuries to three members of staff and one passenger.
It is likely that the train driver had forgotten about the presence of the speed restriction because he was distracted and fatigued due to issues related to his family. Lineside signs and in-cab warnings may have contributed to him not responding appropriately as he approached the speed restriction and engineering controls did not prevent the overspeeding. Neither Virgin Trains East Coast, nor the driver, had realised that family-related distraction and fatigue were likely to be affecting the safety of his driving. Virgin Trains East Coast route risk assessment had not recognised the overspeeding risks particular to Fletton Junction and Network Rail had not identified that a speed limit sign at the start of the speed restriction was smaller than required by its standards.
The incident could have had more serious consequences if the train had derailed or overturned. The risk of this was present because the track layout was designed for a maximum speed of 27 mph (43 km/h).
As a consequence of this investigation, RAIB has made five recommendations. Two addressed to Virgin Trains East Coast relate to enhancing the management of safety critical staff with problems related to their home life, and considering such issues during the investigation of unsafe events.
A recommendation addressed to Virgin Trains East Coast and an associated recommendation addressed to Network Rail relate to assessing and mitigating risks at speed restrictions.
A further recommendation to Network Rail relates to replacement of operational signage when this is non-compliant with relevant standards.
RAIB report also includes learning points relating to managing personal problems that could affect the safety performance of drivers. A further learning point, arising because of a delay in reporting the incident, stresses the importance of drivers promptly reporting incidents which could have caused track damage. A final learning point encourages a full understanding of the effectiveness of safety mitigation provided by infrastructure and signalling equipment.
For more information see:
This Accident shares a “Call Back” Report from the Aviation Safety Reporting System that is applicable far beyond aviation.
In this case, the pilot was fatigued and just wanted to “get home.” He had a “finish the mission” focus that could have cost him his life. Here’s an excerpt:
I saw nothing of the runway environment…. I had made no mental accommodation to do a missed approach as I just knew that my skills would allow me to land as they had so many times in past years. The only conscious control input that I can recall is leveling at the MDA [Rather than continuing to the DA? –Ed.] while continuing to focus outside the cockpit for the runway environment. It just had to be there! I do not consciously remember looking at the flight instruments as I began…an uncontrolled, unconscious 90-degree turn to the left, still looking for the runway environment.
To read about this near-miss and the lessons learned, see:
A lot of bad days start with bad decisions. For example, when you decide to take a selfie with a 4-foot rattlesnake… (Read story.)
The following is a video of a fatal accident. The vehicle drove around a tow truck sent to block the underpass and past a worker waiving his arms to stop her. She drove into water about 17 feet deep. DON’T watch the video if it will upset you. For others, hopefully you can use this to teach others to avoid standing water during flooding.
A colleague at a recent Rail Safety conference pointed me to this article on how to change people’s behavior on rail lines in London. How do we influence people to:
– put trash in trash bins
– be courteous while playing music
– keep feet off the train seats
They’ve tried signs and warnings. I think we can all agree those have limited effect. There are audible reminders. The escalators in Atlanta Airport talk to you continuously on the way down to the trains.
Here are some other (gentler) ways the London Underground is trying to influence passengers to do what is required.
Normalization of Excellence – The Rickover Legacy – 18 Other Elements of Rickover’s Approach to Process SafetyMarch 31st, 2016 by Mark Paradies
The previous three articles discusses Rickover’s “key elements” to achieving safety in the Navy’s nuclear program. They are:
In addition to these three keys that Rickover testified to Congress about, he had 18 other elements that he said were also indispensable. I won’t describe them in detail, but I will list them here:
- Conservatism of Design
- Robust Systems (design to avoid accidents and emergency system activation)
- Redundancy of Equipment (to avoid shutdowns and emergency actions)
- Inherently Stable Plant
- Full Testing of Plant (prior to operation)
- Detailed Prevent/Predictive Maintenance Schedules Strictly Adhered To
- Detailed Operating Procedures Developed by Operators, Improved with Experience, and Approved by Technical Experts
- Formal Design Documentation and Management of Change
- Strict Control of Vendor Provided Equipment (QA Inspections)
- Formal Reporting of Incidents and Sharing of Operational Experience
- Frequent Detailed Audits/Inspections by Independent, Highly Trained/Experienced Personnel that Report to Top Management
- Independent Safety Review by Government Authorities
- Personal Selection of Leaders (looking for exceptional technical knowledge and good judgment)
- One Year of Specialized Technical Training/Hands-On Experience Prior to 1st Assignment
- Advanced Training for Higher Leadership Positions
- Extensive Continuing Training and Requalification for All Personnel
- Strict Enforcement of Standards & Disqualification for Violations
- Frequent Internal Self-Assessments
Would like to review what Rickover had to say about them? See his testimony here:
Now after the description of the excellence of Rickover’s program, you might think there was nothing to be improved. However, I think the program had three key weaknesses. They are:
- Blame Orientation (Lack of Praise)
- Needed for Advanced Root Cause Analysis
Let me talk about each briefly.
The dark side of a high degree of responsibility was a tendency to blame the individual when something went wrong. Also, success wasn’t celebrated, it was expected. The result was burnout and attitude problems. This led to fairly high turnover rate among the junior leaders and enlisted sailors.
Want to work long hours? Join the Nuclear Navy! Eighteen hour days, seven days a week, were normal when at sea. In port, three section duty (a 24 hour day every third day) was normal. This meant that you NEVER got a full weekend. Many errors were made due to fatigue. I remember a sailor was almost killed performing electrical work because of actions that just didn’t make sense. He had no explanation for his errors (they were multiple) and he knew better because he was the person that trained everyone else. But he had been working over 45 days straight with a minimum of 12 hours per day. Was he fatigued? It never showed up in the incident investigation.
ADVANCED ROOT CAUSE ANALYSIS
Root Cause Analysis in the Nuclear Navy is basic. Assign smart people and they will find good “permanent fixes” to problems. And this works … sometimes. The problem? The Nuke Navy doesn’t train sailors and officers how to investigate human errors. That’s where advanced root cause analysis comes in. TapRooT® has an expert system that helps people find the root causes of human error and produce fixes that stop the problems. Whenever I hire a Navy Nuke to work at System Improvements, they always tell me they already know about root cause analysis because they did that “on the boat.” But when they take one of our courses, they realize that they really had so much to learn.
Read Part 7: Statement of Admiral Rickover in front of the Subcommittee on Energy Research and Production of the Committee on Science and Technology of the US House of Representatives – May 24, 1979
If you would like to learn more about advanced root cause analysis, see our course offerings:
And sign up for our weekly newsletter:
Read a recent study that stated only 17.4% of ambulatory care nurses (surveyed) comply with all 9 precautions for infection control. Now first, for those who didn’t click above to see how the data was collected, I will let you know that this is SELF-REPORTED data (glean what you will from that tidbit of information). I would bet that compliance is actually much lower causing even greater concern.
To be fair, we are evaluating 9 different precautions. I will say that any statement that includes “wash hands” has low compliance … just walk into any men’s restroom to see that. But what would drives this? Let’s examine the items because it must be impossible to complete all of these, right? Here is the list:
- Provide care considering all patients as potentially contagious
- Wash hands after removing gloves
- Avoid placing foreign objects on my hands
- Wear gloves when exposure of my hands to bodily fluids is anticipated
- Avoid needle recapping
- Avoid disassembling a used needle from a syringe
- Use a face mask when anticipating exposure to air-transmitted pathogens
- Wash hands after providing care
- Discard used sharp materials into sharp containers
As a non-healthcare professional I don’t see a whole lot I disagree with. I mean, you are working with sick people, washing hands, wearing gloves, watch out for bodily fluids… can’t argue with that. So why do we have such low compliance? And remember this is “infection control,” so keeping healthcare professionals and other patients and staff safe.
Well, on our Root Cause Tree® we have a root cause under Management System->SPAC Not Used named, “No way to implement.” I bring this up simply to examine what we request in this list versus the very very dynamic environment in the hospital.
Can it be reasonable (except for the human self-preservation gene) to expect all of these to happen when working to save a coding patient? Or in a situation when an ER has very high census with multiple traumas (a situation I witnessed myself yesterday)?
I guess the answer truly is no. We are providing a SPAC that as written is reasonable, but can be difficult to implement during certain times. Thus, the very honest self-reported numbers.
Interestingly enough, I know the TapRooT®ers out there are all saying, “Hey dude, this is more of an Enforcement NI thing,” (you know you just did that, don’t act like you didn’t), but is it really Enforcement NI? I don’t believe in any way shape or form that you could enact an enforcement mechanism for all nine of these things, all at the same time, and give healthcare professionals the ability to perform timely patient care. The process would be so burdensome that it would crumble under the weight of its own scrutiny and patient care would suffer.
So is 17.4% compliance enough? Probably not, but let’s also remember what we are asking for people to do for that compliance. The number may not be acceptable, or palatable, but is what we can expect based on what is asked of these courageous folks working in this very difficult environment.
What do you think? Leave your comments below.
If this topic interests you, check out our medical track at the 2016 Global TapRooT® Summit. Breakout sessions include:
- 7 Deadly Sins of Human Performance
- TapRooT® Changes for the Medical Community
- Human Error Causes of Quality Problems
- Writing TapRooT® Driven Preventative & Corrective Actions Workshop
- Anatomy of a Medical Investigation & more!
GO HERE to download a .pdf brochure!
“Easier than making a mistake” … now that is good Human Engineering!
While listening to a radio commercial recently, I heard the announcer say, “Easier than making a mistake!” As a TapRooT® Root Cause Instructor with a quality and human engineering (Human Factors) background, all that I could think about is mistake-proofing, Poka-yoke.
The concept was formalized, and the term adopted, by Shigeo Shingo as part of the Toyota Production System. It was originally described as baka-yoke, but as this means “fool-proofing” (or “idiot-proofing”) the name was changed to the milder poka-yoke. (From Wikipdia)
Now, I did not learn about Dr. Shigeo Shingo during my Human Factors study, even though a large part of training dealt 100% with design and usability from products, to controls and to user graphic user interfaces. On the flip side, Human Factors and Usability was rarely discussed during my Lean Six Sigma certification either, even though Poka-yoke was covered.
Why are two major interactive topics such as Human Factors and Poka-yoke kept in isolation, very dependent on where and what you study? Simple, shared best practices and industry secrets are not always the norm.
Where can you learn about both topics? In San Antonio, Texas during our TapRooT® Summit Week August 1-5.
In the pre-summit 2-Day TapRooT® Quality Process Improvement Facilitator Course, we cover the error of making weak preventative or corrective action items that are not based on the actual root causes found and not optimizing and understanding mistake-proofing that will impact your success in continuous process improvements.
For those that need a deeper understanding of why mistake-proofing should be considered, you should look into signing up for the 2-Day Understanding and Stopping Human Error Course.
If you read last week’s article about stopping the normalization of deviation with the normalization of excellence, you are ready to start learning to apply Rickover’s philosophies of excellence to achieve amazing performance.
In his testimony to Congress, he starts out explaining the nuclear program and the success that has been achieved to that point. He then explains that there is no simple formula to achieve this success. Rather, what we now recognize as a management system is a “integrated whole of many factors.” He emphasizes that these factors cannot be used individually, but rather, must all be used together. He says, “Each element depends on all the other elements.”
So before I start explaining the individual elements, heed Rickover’s advice:
“The problems you face cannot be solved by specifying compliance with one or two simple procedures.
Reactor safety requires adherence to a total concept wherein all elements are recognized
as important and each is constantly reinforced.”
If you aren’t in the nuclear industry, you can replace the words “reactor safety” with “process safety” or maybe even “patient safety” to apply Rickover’s philosophies to your industry.
The first three elements that Rickover explains are:
- Technical Competence
- Facing the Facts
This really is the core of Rickover’s management philosophy and I will explain each in detail.
Rickover believed that to manage a high risk enterprise (nuclear power plant, refinery, offshore drilling platform, or other high risk ventures) you have to fundamentally understand the technical aspects of the job. This was NOT an overview of how things worked. It was a detailed understanding of the science, chemistry, physics, and engineering behind the processes.
The requirement for technical knowledge didn’t stop with the operations manager or plant manager. The technical knowledge requirement went all the way up to the CEO/President level. The higher on the org chart you were – the better your technical knowledge was suppose to be.
“At Naval Reactors, I take individuals who are good engineers and make them into managers.
They do not manage by gimmicks but rather by knowledge, logic, common sense, and hard work.”
All the managers (officers) in the Naval Nuclear Power Program went through a rigorous screening process. First, they were selected from the top portions of good engineering programs from universities across the US. Non-engineer majors were also considered if they had excellent grades in physics, calculus, and chemistry. All people selected then went to Naval Reactors headquarters where they took a technical test to evaluate their technical abilities. Tough engineering, math, chemistry, and physics questions were asked on a non-multiple choice test where the work to achieve the answer had to be shown.
The next day the candidates were put through several technical interviews by high level Department Heads at Naval Reactors. The candidates were asked to solve tough real-life scenarios and apply their technical skills to real-world problems.
Finally, each candidate had the now famous Admiral Rickover interview. Rickover reviewed the candidates academic performance, test results, and interview performance and then asked some of his famous style of questions to evaluate how the candidate reacted under pressure.
My test and interviews went well enough until my Rickover interview. Before the interview, you sat in a room listening to a continuous lecture on what you should and should not do when in the presence of the Admiral. You were informed that you would be accompanied into the room by a senior officer who would sit directly behind you. That you should not address the Admiral until he spoke to you (he was a busy man who didn’t need to be interrupted). That you should take your seat in the chair in front of his desk and wait for him to address you. That when he asked you questions, you should answer directly and that “No excuse sir!” was not an answer. If he asked you a question he wanted to know the answer … not an excuse. That a direct answer was “Yes Sir! or No Sir!” If he asked you if you were married, “Yes Sir” would be a good answer but that didn’t mean that he wanted to hear about your sex life.
Before too long, my name was called and I departed with my escort for the Admiral’s office. When I entered his office I was shocked. Spartan would be a generous description of the accommodations. Old furniture was probably WWII surplus. Rickover’s desk was piled full of neatly stacked folders full of paper, and he was working on something with his head down.
He was a tiny, wiry looking old man, (He was probably in his late seventies, and I was in my senior year of college so too me he looked ancient).
There was an old looking wooden chair directly in from of his desk. I sat down and immediately noticed that one of the legs was shorter than the rest. The chair naturally tipped back and forth. You could either lean forward and have the chair sit still forward or lean back and hold the chair back. I leaned back trying to maintain a straight posture.
I watched Rickover as he worked. He was busy reviewing paperwork and occasionally signing something as he moved files from one stack to another. Finally he stopped and took a file from a different stack and started looking at it. I thought, “That’s my file.”
A minute or two later he looked up at me and said.
“Midshipman Paradies, I see here you got a lot of Cs in your studies at the University of Illinois,
can you tell me why you did so poorly?”
My first thought was … “I’m sure glad he didn’t ask me about that D or E.” I certainly didn’t want to mention drinking beer and playing football and basketball, so I responded:
“Well Admiral, Electrical Engineering at the University of Illinois is a difficult curriculum and that’s all the better I could do.“
I didn’t know that this was one of his standard questions, and he was looking for you to make excuses. I also didn’t know that Rickover had a MS in Electrical Engineering and that he thought it was a tough curriculum.
He said, “OK.”
He closed the file and looked me in the eyes and asked:
“ Midshipman Paradies, are you married?”
That was one of the questions that they warned us to answer directly. I said, “No Sir.”
He asked, “Are you engaged?” I said, “Yes Sir.”
He continued to look me directly in the eyes (a very penetrating stare) and asked:
“Has your fiancé every told you that you are good looking?”
That question caught me totally off guard, Here is this shrunken, bent over old admiral asking me about being good looking … where was he going with this?
I answered, “Yes Sir.”
“ What do you think she meant?”
I was at a total loss. What did she mean? Who knows. I certainly didn’t want to say that I didn’t know. So I said,
“Well, I guess Admiral that she liked the way I looked.”
He said, “No Midshipman Paradies, you are wrong.”
I then gave my best answer of the day. I said, “Yes Sir.”
“What she meant was that she wanted to marry you.
When you go back, will you ask your fiancé what she meant and
send me a letter and tell me what she says?”
I said, “Yes Sir.”
He said, “Get out of my office,” and pointed toward the door.
That was the end of my interview. I had passed. And I did go back and ask my fiancé what she meant and wrote the Admiral and told him what she said.
You might think that writing the letter wan’t important. But it was. The Admirals staff kept track of every letter that he was owed. When I returned for my Engineers Exam, when I checked in with my orders, the woman behind the desk asked, “Do you have any outstanding correspondence with the Admiral?” I said, “No.” She looked in a folder and said, “That’s correct, you sent the letter that you owed the Admiral in 1978.”
My interview was rather straightforward compared to the stories I’ve heard about other Midshipmen. Perhaps the favorite one I heard was from a friend of mine we’ll call Midshipman F.
Midshipman F was a History major. He had taken calculus, chemistry, physics, and other technical subjects and had done quite well. Rickover asked him if he wanted to be in the Naval Nuclear Program, why didn’t he get a technical degree. He responded that to understand the world, history was important.
Rickover then started to tell him that he had wasted his time with history classes. Rickover bet that Midshipman F didn’t know anything about history and ask him questions about history (which were “current events” to Rickover). After listening to Midshipman F answer some history questions, Rickover told him that he was “stupid” and didn’t know anything about history, and to go stand in the closet.
Midshipman F went over to the closet and opened the door, but there was already someone in the closet who looked like a senior officer. He stepped in, shut the door, and they both stood there in the dark and didn’t say a word.
After what Midshipman F said seemed like forever, his escort came over and opened the door and told Midshipman F that the Admiral would like to talk to him again. They went back to arguing over history and Midshipman F was kicked out of the Adsmirals Office twice to go sit in the “penalty box” (another very small room with a chair where you would be sent when the Admiral wanted to make you cool your heels). Midshipman F never gave up his argument about the importance of History and was eventually allowed into the Admiral’s program.
But whenever F would tell the story, he ended it with:
“I’ve always wondered whatever happened to the other guy who was in the closet.“
When you were accepted into the Nuclear Navy, you had to complete a year of extremely difficult technical training before you reported to your first ship. The competition was tough. In the 100 people in my class, many had Masters Degrees in Engineering. One guy had a “photographic memory.” He could remember everything that was written on the board and everything the instructor said verbatim. Not only could he do that, but he did it while doing the homework from the previous lecture. I had the third lowest GPA of anyone in the class and was immediately assigned to “remedial study.”
We had 7 hours of class a day with an hour off for lunch. I used my lunch time to study and usual put in an additional 5 hours each night and another 12 hours on the weekend. I had to keep study logs with how I applied my time in 5 minute intervals. I did well, graduating in the top 10 students in the class. Others did less well … 10 students failed out in the first 6 months. After 6 months, you were assigned to a nuclear prototype plant (and actual naval reactor that had been built ashore to test the design) and went through advanced classes and qualification to be a engineering officer of the watch (EOOW). For this qualification, you worked shift work with mandatory 12 hour days of watch standing, studying, and “check outs” from qualified personnel. Again, I did well and was the second officer to qualify in my class at the prototype 5 weeks before the end of the six month tour. One individual failed out of our prototype training (failed to qualify in the six month time span).
Why is it important to know about this pre-ship education? Because it gives someone who did not go through the program some idea of the technical knowledge that Rickover expected before anyone was allowed to go to sea and qualify to run one of “his” reactors.
And this was just the start of Technical Competency.
Once at sea there was never ending qualifications and continuing training. Drills. And annual “ORSE Board” inspections with level of knowledge exams and interviews.
An officer had to pass another level of Technical Competency call the Engineer’s Exam. To prepare for this exam, the officer was suppose to study in his “spare time” and learn everything there was to know about the design and technical specifications of the reactor plant and systems on his ship. Any topic from the start of Nuclear Power School to current operating problems to any potential equipment failure to system on his ship were fair game for the 8 hour test (with 30 minutes off for lunch) and three technical interviews. You couldn’t get less than a 3.0 score (of 4.0 total) on any section of the exam and couldn’t get less than a 3.2 of 4.0 total on the test. All the questions were essays or engineering calculations. And you had to write fast to complete the exam. During the inter portion of the exam on the next day, any of the interviewers could flunk you if they didn’t like the answers to their questions. Once again, at the end of the process, you had an interview with the Admiral. If you passed, you were then allowed to be assigned as the Engineering Officer for one of Rickover’s ships. People did fail and their career in the Nuclear Navy was over.
Finally, before you were allowed to become a Commanding Officer on one of Rickover’s ships, you had to qualify for command at sea and then go through “Charm School.” Charm School was an assignment to Rickover’s staff where you studied advanced topics and went through a series of brutal interviews with Rickover and his staff members. It was probably someone going though Charm School who was standing in Rickover’s closet when Midshipman F opened the door. Successfully completing Charm School (in some number of months to a year) got you your ticket to your very own “boat” (submarine) or nuclear powered surface ship as the Commanding Officer. Again, there were people who did not get Rickover’s blessing to command a nuclear powered ship. And despite over a decade of service, there was no appeal. If you didn’t have what it takes … you were out.
Most officers also managed to get an advanced degree somewhere along their career.
That’s Technical Competency.
I’ve worked at Du Pont. As a consultant, I’ve visited many refineries, chemical plants, and oil companies. The only thing that comes close to the technical competency required in the Nuclear Navy is the qualification in the commercial nuclear industry and the training program for astronauts at NASA. However, neither program has the advanced technical training requirements for senior level executives.
Read Part 4: Normalization of Excellence – the Rickover Legacy – Responsibility
If it is written down, it must be followed. This means it must be correct… right?
Lack of compliance discussion triggers that I see often are:
- Defective products or services
- Audit findings
- Rework and scrap
So the next questions that I often ask when compliance is “apparent” are:
- Do these defects happen when standard, policies and administrative controls are in place and followed?
- What were the root causes for the audit findings?
- What were the root causes for the rework and scrap?
In a purely compliance driven company, I often here these answers:
- It was a complacency issue
- The employees were transferred…. Sometimes right out the door
- Employee was retrained and the other employees were reminded on why it is important to do the job as required.
So is compliance in itself a bad thing? No, but compliance to poor processes just means poor output always.
Should employees be able to question current standards, policies and administrative controls? Yes, at the proper time and in the right manner. Please note that in cases of emergencies and process work stop requests, that the time is mostly likely now.
What are some options to removing the blinders of pure compliance?
GOAL (Go Out And Look)
- Evaluate your training and make sure it matches the workers’ and the task’s needs at hand. Many compliance issues start with forcing policies downward with out GOAL from the bottom up.
- Don’t just check off the audit checklist fro compliance’s sake, GOAL
- Immerse yourself with people that share your belief to Do the Right thing, not just the written thing.
- Learn how to evaluate your own process without the pure Compliance Glasses on.
If you see yourself acting on the suggestions above, this would be a perfect Compliance Awareness Trigger to join us out our 2016 TapRooT® Summit week August 1-5 in San Antonio, Texas.
There is no Normalization of Deviation. Deviation IS NORMAL!
If you don’t think that is true, read this previous article:
In 1946, Admiral Rickover was one of a small group of naval officers that visited the Manhattan Project in Oak Ridge, Tennessee, to learn about nuclear power and to see if there were ways to apply it in the US Navy. He had the foresight to see that it could be applied as a propulsion for submarines – freeing subs from the risky proposition of having to surface to recharge their batteries.
But even more amazing than his ability to see how nuclear power could be used, to form a team with exceptional technical skills, and to research and develop the complex technologies that made this possible … he saw that the normal ways that the Navy and industrial contractors did things (their management systems) were not robust enough to handle the risk of nuclear technology.
Rickover set out to develop the technology to power a ship with the atom and to develop the management systems that would assure excellence. In PhD research circles these new ways of managing are often called a “high performance organization.”
Rickover’s pursuit of excellence was not without cost. It made him the pariah in naval leadership. Despite his accomplishments, Rickover would have been forced out of the Navy if it had not been for strident support from key members of Congress.
Why was Rickover an outcast? Because he would not compromise over nuclear safety and his management philosophies were directly opposed to the standard techniques used throughout the Navy (and most industrial companies).
What is the proof that his high performance management systems work? Over 60 years of operating hundreds of naval nuclear reactors ashore and at sea without a single process safety accident (reactor meltdown). And his legacy continues even after he left as head of the Nuclear Navy. The culture he established is so strong that it has endured for 30 years!
Compare that record to the civilian nuclear power industry, refinery process safety incidents, or off shore drilling major accidents. You will see that Rickover developed a truly different high performance organization that many with PhD’s still don’t understand.
In his organization, deviation truly was abnormal.
What are the secrets that Rickover applied to achieve excellence? They aren’t secret. He testified to his methods in front of Congress and his testimony is available at this link:
What keeps other industries from adopting the Rickover’s management systems to achieve equally outstanding performance in their industries? The systems Rickover used to achieve excellence are outside the experience of most senior executives and applying the management systems REQUIRES focussed persistence from the highest levels of management.
To STOP the normalization of deviation, the CEO and Presidents of major corporations would have to insist and promote the Normalization of Excellence that is outlined in Rickover’s testimony to Congress.
Sometimes Rickover’s testimony to Congress may not be clear to someone who has not experience life in the Nuclear Navy. Therefore, I will explain (translate from Nuclear navy terminology) what Rickover meant and provide readers with examples from my Nuclear Navy career and from industry.
Read Part 3: Normalization of Excellence – The Rickover Legacy – Technical Competency
Or click on this link if the video does not play …
Yes, I have written and spoken about normalization of deviation before. But today I hope to convince you that normalization of deviation DOES NOT EXIST.
What is “Normalization of Deviation”? (Or sometimes referred to as normalization of deviance.) In an interview, Diane Vaughan, Sociology Professor from Ohio State University, said:
Social normalization of deviance means that people within the organization become so much accustomed to a deviant behaviour that they don’t consider it as deviant, despite the fact that they far exceed their own rules for the elementary safety.
Let’s think about this for a moment. What defines deviation or deviance?
To deviate means to depart from a set path.
In a country, the government’s rules or laws set the path.
In a company, management usually sets the path. Also, management usually conforms to the regulations set by the government.
To say that there is normalization of deviation means that the normal course of doing business is to follow the rules, regulations, and laws.
Let’s look at a simple example: the speed limit.
When Jimmy Carter was President, he convinced Congress to pass a national speed limit of 55 miles per hour. He did this because there was an oil shortage and he declared the conservation of oil to be the “moral equivalent of war.”
Violating the speed limit became a national pastime. Can you remember the popular songs? Sammy Hagar sang “I can’t drive 55.” There was C.W. McCall signing “Convoy.”
And the most famous of all? The movie “Smoky and the Bandit” with Burt Reynolds and Sally Fields …
How many laws did they break?
One might think that this is just a rare example. The rule (55 mph speed limit) was just too strict and people rebelled.
Take a minute to consider what you observe … who breaks the rules?
You can probably remember a famous example for each of these classes of people above where rule breaking was found to be common (or at least not so uncommon).
You might ask yourself … Why do people break the rules? If you ask people why they break the rules you will hear the following words used in their replies:
- Just for the inexperienced
- Just once
- They were just guidelines
- Everybody does it
My belief is that rule breaking is part of human nature. We often work the easiest way, the quickest way, the way with the least effort, to get things done.
Thus deviation from strict standards is NOT unusual. Deviation is NORMAL!
This there is no “Normalization of Deviation” … the abnormal state is getting everyone to follow strict rules.
- To follow the procedure as written
- To always wear PPE
- To follow the speed limit
- To pay every tax
- To never sleep on the job (to stop nodding off on the back shift or at a boring meeting)
Thus, instead of wondering why “Normalization of Deviation” exists and treating it like an abnormal case, we should see that we have to do something special to get the abnormal state of a high performance organization to exist.
What we now want is an abnormal state of an extremely high performance organization that we try to establish with strict codes of behavior that are outside our normal experience (or human nature).
How do you establish this high performance organization with high compliance with strict standards? That’s a great question and the topic for another article that I will write in the future.
Read Part 2: Stop Normalization of Deviation with Normalization of Excellence
Watch this video and see what you think …
Do you like quick, simple tips that add value to the way you work? Do you like articles that increase your happiness? How about a joke or something to brighten your day? Of course you do! Or you wouldn’t be reading this post. But the real question is, do you want MORE than all of the useful information we provide on this blog? That’s okay – we’ll allow you to be greedy!
A lot of people don’t know we have a company page on LinkedIn that also shares all those things and more. Follow us by clicking the image below that directs to our company page, and then clicking “Follow.”
We also have a training page where we share tips about career/personal development as well as course photos and information about upcoming courses. If you are planning to attend a TapRooT® course or want a job for candidates with root cause analysis skills, click the image below that directs to our training page and then click “Follow.”
Thank you for being part of the global TapRooT® community!
When is a safety incident a crime? Would making it a corporate crime improve corporate and management behavior?July 29th, 2015 by Mark Paradies
I think we all agree that a fatality is a very unfortunate event. But it may not be a criminal act.
When one asks after an accident if a crime has been committed, the answer depends on the country where the accident occurred. A crime in China may not be a crime in the UK. A crime in the UK may not be a crime in the USA. And a crime in the USA may not be a crime in China.
Even experts may disagree on what constitutes a crime. For example, University of Maryland Law Professor Rena Steinzor wrote an article on her blog titled: “Kill a Worker? You’re Not a Criminal. Steal a Worker’s Pay? You Are One.” Her belief is that Du Pont and Du Pont’s managers should have faced criminal prosecution after an accident at their LaPorte, Texas, facility. She cited behavior by Du Pont’s management as “extraordinarily reckless.”
OSHA Chief David Michaels disagrees with Professor Steinzor. He is quoted in a different article as saying during a press conference that Professor Steinzor’s conclusions and article are, “… simply wrong.”
The debate should raise a significant question: Is making an accident – especially a fatal accident – a corporate crime a good way to change corporate/management behavior and improve worker safety?
Having worked for Du Pont back in the late 1980’s, I know that management was very concerned about safety. They really took safety to heart. I don’t know if that attitude changed as Du Pont transformed itself to increase return on equity … Perhaps they lost their way. But would making poor management decisions a crime make Du Pont a safer place to work?
Making accidents a crime would definitely making performing an accident investigation more difficult. Would employees and managers cooperate with ANY investigation (internal, OSHA, or criminal) IF the outcome could be a jail sentence? I can picture every interviewee consulting with their attorney prior to answering an investigator’s question.
I believe the lack of cooperation would make finding and fixing root causes much more difficult. And finding and fixing the root causes of accidents is extremely important when trying to improve safety. Thus, I believe increased criminalization of accidents would actually work against improving safety.
I believe that Du Pont will take action to turn around safety performance after a series of serious and sometimes fatal accidents. I think they will do this out of concern for their employees. I don’t think the potential for managers going to jail would improve the odds that this improvement will occur.
What do you think? Do you agree or disagree. Or better yet, do you have evidence of criminal proceedings improving or hindering safety improvement?
Let me know by leaving a comment below.