Category: Human Performance

Where did you eat this weekend? (or, why do companies continue to not learn from their mistakes?)

July 24th, 2017 by

Happy Monday. I hope everyone had a good weekend and got recharged for the week ahead.

Every few weeks, I get a craving for Mexican food. Maybe a sit-down meal with a combo plate and a Margarita, maybe Tex-Mex or maybe traditional. It’s all good.

Sometimes, though, a simple California Style Burrito does the trick. This weekend was one of those weekends. Let’s see, what are my choices…? Moe’s, Willy’s, Qdoba, Chipotle?

Chipotle? What??!!!

Unfortunately, Chipotle is back in the news. More sick people. Rats falling from the ceiling. Not good.

It seems like we have been here before. I must admit I did not think they would survive last time, but they did. What about this time? In the current world of social media we shall see.

For those of us in safety or quality, the story is all too familiar. The same problem keeps happening. Over and Over…and Over

So why do companies continue to not learn from mistakes? A few possible reasons:

**They don’t care
**They are incompetent
**They don’t get to true root causes when investigating problems
**They write poor corrective actions
**They don’t have the systems in place for good performance or performance improvement

TapRooT® can help with the last three. Please join us at a future course; you can see the schedule and enroll HERE

So, what do you think? Why do companies not learn from their mistakes? Leave comments below.

By the way, my Burrito from Moe’s was great!

“Human Error” by Maintenance Crew is “Cause” of NYC Subway Derailment. Two Supervisors Suspended Without Pay.

June 29th, 2017 by

NewImage

The New York Daily News says that a piece of track was left between the rails during repair of track on the NYC subway system. That loose track may have caused the derailment of an eight car train.

The rule is that any track less than 19.5 feet either be bolted down or removed. It seems that others say that the “practice” is somewhat different. This piece of track was only 13.5 feet long and was not bolted down.

But don’t worry. Two supervisors have been suspended without pay. And workers are riding the railed looking for other loose equipment between the rails. Problem solved. Human error root cause fixed…

Construction Safety: Human Cost, OSHA Fines and Lawsuits…

June 5th, 2017 by

Knowing that each year about 900 construction workers do not come home to their families after work, safety on construction work sites must be taken seriously.

AGC, the Associated General Contractors of America recently published a study together with Virginia Tech, “Preventing Fatalities in the Construction Industry”. There are some interesting findings:

  • Dangerous Lunch Hour: construction site fatalities peak at noon, and are much lower on Fridays than Monday through Thursday
  • Small Contractors (less than 9 employees) are overrepresented in the statistics, with a fatality rate of 26 per 100,000 workers
  • Fully 1/3 of fatalities are from falls, and about 29% from Transportation incidents with e.g trucks or pickups
  • More experienced workers are not safer: fatalities start increasing after age 35 and keep growing so that 65 year olds are at the highest risk
  • Industrial projects are the most dangerous, followed by Residential and Heavy construction projects

The consequences of a fatality are devastating. There is a great human cost where families will have to deal with grief as well as financial issues. For the company there may be OSHA fines, law suits and criminal investigations. There really is no excuse for a builder not to have an active safety program, no matter how small the company.

Basic safety activities include providing and checking PPE and fall protection, correct use of scaffolding and ladders, on- going safety training, check- ins and audits. It is also a good idea to actively promote a safety culture, and to use a root cause analysis tool to investigate accidents and near misses, and prevent them from happening again.

The TapRooT® Root Cause Analysis methodology is a proven way of getting to the bottom of incidents, and come up with effective corrective actions. Focus is on human performance, and how workers can be separated from hazards like electricity, falls or moving equipment.

We can organize on- site training, or start by signing up for a public course. We offer the 5-Day TapRooT® Advanced Root Cause Analysis Team Leader Training as well as the introductory 2-Day TapRooT® Root Cause Analysis Training class.

Be proactive, do not let preventable accidents catch up with you… call us today!

#TapRooT_RCA #safety

Monday Motivation: Modify your dreams or magnify your skills!

June 5th, 2017 by

  You must either modify your dreams or magnify your skills. – Jim Rohn

“Dream big,”” they say.

“If you can dream it, you can become it,” they say.

It’s the season of high school and college graduations, and success clichés are in the air.  And, to be fair, there is a certain amount of vision that can be gleaned from inspirational quotes.  But there is more to reaching success in your career than simply having a dream.  Don’t settle and modify your dreams.  You can bridge the gap of where you are now to where you want to be by magnifying your skills.

We can help you do just that!

If you want to magnify your leadership skills, read the TapRooT® Root Cause Analysis Leadership Lessons book.

If you want to magnify your skills of conducting fast simple investigations, read the Using the Essential TapRooT® Techniques to Investigate Low-to-Medium Risk Incidents book or attend our 2-day TapRooT® Root Cause Analysis Training.  We have made major strides in making TapRooT® easy to use. We even have a new five step process for doing a low-to-medium risk incident investigation.

If you want to magnify your skills of conducting major investigations, learn the whole TapRooT® process and tools for investigating high potential and high risk incidents by reading the TapRooT® Root Cause Analysis for Major Investigations book or attending our 5-day TapRooT® Advanced Root Cause Analysis Team Leader Training.  The book and course explain the entire 7-step TapRooT® System and all the TapRooT® Tools.

If you want to get ahead of accidents, incidents, and quality issues, then magnify your proactive/audit skills by reading the TapRooT® Root Cause Analysis for Audits and Proactive Performance Improvement book.

Don’t settle for less than what you want to do with your career.  Magnify your skills!

Time for Advanced Root Cause Analysis of Special Operations Sky Diving Deaths?

May 31st, 2017 by

Screen Shot 2017 05 31 at 1 20 19 PM

Click on the image above for a Navy Times article about the accident at a recent deadly demonstration jump over the Hudson River.

Perhaps it’s time for a better root cause analysis of the problems causing these accidents?

Root Cause Tip: “Enforcement Needs Improvement” – You Can’t Train Obedience/Compliance/Positive Behavior

May 26th, 2017 by

This is a quick clarification to stop a definite no-no in poorly developed corrective actions.

You find evidence during your root cause analysis to support the root cause “Enforcement NI” based on the following statements from your Root Cause Tree® Dictionary for a particular causal factor:

  • Was enforcement of the SPAC (Standards, Policies, Administrative Controls) seen as inconsistent by the employees?
  • Has failure to follow SPAC in the past gone uncorrected or unpunished?
  • Did management fail to provide positive incentives for people to follow the SPAC?
  • Was there a reward for NOT following the SPAC (for example: saving time, avoiding discomfort).
  • When supervisors or management noticed problems with worker behavior, did they fail to coach workers and thereby leave problems similar to this causal factor uncorrected?

But then if you create a corrective action to retrain, remind, and reemphasize the rules, directed at the employee or in rare occasions the immediate supervisor, your investigation started on track and jumped tracks at the end.

Now, I am okay with an alert going out to the field for critical to safety or operation issues as a key care about reminder, but that does not fix the issues identified with the evidence above. If you use Train/Re-Train as a corrective action, then you imply that the person must not have known how to perform the job in the first place. If that were the case, root causes under the Basic Cause Category of “Training” should have been selected.

Training covers the person’s knowledge, skills and abilities to perform a specific task safely and successfully. Training does not ensure sustainment of proper actions to perform the task; supervision acknowledgement, reward and discipline from supervision, senior leadership and peers ensure acceptance and sustainment for correct task behaviors.

Don’t forget, it is just as easy for supervision to ignore unsafe behavior as it is for an employee to deviate from a task (assuming the task was doable in the first place). Reward and discipline applies to changing supervision’s behavior as well.

Something else to evaluate. If the root cause of Enforcement NI shows up frequently, make sure that you are not closing the door prematurely on the Root Cause Tree® Dictionary Near Root Causes of:

  • Oversight/Employee Relations (Audits should be catching this and the company culture should be evaluated).
  • Corrective Actions (If you tried to fix this issue before, why did it fail?).

Remember, you can’t train obedience/compliance/positive behavior. Finally, if you get stuck on developing a corrective active for Enforcement NI or any of our root causes, stop and read your Corrective Action Helper®.  

Learn more by attending one of our upcoming TapRooT® Courses or just call 865.539.2139 and ask a question if you get stuck after being trained.

Healthcare Professionals! Please come visit the TapRooT® Booth at the NPSF Conference

May 10th, 2017 by

If you are coming to the conference (May 17 – 19), please stop by and see us at Booth 300; Per Ohstrom and I will both be there.

Of course TapRooT® can help you with patient safety and reducing Sentinal Events. But there are many more ways to use TapRoot® in your hospital:

Improve Employee Safety and reduce injuries

Improve Quality, reduce human error, and make your processes more efficient

We hope to see you there. We have a free gift for the first 500 people, so don’t miss out!

Are You Writing the Same Corrective Actions?

April 17th, 2017 by

Repeating the same corrective actions over and over again defeats the purpose of a quality root cause analysis investigation. If you spend the time investigating and digging deeper to find the REAL root cause, you should write the most effective corrective actions you can to ensure it was all worth the resources put into it. Instructor & Equifactor® and TapRooT® Expert, Ken Reed, talks about corrective actions and how to make them new and effective for each root cause.

 

Take a TapRooT® Root Cause Analysis course today to learn our effective and efficient RCA methodology. 

Why Does TapRooT® Exist?

March 28th, 2017 by

NewImage.png

If you are a TapRooT® User, you may think that the TapRooT® Root Cause Analysis System exists to help people find root causes. But there is more to it than that. TapRooT® exists to:

  • Save lives
  • Prevent injuries
  • Improve product/service quality
  • Improve equipment reliability
  • Make work easier and more productive
  • Stop sentinel events
  • Stop the cycle of blaming people for system caused errors

And we are accomplishing our mission around the world.

Of course, there is still a lot to do. If you would like to learn more about using TapRooT® Root Cause Analysis to help your company accomplish these things, get more information about TapRooT® HERE or attend one of our courses (get info HERE).

If you would like to learn how others have used TapRooT® to meet the objectives laid out above, see the Success Stories at:

http://www.taproot.com/archives/category/success-stories

Top 3 Reasons for Bad Root Cause Analysis and How You Can Overcome Them…

February 7th, 2017 by

NewImage

I’ve heard many high level managers complain that they see the same problems happen over and over again. They just can’t get people to find and fix the problems’ root causes. Why does this happen and what can management do to overcome these issues? Read on to find out.

 

1. BLAME

Blame is the number one reason for bad root cause analysis.

Why?

Because people who are worried about blame don’t fully cooperate with an investigation. They don’t admit their involvement. They hold back critical information. Often this leads to mystery accidents. No one knows who was involved, what happened, or why it happened.

As Bart Simpson says:

“I didn’t do it.”
“Nobody saw me do it.”
“You can’t prove anything.”

Blame is so common that people take it for granted.

Somebody makes a mistake and what do we do? Discipline them.

If they are a contractor, we fire them. No questions asked.

And if the mistake was made by senior management? Sorry … that’s not how blame works. Blame always flows downhill. At a certain senior level management becomes blessed. Only truly horrific accidents like the Deepwater Horizon or Bhopal get senior managers fired or jailed. Then again, maybe those accidents aren’t bad enough for discipline for senior management.

Think about the biggest economic collapse in recent history – the housing collapse of 2008. What senior banker went to jail?

But be an operator and make a simple mistake like pushing the wrong button or a mechanic who doesn’t lock out a breaker while working on equipment? You may be fired or have the feds come after you to put you in jail.

Talk to Kurt Mix. He was a BP engineer who deleted a few text messages from his personal cell phone AFTER he had turned it over to the feds. He was the only person off the Deepwater Horizon who faced criminal charges. Or ask the two BP company men who represented BP on the Deepwater Horizon and faced years of criminal prosecution. 

How do you stop blame and get people to cooperate with investigations? Here are two best practices.

A. Start Small …

If you are investigating near-misses that could have become major accidents and you don’t discipline people who spill the beans, people will learn to cooperate. This is especially true if you reward people for participating and develop effective fixes that make the work easier and their jobs less hazardous. 

Small accidents just don’t have the same cloud of blame hanging over them so if you start small, you have a better chance of getting people to cooperate even if a blame culture has already been established.

B. Use a SnapCharT® to facilitate your investigation and report to management.

We’ve learned that using a SnapCharT® to facilitate an investigation and to show the results to management reduces the tendency to look for blame. The SnapCharT® focuses on what happened and “who did it” becomes less important.

Often, the SnapCharT® shows that there were several things that could have prevented the accident and that no one person was strictly to blame. 

What is a SnapCharT®? Attend any TapRooT® Training and you will learn how to use them. See:

TapRooT® Training

NewImage

2. FIRST ASK WHAT NOT WHY

Ever see someone use 5-Whys to find root causes? They start with what they think is the problem and then ask “Why?” five times. Unfortunately this easy methods often leads investigators astray.

Why?

Because they should have started by asking what before they asked why.

Many investigators start asking why before they understand what happened. This causes them to jump to conclusions. They don’t gather critical evidence that may lead them to the real root causes of the problem. And they tend to focus on a single Causal Factor and miss several others that also contributed to the problem. 

How do you get people to ask what instead of why?

Once again, the SnapCharT® is the best tool to get investigators focused on what happened, find the incidents details, identify all the Causal Factors and the information about each Causal Factor that the investigator needs to identify each problem’s root causes.

NewImage

3. YOU MUST GO BEYOND YOUR CURRENT KNOWLEDGE

Many investigators start their investigation with a pretty good idea of the root causes they are looking for. They already know the answers. All they have to do is find the evidence that supports their hypothesis.

What happens when an investigator starts an investigation by jumping to conclusions?

They ignore evidence that is counter to their hypothesis. This problem is called a:

Confirmation Bias

It has been proven in many scientific studies.

But there is an even bigger problem for investigators who think they know the answer. They often don’t have the training in human factors and equipment reliability to recognize the real root causes of each of the Causal Factors. Therefore, they only look for the root causes they know about and don’t get beyond their current knowledge.

What can you do to help investigators look beyond their current knowledge and avoid confirmation bias?

Have them use the SnapCharT® and the TapRooT® Root Cause Tree® Diagram when finding root causes. You will be amazed at the root causes your investigators discover that they previously would have overlooked.

How can your investigators learn to use the Root Cause Tree® Diagram? Once again, send them to TapRooT® Training.

THAT’S IT…

The TapRooT® Root Cause Analysis System can help your investigators overcome the top 3 reasons for bad root cause analysis. And that’s not all. There are many other advantages for management and investigators (and employees) when people use TapRooT® to solve problems.

If you haven’t tried TapRooT® to solve problems, you don’t know what you are missing.

If your organization faces:

  • Quality Issues
  • Safety Incidents
  • Repeat Equipment Failures
  • Sentinel Events
  • Environmental Incidents
  • Cost Overruns
  • Missed Schedules
  • Plant Downtime

You need to be apply the best root cause analysis system: TapRooT®.

Learn more at: 

http://www.taproot.com/products-services/about-taproot

And find the dates and locations for our public TapRooT® Training at:

 http://www.taproot.com/store/Courses/

NewImage

The 7 Secrets of Root Cause Analysis – Video

December 12th, 2016 by

Hello everyone,

Here is a video that discusses some root cause tips, common problems with root cause analysis, and how TapRooT® can help. I hope you enjoy!

Like what you see? Why not join us at the next course? You can see the schedule and enroll HERE

Monday Accident & Lessons Learned: Overspeed at Fletton Junction

September 19th, 2016 by

This incident notice is from the UK Rail Investigation Branch about an overspeed incident at Fletton Junction, Peterborough on 11 September 2015.

At around 17:11 hrs on 11 September 2015, the 14:25 hrs Virgin Trains East Coast passenger train service from Newcastle to London King’s Cross passed through Fletton Junction, near Peterborough at 51 mph (82 km/h) around twice the permitted speed of 25 mph (40 km/h). This caused the carriages to lurch sideways resulting in minor injuries to three members of staff and one passenger.

It is likely that the train driver had forgotten about the presence of the speed restriction because he was distracted and fatigued due to issues related to his family. Lineside signs and in-cab warnings may have contributed to him not responding appropriately as he approached the speed restriction and engineering controls did not prevent the overspeeding. Neither Virgin Trains East Coast, nor the driver, had realised that family-related distraction and fatigue were likely to be affecting the safety of his driving. Virgin Trains East Coast route risk assessment had not recognised the overspeeding risks particular to Fletton Junction and Network Rail had not identified that a speed limit sign at the start of the speed restriction was smaller than required by its standards.

Screen Shot 2016 08 01 at 4 32 25 PM

The incident could have had more serious consequences if the train had derailed or overturned. The risk of this was present because the track layout was designed for a maximum speed of 27 mph (43 km/h).
As a consequence of this investigation, RAIB has made five recommendations. Two addressed to Virgin Trains East Coast relate to enhancing the management of safety critical staff with problems related to their home life, and considering such issues during the investigation of unsafe events.

A recommendation addressed to Virgin Trains East Coast and an associated recommendation addressed to Network Rail relate to assessing and mitigating risks at speed restrictions.

A further recommendation to Network Rail relates to replacement of operational signage when this is non-compliant with relevant standards.

RAIB report also includes learning points relating to managing personal problems that could affect the safety performance of drivers. A further learning point, arising because of a delay in reporting the incident, stresses the importance of drivers promptly reporting incidents which could have caused track damage. A final learning point encourages a full understanding of the effectiveness of safety mitigation provided by infrastructure and signalling equipment.

For more information see:

R142016_160801_Fletton_Jn

Monday Accident & Lessons Learned: Human Error That Should Not Occur

July 25th, 2016 by

This Accident shares a “Call Back” Report from the Aviation Safety Reporting System that is applicable far beyond aviation.

In this case, the pilot was fatigued and just wanted to “get home.” He had a “finish the mission” focus that could have cost him his life. Here’s an excerpt:

I saw nothing of the runway environment…. I had made no mental accommodation to do a missed approach as I just knew that my skills would allow me to land as they had so many times in past years. The only conscious control input that I can recall is leveling at the MDA [Rather than continuing to the DA? –Ed.] while continuing to focus outside the cockpit for the runway environment. It just had to be there! I do not consciously remember looking at the flight instruments as I began…an uncontrolled, unconscious 90-degree turn to the left, still looking for the runway environment.

To read about this near-miss and the lessons learned, see:

http://asrs.arc.nasa.gov/docs/cb/cb_436.pdf

Screen Shot 2016 06 28 at 3 35 26 PM

What does a bad day look like?

May 17th, 2016 by

snake-751722_1920

A lot of bad days start with bad decisions.  For example, when you decide to take a selfie with a 4-foot rattlesnake… (Read story.)

Monday Accident & Lessons Learned: What Not To Do During Flooding

May 16th, 2016 by

The following is a video of a fatal accident. The vehicle drove around a tow truck sent to block the underpass and past a worker waiving his arms to stop her. She drove into water about 17 feet deep. DON’T watch the video if it will upset you. For others, hopefully you can use this to teach others to avoid standing water during flooding.

What Does a Bad Day Look Like?

May 3rd, 2016 by

When you’re late for work because a truck fell on you, perhaps?

bad day

Understanding & Stopping Human Error – EXCLUSIVE 2-Day Course

April 29th, 2016 by

This course is scheduled for August 1-2, 2016 in San Antonio, Texas just prior to the 2016 Global TapRooT® Summit.

LEARN ABOUT TEN MORE EXCLUSIVE PRESUMMIT COURSES! (Click here.)

Nudging Human Behavior in the Rail Industry

April 20th, 2016 by

rail

A colleague at a recent Rail Safety conference pointed me to this article on how to change people’s behavior on rail lines in London. How do we influence people to:
– put trash in trash bins
– be courteous while playing music
– keep feet off the train seats

They’ve tried signs and warnings. I think we can all agree those have limited effect. There are audible reminders. The escalators in Atlanta Airport talk to you continuously on the way down to the trains.

Here are some other (gentler) ways the London Underground is trying to influence passengers to do what is required.

What does a bad day look like?

April 12th, 2016 by

And it could have been much worse!

Normalization of Excellence – The Rickover Legacy – 18 Other Elements of Rickover’s Approach to Process Safety

March 31st, 2016 by

NewImage

The previous three articles discusses Rickover’s “key elements” to achieving safety in the Navy’s nuclear program. They are:

  1. Technical Competence
  2. Total Responsibility
  3. Facing the Facts

In addition to these three keys that Rickover testified to Congress about, he had 18 other elements that he said were also indispensable. I won’t describe them in detail, but I will list them here:

  1. Conservatism of Design
  2. Robust Systems (design to avoid accidents and emergency system activation)
  3. Redundancy of Equipment (to avoid shutdowns and emergency actions)
  4. Inherently Stable Plant
  5. Full Testing of Plant (prior to operation)
  6. Detailed Prevent/Predictive Maintenance Schedules Strictly Adhered To
  7. Detailed Operating Procedures Developed by Operators, Improved with Experience, and Approved by Technical Experts
  8. Formal Design Documentation and Management of Change
  9. Strict Control of Vendor Provided Equipment (QA Inspections)
  10. Formal Reporting of Incidents and Sharing of Operational Experience
  11. Frequent Detailed Audits/Inspections by Independent, Highly Trained/Experienced Personnel that Report to Top Management
  12. Independent Safety Review by Government Authorities
  13. Personal Selection of Leaders (looking for exceptional technical knowledge and good judgment)
  14. One Year of Specialized Technical Training/Hands-On Experience Prior to 1st Assignment
  15. Advanced Training for Higher Leadership Positions
  16. Extensive Continuing Training and Requalification for All Personnel
  17. Strict Enforcement of Standards & Disqualification for Violations
  18. Frequent Internal Self-Assessments

Would like to review what Rickover had to say about them? See his testimony here:

Rickover Testimony

Now after the description of the excellence of Rickover’s program, you might think there was nothing to be improved. However, I think the program had three key weaknesses. They are:

  1. Blame Orientation (Lack of Praise)
  2. Fatigue
  3. Needed for Advanced Root Cause Analysis

Let me talk about each briefly.

BLAME ORIENTATION

The dark side of a high degree of responsibility was a tendency to blame the individual when something went wrong. Also, success wasn’t celebrated, it was expected. The result was burnout and attitude problems. This led to fairly high turnover rate among the junior leaders and enlisted sailors.

FATIGUE

Want to work long hours? Join the Nuclear Navy! Eighteen hour days, seven days a week, were normal when at sea. In port, three section duty (a 24 hour day every third day) was normal. This meant that you NEVER got a full weekend. Many errors were made due to fatigue. I remember a sailor was almost killed performing electrical work because of actions that just didn’t make sense. He had no explanation for his errors (they were multiple) and he knew better because he was the person that trained everyone else. But he had been working over 45 days straight with a minimum of 12 hours per day. Was he fatigued? It never showed up in the incident investigation.

ADVANCED ROOT CAUSE ANALYSIS

Root Cause Analysis in the Nuclear Navy is basic. Assign smart people and they will find good “permanent fixes” to problems. And this works … sometimes. The problem? The Nuke Navy doesn’t train sailors and officers how to investigate human errors. That’s where advanced root cause analysis comes in. TapRooT® has an expert system that helps people find the root causes of human error and produce fixes that stop the problems. Whenever I hire a Navy Nuke to work at System Improvements, they always tell me they already know about root cause analysis because they did that “on the boat.” But when they take one of our courses, they realize that they really had so much to learn.

Read Part 7:  Statement of Admiral Rickover in front of the Subcommittee on Energy Research and Production of the Committee on Science and Technology of the US House of Representatives – May 24, 1979

If you would like to learn more about advanced root cause analysis, see our course offerings:

COURSES

And sign up for our weekly newsletter:

NEWSLETTER

Is 17% Compliance Good Enough in Healthcare?

March 11th, 2016 by
9M1HWW2JFV

Why do we have such low compliance?

Read a recent study that stated only 17.4% of ambulatory care nurses (surveyed) comply with all 9 precautions for infection control. Now first, for those who didn’t click above to see how the data was collected, I will let you know that this is SELF-REPORTED data (glean what you will from that tidbit of information). I would bet that compliance is actually much lower causing even greater concern.

To be fair, we are evaluating 9 different precautions. I will say that any statement that includes “wash hands” has low compliance … just walk into any men’s restroom to see that. But what would drives this? Let’s examine the items because it must be impossible to complete all of these, right? Here is the list:

  1. Provide care considering all patients as potentially contagious
  2. Wash hands after removing gloves
  3. Avoid placing foreign objects on my hands
  4. Wear gloves when exposure of my hands to bodily fluids is anticipated
  5. Avoid needle recapping
  6. Avoid disassembling a used needle from a syringe
  7. Use a face mask when anticipating exposure to air-transmitted pathogens
  8. Wash hands after providing care
  9. Discard used sharp materials into sharp containers

As a non-healthcare professional I don’t see a whole lot I disagree with. I mean, you are working with sick people, washing hands, wearing gloves, watch out for bodily fluids… can’t argue with that. So why do we have such low compliance? And remember this is “infection control,” so keeping healthcare professionals and other patients and staff safe.

Well, on our Root Cause Tree® we have a root cause under Management System->SPAC Not Used named, “No way to implement.” I bring this up simply to examine what we request in this list versus the very very dynamic environment in the hospital.

Can it be reasonable (except for the human self-preservation gene) to expect all of these to happen when working to save a coding patient? Or in a situation when an ER has very high census with multiple traumas (a situation I witnessed myself yesterday)?

I guess the answer truly is no. We are providing a SPAC that as written is reasonable, but can be difficult to implement during certain times. Thus, the very honest self-reported numbers.

Interestingly enough, I know the TapRooT®ers out there are all saying, “Hey dude, this is more of an Enforcement NI thing,” (you know you just did that, don’t act like you didn’t), but is it really Enforcement NI? I don’t believe in any way shape or form that you could enact an enforcement mechanism for all nine of these things, all at the same time, and give healthcare professionals the ability to perform timely patient care. The process would be so burdensome that it would crumble under the weight of its own scrutiny and patient care would suffer.

So is 17.4% compliance enough? Probably not, but let’s also remember what we are asking for people to do for that compliance. The number may not be acceptable, or palatable, but is what we can expect based on what is asked of these courageous folks working in this very difficult environment.

What do you think? Leave your comments below.

If this topic interests you, check out our medical track at the 2016 Global TapRooT® Summit.  Breakout sessions include:

  • 7 Deadly Sins of Human Performance
  • TapRooT® Changes for the Medical Community
  • Human Error Causes of Quality Problems
  • Writing TapRooT® Driven Preventative & Corrective Actions Workshop
  • Anatomy of a Medical Investigation & more!

GO HERE to download a .pdf brochure!

Error Proofing: Learn Best Practices and Industry Secrets

March 11th, 2016 by

easier

What is the error proofing here?

 

“Easier than making a mistake” … now that is good Human Engineering!

While listening to a radio commercial recently, I heard the announcer say, “Easier than making a mistake!” As a TapRooT® Root Cause Instructor with a quality and human engineering (Human Factors) background, all that I could think about is mistake-proofing, Poka-yoke.

The concept was formalized, and the term adopted, by Shigeo Shingo as part of the Toyota Production System. It was originally described as baka-yoke, but as this means “fool-proofing” (or “idiot-proofing”) the name was changed to the milder poka-yoke. (From Wikipdia)

Now, I did not learn about Dr. Shigeo Shingo during my Human Factors study, even though a large part of training dealt 100% with design and usability from products, to controls and to user graphic user interfaces. On the flip side, Human Factors and Usability was rarely discussed during my Lean Six Sigma certification either, even though Poka-yoke was covered.

Why are two major interactive topics such as Human Factors and Poka-yoke kept in isolation, very dependent on where and what you study? Simple, shared best practices and industry secrets are not always the norm.

Where can you learn about both topics? In San Antonio, Texas during our TapRooT® Summit Week August 1-5.

In the pre-summit 2-Day TapRooT® Quality Process Improvement Facilitator Course, we cover the error of making weak preventative or corrective action items that are not based on the actual root causes found and not optimizing and understanding mistake-proofing that will impact your success in continuous process improvements.

For those that need a deeper understanding of why mistake-proofing should be considered, you should look into signing up for the 2-Day Understanding and Stopping Human Error Course.

Normalization of Excellence – The Rickover Legacy – Technical Competency

March 10th, 2016 by

If you read last week’s article about stopping the normalization of deviation with the normalization of excellence, you are ready to start learning to apply Rickover’s philosophies of excellence to achieve amazing performance.

In his testimony to Congress, he starts out explaining the nuclear program and the success that has been achieved to that point. He then explains that there is no simple formula to achieve this success. Rather, what we now recognize as a management system is a “integrated whole of many factors.” He emphasizes that these factors cannot be used individually, but rather, must all be used together. He says, “Each element depends on all the other elements.”

So before I start explaining the individual elements, heed Rickover’s advice:

“The problems you face cannot be solved by specifying compliance with one or two simple procedures.
Reactor safety requires adherence to a total concept wherein all elements are recognized
as important and each is constantly reinforced.” 

 If you aren’t in the nuclear industry, you can replace the words “reactor safety” with “process safety” or maybe even “patient safety” to apply Rickover’s philosophies to your industry.

The first three elements that Rickover explains are:

  • Technical Competence
  • Responsibility
  • Facing the Facts

This really is the core of Rickover’s management philosophy and I will explain each in detail.

Technical Competancy

Rickover believed that to manage a high risk enterprise (nuclear power plant, refinery, offshore drilling platform, or other high risk ventures) you have to fundamentally understand the technical aspects of the job. This was NOT an overview of how things worked. It was a detailed understanding of the science, chemistry, physics, and engineering behind the processes.

The requirement for technical knowledge didn’t stop with the operations manager or plant manager. The technical knowledge requirement went all the way up to the CEO/President level. The higher on the org chart you were – the better your technical knowledge was suppose to be.

Rickover said:

“At Naval Reactors, I take individuals who are good engineers and make them into managers.
They do not manage by gimmicks but rather by knowledge, logic, common sense, and hard work.” 

All the managers (officers) in the Naval Nuclear Power Program went through a rigorous screening process. First, they were selected from the top portions of good engineering programs from universities across the US. Non-engineer majors were also considered if they had excellent grades in physics, calculus, and chemistry. All people selected then went to Naval Reactors headquarters where they took a technical test to evaluate their technical abilities. Tough engineering, math, chemistry, and physics questions were asked on a non-multiple choice test where the work to achieve the answer had to be shown.

The next day the candidates were put through several technical interviews by high level Department Heads at Naval Reactors.  The candidates were asked to solve tough real-life scenarios and apply their technical skills to real-world problems.

Finally, each candidate had the now famous Admiral Rickover interview. Rickover reviewed the candidates academic performance, test results, and interview performance and then asked some of his famous style of questions to evaluate how the candidate reacted under pressure.

My test and interviews went well enough until my Rickover interview. Before the interview, you sat in a room listening to a continuous lecture on what you should and should not do when in the presence of the Admiral. You were informed that you would be accompanied into the room by a senior officer who would sit directly behind you. That you should not address the Admiral until he spoke to you (he was a busy man who didn’t need to be interrupted). That you should take your seat in the chair in front of his desk and wait for him to address you. That when he asked you questions, you should answer directly and that “No excuse sir!” was not an answer. If he asked you a question he wanted to know the answer … not an excuse. That a direct answer was “Yes Sir! or No Sir!” If he asked you if you were married, “Yes Sir” would be a good answer but that didn’t mean that he wanted to hear about your sex life.

Before too long, my name was called and I departed with my escort for the Admiral’s office. When I entered his office I was shocked. Spartan would be a generous description of the accommodations. Old furniture was probably WWII surplus. Rickover’s desk was piled full of neatly stacked folders full of paper, and he was working on something with his head down.

He was a tiny, wiry looking old man, (He was probably in his late seventies, and I was in my senior year of college so too me he looked ancient).

NewImage

There was an old looking wooden chair directly in from of his desk. I sat down and immediately noticed that one of the legs was shorter than the rest. The chair naturally tipped back and forth. You could either lean forward and have the chair sit still forward or lean back and hold the chair back. I leaned back trying to maintain a straight posture.

I watched Rickover as he worked. He was busy reviewing paperwork and occasionally signing something as he moved files from one stack to another. Finally he stopped and took a file from a different stack and started looking at it. I thought, “That’s my file.”

A minute or two later he looked up at me and said.

“Midshipman Paradies, I see here you got a lot of Cs in your studies at the University of Illinois,
can you tell me why you did so poorly?”

My first thought was … “I’m sure glad he didn’t ask me about that D or E.” I certainly didn’t want to mention drinking beer and playing football and basketball, so I responded:

Well Admiral, Electrical Engineering at the University of Illinois is a difficult curriculum and that’s all the better I could do.

I didn’t know that this was one of his standard questions, and he was looking for you to make excuses. I also didn’t know that Rickover had a MS in Electrical Engineering and that he thought it was a tough curriculum.

He said, “OK.”

He closed the file and looked me in the eyes and asked:

“ Midshipman Paradies, are you married?”

That was one of the questions that they warned us to answer directly. I said, “No Sir.”

He asked, “Are you engaged?”  I said, “Yes Sir.”

He continued to look me directly in the eyes (a very penetrating stare) and asked:

 “Has your fiancé every told you that you are good looking?”

That question caught me totally off guard, Here is this shrunken, bent over old admiral asking me about being good looking … where was he going with this?

I answered, “Yes Sir.”

He asked,

 “ What do you think she meant?”

 I was at a total loss. What did she mean? Who knows. I certainly didn’t want to say that I didn’t know. So I said,

Well, I guess Admiral that she liked the way I looked.”

He said, “No Midshipman Paradies, you are wrong.”

I then gave my best answer of the day. I said, “Yes Sir.”

He said,

What she meant was that she wanted to marry you.
When you go back, will you ask your fiancé what she meant and
send me a letter and tell me what she says
?”

I said, “Yes Sir.”

He said, “Get out of my office,” and pointed toward the door.

That was the end of my interview. I had passed. And I did go back and ask my fiancé what she meant and wrote the Admiral and told him what she said.

You might think that writing the letter wan’t important. But it was. The Admirals staff kept track of every letter that he was owed. When I returned for my Engineers Exam, when I checked in with my orders, the woman behind the desk asked, “Do you have any outstanding correspondence with the Admiral?” I said, “No.” She looked in a folder and said, “That’s correct, you sent the letter that you owed the Admiral in 1978.”

My interview was rather straightforward compared to the stories I’ve heard about other Midshipmen. Perhaps the favorite one I heard was from a friend of mine we’ll call Midshipman F.

Midshipman F was a History major. He had taken calculus, chemistry, physics, and other technical subjects and had done quite well. Rickover asked him if he wanted to be in the Naval Nuclear Program, why didn’t he get a technical degree. He responded that to understand the world, history was important.

Rickover then started to tell him that he had wasted his time with history classes. Rickover bet that Midshipman F didn’t know anything about history and ask him questions about history (which were “current events” to Rickover). After listening to Midshipman F answer some history questions, Rickover told him that he was “stupid” and didn’t know anything about history, and to go stand in the closet.

Midshipman F went over to the closet and opened the door, but there was already someone in the closet who looked like a senior officer. He stepped in, shut the door, and they both stood there in the dark and didn’t say a word.

After what Midshipman F said seemed like forever, his escort came over and opened the door and told Midshipman F that the Admiral would like to talk to him again. They went back to arguing over history and Midshipman F was kicked out of the Adsmirals Office twice to go sit in the “penalty box” (another very small room with a chair where you would be sent when the Admiral wanted to make you cool your heels). Midshipman F never gave up his argument about the importance of History and was eventually allowed into the Admiral’s program.

But whenever F would tell the story, he ended it with:

I’ve always wondered whatever happened to the other guy who was in the closet.

When you were accepted into the Nuclear Navy, you had to complete a year of extremely difficult technical training before you reported to your first ship. The competition was tough. In the 100 people in my class, many had Masters Degrees in Engineering. One guy had a “photographic memory.” He could remember everything that was written on the board and everything the instructor said verbatim. Not only could he do that, but he did it while doing the homework from the previous lecture. I had the third lowest GPA of anyone in the class and was immediately assigned to “remedial study.”

We had 7 hours of class a day with an hour off for lunch. I used my lunch time to study and usual put in an additional 5 hours each night and another 12 hours on the weekend. I had to keep study logs with how I applied my time in 5 minute intervals. I did well, graduating in the top 10 students in the class. Others did less well … 10 students failed out in the first 6 months. After 6 months, you were assigned to a nuclear prototype plant (and actual naval reactor that had been built ashore to test the design) and went through advanced classes and qualification to be a engineering officer of the watch (EOOW). For this qualification, you worked shift work with mandatory 12 hour days of watch standing, studying, and “check outs” from qualified personnel. Again, I did well and was the second officer to qualify in my class at the prototype 5 weeks before the end of the six month tour. One individual failed out of our prototype training (failed to qualify in the six month time span).

Why is it important to know about this pre-ship education? Because it gives someone who did not go through the program some idea of the technical knowledge that Rickover expected before anyone was allowed to go to sea and qualify to run one of “his” reactors.

And this was just the start of Technical Competency.

Once at sea there was never ending qualifications and continuing training. Drills. And annual “ORSE Board” inspections with level of knowledge exams and interviews.

An officer had to pass another level of Technical Competency call the Engineer’s Exam. To prepare for this exam, the officer was suppose to study in his “spare time” and learn everything there was to know about the design and technical specifications of the reactor plant and systems on his ship. Any topic from the start of Nuclear Power School to current operating problems to any potential equipment failure to system on his ship were fair game for the 8 hour test (with 30 minutes off for lunch) and three technical interviews. You couldn’t get less than a 3.0 score (of 4.0 total) on any section of the exam and couldn’t get less than a 3.2 of 4.0 total on the test. All the questions were essays or engineering calculations. And you had to write fast to complete the exam. During the interview portion of the exam on the next day, any of the interviewers could flunk you if they didn’t like the answers to their questions. Once again, at the end of the process, you had an interview with the Admiral. If you passed, you were then allowed to be assigned as the Engineering Officer for one of Rickover’s ships. People did fail and their career in the Nuclear Navy was over.

Finally, before you were allowed to become a Commanding Officer on one of Rickover’s ships, you had to qualify for command at sea and then go through “Charm School.” Charm School was an assignment to Rickover’s staff where you studied advanced topics and went through a series of brutal interviews with Rickover and his staff members. It was probably someone going though Charm School who was standing in Rickover’s closet when Midshipman F opened the door. Successfully completing Charm School (in some number of months to a year) got you your ticket to your very own “boat” (submarine) or nuclear powered surface ship as the Commanding Officer. Again, there were people who did not get Rickover’s blessing to command a nuclear powered ship. And despite over a decade of service, there was no appeal. If you didn’t have what it takes … you were out.

Most officers also managed to get an advanced degree somewhere along their career.

That’s Technical Competency.

I’ve worked at Du Pont. As a consultant, I’ve visited many refineries, chemical plants, and oil companies. The only thing that comes close to the technical competency required in the Nuclear Navy is the qualification in the commercial nuclear industry and the training program for astronauts at NASA. However, neither program has the advanced technical training requirements for senior level executives.

Read Part 4:  Normalization of Excellence – the Rickover Legacy – Responsibility

Does A Good Quality Management System equate to Compliance?

March 8th, 2016 by

book_graphic_1511

If it is written down, it must be followed. This means it must be correct… right?

Lack of compliance discussion triggers that I see often are:

  • Defective products or services
  • Audit findings
  • Rework and scrap

So the next questions that I often ask when compliance is “apparent” are:

  • Do these defects happen when standard, policies and administrative controls are in place and followed?
  • What were the root causes for the audit findings?
  • What were the root causes for the rework and scrap?

In a purely compliance driven company, I often here these answers:

  • It was a complacency issue
  • The employees were transferred…. Sometimes right out the door
  • Employee was retrained and the other employees were reminded on why it is important to do the job as required.

So is compliance in itself a bad thing? No, but compliance to poor processes just means poor output always.

Should employees be able to question current standards, policies and administrative controls? Yes, at the proper time and in the right manner. Please note that in cases of emergencies and process work stop requests, that the time is mostly likely now.

What are some options to removing the blinders of pure compliance?

GOAL (Go Out And Look)

  • Evaluate your training and make sure it matches the workers’ and the task’s needs at hand. Many compliance issues start with forcing policies downward with out GOAL from the bottom up.
  • Don’t just check off the audit checklist fro compliance’s sake, GOAL
  • Immerse yourself with people that share your belief to Do the Right thing, not just the written thing.
  • Learn how to evaluate your own process without the pure Compliance Glasses on.

If you see yourself acting on the suggestions above, this would be a perfect Compliance Awareness Trigger to join us out our 2016 TapRooT® Summit week August 1-5 in San Antonio, Texas.

Go here to see the tracks and pre-summit sessions that combat the Compliance Barriers.

Connect with Us

Filter News

Search News

Authors

Angie ComerAngie Comer

Software

Barb PhillipsBarb Phillips

Editorial Director

Chris ValleeChris Vallee

Six Sigma

Dan VerlindeDan Verlinde

VP, Software

Dave JanneyDave Janney

Safety & Quality

Gabby MillerGabby Miller

Marketing

Garrett BoydGarrett Boyd

Technical Support

Ken ReedKen Reed

VP, Equifactor®

Linda UngerLinda Unger

Co-Founder

Mark ParadiesMark Paradies

Creator of TapRooT®

Per OhstromPer Ohstrom

VP, Sales

Shaun BakerShaun Baker

Technical Support

Steve RaycraftSteve Raycraft

Technical Support

Wayne BrownWayne Brown

Technical Support

Success Stories

Our challenge was to further improve the medication administration process at our facility by proactively…

Jackson County Memorial Hospital

Alaska Airlines adopted System Safety and incorporated TapRooT® into the process to find the root causes…

Alaska Airlines
Contact Us