Category: Human Performance
Watch two children explain their morning routine using a process flow chart and a control chart.
If you do not have a knowledgeable kindergartner hanging around to help you, I would recommend attending the following this April during our TapRooT® Summit Week:
Advanced Trending Techniques
TapRooT® Quality/Six Sigma/Lean Advanced Root Cause Analysis Training http://www.taproot.com/taproot-summit/pre-summit-courses#TapRooTSixSigma
Process Quality and Corrective Action Programs
The Associated Press reports, “Dutch bus drivers to test fatigue warning.”
See the article at:
Cook found alive in wreckage three days after ship sinks…
If you answered a hornet’s nest, you are correct!
It’s the first one I’ve ever seen in person in the wild (with real, live hornets buzzing in and out).
What does this have to do with root cause analysis?
Practice the skills you learn in a TapRooT® class by analyzing everyday situations. In this example, let’s look at Energy – Safeguard – Target.
What is the ENERGY?
I guess I would call it a biological source of Energy – HORNETS!
What is the TARGET?
Anything that disturbs the nest. It could have been me if I moved any closer.
What are the SAFEGUARDS that protected me from the hornets?
In this case, the only safeguard was my own awareness when walking through the woods.
That’s a pretty weak human performance safeguard. But this time it worked!
Should I have removed the hazard? No way! That’s much more risk that just leaving the area and remembering where the nest is.
How many “awareness” safeguards do you depend on at work? Is that really good enough? Should you be removing the hazards?
That’s your root cause analysis tip to think about for today!
Want to learn more about TapRooT®, advanced root cause analysis, and Energy – Safeguard – Target Analysis (we call it Safeguard Analysis)? Then attend a 5-Day TapRooT® Advanced Root Cause Analysis Team Leader Course. See the upcoming worldwide public course schedule at:
When many people first see the “enforcement NI” root cause on the TapRooT® Root Cause Tree®, they think that the obvious corrective action would be discipline. But those who read the TapRooT® Corrective Action Helper® know that there are reasons for people violating rules and that the only way to truly change behavior is to address the reasons why the rules are being broken.
The article, “The Five Pitfalls of Discipline in Safety,” clearly presents reasons why enforcement might need improvement. Click on the link and see if you agree.
Here’s and link to a LinkedIn article about motivation:
The article starts with:
Ten years ago, I was managing a team of talented marketers at Yahoo! when something unexpected happened. In a one-on-one meeting, a woman on my team said to me, “I wanted you to know that if I ever do a really good job, just pay me more money. I don’t care about recognition or awards and I’m not motivated by praise. If I do well, just give me a bonus or pay me more.”
By Chris Vallee
I was an aircraft mechanic in USAF when this incident occurred. The aftermath of the F-15 Crash and Pilot Fatality continued with an Airman’s suicide was loss to many.
While, I knew the basics, I just recently found a follow up report and wanted to share it. The information is taken directly from the article as is without my paraphrase. Here is the website.
An Air Force review board has partly cleared the name of an F-15 mechanic who committed suicide in 1996 rather than face a court-martial for a fatal repair error.
Evidence showed that TSgt. XXXXXX did not perform the botched control rod maintenance at issue, although he did check the work and found nothing wrong.
In addition, several previous incidents in which other mechanics made the same mistakes should have alerted the Air Force to a potential problem, according to the board.
“We did not think XXXX was totally free of all responsibility,” said Lee Baseman, chairman of the correction board. “But it was our view that he was unduly carrying the burden for a series of missteps that went back at least 10 years.”
In May 1995, XXXX and TSgt. YYYYYY were carrying out maintenance on an F-15C based at Spangdahlem AB, Germany, when YYYYY accidentally crossed flight control rods while reinstalling them. XXXX did not catch the miscue, which made the airplane impossible to control in the air. It subsequently crashed, killing Maj. Donald G. Lowry Jr. (Great GUY!!)
Air Force authorities charged XXXX and YYYYY with dereliction of duty and negligent homicide. XXXXX shot himself in October 1996 during a break in court proceedings. Commanding officers then accepted YYYYY request for administrative separation, on grounds that the interests of the service would be best served by bringing the tragic case to a swift conclusion.
Similar crossed-rod cases occurred at least twice before the Spangdahlem crash, noted the review board-once in 1986 and again in 1991. But in both instances the problem was caught before takeoff.
In its conclusions, the board stated, “After the Black Hawk shootdown [in 1994], the demand for accountability for this accident may have been pursued with such zeal as to leave fairness and equity behind. The fatal crash was a tragedy waiting to happen, yet the decedent was singled out to pay for an accident that could have been prevented anywhere along the ‘chain of events’ had any of the numerous individuals involved made different decisions.
“Most disturbing was the way the Air Force leadership allowed this case to be handled. The Air Force’s representatives resisted the inclusion of potentially exculpatory evidence from the review and report and managed to have a good deal of it excluded from consideration in the pending trial.”
Following the death of Lowry, the Air Force took steps to prevent such a mix-up from happening again. The control rods are now color-coded to ensure proper installation, and the maintenance technical manual warns against the mistake. All flight control systems must now be checked any time the control rods undergo maintenance. ” “
Ref: Journal of the Air Force Association, June 1998 Vol. 81, No.5, Peter Grier
I find it hard to believe that this question needs to be asked. Of course fatigue can lead to human error. This has been proven over and over again. And doctors are human.
I read an article in the Ploughkeepsie Journal that had the following quote:
“While surgeons interviewed in a 2011 Georgia Regents University study believe fatigue has an effect on their ‘emotions, cognitive capability, and fine-motor skills,’ few of them said it has a large effect on patient safety.”
Obviously surgeons, just like many other workers, convince themselves that they can use willpower to overcome the real physical limitations that fatigue creates. Physical limitations that lead to increased errors and patient harm.
Of course, everyone has some degree of fatigue in life. But the level of fatigue we are talking about goes above and beyond what could be deemed acceptable.
Is it possible to predict when someone will be too fatigued to produce reliable results? Yes. We worked with Circadian Technologies to help create their Fatigue Accident Causation Testing System (FACTS). Learn more about it at this link:
I believe fatigue is one of the most underreported accident patient safety incident causes, Why? Because people don’t understand how insidious fatigue is as an accident cause.
No matter what industry you are in, if you would you like to learn more about fatigue as a cause of human error, attend the 2014 Global TapRooT® Summit and hear Bill Sirois, COO at Circadian Technologies, present “Fatigue and Human Performance – The Tell-Tail Signs of Fatigue Related Mistakes” in the Human Error Reduction and Behavior Change Track. You will return to work with a heightened awareness of the risks presented when people go beyond their limits of fatigue.
The story continues to get more complicated in the train wreck in Lac-Megantic, Canada.
Initially people wondered about sabotage or the firefighting efforts on the engine that caught fire disabling the train on a hill side. The Chairman of the Montreal, Main, and Atlantic railroad, Edward Burkhardt, blamed the train’s engineer for not setting enough hand brakes to hold the train in place. A quote from a story in the Globe and mail said:
“… the outspoken chairman told reporters he believes that employee Tom Harding – the engineer responsible for the train that night – did not turn enough of the handbrakes that might have prevented the tragedy.”
Here’s raw video of the fire after the crash that is believed to have killed 48 people in Lac-Megantic.
The Canadian Transportation Safety Board Chair, Wendy Tadros, cautioned against blaming any one person for the accident. In her experience (and our’s as well), major accidents are usually a result of a series of events with several causal factors and failed safeguards. In addition to the initiating event (the fire on the train that disabled it) and the next contributing event (the failure of the brakes to hold the train), the story said that the Canadian TSB will be looking “… at the way MM&A operates, its adherence to safety standards and other issues that might have contributed to the crash.“
Our greatest weakness lies in giving up. The most certain way to succeed is always to try just one more time.
From despair.com …
There are too many major accidents due to failures in process safety. These accidents go beyond the regulations written by OSHA and EPA (and the regulators in other countries). They go beyond the chemical industry and include the nuclear industry, oil exploration and production, fertilizer storage and distribution, grain elevators (and other dust explosion examples), aviation, shipping, utilities, and even hospitals.
How can these accidents be prevented? First one has to understand process safety and fatality prevention. Unfortunately, many senior managers don’t understand it. And that’s why Mark Paradies started giving talks about this topic at the TapRooT® Summit. Unfortunately, even though the Summits are well attended, thousands need to hear what Mark has to say, but don’t get the chance. That’s why we decided to post links to some of Mark’s Summit talks here.
Of course, attending the sessions at the TapRooT® Summit is much better than looking at slides and watching videos. But the information in these talks needs greater dissemination to help prevent major accidents around the world. Therefore, we’ve selected video clips, slides from mark’s talks, and Admiral Rickover’s testimony before Congress after TMI (written remarks) to provide an overview of some of the concepts that senior managers need to consider to prevent major process safety accidents.
Here are the links:
Mark’s General Session Talk About Fatality Prevention from the 2013 Summit
I know this is a lot of information and the videos are long, but the lives lost each year are a preventable tragedy. Please pass this information on to those that you think many need it.
For those who would like to get Mark to talk to your senior management about management’s role in process safety and how the lessons from Admiral Rickover apply to your facilities, call us at 865-539-2139 or e-mail us by CLICKING HERE.
A creative man is motivated by the desire to achieve, not by the desire to beat others.
~ Ayn Rand
From the Motivation Hacker blog …
Monday Accident & Lessons Learned: Accidents at Intersections Reduced After Red Light Cameras RemovedApril 29th, 2013 by Mark Paradies
Here’s a link to the story in the Houston Chronicle:
The story says that:
“In the five months after Houston voters forced city officials to turn off a camera surveillance system that fined motorists for running red lights, traffic accidents at those 50 intersections with 70 cameras have decreased 16 percent, according to recently released data.”
There were lot’s of reasons given by officials for this unexpected outcome. Everything from the “weather was good” to “the camera’s had trained people to be safer.”
The interesting statistic that no one mentioned was that it is usual for rear-end collision to increase when red light cameras are installed because, to avoid a ticket, people slam on their brakes when a light turns red and they get rear ended.
There are at least two lessons that I think you can learn from this article.
1. People don’t know how to trend infrequently occurring accident statistics.
In this case, no one on either side of the argument used advanced trending techniques to prove their point. Instead, they chose the statistics that best fit their argument and claimed that those stats proved their point.
2. Sometimes corrective actions can have unintended consequences.
Several times in the past we’ve discussed red light cameras as an enforcement tool and the consequences that the tool could have on accident statistics. Our general opinion is that the cameras would be great for raising revenue but would do little to improve safety. For several reasons, rear end collisions were an unintended consequence of red light cameras that tend to increase accident rates at intersections where the devices were installed. So all people looking to improve performance should learn that your corrective actions may have other consequences than the ones you intend them to have!
Psychology Today says …
Motivation is literally the desire to do things. It’s the difference between waking up before dawn to pound the pavement and lazing around the house all day. It’s the crucial element in setting and attaining goals—and research shows you can influence your own levels of motivation and self-control. So figure out what you want, power through the pain period, and start being who you want to be.
Here’s the Meridian-Webster On-line Dictionary definition of “behavior”:
1. a : the manner of conducting oneself
b : anything that an organism does involving action and response to stimulation
c : the response of an individual, group, or species to its environment
2 : the way in which someone behaves; also : an instance of such behavior
3 : the way in which something functions or operates
Another definition that I think that management has in their heads is a “behavior” is:
“Any action or decision that an employee makes that management,
after the fact, decides was wrong.”
Why do I say that mangement uses this definition? Because I often hear about managers blaming the employee’s bad behavior for an accident.
For example, the employee was hurrying to get a job done and makes a mistake. That’s bad behavior!
What if an employee doesn’t hurry? Well, we yell at them to get going!
And what if they hurry and get the job done without an accident? We reward them for being efficient and a “go-getter.”
Management doesn’t usually see their role in making a “behavior” happen.
Behavior should NEVER be the end of a root cause analysis. Behavior is a fact. Just like a failed engine is a fact when a race car “blows it’s engine.”
Of course, a good root cause analysis should look into the causes for a behavior (a mistake) and uncover the reasons for the mistake and, if applicable, the controls that management has over behavior and how those controls failed when an accident occurred.
A bad decision or a human error that we call a “behavior” isn’t the end of the investigation … it is just the beginning!
TapRooT® helps investigator go beyond the symptoms (the behaviors) and find the root causes that management can fix. Some of the most difficult behaviors to fix are those so ingrained in the organization that people can’t see any other way to work.
For example, the culture of cost saving/cutting at BP was so ingrained, that even after the explosions and deaths at the Texas City Refinery, BP didn’t (couldn’t?) change it’s culture – at least not in the Gulf of Mexico exploration division – before they had the Deepwater Horizon accident. At least that is what I see in the reports and testimony that I’ve reviewed after the accident.
And with smaller incidents, it is even harder to get some managers’ attention and show them how they are shaping behavior. But at least in TapRooT® tries by providing guidance in analyzing human errors that leads to true root causes (not just symptoms).
Want to find out more about TapRooT® and behavior? Attend one of our 5-Day TapRooT® Advanced Root Cause Analysis Team Leader Courses. You’ll see how TapRooT® helps you analyze behavior issues in the exercises on the second day of the training. And you will learn much more. For a public 5-Day Course near you see:
CLICK HERE to go to their web site to see more videos and order gloves.
Does the value you place on life influence the national health & safety standards vs the risk the government allows?
Health & Safety Standards = f(Value of Life-Risk Allowed)
Mark Paradies, President of System Improvements and co-creator of the TapRooT® System, will be speaking at the IOSH Conference in Spotlight Theatre 2 on Tuesday, February 26, and Wednesday, February 27.
His topics are:
Tuesday: 13:20 – 13:50 – Spotlight Theatre 2
BP Deepwater Horizon & BP Texas City Accidents: Two Lessons That You May NOT Have Learned
Much has been published about the BP Deepwater Horizon and Texas City Refinery accidents. But there are still some important lessons learned that people may be overlooking. Mark Paradies, root cause analysis expert, will share insights into two lessons learned that have not received much attention yet are important to safety improvement.
Wednesday: 11:20-11:50 – Spotlight Theatre 2
Fixing The Safety Pyramid & Stopping Major Accidents
Several articles have been published criticizing Heinrich’s Safety Pyramid and blaming it’s weaknesses for the gap between the decline in safety statistics and the continuing level rate of serious injuries, including fatalities. Mark Paradies will share insight into the Safety Pyramid and explain why fatality prevention needs a revised model and new approaches to achieve across the board safety performance improvements.
Hope to see you there!
This lesson learned is from a regulatory incident.
I remember talking to a nuclear industry VP who had a troubled plant (NRC regulatory issues). He said that he was blindsided by their problems. It was as if the day before he was walking along on a beautiful day and the next morning he woke up at the bottom of a deep dark hole. The change seemed almost instantaneous … without warning.
How does a leading company go from excellence to disaster? It isn’t that there weren’t warning signs. The signs were there but management missed them.
The fastest way to get in trouble is to start thinking that you are so good that you don’t need to pay attention to small problems. That you can economize on improvement without experiencing performance improve-ment declines. That your cost saving efforts will NOT lead to field personnel placing more emphasis on production and less on safety and quality.
The switch from a performance improvement focus to a cost-cutting focus can seem like a small change – a minor variation. But when the problems start – when you wake up at the bottom of the deep dark hole – you will say the same thing that the nuclear VP said:
If I’d known how bad
this was going to be,
I would have paid any
amount of money to avoid it.
Don’t find yourself at the bottom of the deep dark hole.
Keep your focus on performance improvement.
Learn best practices that others use to make their programs better every year.
Where can you learn these practices? At the 2013 Global TapRooT® Summit in Gatlinburg, TN!
Summit week is March 18-22. Register now to ensure your choice of the pre-Summit Courses.
Get complete Summit info including the complete Summit schedule at:
Remember, there is no time like the present to avoid a disaster!