Not too far on this day in 1983…
Monday Accident & Lessons Learned: Incident Report from the UK Rail Accident Investigation Branch: Tram running with doors open on London Tramlink, CroydonPosted: April 7th, 2014 in Accidents, Current Events, Human Performance, Investigations, Pictures
There were eight recommendations made by the UK RAIB. here’s a summary of the investigation:
On Saturday 13 April 2013 between 17:33 and 17:38 hrs, a tram travelling from West Croydon to Beckenham Junction, on the London Tramlink system, departed from Lebanon Road and Sandilands tram stops with all of its doors open on the left-hand side. Some of the doors closed automatically during the journey, but one set of doors remained open throughout the incident. The incident ended when a controller monitoring the tram on CCTV noticed that it had departed from Sandilands withits doors open, and arranged for the tram to be stopped. Although there were no casualties, there was potential for serious injury.
The tram was able to move with its doors open because a fault override switch, which disables safety systems such as the door-traction interlock, had been inadvertently operated by the driver while trying to resolve a fault with the tram. The driver didnot close and check the doors before departing from Lebanon Road and Sandilands partly because he was distracted from dealing with the fault, and partly because he did not believe that the tram could be moved with any of its doors open. The design of controls and displays in the driving cab contributed to the driver’s inadvertent operation of the fault override switch. Furthermore, breakdowns in communication between the driver and the passengers, and between the driver and the controller, meant that neither the driver nor the controller were aware of the problem until after the tram left Sandilands.
The RAIB has made eight recommendations. Four of these are to Tram Operations Ltd, aimed at improving the design of tram controls and displays, as well astraining of staff on, and processes for, fault handling and communications. Two recommendations have been made to London Tramlink, one (in consultation with Tram Operations Ltd) relating to improving cab displays and labelling and one on enhancing the quality of the radio system on the network. One recommendation is made to all UK tram operators concerning the accidental operation of safety override switches. The remaining recommendation is to the Office of Rail Regulation regarding the provision of guidance on ergonomics principles for cab interface design.
For the complete report, see:
Five days of panic. 140,000 residents voluntarily evacuate. Fourteen years of clean-up.
The 35th anniversary of the Three Mile Island Nuclear Disaster.
On the midnight shift on March 28, 1979, things started to go wrong at TMI. A simple instrument problem started a chain of events that led to a core meltdown.
I can still remember that morning.
I was learning to operate a nuclear plant (S1W near Idaho Falls, ID) at the time. I was in the front seat of the bus riding out to the site. The bus driver had a transistor radio on and the news reported that there had been a nuclear accident at TMI. They switched to a live report from a farmer across the river. He said he could smell the radiation in the air. Also, his cows weren’t giving as much milk.
the midnight shift on March 28, 1979, things started to go wrong at TMI. A simple instrument problem started a chain of events that led to a core meltdown.
I was learning to operate a nuclear plant (S1W near Idaho Falls, ID) at the time. I can still remember that morning. I was in the front seat of the bus riding out to the site. The bus driver had a transistor radio on and the news reported that there had been a nuclear accident at TMI. They switched to a live report from a farmer across the river. He said he could smell the radiation in the air. Also, his cows weren’t giving as much milk.
Years later, I attended the University of Illinois while also being a Assistant Professor (teaching midshipmen naval weapons and naval history). I was the first in a new program that was a cooperative effort between the Nuclear Engineering and Psychology Departments to research human factors and nuclear power plants. My advisor and mentor was Dr. Charles O. Hopkins, a human factors expert. In 1981-1982, he headed group of human factors professionals who wrote a report for the NRC on what they should do to more fully consider human factors in nuclear reactor regulation.
As part of my studies I developed a course on the accident at TMI and published my thesis on function allocation and automation for the next generation of nuclear power plants.
So, each year when the anniversary of the accident comes around I think back to those days and how little we have learned (or should I say applied) about using good human factors to prevent industrial accidents.
WHAT ARE HUMAN PERFORMANCE TOOLS?
Over the past decade, best practices and techniques have been developed “stop” or manage human error. They were developed mainly in the US nuclear industry and vary in content/name by the consultant/organization that offers them. Common tools include:
- Procedure Use*
- Place Keeping*
- Pre-Job Brief*
- Post-Job Brief
- Peer Checking*
- Time Out
- Rule of Three
- 3-Way Communication*
- Observation & Coaching*
- Questioning Attitude
- Attention to Detail
- Errors Traps/Precursors
Here are some links to learn more about the tools above:
Also, if you plan on attending the 2014 Global TapRooT® Summit, attend Mark Paradies’ talk on human performance tools to learn more about these tools.
The asterisk (*) techniques above have always been included on the Root Cause Tree® (part of the TapRooT® System) because they are supported by established human factors research. Post-Job Briefs are also a well-established best practice that isn’t included on the Root Cause Tree® because it would occur after an incident or as part of the normal performance improvement program.
WHAT’S WRONG WITH HUMAN PERFORMANCE TOOLS?
Some of the techniques seem like excellent best practices (paying attention, having a questioning attitude, STAR, and Time Out), but I haven’t been able to find scientific human factors research that supports their use. For example, the “Rule of Three” is supposedly supported by research in the aviation industry that three yellow lights (conditions that are worrisome but not enough to prevent a flight) are equal to one red light (a fight no-go indicator – for example weather that doesn’t meet the flight minimums).
Because they seem like good ideas, you may decide to adopt them, but they may not work as intended in all cases. After all, research hasn’t tested their limits.
The final technique, Error Traps/Precursors seems to violate a couple of human factors principles and therefore should only be used with caution.
ERROR TRAPS / PRECURSORS
The concept behind Error Traps/Precursors is that certain human conditions are indicators of impending human error. If a person can self-monitor to detect the “error likely” human condition, he/she can then apply an appro-priate human performance tool to avoid (stop) the impending error. For example, if you notice that you are rushing, you could apply STAR.
What are these human conditions? The selection varies depending on the consultant that presents the technique, but they commonly include:
- High Workload
- New Tasks
- First Time
- New Technique
A problem with this technique is that the person performing work must self-monitor to detect the human condition to self-trigger action. I’ve never seen research that people are particularly good at self-monitoring to detect any human condition. And even if they were, the list seems to indicate that people would be would be constantly self-triggering. By this list, people are always just about to make a mistake. (To err is human?)
Constantly monitoring points to another human factors limitation. The human brain automatically apportions a very limited resource – attention. Your brain continuously, subconsciously decides what to pay attention to and what to ignore. Your brain decides what sounds are important and which ones are noise. Your brain may decide that motion in the visual field deserves more attention than a stationary object. Or that a sharp pain is more important than a faint touch.
In times of crisis or when one is busy, your ability to pay attention is stressed. Imagine yourself driving on ice. You are so focused on the feel of the road and preventing sliding that you don’t have enough attention left over to even have a casual conversation.
Even when you are not stressed, if you self-monitor your state, you stealing attention from some other task. What faint signal might you miss?
All of the Human Performance Tools have a common limitation. They are weak corrective actions. They are 5’s or 6’s on the TapRooT® hierarchy of controls. Rules, procedures, training, are all attempts at improving human performance. And the human may be your weakest safeguard. If your human performance improvement program is based on the weakest safeguards, what should you expect?
This doesn’t mean that you should not try proven human performance tools. It means that you should try to adopt stronger safeguards and understand the limitations of human performance tools and, at a minimum, implement defense in depth to ensure adequate performance.
On August 14, 2013, UPS Airlines Flight 1354 crashed and burst into flames short of the runway on approach to Birmingham–Shuttlesworth International Airport. Both pilots of the cargo plane were pronounced dead at the scene of the crash.
The Federal Aviation Administration issued new rules aimed at ensuring airline pilots have sufficient rest 2 years ago, and proposed to include cargo airlines in draft regulations, but exempted them when final regulations were released, citing cost.
Read the rest of the story on The Washington Post.
Learn more about tell-tale signs of fatigue-related mistakes at the 2014 Global TapRooT® Summit. Summit speaker Bill Sirois, Senior Vice President and Chief Operating Officer for Circadian Technologies, will be speaking about fatigue and human performance.
Fatigue in the workplace is difficult to measure, and it is even more difficult to identify as a causal factor of accidents and injuries. However, fatigue does contribute to human errors including errors in judgment, risk-taking behaviors, clouded decision-making, ability to handle stress and reaction time.
Join us for the Human Error Reduction and Behavior Change track, April 9 – 11, 2014 in Horseshoe Bay, Texas, to hear this talk.
LEARN MORE on the Summit website.
REGISTER NOW for the Human Error Reduction and Behavior Change track.
The following is the text of a speech delivered in 1982 by Admiral Hyman G. Rickover – the father of the Nuclear Navy – at Columbia University. Rickover’s accomplishments as the head of the Nuclear Navy are legendary. From developing the first power producing submarine based nuclear reactor from scratch to operations in just three years to creating a program to guarantee process safety (nuclear safety) for over 60 years (zero nuclear accidents).
I am reprinting this speech here because I believe that many do not understand the management concepts needed to guarantee process safety. We teach these concepts in our “Reducing Serious Injuries and Fatalities Using TapRooT®” pre-Summit course. Since many won’t be able to attend this training, I wanted to give all an opportunity to learn these valuable lessons by posting this speech.
- – -
Human experience shows that people, not organizations or management systems, get things done. For this reason, subordinates must be given authority and responsibility early in their careers. In this way they develop quickly and can help the manager do his work. The manager, of course, remains ultimately responsible and must accept the blame if subordinates make mistakes.
As subordinates develop, work should be constantly added so that no one can finish his job. This serves as a prod and a challenge. It brings out their capabilities and frees the manager to assume added responsibilities. As members of the organization become capable of assuming new and more difficult duties, they develop pride in doing the job well. This attitude soon permeates the entire organization.
One must permit his people the freedom to seek added work and greater responsibility. In my organization, there are no formal job descriptions or organizational charts. Responsibilities are defined in a general way, so that people are not circumscribed. All are permitted to do as they think best and to go to anyone and anywhere for help. Each person then is limited only by his own ability.
Complex jobs cannot be accomplished effectively with transients. Therefore, a manager must make the work challenging and rewarding so that his people will remain with the organization for many years. This allows it to benefit fully from their knowledge, experience, and corporate memory.
The Defense Department does not recognize the need for continuity in important jobs. It rotates officer every few years both at headquarters and in the field. The same applies to their civilian superiors.
This system virtually ensures inexperience and nonaccountability. By the time an officer has begun to learn a job, it is time for him to rotate. Under this system, incumbents can blame their problems on predecessors. They are assigned to another job before the results of their work become evident. Subordinates cannot be expected to remain committed to a job and perform effectively when they are continuously adapting to a new job or to a new boss.
When doing a job—any job—one must feel that he owns it, and act as though he will remain in the job forever. He must look after his work just as conscientiously, as though it were his own business and his own money. If he feels he is only a temporary custodian, or that the job is just a stepping stone to a higher position, his actions will not take into account the long-term interests of the organization. His lack of commitment to the present job will be perceived by those who work for him, and they, likewise, will tend not to care. Too many spend their entire working lives looking for their next job. When one feels he owns his present job and acts that way, he need have no concern about his next job.
In accepting responsibility for a job, a person must get directly involved. Every manager has a personal responsibility not only to find problems but to correct them. This responsibility comes before all other obligations, before personal ambition or comfort.
A major flaw in our system of government, and even in industry, is the latitude allowed to do less than is necessary. Too often officials are willing to accept and adapt to situations they know to be wrong. The tendency is to downplay problems instead of actively trying to correct them. Recognizing this, many subordinates give up, contain their views within themselves, and wait for others to take action. When this happens, the manager is deprived of the experience and ideas of subordinates who generally are more knowledgeable than he in their particular areas.
A manager must instill in his people an attitude of personal responsibility for seeing a job properly accomplished. Unfortunately, this seems to be declining, particularly in large organizations where responsibility is broadly distributed. To complaints of a job poorly done, one often hears the excuse, “I am not responsible.” I believe that is literally correct. The man who takes such a stand in fact is not responsible; he is irresponsible. While he may not be legally liable, or the work may not have been specifically assigned to him, no one involved in a job can divest himself of responsibility for its successful completion.
Unless the individual truly responsible can be identified when something goes wrong, no one has really been responsible. With the advent of modern management theories it is becoming common for organizations to deal with problems in a collective manner, by dividing programs into subprograms, with no one left responsible for the entire effort. There is also the tendency to establish more and more levels of management, on the theory that this gives better control. These are but different forms of shared responsibility, which easily lead to no one being responsible—a problems that often inheres in large corporations as well as in the Defense Department.
When I came to Washington before World War II to head the electrical section of the Bureau of Ships, I found that one man was in charge of design, another of production, a third handled maintenance, while a fourth dealt with fiscal matters. The entire bureau operated that way. It didn’t make sense to me. Design problems showed up in production, production errors showed up in maintenance, and financial matters reached into all areas. I changed the system. I made one man responsible for his entire area of equipment—for design, production, maintenance, and contracting. If anything went wrong, I knew exactly at whom to point. I run my present organization on the same principle.
A good manager must have unshakeable determination and tenacity. Deciding what needs to be done is easy, getting it done is more difficult. Good ideas are not adopted automatically. They must be driven into practice with courageous impatience. Once implemented they can be easily overturned or subverted through apathy or lack of follow-up, so a continuous effort is required. Too often, important problems are recognized but no one is willing to sustain the effort needed to solve them.
Nothing worthwhile can be accomplished without determination. In the early days of nuclear power, for example, getting approval to build the first nuclear submarine—the Nautilus—was almost as difficult as designing and building it. Many in the Navy opposed building a nuclear submarine.
In the same way, the Navy once viewed nuclear-powered aircraft carriers and cruisers as too expensive, despite their obvious advantages of unlimited cruising range and ability to remain at sea without vulnerable support ships. Yet today our nuclear submarine fleet is widely recognized as our nation’s most effective deterrent to nuclear war. Our nuclear-powered aircraft carriers and cruisers have proven their worth by defending our interests all over the world—even in remote trouble spots such as the Indian Ocean, where the capability of oil-fired ships would be severely limited by their dependence on fuel supplies.
The man in charge must concern himself with details. If he does not consider them important, neither will his subordinates. Yet “the devil is in the details.” It is hard and monotonous to pay attention to seemingly minor matters. In my work, I probably spend about ninety-nine percent of my time on what others may call petty details. Most managers would rather focus on lofty policy matters. But when the details are ignored, the project fails. No infusion of policy or lofty ideals can then correct the situation.
To maintain proper control one must have simple and direct means to find out what is going on. There are many ways of doing this; all involve constant drudgery. For this reason those in charge often create “management information systems” designed to extract from the operation the details a busy executive needs to know. Often the process is carried too far. The top official then loses touch with his people and with the work that is actually going on.
Attention to detail does not require a manager to do everything himself. No one can work more than twenty-four hours each day. Therefore to multiply his efforts, he must create an environment where his subordinates can work to their maximum ability. Some management experts advocate strict limits to the number of people reporting to a common superior—generally five to seven. But if one has capable people who require but a few moments of his time during the day, there is no reason to set such arbitrary constraints. Some forty key people report frequently and directly to me. This enables me to keep up with what is going on and makes it possible for them to get fast action. The latter aspect is particularly important. Capable people will not work for long where they cannot get prompt decisions and actions from their superior.
I require frequent reports, both oral and written, from many key people in the nuclear program. These include the commanding officers of our nuclear ships, those in charge of our schools and laboratories, and representatives at manufacturers’ plants and commercial shipyards. I insist they report the problems they have found directly to me—and in plain English. This provides them unlimited flexibility in subject matter—something that often is not accommodated in highly structured management systems—and a way to communicate their problems and recommendations to me without having them filtered through others. The Defense Department, with its excessive layers of management, suffers because those at the top who make decisions are generally isolated from their subordinates, who have the first-hand knowledge.
To do a job effectively, one must set priorities. Too many people let their “in” basket set the priorities. On any given day, unimportant but interesting trivia pass through an office; one must not permit these to monopolize his time. The human tendency is to while away time with unimportant matters that do not require mental effort or energy. Since they can be easily resolved, they give a false sense of accomplishment. The manager must exert self-discipline to ensure that his energy is focused where it is truly needed.
All work should be checked through an independent and impartial review. In engineering and manufacturing, industry spends large sums on quality control. But the concept of impartial reviews and oversight is important in other areas also. Even the most dedicated individual makes mistakes—and many workers are less than dedicated. I have seen much poor work and sheer nonsense generated in government and in industry because it was not checked properly.
One must create the ability in his staff to generate clear, forceful arguments for opposing viewpoints as well as for their own. Open discussions and disagreements must be encouraged, so that all sides of an issue will be fully explored. Further, important issues should be presented in writing. Nothing so sharpens the thought process as writing down one’s arguments. Weaknesses overlooked in oral discussion become painfully obvious on the written page.
When important decisions are not documented, one becomes dependent on individual memory, which is quickly lost as people leave or move to other jobs. In my work, it is important to be able to go back a number of years to determine the facts that were considered in arriving at a decision. This makes it easier to resolve new problems by putting them into proper perspective. It also minimizes the risk of repeating past mistakes. Moreover if important communications and actions are not documented clearly, one can never be sure they were understood or even executed.
It is a human inclination to hope things will work out, despite evidence or doubt to the contrary. A successful manager must resist this temptation. This is particularly hard if one has invested much time and energy on a project and thus has come to feel possessive about it. Although it is not easy to admit what a person once thought correct now appears to be wrong, one must discipline himself to face the facts objectively and make the necessary changes—regardless of the consequences to himself. The man in charge must personally set the example in this respect. He must be able, in effect, to “kill his own child” if necessary and must require his subordinates to do likewise. I have had to go to Congress and, because of technical problems, recommended terminating a project that had been funded largely on my say-so. It is not a pleasant task, but one must be brutally objective in his work.
No management system can substitute for hard work. A manager who does not work hard or devote extra effort cannot expect his people to do so. He must set the example. The manager may not be the smartest or the most knowledgeable person, but if he dedicates himself to the job and devotes the required effort, his people will follow his lead.
The ideas I have mentioned are not new—previous generations recognized the value of hard work, attention to detail, personal responsibility, and determination. And these, rather than the highly-touted modern management techniques, are still the most important in doing a job. Together they embody a common-sense approach to management, one that cannot be taught by professors of management in a classroom.
I am not against business education. A knowledge of accounting, finance, business law, and the like can be of value in a business environment. What I do believe is harmful is the impression often created by those who teach management that one will be able to manage any job by applying certain management techniques together with some simple academic rules of how to manage people and situations.
Why Are the Major, Steady Declines in Minor and Recordable Injuries Not Seen to the Same Extent in Major Accident (Fatality) Statistics?Posted: December 26th, 2013 in Courses, Human Performance, Performance Improvement
Why are the major, steady declines in minor and recordable injuries not seen to the same extent in major accident (fatality) statistics? Mark Paradies has new insight into the phenomenon and has used it to develop systematic methods to stop major accidents by using TapRooT® both reactively and proactively.
Register for Reducing Serious Injuries & Fatalities Using TapRooT®, a 2-Day Pre-Summit Course scheduled for April 7-8, 2014 in Horseshoe Bay, Texas.
The course highlights three major sources of major accidents:
* industrial hazards
* process safety and
* driving safety.
Learn new ideas to revolutionize your fatality/major accident prevention programs and start you down the road to eliminating major accidents.
Learn more about the Summit: http://www.taproot.com/taproot-summit
Register for this 2-day course and the Summit and save $200!
Watch two children explain their morning routine using a process flow chart and a control chart.
If you do not have a knowledgeable kindergartner hanging around to help you, I would recommend attending the following this April during our TapRooT® Summit Week:
Advanced Trending Techniques
TapRooT® Quality/Six Sigma/Lean Advanced Root Cause Analysis Training http://www.taproot.com/taproot-summit/pre-summit-courses#TapRooTSixSigma
Process Quality and Corrective Action Programs
The Associated Press reports, “Dutch bus drivers to test fatigue warning.”
See the article at:
Cook found alive in wreckage three days after ship sinks…
If you answered a hornet’s nest, you are correct!
It’s the first one I’ve ever seen in person in the wild (with real, live hornets buzzing in and out).
What does this have to do with root cause analysis?
Practice the skills you learn in a TapRooT® class by analyzing everyday situations. In this example, let’s look at Energy – Safeguard – Target.
What is the ENERGY?
I guess I would call it a biological source of Energy – HORNETS!
What is the TARGET?
Anything that disturbs the nest. It could have been me if I moved any closer.
What are the SAFEGUARDS that protected me from the hornets?
In this case, the only safeguard was my own awareness when walking through the woods.
That’s a pretty weak human performance safeguard. But this time it worked!
Should I have removed the hazard? No way! That’s much more risk that just leaving the area and remembering where the nest is.
How many “awareness” safeguards do you depend on at work? Is that really good enough? Should you be removing the hazards?
That’s your root cause analysis tip to think about for today!
Want to learn more about TapRooT®, advanced root cause analysis, and Energy – Safeguard – Target Analysis (we call it Safeguard Analysis)? Then attend a 5-Day TapRooT® Advanced Root Cause Analysis Team Leader Course. See the upcoming worldwide public course schedule at:
When many people first see the “enforcement NI” root cause on the TapRooT® Root Cause Tree®, they think that the obvious corrective action would be discipline. But those who read the TapRooT® Corrective Action Helper® know that there are reasons for people violating rules and that the only way to truly change behavior is to address the reasons why the rules are being broken.
The article, “The Five Pitfalls of Discipline in Safety,” clearly presents reasons why enforcement might need improvement. Click on the link and see if you agree.
Here’s and link to a LinkedIn article about motivation:
The article starts with:
Ten years ago, I was managing a team of talented marketers at Yahoo! when something unexpected happened. In a one-on-one meeting, a woman on my team said to me, “I wanted you to know that if I ever do a really good job, just pay me more money. I don’t care about recognition or awards and I’m not motivated by praise. If I do well, just give me a bonus or pay me more.”
By Chris Vallee
I was an aircraft mechanic in USAF when this incident occurred. The aftermath of the F-15 Crash and Pilot Fatality continued with an Airman’s suicide was loss to many.
While, I knew the basics, I just recently found a follow up report and wanted to share it. The information is taken directly from the article as is without my paraphrase. Here is the website.
An Air Force review board has partly cleared the name of an F-15 mechanic who committed suicide in 1996 rather than face a court-martial for a fatal repair error.
Evidence showed that TSgt. XXXXXX did not perform the botched control rod maintenance at issue, although he did check the work and found nothing wrong.
In addition, several previous incidents in which other mechanics made the same mistakes should have alerted the Air Force to a potential problem, according to the board.
“We did not think XXXX was totally free of all responsibility,” said Lee Baseman, chairman of the correction board. “But it was our view that he was unduly carrying the burden for a series of missteps that went back at least 10 years.”
In May 1995, XXXX and TSgt. YYYYYY were carrying out maintenance on an F-15C based at Spangdahlem AB, Germany, when YYYYY accidentally crossed flight control rods while reinstalling them. XXXX did not catch the miscue, which made the airplane impossible to control in the air. It subsequently crashed, killing Maj. Donald G. Lowry Jr. (Great GUY!!)
Air Force authorities charged XXXX and YYYYY with dereliction of duty and negligent homicide. XXXXX shot himself in October 1996 during a break in court proceedings. Commanding officers then accepted YYYYY request for administrative separation, on grounds that the interests of the service would be best served by bringing the tragic case to a swift conclusion.
Similar crossed-rod cases occurred at least twice before the Spangdahlem crash, noted the review board-once in 1986 and again in 1991. But in both instances the problem was caught before takeoff.
In its conclusions, the board stated, “After the Black Hawk shootdown [in 1994], the demand for accountability for this accident may have been pursued with such zeal as to leave fairness and equity behind. The fatal crash was a tragedy waiting to happen, yet the decedent was singled out to pay for an accident that could have been prevented anywhere along the ‘chain of events’ had any of the numerous individuals involved made different decisions.
“Most disturbing was the way the Air Force leadership allowed this case to be handled. The Air Force’s representatives resisted the inclusion of potentially exculpatory evidence from the review and report and managed to have a good deal of it excluded from consideration in the pending trial.”
Following the death of Lowry, the Air Force took steps to prevent such a mix-up from happening again. The control rods are now color-coded to ensure proper installation, and the maintenance technical manual warns against the mistake. All flight control systems must now be checked any time the control rods undergo maintenance. ” “
Ref: Journal of the Air Force Association, June 1998 Vol. 81, No.5, Peter Grier
I find it hard to believe that this question needs to be asked. Of course fatigue can lead to human error. This has been proven over and over again. And doctors are human.
I read an article in the Ploughkeepsie Journal that had the following quote:
“While surgeons interviewed in a 2011 Georgia Regents University study believe fatigue has an effect on their ‘emotions, cognitive capability, and fine-motor skills,’ few of them said it has a large effect on patient safety.”
Obviously surgeons, just like many other workers, convince themselves that they can use willpower to overcome the real physical limitations that fatigue creates. Physical limitations that lead to increased errors and patient harm.
Of course, everyone has some degree of fatigue in life. But the level of fatigue we are talking about goes above and beyond what could be deemed acceptable.
Is it possible to predict when someone will be too fatigued to produce reliable results? Yes. We worked with Circadian Technologies to help create their Fatigue Accident Causation Testing System (FACTS). Learn more about it at this link:
I believe fatigue is one of the most underreported accident patient safety incident causes, Why? Because people don’t understand how insidious fatigue is as an accident cause.
No matter what industry you are in, if you would you like to learn more about fatigue as a cause of human error, attend the 2014 Global TapRooT® Summit and hear Bill Sirois, COO at Circadian Technologies, present “Fatigue and Human Performance – The Tell-Tail Signs of Fatigue Related Mistakes” in the Human Error Reduction and Behavior Change Track. You will return to work with a heightened awareness of the risks presented when people go beyond their limits of fatigue.
The story continues to get more complicated in the train wreck in Lac-Megantic, Canada.
Initially people wondered about sabotage or the firefighting efforts on the engine that caught fire disabling the train on a hill side. The Chairman of the Montreal, Main, and Atlantic railroad, Edward Burkhardt, blamed the train’s engineer for not setting enough hand brakes to hold the train in place. A quote from a story in the Globe and mail said:
“… the outspoken chairman told reporters he believes that employee Tom Harding – the engineer responsible for the train that night – did not turn enough of the handbrakes that might have prevented the tragedy.”
Here’s raw video of the fire after the crash that is believed to have killed 48 people in Lac-Megantic.
The Canadian Transportation Safety Board Chair, Wendy Tadros, cautioned against blaming any one person for the accident. In her experience (and our’s as well), major accidents are usually a result of a series of events with several causal factors and failed safeguards. In addition to the initiating event (the fire on the train that disabled it) and the next contributing event (the failure of the brakes to hold the train), the story said that the Canadian TSB will be looking “… at the way MM&A operates, its adherence to safety standards and other issues that might have contributed to the crash.“
Our greatest weakness lies in giving up. The most certain way to succeed is always to try just one more time.
From despair.com …
There are too many major accidents due to failures in process safety. These accidents go beyond the regulations written by OSHA and EPA (and the regulators in other countries). They go beyond the chemical industry and include the nuclear industry, oil exploration and production, fertilizer storage and distribution, grain elevators (and other dust explosion examples), aviation, shipping, utilities, and even hospitals.
How can these accidents be prevented? First one has to understand process safety and fatality prevention. Unfortunately, many senior managers don’t understand it. And that’s why Mark Paradies started giving talks about this topic at the TapRooT® Summit. Unfortunately, even though the Summits are well attended, thousands need to hear what Mark has to say, but don’t get the chance. That’s why we decided to post links to some of Mark’s Summit talks here.
Of course, attending the sessions at the TapRooT® Summit is much better than looking at slides and watching videos. But the information in these talks needs greater dissemination to help prevent major accidents around the world. Therefore, we’ve selected video clips, slides from mark’s talks, and Admiral Rickover’s testimony before Congress after TMI (written remarks) to provide an overview of some of the concepts that senior managers need to consider to prevent major process safety accidents.
Here are the links:
Mark’s General Session Talk About Fatality Prevention from the 2013 Summit
I know this is a lot of information and the videos are long, but the lives lost each year are a preventable tragedy. Please pass this information on to those that you think many need it.
For those who would like to get Mark to talk to your senior management about management’s role in process safety and how the lessons from Admiral Rickover apply to your facilities, call us at 865-539-2139 or e-mail us by CLICKING HERE.
A creative man is motivated by the desire to achieve, not by the desire to beat others.
~ Ayn Rand
From the Motivation Hacker blog …
Monday Accident & Lessons Learned: Accidents at Intersections Reduced After Red Light Cameras RemovedPosted: April 29th, 2013 in Accidents, Human Performance, Performance Improvement
Here’s a link to the story in the Houston Chronicle:
The story says that:
“In the five months after Houston voters forced city officials to turn off a camera surveillance system that fined motorists for running red lights, traffic accidents at those 50 intersections with 70 cameras have decreased 16 percent, according to recently released data.”
There were lot’s of reasons given by officials for this unexpected outcome. Everything from the “weather was good” to “the camera’s had trained people to be safer.”
The interesting statistic that no one mentioned was that it is usual for rear-end collision to increase when red light cameras are installed because, to avoid a ticket, people slam on their brakes when a light turns red and they get rear ended.
There are at least two lessons that I think you can learn from this article.
1. People don’t know how to trend infrequently occurring accident statistics.
In this case, no one on either side of the argument used advanced trending techniques to prove their point. Instead, they chose the statistics that best fit their argument and claimed that those stats proved their point.
2. Sometimes corrective actions can have unintended consequences.
Several times in the past we’ve discussed red light cameras as an enforcement tool and the consequences that the tool could have on accident statistics. Our general opinion is that the cameras would be great for raising revenue but would do little to improve safety. For several reasons, rear end collisions were an unintended consequence of red light cameras that tend to increase accident rates at intersections where the devices were installed. So all people looking to improve performance should learn that your corrective actions may have other consequences than the ones you intend them to have!
Psychology Today says …
Motivation is literally the desire to do things. It’s the difference between waking up before dawn to pound the pavement and lazing around the house all day. It’s the crucial element in setting and attaining goals—and research shows you can influence your own levels of motivation and self-control. So figure out what you want, power through the pain period, and start being who you want to be.