December 18, 2019 | Mark Paradies

Which Human Performance Tools Work and Which Ones Don’t?


Human Performance Tools: Good, Bad, or Ugly?

In the past two decades, the Nuclear Industry has developed tools to try to improve human performance. The goal? Achieve error-free operations. I’ve been interested in these tools and how they work.

This article provides a list of the tools and a critique of one of the tools. It also provides a link to a course where you can learn about all of the tools and other methods to stop human error and improve human performance.

List of Human Performance Tools

The tools – sometimes called HU or HuP (Human Performance) tools or HPT (Human Performance Technology) have many variations. Some were initiated by the military, especially the Nuclear Navy, and many were further applied in the commercial nuclear industry at the insistence of the Institute of Nuclear Power Operation (INPO) in Atlanta. The tools have taken on a life of their own as they have spread to other industries by various consultants who added to and modified the techniques. Thus, even though we will try to provide the most accurate description possible, the descriptions may vary from practices you learned from a particular consultant. Despite the variation, the basics of the tools are usually similar and include:

  • Procedure Use/Adherence
  • Place Keeping
  • Independent Verification
  • Three-Way Communications
  • Pre-Job Brief/Personal Safety Assessment
  • Observation and Coaching
  • Post-Job Brief
  • Concurrent Verification
  • Questioning Attitude
  • STAR
  • Time Out (Stop When Unsure)
  • Attention to Detail
  • Management of Change
  • Error Traps and Precursors
  • Validate Assumptions
  • Do Not Disturb Sign
  • Conservative Decision-Making

Which Hu Tools Work – Which Ones Don’t

I find it strange that there hasn’t been a research project to verify the effectiveness of the above-mentioned human performance tools as a set. Of course, there is evidence that some of the tools are effective. For example, Procedure Use/Adherence and Placekeeping are fairly well-proven as effective.

However, for about half of the techniques, I haven’t been able to find any research that shows the tool’s effectiveness. Even worse, some of the techniques seem to violate basic human factors principles.

Questioning Attitude

Let’s look at an example: Questioning Attitude. (The following is taken from the book, Stopping Human Error and is copyrighted material used by permission.)

The U.S. Nuclear Regulatory Commission defines a Questioning Attitude as:

“…when individuals avoid complacency and continuously
challenge existing conditions and activities in order to identify
discrepancies that might result in an error or inappropriate action.”

After an accident, an investigator can usually point to the lack of a Questioning Attitude as a contributor to the accident. Obviously, if the operator or mechanic had a Questioning Attitude, they would have avoided complacency and identified the discrepancy and would have avoided the error or inappropriate action (just like the NRC definition says).

It would be nice if people could continuously monitor their work environments and proactively detect errors and stop accidents from happening. In some cases, this happens. Workers catch mistakes and prevent accidents. But how reliable are people at continuously questioning, challenging assumptions, and detecting anomalies? That depends.

Research shows that the brain consistently makes corrections to fill in blank spots and make sense of what we see. This happens literally when the brain fills in the blank spot in your vision (see Being Wrong, Adventures in the Margin of Error, Kathryn Schulz, Harper-Collins Publishers, NY, 2010).

I’ve tried to find reliable human factors research that shows that humans can be trained to deny their human nature and become robotic, error-proof error checkers that constantly check and recheck the accuracy of their actions and the actions of those around them, but I can’t find any such research.

What can I find? The brain can detect errors. It senses when something is wrong. But there is a problem. Humans aren’t consistent. For example, they catch one typo and miss the next.

In research on detecting spreadsheet errors (yes, people are paid to detect errors in spreadsheets), several researchers found that the odds of individual inspectors finding a seeded error were 50% to 60%. (See “Does Spell-Checking Software Need a Warning Label?” Communications of the ACM 48(7), pp. 82-86, July 2005, by D.F. Galletta, A. Durcikova, A. Everard, and B. Jones and Human Error Website by Ray Panko, 2007.) See a table of error-finding probabilities compiled by Ray Panko at the University of Hawaii at this link:

A sample of the data he presented about catching errors includes:

Work          –   Error Detection Rate in %

  • Postal workers typing ZIP codes – 50%
  • Programmers finding coding errors – 43%
  • Nuclear workers detecting execution errors in a simulator exercise – ~ 66%
  • Nuclear workers detecting misdiagnosis errors in a simulator exercise – 0%

Perhaps those nuclear operators needed a better questioning attitude!

In the paper, Thinking is Bad: Implications of Human Error Research for Spreadsheet Research and Practice, Ray Panko writes:

The second fundamental thing to understand from human error research is that making large spreadsheets error free is theoretically and practically impossible and that even reducing errors by 80% to 90% is extremely difficult and will require spending about 30% of project time on testing and inspection. Trusting “eyeballing” and “checking for reasonableness” to reduce errors to a reasonable level flies in the face of a century of human error research and is a testament to the profound human tendency to overestimate their ability to control their environment even when they cannot. (

Yes, you can catch some errors. Yes, it is good to try to catch all the errors. You may even be good at it (perhaps 66%). But you won’t be perfect. And when you are imperfect, someone will say:

“You should have had a questioning attitude.”

Questioning Attitude Example

Here is a healthcare example from the U.S. NRC newsletter Safety Culture Trait Talk (Issue 3, December 2014):

“A hospital was conducting a cancer treatment with a high-dose rate brachytherapy remote after loading system using an iridium-192 source. Just prior to the cancer treatment, the hospital had replaced the source and upgraded the software.

When entering the data into the treatment system, the medical physicist was unable to electronically transfer the patient’s treatment plan from the planning system to the treatment system due to an error message. After several failed attempts by staff, the medical physicist entered the treatment plan manually into the treatment systems control console, rather than question why he was seeing the error message.

Due to a bug in the software upgrade, the treatment system software created an unexpected source step size change in the treatment parameters. When the medical physicist entered the data manually for the source dwell times, the software automatically changed the entered data to the default parameters for the source step size. The medical physicist faced an unexpected condition with the software error and failed to recognize the change in the source step size.

The patient was then treated with a mispositioned source.

The medical physicist failed to verify that the treatment computer system was correct after data entry and prior to treatment. In addition, the hospital failed to follow its procedure of performing an independent review of the treatment plan prior to patient treatment.

As a result, the patient received a radiation dose to tissue outside the treatment area and an underdose to the treatment site.

This scenario illustrates equipment (software) errors as the initial precipitating event. Had the medical physicist used a questioning attitude, he could have identified the equipment failure and the hospital could have corrected this failure before treating the patient”

Thus, every error – even computer errors – should be detected by a Questioning Attitude! (At least according to this article from an NRC newsletter.)

When Should You Have a Questioning Attitude?

When do human performance tool experts say you should use a Questioning Attitude? Here’s a list I compiled from different sources:

  • When you feel uncertain
  • When your gut tells you there is something wrong
  • When something unanticipated happens
  • When you are using the human performance tool STAR
  • When making important decisions
  • When you are confused or in doubt
  • When you find something missing
  • When someone says, “We’ve always done it this way.”
  • When you encounter inconsistencies
  • Before doing something important (like a critical procedure step)
  • When plans and procedures don’t agree
  • When plant conditions are different than you expected
  • When you encounter something new/different
  • When you hear: “I assume” or “probably” or “I think …”
  • When normal work processes fail/don’t produce the expected results

That’s a nice list. Peter Parker (Spider-Man) would add:

When my spidey senses are tingling!

When do we recommend having a Questioning Attitude? Always!

The only time we are sure you will have had a Questioning Attitude is AFTER an accident because:

Hindsight is 20/20.

And that is the secret to this human performance tool.

The Secret Problem

After an accident, it is easy to point out where you should have noticed an error, asked a question, or doubted your senses. But, in the heat of battle, while the clock is ticking, when you have lots to do and only eight hours to do it, spotting all the errors and correcting them is NEVER a sure thing. Thus, you are doomed by fate to be caught without your Questioning Attitude when you need it the most (at least that’s what the Inspector from the NRC will say).

So, have a Questioning Attitude but don’t think of it as a magic error reduction tool.

Use your Questioning Attitude on anyone who tells you that Questioning Attitude is a reliable error reduction technique. Question their belief that a Questioning Attitude is a foolproof tool to stop human errors.

In addition, management and supervision can’t assess the use of a Questioning Attitude because they don’t know when the worker should be actively questioning something. Penalizing someone for not having a Questioning Attitude after an incident looks like and is … BLAME.

We recommend that people have a Questioning Attitude, but we don’t see this as a foolproof tool to stop human errors.

Want to Learn More About Human Performance Tools and Stopping Human Error?

In the book, Stopping Human Error, we cover the research and effectiveness of all the Human Performance tools. In addition, we’ve built this book and many other topics into our course:

Stopping Human Error

To see the next scheduled public Stopping Human Error Course, CLICK HERE.

Why should you attend the course? Ask these questions:

  • Do you want to achieve excellent human performance?
  • Would you like to understand the methods you can apply to effectively stop major accidents and incidents by “stopping” human error?
  • Do you need to understand what are the most effective human performance improvement techniques and which ones are counterproductive (yes, some techniques don’t work and may cause people to hide errors)?

If any of the above questions received a “Yes!” answer, you should attend this course.

The instructors will help you understand:

  • The causes of human error
  • Human factors design best practices
  • Methods to find error-likely situations
  • CHAP (Critical Human Action Profile)
  • Human Performance improvement Technology (Human Performance Tools)
  • Designing your human performance improvement program.

You will leave this course with a clear understanding of methods to improve human performance and a plan to apply those methods at your company to achieve great gains in safety, quality, or operational and maintenance performance (all of which depend on human performance).

Participants will also receive the book, Stopping Human Error, a $99.95 value, as part of the course materials. Participants will also receive a certificate of completion and a 90-day subscription to TapRooT® VI Software, our dynamic cloud-based software that computerizes the Equifactor® and TapRooT® Root Cause Analysis Techniques.

See the schedule for upcoming public Stopping Human Error courses HERE. Or contact us to schedule a course at your site.

RCA Course
Courses & Training, Human Performance
Show Comments

9 Replies to “Which Human Performance Tools Work and Which Ones Don’t?”

  • Mark Paradies says:

    I’m surprised that someone hasn’t left a comment defending the technique “Questioning Attitude.”

    • john says:

      I can only agree with what you’ve written Mark. People can and will make mistakes even with a Questioning Attitude!

  • Scott Luscombe says:

    Questioning attitude is more acurately….. “Avoid making assumptions”.
    The health care example shows a typical human machine interface (HMI) where the human believes they are correct and the machine is incorrect.
    As much as we belive computers and machines are better than humans, we still don’t trust them and often assume they are faulty. We don’t make the effort to accurately interpret the error and often make misleading assumptions to fill in the gap.

    • Mark Paradies says:


      Thanks for your comment.

      I think the person in this example was, at least partially, correct.

      The machine would not work (transfer the treatment numbers) so the person input them manually. However, the machine changed the numbers when the person hit enter.

      The person was mistaken in that they didn’t have another human check the numbers once they were entered.

  • Bill Brown says:

    Perhaps no one has come to the defense of ‘Questioning Attitude’ because we all know that ‘having a questioning attitude’ is NOT per se a ‘tool.’ (I recall someone – either Rob Fisher or Tony Muschara – saying exactly that to an ‘HPI’ class a long time ago.)
    Yes, ‘questioning attitude’ is on the (INPO) list. But any behaviorist worth his salt (I got all my education at one of the last bastions of behaviorism) can spot the difference between an attitude versus things one actually does – i.e., things that could reasonably be referred to as ‘tools.’ ‘Questioning attitude’ differs from most of the other ‘tools’ in that the title does not plainly refer to what ought to be done in order to reduce error (compare ‘procedure use and adherence’ or ‘stop when unsure’). Fortunately, some answers to the question “What would I actually do to reduce error?” can be found in the Background and – especially – the Accepted Practices sections of the INPO discussion of ‘Questioning Attitude.’ Which is what one would expect.
    I agree with you about the pointlessness of using terms like ‘questioning attitude’ or ‘complacency’ uncritically. Kahneman offers a good discussion of these manifestations of the hindsight bias in his book – in a chapter titled ‘The Illusion of Understanding.’ Enough said. But in defense of the NRC (or at least the hypothetical inspector) – I think they know a bit more about error than the colloquial use of ‘questioning attitude’ in a newsletter would suggest. For example, when I talk about verification tools I refer to a 1995 NRC report that illustrated the importance of countering the effects of expectancy in independent verification of stereotactic coordinates in radiation therapy as follows:
    “An impressive double blind checking routine would consist of one person setting the shot coordinates from the prescriptions, which are left unknown to the checkers. Each of two checkers separately records their inspection of the set coordinates. Then both checks are compared to each other and the prescription. If there is any discrepancy among the records, the coordinates are reset and the checking procedure is repeated.” [NUREG CR-6323, p.42]
    [Full disclosure: More recently, John Wreathall, Laura Militello and I conducted research for the NRC on human error in radiation therapy (NUREG-2170). The original purpose of the project was to help NRC inspectors understand human errors in radiation therapy.]
    By the way, I’M surprised that someone hasn’t left a comment questioning the idea that human error can be “stopped.”

  • Maile Shrewsberry says:

    Civilization advances by extending the number of important operations which we can perform without thinking about them. — Alfred North Whitehead (Introduction to Mathematics)

  • Tim Irving says:

    Questioning Attitude in my experience is more of a free pass for people to challenge each other and the status quo without fear of repercussion because it is an acceptable part of the organization’s “culture”. Listing it as an HU tool is that free pass. Getting people past the hesitancy of asking questions & challenging each other is a huge step-change in many organizations and the starting point for an HU program.

  • Mark Paradies says:

    Above, Bill Brown suggested that a “double-blind checking routine” might reduce human errors. In my experience, it helps BUT people tend to get complacent and may pencil-whip the exercise which defeats the whole purpose. Whenever a double-blind checking routine is proposed, it should also include frequent auditing which almost makes it a check of the double-blind check.

Leave a Reply

Your email address will not be published. Required fields are marked *