How To Lie With Statistics
One of my favorite “business” books is an old one titled: How to Lie With Statistics by Darrel Huff. Written in 1954 (before I was born), it is just as applicable today (maybe more so) as it was the day it was written.
Whenever I think about the book, I also think about my first job after I left the Navy. I was in a performance improvement group and I went to a presentation about incident statistics by the manager in charge of their incident database. This was an annual presentation to senior executives and the theme of his talk was that human error was the biggest cause of incidents and we needed to improve human performance. That’s what his statistics said.
Since this was my focus, I thought … “Great! We will be solving one of the biggest problems around!”
We dived into creating a human performance program but, by the next year, we hadn’t accomplished much. We had made some starts, met major resistance, and had plans for more efforts, but we really hadn’t made much impact. I knew that his presentation would show that problems persisted.
Instead, I was surprised. He didn’t mention human performance or human error. He had a new theme. The main problem for improvement was equipment reliability. That’s what the statistics said.
I thought … “How can this be?”
We had not made significant progress solving the human error causes. How could the statistics change without changes to the system?
So, I got access to the data and started comparing previous data.
What did I find?
- BAD DATA. The measurement system was corrupt. There wasn’t a good way to assure that incidents were being investigated throughly and the results could be manipulated. Thus, the person interpreting the data could make it say whatever he or she wanted it to say.
- BAD GRAPHS. There wasn’t a standard set of graphs that management looked at year after year. A new set of graphs were presented at each annual review. No one wanted to admit that progress was slow (or non-existent) … so, they created a new crisis (area for improvement) each year and just declared victory over the old problem because it wasn’t the biggest problem any more (even though it really had NOT disappeared).
- BAD QUESTIONS. Management didn’t ask the right questions. For example, management should have asked …“What happened to the human error problems you told us about last year?”“Show us the graphs that demonstrate the improvement.”“What were the programs that caused this improvement?”
They should have known about the lack of progress over the year and been familiar with the incidents that kept happening over and over again.
But they weren’t.
The just waited for the next year’s presentation and were glad that so much progress was being made.
Leaders needed a good measurement system and graphs that helped them improve performance. That’s why we created the course,
Mark Paradies (President of System Improvements) and Chris Vallee (a Six Sigma Black Belt) will teach you how to measure performance and how to use trends to find problems and prove that performance has improved. When combined with the TapRooT® System for root cause analysis, this becomes a potent performance improvement program.
This course is being held on March 11-12, prior to the 2019 Global TapRooT® Summit, in Houston, TX.
Don’t miss out. Register for both the course and the Summit TODAY.