Under Scrutiny

Abstract:Quality practitioners have for many years used cause and effect analysis to find the root causes of problems, but these techniques often fail to stop problems caused by human error. Problem solvers may want to consider a new approach to clarify misconceptions in applying commonly taught root cause analysis techniques. A new definition of root cause based not on cause and effect, but on an absence of best practices, results in a structured process that allows problem solvers to find the real cause of the human errors and equipment failures that result in most quality …

Access this article
Other ways to access this article
Please register to access this article

Social Bookmarking

Digg, delicious, NewsVine, Furl, Google, StumbleUpon, BlogMarks, Facebook



Excellent information.
--Michael Hoffman, 11-07-2014


Gary:

It seems pretty obvious my article didn't convince you to reconsider your beliefs about 5-Whys and the use of cause-and-effect ... but it did get under your skin! But before I go any further, I would like to apologize. You seem to think your article is the source (or, as you put it, the "primary" source) of all the misconceptions I wrote about. It is NOT. I did not mean to imply that. I just used your article as an example of two particular misconceptions.

However, as you have made several points about my article, let me address each point separately ...

1. "Only tool" misconception. I started out that section as follows:

"Many quality professionals believe cause and effect (the infinite chain-of-causation philosophical mode1) is the only (or perhaps preferred) method to find root causes, maybe because cause and effect is taught in most Six Sigma courses."

Please note, this was not listed as a misconception in your article or by you. Many people involved in quality think cause-and-effect is the only way to find root causes. I've had them tell me that face-to-face. Therefore, I think my point stands as a common misconception, even though it is not a misconception by you, but by others.

2. Jefferson Monument example. I reread the example, the section of the "Flip the Switch" paper on "sphere of influence," and how I used your example to demonstrate the problem of cause-and-effect analysis not leading beyond the investigator's knowledge. I continue to reach the same conclusion I published in the article. If I'm missing something here, I think the only way to solve it is a long discussion over a beer.

3. Single Cause Misconception. I don't really understand your argument here. Didn't both of your examples have a single chain of causation? That's the point I made. I used your examples to demonstrate a common misconception that there is only a single chain of causation. I didn't imply that you think there is only a single cause or single chain of causation ... just that your examples only had a single chain of causation. Therefore, I still think this demonstrates my point. Did I miss something?

Finally, I'm not sure the debate you propose is the best way for people to learn about the problems applying 5-Whys and cause-and-effect. I would suggest a better way to learn might be to try techniques other than 5-Whys and Fishbone Diagrams (the most frequently written about Six Sigma and quality root cause analysis tools, and tools you wrote two articles about before I published my article) and see if the investigator discovers problems they didn't find using cause-and-effect. If they need ideas about the techniques to try, I would recommend they read my blog (http://www.taproot.com/blog) for suggestions.

Perhaps we will meet some day at an industry conference, at the TapRooT Summit or if I get to Minneapolis or you get to Knoxville. I promise to buy the first round of beers and to listen to your arguments about the goodness of why 5-Whys works before I say anything about the research I have done.

Best Regards,
Mark
--Mark Paradies, 08-26-2010


This article cited some statements from my article, Flip the Switch, and used them as the primary source of "misconceptions" to make some points. Although there are some good points in this article, unfortunately none of the "misconceptions" about my article is true. I originally wasn't going to correct them since I believe people would be able to sort them out if they read my article series. Yet a recent inquiry I received reflected that people may take statements from this article as truth, without verifying them through the cited articles. It seems like adding value to put out some corrections as encouraged by QP editor.

Here is a rundown on the statements (all false) about mine in this article:

"Only tool" misconception: Many quality professionals believe cause and effect (the infinite chain-of-causation philosophical model 1) is the only (or perhaps preferred) method to find root causes:
- False. See the 1st RCA technique introduced my article 2 below - Is / Is Not technique, "is a comparative method. It has not been formally classified as a cause-effect analysis tool."

Can't go beyond current knowledge: Another misconception demonstrated in a QP article, "Flip the Switch," which included an example of the Jefferson Monument dirtied by birds.
- False. No specifications either way. (See details of my discussion about this case.) The subject is addressed through the "sphere of influence" discussion in my series.

Single-cause misconception: Another common misconception is the error of identifying a single cause. This was demonstrated in the two cause-and-effect examples presented in "Flip the Switch."
- False. It's a self-guided "tunnel vision" (- citing from this article) laughter, non-issue. See my article 2 for details.

As a conclusion, unfortunately, all of the assumed misconceptions about mine are incorrect and false. My overall conclusion from this article is it didn't intend to contradict my views. The author merely wants to use my article to draw attentions. The core of my view is about the true meaning of root cause: there is no "real" root cause; all root causes are subjectively chosen to serve that role. The author didn't even touch this sometimes controversial subject. Beyond this core, all other points are just talking points - no real debate there. There is one thing new in this article that is the human factor aspect.

A take away from this self-inflated "Scrutiny" is that you don't have to elevate yourself by denying others; a healthy debate needs to be constructive, amending or enhancing instead of trashing each other; as an author, one needs to read a bit more including references, before making statements or assumptions. (So do readers.) Due to space limitation, an article couldn't address everything. For things not addressed, it's much better to simply state your own opinion, instead of making false assumptions at others.

I strongly encourage readers to read through my publication series on RCA before forming any opinions. And I welcome any constructive debate and opinions. The more attention people pay to this subject, the clearer on any misconceptions.

Gary Jing's publication series on RCA:
1. Flip the Switch: http://www.asq.org/quality-progress/2008/10/quality-tools/flip-the-switch.html
2. Digging for the Root Cause: http://www.asq.org/pub/sixsigma/past/volume7-issue3/index.html
3. Webcast: http://www.sixsigmaiq.com/video.cfm?id=917

Gary Jing
http://www.linkedin.com/in/ggaryjing

--Gary Jing, 08-24-2010


Excellent article. The remarks on confirmation bias are spot on, and it's difficult to get people out of that box.
--P. Collinsworth, 08-21-2010


It's a misconception to believe that 5 Whys only traces one root cause issue. For example, asking, "Why did this machine stop working?" will ultimately lead to the series of events that led to the machine failure; if any one event is removed, then the resulting issue would not have occurred. 5 Whys analysis can then be used for each event in the string to drill down to the breakdown within the system that allowed each event to occur.
--Steve, 08-19-2010


We use a modified 5-why tool that we call a 5x3, where we have separate paths for specific root cause, root cause for lack of detection and systematic root cause to address some of the weaknesses described in the article. We then seek corrective actions for each path.

We also have an issue with exploring multiple causes within a path but have been taught for practicality's sake to only follow the most obvious cause. However, we do full 30, 60 and 90-day reviews to determine if a different path needs to be explored. This fits with our lean program, which is teaching us to plan less (in the engineering offices) and try and check more (at the Gemba).
--Ryan Carlton, 08-18-2010


Very good article. About time we changed the way we approach RCAs. As a DNV certified lead auditor, we were cautioned to use the 80/20 rule. 80% of the time its the system, not the people, and even if it initially points to human error, what part of the system allowed the error?

I would like to add one question to the 5 whys list: What's changed? I've seen plenty of ghost chasing by not asking that question first. If the sum of all of your inputs equals your output, what's changed? Answering this first has kept a lot of false starts from happening.
--Adam Sienko, 08-18-2010


When I teach root cause analysis, in the fishbone diagram, I say all possible causes must be listed by category (including personnel) and that a team puts together the diagram (operator, design and process engineer, etc). Then narrow it down to the most likely possibilities. This is not new stuff.
--Tom Kuczek, 08-18-2010


The article is very good. After consideration of Five Whys and Human Factor, I believe we are missing one point in our consideration from initial product to final product requirements. I think this article provides very good details that we must consider, like first what is requirement and from what we have as raw product. Good article.
--HARDIK DAVE, 08-18-2010


Very intriguing and informative article. I forwarded it to our VP of quality.
--Thomas O'Brien, 08-18-2010


We are asked to use the practice of "ask why five times" whenever we have quality issues with processes and procedures. The article provides additional quality "tools" to add to the mix. Additional tools means additional continual professional training for quality professionals. Quality continues to be ongoing, changing and evolving.
--Al Foster, 08-18-2010


Interesting article, especially about confirmation bias, but as already stated, the author fails to heed Dr. Deming's point that 90%+ of problems are caused by the system, not the people who do the work. If you keep that in mind as you read, the article is helpful. Deming said you need profound knowledge to improve (http://www.youtube.com/watch?v=STTwZGNvLmM), which consists of 1) understand variation, 2) understand psychology (which this article adds to), 3) understand theory of knowledge (again, this article helps), and 4) understand systems concepts.
--Ken, 08-18-2010


Thought I would "comment" on some of the comments!

1. If you want to know more about the method I mentioned, go to our blog site (www.taproot.com/wordpress) and see the "How does TapRooT Work?" article under the popular posts.

2. I agree with Deming about system causes. I just think these cause human performance issues.

3. I don't think the "interaction" and multiple causes is what I'm thinking about for multiple causes. I'm thinking more of multiple opportunities to stop the progression of events over time. Sometimes this means interactions, but other times it means there were setup factors, mistakes and failures to catch mistakes that lead up to the final interaction.

Thanks everyone for reading what I wrote!

Mark Paradies
President
System Improvements
--Mark Paradies, 04-21-2010


Interesting article. But I disagree with the statement that human performance issues (human errors) cause most quality problems. I recall Dr. Deming estimating that 85% of the problems are in the system. If we fix the system, the employees will not experience as many problems. If employees believe they are responsible for the majority of the problems, they will not be as willing to identify problems.
--Philip Heinle, 04-18-2010


The author makes some very good points regarding unstructured root cause analysis that rings true with personal experience. The concept of confirmation bias is very helpful in explaining a number of biased root cause investigations that I have recently observed.
--Mike Marchlik, 04-15-2010


Me parece muy adecuado el articulo para el desarrollo de mi actividad como consultor en el sector educativo, en buen momento.
--Roberto, 04-14-2010


This is really good food for thought, as I often find myself struggling a bit to connect causes and effects. I even asked two of my project teams to read the article to get them thinking, too. I would like to see more explanations or examples of the New Method #3, built-in expert systems to find root causes.
--Jon Beals, 04-13-2010


A good article for someone who knows nothing about the subject like me.
--Frank Dorsett, 04-13-2010


Great article on a subject that is relevant to many of us who try to utilize root cause analysis in our operations, as well as to answer the complaints of our customers.
--Marlowe Wicks, 04-12-2010


The section called "Single-cause misconception" would have been much clearer if the concept of Interaction had been defined and explained (e.g., the mine explodes when methane and oxygen are present present AND also a spark occurs). Dorian Shainin's methods of Variation Research using Multivari Analysis and Factorial Experiments can discover interactions the simplistic methods described in the article cannot.
--Tom Woods, 04-10-2010


Featured advertisers