"Guru Guide" (November 2010) is an excellent article about the trendsetting leaders of quality. I am grateful for the opportunity to learn about their commitment and journey to make a difference.
It is very inspiring to read their philosophic words and approaches that made things happen in the world of quality. I want to thank the QP staff for putting together this excellent article, especially for members like me who are new to the quality world.
Setting an example
The lesson learned from these and many other unsung gurus of quality is that, as Henry Wadsworth Longfellow said, "Lives of great men all remind us we can make our lives sublime and, departing, leave behind us footprints on the sands of time." So let us do our bit and leave at least a faint memory of these men’s efforts in the minds of our peers, staff and students.
The words of wisdom from Natalia Scriabina, Romayne Smith Fullerton and Burjor Mehta ("Word Power," October 2010) struck a chord with me.
In my first trip to Japan in 1981 to learn its secrets about total quality, I became acquainted with the Japanese word baka-yoke. The usual translation is "foolproof," as in foolproof your machines and processes. This is a great idea that is now in wide use in quality and lean as mistake-proofing or error-proofing.
But it is not used by me because I soon learned that in Japan, because of baka-yoke’s negative connotation, it had been generally replaced by poka-yoke, best translated as the more neutral term "fail-safing." On a personal note, my own efforts to get people to switch from "mistake-proofing" to "fool-proofing" have been a failure.
A second example: In an early trip to Kodak, I learned that a wise consultant had convinced Kodak managers that the use of "error" or "mistake" was likely to raise defensive walls. In almost any case, the consultant said, the neutral word "mishap" fits well.
Ever since, my own writings have been heavy on mishap and devoid of error or mistake—if you can believe that.
The importance of reports
"Brace for Impact" (October 2010) is a very interesting article that truly points out the need for well-written nonconformity reports in all types of audits.
The article dovetails well with the required three-part nonconformity write-up required by most registrars, in which the "statement of nonconformity" portion should communicate why the finding should be addressed. Well-written nonconformities should answer the question, "So what?" for top management. Thanks for the article.
Redefining root cause
To be of value, root cause analysis should focus primarily on identifying solutions to system design flaws, thereby preventing accidents and failures. It shouldn’t focus on identifying causes, root or otherwise.
Root causes are selected by the analyst or management. Hopefully, the selected causes are based on data that have been critically analyzed. All too often, however, the causes are based on biases and a lack of critical thinking. In addition, human performance and error factors are frequently overlooked or treated superficially in causal analysis.
Questions such as, "How could he not have seen that was going to happen?" or "How could they have been so irresponsible and unprofessional?" are retrospective. The only reason they can be answered is because the outcome is already known. They are the result of hindsight bias, which includes a tendency to oversimplify the complexity and ignore the uncertainties of the circumstances people faced when the problem occurred.
When failure occurs, reactions tend to focus on the people proximal to the accident—those closest in time and space to the mishap. This is also called focusing on the sharp end. By doing so, you miss underlying contributors to the event.
More than likely, you’ll be able to describe in detail what could have been done to prevent the mishap. But this is a counterfactual representation of the past. Instead of focusing on what actually happened and seeking to understand why it made sense for people to do what they did, all you’re doing is proving that another course was open to them.
All of these common approaches to causal analysis allow you to judge the actions of those involved, to point out what they should have done and to condemn them for what they failed to do to prevent the mishap.
When those actions don’t yield the desired result (and they won’t—remember, we started with a bad outcome), it’s only a small leap to judge not only the actions, but also the very character of those involved. This is an example of something called the fundamental attribution error.
Research shows that humans operate as pattern matchers who, when necessary, become pattern completers and assumption makers. This is necessary for efficient functioning, but it can get you into trouble in event investigation and analysis.
Human beings are so skilled at filling in the gaps in their understanding with plausible assumptions, they are not even aware they’re doing it. As a result, people tend to be overconfident in the validity of their understanding. Many assumptions are incorrect, and sometimes that fact will be material, which means you will come to the wrong conclusions.
A related tendency is the illusion of common sense—the sense that everyone (or at least everyone who is reasonable, intelligent and informed) shares your understanding and perspective. When you approach event analysis with the belief that what is common sense to you is the right way to see the world, you may fail to understand why people’s actions made sense to them at the time. You will likely be judgmental in your approach and will not uncover important and actionable causes of the event.
When it comes to causal analysis, organizations tend to be satisfied with a simplified, linear and proximal set of causes, even though studies of accidents in complex socio-technical systems beg to be looked at more broadly and deeply for causes. You expect to quickly get to the bottom of things, get the forms filled out, fix the problem identified and get back to work.
Invariably, this leads to solutions aimed at the people involved on the front line—retrain them, supervise them better or just fire them—rather than at the conditions that led to the event.
Los Alamos, NM