2019

Editor's note: For letters about the Charles Quesenberry article "Statistical Gymnastics" (August 1998, p. 77), turn to the special Statistics Roundtable column.

Career Corner Should Have Been on Cover

I was extremely interested in Greg Hutchins' "Quality at Work" article (Career Corner, September 1998, p. 95). While he makes sense, quite frankly I'm astonished that it needed to be written.

My "Quality 101" training/experience has always told me that the purpose of a quality focus is to improve an organization's performance, whether to meet customer needs or serve a larger community. If we don't understand the business (and what that business is trying to accomplish) how can we, with a straight face, say that our work is doing anything worthwhile?

I think the larger issue that Hutchins alludes to goes far beyond individual career management. Specifically, the quality profession needs to take a hard look at itself and ask, "Do we add value?"

We all get a kick out of Dilbert, but let's be honest--doesn't a little part of us say, "Gee, the guy's right...this is stupid!" Hutchins' last paragraph ("More strategic, less tool oriented....") applies more, I believe, to the profession as a whole than individual professionals.

To put things into a little perspective, I am (I like to think) a businessperson who focuses on quality. I've been through several reorganizations, restructurings, and rightsizings (or downsizings), not always surviving, either. I agree with Hutchins but believe his article should have been in bold type on the cover of Quality Progress. It's time to wake up and smell the coffee.

Sam Ireland
Director of Performance Excellence-Corporate Strategy
American Red Cross


Eliminate the Directories

I have been receiving Quality Progress for a little less than a year now, and I have grown increasingly disappointed with certain content of the magazine.

It appears that sometimes you must not have enough actual content to put forth an issue, and you seem to be increasingly relying on these listings, directories, and surveys to fill the pages of your magazine.

I was an electronic technician for nine years and have recently moved to a new company, becoming the management representative. I had come to enjoy and anticipate my monthly edition of your magazine. I thoroughly enjoyed the One Good Idea column and always turned to that page first. I naturally have been disappointed that this feature seems to have vanished from your publication.

I originally was overwhelmed with the amount of advertising for what always seemed to be the same product in a different package. I was able to get through this to the articles that I relied on to keep me informed about what was going on in the world of quality. Now all I seem to get are lists, directories, and surveys.

Let's get back to the issue of quality and get away from the lists, directories, and surveys. I need you guys to talk about quality, not about salaries. I mean, who wants to be more depressed about what they are not getting paid anyway. I am counting on you to come through for me in December.

Jesse Rogers
Everett Charles Technologies
Orlando, FL

Editor's Response

Our December issue, with eight solid feature articles, the regular departments including a return of One Good Idea, and no directories should have pleased the writer of the preceding letter.

One of the decisions made during Quality Progress' editorial and graphic redesign project, which has been going on for several months and will be unveiled in March, has been to eliminate the software directory (after 15 years), services directory (11 years), and education directory (eight years). Input from our readers had told us they wanted more articles and fewer of these listings.

The salary survey, however, remains our most popular issue of the year and will certainly continue to appear.

Most of the information in the software, services, and education directories now appears on ASQ's Virtual Quality network on the Internet. Readers can access the site through http://www.asq.org.

As for One Good Idea, the months in which it doesn't appear are months in which we have not received a submission that has been approved by our review board. We are aware that it is extremely popular, and urge our readers to submit articles for the column.


FMEA Article More of a "How Not to Do It"

You recently ran an interesting article entitled "How to Use FMEA to Reduce the Size of Your Quality Toolbox" (November 1998, p. 97).

The examples show the extended use of failure mode effects analysis (FMEA) for evaluating possible environmental hazards. The author, Willy Vandenbrande, presents a number of examples that call into question the fundamental use of the FMEA tool itself.

Figure 3 is typical of the loose logic and language that often nullify the value to be gained from the use of FMEAs. Normally a close, logical connection between the failure mode, effect, and cause is required to make sense.

The effect is stated as "water pollution by cooling fluid," while the mode is a "broken or loose hose." A long chain of important unstated events exist between this mode and the resultant effect.

In a typical machine tool environment in which the example is cited, a normal machine system collects the cooling water and continuously recycles it in normal use. The missing steps of the chain of important events are:

  1. There is a water leak from the normal catch basin and recirculation system.
  2. A water leak is created and remains unknown or undetected.
  3. The water leaks somehow away from the machine location and away from normal protections and drains.
  4. The water leaks in large quantity into the surrounding groundwater, to be noted as the environmental effect of water pollution.

This long chain of critical events is ignored, thus negating the FMEA entry. If an occurrence number were applied to this necessary chain of events, this number would be a 1 or 2, instead of the suggested number. The loose logic and critical unstated events demonstrate that the FMEA entry has little real-world value and is unlikely to lead to appropriate actions to prevent the undesired effect of water pollution.

A second example, illustrated by Figure 5, exhibits the same major logic leaps where the potential generation of molybdenum dust is tied directly to soil contamination without a reasonable, necessary, and direct chain of events.

Another noticeable logic error exists in the severity table shown in Figure 2. Each entry really has a two-part description, one for an immediate company impact and one for potential indirect long-term environmental damage. This table is seriously flawed as presented. It needs further clarification and rework to be of use to a FMEA team.

In the third example, the generation of SO2 gas leads to an environmental impact of bad smell. The term "bad smell" is not an environmental impact such as soil contamination listed in an earlier entry. This entry should be restated.

The last area is the introduction of the term "contribution," which attempts to place a probability on the long, convoluted, often-unstated chain of events often overlooked in the first two FMEA comments. This extra new factor would be unnecessary altogether if clear and complete statements of the failure modes and effects and causes had been made initially.

The upshot of this article is to produce examples of FMEA problems one can easily create by not doing a good job with the FMEA tool. A few simple rules for generating FMEA entries help prevent this logic confusion.

The first is to say what you mean as precisely as possible and not use vague words and phrases. This second is to watch the logic and avoid long or convoluted connections between cause, mode, and effect. These need to be a right and close logical chain of events.

You cannot skip or omit important and relevant events in the logic chain. The author appears to fail these two simple rules in almost all of his examples. Creating new, complex contribution factors, even if based on an ISO standard such as 14001, does not fill the logic holes. The article is more of a "how not to do simple FMEAs" when you desire a clear, concise, and objective outcome.

James McLinn, reliability consultant
Hanover, MN

Author's Response

The chain of events James McLinn indicates regarding the first example is valid for a leak in a machine in a certain surrounding. This is hypothetical and not necessarily valid for the risk and effect of the specific damaged hose in this specific machine on this specific location.

As for scoring occurrence, it is my practice to score the occurrence of each and every cause that can create the failure mode separately. Scoring the chain of events would lead to the same occurrence score independent of the cause of hose damage.

Using a two-part description for severity is very common, and not a logical error but an increase in efficiency. As an example, I refer to the scoring guideline for severity in process failure mode effects analysis (FMEA) as shown on page 35 of the QS-9000 guideline. The important thing is to have people in the team who can judge the severity of the effect for each part of the description. The alternative is to score severity twice and to calculate two risk priority numbers or environmental priority numbers with little added value.

McLinn is obviously correct that the introduction of the contribution factor in Figure 5, wrongly indicated as controls, adds complexity. In the original setup it was not included, but was added after tests with the environmental practitioners. It relates closely to terminology they know and use and is included in local environmental regulations. In addition, they found it easier to use, allowing them to quantify impacts and to help set priorities for action.

Finally it is not my intention to present the way of using FMEA for judging environmental impacts, but just a way in which it can be done. The presented method works, but all readers are free to use it in a creative way that suits their specific needs.

Willy Vandenbrande


TQM Without Statistics Like a Rowboat Without Oars

It was fascinating to follow the debate on the merits of statistics vs. management that filled your correspondence column for months on end. On this side of the Atlantic, too, the management hand in the quality game tends to be overplayed at the expense of statistics.

In Great Britain, the Institute of Management publishes a total quality management (TQM) checklist, which mentions a range of quality techniques. Only one, histograms, is statistical. All the other statistical techniques that are important in quality are ignored: sampling plans, control charts, analysis of variance, correlation and regression, design of experiments, and synthesis of variance.

We must get back to first principles. Variability is at the heart of every quality problem. It has to be measured, analyzed, and controlled. The only available calculus of variability is that provided by statistical method based on probability theory.

Papering the walls with statistical control charts does not, of itself, solve any problems. Management involvement is essential. Managers who do not understand statistics are not competent in quality. Good intentions alone are not sufficient.

The TQM problem is aggravated by widespread lack of statistical competence among present-day engineers and managers because statistics was absent from the curriculum in their formative academic years.

We must raise a new generation of young engineers and operations management to work together instead of conducting a civil war in the field of quality.

S.J. Morrison
East Riding of Yorkshire, England


QP Survey Sounds Fishy to Reader

The December Quality Progress article "Are Your Surveys Only Suitable for Wrapping Fish?" (Ken Miller, p. 47) drew a huge grin from this reader when he reached the QP survey question asked at the conclusion of the article (and, indeed, at the end of all QP articles): "What did you think about this article?" I have to conclude that the QP survey has the unmistakable odor of aged cod!

Michael J. Paul
Oshkosh, WI

Editor's Response

Reader Michael J. Paulk will be happy to know that the survey box at the end of each Quality Progress article will be eliminated when we unveil our redesigned magazine in March. We will, however, continue to survey readers through the mail using a questionnaire that includes open-ended questions.


Letters to the editor (no more than 500 words) should be sent to Editor, ASQ/Quality Progress, 611 E. Wisconsin Ave., P.O. Box 3005, Milwaukee, WI 53201-3005, fax (414) 272-1734, e-mail sdaniels@asq.org. Please include address and daytime phone number. Due to space restrictions, Quality Progress is not able to publish all letters. Quality Progress reserves the right to edit letters for space and clarity. QP


ASQ staff members can be contacted by e-mail using the address protocol of first initial and full last name, followed by @asq.org (for example, Mary Rose Wallus: mwallus@asq.org).

Article photocopies are available from the ASQ Quality Information Center, 800-248-1946 or (414) 272-8575; and University Microfilms International, Dept. P.R., 300 N. Zeeb Rd., Ann Arbor, MI 48106, 800-521-0600, ext. 533, or (313) 761-4700 (in Michigan).

Visit Quality Progress on the World Wide Web at http://qualityprogress.asq.org. The site includes a searchable database of abstracts from all feature articles that have appeared in the magazine starting with the January 1990 issue.

Authorization to photocopy items for internal or personal use or the internal or personal use of specific clients is granted by Quality Progress, provided that the fee of $1 per copy is paid to ASQ or the Copyright Clearance Center, 222 Rosewood Dr., Danvers, MA 01923, (508) 750-8400. Copying for other purposes requires the express permission of Quality Progress. For permission, contact Daren Miller, P.O. Box 3005, Milwaukee, WI 53201-3005; call (414) 272-8575; send a fax to (414) 272-1734; or send an e-mail to dmiller@asq.org.


Average Rating

Rating

Out of 0 Ratings
Rate this article

Add Comments

View comments
Comments FAQ


Featured advertisers