2019

QP MAILBAG

Innovative Thinking Is Model for All

I want to thank Ray Anderson for his excellent article in the February 2004 issue (“Climbing Mount Sustainability,” p. 32). This type of innovative thinking serves as a model for those of us diligently trying to drive the sustainability agenda in our companies. One day, the current management philosophy of minimum regulatory compliance will become much as the idea of no regulatory compliance is to us now—a dull memory.

I especially liked the fact that Anderson included a suggested reading list to help others think about the importance of sustainability. I plan to read each book.

JAY RICHARD
SNF–Polychemie
New Orleans, LA
jayr@snfhc.com

Interface’s Efforts Are Inspiring

Ray Anderson’s article “Climbing Mount Sustainability” is not only nicely constructed and well written, it is also timely. The sustainability approach and efforts made by Interface are inspiring. If we can persuade other corporations to come to the same understanding and take action as Interface has, there will be hope for future generations of mankind.

Thanks to Anderson and ASQ for an excellent article.

STEVE JOHNSON
Lanier Worldwide
sjohns10@lanier.com

Hunter’s “Statistics Roundtable” Is a Gem

The “Statistics Roundtable: Improving an Unstable Process” article by J. Stuart Hunter (February 2004, p. 68) is a gem. For many years we have been learning how to use statistical methods by reading Hunter’s examples, which always include a key graph, a table of data for our own graphical analysis, some theory, some teasing and appropriate references.

Hunter’s illustration of the simplest evolutionary operation approach to an autocorrelated data problem (a hot topic on Six Sigma bulletin boards) avoids the usual time series modeling and residuals chart approaches and gets right to improving the process by taking action in a safe way and learning how the process responds. As in his other articles, Hunter moves us into the use of more tools in the statistical bag of tricks by simply graphing the data and discussing an appropriate case with a minimum of math heroics. Very sneaky.

MICHAEL CLAYTON
mclayton2000@hotmail.com

Hubri-Doobri-Doobri Sounds About Right

The concept of Six Sigma is simply the shrinking of the process variation to half the design tolerance while allowing the mean to shift as much as 1.5∋ from the target. If the process mean can be controlled to within 1.5 standard deviations of the target, a maximum of 3.4 defects per million can be expected.

And voilá! A whole new industry has popped up to provide Six Sigma related services, and organizations have spent millions of dollars on Six Sigma training and projects. So far, reports indicate organizations that have used the Six Sigma methodology have rarely achieved 3.4 defects per million, and the promised savings have seldom materialized.

Define, measure, analyze, improve, control and design for Six Sigma are similar to plan-do-check-act combined with statistical process control. They are commonsense methodologies for continuous process improvement that use well-known tools, such as quality function deployment, Pareto charts, control charts, fishbone diagrams, failure mode and effects analysis and design of experiments.

Organizations will have to use any hubri-doobri-doobri methodology (Rick L. Edgeman and David Bigio, “Six Sigma in Metaphor: Heresy or Holy Writ,” January 2004, p. 25) that improves customer satisfaction and reduces their cost of quality to maintain their competitiveness in the market.

G. GARY BAGDASARIAN
gazar.bagdasarian@ps.ge.com

Ford Should Know Nothing Good Is Free

The “Keeping Current” article “Ford Asks Exec To Solve Healthcare Crisis” (February 2004, p. 18) makes two erroneous statements.

First, the reason Ford and other U.S. automobile manufacturers spend thousands of dollars per vehicle on healthcare has nothing to do with any national healthcare crisis. Rather, it reflects a shortsighted trade-off in labor relations. Years ago, management won labor peace by agreeing to fund healthcare for retired employees, presumably until they reached Medicare age. Now the companies—and their customers—are paying the price.

Second, CEO William Clay Ford Jr.’s claim that private employees (or did he mean employers?) don’t bear this cost in countries with government funded medical care implies that governments generate healthcare without extracting taxes from their governed. Sorry, Mr. Ford, but you should know nothing good is free.

DONALD J. MIRATE, M.D.
Mirate Eye Center
Valdosta, GA
drmirate@friendsforvision.com

Justifying a Higher Price

ohannes Freiesleben’s article “How Better Quality Affects Pricing” (February 2004, p. 48) clearly shows what quality economics is all about. Until they really understand these concepts, quality professionals should read this article over and over again.

In the early 1960s, I had a case history in Portland, OR, that taught me the quality value concepts. A company team made up of the quality, marketing and engineering departments conferred with end product users in their work area. They asked about the in-field use (out of the box) performance of the product, how long it stayed at a useful performance level without the need for a touch-up and how long it took before the product needed repair or wore out.

These new data produced information beyond that gathered in dis- cussions with dealers or from performance use testing of the company’s and its competitors’ end products. They allowed the company to substantially raise its prices without losing customers. This would be a great story if it stopped here.

The company now knew the relative value the end user placed on each performance quality characteristic. This added knowledge allowed each characteristic to drive R&D effort on a cost benefit basis. Once the R&D work increased performance, the company could increase the price again. The increased performance of design quality and outgoing quality level could then be sold to informed users.

WILLIAM (RAY) VANDERZANDEN
ASQ Fellow, retired
Portland, OR
radioray@alveus.com

Article Title Technically, Literally Wrong

am empathic toward the points Irena Ograjensùek and Poul Thyregod make in their article “Qualitative vs. Quantitative Methods” (January 2004, p. 82). However, I believe the title of the article is technically and literally wrong. My research of statistics books such as Juran’s Quality Control Handbook, fourth edition (McGraw-Hill, 1988), indicates there are no such things as qualitative and quantitative methods. There are qualitative and quantitative data, research, assessment and chemistry, but no qualitative method.

I am open to new ideas and perhaps this is something new I can learn about, but no step-by-step methods were described in the article.

Also, the authors never reference qualitative methods in the article. Looking at the title, I was hoping to learn about qualitative methods, too. I do not disagree with the literary use of “quantitative methods” to collectively describe methods such as statistical methods. However, I think the title was misleading, nonsensical and should not have been used.

J.P. RUSSELL
QualityWBT Center for Education
Gulf Breeze, FL
jpr@qualitywbt.com

Correction

  • Please do not send birthday wishes for Joseph M. Juran to jdefeo@juran.com (“Keeping Current,” March 2004, p. 16). Instead, go to http://www.juran.com for more information.

Average Rating

Rating

Out of 0 Ratings
Rate this article

Add Comments

View comments
Comments FAQ


Featured advertisers