Where Did Those Numbers Come From?
I recently received my first issue of Quality Progress and am enjoying it. However, I got a little confused by a section on page 37 of the article "The Tip of the Iceberg" (Joseph A. DeFeo, May 2001, p. 29). The author states, "When ±3 sigma of the process that produces a part is within specification, there will be 66,807 defects per million parts produced. If each defect costs $1,000 to correct, the total cost of poor quality is $66,807,000. When an organization improves the process within ±4 sigma, there will be only 6,210 defects per million at a cost of $6,210,000. At ±5 sigma the cost of defects declines to $233,000 per million."
The article did not state what type of distribution the data fall under or what mathematical rules apply, and I could not arrive at the same numbers. Statistical Quality Control gives the percentage of values that will be outside the limits for normal distributions, Camp-Meidell circumstances and when nothing is known about the distribution (Tchebycheff's inequality).1 For example, for ±3 sigma the values are approximately 0.27%, ¾4.9% and ¾11.1%. For ±4 sigma the values are approximately 0.006%, ¾2.8% and ¾6.3%.
Am I missing something simple?
1. Eugene Grant and Richard Leavenworth, Statistical Quality Control, seventh edition (New York: McGraw-Hill, 1996).
JOHN D. MOODY
Newport News, VA
Author's Response: Thank you for your response to my recent article "The Tip of the Iceberg." Yes, as you suggest, you are missing something.
No mathematical rules apply to account for the differences between your correct numbers and the numbers used in my article. It has been a convention in the Six Sigma model as defined by Mikel Harry to postulate a long-term shift in a process mean of 1.5 sigma.1
The Six Sigma model assumes shifts and drifts in the mean of a distribution will occur for a number of reasons. This can cause nonrandom shifts in the distribution. Harry also states it is almost impossible to quantitatively predict when the changes in the distribution will occur.2 A solution proposed by A. Bender and now supported by Harry allows for nonrandom shifts and drifts by subtracting 1.5 sigma from the current short-term value.3
The 1.5 sigma shift is used as a theoretical compensatory offset in the mean to generally account for dynamic long-term nonrandom variations in process centering. It represents the average amount of change in a typical process over many cycles of that process.
To reach my numbers (which account for the long-term shift), subtract 1.5 sigma from your current, short term, sigma value, subtract the resulting z value from 1 and multiply by 1 million.
For example, 3 sigma - 1.5 sigma = 1.5 sigma, which equals a z value of 0.9332. Subtract this from 1 (1 - 0.9332 = 0.0668) and multiply by 1 million. This equals 66,800.
Thank you for your interest.
1. Mikel J. Harry, The Vision of Six Sigma: A Roadmap for Breakthrough (Phoenix: Six Sigma Publishing Co., 1994).
3. A. Bender, "Statistical Tolerancing as it Relates to Quality Control and the Designer," ASQC Automotive Division Newsletter, 1975.
Article Should Define Term "Proportions"
If the term "proportions" does not mean "the number of defective parts in a large sample of products or parts," then Figure 2 in the article "Help Wanted: One Statistician Consultant" ("Emerging Sectors," Sandra Strasser and Thomas R. Hall, M.D., April 2001, p. 114) is incorrect.
A P-chart is based on the binomial distribution and values such as percent on time, and undefined proportions are not binomial random variables. So for the type of data in Figure 1 or 2, the correct control chart is most likely a chart of individuals based on moving ranges of size two--an X-MR control chart.
Also, the limits in Figure 2 are at 0 and 1. What was the sample size used to collect the data? With a very large sample size and p¯ = 0.55, the data could be modeled by the normal distribution.
DONALD S. ERMER
Departments of Industrial
and Mechanical Engineering
University of Wisconsin-Madison
'Up Front' Column Misses the Point
I think the comments from Joseph M. Juran in Debbie Phillips-Donaldson's "Up Front" column ("Juran on the Future of Quality," June 2001, p. 6) miss the point. People in other sectors are hungry for some real information on quality, yet we continue to fail to listen to what they need. Instead, we provide it in our own language, in our way and at our time. If we really are going to reach other sectors, are we willing to learn what is going on in those sectors?
ETHAN J. MINGS
Good Job Merging Deming Philosophy and Standards
I recently had an opportunity to read R. Dan Reid's article "From Deming to ISO 9000:2000" in the June 2001 issue of QP ("Standards Outlook," p. 66). Reid did a great job merging the Deming philosophy and quality standards, without employing the usual chronological format.
I have enjoyed all the articles he has written for Quality Progress.
ROBERT D. ZACIEWSKI
Quality Network Implementation
Avoid Problems By Anticipating Them
I would like to commend Randall Goodden on his article in the June 2001 issue ("How a Good Quality Management System Can Limit Lawsuits," p. 55). I had just returned from conducting a meeting of our plant quality managers from around the world when I read the article, which reinforces the message that we had given during the meeting.
My company has long espoused the position that our quality system is our best means to address product safety and liability prevention. We have incorporated this aspect of the business responsibility directly into our quality system, and through the years it has proven its value.
As Goodden states, the best way to avoid problems is to anticipate and prevent them, and this is the precise intent of a good quality system. While no system is perfect, a strong quality system that is integrated throughout a business provides the substance required to deal with both the foreseen and unforeseen.
FRED D. MUNDT
Praxair Surface Technologies
In the June 2001 "Quality Web Watch" (p. 24), www.freequality.com should be www.freequality.org.