2019

A Math Error in January's One Good Idea?

In his article "Process Capability: Understanding the Concept," Jack Meagher (January 2000, p. 136) states, "For example, the motorcycle running 2 feet away from the barrier would have a capability of 2 because (5-2)/1.5=2." The math is wrong in this statement. If the motorcycle is 2 feet away from the barrier, it would be (5-3)/1.5=1.3333. Meagher's math indicates that the motorcycle is running 2 feet off center, or 3 feet from the barrier.

In the next example he states, "The compact car moving the same 2 feet toward one of the barriers would have a capability of 1 as (5-2)/3=1." This is a correct statement. The only other thing I noticed as unusual is that the semi is 9.5 feet wide. In Michigan, you need special permits to drive a vehicle, or load over 8.5 feet wide. I realize he probably used 9.5 to make the math easier, but even 10/8.5=1.176something, or rounded could be 1.18Cp.

I do agree that this analogy is helpful because most everyone has "been there, done that."

MIKE JANES
Hillsdale, MI 
mikej@hillsdale.epcorp.com
  


Author's Response

Please note the following correction to the article "Process Capability: Understanding the Concept" in the January issue of Quality Progress.

The 3rd paragraph from the end should have read "... 2 feet away from the centerline ...", not "... the barrier."

Somewhere in the editing and approval process at ASQ, "centerline" was changed to "barrier" and I did not pick up the modification.

JACK R. MEAGHER
Peterborough, NH 
jmeagher@nhbb.com
 


Defect Probability of T+4.5µ Is 6.8 ppm, Not 3.4 ppm

In J. Stuart Hunter's article, "Statistical Process Control" (December 1999, p. 54), he states that, "The Six Sigma approach to total quality management has as one of its technical objectives the establishment of processes that produce parts per million defectives while allowing the mean µ to drift from the target T by as much as ±1.5 sigma. A process standard deviation is therefore required to be small enough to meet the specification limits of T ±4.5 sigma."

Nevertheless, the defect probability of T ±4.5 µ is 6.8 ppm, not 3.4 ppm. Therefore, I think that the phrase "allowing the mean µ to drift from the target T by as much as ±1.5 sigma" should actually state "allowing the mean µ to drift from the target T by as much as +1.5 sigma or ­1.5 sigma."

Also, "T ±4.5 µ" should be "T +4.5 µ, T ­7.5 µ or T +7.5 µ, T ­4.5 µ." This means that the µ can drift to the left side 1.5 sigma or drift to the right side 1.5 sigma. So, the defect probability of T +4.5 µ, T ­7.5 µ is 3.4 ppm, and the defect probability of T +7.5 µ, T ­4.5 µ is 3.4 ppm.

JERRY CHENG
Hsinchu, Taiwan 
jerrycheng@mxic.com.tw
 


Author's Response

Both Jerry Cheng and I agree that the Six Sigma approach allows for motion in the mean to occur over the region T ±1.5 sigma while still keeping the probability of defects, items falling beyond the specifications (T ±6 sigma), very low.

To argue the details of computing probabilities in ppm requires one to accept the assumptions of exact normality and independence. Even when the assumptions are not exactly met, the Six Sigma approach remains valuable because we can still be confident that the probability of defects will remain very low.

Suppose motion in the mean could be perfectly eliminated and the assumptions of normality and independence exactly met. The probability of an item falling beyond the specs
(T ±6 sigma) is now approximately
1 in 1 trillion! Or is it 1 in 2 trillion?

The Six Sigma approach is useful because it is robust to assumptions and recognizes motion in the mean. The Box-Jenkins approach is similarly robust and proposes to reduce motioniii in the mean by informing an operator when and how much to adjust a process. The essential motivation is not framed in ppm, but in the willingness to immediately employ available data to adjust a process to reduce variability.

J. STUART HUNTER
Princeton, NJ  
stuhunt@bellatlantic.net
 


Scarcity of Raw Materials Isn't a Real Problem

I take strong exception to some comments made in Greg Hutchins' article, "Help Your Career and the Environment," in the January 2000 issue of Quality Progress (p. 116). According to Marsha Willard, "More people are demanding more stuff, while the earth's ability to provide those things is declining." This is both true and false. People are demanding more stuff, but we are becoming more efficient in our use of stuff. Also, if Willard's statement were entirely true, we would be seeing a scarcity of some materials with concurrently increasing prices. There are no scarcities, and real prices for raw materials are declining.

Willard also states, "If an organization keeps on its current course, it can count on higher resource costs, higher disposal costs, more environmental regulations ... " Disposal costs are somewhat masked by the fact that they are often partially paid for with tax money. When true costs are calculated, they will probably show a decline due to our increased efficiency. As for environmental regulation, this is invariably true. Legislators seem to be unaware of Deming's Plan, Do, Study, Act (PDSA) cycle and appear to operate under their own Act, Act, Act, Act cycle.

According to Darcy Hitchcock later in the article, "Even many oil companies are now accepting global climate change as a key strategic issue, moving away from fossil fuels to renewable energy sources." While global climate change as a result of human activity is far from proven, the issue manages to stay in the public eye due to the efforts of certain zealous environmental groups who need to keep the issue stirred up so the contributions flow in. But Hitchcock is wrong when she says oil companies are moving away from fossil fuels. Oil companies make their money selling fossil fuels. A few oil companies, however, have jumped on the anti-coal bandwagon as a means to:

1. Appear to be more environmentally friendly.

2. Drum up more business for their own products--oil and gas.

Renewable energy resources are uneconomical, at least without generous tax subsidies, and will remain so for the foreseeable future.

Earlier in the article, Willard states that environmental sustainability is a response to a supply and demand problem. There is no problem and never has been. The predictions of disaster put forth by Thomas Robert Malthus 200 years ago are baseless. The alarmists refuse to get their facts straight, and the more often their predictions appear in print, the more some people believe in impending environmental disaster.

We've got real problems, and Deming's PDSA cycle provides an excellent process for dealing with them. Scarcity of raw materials and ever increasing costs are not real problems, and a publication such as Quality Progress shouldn't waste space publicizing them.

HARRY E. NAGEL
Fort Worth, TX 
harrynagel@upr.com
 


Recalibration Schedules Are Preventive Tools

I felt the need to make a comment concerning a QP Mailbag letter published in the December 1999 issue (Alex T.C. Lau, "Calibration Is Always Subject To Variation," p. 8). The letter was written in regard to an article in the September 1999 issue by Philip Stein ("Avoiding Calibration Overkill," p. 74).

In the letter, Lau states, "I always advocate that recalibration should only be done when there are data that suggest the process has gone out of control." This statement is out of focus. First, recalibration schedules are used as preventive tools rather than corrective tools. A gage may need recalibration or adjustment even though it is still within an acceptable tolerance range.

There should also be some kind of verification at more frequent intervals than scheduled recalibrations in order to confirm gage status. To say that recalibrations should be done only when data suggest the process has gone out of control is like saying, "If it ain't broke, don't fix it." I think we have evolved beyond that old management policy.

CARLOS QUINTERO
Cidra, Puerto Rico 
carlosq@caribe.net
 


Author's Response

Perhaps I need to clarify what I mean when I say, " ... the data suggest the process has gone out of control." What I mean is that the data from regular verification of the gage using accepted standards suggest that the measurement process has gone out of control. This is based on the control charting of the verification data as opposed to the data from the manufacturing process. With this clarification, I trust you can see that this is not too different from your belief in making verifications at more frequent intervals.

ALEX T.C. LAU
Toronto, Ontario 
alex.lau@esso.com
 


Confidence in Competence Of Testing Labs Is Needed

The National Association of Testing Authorities (NATA), Australia, noted the October 1999 article (Keeping Current, "ISO 9000 Used in Fight Against Drug Abuse in Sports," p. 20) about the doping control standard developed by the International Anti-Doping Arrange-ment (IADA) with interest. We, NATA, believe that it presents only part of the picture in relation to the adoption of international standards in the pursuit of drug free sports.

The adoption of a quality management system conforming to ISO 9002 needs to be complemented by the adoption of third party accreditation to ISO/IEC Guide 25 (soon to be replaced by ISO/IEC 17025) of doping control laboratories to demonstrate the technical competence of those laboratories. The new IOC medical code for 2000 requires accreditation to ISO Guide 25 as the prerequisite for gaining IOC accreditation to do the tests. Currently, three doping control laboratories are accredited: two by NATA and one by the United Kingdom Accreditation Service.

While it is important that there is a quality system in place covering the collection of samples and their
handling and transport under a demonstrable chain of custody, there needs to be confidence in the technical competence of the laboratories testing the samples if there is to be confidence in the test results. Accreditation of the doping control laboratories for compliance with ISO/IEC Guide 25 provides this confidence.

HELEN LIDDY
North Melbourne, Australia


The First Millennium Began With Year 1

I want to take issue with one point in an otherwise excellent Up Front column (Miles Maguire, "A Question of Timing," December 1999, p. 6). The choice of 0 or 1 as a starting point for the first millennium was not a matter of whim or error on the part of Denis the Short. The only choice that would allow consistent numbering for time before and after the first year of the first millennium was to use 1 as the name of the first year.

On the one hand, if Denis had chosen 0 as the name of the first year, the first millennium into the present era would extend from 1 to 1000. This is simply because a year 0 could not be in two different millennia. It would lead to an inconsistent numbering scheme for the beginning and end of the millennia.

If, on the other hand, the years began with 1, then the millennia run from 1 to 1000 and 1001 to 2000 in our present era and from 1 to 1000 (think of this as ­1 to ­1000) and 1001 to 2000 (-1001 to ­2000) in the pre-Christian era.

A thermometer has a 0 mark on it. However, if the temperature is 0.5º we don't mean that the temperature is 0.5 of 0º, but that it is 0.5 of 1º. In a similar manner, the halfway point of the first year should not be 0.5 of year 0, but 0.5 of year 1.

I am well aware of the helpful practice in computer programming of starting some array numbering schemes with 0 so that the elements of the array are numbered, for example,
-5, -4, -3, -2, -1, 0, 1, 2, 3, 4, 5. When we come to speak of such an array, we say its first element has an index of ­5. For this concept to carry over to the year numbering problem, we would have to number the years from the true first year--that is, the year when time began with the Big Bang. And even then we would never say that the Big Bang happened in year 0, would we?

RONALD NEWLAND, CSQE
Austin, TX 
rnewland@hbfgroup.com
 


Personal Advertisements Impersonate Article

It appears that the Quality Progress article "21 Voices for the 21st Century" (January 2000, p. 31) was inadvertently considered an article. The 10-page article contains little more than old, restated quality principles and 21 classified ads.

WILLIAM S. RICHARDSON
Houston, TX


'21 Voices for the 21st Century' Article Is Amusing

I would like to thank you for the very humorous piece in the January 2000 issue of Quality Progress, "21 Voices for the 21st Century" (QP Editors, p. 31). Although I am certain the article was not intended to be humorous, it certainly turned out that way.

While it failed to fulfill its stated intent of providing some illumination concerning the direction of the quality profession in the future, it was illustrative of a growing and very grave problem all of us face. The emergence of pseudointellectualism and its subsequent acceptance and deification threaten to bury original thinking for all time. I call the problem pseudointellectualism because "intellectuals" try to pass off a medley of recombinant textbook snippets, buzzwords and rhetorical catch phrase rubbish as original, insightful thought. Most of the people in the article hold advanced university degrees; 11 of them hold doctorates. I would have thought that at some point in their educational journeys they would have been required to demonstrate the ability to synthesize original ideas from what they've learned.

Now I'm sure that you will try to dismiss me as a frustrated, bitter person who takes potshots at learned people. This is not the case--I applaud anybody who achieves academic excellence. My problem is with the media, which seem to search out supposed experts who, when it comes down to it, really have nothing to say, but continue to speak nevertheless. Take, for example, this quote from the article, "They also need to stay up to speed with the opportunities the World Wide Web provides." One usually expects to receive this level of prescient, sage advice from a TV weather person. We've all seen it, when the temperature begins to climb above the 90s, the serious-faced weather person will come in and tell us all to wear light, loose fitting clothing, stay in the shade and drink plenty of water.

What is my point? Simply this, I wish to put an end to the blind acceptance of this pseudointellectual babble as anything approaching significance. Too often people will just nod along to anything that sounds insightful rather than expending the mental effort to examine it critically and make their own decision as to whether it has merit. The effect of this is the slow but steady denigration of critical thought. Just because someone holds a doctorate does not place him or her above reproach. In closing I would like to ask that QP not become a contributor to this most opprobrious and pernicious assault on true intellect by giving voice and credence to these or any other specious representatives of academia.

KENNETH J. KLEIN
Englewood, OH 
kklein@siscom.net
 


Answers to 'Preventive, Corrective Case Study'

I have received queries from a number of Quality Progress readers regarding the answers to the five questions at the end of my previous QP Mailbag letter "Preventive, Corrective Case Study" (January 2000, p. 8).

Before giving the answers, I would like to share the general definitions of corrective and preventive actions. A corrective action is an action taken to prevent the reoccurrence of a nonconformance. It is an action taken after a nonconformance has already occurred. A preventive action is an action taken to prevent the occurrence of a nonconformance. It is an action taken before a nonconformance occurs. The key words separating corrective and preventive actions are reoccurrence and occurrence.

For example, suppose there are two machines, A and B, that produce the same product. Once upon a time, a nonconformance was noticed in the product produced by machine A. An investigation was performed in order to locate the root cause, and subsequent action was taken to prevent another occurrence of the same nonconformance. This is called a corrective action. Now, based on what happened to machine A, the same action is taken on machine B even though a nonconformance has not yet occurred. This action taken on machine B prevents the occurrence of nonconformance and is, therefore, called a preventive action.

As for the answers to the five questions asked in the case study, I want to acknowledge that they are based on personal opinion. The answers are as follows:

1. What was the nonconformity? The plague.

2. What was the root cause of nonconformity? Untidiness and the spread of disease through rats.

3. What was the disposition action?

a. Admit all plague infected people into the hospitals and nursing homes.

b. Ban public assembly.

c. Close public meeting places, such as cinemas.

d. Vaccinate everyone in Surat.

4. What was the corrective action?

a. The cleaning movement and education of the public about cleanliness.

b. Review of the garbage disposal system by competent authorities.

5. What was the preventive action?

a. Sealing the borders of Surat and prohibiting people from migrating.

b. Promoting cleanliness through- out India.

c. Trying to keep districts clean through local efforts.

P.K. SRIVASTAVA
Sultanate of Oman 
srivastavapk@hotmail.com
 


The CQE Approach Does Differ From Six Sigma

I read Paul Ipolito's letter ("Six Sigma Lacks Breakthrough Thinking," January 2000, p. 13) and thought he represented the thoughts of many certified quality engineers (CQEs). When I read one of the articles on Six Sigma that was published in Quality Progress last year, I immediately noticed the similarity between the CQE body of knowledge and the Six Sigma Black Belt course contents.

However, in my opinion, the two major differences between the CQE approach and Six Sigma breakthrough strategy are:

1. The selection of projects that have a direct, measurable impact on the bottom line.

2. The direct linkage of the projects to customer satisfaction.

Thus, the Six Sigma approach allocates the best trained resources (Black Belts) to the projects that will have a very high impact on customer satisfaction and the bottom line. CQEs may tend to concentrate their attention on the problems that appear important to them.

I do agree with Ipolito that CQEs who regularly practice the tools to solve important problems are playing the role of a Black Belt. They just don't call it that, and they're not making the Six Sigma consultants rich. After all, we had been making improvements way before we ever heard of Six Sigma!

HEMANT P. URDHWARESHE
CQE, Certified Quality Manager
India 
hpurdhwareshe@hotmail.com
  


Author's Response

Thank you for your response, but I must point out a flaw in your logic, and I must state my belief that there's nothing new in regards to Six Sigma.

Any project I work on must have a bottom line impact and it must increase customer satisfaction. Otherwise, I am forced to wonder why I'm devoting resources to it.

Please accept my response to your comments in the spirit of good, thought provoking action.

PAUL A. IPOLITO
CQT, CQA, CQE, Certified Quality Manager
Rochester, NY 
paul.ipolito@lightnin.gensig.com
 

We welcome your letters. Send them to EDITOR, ASQ/QUALITY PROGRESS, 611 E. WISCONSIN AVE., PO BOX 3005, MILWAUKEE, WI 53201-3005; or e-mail them to editor@asq.org. Please include address, daytime phone number and e-mail address. Whenever possible, the e-mail addresses will be included with published letters. Due to space restrictions, Quality Progress will publish a selection of letters in the magazine. All letters will be published on QP Forum, or you can post your comment on QP Forum directly at www.asqnet.org. We reserve the right to edit letters for space and clarity.

On the Lighter Side,
'
QP' Introduces Mr. Pareto Head


Average Rating

Rating

Out of 0 Ratings
Rate this article

Add Comments

View comments
Comments FAQ


Featured advertisers