2019

STATISTICS ROUNDTABLE

Play it Again, Sam

by Lynne B. Hare

People who know a lot about old movies say Humphrey Bogart, who played Rick Blaine in Casablanca, never really said, “Play it again, Sam” to Dooley Wilson, who played Sam. But the saying continues to stick. There is something comfortably appealing about the smoky bar and the sulking attitude Bogart portrays while listening to a mournful, philosophical “As Time Goes By” repeated throughout the movie.

Rick takes solace in the familiar song, booze and surroundings, and so do we. You can tell because, long after the movie has ended, men go around for several days saying things like “Play it again, Sam,” and “Here’s looking at you, kid,” with a slight lisp until they and their kids grow tired of it.

There is comfort in repetition. I’ve noticed on the industrial side that some people appear to take great joy, let alone solace, in reworking product already produced. Once, years ago, I visited a glass factory and noticed the employees seemed unperturbed about casting defective bottles off the manufacturing line. “Isn’t that expensive?” I asked, seeing large quantities being reworked and thinking prevention would be more efficient.

“No. It doesn’t cost anything,” came the answer. “We just melt them down and start over again.” Now I was a visitor, and I did not want to wear out my welcome by pressing the point, but it sure looked expensive to me. After all, some of the work invested in creating the defective units had to be repeated, and that must have cost something. Sensing my disbelief, my host said, “Actually, by reprocessing, we can run our ovens at a lower temperature, and that saves fuel.”

“Now just a darn minute!” I thought, but I bit my tongue.

As it turns out, I wasn’t able to convince that supplier to consider another way of looking at the process despite subsequent discussions of first run yield, rework being the most expensive ingredient and the laws of conservation of matter and energy. I even facetiously suggested the company could make glass bottles with the ovens at room temperature if it reprocessed enough times.

Unfortunately, it was all to no avail. The company refused to leave its comfort zone, and I wonder if that’s one reason there aren’t as many glass suppliers now as there were 25 years ago.

The Case Against Rework

There are a number of reasons why you should work to reduce or eliminate rework. First, it’s simply too expensive. Consider the flow diagram in Figure 1.


Here, raw materials are processed at stage one, and the value added is marked by a $. The product is checked. If it is rejected, $ is lost, and the material goes back to stage one. If accepted, it goes on to stage two where more value ($$) is added. Again the product is checked and, if rejected this time, $$ are lost, and the product returns to stage one. Successful product flow continues to stage three, adding yet more value ($$$), and so on through as many stages as necessary to generate a finished product.

Other costs may also be involved. For example, there may be costs associated with melting, separating or grinding rejected intermediate material. Some organizations inadvertently hide these costs by scheduling loss allowances in the manufacturing budget. Inefficiencies within those loss allowances are mostly ignored because they are considered part of the cost of doing business.

Zero based budgeting might be more appropriate here because it would shine the light on opportunities for improvement and corresponding cost savings, revealing the hidden plant.

Another cost encountered with rework is the cost of the technical effort for reprocessing or discarding rejected material. Technical resources are better allocated to the prevention of rework than to figuring out what to do with it once it is created.

The bottom line is, organizations that have less rework have a higher competitive advantage due to lower manufacturing costs. Rework isn’t free. As a matter of fact, it may be the most expensive ingredient because the product is made twice.

Second, permitting rework can reward shoddy work, especially if the financial system provides loss allowances. If a certain amount of lax behavior is allowed, it will come to be expected, and then it will become the norm. But norms have their variation and their extremes, and human nature will push the bounds of acceptance toward the extremes, all of which leads to slowly expanding tolerances for increasing rework.

The third reason concerns traceability. If batches contain 10% rework, then the current batch contains 10% of its predecessor plus 1% of the batch before that plus 0.1% of the batch before that and so on. A contaminant can be present, at least at low levels, for a long time in reworked batches. Maintaining traceability in such a system is no easy task.

That problem carries over to the fourth point in the case against the use of rework. Go back and reread W. Edwards Deming’s 14 points.1 Notice how they are all about the reduction of variation. Now take a look at materials used in Six Sigma education. They are all about reducing variation, too. When you reduce variation, two things happen, and they’re both good:

  1. The consumer’s second experience with the product is more like the first. It builds consumer confidence and, therefore, repeat sales.
  2. The production line exhibits more laminar, less turbulent, flow. It has fewer stops and less downtime. The resulting efficiency improvement goes right to the bottom line. But if you add rework back into the process, you increase variation. You go in the opposite direction from that desired to improve quality and productivity. (See sidebar “Rework Increases Variation”).

So despite the comfortable old shoe of living with rework, of playing it again and of listening to “As Time Goes By,” we should continue to oppose the generation and use of rework. Good luck, and here’s looking at you, kid.

Rework Increases Variation

 Here’s a way to show how the addition of rework increases variation. Suppose a process with 10% rework were modeled autoregressively. We would have zt = at + øat-1 + ø2at-2 + ø3at-3 + ... , where zt is the current deviation from target of some key process measure, at, at-1, at-2, ... are independent, random errors associated with the present and previous batches, and ø measures the correlation between successive batches (-1 < ø < 1).

It turns out the variance of zt is Var(zt) = ∋2a (1 + ø2 + ø4 + ø6 + ...). But we know the quantity in the parentheses on the right hand side of this equation is always greater than 1, because the exponents are all even. Therefore, the variance of zt is always greater than the variance of at. Why am I going on like this? Because it illustrates the variation of product with rework is always greater than the variation of product without rework.


REFERENCE

  1. W. Edwards Deming, Out of the Crisis, The MIT Press, 2000.

LYNNE B. HARE is program director of applied statistics at Kraft Foods Research in East Hanover, NJ. He received a doctorate in statistics from Rutgers University, New Brunswick, NJ. Hare is a past chairman of ASQ’s Statistics Division and a Fellow of both ASQ and the American Statistical Association.

Average Rating

Rating

Out of 0 Ratings
Rate this article

Add Comments

View comments
Comments FAQ


Featured advertisers