2019

STATISTICS ROUNDTABLE

Tried and True

Organizations put statistical engineering to the test, see results

by Roger W. Hoerl and Ronald D. Snee

A review of past Statistics Roundtable columns reveals that statisticians and quality professionals are always looking for better ways to increase the breadth and effectiveness of the use of statistical thinking and methods.

In last month’s column, we introduced statistical engineering as an approach that links statistical thinking with statistical methods and provides a mechanism for improving the effectiveness of statistical thinking and methods.1

Statistical engineering works because statistical concepts, methods and tools are linked and sequenced, based on sound science and embedded in work processes with the aid of software. Embedding—including statistical methods in the standard operating procedures for business processes—is a powerful method of institutionalizing the proper use of statistical methods.

Statistical engineering is effective in solving problems and improving processes. What follows are several cases illustrating how statistical engineering is used and the breadth of its application.

Transactional processes

The first example of statistical engineering is improving the performance of a transactional process: the process of collecting money from those who are past due on their accounts. 

Statistical thinking and methods, as well as proper use of data, can be imbedded into the operation of the process to improve performance and to institutionalize this improved approach. Table 1 illustrates some simple ways statistical thinking and methods can be imbedded in such a process. The statistical tools involved include process maps, run charts, control charts, Pareto charts, measurement system analysis using attribute measurement studies, and design of experiments (DoE).

Table 1

Perhaps the only method requiring some explanation is the use of DoE to improve collection strategies. In fact, DoE has many potential applications in the service industries.2 This first example is based on actual DoE applications in collections at General Electric (GE).3

In this case, GE’s credit card division had many collectors attempting to find customers who owed money on their cards, but they could not be found at the address or phone numbers on record. Many of these customers were willing to pay, but GE had no way of getting them their bills.

Unfortunately, each collector had his or her own process, usually involving multiple external data sources, such as external credit agencies, public records or other GE credit-card files.

By applying DoE to identify the most effective collection strategy, GE was able to standardize the approach taken by collectors, resulting in additional collections of almost $3 million per year.

At GE, the contribution of statistical engineering was to identify how methods effective in one application arena could also be applied in a new arena—collections—and also to ensure the ongoing application of these methods by embedding them into standard work processes.

Managing product quality

Another tangible example of statistical engineering in the field of quality improvement is the Product Quality Management System (PQM) developed at DuPont in the 1970s. PQM is an overall framework for managing the quality of a product or service. It provides an operational system that enables marketing, R&D, production and support personnel to work together to meet increasingly stringent customer requirements.

 In his 1986 American Statistical Association Invited Presidential Address, DuPont CEO Richard E. Heckert commented:

"Within two years, product quality had improved to the point of commanding a marketplace advantage, and more than $30 million had been gained in operating cost improvements. The statistically based PQM system developed for ‘Dacron’ [a polyester fiber] was expanded to other products with further contributions in earnings."4

The statistical tools embedded in the PQM approach included:

  • Product and process sampling schemes.
  • Product lot release methods.
  • Analysis of variance and variance components.
  • Cumulative sum control chart process control.
  • Shewhart control.
  • Interlaboratory studies of measurement methods.
  • DoE.
  • Response surface method.
  • Data graphics.

A training system was also in place to provide DuPont staff with the needed knowledge and skills to use PQM.

Managing product quality was emphasized the tools were not. This was evident in the role definitions of PQM statisticians and quality professionals found in DuPont’s PQM manual in 1988:

  • Leaders in developing their organization’s strategy for quality management.
  • Leaders in developing technology systems for quality management.
  • Participants in the business planning process.
  • Participants in problem-solving activities.
  • Leaders in initiating and implementing quality management training systems.

This manual, which goes back more than 20 years, indicates that the roles of statisticians and quality professionals are different when embedding statistical thinking and methods in the processes used to operate the business. Leadership is required by those using statistical engineering. This example illustrates the leadership role needed to implement statistical engineering by connecting statistical tools with statistical thinking. This was illustrated in Figure 1 in last month’s Statistics Roundtable column.5

The contribution of statistical engineering in this example was to integrate various techniques (statistical and otherwise), software, a training system, a manual and new roles for statisticians and quality professionals into an overall quality management system to improve quality at lower costs.

In essence, the whole was greater than the sum of the parts. This difference was the unique contribution of statistical engineering.

Streamlining experimentation

Another example involves streamlining experimentation. This application of statistical engineering was developed at DuPont in the 1960s and was offered in public workshops beginning in the 1970s.6

The strategy identifies three experimental environments: screening, characterization and optimization. The strategy sequences and links a variety of experimental designs that are used depending on the experimental environment.

The screening phase explores the effects of a large number of variables with the objective of identifying a smaller number of variables to study further in characterization or optimization experiments. Screening studies typically use fractional-factorial and Plackett-Burman designs to collect data. Additional screening experiments involving additional factors may be needed when results of the initial screening experiments aren’t promising. On several occasions, we’ve seen the screening experiment solve the problem.

In the characterization phase, we try to better understand the system by estimating interactions and linear (main) effects. Here, full-factorial and fractional-factorial designs are used.

In the optimization phase, we develop a predictive model for the system to find useful operating conditions through response surface contour plots and perhaps mathematical optimization. Response surface designs are used to collect data in these studies.

This screening, characterization and optimization (SCO) strategy actually embodies several substrategies that are a subset of the overall SCO strategy, namely:

  • Screening-characterization-optimization.
  • Screening-optimization.
  • Characterization-optimization.
  • Screening-characterization.
  • Screening.
  • Characterization.
  • Optimization.

The end result of each of these sequences is a completed project. There is no guarantee of success in a given instance, only that SCO strategy will increase your chances of succeeding.

The strategy used depends on the experimental environment. These characteristics involve program objectives, the nature of the factors (Xs) and responses (Ys), resources available, quality of the information to be developed and the theory available to guide the experiment design and analysis. A careful diagnosis of the experimental environment along these lines can have a major effect on the success of the experimental program.7

Statistical engineering contributes by taking proven DoE techniques, which are often researched and taught in isolation from one another, and integrating them into an overall experimentation strategy. This strategy allows practitioners to achieve greater results than they could achieve using the same DoE techniques in isolation.

Improving sales and operations

James L. Hess, vice president of operations services at Leggett and Platt, described another use of statistical engineering that helped the the company improve its sales and operational planning process, which of course is a critical business process.8

This example involved embedding statistical modeling and forecasting methods within the demand planning part of the process. Using data and applying proven statistical methods are integral to the business process, as is the commercial software used to support the workflows.  

The statistical tools used included Bayesian time series techniques for modeling data. These techniques were tailored to the application.

For example, if adequate data were available, seasonality was incorporated into the models. Rather than relying solely on statistical methods, the models were constructed and used to make forecasts that can be revised based on outside information, integrating statistical and knowledge-based approaches.  

The main benefits came from reduced inventories for the same customer service level and production efficiency. This freed cash and reduced expenses associated with carrying inventory. One of the most significant benefits was the reduced opportunity for slow-moving and obsolete inventory.

A smaller part of the business case is improved labor efficiency. While the financial impact of this approach is company confidential, Hess reported that this business process approach contributed significantly to the company’s bottom line.

In this case, the contribution of statistical engineering was embedding the methods to ensure institutionalization, integration of statistical and knowledge-based approaches, and tailoring the Bayesian time series approach so it could differentiate between cases with enough data for seasonality estimation and cases that didn’t have enough data.

Underlying theory

The reasons for the effectiveness of statistical engineering can be seen when you consider the theory underlying its use. We believe there are at least five aspects of this theory:

  1. A system or strategy to guide the use of statistical tools is needed to effectively use the tools.
  2. The impact of statistical thinking and methods can be increased by integrating several statistical tools, enabling practitioners to deal with highly complex issues that cannot be addressed with any one method.
  3. Linking and sequencing the use of statistical tools speeds the learning of the approach, thereby increasing the impact of the method.
  4. Embedding statistical thinking and tools into daily work institutionalizes their application.
  5. Viewing statistical thinking and methods from an engineering context provides a clear focus on problem solving to the benefit of humankind.

Over time, as statistical engineering is used, this theory will be challenged and revised as more is discovered. As a result, the breadth, depth and effectiveness of statistical thinking and methods in practice will increase—an objective on which the vast majority of statisticians and quality professionals can agree.

Realizing the promise

Statistical engineering provides a thought process that enables you to integrate, link and sequence statistical tools in a way that facilitates embedding the tools in processes and systems used to run the business. As a result, the sustainability of the benefits on the problem solution or process improvement is increased.

Broad use of statistical engineering will move us closer to realizing the promise of statistical thinking and methods. Leadership by statisticians and quality professionals is required to effectively apply statistical engineering. Assuming leadership will result in impactful roles for those who accept the challenge. 


References

  1. Roger W. Hoerl and Ronald D. Snee, "Closing the Gap," Quality Progress, May 2010, pp. 52-53.
  2. J. Ledolter and A.J. Swersey, Testing 1-2-3: Experimental Design With Applications in Marketing and Sales Operations, Stanford Business Books, 2007.
  3. Gerald J. Hahn, Necip Doganaksoy and Roger W. Hoerl,  "The Evolution of Six Sigma," Quality Engineering, Vol. 12,  No. 3, 2000, pp. 317-326.
  4. Richard E. Heckert, "Presidential Invited Address," American Statistical Association Annual Meeting, Chicago, 1986.
  5. Hoerl and Snee, "Closing the Gap," see reference 1.
  6. Ronald D. Snee, "Raising Your Batting Average," Quality Progress, December 2009, pp. 64-68.
  7. Ibid.
  8. Author communication with James L. Hess, 2009.

Bibliography

Roger W. Hoerl and Ronald D. Snee, "Moving the Statistics Profession Forward to the Next Level," The American Statistician, February 2010, pp. 10-14.

© 2010 Roger W. Hoerl and Ronald D. Snee


Roger W. Hoerl is manager of GE Global Research’s applied statistics lab. He has a doctorate in applied statistics from the University of Delaware in Newark. He is an ASQ fellow and a recipient of the ASQ Shewhart Medal and Brumbaugh Award. Hoerl is also an Academician in the International Academy for Quality.

Ronald D. Snee is president of Snee Associates LLC in Newark, DE. He has a doctorate in applied and mathematical statistics from Rutgers University in New Brunswick, NJ. Snee has received the ASQ Shewhart and Grant medals, and is an ASQ fellow and an Academician in the International Academy for Quality.


Average Rating

Rating

Out of 0 Ratings
Rate this article

Add Comments

View comments
Comments FAQ


Featured advertisers