| Cart Total:
Menu
Learn About Quality

Statistical Methods in Quality Improvement


Statistics on ASQTV™

Quality Glossary Definition: Statistics

Statistics are defined as a field that involves tabulating, depicting, and describing data sets.

Statistical methods in quality improvement are defined as the use of collected data and quality standards to find new ways to improve products and services. They are a formalized body of techniques characteristically involving attempts to infer the properties of a large collection of data.

The use of statistical methods in quality improvement takes many forms, including:

Hypothesis Testing

Two hypotheses are evaluated: a null hypothesis (H0) and an alternative hypothesis (H1). The null hypothesis is a "straw man" used in a statistical test. The conclusion is to either reject or fail to reject the null hypothesis.

Regression Analysis

Determines a mathematical expression describing the functional relationship between one response and one or more independent variables.

Statistical Process Control (SPC)

Monitors, controls, and improves processes through statistical techniques. SPC identifies when processes are out of control due to special cause variation (variation caused by special circumstances, not inherent to the process). Practitioners may then seek ways to remove that variation from the process.

Design and Analysis of Experiments

Planning, conducting, analyzing, and interpreting controlled tests to evaluate the factors that may influence a response variable.

However, there are some practical and statistical considerations to keep in mind when choosing a statistical method to use.

Example Considerations When Using Statistics/Statistical Methods
Example Considerations When Using Statistics/Statistical Methods

The practice of employing a small, representative sample to make an inference of a wider population originated in the early part of the 20th century. William S. Gosset, more commonly known by his pseudonym "Student," was required to take small samples from a brewing process to understand particular quality characteristics. The statistical approach he derived (now called a one-sample t-test) was subsequently built upon by R. A. Fisher and others.

Jerzy Neyman and E. S. Pearson developed a more complete mathematical framework for hypothesis testing in the 1920s. This included now-familiar concepts to statisticians, such as:

  • Type I error: Incorrectly rejecting the null hypothesis
  • Type II error: Incorrectly failing to reject the null hypothesis
  • Statistical power: The probability of correctly rejecting the null hypothesis

Fisher’s Analysis of Variance, or ANOVA, procedure provides the statistical engine through which many statistical analyses are conducted, as in gage repeatability and reproducibility studies and other designed experiments. ANOVA has proven to be a very helpful tool to address how variation may be attributed to certain factors under consideration.

W. Edwards Deming and others have criticized the indiscriminate use of statistical inference procedures, noting that erroneous conclusions may be drawn unless one is sampling from a stable system. Consideration of the type of statistical study being performed should be a key concern when reviewing data.

Statistics Resources

You can also search articles, case studies, and publications for statistics resources.

Books

Statistics For Six Sigma Black Belts

Continuous Improvement, Probability, And Statistics

Practical Engineering, Process, And Reliability Statistics

Applied Statistics Manual

Articles

Statistical Learning Methods Applied to Process Monitoring: An Overview and Perspective (Journal of Quality Technology) While the research on multivariate statistical process monitoring tools is vast, the application of these tools for big data sets has received less attention. In this expository paper, we give an overview of the current state of data-driven multivariate statistical process monitoring methodology.

Rethinking Statistics For Quality Control (Quality Engineering) As methods used for statistical process control (SPC) become more sophisticated, it becomes apparent that the required tools have not been included in courses that teach statistics in quality control. A basic description of these tools and their applications is provided.

Case Studies

Setting Appropriate Fill Weight Targets With Statistical Engineering (Quality Engineering) A high-level business need was addressed via the development of a solution for setting appropriate targets for product filling processes. This required the calculation of the probability of meeting corporate and regulatory requirements under more realistic assumptions.

Statistical Engineering to Stabilize Vaccine Supply (Quality Engineering) Reliable vaccine supply is a critical public health concern. In this case study, statistical engineering was applied to a complex problem in vaccine production.

Webcasts

More Than Statistics: How Successful Manufacturers Are Adopting a Philosophy of Quality Sponsor Joe Humm, a senior member of Sparta Global Sales Operations, walks through the Deming model, shares a quality philosophy that must infiltrate an entire business ecosystem to ensure long-term success, and provides three case study examples of its application.

Featured Advertisers