Webinars by the Statistics Division
Please refer to our YouTube channel at ASQStatsDivision.
Central Composite Design and the US Bowling Congress (Webinar)
Scott Sterbenz and Nicki Brose
The United States Bowling Congress is tasked with setting rules and regulations for the sport of bowling, including acceptable equipment specifications. Faced with a potential specification change, Scott Sterbenz and Nicki Brose utilized a Central Composite Response Surface Design to study the impact of such a change and ended up discovering a previously unstudied phase of ball motion that could have drastically changed the sport.
Statistical Engineering in Business Management (Webinar)
Forrest W. Breyfogle
Organizations often report performance metrics using a table of numbers, pie charts, stacked bar charts, red-yellow-green scorecards with variance to goals, and/or time series charts. These measurement formats often lack statistical rigor and do not quantify the impact from process variability in the reported metrics. Using this form of historical data reporting when running a business is not unlike driving a car by only looking at the rear view mirror – an unhealthy practice.
Described is a Statistical Engineering business management approach where a unique predictive organizational performance measurement and improvement system can be used to orchestrate efforts for addressing today's challenges. This system provides an integrated methodology where organizations can "look out their car's windshield" and, if they don't like what they see, can make adjustments through targeted Lean Six Sigma and other improvement efforts that positively impact the business as a whole. Companies can benefit from this statistics-based predictive scorecarding methodology through a reduction in organizational firefighting; e.g., where stoplight scorecards often treat common-cause variability as though it were special cause, which is no different from what occurred in Edwards Deming's red-bead experiment. In addition, this Statistical Engineering approach to business management contains a detailed roadmap that describes how to use analytics for determining, among other things, where improvement efforts should focus and how these efforts should occur so that the enterprise as a whole benefits.
Getting There: Using Data Mining to Navigate Toward Success (Webinar)
With "analytics" becoming a hot buzzword and the availability of sophisticated software and large readily available data sets, the topic of data mining, that is, the application of sophisticated analysis routines to large data sets in hopes of finding useful information such as patterns in the data, has become a topic of increased interest. What exactly is data mining? Through a series of examples at a fairly basic level I will try to answer that question, showing that data mining is really a collection of tools, some new and many others repackaged older ideas. Often decisions, such as whether a potential customer will default on a loan or not, are the items of interest and various "features" such a debt to income ratio, age, number of years employed etc. are the predictor variables available to inform the decision. Data mining then is used to discover the relationship between the predictors and the target response (default or not) so that informed decisions result. This is called "supervised learning" because a data set containing those predictor variables is available for customers with known default history. When no target variable is available, data mining becomes clustering of the data based only on the observed features. For example we might be able to segment our market into a few groups of customers with each group consisting of similar customers in terms of these measured features. Clustering of documents based on content is another example. Not much theory will be discussed - mostly the presentation will be overviews and examples.
Sampling Supermarket Statistics (Webinar)
November 17, 2011
What started as a straightforward statistical analysis for a large North American wholesale grocer became a true application of statistical engineering. The team at Straight Line Performance Solutions designed and delivered a solution for a corporate audit team that provided new ways of looking at how they measured and tracked customer quality, what standard practices they followed and how they staffed their auditing facilities to maximize their value. The application involved multiple statistical techniques and touched upon all levels inside and outside the organization; from warehouse auditors, to the C suite, to their customers.
Practical Aspects of Algorithmic Design of Physical Experiments (Webinar)
October 12, 2011
Due to operational or physical considerations, standard (i.e. canned) response surface and mixture designs often prove to be unsuitable for actual experimentation. In such cases an algorithmic design is required. I will explore various mathematical properties useful for evaluating alternative algorithmic designs. To assess “goodness of design” such evaluations must consider the model choice, specific optimality criteria (e.g. D, IV, etc), precision of estimation (fraction of design space), the number of runs (required precision), testing for lack of fit, and so forth. These issues are considered at practical level – keeping the actual experimenters in mind. This brings to the forefront such considerations as subject matter knowledge (first principles), factor choice and the feasibility of the experiment design.
Understanding the Difference Equals Making Better Decisions (slides)
Wayne R. Fischer
Series or groups of data differences (or fluctuations) are together called “variation.” Participants will understand what variation is, why and how it occurs, and how to present it so that it may be measured, analyzed, and interpreted properly - not only for reporting results and improving process performance and outcomes, but also for raising staff morale and job satisfaction.
Control Chart Dashboards
Steven S. Prevette
Exploring how to effectively use SPC-based dashboards. Many balanced scorecards employ stoplight charts to judge performance. The process is easily automated using software applications. The operation is quick, efficient and potentially harmful. Harmful? We'll examine why in this webinar.
Control Charting and Process Capability Statements: Issues and Resolution
Forrest W. Breyfogle
Statistical Engineering Webinar
Roger Hoerl and Ron Snee
Much has been written about how statisticians can be more impactful and influential as a profession. One potential opportunity recently proposed is that society may need us to function more as an engineering discipline in the future, rather than solely as a pure science. One can define engineering as the study of how to best utilize scientific and mathematical principles for the benefit of mankind. In other words, engineers do not focus on advancement of the fundamental laws of science, but rather on how existing science might be best utilized for practical benefit, i.e., putting the "parts" together in novel ways rather than inventing new "parts".
The recent performance of the IBM computer "Watson" on the game show Jeopardy is one such example of an engineering versus a scientific breakthrough. This is not to say that engineers do not perform research, or do not develop theory. Rather, it suggests that engineers’ theoretical developments tend to be oriented towards the question of how to best utilize known science to benefit society. If this need for an emphasis on statistical engineering in addition to statistical science is true, then one could argue that our ability to make this transition will largely determine our future vitality as a discipline. The presenters will discuss the need for enhanced focus on statistical engineering, provide an operational definition, and give tangible examples of its application. They will share their thoughts on how statistical engineering should be integrated with such things as statistical theory, applied statistics, statistical methods, and statistical thinking, in order to view the statistics discipline as a system.