Strategic Structure—The Big Picture

Framework for holistic improvement with lean Six Sigma 2.0

by Roger W. Hoerl and Ronald D. Snee

Some time has passed since Six Sigma was introduced and lean Six Sigma was developed. Meanwhile, the world and its needs have changed. In last month’s Statistics Spotlight, we argued that it is time for quality improvement to adopt a new paradigm—one of holistic improvement using what we call lean Six Sigma 2.0.1

This approach must incorporate various improvement methods for different types of problems, integrate with an overall holistic improvement system, and address lean Six Sigma’s other limitations. Now, let’s examine what such a system might look like. But first, let’s see some of the recent changes in the world that have significant implications for continuous improvement—large, complex and unstructured problems, the emergence of big data analytics and the growing importance of risk management.

Large, complex and unstructured problems

Since the emergence of Six Sigma in about 1987, there has been a growing awareness that some problems are too large, complex and unstructured to be solved with traditional problem-solving methods, including lean Six Sigma. Applications in such areas as genomics, public policy and national security often present significant challenges, even in terms of precisely defining the specific problem to be solved.2

Similarly, the system in place to approve new pharmaceuticals in the United States involves a series of clinical trials and analyses, guided by significant subject matter knowledge, such as in identifying likely drug interactions. No single experimental design or statistical analysis results in a new approved pharmaceutical. The system must balance the need for public safety with the urgent need for new medications to combat emerging diseases such as the Ebola or Zika viruses. The problem is complex.

Such problems are different from those that can be solved through routine problem solving, or even through lean Six Sigma. Some attributes of these types of problems are shown in Table 1.

Table 1

Here’s a closer look at each of these attributes.

In terms of size, the problem is simply too large to be solved with any one method. Several tools, and perhaps several different disciplines, are required to address the full scope of the problem. It cannot be resolved in the three to six months a lean Six Sigma project typically takes.

The problem has significant complexity—not only technical issues, but also political, legal or organizational challenges. Typically, technical problems cannot be addressed without understanding and addressing nontechnical challenges.

The problem itself is not well defined, at least not initially. Many of General Electric’s original Six Sigma projects faced this issue, which led to adding the define step in the define, measure, analyze, improve and control process. Large, complex and unstructured problems, however, often have an even greater lack of structure initially, requiring even more up-front effort to define and structure the problem.

Most textbook problems in virtually all quantitative disciplines—from statistics to mechanical engineering to econometrics—come with canned data sets. The data, sometimes real, often have unquestionable quality and are presented in statistics texts as a random sample—which is rarely accomplished in practice. Researchers and practitioners who must collect their own data understand the challenge of obtaining high quality data. For many significant problems, the existing data are either inadequate, or from disparate data sources of different quality and quantity.

Most textbook problems have a single correct answer, too. Many real problems also have a single correct answer. Even in online data competitions, such as those on kaggle.com, there is typically an objective metric, such as residual standard error when predicting a holdout data set, used to define the best model. Complex problems, however, rarely have a correct solution that you could look up in a reference text. They are too big, too complicated and too constrained.

Given these issues, it’s impossible to theoretically derive correct solutions to large, complex and unstructured problems. Rather, a unique strategy must be developed to attack this specific problem, based on its unique circumstances. Theory and experience will help, but an overall strategy involving multiple tools applied in some logical sequence, and perhaps integration of multiple disciplines—especially computer science—will be required.

We argue that a solution must be engineered using known science, and often including statistics. Textbooks and academic courses across the quantitative disciplines do not discuss strategy enough, if at all. Engineering solutions to large, complex and unstructured problems require a problem-solving mindset guided by theory and experience.

Statistical engineering has been proposed as an overall approach to developing a strategy to attack such problems.3 Statistical engineering is defined as: "The study of how to best utilize statistical concepts, methods and tools, and integrate them with information technology and other relevant disciplines, to achieve enhanced results."4 Statistical engineering is not a problem-solving method per se, such as lean Six Sigma, but rather a discipline. A generic statistical engineering framework to attack large, complex and unstructured problems5 is shown in Figure1.

Figure 1

After the high-impact problem has been identified, it must be properly structured, and significant time and effort are typically required to understand the context of the problem. Large, complex problems have a defied solution for a reason. A thorough understanding of the context of the problem is critical to finding a solution.

Big data analytics

Recent IT advances have led to a revolution in the ability to acquire, store and process data. Data are being collected at an ever-increasing pace—through social media, online transactions and scientific research.

At the beginning of this millennium, Thomas H. Davenport and Jeanne Harris foretold the potential impact that data analytics might have in the business world.6 Shortly thereafter, Netflix announced a $1 million prize for anyone who could develop a model to predict their movie ratings at least 10% better than their current model.7 The website kaggle.com emerged as a host to online data analysis competitions, becoming what might be called the "eBay of analytics."

But there is a dark side to big data analytics. The initial success and growth of big data led many to believe that the combination of large data sets and sophisticated analytics would guarantee success.8 Unfortunately, this naïve approach has proven false, with several highly publicized failures of big data.9

Our point is not to disparage the potential of big data analytics, but rather to point out that sophisticated analytics have not replaced the need for critical thinking and fundamentals. Studying algorithm development is important and quite useful in practice. Such study, however, does not replace the study of the problem-solving process, statistical engineering or continuous improvement principles.

Big data and data science are frequently discussed in the context of analytics, statistics, machine learning or computer science. Big data is rarely discussed in the context of continuous improvement. Massive data sets provide unique opportunities to make improvements—assuming, of course, they contain the data needed to solve the problem at hand, and also that subsequent models go through proper vetting. Newer, more sophisticated analytics provide additional options for attacking problems, particularly large, complex and unstructured problems. Big data analytics provide a significant opportunity for expanding the scope and impact of continuous improvement initiatives.

Business in an increasingly dangerous world

The world certainly seems to be a more dangerous place, especially for business, than in the past. Clearly, concerns over terrorism are not restricted to military or government institutions.

Terrorism is not the only cause for concern of businesses from a risk management viewpoint. Identity theft is now a billion-dollar criminal enterprise in the United States alone. A more modern phenomenon is the hacking of computer systems to obtain confidential information. The hack of Target’s credit card data base not only allowed 40 million credit card numbers to be stolen, but also did irreparable harm to Target’s image.10

Businesses in the 21st century face unique security challenges in addition to traditional business risks such as major lawsuits, environmental disasters and catastrophic product failures. Therefore, risk management has become an even more critical business priority. The cost of failure in risk management can be high. Therefore, risk management must be considered as an integral element within a holistic improvement system.

New paradigm needed

It now should be clear that lean Six Sigma, even in the more modern form of version 1.3,11 is insufficient to address today’s business improvement needs. We have reached the point that there are too many problems unaddressed by lean Six Sigma 1.3 to ignore.12 A new paradigm, or way of thinking about improvement, will be required to make significant progress going forward.

Lean Six Sigma is excellent at what it was designed to do: solve medium-sized "solution unknown" problems. Version 1.3 also incorporates innovation efforts, new product and process design, as well as lean concepts and methods. The supportive infrastructure developed for Six Sigma is the best and most complete continuous improvement infrastructure developed to date.13 If this same infrastructure can be applied to a holistic version of lean Six Sigma, one that addresses the limitations noted earlier, the resulting improvement system would would surpass the established system.

Addressing all of the limitations previously noted would not be a minor upgrade, but rather a fundamental redesign based on a much broader paradigm. It would be version 2.0, not simply version1.4.

What paradigm would be required to develop lean Six Sigma 2.0? The answer is clear: We need a holistic paradigm of improvement—that is, a system that is not based on a particular method, whether it be Six Sigma, lean, Work-out or something else. Rather, it must start with the totality of improvement work needed and develop a suite of methods and approaches that would enable the organization to address all the improvement work identified.

A holistic paradigm reverses the typical way of thinking about improvement. Traditionally, books, articles and conference presentations on improvement focus on a particular method and typically promote that method over others, at least for specific types of problems. It’s easy to find books on Six Sigma, lean or TRIZ, for example.

But there are few, if any, books on improvement per se. With a holistic paradigm, the focus is not on methods, but the improvement work and the problems to be solved. Only after the problems have been identified and diagnosed are methods discussed. The individual methods can be applied to the specific problems for which they are most appropriate. That is, holistic improvement is tool agnostic—the tools are "hows" and not "whats." Improvement is our what and the focus of our efforts.

Framework for holistic improvement system

The high-level framework for a holistic improvement system is shown in Figure 2. All improvement begins with the business and organizational context, which defines overall improvement needs and opportunities. Needs and opportunities also greatly depend on leadership, provided by the organization’s management as defined in the organization’s strategy. Clearly, holistic improvement is a strategic approach.

Figure 2

The holistic improvement system has three critical building blocks:

  1. Quality by design, which focuses on innovation and developing new businesses, products and processes.
  2. Breakthrough improvement, which encompasses most of what would traditionally be considered continuous quality improvement.
  3. Quality and process management systems, which is the defensive aspect of quality—that is, managing processes with excellence to avoid errors and mistakes, and maintaining process control.

These building blocks are linked and sequenced as shown in Figure 2. The outputs of the building blocks are impactful and sustainable results, which in turn enhance the business and organizational context. The cycle continues.

Sample methods and approaches used within the building blocks are shown in Table 2. The methods selected and the sequence of use depend on the problem or opportunity being addressed. Other methods and approaches can be added as needed.

Table 2

Addressing improvement from a holistic perspective puts the focus where it must be: on the problem and improvement need, with the methods being an important but secondary consideration.

As a result, the impact and bottom-line results of an improvement system are increased and sustained over time.


  1. Ronald D. Snee and Roger W. Hoerl, "Time for Lean Six Sigma 2.0?" Quality Progress, May 2017, pp. 50-53.
  2. Roger W. Hoerl and Ronald D. Snee, "Statistical Engineering: An Idea Whose Time Has Come," American Statistician, 2017.
  3. Ibid.
  4. Roger W. Hoerl and Ronald D. Snee, "Moving the Statistics Profession Forward to the Next Level," American Statistician, 2010, Vol. 64, No. 1, p. 12.
  5. Alexa DiBenedetto, Roger W. Hoerl and Ronald D. Snee, "Solving Jigsaw Puzzles," Quality Progress, June 2014, pp. 50-53.
  6. Jeanne G. Harris and Thomas H. Davenport, Competing on Analytics: The New Science of Winning, Harvard Business Review Press, 2007.
  7. Xavier Amatriain and Justin Basilico, "Netflix Recommendations: Beyond the 5 Stars, Part I," Netflix Tech Blog, April 6, 2012, http://techblog.netflix.com/2012/04/netflix-recommendations-beyond-5-stars.html.
  8. Roger W. Hoerl, Ronald D. Snee and Richard D. De Veaux, "Applying Statistical Thinking to ‘Big Data’ Problems," Wiley Interdisciplinary Reviews: Computational Statistics, July/August 2014, pp. 221-232.
  9. David Lazar, Ryan Kennedy, Gary King and Alessandro Vespignani, "The Parable of Google Flu: Traps in Big Data Analysis," Science, March 14, 2014, Vol. 343, pp. 1,203-1,205.
  10. CNNMoney Staff, "Target: 40 Million Credit Cards Compromised," CNNMoney, Dec. 19, 2013, http://money.cnn.com/2013/12/18/news/companies/target-credit-card.
  11. Snee and Hoerl, "Time for Lean Six Sigma 2.0?" see reference 1.
  12. Thomas S. Kuhn, The Structure of Scientific Revolutions, University of Chicago Press, 1962.
  13. Ronald D. Snee and Roger W. Hoerl, Leading Six Sigma: A Step-by-Step Guide Based on Experience With GE and Other Six Sigma Companies, Financial Times/Prentice Hall, 2003.

© 2017 Roger W. Hoerl and Ronald D. Snee

Roger W. Hoerl is a Brate-Peschel assistant professor of statistics at Union College in Schenectady, NY. He has a doctorate in applied statistics from the University of Delaware in Newark. Hoerl is an ASQ fellow, a recipient of the ASQ’s Shewhart Medal and Brumbaugh Award and an academician in the International Academy for Quality.

Ronald D. Snee is president of Snee Associates LLC in Newark, DE. He has a doctorate in applied and mathematical statistics from Rutgers University in New Brunswick, NJ. Snee has received ASQ’s Shewhart, Grant and Distinguished Service Medals. He is an ASQ fellow and an academician in the International Academy for Quality.

Average Rating


Out of 0 Ratings
Rate this article

Add Comments

View comments
Comments FAQ

Featured advertisers