Integrating Improvement Initiatives: Connecting Six Sigma for Software, CMMI®, Personal Software Process (PSP)(SM), and Team Software Process (TSP)(SM) - ASQ

Integrating Improvement Initiatives: Connecting Six Sigma for Software, CMMI®, Personal Software Process (PSP)(SM), and Team Software Process (TSP)(SM)


Download the Article (PDF, 85 KB)

Gary A. Gack and Kyle Robison

Six Sigma is an approach to product and process improvement that has gained wide acceptance and has delivered large business benefits across many industries. As application of this framework spreads to software development one must consider how Six Sigma relates to, and can be integrated with, other improvement initiatives and models already in use or under consideration.

This article describes Six Sigma and several widely used software improvement initiatives in terms of their relationships to one another – how they are similar, and how they are different. This foundation provides a backdrop for an illustration of how one organization, LSI Logic Storage Systems, has defined the connections between several initiatives currently under way or under consideration.

Developing and communicating a clear statement of the connections between various initiatives has been found to be a critical factor for successful deployment of any set of improvements. Articulating the connections and distinctions between initiatives prevents confusion and helps to avoid conflicting priorities and expectations among those involved.

Key words: goal setting and deployment; organizational leadership; Personal Software Process (PSP)(SM); program performance and process effectiveness; risk management; SEI Capability Maturity Model Integrated (CMMI)®; Six Sigma; standards, specifications, and models; Team Software Process (TSP) (SM).

Software has long been one of the most difficult challenges faced by many businesses. The rate of failure has been high, rework and “cost of poor quality” (CoPQ) consume a large share of software resources, and yet software is critical to success in every segment of our economy. The cost of hardware technology has decreased sharply, and quality has increased by orders of magnitude. The cost and quality of software technology, however, has not seen comparable improvements.

Many different responses to these problems have evolved in recent years, including those discussed here and many others, such as ISO 9001 and ISO 12204, which will not be examined here. In response to these diverse approaches, many organizations find themselves somewhat conflicted and confused as to what is best and what should come first. The authors’ goal is to offer some clarity as to how these different approaches relate to one another, and to provide an example of how a set of potentially overlapping initiatives were rationalized and connected.

Six Sigma Overview
Six Sigma (6s) is a multifaceted approach to business improvement. It includes a philosophy, set of metrics, set of improvement frameworks, and a toolkit.

The Six Sigma philosophy is to improve customer satisfaction through defect elimination and prevention and, as a result, to increase business profitability. “Defects ” are defined in terms of the customer’s (not engineer’s) viewpoint. The business profitability motive is crucial; improvement for improvement’s sake, without positive impact on the bottom line, does not align with the Six Sigma philosophy.

Six Sigma originated at Motorola in the mid-1980s, and was originally targeted at manufacturing operations. The basic methodology was known as DMAIC (define, measure, analyze, improve, and control) and its intended use was improvement of existing products and processes.

At first glance, it sounds much like the plan-do-check-act cycle that originated with Walter A. Shewhart in the 1930s (Shewhart 1931). So what was new and different? Six Sigma did not really introduce new tools, but a different focus—not just manufacturing, not just quality, but also a support infrastructure and an emphasis on results measurement and an explicit financial tie to bottom-line results. Six Sigma also introduced an increased emphasis on control (sustaining the gains), and an extensive use of data, statistics, and metrics. In addition, Six Sigma introduced the term “Black Belt” to refer to dedicated full-time process improvement specialists who drive projects.

As Six Sigma evolved through the 1990s it was increasingly recognized that in many instances quality and cost problems were rooted in the design of products and processes, and sometimes could not be “improved out.” This realization led to the definition of a new branch of the Six Sigma methodology that came to be known as “Design for Six Sigma” (DFSS).

The DFSS approach, often referred to as “DMADV” (define, measure, analyze, design, and verify), has come to be used when one is designing new products and or processes.

The phenomenal success of Six Sigma in manufacturing and transactional environments, as demonstrated by General Electric, Motorola, American Express, and many others (Harry and Schroeder 2000), has led to a dramatic increase in the number of organizations considering application of Six Sigma to the elusive and intangible world of software.

Six Sigma projects begin and end with business considerations. Project selection and tracking focus on maximizing the benefit delivered to the business bottom line. While there may be plenty of fundamental metrics and statistics en route, Six Sigma project success is measured in financial terms. “Process maturity” is not of interest in itself—the focus is on quantitatively measured business benefits.

Success of Six Sigma in software requires more than just an understanding of the Six Sigma philosophy and tools (which can be gained from traditional Six Sigma training); it also requires learning how the tools and philosophy apply to the specific business area being addressed. This fact is behind the emergence of Six Sigma training tailored to transactional/service environments vs. traditional training rooted in manufacturing. To maximize the application and benefit of Six Sigma concepts, case studies, tools, practice problems, and assignments used in the training need to be appropriate to the domain in which Six Sigma is to be applied.

This need for training tailored to the intended environment of use is even more critical in software, because learning is maximized when the problems and examples are directly relevant to the student’s immediate needs and because software is “different.” But first, consider the business justification, since without that, one never gets the critical management support one needs.

The Six Sigma Value Proposition
Perhaps the most important distinction between Six Sigma and other approaches to process improvement in software lies in its almost obsessive preoccupation with financially measured business results. Six Sigma caters primarily to the concerns of the CEO and CFO—process maturity is not viewed as a business benefit in and of itself. Six Sigma is a “show me the money” proposition. At first glance, those in the technical community might look slightly askance as this blatantly commercial perspective—but give it the benefit of the doubt for a moment. Perhaps there is something here that caters to the needs and desires of software practitioners as well.

Experience with Six Sigma has demonstrated, in many business and industry segments, that the payoff can be quite substantial, but that it is also critically dependent on how it is deployed (see, for example, Eckes 2001). The importance of an effective deployment strategy is no less in software than in manufacturing or transactional environments. While a discussion of culture change implications is beyond the scope of this article, it is clearly a very important issue, especially as resistance to change is often high in software organizations, as many practitioners have seen a long series of “silver bullets” but few dead wolves.
An effective Six Sigma deployment begins at the top with executive training that is designed to ensure a clear linkage between corporate strategic imperatives and the Six Sigma program. Assuming for a moment that one can establish clear linkages between Six Sigma and software process improvements—imagine what a refreshing circumstance this could create! Software practitioners understood and supported by executive management! Costs and schedule improvements evaluated based on economic payback! Initiatives sustained after the latest reorganization!

Executive juices flow at the prospect of improved financial results, and Six Sigma can deliver. Experience demonstrates that, on average, each Six Sigma Black Belt can complete four to six projects per year at an average financial benefit of $150,000 per project, and Green Belts can complete two to three projects per year at an average financial benefit of perhaps $75,000 each—a small deployment (15 Green Belts, 15 Black Belts) produces a total benefit of around $8,000,000 in the first year, at a typical cost of less than $2,000,000— a four-to-one return in the first year and greater thereafter. (Results will vary significantly from project to project, but the results indicated here are well within the typical range.) How many software process improvement efforts produce comparable results?

What’s Different About Software?
Although there are many ways in which software development differs from manufacturing or transactional/service applications of Six Sigma, the authors believe the following are the most significant in the context of this article:

  • Software projects are very risky. A survey of 8000 large U.S. software projects (Standish Group Chaos Report 2001) indicates an average cost overrun of 90 percent, schedule overrun of 120 percent, and large-project cancellation of 25 percent due to some combination of delays, budget overruns, or poor quality. Six Sigma tools and methods can reduce these risks dramatically.
  • Requirements failures (reflecting needs not originally recognized or correctly understood, leading to substantial and costly rework late in the software development cycle) are associated with 80 percent of failed (late or cancelled) software projects (Jones 1994). The DFSS methodology, with its associated toolset, can greatly improve timely discovery of latent or “hidden” requirements, clarify the customer importance associated with each requirement, determine measures of functionality and associated cost/benefit trade-offs, and balance identified alternatives with the “voice of the business.”
  • “Expectations” failures (incorrect and overly optimistic estimates, leading to long delays and large cost overruns) are a factor in 65 percent of failed software projects (Jones 1994). Six Sigma statistical tools, such as regression analysis, can be applied to the development and refinement of software cost, schedule, and delivered quality forecasting.
  • “Execution” failures (leading to poor software quality, heavily back-loaded costs, and very high levels of rework—commonly 40 percent of total cost) are a factor in 60 percent of failed software projects (Jones 1994). Defect cost analysis scorecards and Rayleigh effort and defect modeling tools provide mechanisms to analyze the cost and quality dynamics of software projects that enable accurate forecasting of the cost benefit of proposed process improvements.
  • Software is mysterious. Improvements are not evident unless good intermediate metrics are in place. It’s not like manufacturing where measurements and processes are well understood and changes are quick to evaluate. Cycle times for software development are typically several months or even years long, and repeatability is not a common concept as it is in manufacturing and transactional areas.

The Capability Maturity Model Integrated (CMMI), a product of the Carnegie-Mellon Software Engineering Institute, is an evolution and combination of the original Software Capability Maturity Model and the Systems Engineering Maturity Model. This model describes a series of maturity levels (originally derived from and analogous to Crosby’s “Quality Maturity Grid” [Crosby 1979]) related to process areas (PAs) defined by the model.

The principal changes introduced by the CMMI relative to its predecessors (Software and Systems Engineering CMM) are: 1) certain aspects of the systems engineering and software models are integrated into a single model; 2) both a continuous view (in which maturity of specific PAs is rated individually, as in the original system engineering model and in the ISO 15504 [SPICE] model) and a step view (as in the original Software CMM) are accommodated; and 3) measurement is introduced at level 2 rather than at level 4 as in the original Software CMM. From the perspective of Six Sigma, this early introduction of measurement is by far the most important refinement.

When considered from a Six Sigma perspective, CMMI can be viewed as a definition of industry best practices—a set of candidate improvements. In this view, Six Sigma provides the “why” (the business case that leads to selection of specific improvements to be implemented in the context of a Six Sigma project), and also the “how” in the sense that the Six Sigma “measure” and “analyze” activities provide the framework and tools needed to develop the business case that will be acceptable to and supported by the CFO. In the authors’ view, the “why” and the “how” are not within the scope of the CMMI. Hence, Six Sigma overarches and draws upon the CMMI.

The DFSS aspect of Six Sigma can be understood as a front-end extension to the best practices defined by the CMMI—it deals with finding and articulating latent or hidden requirements that might otherwise be missed through the application of tools such as needs/context analysis, KJ analysis, Kano analysis, Conjoint analysis, and other Six Sigma tools not within the scope of the CMMI. DFSS activities typically begin before and overlap requirements management and other PAs.

“Developing software products involves more than just stringing programming instructions together and getting them to run on a computer. It requires meeting customer requirements at an agreed upon cost and schedule. To be successful, software engineers need to consistently produce high-quality programs on schedule and at their planned cost. This book shows you how to do this. It introduces the Personal Software Process (PSP), which is a guide to using disciplined personal practices to do superior software engineering” (Humphrey 1997).

While space does not permit a complete description of every aspect of the PSP, the authors think there are several key features that are particularly pertinent to the focus of this article.

  • Using the PSP requires software engineers to track their time according to a standard. Time tracking is fundamental to any realistic approach to improvement in software development, as personnel time is by far the largest element of cost. If one does not know how long things took before and after process changes, he or she clearly cannot know if cost improvement has really occurred.
  • PSP also requires that software engineers estimate the size of software products to be produced, and record actual size upon completion of the work products. Size is also a fundamental metric that is essential to any improvement process.
  • In addition, PSP requires that the software engineer record calendar time for each development process activity. This metric is essential to understanding the cycle time impact of process changes.
  • PSP emphasizes the importance of planning, and uses actual vs. estimated outcomes to engender a learning cycle based on feedback. This learning cycle leads to improved ability to estimate, and hence to reduction of “expectations” failures.
  • PSP is a defined process that specifies the deliverables to be produced, quality assurance activities to be performed, and other aspects of the software development process—hence, a repeatable process. This foundation is a desirable prerequisite to application of Six Sigma for Software—a consistent process is necessary for learning and improvement. PSP is one way to create this foundation.
  • PSP requires the recording and tracking of defects, and hence enables prediction and evaluation of yield at each step in the process (often called “phase containment effectiveness” in software)—not just final yield (often called “total containment effectiveness” in software).
  • The PSP emphasizes use of formal work product reviews using checklists (sometimes referred to as peer reviews or Fagan style inspections). This process enables the recording of defects at each phase, and makes it possible to understand and manage subprocess yields.

While there are many other specifics of the PSP, the above gives an overview of the main features. Taken together, all of these attributes of the PSP may be understood as Six Sigma for Software enablers. Followed as prescribed, the PSP will provide some of the data that can be analyzed using the Six Sigma tool set, and will lead directly to improved performance. There are, however, many Six Sigma for Software tools and techniques that are not within the scope of the PSP.

“The TSP is a fully defined and measured process that teams can use to plan their work, execute their plans, and continuously improve their software development process. The Team Software Process (TSP) is defined in a series of process scripts that describe all aspects of project planning and product development. The process includes team role definitions, defined measures, and the postmortem process.” (Humphrey 2002)

The TSP is an extension of, and incorporates, the PSP—essentially, PSP scaled up for teams. The current version is intended for relatively small teams (3-15 members), and a version for teams of up to about 150 members is being used on a limited basis. All of the metrics and quality assurance activities associated with PSP are also incorporated into the TSP. The TSP addresses the intent of most of the SEI/CMMI PAs, although it does not fully achieve all key practices of each PA.

Like the PSP, the TSP establishes a defined process foundation and produces useful data that can be analyzed and leveraged with the Six Sigma for Software toolkit. Limited results available suggest that use of the TSP improves software development effectiveness.
Key features of the TSP that are pertinent to Six Sigma include:

  • An explicit process improvement activity called the process improvement proposal, or PIP. The PIP process shows engineers how to record improvement ideas while doing development work. These proposals can be used as a starting point for Six Sigma DMAIC projects—potential benefit of alternate proposals may be evaluated using Six Sigma methods and thereby prioritized for action.
  • TSP teams make quantitative quality plans for defects injected and removed by process step. They also determine quality goals for a family of quality measures that are used in tracking quality performance. During the work, the engineers gather quality data and track their performance against the quality plan. This approach is consistent with and supportive of the control plans that result from the “C” phase of a DMAIC project.
  • The TSP has been designed to be independent of the languages, tools, or methods used. As a consequence, Six Sigma tools are not specifically called for by the TSP. Where organizations use Six Sigma tools and methods, these would be explicitly called for when the team developed its customized TSP project process. This process would then provide the explicit coupling needed between the development and quality processes to ensure that the developers used the quality methods when required. Similarly, the DMAIC improve and control phase outcomes of a Six Sigma project would explicitly define the links to TSP processes and metrics, thereby ensuring effective integration and avoidance of duplication of effort.
  • Six Sigma implementation implies organizations must have a defined and measured operational process and that engineers must follow that process and routinely gather the required data as they work. PSP and TSP methods provide explicit guidance on how to build a defined and measured process that fits the project and that the developers will use.

It is worth noting that the SEI has taken 13 years to develop these processes and organizations that introduce Six Sigma for Software without capitalizing on this work may have to reinvent much of what the SEI has done. At a minimum, the PSP and TSP provide an excellent starting place with respect to definition of a mature software process that can effectively leverage the potential of Six Sigma.

Similarly, Six Sigma can provide the linkage to the business and facilitate the sustained executive management support essential to success. Many organizations today have within the executive ranks persons who have experienced the tremendous leverage that Six Sigma can bring. These individuals speak the Six Sigma language, understand and respect its potential, and are much more likely to support improvement initiatives that are framed and justified as Six Sigma projects.

On the other hand, few general management executives outside the software function are familiar with CMMI, PSP, or TSP, which are designed to “speak” to software engineers and managers. When seeking support of those who control the funding it is very helpful to speak a language they understand.

PSP and TSP are software development process definitions (some might call them methodologies) that are compatible with a wide range of software development concepts such as spiral development, object-oriented development, and various other sets of techniques, each with certain advantag0pes in modeling and describing requirements and designs for software systems. One way of viewing this relationship is to consider PSP/TSP as the process framework within which specific techniques may be invoked to describe or model the work products being produced.

Six Sigma for Software, on the other hand, is not a software development process definition—rather it is a far more generalized process for improving processes and products. Although a few elements of the Six Sigma for Software toolkit are invoked within the PSP/TSP framework (for example, regression analysis for development of estimating models), there are many other tools available in the Six Sigma for Software toolkit that are not suggested or incorporated in PSP/TSP. While PSP/TSP refers to and may employ some statistical techniques, specific training in statistical thinking and methods generally is not a part of PSP/TSP, whereas that is a central feature of Six Sigma for Software.

Whereas Six Sigma for Software incorporates the DFSS approach to improving the feature/function/cost trade-off in definition and design of the software product, this aspect is not addressed by CMMI/PSP/TSP. Tools such as KJ analysis, quality function deployment, conjoint analysis, design of experiments, and many others have high-leverage applications in the software world, but are not specifically addressed by CMMI/PSP/TSP.

In summary, the authors believe that CMMI/PSP/TSP are among the potential choices of software development process definition that can lead to improved software project performance. They also believe that the full potential of the data produced by these processes cannot be fully leveraged without applying the more comprehensive Six Sigma for Software toolkit.

The relation of Six Sigma for Software to CMMI/PSP/TSP can be best understood as a difference in level of abstraction.

  • Six Sigma for Software might be used to objectively evaluate the overall effect of CMMI/PSP/TSP on software product quality, cost, and cycle time as compared to an alternative approach, perhaps one of the “agile” process definitions such as extreme programming or Ken Schwaber’s “Scrum” (Schwaber and Beddle 2002).

The relation of Six Sigma for Software to CMMI/PSP/TSP might also be characterized as a difference in goals, in which the goals of CMMI/PSP/TSP may be a subset of those associated with Six Sigma for Software.

  • The primary goals of CMMI/PSP/TSP are continuous improvement in the performance of software development teams in terms of software product cost, cycle time, and delivered quality.
  • The goals of Six Sigma for Software may include the goals of CMMI/PSP/TSP, but do not specify any particular process definition to achieve those goals. In addition, Six Sigma for Software may be applied to achieve many other business objectives, such as improved customer service after delivery of the software, or improved customer satisfaction and value realization from the software product feature set delivered. Six Sigma for Software applies to the software process, the software product, and to balancing the voice of the customer and the voice of the business to maximize overall business value resulting from processes and products.
  • An additional distinction is that Six Sigma is typically applied to selected projects, while CMMI/PSP/TSP are intended for all projects. Six Sigma may, for example, be used to plan and evaluate pilot implementation of CMMI/PSP/TSP, or alternatives thereto, while CMMI/PSP/TSP can provide an orderly and defined vehicle to institutionalize the lessons learned from Six Sigma projects.

The most fundamental tenet of Six Sigma is that one must “manage by fact.” This view is consistent with that of TSP/PSP, but it has not yet been established that PSP/TSP is the best alternative in every context, only that it is better than some alternatives. Six Sigma for Software can help organizations find the solution(s) that are truly optimal for each unique circumstance.

LSI Logic Storage Systems, Inc. (LSI-SSI) designs and manufactures high reliability, large capacity RAID controllers. These products are embedded software intensive, and much of the value-add and product differentiation is produced by the software element of the product. Typically, software determines time to market, which is a very important competitive consideration, as is product reliability.

As a technology leader, LSI-SSI has long placed great importance on quality and continuous improvement. They are ISO certified, have extensive experience with the application of total quality management (TQM), and began to apply Six Sigma to manufacturing and logistics aspects of the business about two years ago. More recently, as a consequence of significant successes in the initial deployment of Six Sigma, LSI decided to deploy Six Sigma to software engineering activities as well.

Early in the process of planning the deployment of Six Sigma to software engineering, the LSI team recognized that they faced two important challenges: 1) how to adapt Six Sigma training that was designed for manufacturing environments to suit software engineering, and 2) how to explain to everyone involved how Six Sigma would relate to and coexist with other improvement initiatives already started or being considered. The authors’ focus here is on the second issue, but they think it important to emphasize that the first issue is also critical—most Six Sigma software deployments fail when that issue is not effectively addressed.

Six different, but related, initiatives needed to be considered and integrated: 1) a strategic marketing initiative known as “market leadership” (Ryans et al. 2000); 2) a project/program management process known as “critical chain” (Goldratt 1999); 3) Six Sigma product/process improvement (DMAIC); 4) DFSS; 5) CMMI; and 6) PSP/TSP.

Market leadership, to simplify a bit, is an approach to strategic marketing planning that begins at the highest level. This initiative focuses on identifying target markets and segments, understanding the requirements of those markets and segments at a high level, assessing the competitive landscape in terms of threats and opportunities, and evaluating the company’s capabilities and channels with respect to capability to service identified markets and segments. This initiative further considers potential differentiation, and attempts to understand and quantify risks in each segment.

Critical chain is much more than can be adequately explained in the space here, so perhaps we can simply say that it is a new and potentially very powerful way to think about project and program management. It brings some new and important thinking about how to balance risk against the need for the shortest possible cycle times. It is one way to actualize the intent of the project planning and project tracking CMMI PAs.

DFSS is that aspect of Six Sigma concerned with designing new products and processes, as opposed to improving something that already exists.

DFSS employs voice of the customer tools such as needs and context analysis, use cases and measures, KJ analysis, Kano analysis, and various tools for feature importance ranking and prioritization. Further, DFSS attempts to balance the voice of the customer with voice of the business considerations such as time to market, delivered quality, risk, warranty cost, and so forth.

DFSS emphasizes forecasting product and process capability before product detailed design and construction in order to be proactive rather than reactive. Properly executed, DFSS leads to many fewer requirements changes during development, higher customer satisfaction, improved competitiveness, and, consequently, higher market share and profitability.

Six Sigma DMAIC is that aspect of Six Sigma concerned with improving existing processes or products—for example, the software development process itself or a particular legacy system. This view of Six Sigma will often leverage the recommendations and metrics of specific CMM PAs in the measure-analyze-improve phases of a Six Sigma project. For example, a Six Sigma DMAIC project might define as its primary objective (success metric) a reduction in software development cost, and as a secondary metric improvement in delivered quality as measured by post-release defects. To accomplish those objectives the project might introduce an improvement consistent with the Peer Reviews PA. This improvement as a Six Sigma project will differ from a CMM initiative in that it will place greater emphasis on the business case and on controlling and sustaining the change after it is introduced.
The authors have previously discussed PSP/TSP, so this article now turns to explaining how these several initiatives connect and support one another in this particular situation, recognizing that other ways of looking at this might make sense in a different context.

Six Sigma Frameworks
There are currently two main Six Sigma frameworks:

DMAIC (define-measure-analyze-improve-control), is used to improve and optimize existing processes and products.

DFSS (Design for Six Sigma) is used to design new products and processes. It is also used to redesign existing processes and products that have been optimized but still do not meet performance goals (that is, the desired sigma level). The latter case is believed to frequently occur when moving from a 5-sigma level of performance to a 6-sigma level. While DMAIC enjoys a relatively high degree of consistency across industry, DFSS is much more varied in its implementation. An example of the key DFSS steps is define-measure-analyze-design-verify.

LeanSigma, another approach to Six Sigma, may emerge as a third framework. However, lean techniques vary significantly across industry, and many organizations are implementing them within their existing DMAIC or DFSS frameworks.

First consider how PSP/TSP relates to CMMI. The authors’ view is that PSP/TSP is simply one way to substantially satisfy the attributes of all or most of the CMMI PAs—not the only way, but certainly a very viable “how to do it” option.

Similarly, as mentioned earlier, critical chain can be understood as one way to substantially satisfy the attributes of the project planning and project tracking PAs—again, not the only way, but a good way. So far one can see connections that relate to different levels of abstraction—CMMI “contains” PSP/TSP, certain PAs “contain” critical chain.

When thinking about the connection between Six Sigma DFSS and DMAIC one can visualize a temporal relationship and a tendency for these views to live in different quadrants of the Six Sigma space as illustrated by Figure 1. The relationship is temporal in the sense that one clearly cannot apply DMAIC to a product or process that does not exist, so in that sense DFSS comes first—although clearly many products and processes exist that were not created using the DFSS approach.

Hence, the boundary between DFSS and DMAIC is “fuzzy” in practice. When products or processes were created using DFSS we will have created a lot of valuable information and context that can be revisited to advantage when we later start a DMAIC project. When that is not the case we may need to reach back into the DFSS space from within a DMAIC project to create what is missing.

The boundary is also fuzzy in the sense that DFSS tends to focus externally and strategically, while DMAIC has a tendency to focus internally and tactically. Broadly speaking, DFSS projects are often more closely connected to the voice of the customer, while DMAIC projects are often more closely tied to the voice of the business—as with every generalization, there are exceptions and border conditions.

As illustrated in Figure 2, one can consider market leadership as a “front end” to DFSS—with market leadership we select the universe we will live in, with DFSS we design the space ship we will use to explore our universe. With DMAIC we might improve the fuel consumption of our rocket. PSP/TSP might be viewed as analogous to our choice of engine technology, while critical chain might be comparable to the thrust control system that governs our engine.

While every situation is different, the authors have found that it is always helpful to recognize that one cannot simply pile one thing on top of another and expect to get results. It is always important to realistically survey all of the things that impact the attention span of the people involved and to provide a clear explanation to avoid the “oh no not another initiative” syndrome we have all experienced.

Thinking through how things connect, who will work on what, and realistic appraisal of the effort and expected returns will reduce resistance, clarify responsibilities, and improve outcomes.

The ultimate goal of all process and product improvement approaches is to help people be more effective and efficient in whatever they do. Six Sigma, CMMI, PSP, TSP, and any other methodologies people may incorporate in their work are not ends in themselves but means. Progressive organizations, such as Raytheon, think of “Six Sigma” (although it may go by another name) as an over-arching “umbrella” under which are found a large set of tools and ideas that professionals can draw upon to do their work. It can be helpful to early understanding to clarify relationships and connections among different methods and tools, but in the end all are most effective when blended into “how we do our jobs.”

CMMI®, Personal Software Process(SM), and Team Software Process(SM) are service marks of the Carnegie Mellon University, Software Engineering Institute.

Crosby, Philip. 1979. Quality Is Free: The art of making quality certain. New York: New American Library.

Eckes, George. 2001. Making Six Sigma last: Managing the balance between cultural and technical changes. New York: John Wiley and Sons.

Goldratt, Eliyahu. 1999. Critical chain. Great Barrington, Mass.: North River Press.

Harry, Mikel, and Richard Schroeder. 2000. Six Sigma: The breakthrough management strategy revolutionizing the world’s top corporations. New York: Doubleday.

Humphrey, Watts. 1997. Introduction to the Personal Software Process. Reading, Mass.: Addison-Wesley.

Humphrey, Watts. 2002. Winning with software. Reading, Mass.: Addison-Wesley.

Jones, Capers. 1994. Assessment and control of software risks. Englewood Cliffs, N.J.: Prentice Hall.

Ryans, Adrian, Roger More, Donald Barclay, and Terry Deutscher. 2000. Winning market leadership: Strategic market planning for technology-driven businesses. Toronto: John Wiley and Sons Canada.

Schwaber, Ken, and Mike Beddle. 2002. Agile software development with scrum. Englewood Cliffs, N.J.: Prentice Hall.

Shewhart, Walter. A. 1931. Economic control of quality of manufactured product. New York: Van Nostrand.

Standish Group Chaos Report. 2001. See URL .

Gary Gack is a cofounder of Six Sigma Advantage, a firm dedicated to training and coaching in the application of Six Sigma to software development and information technology (IT). He is coauthor of Six Sigma Foundation, Green Belt, and Black Belt training programs tailored to software and IT audiences.

Gack has more than 40 years of experience in the software and IT industry with extensive large-scale project and program management, including teams with more than 200 developers. He has owned and managed several software/consulting businesses, and has extensive experience with software process assessments using the SEI/CMM, ISO 15504, and various proprietary methods. Gack is the author of numerous articles dealing with IT project management, IT process improvement, cost accounting and metrics, and software quality assurance. He has consulted with leading companies in the United States, Canada, and Europe. He can be reached by e-mail at or on the Web at .

Kyle Robison is currently a Six Sigma Black Belt for LSI Logic Storage Systems, Inc. in Wichita, Kansas. He has been involved in Six Sigma for the past three years in various roles in Six Sigma deployment in manufacturing, software engineering, and marketing. Robison graduated with a bachelor of science degree in electrical engineering from Oklahoma State University and is currently pursuing a master’s degree in business administration from Friends University. He has held many positions within LSI over the last 20 years, including firmware development engineer, program manager, product marketing engineer, and operations program manager before his involvement in Six Sigma. He can be reached by e-mail at

Return to top

Featured advertisers

ASQ is a global community of people passionate about quality, who use the tools, their ideas and expertise to make our world work better. ASQ: The Global Voice of Quality.