What Makes a Six Sigma Project Successful?

by Joseph G. Voelkel

Suppose you just completed a Six Sigma project on which you were either a Champion, a belt (Master Black Belt, Black Belt or Green Belt) or a team member. You are now in a lessons learned meeting and have been asked to give your opinion on the project. You can be totally honest at this meeting; your voice will be electronically modified, and your notes will be transcribed, so flinching is not a concern. You are asked:

  1. Do you think the project was successful? Please explain your answer and include how you define “successful.”
  2. If it was successful, what were the contributors to its success? If it wasn’t successful, what were the contributors to its lack of success?

The people asking you these questions are very interested in your opinion and insight, so they want comprehensive answers.

Contributors to Success

Six Sigma emphasizes a data driven, process approach known as the DMAIC (define, measure, analyze, improve, control) roadmap. As a professional statistician, I naturally want to believe being both data driven and process oriented are very important. But what really makes a particular project successful? Was it the DMAIC process itself? Was it the leadership qualities of the Black Belt running the project? Was it one piece of information that had been buried? If so, was it an embarrassingly simple discovery or the result of an intense and sophisticated investigation?

My experience using project solving approaches to make improvements both precedes and includes Six Sigma projects. My experience with Six Sigma projects has largely occurred with teams working on their first projects, while my experience with other projects has primarily occurred in my role as a statistical consultant or quality advisor1 or indirectly through the experiences of my students. This experience has most often taken place in the hard sciences, and my terminology below will reflect that.

Seven Observations

Based on projects that have had some measure of success, I have gathered seven main observations.

1. The DMAIC order works. It makes logical sense, but that is not why it works. How it works can best be seen by looking at problems encountered when it is not followed. For example, one team jumped into a project and later found out it never agreed to a definition of the problem or the boundaries under which it would work. Frustration and dissention followed. In another example, a team jumped into a designed experiment, which could have been a quick and powerful way to solve the problem. But the team discovered none of the factors were active and only later realized the large noise from the measurement system buried any signal these factors might have been sending.

2. Good leadership may be critical. Suppose a project leader was placed in charge of a team whose members were also members of the company’s union. At the first meeting, the members sat with folded arms and cynical looks because they had already been down many bad roads with the company. By the end of the project, however, the members were totally supportive of the project leader. Why? Because he led by example and had a serious and overriding commitment to the process and team members, not to management.

3. First-order scientific or engineering principles usually don’t solve the problem. If they did, the problem would probably not have reached the Six Sigma project stage. For example, consider a project that aimed to reduce the squealing of metallic brake pads. To solve the problem, metallurgists used first-order principles to recommend changing the kind of iron used and some other chemistries and then holding tighter tolerances. While reasonable, the squealing problem remained, only with higher scrap rates. It took a data driven approach to eventually solve the problem.

4. Soft tools, such as value stream mapping, process mapping and failure mode and effects analysis, can be valuable, but not when the work is done in a meeting room. According to Bill Murray in the movie What About Bob? “There are two types of people in this world: those who like Neil Diamond and those who don’t.” In that statement, Murray gave us an analogy for the two types of engineers in this world: those who like to sit in a meeting room to theorize what’s going on and those who like to go to the workplace and find out.

A related example on the Stanford Graduate School of Business’ website says:

The Canon executive couldn’t find the English word. He wanted to express the belief of Japanese managers that when something goes wrong, the only solution is to get to the site to see what’s happening. “We have a phrase in English,” volunteered Sam Wood, assistant professor of manufacturing and technology. “We say you have to go to the gemba.” The Japanese managers roared with laughter. Gemba, the Japanese word for the scene of the action, was precisely the word they had been trying to translate.2

5. There is little connection be-tween formal education and the ability to come up with good ideas. This is directly related to point three. Those closest to the action often know more about the process than many of the people above them on the organizational chart. They frequently have knowledge the process or design engineer may not.

I was once in a four-hour meeting with bright engineers who had complex theories for why a measurement system process had recently not been working. After two requests, the engineers agreed to let me look at the process itself. I watched the process for a while and then asked the operator if he knew what the problem was. He said he did. I asked him if he’d tell me, and he did. I asked him why he didn’t tell anyone before. He grinned at me and said, “No one else asked.”

6. The interplay between theory and data is like a chicken and an egg. Which came first? W. Edwards Deming emphasized the need to have a theory, even a hunch, before starting to solve a problem when he wrote, “Experience without theory teaches nothing.”3 However, the common phrase “letting the data talk to you” suggests using data first to generate theories.

The solution to which comes first can be found in Figure 1 (see p. 66). There is no first. In some projects, broad based, passive data collection may generate theories where virtually none existed. An example of this is collecting three samples in a row from each of four eight-cavity machines, once an hour for an eight-hour shift, over seven days. In other projects, a well-planned, theory based experimental design, say a fractional factorial design, may narrow down theories.

7. The Pareto principle wins out. The uniform principle (if it existed) would say many small parts play a roughly equal role in the improvement. This would hold true if each of the DMAIC phases contributed between 15 and 25% to the overall success of the project. The Pareto principle often wins out, however, because one of the DMAIC phases generally contributes to most of the project’s success.

Your Experience

Now it’s your turn. Think about the more recent projects on which you have worked. For each, answer these questions:

  1. Do you think the project was successful? Please explain your answer and include how you define “successful.”
  2. If it was successful, what were the contributors to its success? If it wasn’t successful, what were the contributors to its lack of success?

OK, maybe your voice won’t be electronically modified, and your notes won’t be transcribed, but you can achieve the same effect by sending your comments to me at sixsigmathinker@yahoo.com. I plan to organize them and, together with the editors of QP, post them online at www.asq.org/pub/qualityprogress. Please tell me something interesting, but do not send me anything that is confidential or can link your company to the project. Also, let me know if you’d prefer I use your name or a specific handle, in case you wish to remain anonymous.


  1. Peter Scholtes, Brian Joiner and Barbara Streibel, The Team Handbook, third edition, Joiner/Oriel, 2003.
  2. Stanford Graduate School of Business, “Notes From the Gemba,” The Stanford Business Main Page, December 1993, www.gsb.stanford.edu/community/bmag/sbsm622/gemba.html.
  3. W. Edwards Deming, Out of the Crisis, MIT Center for Advanced Engineering Study, 1986, p. 317.

JOSEPH G. VOELKEL is graduate program chair and associate professor at the John D. Hromi Center for Quality and Applied Statistics at the Rochester Institute of Technology in Rochester, NY. He earned a doctorate in statistics at the University of Wisconsin-Madison. Voelkel is a Fellow of ASQ.

Average Rating


Out of 0 Ratings
Rate this article

Add Comments

View comments
Comments FAQ

Featured advertisers