Taz Daughtrey, computer science department, James Madison University, and editor of Software Quality Professional
Software based systems have often seemed a breed apart in the eyes of both customers and quality professionals. How often do we see these systems fail and realize that consumers wouldnt tolerate such imperfection in tangible goods or well-defined services? Is it because of a culture of technically sweet solutions, whose deficiencies are tolerated because of the gee whiz nature of the technology?
The wonder is that so many organizations have done as well as they have for so long with software intensive systems that have been developed, operated and maintained in essentially a craftsman mode. The standard rating of organizations developing softwarea model adapted from Phil Cosbys five stages of quality awarenessscores the majority of software suppliers at the lowest of its five levels, where ad hoc activities must depend on strong personalities and heroic actions.
We can scarcely be content with such immaturity. From financial transactions to infrastructure control to automobile performance, our society has become critically reliant on all manner of software dependent systems. Indeed, the Six Sigma enterprise itself, with its data gathering and computational requirements, stands or falls on the availability and accuracy of software tools.
A data driven approach to analyzing the root causes of business problems may well provide the catalyst to remind us computationally intensive systems are not acquired or used in isolation from the larger business processes and market environment of an organization.
Software Quality Professional published its first article on this topic in December 2001: Six Sigma for Internet Application Development by H. James Harrington and Tom McNellis (www.asq.org/pub/sqp/past/vol4_issue1/sixsigma.html). ASQs Software Division included Robert Stoddards informative presentation Six Sigma for Software in its sponsored track at the May 2002 Annual Quality Congress.
We are learning that measurement systems must look at total system lifecycle costs and focus on actions that minimize these costs. In a cost of quality model, the costs to control quality are planned, scheduled and budgeted.
On the other hand, the costs of not controlling quality result from failures that seem to happen at the most inconvenient times and that often are budget busters. Planning and implementing process improvement are deliberate investments. When these measures prove inadequate, an organization enters the unscripted realm of crisis response, disaster recovery and reputation repair.
The primary challenge I see is the application of statistical analysis to the proper figure of merit for software quality. Six Sigma practitioners must identify useful metrics and use them to identify areas needing improvement and to validate process changes.
Software product defect counts are not fully satisfactory as a measure of quality or a target for process improvement. Defects (a static property) may or may not lead to unacceptable system behavior (a dynamic property). It is the frequency and severity of these failures that truly measure quality, yet failure data are a lagging indicator when we need a leading indicator.
I look to the continuing maturing of Six Sigma as the methodology is turned upon itself for refinement. Its application to software development can lead to the virtuous cycle of improved tools providing improved insights for providing even more improved tools.