Let’s Get Statistical
by Dale K. Gordon
Many of us who spend considerable time working with and implementing quality management system (QMS) standards are usually in a mode I like to call “system think.”
We are looking at the requirements of the standards in terms of how the interrelated processes of an organization work in a symbiotic manner to achieve the desired method of control and how closed loop activities necessary to ensure customer requirements are being fulfilled.
But one question that keeps coming back is, where’s the hard evidence of the effectiveness of these systems or that the standard adequately de-scribes what is really required from a well-functioning QMS?
The overall assumption, of course, is a fully functioning, management driven, continuously improving, effectively utilized QMS will afford both the organization and customer some measure of assurance the products or services produced will meet both the contractual and functional requirements expected by the customer.
To assure customers and other interested parties an organization’s QMS is properly and effectively implemented, many organizations have used the certification/registration process afforded by an accredited assessment process. This assessment is generally supported by an agreed upon set of rules that begins with the International Organization for Standardization/ International Electrotechnical Com-mission (ISO/IEC) guidelines for the assessment of organizations to ISO 9000 standards.
The primary guides are ISO/IEC Guide 61:1996, which sets criteria for bodies operating accreditation systems for certification/registration bodies, and ISO/IEC Guide 62:1996, which sets criteria for bodies operating assessment and certification/registration of organizations’ QMSs.
These guides then are detailed and expanded into further sets of standards, requirements, processes, procedures and agreements used by accreditation bodies, certification/ registration bodies and auditors who perform the assessments of organizations that want to obtain certifi-cation/registration in accordance with the ISO 9001 QMS standard or a similar sector specific QMS standard. These include QS-9000 for the automotive industry, TL 9000 for tele-communications and AS9100 for aerospace.
No accurate count of total worldwide registrations has been completed, but it has been reported that worldwide ISO 9001 registrations have exceeded the half-million mark.1
Several questions come to mind related to all these assessments being performed by qualified auditors using standardized processes and procedures for doing the assessments on a worldwide basis:
- Which clauses of the standard are organizations having the most trouble complying with?
- Are there problems with understanding the requirements of the standard(s) that lead to noncompliance?
- What is the variability in the assessment process, including training and capabilities of the persons doing the assessments?
- What about the standard should be changed to improve organizational compliance and effectiveness?
To answer these questions we could look at the ISO 9001 standard and determine which clauses of the standard have a significant number of auditor findings written up and begin to analyze the systemic causes for these failures of organizations to be in compliance.
Within ISO 9001, 23 major clauses (4.1 through 8.5) that define and outline the requirements for a fully functioning QMS should be assessed to determine compliance. Clause 1.2 (application) says only the clauses in section 7 may be excluded from a QMS.
The most common exclusion is typically clause 7.3 (design and development) when the organization does not perform a design activity and only manufactures or provides a service.
But here we run into a problem obtaining these data since most certification/registration bodies execute a confidentiality agreement with their clients, and the data become buried in reports and files. Wouldn’t this information be invaluable to ISO technical committee 176 (the committee responsible for developing the ISO 9000 standards) on an aggregate basis for comparison to the standard to determine understanding and compliance?
Aerospace Industry Action
The aerospace industry began looking at these questions and has developed a database to collect the information and results of assessments within the industry and its supporting supply chain.
The industry’s AS9100 is based on ISO 9000 and is also published as EN9100, JIS Q 9100 and other identifiers by other entities in other parts of the world. All these versions are equivalent and under the control of the International Aerospace Quality Group (IAQG), comprised of more than 70 major aerospace companies and organizations worldwide.
The IAQG created database for the assessment results is called OASIS (Online Aerospace Supplier Infor-mation System) and is found at www.iaqg.org/oasis. The database is maintained by the Society of Auto-motive Engineers on behalf of the IAQG.
While similar to one used by the Quality Excellence for Suppliers of Telecommunications Forum, the IAQG database is not as robust in collection and segmentation of the data within the industry. However, because the aerospace assessment process uses a mandatory checklist and requires the recording of results (including findings from assessments), some analysis of the data is possible and yields some interesting results.
Analyzing the Data
Since July 2003, the OASIS
database has collected information on the assessments of more
than 2,000 aerospace organizations, and the numbers are expected
to grow significantly. A simple Pareto chart of the results
reveals the top 10 findings by ISO 9001 clause number written by
certification/registration body assessors for this period of time
(see Figure 1).
While the aerospace QMS standard is heavily supplemented, similar to other sector specific standards, it still builds on and expands the basic ISO 9001 QMS requirements.
To say the aerospace industry information would be indicative of the other 498,000 registrations to ISO 9001 or other sector specific standards would probably be too big a leap of faith and certainly not statistically valid. But, looking at the data certainly suggests many possible avenues of investigation for beginning to analyze the results and drawing conclusions about the areas that may be common problems. It may also lead to some questions about the QMS assessment process and those performing it.
One item immediately apparent in this crude analysis is that 55% of all the reported findings are confined to four major clauses of the standard:
- 4.2—documentation requirements.
- 7.5—production and service provision.
- 8.2—monitoring and measurement.
To those in the aerospace industry, that these clauses are the top four and comprise more than 50% of all findings probably comes as no surprise. All four clauses present what an auditor might term “target rich environments” within any aerospace company, given the complexity of the product, regulatory nature of the industry, heavy dependence on documentation and integral aspects of the supply chain to produce airplanes and complex weapons.
Another explanation may be that most of the assessors who observe and record these findings typically come from manufacturing or quality career experiences in the aerospace industry. During an assessment they tend to gravitate toward areas with more relevance to these functional areas in performance of the assessment.
Another interesting view of the data is what they show about expected areas of emphasis of a QMS assessment with respect to product quality and the meeting of customer needs or requirements.
Most writings and quality and business gurus from Walter Shewhart to Peter Drucker tell us the starting point in any process of providing goods and services considered of good quality and as meeting the customers’ needs begins with understanding customer requirements.
The process then moves into the design and development of goods and services (product realization) that fulfills the customers’ needs or wants at a level that meets their specifications or what the market will accept.
Even the ISO 9001 standard has
within it an example of a process model in the form of the plan,
do, check, act process. Figure 2 (p. 85), ISO’s model of a
process based QMS, clearly shows the inputs from the customer as
the value added activity feeding product realization.
Yet the data from the assessments show that of the top 10 clauses for which findings were issued during an assessment, the one related to customer processes (7.2—customer related processes) was low on a percentage basis and ranked number 10 overall. That could mean any of the following:
- Aerospace organizations are doing a good jobof understanding and incorporating customers’ requirements.
- The standard is sufficiently vague about what is really expected here.
- The assessors do not have sufficient access or visibility to design re-quirements or statements of work that are part of the requirements customers impose on an organization.
Additionally the next step in the product realization stage after planning—design and development—did not make the top 10 of the total findings issued. This could lead you to conclude most QMS improvement is in the manufacturing and inspection areas of an organization.
The data, of course, would be skewed depending on the number of assessed organizations that omitted this clause as a requirement in their scope of registration. However, there is also the question of what I’ll call “auditor bias” with respect to covering the QMS. Only additional examination of the data would determine whether this is a meaningful factor.
While many people could speculate all day long as to the causes and necessary changes that would improve overall compliance or produce a more balanced approach to the assessments, would it not be beneficial to have this discussion?
I suggest examining some hard data and looking at some objective evidence to determine how effectively organizations are adopting QMS requirements as a whole. Or, we might even do some organizational segmentation to examine differences for different types of commodities.
I pose these questions as a challenge to those of us concerned about and interested in the effectiveness and implementation of robust QMSs in our organizations. As a participant in this process in the aerospace industry, I know we are actively working to generate the data, analyze them and answer some of these questions.
- Ashok M. Thakkar, “The Rise and Fall of ISO 9001?” Quality Digest, November 2004.
DALE K. GORDON is vice president of quality for MPC Products in Skokie, IL. He is an ASQ Fellow, past chair of the American Aerospace Quality Group and one of the writers of the AS9100 aerospace standard. Gordon earned a bachelor’s degree in industrial engineering from General Motors Institute (now Kettering University) in Flint, MI, and an MBA from Butler University in Indianapolis.