2019

STANDARDS OUTLOOK

Output Really Does Matter

by John E. “Jack” West

Auditing is a key component of systems that provide confidence in organizations’ competence, ability and honesty in meeting requirements. For decades we have been using audits for this purpose, and it can be argued that the concept has generally worked well.

Positive audit results usually reflect reality and generate confidence. This is true whether it relates to finances, environmental issues, quality or other areas.

Why then are we sometimes disappointed to learn that an organization with a quality management system (QMS) certified to ISO 9001 has major problems delivering products that meet customer requirements?

Similar questions can be asked about financial, health and safety, and environmental auditing. The basic question is can QMS audits reliably predict future QMS performance?

Perhaps more broadly we might ask whether QMS auditing ever can provide complete assurance that customer requirements will be met. Certainly the generic answer for many years has been a qualified no.

The product certification process indicates that the rigorous way to know a product conforms is to have a lab inspect or test a sample using agreed upon criteria. This philosophy tells us that if the sample meets the criteria, we should be able to rely on the producer’s QMS to ensure consistent conformity over time. ISO 9001, with its focus on control of processes, is a great tool to help organizations consistently meet requirements.

Limitations of Product Inspection and Testing

Of course, for some products the risks associated with potential failures drive us to test or inspect every item. But for the most part, we rely on more limited inspection and testing plus the controls of the QMS. Here’s the rub: Even for supposedly high reliability, high risk items, it is generally impossible to inspect or test for everything that could cause failure.

Anyone who has ever used failure mode effects analysis (FMEA) has examples of this. Perhaps we are forced by practicality and economics to focus on potential failure causes that have high risk priority numbers.

Even for simple products and services, all of this seems technical and complicated. In my experience, most real product failures are caused by simple things that could have easily been avoided during design or production.

In fact, my experience also includes many cases in which recalls and catastrophes have been prevented because a lowly quality auditor asked a simple question that seemed stupid to those listening. We need more QMS auditors to ask these basic questions.

Reliance on Records

Audits, like lab tests of samples, are snapshots in time. Only a review of records can show what happened before the audit. So we are forced to rely on records that might or might not be designed to reveal problems and that might or might not have been competently and honestly completed.

Most audit systems are quite loose in determining how much recorded history is reviewed to determine process stability. Audits of stable processes can verify past stability and predict future stable performance. Audits of processes that are either known to be unstable or not known to be stable see performance only at the time of the audit. They offer no ability to predict future success or failure of the process.

Expert auditors can recognize a great system quickly, and easily find the holes in any marginal system. But such expertise is not universal, and expert auditors seldom want to work for the third-party auditor’s day rate. Organizations also aren’t eager to assign their best and brightest people to conduct internal audits.

Even with expert auditors and really good systems, all it takes to cause product failure or recall is for the organization to fail to meet one key characteristic.

Ask the Right Questions

Auditors need to ask probing but basic questions. For a toy manufacturer, the question might be, “How do you know that coating is nontoxic?” or “How do you ensure very young kids can’t swallow these small parts?”

The representative of the organization being audited might think the answer is “Isn’t it obvious?” But when pressed for objective evidence, the organization might be forced to change its coating or parts design.

When this happens, chalk up another audit-produced success. Auditors who ask such questions should be encouraged. The ones who force preventive change probably should be designated heroes.

A story related to me by a course attendee a couple of years ago illustrates the importance of watching for the little things (see “Auditing Success Story,” p. 62). Stories like this abound when auditors use the right approach to each audit they conduct.

Careful Planning Needed

This means auditors need to carefully plan each audit. Even if an auditor has audited the same organization several times, conditions and pressures change. Successful auditors spend time planning so the right questions get answered.

Data analysis in preparation for audits is often inadequate to determine the areas of most improvement opportunity. In fact, the data might not be made available and, in many cases, the auditor might not know to ask for it.

Aggregation of hard data (for example, customer feedback, internal data and cost data) with audit observations (not findings) is often inadequate for defining areas in which the data and observations taken together demonstrate problems or show excellent performance. ISO 9001 requires a lot more data and analysis than prior QMS standards, so auditors should ask for data and use it.

Rigorous system audits can provide confidence that the system has successfully produced conforming product in the past, and audits can also be good at finding problems. But without a great deal of analysis, it is hard to determine what effect the problems will have on future product—even if conditions don’t change.

Prediction and Change

The predictive nature of audit results is questionable since change is occurring all around (and within) the system. Control of the system changes (ISO 9001, clause 5.2.4 b) is quite difficult to audit under the best of circumstances and can be a source of strong rebuttals—even heated arguments—from auditees. So this area is often not pursued aggressively.

Sometimes the apparently minor problems have the most significant future negative impact. For example, if a job is normally done by temporary workers who are trained each day, it would seem to be a minor issue if the work is being done by the most experienced permanent employee on the day of the audit.

But after the customer complains, we could discover that this experienced person has never been trained to do the particular work involved. Such minor things might even go unreported and only become a big deal after a major customer issue is raised.

A situation like this could easily lead to the conclusion that audits by themselves are not likely to be good predictors of future system performance. In my experience, however, organizations that focus their systems on achieving capable processes for product realization and that have strong top management involvement seem to have better long-term records of success.

In these situations, audits are an ideal way to ensure ongoing process stability and control—as long as the auditors know how to look for signs of deterioration.

Some argue that if this type of observation could be validated, criteria could be developed to determine a fairly small number of system elements that are predictors of future success. But this has proven illusive because organizations that do these critical few things well tend to also do the rest of the system well.

Suggestions for Successful QMS Audit Programs

Perhaps sustainable success is possible only by doing everything well. What then should those who manage QMS audit programs do? Here are several ideas:

  • Select the right people—the best and brightest—to do the auditing. Pick people with inquisitive minds and enough product knowledge to ask the right basic questions.
  • Provide time prior to audits for the auditor or team to collect and analyze operational data on the areas and processes to be audited. This analysis and planning should help focus the audit. Part of the audit objective should be to verify the areas being audited are themselves collecting and analyzing the right data.
  • Have auditors aggressively pursue issues related to change. This means that before the audit starts, auditors need to have studied the change related pressures on the area to be audited and the nature of changes going on there. Never acquiesce to complaints that, “This is not a good time for the audit. We are in the midst of change. Come back in six months.” The midst of change might be the best time to audit.
  • Provide time during or after the audit for analysis of the auditor or audit team’s observations against the operational data collected in preparation for the audit. Ask whether the organization already knows about the concerns and is working on them. This should help focus findings on new issues that are not being addressed.
  • Train auditors to focus on obtaining facts and supporting data. When possible, get auditors
    to state issues in terms of potential cost savings or monetary risk. Teach them enough statistics that they can conduct sufficient sampling and recognize whether processes are statistically in control and capable.
  • Review or discuss auditors’ observations with them before they write up their findings. This can be a tricky issue because the auditors could get the notion that you want to restrain their freedom. So, organizations should ask probing questions and remember that the objective is to get the stupid questions asked, not to explain the answers.
  • Perhaps most important of all, focus observations on the product, customer or business performance, no matter where in the QMS they occur. For some requirements, such as those related to product realization, the link to the customer is obvious. But QMS standards have requirements in many areas, such as management responsibility and resource management, that are often looked at out of context. Put these requirements into the context of how the organization ensures conforming product, customer satisfaction and business performance. What will be the potential affect on the customer if this situation goes unchanged? For example, management-review records often mainly report past activities or status of projects. In such cases, the auditor should ask how management review is being used to identify needed product improvements that could increase sales. Any issue related to business performance and customers can get top managers excited about changing the status quo.
  • Worry less about whether auditors are experts in a particular QMS standard than about other things that are important. Worry more about how well the organization is meeting customer requirements and satisfying customers than about how it conforms to every little requirement of the applicable QMS standard.

It is said, correctly I think, that 80% of what businesses do is common to all, while 20% differs by individual industry and organization. So, all auditors should be able to do a good job with the 80%. The trap lies in the 20%.

The bottom line is that audit programs need to sweat the basic little details that impact customers. This requires great auditors.

Auditing Success Story

An organization had outsourced the manufacture of some components to a company in a developing country.

The process capability for making one of the components was extremely poor. The manufacturing engineer decided to install a second piece of equipment to perform a rework operation because the primary operation had a defect rate approaching 50%.

This new equipment worked fine, but the engineer was not satisfied because the rework took three times longer than the primary operation.

To understand why the primary operation was not capable of meeting requirements without rework, the engineer asked the operator to prepare a control chart of operation results.

The person running this small cell had not only been trained to make control charts, but she had also learned about records of conformity. An auditor from the purchasing company watched the operator and discovered she was charting results only after rework was completed. Naturally, this defeated the intended purpose of the chart.

What is significant is that both the engineer and the operator’s supervisor had noticed the data were all well within specification, and no one had asked the operator how this could be.

After the auditor reported her findings to the plant manager, she concluded the organization had a major communications problem between supervisors and operators and between the manufacturing engineers and production.

It took a while for this to be addressed and fixed, but this one little auditing observation resulted in a complete turnaround in that plant’s methods of communicating and operating. —J.W.


JOHN E. “JACK” WEST is a management consultant and business advisor. He served on the board of examiners for the Malcolm Baldrige National Quality Award from 1990 to 1993 and is past chair of the U.S. technical advisory group to ISO technical committee 176 and lead delegate to the committee responsible for the ISO 9000 family of quality management standards. He is an ASQ fellow and co-author of several ASQ Quality Press books.


Average Rating

Rating

Out of 0 Ratings
Rate this article

Add Comments

View comments
Comments FAQ


Featured advertisers