2020

EXPERT ANSWERS

Driving service quality

Q: How do you drive quality for an IT department in the service sector, such as at a bank?

Zaheer Ahmed
Riyadh, Saudi Arabia

A: Many will say to drive quality in IT, you must focus on using methods such as the capability maturity model integrated (CMMI), lean Six Sigma or the project management body of knowledge. While many organizations struggle with which improvement method to use, their operational area’s service delivery to internal and external clients is feeling the impact of poor quality.;

If you are interested in driving quality within an IT department in a bank, there are three avenues to focus on:

  1. The problems.
  2. Moving from problem containment to true root cause.
  3. Leveraging a maturity model.

First, focus on the problems, not the method. Using a hybrid, pragmatic approach to method selection is a good way to proceed. In addition, you may want to leverage metrics to help you identify where to focus.

Many IT service areas have metrics, but few of those metrics are linked to known problem areas for clients. Sometimes, IT departments don’t measure what matters for their clients. In a previous career at a major bank, my department was getting feedback from the retail area that network operations were causing major downtime. Staffing issues resulted from the systems going down, and the staff members had to refer to hard-copy ledgers.

There was an effort to normalize the actual downtime data across all of the retail branches. This had not been done before and turned out to be extremely beneficial. After that metric was produced and shared, the tone of the conversation changed. When the retail bank saw how small the actual downtime per branch was, the IT department was able to become proactive and begin efforts to reduce the downtime even further. It would have been difficult to change the relationship with that client without that metric.

It’s important to ensure that metrics are linked to problem areas. In the past, operational leaders would approach a continuous improvement team and say, "We need your help." That’s honorable, but knowing which critical problems to focus on is important.

In that vein, if you happen to be on the receiving end of IT service (Who isn’t these days?), ask yourself, "What is keeping me up at night?" or "What is my No. 1 problem?" Then, ask yourself, "Do I have metrics or goals relevant to those problems?" Chances are, the answer will be no. If the answer is no, don’t be discouraged. This is a common problem and a great way to find out where to focus quality or improvement efforts.

Second, moving from problem containment to true root cause is another great place to focus. Many IT shops stop at containment and fail to move to true root cause. Containment is generally defined as the effort that stops the bleeding and restores service, but that doesn’t address the root cause.

Improvement only starts when you systematically start looking for true root cause. Team-oriented problem solving is useful for this as it forces the distinction between containment and true root cause.1 ;

Third, using a maturity model can be helpful for getting started on implementing quality in any area. The notion of a maturity model is that it forces users to come to grips with where they are on their journey. If you have a resource on your team with an in-depth quality background, he or she should be able to help leverage this tool.

Quality maturity models are broken down into levels, with each level being characterized by traits of your current and future states of quality. The first levels have few characteristics associated with them. As you move up, you will find that the information under each level increases. You also will notice increased granularity as you move up the levels.

As an example, metrics should be an element of the maturity model. You may find metrics at level one are ad hoc or nonexistent. Level two may have metrics as present, but purely reactive in nature. Level three may have metrics as present and used as needed. Level four may say that they are integrated into each area and supported by measurement system analysis. Level five may represent proactive use of data, generated by both internal (client needs) and external (benchmarking) perspectives.

Using these tips, pull a small team together and get started. Your efforts will quickly be noticed and appreciated.

Keith Wagoner
AVP Partner Solutions
Lincoln Financial Group
Greensboro, NC

Reference

  1. Keith Wagoner, "8D Solutions," Quality Progress, November 2009, pp. 8-9.

Evaluating equipment

Q: How do you evaluate capability of equipment such as pumps, heat exchangers and furnaces?

Rajendra Prasad Yalamanchily
Secunderabad, India

Capability evaluation of equipment deals with studying the process and output of the equipment with respect to its ability to fulfill its intended purpose. This concept is related to process validation, which is defined in the U.S. Food and Drug Administration’s quality system regulation as "establishing by objective evidence that a process consistently produces a result or product meeting its predetermined specifications."1 Therefore, this answer addresses the phases of process validation to ensure that equipment is designed correctly and that it has the required initial, short-term and long-term capability.

Qualification is another related, more broadly used term with which some may be more familiar. The sequence presented in Figure 1 consists of design qualification (DQ), installation qualification (IQ), operational qualification (OQ) and performance qualification (PQ).

Figure 1

When equipment arrives from a vendor or internal fabricator, the first question that needs to be answered is, "Is this the equipment that was ordered?" There is no sense proceeding with the subsequent forms of qualification if this design qualification is not performed. The equipment should be inspected thoroughly for overall appearance and damage. The vendor’s records that arrive with the equipment also should be inspected, including:

  • Functional, technical and performance specifications.
  • Engineering drawings.
  • User’s manual.
  • Machine guarding and safety features.
  • Test results performed at the vendor (such as temperature profile verification and pump output versus setpoints).
  • Certificate of conformance covering any of the above, as required by contract.

Once there is assurance—in appearance at least—that the equipment is what it is supposed to be, the equipment can be installed and further evaluation can begin. A process validation guidance document defines installation qualification as "establishing by objective evidence that all key aspects of the process equipment and ancillary system installation adhere to the manufacturer’s approved specification and that the recommendations of the supplier of the equipment are suitable considered."2 The question being answered here is, "Is the equipment installed correctly?" The following tasks are typically performed on equipment at IQ:

  • Assign an asset number or other unique identification.
  • Determine installation requirements and connect power sources and utilities.
  • Evaluate and add guards and safety features beyond the original design, per internal requirements.
  • Establish any environmental controls such as cleanliness, temperature, humidity, vibration and lighting.
  • Verify all machine controls and settings work as intended. This is the first hint of the task of establishing statistical capability. If things don’t work at least once, more comprehensive subsequent evaluations will likely fail.
  • Develop a calibration log and verify calibration can be executed.
  • Determine the preventive maintenance tasks and schedule, and affix appropriate tracking stickers.
  • Also consider potential corrective maintenance with a spare parts list and a plan for quickly obtaining disposable and wear-out parts.
  • Write manufacturing procedures for the care, cleaning and use of the equipment.
  • Complete whatever else may be on your organization’s installation checklist.

Up to this point in the determination of equipment capability, most of the data are qualitative, or yes/no checklist type of data. In the steps that follow, more quantitative, or numerical, analysis can be conducted. Another transition at this point in the qualification process is from collection of data on the equipment itself to data collection on product that is processed by the equipment. A couple examples are:

  1. An annealing oven for high-performance plastic products should be validated to provide a consistent temperature profile, then validated to produce product that meets required stress levels.
  2. A pump used in cast nylon manufacturing should be validated to provide correct flow rates for reactants, then validated to produce product that meets tensile strength requirements.

The next phase is operational qualification. Here, the equipment is run at combinations of upper and lower operating limits, sometimes referred to as worst case conditions. The question being answered at this point is, "What is the short-term capability at the process limits?" If applicable, raw materials encompassing the entire range of critical characteristics also should be used in these studies. Examples of potentially relevant raw material properties include viscosity, density, particle size and chemical composition.

OQ provides the opportunity to use basic and advanced statistical tools. Design of experiments is particularly helpful, including simple factorial designs that allow for a process model to be developed and interactions to be determined. The statistical demonstration of an entire experimental design space meeting the process requirements is a powerful argument for the capability of the equipment. Another advantage of the process model is that predictions can be made regarding combinations of variables that are not explicitly combined in the experiment.

Analysis of variance (ANOVA) is another useful tool. For an oven, for example, thermocouples can be placed at strategic locations and temperature profile data collected over time at different setpoints. ANOVA can be used to detect hot and cold spots. In addition, statistical analysis can be performed to determine oven locations of relatively high variation. There also may be opportunities to use correlation and regression. For a pump, for example, plots of output versus setpoint can determine areas of concern with respect to deviation from expected values.

Lastly, PQ is executed to answer the question, "What is the long-term capability at standard operating conditions?" The equipment is run at its optimal conditions, perhaps near the center of the combination of conditions that comprised the OQ. Data are collected for an extended time, at least long enough for all reasonable sources of variation to occur. Often, as relevant to the situation, the rule of thumb is that three production lots should be run. One practice is to obtain data on product from three lots of raw material run by three different operators during three different shifts. Enough data should be collected to determine true stability and capability in a time-series analysis. These data could form the baseline for setting control chart limits as transition is made from equipment qualification to standard production and monitoring.

In PQ, a statistical analysis decision must be made regarding whether to "pool" the data from different production runs. ANOVA can be used to see if there are run-to-run statistically significant differences. If there are differences, each run should be analyzed individually versus the requirements to ensure capability is consistently established.

Throughout all the phases of validation, protocols should be written to clearly identify the objective, procedures and acceptance criteria. Afterward, reports tied directly to the protocols should contain all data, checklists and completed forms. Reports also contain an analysis of all the information and a conclusion versus the acceptance criteria. Any exceptions or deviations to the plan as it was portrayed in the protocol should be documented and evaluated versus their relevance to the conclusions of the report.

All of these activities, taken together, should achieve the goal of process validation to demonstrate that the process, including qualified equipment, consistently produces a result or product meeting its predetermined specifications.

Scott A. Laman
Senior manager, quality engineering and risk management
Teleflex Inc.
Reading, PA

References

  1. U.S. Food and Drug Administration, Quality System Regulation, 21 CFR Part 820, Medical devices—current good manufacturing practice—final rule, section 820.3 (z)(1).
  2. Global Harmonization Task Force, Study Group 3, GHTF/SG3/N99-10:2004 (Edition 2)—Quality management systems—process validation guidance, January 2004.

Average Rating

Rating

Out of 0 Ratings
Rate this article

Add Comments

View comments
Comments FAQ


Featured advertisers