Back to Basics
The Big Picture
Don’t be misled by productivity measurements
by Janet Bautista Smith
Business Competition is an unending race with multifaceted measurements of success. One success factor is productivity: the ratio of value added—such as service rendered or products produced—versus the associated cost. This may be one of the simplest metrics to achieve, but if it’s incorrectly interpreted, it may cause the company to spiral into a failure abyss.
Error rate—the occurrence of nonconformance or rejection—is one countermeasure that may provide a reality check on a process’s productivity. Speed and volume do not necessarily equate to quality or efficiency if the error rate is skyrocketing.
Productivity and error rate, although generally effective as counter–balance metrics, may be flawed if the error rate is out of sight, tucked away and undetectable in the “hidden factory” under the invisible cloak of suboptimization. Suboptimization hosts a multitude of cohorts buried in the system, such as rework or scrap data not properly identified, or misuse of skills to maximize selected areas while jeopardizing other processes.
If this scenario exists, productivity measurements will definitely mislead the stakeholders into believing all is well.
Productivity is a measurement tool that is like a dual-edged sword and should be used with caution. The numbers can be impressive but may be less sensitive to the deeper operational issues afflicting the system on a daily basis.
A workforce unit may appear highly productive to the point that the quantitative results outrun the quality level, causing a gap of error rate and suboptimization, as shown in Figure 1.
Error rate is probably one of the simplest indicators of a system’s performance prior to change deployment. But if left unchecked, error rate is like a mushroom and can multiply by spreading its spores and clogging the system flow. Early recognition of failure modes will aid in timely intervention to prevent impending catastrophe.
Suboptimization is like an odorless gas that can envelope the system in a discrete velocity masked by process noise, misleading the interpretation of productivity numbers. For example, a manager may focus on a noncritical but highly visible process by exhausting skilled resources from other areas (thus, depleting these areas of needed talent) to push for superstar productivity numbers.
At first glance, it would seem productivity or a key process indicator (KPI) has been optimized at the highest level. But the long-term effect will show gradual deterioration of the areas that should have benefited from this suboptimized talent or resource.
How do you recognize suboptimization for early prevention? There is no simple answer because this element is a moving target that is sometimes difficult to quantify, so it may remain incognito. But here are some examples of simple data-mining questions to validate robust decision making using KPIs for improvement opportunities:
- Are the processes being measured one of the critical points in the value stream map? If not, why was this process selected for KPI measurement?
- Is the process or resource capability compatible with the desired output? If there is consistent fluctuation, it may warrant further investigation.
- Do the measurements trigger actions for improvement? If not, why was this metric
In plain view, productivity or any other performance metric may not always show the big picture. Primary metrics need continuous validation to ensure a balanced scorecard approach. Selection and deployment of the KPIs should consider the voice of the process and the voice of the customer to truly understand the risks and benefits involved in this activity.
Janet Bautista Smith is the director of quality and continuous improvement at ProTrans International in Indianapolis. She earned a bachelor’s degree in chemical engineering from the University of Santo Tomas in Manila, Philippines. Bautista Smith is a senior member of ASQ and an ASQ-certified quality manager, auditor, engineer and Six Sigma Black Belt. Bautista Smith presented the session “Lean Express” at ASQ’s 2010 Lean and Six Sigma Conference, was a lean-based auditing workshop instructor at the 2010 and 2011 ASQ Audit Division Conferences and is the author of Auditing Beyond Compliance (ASQ Quality Press, 2012).