I do not have a collection of data over decades from which to produce a trend. I have only my perceptions. So I offer up the following as food for thought and comment. I observe a change in our judgment concerning the limits of our capabilities and good enough versus poor. I think often that industry in general struggles to maintain control of processes tightly enough to ensure performance so close to the limits we set. As a result we create quality and performance problems in spite of ourselves.
At the risk of dating myself, when I went through engineering school the accepted practice to ensure that a design would measure up to the unpredicted possibilities of stress or abuse or challenge was to engineer with a safety factor. Depending upon the criticality or risk to safety of a design, a safety factor from 0.5 to 2 was generally selected.
As I became more experienced, more precise methods came into play. Six Sigma, specifically the discipline Design for Six Sigma, introduced statistical methods of predicting variation, errors, and testing design limits. It enabled us to utilize statistical tolerance methods to effectively optimize designs that would not be acceptable according to traditional methods. It also enabled us to be more precise than assumptions and guesswork in terms of setting our design parameters with regard to performance limits.
The science of mapping performance distributions told us exactly where we could set our process controls in order to ensure virtually 100% defect-free results. It is a better method than assuming or guessing at an appropriate safety factor.
Similarly, the Lean methodology with its directive to eliminate waste and variation taught us how to incrementally improve processes, and the designs that went into those processes, by carefully observing the performance constraints of those processes and setting our design parameters according to what was easiest or most powerfully controlled. Likewise the practice of Statistical Process Control ensures that we not only maintain our performance levels, but also anticipate changes or problems before they turn into quality or performance issues.
In short, a wide variety of practices and methodologies enable us to pursue optimization with greater sophistication than designing to arbitrary safety factors or setting controls well within limits. Progress is good.
Though it all, interestingly, the safety factor agenda remained. For example, Six Sigma is based on the observation that the spread a normal distribution more-or-less fits within three standard deviations (or “sigma”) to either side of the mean of the distribution. However, to account for long-term variation over time, the rule-of-thumb is to set the performance mean six standard deviations or sigma from the performance limit. The fundamental principle of Six Sigma is a safety factor.
The challenge that the ideal poses, unfortunately, is that it is very difficult and costly to drive our performances in every aspect of our business to such a standard. It also requires extraordinary discipline and effort to calculate our performance and control it. Many businesses don’t achieve such ideal control. I’m led to understand that even Motorola, the birthplace of the Six Sigma methodology, no longer maintains the discipline that made it so envied. It is very demanding.
Thus, I perceive the phenomenon I mentioned in the opening paragraph. We all know and understand now that there are more accurate and sophisticated ways of optimizing our designs and processes than the arbitrary safety factor method popular in decades past. And we seek out the optimal capabilities of our designs and processes. But we don’t always maintain the control to ensure that our performance is always acceptable because it is very difficult to do so.
Don’t misunderstand my message or my point. Optimization is good. Our intent is correct. Not only is optimization necessary to compete, it is also a responsible use of resources. By all means we should continue to optimize our performances.
What I do want to caution us against is neglecting an appropriate safety factor. Let’s use a finish process on component parts as an example.
A business serves customers by applying a finish coat to component parts that customers supply. If too much finish is applied parts might not assemble or perform correctly. It also wastes materiel. If too little finish is applied then the coat is not durable or resistant enough to environmental influences to perform properly and there is a quality issue. The optimal setting for the finish process is to provide the thinnest coat without producing one that is too thin.
Either through science or trial-and-error the business determines its process settings and controls as close as possible to that ideal threshold. Unfortunately, controlling processes, such as finish processes, that are influenced by a great many parameters, including chemical concentrations, color mixes, cleanliness of base materials, and temperature requires great discipline and effort.
If an environmental control breaks down or underperforms, or if one person on one shift skips a step, or if someone waits too long to recharge the chemical bath because production demands are spiking and time seems more important, the performance of the finish process can change. If there is no safety factor to account for these events, then defects are introduced.
I perceive that these kinds of quality escapes and occurrences of loss of process control are becoming frequent. I perceive that there are many contributing factors to these occurrences across industry.
- Economic stress and strong competition drive us to push our limits, perhaps farther than we reasonably should
- The stress of maintaining measurements and controls with a minimum of resources drives us to relax our concentration
- The common phenomenon of outsourcing processes strains our ability to know and understand the true performance of processes that affect our quality and performance
- Many have attempted to build a culture of process improvement and optimization methodology, and established business models accordingly, but have not yet succeeded in inculcating the culture
- Sometimes we simply forget the importance
More than a century of industrialized production has taught us the wisdom, repeatedly, that doing something right the first time, and producing a quality output for our customers, is less expensive in the long run than correcting mistakes and producing poor quality. It is easy to understand, but it is also easily lost or confounded within the complexities of business and production process interactions.
Take a good look at your performance and limits and process controls this week as you go about your duties. See if your perceptions agree with my own that too many processes are gambling too close to performance limits for actual performance to remain in acceptance. Whether you agree or not, take some time to be sure that your own processes are operating according to an appropriate safety factor to ensure your own performance.
Optimization is a good and valuable quest. Sophisticated means of establishing the best, most optimal setting are better than arbitrary safety factors though the effort can be difficult to sustain. Regardless, the point of optimization is lost when we push the limits of our capabilities too far and begin introducing defects. Instead, choose the right and best way for your organization to establish appropriate safety factors and stay within performance limits.
Stay wise, friends.
If you like what you just read, find more of Alan’s thoughts at www.bizwizwithin.com
Filed Under: Aerospace + defense