It’s probably not too controversial to suggest that software can accelerate quality management capabilities. In fact, it sounds a lot like common-sense. But in the world of quality management, common-sense is just a starting point, rather than a destination. The truth resides in the final numbers and it’s critical to bring them to bear when judging the effectiveness of any approach.
So how does the use of quality management software affect overall business performance? The relationship seems to be unequivocally positive. In a 2011 survey, the Aberdeen Group found that “best-in-class manufacturers” had substantially higher quality management software usage rates than everyone else (83% compared to 64%).
Given their impact on corporate performance, it’s no shock to find a plethora of commercially available quality management software options on the market. In fact, quality management software systems come in many combinations of product scope, platform, industry-focus, integration capabilities, and other functional dimensions.
Despite the variety, quality management systems do have one thing in common. They all pull functional solutions from the same rich tradition of quality management thinking. It’s a management tradition that takes the objectives of testability and improvement seriously. And, unsurprisingly, some particularly robust answers have emerged to the question of which quality tools provide the most value. In fact, it’s these core functional structures–or tools–that tend to be repeatedly implemented in quality management software programs.
In order to understand precisely which of these tools offer the most practical benefit, I spoke with a Six Sigma master black belt and got his opinion.
Mike Goeden is the Director of Quality Control at Rexnord, a leading manufacturer of industrial process and motion control and water management products based in Milwaukee. Before digging into his top quality management software tools, Mike spoke with me about the nature of the challenge presented by ensuring strong quality management.
After I had double-checked my working understanding of quality control (QC) measures as “reactive and testing-based” and quality assurance (QA) as “proactive and process optimization-oriented,” Mike noted that technology needs to support both objectives. But he explained that while QC tactics play an important role, the method of “make something, give it to the quality department, and they inspect it” is an outdated, “1980’s style” approach to quality management, if it’s not paired with a strong commitment to QA.
Mike suggested that manufacturers are best-served to put the majority of their quality management efforts toward quality assurance. When I asked how he would define QA, Mike provided his canonical definition as “controlling the process inputs to get an expected output.”
Discussing his company’s software investment, Mike noted that Rexnord uses a combination of commercial software products for quality management, including mq1 from Cebos and SAP.
Mike described the scope of their usage of quality management software this way: “We utilize our software to track all of our quality data. It captures our product and process quality issues. We continuously analyze and chart this data to identify trends in our processes.” The symbiotic relationship between monitoring results and optimizing processes plans was something that came up throughout our conversation. These are the specific tools that Mike identified he relies on for his quality management work (in no specific order of importance):
One of the fundamental goals for effective quality assurance is to minimize risk. According to Mike, conducting a “process failure modes effects analysis (pFMEA)” offers the ability to “look at your risk of what could go wrong and [determine] how do you prevent that from happening.” Essentially, what a pFMEA provides is a systematic approach to risk registering. In a pFMEA, each risk receives a rating based on severity, frequency, and detectability. The approach also allows for a consistent way to define corrective actions.
For more info: The American Society for Quality website offers a useful discussion that identifies when to use a FMEA and demonstrates a sample procedure.
Process control plans are another Six Sigma derived tool that Mike indicated he utilizes for quality assurance. While the pFMEA looks at processes starting with individual risks, the building block of the process control plan is each individual process “input.” A process control plan provides a structure to document the process step inputs, the metrics which identify if each process step is “in control,” and defined actions to take if a process step is “out of control.”
For more info: The DMAICTools.com website provides an extensive collection of free Six Sigma related file templates, including a sample control plan with annotated commentary on the purpose of each cell.
Relying exclusively on after-the-fact product inspection as a means of quality control has two significant, but related drawbacks. It is fundamentally reactive–occurring only after defects have resulted in sub-standard items–and the reactivity of the approach means it is expensive at scale. Statistical process control (SCP) is an attempt to overcome these limitations. SCP seeks to identify variations within processes themselves, before they have resulted in the manufacture of many non-conforming items. In order to accomplish this task, SCP software integrates process data, establishes conformance metrics, and utilizes alerts and monitors to trigger the remediation of “out of control” results. Graphical control charts are an important part of SCP monitoring and analysis capabilities.
For more info: There are many SPC products on the market. Cebos offers SPC functionality in their quality management program, MQ1.
Quality managers like Mike Goeden are continually making decisions about the prioritization of risks in process optimization work. One of the tools that Mike indicated he uses to “gain visibility on how to reduce and eliminate variation in the process” are Pareto charts.
The power of a Pareto chart lies in its ability to visually simplify which factor among a group is most significant. The trick of a Pareto chart is that it utilizes the y-axis twice. The first time, the y-axis is used to plot a standard bar graph where the largest item is placed to the left and each factor thereafter is graphed in successively diminishing size moving to the right. Utilizing the y-axis a second time provides the opportunity for cumulative count percentages to communicate how significantly each factor individually contributes to the group as a whole.
Equipment and preventative maintenance software is often viewed through the lens of asset management, so finding it mentioned in a list of top quality tools may be surprising. But its inclusion makes sense.
Equipment performance can affect quality outcomes in both subtle and substantial ways. For example, deteriorating machine performance can require frequent calibrations that drive up costs incrementally over time. At the other extreme, a full breakdown can immediately jeopardize QM metrics like delivery rates, time to fill, and, what Mike indicated he considered to be the single most important quality management measurement of all: customer satisfaction.
For more info: To check out the SAP Enterprise Asset Management system–which includes preventative maintenance capabilities–visit SAP.com. For more on preventative maintenance systems in general, check out our software guide on Computerized Maintenance Management Software.
My thanks to Mike Goeden of Rexnord for sharing his time and his insights on quality management tools. To find out more about Rexnord, visit their website here. For a further discussion of quality management software, including a list of popular commercial options, don’t miss our software guide on the topic.