An interview with Milosz Majta of Apriso
The idea that we can separate out a process, task, or problem and fix it up with a singular approach is tantalizing. It even works–in some cases and to some degrees.
Take the case of manufacturing operations software. For years, manufacturing software developers offered a bevy of different applications. The applications were neatly organized and designed to address discrete areas such as:
Single functionality focused applications offer a major improvement over manual approaches in terms of efficiency, ease of use, and analytical opportunities. But they are also limited by one inconvenient reality: the lines between the functional areas in real manufacturing environments are rarely as cleanly drawn as the application names themselves.
In a live production environment, processing an order means understanding which equipment is operational and available, consulting a bill of materials, and monitoring the resulting product to ensure quality objectives are met. The point is, there is a fluid and interconnected relationship between each task in the manufacturing process.
Manufacturing software developers are increasingly taming the complexity of managing interconnected functions with comprehensive, cross-functional products that leverage a common platform to help manufacturers meet overall quality and efficiency goals.
One of the developers offering a set of manufacturing operations management tools integrated across a unified technology platform is Apriso. Recently, I connected with Product Manager of Quality & Maintenance, Milosz Majta, to understand more about integrated manufacturing software in general and specifically to find out about the opportunities offered by tying together computerized maintenance management software (CMMS) and quality management software (QMS) functionality.
Milosz: A massive “digitization” is occurring across manufacturing, which began with the product design departments, as evidenced by the profound advances now possible within CAD, CAM and product lifecycle management applications. This growing digital world of design is now making its way down to the shop floor - bringing with it a host of new operations management systems to convert all these digital designs into real products. The recent price drop of 3D printers is just one example of how production processes are being significantly impacted by this transformation.
Milosz: Given all the investment in new applications and IT systems to better integrate shop floor operations with product design, it makes sense that each of the other activities done on the shop floor, such as inventory management, quality monitoring and improvement and equipment maintenance must now keep up, or else those vendor applications will soon go the way of the abacus. Part of this transition to a digital world means interconnectivity is a given. In a digital world, no machine, application or employee can reasonably function in a “silo”. Instead, they must work seamlessly together, so as to enable adaptive operations that can change quickly to meet new opportunities, or, to quickly stop out-of-spec production to minimize the potential cost of recalls or poor quality. An example here would be an SPC alert automatically triggering a calibration order based on the process trending out-of-spec.
Example: A performance dashboard featuring preventive maintenance orders displays the connection points between quality and equipment intelligence.
Milosz: As they say, you can’t measure what you can’t track. ERP plays an important role in being the system of record. This means that every transaction must be recorded and preserved in this system to ensure corporate records are reflected accurately and with appropriate transparency to meet audit, regulatory and investor requirements. Shop floor systems, however, are completely different in that they must operate with the highest speed and effectiveness in an environment that is often running 24/7.
Given this level of stress on these IT systems, an appropriate architecture must be deployed that simultaneously ensures immediate integration to operations processes, an ability to quickly change these processes, as well as an ability to effectively extract reporting data from the activities to then provide intelligence to the management team for continuous process improvement.
Batch-style data uploading is an effective approach to ensuring the systems of record are in alignment with the systems of operations. This alignment can be optimized and best managed when each of your operations systems are seamlessly orchestrated - such as run from a single platform or foundation. This type of layout implies a common data model whereby records and programming logic is shared across functions - as well as different locations - such that all of your manufacturing operations can be run seamlessly as if operating on a single, global plant floor.
Milosz: We always want to be proactive versus reactive. No one wants to find out that a machine required lubrication after it stopped. In fact, many maintenance organizations are measured by the ratio between reactive and proactive maintenance. Likewise, you don’t want to find out about a quality issue after you have already shipped 100,000 unites to your most important customer!
How do you become more proactive? Well, often that is accomplished with intelligence that can be captured while a machine is running. Operators have enough knowledge about their machines to know if a particular performance specification starts to trend out-of-bounds of its normal operating range. Such activity could be indicative of a future issue. What if this intelligence could be instantly captured, processed and then used to alert these operators whenever any such “indicative” behavior begins? That could go a long way towards increasing equipment uptime, or improving quality. This type of scenario is now possible. And, as more systems and equipment is connected together, the more powerful and beneficial such advance notifications can become.
Milosz: There isn’t really one answer to this question. Predictive intelligence is based on building the right models from historical data and predicting future performance, so the more comprehensive and relevant data you are able to use to build the model the more reliable the model will be. It also depends on what you are making and what the process is that is involved. But, what I can tell you is that whatever your process is, you can always get better. And, after a few years of honing your craft, understanding the intelligence that is available and actively using it for performance improvement, you can then turn this knowledge into a significant competitive advantage - an advantage that can not be easily copied or replicated without going through the same learning curve you went through. In the end, your competitors might never catch up, provided you continue to move forward with your performance improvement journey!
For more insights from Milosz, make sure to check out the Manufacturing Transformation blog, where he is a frequent contributor. If you are actively considering purchasing manufacturing software, or just looking to better understand some of the opportunities available, feel free to explore our manufacturing platforms, CMMS, and quality management software guides.