What Production Data is Necessary to Drive Your Industry 4.0 Agenda?

  • April 25, 2017
  • Feature
What Production Data is Necessary to Drive Your Industry 4.0 Agenda?
What Production Data is Necessary to Drive Your Industry 4.0 Agenda?

By Mathew Daniel, Vice President of Operations, Sciemetric Instruments Data, data and more data. It’s the hot topic in manufacturing today with all the hype and anxiety around Industry 4.0 and the Industrial Internet of Things. How should it be collected, managed and used effectively?

Even more fundamental, what kind of data matters in the first place? Not all data is the same. Let’s examine what production data is necessary to achieve new benchmarks for quality, yield and cost efficiency in the modern smart factory.


Data-driven decision making that just isn’t granular enough

Data collection in the industry began decades ago with Manufacturing Resource Planning (MRP) systems for scheduling materials, parts and people. These systems evolved into Enterprise Resource Planning (ERP) systems that added sales and finance to the equation. Then came Manufacturing Execution Systems (MES). This sparked a more comprehensive focus on manufacturing processes and efficiencies by capturing data related to machines and people in product quality and throughput.

A typical MES system tracks and documents the transformation of raw materials to finished goods with the goal to understand how current conditions on the plant floor can be optimized to improve production output.

MES systems  typically gather pass/fail  and/or scalar data. The latter being isolated data points from each process or test on a production line such as max torque or final spindle angle. Some stations may be just binary pass/fail propositions indicated by a  green light for pass, and a red light for fail.

Just as ERP and MES significantly improved data collection, analysis and aided in decision-making, this fourth industrial revolution known as Industry 4.0 takes data collection for manufacturing to a whole new level. This is not to suggest that SPC and scalar data are obsolete. They remain important tools for monitoring the health of a production line to spot issues as they arise. Industry 4.0 offers new capabilities to capture complete sensor signature data in production real time (within one cycle), to quickly trace the root cause of an issue.

The falling cost of sensors and data acquisition systems, network topology and throughput, and multi-terabyte class storage make collecting the full sensor waveform, or digital process signature, of each cycle from a line’s process and test stations very feasible. Powerful and advanced off-the-shelf analytical tools make this data easy to analyze together with all your other data sets – scalar, machine vision image files and so forth – in one central database.


The practical difference between scalar and signature data

With scalar data alone, a part’s pass or fail is based on only a few isolated data points within the process data gathering phase. This leaves sizeable data gaps in time between the start and end of a process. If an anomaly occurs during one of these gaps, these product defects can go undetected. Faulty units can still slip through or make their way downstream and fail end-of-line tests or, worse still, fail in the hands of a consumer. Productivity (and profitability) losses can mount as first-time yields drop and the cost of scrap and rework climbs.

For example, during a sealant dispensing operation, air bubbles in the line may have interrupted the evenness of the bead. The dispense passes machine vision inspection, but later down the line, after this part has been matched with its mate, the assembly fails its leak test due to a flaw in the seal. Those bubbles are all too easy to miss if only scalar data is being collected. There is no way to go back and trace the root cause because engineers lack the “video instant replay” that will reveal what happened through every millisecond of dispense, assembly and test.

Signature analysis provides that replay, providing hundreds of thousands of data points instead of just a few. It’s like a full video of the event, versus isolated snapshots.

If we take that example of the dispense operation again, signature analysis allows the operator to spot the “blow-out” that may occur after the dispense. Trapped air within a material applied under high pressure that is typically not compressible will rapidly expand when exposed to the plant’s ambient air pressure. By having, for example, the dispense pressure sampled for the entire cycle, the flaw is caught on the spot.

This makes for much more precise pass/fail decisions in real time on the line, but levering data for Industry 4.0 requires more.

A Process Signature Example:  In this particular process, two parts failed because their scalar values (“peak force”) exceeded the upper specification limit of 2.2 Nm. If we have the waveform data, we notice that one of the process “signatures” for a particular part did not follow the same pattern as the batch of “normal” parts even though it passed. For the torque value to stay high, this could indicate contamination, or misalignment, or oversized part etc. and would be worth investigating, especially if it ends up as a warranty issue. A “peak-to- peak” or “envelope” method may have been better methods for checking for proper assembly.

Powering Industry 4.0 with advanced data analysis

If all this data from up and down the line – waveform signatures, scalar data, machine vision images – are collected into a single database, they can be searched, correlated and visualized with algorithms. Quality engineers can then find trends and patterns that reveal the “how” and “why” of decreases in yield. This applies to any controlled process – from press fitting and leak test to rundown, crimping, weld and dispensing. 

A part failure can now easily be distinguished from a test malfunction. The quality team can spot anomalies that require further investigation, pinpoint where problems occur during a process, and optimize test stations by understanding how to shorten the test.

Best of all, you have complete traceability right down to the specific parts and their serial number. Track down a few dozen defective units without having to scrap, rework or recall thousands. The root cause of an issue can be tracked down in minutes or hours, rather than days or weeks.

This gives a factory the capability to create a standardized global platform for quality and innovation. It’s now easy to determine if existing tooling can meet new needs by comparing the data. New control limits can be verified and easily adjusted to support new models. Process signatures from a new production line can be matched against existing ones to give a strong indication of conformance. Data from multiple plants can be compared to see trends in tooling problems. Innovations in one plant are reliably applied to other plants, providing a many times increase in yield.


Considerations for creating this architecture

For Industry 4.0, the goal is to eliminate data layers or silos through digitization and connectivity.

You may have heard the terms “Digital Thread” or “Digital Twin.” These concepts were introduced by the U.S. Airforce in 2013 as game changers in manufacturing. The former provides a framework to capture data and eliminate data silos from product design, through to manufacturing, service and even post service. The latter promotes the ability to define specifications and simulate product performance through the “as-built” manufacturing phase and through the product’s life while in service.

Having this level of digitization with feedback loops allows for specifications to be rapidly adjusted to minimize potentially costly product quality issues, lower costs by reducing unnecessarily tight product specifications, improve customer perceptions and loyalty, and rapidly introduce product enhancements.

But it all relies on the completeness and reliability of your data. Here are some best practices to consider:

First, have confidence in the data:The data must be trusted. There is nothing worse than sitting in a data review meeting where the discussion is about the data and not the product. Build confidence by making sure the process generating the data is accurately and reliably pushing data out. Baked into this strategy should be an understanding of the network topology (to manage inevitable failures at various points in the network patch), methods for caching and throughput management, and disaster recovery and redundancy.

Analysis within “Production Real Time”:There is nothing as important as getting your data into your analytics data repository in Production Real Time, which we define as within one cycle of when the cycle ended. This allows for dashboards and smart alerting systems to quickly notify if processes are drifting out of control.

Standardize the data model:Having a standard data model for your plant floor systems greatly simplifies implementation and assures project success by levering standard off-the-shelf reporting. For instance, if your data model standard identifies traceability to a specific part by serial number (highly recommended), this will ensure all tooling suppliers program their logic controllers accordingly, or have the appropriate barcode readers/scanners or part-marking systems in place.


In summary

“Industry 4.0”, “big data” and “data analytics” are not futuristic “hope to achieve some day” concepts. They are redefining the competitive landscape of global manufacturing today.

Manufacturers can chart a reasonable upgrade path using proven data collection and analysis tools that rely on collecting the digital process signature and correlating this data set with scalar, vision and other forms of data. For relatively modest investments, they can realize substantial gains in quality, efficiency and profitability.

It’s time to elevate the conversation about how to make the most of your data.


About the Author

Mathew Daniel is Vice President of Operations at Sciemetric Instruments, where he manages service and installation, product development, and manufacturing and quality. Mat oversees many of the manufacturing data and analytics implementations provided to large manufacturers, helping them to organize and maximize a return from their production data.

Learn More

Did you enjoy this great article?

Check out our free e-newsletters to read more great articles..