Advanced Reporting Strategies for Layered Process Audit Data

Layered Process Audits

October 29, 2019

Quality standards metrics

Organizations just getting started with layered process audits typically focus on basic metrics like audit completion rate and overdue corrective actions. As they progress along their digital transformation journey, however, they could be getting more value from these specialized audits by asking more complex questions.

Executed correctly, layered process audits (LPAs) take place every shift, generating hundreds of data points daily. Multiply that out over the course of a year, and it’s clear that manufacturers have some pretty robust data sets at their fingertips.

Managing these vast quantities of data can be difficult when relying on spreadsheets and other manual data analytics tools. An automated LPA platform integrated with business intelligence tools like Tableau and Power BI makes it much simpler (even if many analyses still require having an analyst or data scientist on staff).

With that in mind, let’s look at some examples of how manufacturers can leverage deeper insights from the volume of data they collect daily.

>> Find out how your LPA program compares to industry benchmarks with our free 2019 State of LPA Report

Data Segmentation

One common advanced reporting technique you can apply to your layered process audit data is segmentation. By slicing up your layered process audit data in different ways, you can get a fresh perspective on problems and how to address them.

For example, you might analyze overdue corrective actions based on the percentage closed on time, segmenting them by according to how long employees have been conducting LPAs.

The resulting data can provide insight into where bottlenecks in your corrective action process are occurring, where your greatest risks are and which plants might need additional support. In the case above, those with the most experience tend to have lower on-time completion rates, which could (among other things) reflect low buy-in on the part of more experienced workers.

Organizations can use data segmentation on a wide variety of LPA data, such as:

  • Analyzing ranges of nonconformance rates to identify priority regions or groups of plants needing help.
  • Identifying clusters of plants that have both high audit pass rates and a high rate of defects
  • Correlating LPA metrics with other key quality targets such as scrap, defects and complaints to develop leading metrics and best practices

Statistical Analysis

Statistical analysis is another area where organizations can dive deeper with their LPA data in order to drive actionable insights. Statistical functions you might work with include:

  • Mean: Looking at averages in areas like audit duration, completion rate, pass rate and nonconformance rate can tell you about LPA effectiveness in different plants. You can also examine whether one mean is significantly different than the group’s, such as whether a plant has a significantly better audit completion rate and might have best practices to share with the organization.
  • Standard deviation: Analyzing the standard deviation, or how far spread out the data is around the mean, can provide critical insight into how well you’ve been able to standardize LPAs in your organization. It can also tell you which outliers might need additional coaching.
  • Chi-square tests: In basic terms, a chi-square (or C2) test tells you whether the difference in expected and observed frequencies of categorical data is statistically significant (as opposed to just due to chance). Regarding LPA data, you might look at whether a spike in a particular nonconformance type is significant or just due to natural variation.
  • R-squared and regression analysis: R2 is a statistic that describes whether one variable controls another, this time with numerical data. A high R2 means stronger predictive power, helping you identify predictive variables that you might use to develop leading metrics. For instance, you might find that audit completion rate is a strong predictor of scrap rates.

Predictive Analysis

Predictive analysis is one area where business intelligence tools can provide a big advantage, allowing you to drag and drop a trend line onto any set of time-series data. This allows teams to ask questions like what that number looks like at a given point (e.g., three or six months in the future), helping prioritize where they need to take action now.

Time-series analysis is also a critical part of developing leading metrics, helping predict where nonconformances might happen so you can step in before problems occur. For instance, if a drop in audit completion rate tends to precede a decrease in yield, you can monitor audit completion rate to prevent it from dipping.

Predicting where problems are likely to occur can also help organizations develop more effective poka-yoke or error-proofing processes. And when you can keep human error from taking place, that’s when your LPA data become especially powerful at improving quality.

Data segmentation, statistical analysis and time-series analysis are just three of many reporting techniques you can apply to LPA data to unlock deeper quality insights. Integrating LPA data with business intelligence tools helps do it with minimal in-house analytics staff, so organizations of all sizes can get more from the data they’re already collecting.