Data, and all that jazz

As  we all try to figure out just exactly what Industry 4.0 and Quality 4.0 mean it is not an exaggeration to say “Data is your most valuable asset. Yet we all struggle to actually get a benefit from this data and data integrity is an area of intense regulatory concern.

To truly have value our data needs to be properly defined, relevant to the tasks at hand, structured such that it is easy to find and understand, and of high-enough quality that it can be trusted. Without that we just have noise.

Apply principles of good master data management and data integrity. Ensure systems are appropriately built and maintained.

Understand why data matters, how to pick the right metrics, and how to ask the right questions from data. Understand correlation vs. causation to be able to make decisions about when to act on analysis and when not to is critical.

In the 2013 article Keep Up with Your Quants, Thomas Davenport lists six questions that should be asked to evaluate conclusions obtained from data:

1. What was the source of your data?

2. How well do the sample data represent the population?

3. Does your data distribution include outliers? How did they affect the results?

4. What assumptions are behind your analysis? Might certain conditions render your assumptions and your model invalid?

5. Why did you decide on that particular analytical approach? What alternatives did you consider?

6. How likely is it that the independent variables are actually causing the changes in the dependent variable? Might other analyses establish causality more clearly?

Framing data, being able to ask the right questions, is critical to being able to use that data and make decisions. In the past it was adequate enough for a quality professional to have a familiarity with a few basic tools. Today it is critical to understand basic statistics. As Nate Silver advises in an interview with HBR. “The best training is almost always going to be hands on training,” he says. “Getting your hands dirty with the data set is, I think, far and away better than spending too much time doing reading and so forth.”

Understanding data is a key ability and is necessary to thrive. It is time to truly contemplate the data ecosystem as a system and stop treating it as a specialized area of the organization.

Master and Transactional Data Management

Mylan’s 483 observation states that changes were being made to a LIMS system outside of the site’s change control process.

This should obviously be read in light of data integrity requirements. And it looks like in this case there was no way to produce a list of changes, which is a big audit trail no-no.

It’s also an area where I’ve seen a lot of folks make miss-steps, and frankly, I’m not sure I’ve always got it right.

There is a real tendency to look at the use of our enterprise systems and want all actions and approvals to happen within the system. This makes sense, we want to reduce our touch points, but there are some important items to consider before moving ahead with that approach.

Changes control is about assessing, handling and releasing the change. Most importantly it is in light the validated and regulatory impact. It serves disposition. As such, it is a good thing to streamline our changes into one system. To ensure every change gets assessed equally, and then gets the right level of handling it needs, and has a proper release.

Allowing a computer system to balkanize your changes, in the end, doesn’t really simplify. And in this day of master data management, of heavily aligned and talking systems, to be nimble requires us to know with a high degree of certainty that when we apply a change we are applying it thoroughly.

The day of separated computer systems is long over. It is important that our change management system takes that into account and offers single-stop shopping.

Interoperability of Data

Thomas Peither on GMP Logfile in “Industry 4.0 – Inspiration for the Pharmaceutical Industry?” summarizes discussions from the ISPE European Annual Meeting 2018.

There is a lot to unpack in the concept of Industry 4.0. The topic on my mind is data interoperability.

“We identified a struggle with the lack of harmonised and consistent information flows in the industry”, said Volker Roeder

I think this is an understanding that a lot of us, especially from bigger companies that grew from acquisitions and mergers, are grappling with. Decisions made 15+ years ago in the ERP (for example) now has widespread impacts as we start aligning and integrating other systems. It is difficult to align the MES, CMMS, ERP, LIMS and QMS if they all think of master-data differently. Its even harder when one branch of the business deals with master data differently than another branch, but both branches are in the same system.

I don’t have a magic bullet. I just think this is an area that will make Industry 4.0 a great deal more challenging than many of its proponents advocate. And it is becoming a bigger part of my professional life.

Read the article, it is well worth the 10 minutes spent.