Global versus Local Process and Procedure and the eQMS

Companies both large and small grapple with how and when to create standard work at the global level, while still having the scalability to capture different GXP activity families and product modality.

I’ve discussed before on document hierarchy and on the leveling of process and procedure. It is really important to level your processes, and this architecture should be deliberate and shepherded.

This really gets to the heart of work-as-imagined and prescribed, and the concept of standard work.

Benefits of Standard Work

  • Ensures all work is done according to the current best practice
  • Consistency is the essential ingredient of quality
  • Allows organizations to scale rapidly
  • Puts the focus on the process and not an individual or team
  • Makes improvements easier and faster

Global versus Local Process and Procedure in the Document Hierarchy

Most Quality Hierarchies look fairly similar.

A Document Hierarchy

Excluding the Program level (which becomes even more important) we can expand the model in the process band to account for global versus local.

Global and local process within the document hierarchy

Quality Manual and Policy remains global with local input and determine the overall structure of the quality management system.

Global Process is created when a process is majority task and role driven at a global level. It is pan-GXP, pan-modality, pan-geography. It is the standard way of work to drive consistency across and through the organization.

Local Process is created when a process is specific to a specific GXP, product modality, geography.

Procedure, which describes the tasks, can be created off of local or global process. When the global process has localizations (a CAPA is a CAPA but how I build action items may differ across sites), I can build local versions off the global process.

For an example, Document and Record Management.

This approach takes real vision among leaders to drive for consistency and simplicity. This activity is a core component in good system design, no matter the size of the organization.

PrincipleDescriptionApplication for Global and Local Process
BalanceThe system creates value for the multiple stakeholders. While the ideal is to develop a design that maximizes the value for all the key stakeholders, the designer often has to compromise and balance the needs of the various stakeholders.The value of standard work really shines here.
CongruenceThe degree to which the system components are aligned and consistent with each other and the other organizational systems, culture, plans, processes, information, resource decisions, and actions.We gain congruence through ensuring key processes are at the global level.
ConvenienceThe system is designed to be as convenient as possible for the participants to implement (a.k.a. user friendly). System includes specific processes, procedures, and controls only when necessary.The discussion around global versus local will often depend on how you define convenience
CoordinationSystem components are interconnected and harmonized with the other (internal and external) components, systems, plans, processes, information, and resource decisions toward common action or effort. This is beyond congruence and is achieved when the individual components of a system operate as a fully interconnected unit.How we ensure coordination across and through an organization.
EleganceComplexity vs. benefit — the system includes only enough complexity as is necessary to meet the stakeholder’s needs. In other words, keep the design as simple as possible and no more while delivering the desired benefits. It often requires looking at the system in new ways.Keep this in mind as global for the sake of global is not always the right decision.
HumanParticipants in the system are able to find joy, purpose and meaning in their work.Never forget
LearningKnowledge management, with opportunities for reflection and learning (learning loops), is designed into the system. Reflection and learning are built into the system at key points to encourage single- and double-loop learning from experience to improve future implementation and to systematically evaluate the design of the system itself.Building the right knowledge management into the organization is critical to leverage this model
SustainabilityThe system effectively meets the near- and long-term needs of the current stakeholders without compromising the ability of future generations of stakeholders to meet their own needs.Ensure the appropriate tools exist to sustain, including regulatory intelligence. Long-term scalability.
Pillars of Good System Design for Gloval and Local Process

Utilizing the eQMS to drive

The ideal state when implementing (or improving) an eQMS is to establish global processes and allow system functionality to localize as appropriate.

Leveraging the eQMS

So for example, every CAPA is the same (identify problem and root cause, create plan, implement plan, prove implementation is effective. This is a global process. However, one wants specific task detail at a lower level, for example GMP sites may care about certain fields more the GCP, medical device has specific needs, etc. These local task level needs can be mainted within one workflow.

The Key is Fit-For-Purpose Fit-for-Use

A fit for purpose process meets the requirements of the organization.

A fit for use process is usable throughout the lifecycle.

Global and localizing processes is a key part of making both happen.

Documents and the Heart of the Quality System

A month back on LinkedIn I complained about a professional society pushing the idea of a document-free quality management system. This has got to be one of my favorite pet peeves that come from Industry 4.0 proponents, and it demonstrates a fundamental failure to understand core concepts. And frankly one of the reasons why many Industry/Quality/Pharma 4.0 initiatives truly fail to deliver. Unfortunately, I didn’t follow through with my idea of proposing a session to that conference, so instead here are my thoughts.

Fundamentally, documents are the lifeblood of an organization. But paper is not. This is where folks get confused. But fundamentally, this confusion is also limiting us.

Let’s go back to basics, which I covered in my 2018 post on document management.

When talking about documents, we really should talk about function and not just by name or type. This allows us to think more broadly about our documents and how they function as the lifeblood.

There are three types of documents:

  • Functional Documents provide instructions so people can perform tasks and make decisions safely effectively, compliantly, and consistently. This usually includes things like procedures, process instructions, protocols, methods, and specifications. Many of these need some sort of training decision. Functional documents should involve a process to ensure they are up-to-date, especially in relation to current practices and relevant standards (periodic review)
  • Records provide evidence that actions were taken, and decisions were made in keeping with procedures. This includes batch manufacturing records, logbooks and laboratory data sheets and notebooks. Records are a popular target for electronic alternatives.
  • Reports provide specific information on a particular topic on a formal, standardized way. Reports may include data summaries, findings, and actions to be taken.

The beating heart of our quality system brings us from functional to record to reports in a cycle of continuous improvement.

Functional documents are how we realize requirements, that is the needs and expectations of our organization. There are multiple ways to serve up the functional documents, the big three being paper, paper-on-glass, and some sort of execution system. That last, an execution system, united function with record, which is a big chunk of the promise of an execution system.

The maturation mind is to go from mostly paper execution, to paper-on-glass, to end-to-end integration and execution to drive up reliability and drive out error. But at the heart, we still have functional documents, records, and reports. Paper goes, but the document is there.

So how is this failing us?

Any process is a way to realize a set of requirements. Those requirements come from external (regulations, standards, etc) and internal (efficiency, business needs) sources. We then meet those requirements through People, Procedure, Principles, and Technology. They are interlinked and strive to deliver efficiency, effectiveness, and excellence.

So this failure to understand documents means we think we can solve this through a single technology application. an eQMS will solve problems in quality events, a LIMS for the lab, an MES for manufacturing. Each of these is a lever for change but alone cannot drive the results we want.

Because of the limitations of this thought process we get systems designed for yesterday’s problems, instead of thinking through towards tomorrow.

We get documentation systems that think of functional documents pretty much the same way we thought of them 30 years ago, as discrete things. These discrete things then interact through a gap with our electronic systems. There is little traceability, which complicates change control and makes it difficult to train experts. The funny thing, is we have the pieces, but because of the limitations of our technology we aren’t leveraging them.

The v-model approach should be leveraged in a risk-based manner to the design of our full system, and not just our technical aspects.

System feasibility matches policy and governance, user requirements allow us to trace to what elements are people, procedure, principles, and/or technology. Everything then stems from there.