Risk Assessments Do Not Replace Technical Knowledge

The US Food and Drug Administration (FDA) last month warned Indian generic drugmaker Lupin Limited over three good manufacturing practice (GMP) violations at its facility in Maharashtra, India that identified issues with the company’s written procedures for equipment cleaning, its written procedures for monitoring and controlling the performance of processing steps and the “failure to investigate all critical deviations.”

The FDA said the company “performed multiple risk assessments with the purpose to verify whether existing cleaning procedures and practices eliminate or reduce genotoxic impurities … generated through the manufacture of [redacted] drugs after you detected [redacted] impurities in your [active pharmaceutical ingredient] API.” The company also performed risk assessments to determine whether its cleaning procedures reduced the risk of cross-contamination of intermediates and API. However, FDA said the risk assessments “lacked data to support that existing equipment cleaning procedures are effective in removing [redacted] along with residual API from each respective piece of equipment to acceptable levels. “The identification of genotoxic impurities in quantities near their established limits suggests excursions are possible. All intermediates and API manufactured on non-dedicated equipment used to manufacture [redacted] drugs should be subject to validated sampling and analytical testing to ensure they are not contaminated with unacceptable levels of genotoxic impurities,” FDA said.

At heart this warning letter shows a major weakness in many company’s risk management approach, they use the risk assessment to replace technical inquiry, instead of as a tool to determine the appropriateness of technical understanding and as a way to manage the uncertainty around technical knowledge.

A significant point in the current Q9 draft is to deal with this issue, which we see happen again and again. Risk management cannot tell you whether your cleaning procedures are effective or not. Only a validated testing scheme can. Risk management looks at the aggregate and evaluates possibilities.

Global versus Local Process and Procedure and the eQMS

Companies both large and small grapple with how and when to create standard work at the global level, while still having the scalability to capture different GXP activity families and product modality.

I’ve discussed before on document hierarchy and on the leveling of process and procedure. It is really important to level your processes, and this architecture should be deliberate and shepherded.

This really gets to the heart of work-as-imagined and prescribed, and the concept of standard work.

Benefits of Standard Work

  • Ensures all work is done according to the current best practice
  • Consistency is the essential ingredient of quality
  • Allows organizations to scale rapidly
  • Puts the focus on the process and not an individual or team
  • Makes improvements easier and faster

Global versus Local Process and Procedure in the Document Hierarchy

Most Quality Hierarchies look fairly similar.

A Document Hierarchy

Excluding the Program level (which becomes even more important) we can expand the model in the process band to account for global versus local.

Global and local process within the document hierarchy

Quality Manual and Policy remains global with local input and determine the overall structure of the quality management system.

Global Process is created when a process is majority task and role driven at a global level. It is pan-GXP, pan-modality, pan-geography. It is the standard way of work to drive consistency across and through the organization.

Local Process is created when a process is specific to a specific GXP, product modality, geography.

Procedure, which describes the tasks, can be created off of local or global process. When the global process has localizations (a CAPA is a CAPA but how I build action items may differ across sites), I can build local versions off the global process.

For an example, Document and Record Management.

This approach takes real vision among leaders to drive for consistency and simplicity. This activity is a core component in good system design, no matter the size of the organization.

PrincipleDescriptionApplication for Global and Local Process
BalanceThe system creates value for the multiple stakeholders. While the ideal is to develop a design that maximizes the value for all the key stakeholders, the designer often has to compromise and balance the needs of the various stakeholders.The value of standard work really shines here.
CongruenceThe degree to which the system components are aligned and consistent with each other and the other organizational systems, culture, plans, processes, information, resource decisions, and actions.We gain congruence through ensuring key processes are at the global level.
ConvenienceThe system is designed to be as convenient as possible for the participants to implement (a.k.a. user friendly). System includes specific processes, procedures, and controls only when necessary.The discussion around global versus local will often depend on how you define convenience
CoordinationSystem components are interconnected and harmonized with the other (internal and external) components, systems, plans, processes, information, and resource decisions toward common action or effort. This is beyond congruence and is achieved when the individual components of a system operate as a fully interconnected unit.How we ensure coordination across and through an organization.
EleganceComplexity vs. benefit — the system includes only enough complexity as is necessary to meet the stakeholder’s needs. In other words, keep the design as simple as possible and no more while delivering the desired benefits. It often requires looking at the system in new ways.Keep this in mind as global for the sake of global is not always the right decision.
HumanParticipants in the system are able to find joy, purpose and meaning in their work.Never forget
LearningKnowledge management, with opportunities for reflection and learning (learning loops), is designed into the system. Reflection and learning are built into the system at key points to encourage single- and double-loop learning from experience to improve future implementation and to systematically evaluate the design of the system itself.Building the right knowledge management into the organization is critical to leverage this model
SustainabilityThe system effectively meets the near- and long-term needs of the current stakeholders without compromising the ability of future generations of stakeholders to meet their own needs.Ensure the appropriate tools exist to sustain, including regulatory intelligence. Long-term scalability.
Pillars of Good System Design for Gloval and Local Process

Utilizing the eQMS to drive

The ideal state when implementing (or improving) an eQMS is to establish global processes and allow system functionality to localize as appropriate.

Leveraging the eQMS

So for example, every CAPA is the same (identify problem and root cause, create plan, implement plan, prove implementation is effective. This is a global process. However, one wants specific task detail at a lower level, for example GMP sites may care about certain fields more the GCP, medical device has specific needs, etc. These local task level needs can be mainted within one workflow.

The Key is Fit-For-Purpose Fit-for-Use

A fit for purpose process meets the requirements of the organization.

A fit for use process is usable throughout the lifecycle.

Global and localizing processes is a key part of making both happen.

Pay Transparency

Let’s be honest, there are not enough quality professionals in the biotech field with the experience we need. Training and developing quality professionals is a lot of hard work that takes years.

There can be some substantial gaps in salary in an organization. Unfortunately, it is easier to get pay raises from leaving the organization than from staying (a huge issue), even when being promoted from within. Any quality organization soon has people all over the place in salary, doing similar things and similar levels.

People talk about salaries, it is a legally protected activity, and frankly this transparency can be out friend if we let it.

As we build our quality organizations, make sure we are building pay equity, and then live it. Fight for it.

Lots of good thoughts in this article at Bio Space “How Salary Transparency Can Help Employers Find and Keep Top Talent

Computer Software Assurance Draft

The FDA published on 13-Sep-2022 the long-awaited draft of the guidance “Computer Software Assurance for Production and Quality System Software,” and you may, based on all the emails and posting be wondering just how radical a change this is.

It’s not. This guidance is just one big “calm down people” letter from the agency. They publish these sorts of guidance every now and then because we as an industry can sometimes learn the wrong lessons.

This guidance states:

  1. Determine intended use
  2. Perform a risk assessment
  3. Perform activities to the required level

I wrote about this approach in “Risk Based Data Integrity Assessment,” and it has existed in GAMP5 and other approaches for years.

So read the guidance, but don’t panic. You are either following it already or you just need to spend some time getting better at risk assessments and creating some matrix approaches.

Data Integrity Warning Letter

In July 2022, the U.S. FDA issued a Warning Letter to the U.S. American company “Jost Chemical Co.” after having inspected its site in January 2022. The warning letter listedfour significant areas:

  • Failure of your quality unit to ensure that quality-related complaints are investigated and resolved, and failure to extend investigations to other batches that may have been associated with a specific failure or deviation.”
  • “Failure to establish adequate written procedures for cleaning equipment and its release for use in manufacture of API.”
  • “Failure to ensure that all test procedures are scientifically sound and appropriate to ensure that your API conform to established standards of quality and purity, and failure to ensure laboratory data is complete and attributable.”
  • “Failure to exercise sufficient controls over computerized systems to prevent unauthorized access or changes to data, and failure to establish and follow written procedures for the operation and maintenance of your computerized systems.”

I offer them the above clip as a good mini-training. I recently watched the show, and my wife thought I was going to have several heart attacks.

In a serious nature, please do not short your efforts in data integrity.