Health of the Validation Program

In the Metrics Plan for Facility, Utility, System and Equipment that is being developed a focus is on effective commissioning, qualification, and validation processes.

To demonstrate the success of a CQV program we might brainstorm the following metrics.

Deviation and Non-Conformance Rates

  • Track the number and severity of deviations related to commissioned, qualified and validated processes and FUSE elements.
  • The effectiveness of CAPAs that involve CQV elements

Change Control Effectiveness

  • Measure the number of successful changes implemented without issues
  • Track the time taken to implement and qualify validate changes

Risk Reduction

  • Quantify the reduction in high and medium risks identified during risk assessments as a result of CQV activities
  • Monitor the effectiveness of risk mitigation strategies

Training and Competency

  • Measure the percentage of personnel with up-to-date training on CQV procedures
  • Track competency assessment scores for key validation personnel

Documentation Quality

  • Measure the number of validation discrepancies found during reviews
  • Track the time taken to approve validation documents

Supplier Performance

  • Monitor supplier audit results related to validated systems or components
  • Track supplier-related deviations or non-conformances

Regulatory Inspection Outcomes

  • Track the number and severity of validation-related observations during inspections
  • Measure the time taken to address and close out regulatory findings

Cost and Efficiency Metrics

  • Measure the time and resources required to complete validation activities
  • Track cost savings achieved through optimized CQV approaches

By tracking these metrics, we might be able to demonstrate a comprehensive and effective CQV program that aligns with regulatory expectations. Or we might just spend time measuring stuff that may not be tailored to our individual company’s processes, products, and risk profile. And quite frankly, will they influence the system the way we want? It’s time to pull out an IMPACT key behavior analysis to help us tailor a right-sized set of metrics.

The first thing to do is to go to first principles, to take a big step back and ask – what do I really want to improve?

The purpose of a CQV program is to provide documented evidence that facilities, systems, equipment and processes have been designed, installed and operate in accordance with predetermined specifications and quality attributes:

  • To verify that critical aspects of a facility, utility system, equipment or process meet approved design specifications and quality attributes.
  • To demonstrate that processes, equipment and systems are fit for their intended use and perform as expected to consistently produce a product meeting its quality attributes.
  • To establish confidence that the manufacturing process is capable of consistently delivering quality product.
  • To identify and understand sources of variability in the process to better control it.
  • To detect potential problems early in development and prevent issues during routine production.

The ultimate measure of success is demonstrating and maintaining a validated state that ensures consistent production of safe and effective products meeting all quality requirements. 

Focusing on the Impact is important. What are we truly concerned about for our CQV program. Based on that we come up with two main factors:

  1. The level of deviations that stem from root causes associated with our CQV program
  2. The readiness of FUSE elements for use (project adherence)

Reducing Deviations from CQV Activities

First, we gather data, what deviations are we looking for? These are the types of root causes that we will evaluate. Of course, your use of the 7Ms may vary, this list is to start conversation.

  Means  Automation or Interface Design Inadequate/DefectiveValidated machine or computer system interface or automation failed to meet specification due to inadequate/defective design.
  Means  Preventative Maintenance InadequateThe preventive maintenance performed on the equipment was insufficient or not performed as required.
  MeansPreventative Maintenance Not DefinedNo preventive maintenance is defined for the equipment used.
  MeansEquipment Defective/Damaged/FailureThe equipment used was defective or a specific component failed to operate as intended.
  Means  Equipment IncorrectEquipment required for the task was set up or used incorrectly or the wrong equipment was used for the task.
  Means  Equipment Design Inadequate/DefectiveThe equipment was not designed or qualified to perform the task required or the equipment was defective, which prevented its normal operation.
MediaFacility DesignImproper or inadequate layout or construction of facility, area, or work station.
  MethodsCalibration Frequency is Not Sufficient/DeficiencyCalibration interval is too long and/or calibration schedule is lacking.
  Methods  Calibration/Validation ProblemAn error occurred because of a data collection- related issue regarding calibration or validation.
MethodsSystem / Process Not DefinedThe system/tool or the defined process to perform the task does not exist.

Based on analysis of what is going on we can move into using a why-why technique to look at our layers.

Why 1Why are deviations stemming from CQV events not at 0%
Because unexpected issues or discrepancies arise after the commissioning, qualification, or validation processes

Success factor needed for this step: Effectiveness of the CQV program

Metric for this step: Adherence to CQV requirements
Why 2 (a)Why are unexpected issues arising after CQV?
Because of inadequate planning and resource constraints in the CQV process.

Success Factor needed for this step: Appropriate project and resource planning

Metric for this Step: Resource allocation
Why 3 (a)Why are we not performing adequate resource planning?
Because of the tight project timelines, and the involvement of multiple stakeholders with different areas of expertise

Success Factor needed for this step: Cross-functional governance to implement risk methodologies to focus efforts on critical areas

Metric for this Step: Risk Coverage Ratio measuring the percentage of identified critical risks that have been properly assessed and and mitigated through the cross-functional risk management process. This metric helps evaluate how effectively the governance structure is addressing the most important risks facing the organization.
Why 2 (b)Why are unexpected issues arising after CQV?
Because of poorly executed elements of the CQV process stemming from poorly written procedures and under-qualified staff.

Success Factor needed for this step: Process Improvements and Training Qualification

Metric for this Step: Performance to Maturity Plan

There were somethings I definitely glossed over there, and forgive me for not providing numbers there, but I think you get the gist.

So now I’ve identified the I – How do we improve reliability of our CQV program, measured by reducing deviations. Let’s break out the rest.

ParametersExecuted for CQV
IDENTIFYThe desired quality or process improvement goal (the top-level goal)Improve the effectiveness of the CQV program by taking actions to reduce deviations stemming from verification of FUSE and process.
MEASUREEstablish the existing Measure (KPI) used to conform and report achievement of the goalSet a target reduction of deviations related to CQV activities.
PinpointPinpoint the “desired” behaviors necessary to deliver the goal (behaviors that contribute successes and failures)Drive good project planning and project adherence.

Promote and coach for enhanced attention to detail where “quality is everyone’s job.”

Encourage a speak-up culture where concerns, issues or suggestions are shared in a timely manner in a neutral constructive forum.
ACTIVATE the CONSEQUENCESActivate the Consequences to motivate the delivery of the goal
(4:1 positive to negative actionable consequences)
Organize team briefings on consequences

Review outcomes of project health

Senior leadership celebrate/acknowledge

Acknowledge and recognize improvements

Motivate the team through team awards

Measure success on individual deliverables through a Rubric
TRANSFERTransfer the knowledge across the organization to sustain the performance improvementCreate learning teams

Lessons learned are documented and shared

Lunch-and-learn sessions

Create improvement case studies

From these two exercises I’ve now identified my lagging and leading indicators at the KPI and the KBI level.

Good Engineering Practices Under ASTM E2500

ASTM E2500 recognizes that Good Engineering Practices (GEP) are essential for pharmaceutical companies to ensure the consistent and reliable design, delivery, and operation of engineered systems in a manner suitable for their intended purpose.

Key Elements of Good Engineering Practices

  1. Risk Management: Applying systematic processes to identify, assess, and control risks throughout the lifecycle of engineered systems. This includes quality risk management focused on product quality and patient safety.
  2. Cost Management: Estimating, budgeting, monitoring and controlling costs for engineering projects and operations. This helps ensure projects deliver value and stay within budget constraints.
  3. Organization and Control: Establishing clear organizational structures, roles and responsibilities for engineering activities. Implementing monitoring and control mechanisms to track performance.
  4. Innovation and Continual Improvement: Fostering a culture of innovation and continuous improvement in engineering processes and systems.
  5. Lifecycle Management: Applying consistent processes for change management, issue management, and document control throughout a system’s lifecycle from design to decommissioning.
  6. Project Management: Following structured approaches for planning, executing and controlling engineering projects.
  7. Design Practices: Applying systematic processes for requirements definition, design development, review and qualification.
  8. Operational Support: Implementing asset management, calibration, maintenance and other practices to support systems during routine operations.

Key Steps for Implementation

  • Develop and document GEP policies, procedures and standards tailored to the company’s needs
  • Establish an Engineering Quality Process (EQP) to link GEP to the overall Pharmaceutical Quality System
  • Provide training on GEP principles and procedures to engineering staff
  • Implement risk-based approaches to focus efforts on critical systems and processes
  • Use structured project management methodologies for capital projects
  • Apply change control and issue management processes consistently
  • Maintain engineering documentation systems with appropriate controls
  • Conduct periodic audits and reviews of GEP implementation
  • Foster a culture of quality and continuous improvement in engineering
  • Ensure appropriate interfaces between engineering and quality/regulatory functions

The key is to develop a systematic, risk-based approach to GEP that is appropriate for the company’s size, products and operations. When properly implemented, GEP provides a foundation for regulatory compliance, operational efficiency and product quality in pharmaceutical manufacturing.

Invest in a Living, Breathing Engineering Quality Process (EQP)

The EQP establishes the formal connection between GEP and the Pharmaceutical Quality System it resides within, serving as the boundary between Quality oversight and engineering activities, particularly for implementing Quality Risk Management (QRM) based integrated Commissioning and Qualification (C&Q).

It should also provide an interface between engineering activities and other systems like business operations, health/safety/environment, or other site quality systems.

Based on the information provided in the document, here is a suggested table of contents for an Engineering Quality Process (EQP):

Table of Contents – Engineering Quality Process (EQP)

  1. Introduction
    1.1 Purpose
    1.2 Scope
    1.3 Definitions
  2. Application and Context
    2.1 Relationship to Pharmaceutical Quality System (PQS)
    2.2 Relationship to Good Engineering Practice (GEP)
    2.3 Interface with Quality Risk Management (QRM)
  3. EQP Elements
    3.1 Policies and Procedures for the Asset Lifecycle and GEPs
    3.2 Risk Assessment
    3.3 Change Management
    3.4 Document Control
    3.5 Training
    3.6 Auditing
  4. Deliverables
    4.1 GEP Documentation
    4.2 Risk Assessments
    4.3 Change Records
    4.4 Training Records
    4.5 Audit Reports
  5. Roles and Responsibilities
    5.1 Engineering
    5.2 Quality
    5.3 Operations
    5.4 Other Stakeholders
  6. EQP Implementation
    6.1 Establishing the EQP
    6.2 Maintaining the EQP
    6.3 Continuous Improvement
  7. References
  8. Appendices

Retrospective Validation Doesn’t Really Exist

A recent FDA Warning Letter really drove home a good point about the perils of ‘retrospective validation’ and how that normally doesn’t mean what folks want it to mean.

“In lieu of process validation studies, you attempted to retrospectively review past batches without scientifically establishing blend uniformity and other critical process performance indicators. You do not commit to conduct further process performance qualification studies that scientifically establish the ability of your manufacturing process to consistently yield finished products that meet their quality attributes.”

The FDA’s response here is important for three truths:

  1. Validation needs to be done against critical quality attributes and critical process parameters to scientifically establish that the manufacturing process is consistent.
  2. Batch data on its own is rather useless.
  3. Validation is a continuous exercise, it is not once-and-done (or rather in most people’s view thrice-and-done).

I don’t think the current GMPs really allow the concept of retrospective validation as most people want it to mean (including the recipient of that warning letter). It’s probably a term we should go into the big box of Nope.

AI generated art

Retrospective validation as most people mean it is a type of process validation that involves evaluating historical data and records to demonstrate that an existing process consistently produces products meeting predetermined specifications. As an approach retrospective validation involves evaluating historical data and records to demonstrate that an existing process consistently produces products meeting predetermined specifications. 

The problem here is that this really just tells you what you were already hoping was true.

Retrospective validation has some major flaws:

  1. Limited control over data quality and completeness: Since retrospective validation relies on historical data, there may be gaps or inconsistencies in the available information. The data may not have been collected with validation in mind, leading to missing critical parameters or measurements. It rather throws out most of the principles of science.
  2. Potential bias in existing data: Historical data may be biased or incomplete, as it was not collected specifically for validation purposes. This can make it difficult to draw reliable conclusions about process performance and consistency.
  3. Difficulty in identifying and addressing hidden flaws: Since the process has been in use for some time, there may be hidden flaws or issues that have not been identified or challenged. These could potentially lead to non-conforming products or hazardous operating conditions.
  4. Difficulty in recreating original process conditions: It may be challenging to accurately recreate or understand the original process conditions under which the historical data was generated, potentially limiting the validity of conclusions drawn from the data.

What is truly called for is to perform concurrent validation.

Navigating the Evolving Landscape of Validation in Biotech: Challenges and Opportunities

The biotech industry is experiencing a significant transformation in validation processes, driven by rapid technological advancements, evolving regulatory standards, and the development of novel therapies.

The 2024 State of Validation report, authored by Jonathan Kay and funded by Kneat, provides a overview of trends and challenges in the validation industry. Here are some of the key findings:

  1. Compliance and efficiency are top priorities: Creating process efficiencies and ensuring audit readiness have become the primary goals for validation programs.
    • Compliance burden emerged as the top validation challenge in 2024, replacing shortage of human resources which was the top concern in 2022-2023
  2. Digital transformation is accelerating: 83% of respondents are either using or planning to adopt digital validation systems. The top benefits include improved data integrity, continuous audit readiness, and global standardization.
    • 79% of those using digital validation rely on third-party software providers
      • Does this mean that 21% of respondents are in companies that have created their own bespoke systems? Or is something else going on there
    • 63% reported that ROI from digital validation systems met or exceeded expectations
  3. Artificial intelligence and machine learning are on the rise: 70% of respondents believe AI and ML will play a pivotal role in the future of validation.
  4. Remote audits are becoming more common: 75% of organizations conducted at least some remote regulatory audits in the past year.
  5. Challenges persist: The industry faces ongoing challenges in balancing costs, attracting talent, and keeping pace with technological advancements.
    • 61% reported an increase in validation workload over the past 12 months
  6. Industry 4.0 adoption is growing: 60% of organizations are in the early stages or actively implementing Industry/Pharma 4.0 technologies.
  7. Digital Transformation:

As highlighted in the 2024 State of Validation report and my previous blog post on “Challenges in Validation,” several key trends and challenges are shaping the future of validation in biotech:

  1. Technological Integration: The integration of AI, machine learning, and automation into validation processes presents both opportunities and challenges. While these technologies offer the potential for increased efficiency and accuracy, they also require new validation frameworks and methodologies.
  2. Regulatory Compliance: Keeping pace with evolving regulatory standards remains a significant challenge. Regulatory bodies are continuously updating guidelines to address technological advancements, requiring companies to stay vigilant and adaptable.
  3. Data Management and Integration: With the increasing use of digital tools and platforms, managing and integrating vast amounts of data has become a critical challenge. The industry is moving towards more robust data analytics and machine learning tools to handle this data efficiently.
  4. Resource Constraints: Particularly for smaller biotech companies, resource limitations in funding, personnel, and expertise can hinder the implementation of advanced validation techniques.
  5. Risk Management: Adopting a risk-based approach to validation is essential but challenging. Companies must develop effective strategies to identify and mitigate risks throughout the product lifecycle.
  6. Collaboration and Knowledge Sharing: Ensuring effective communication and data sharing among various stakeholders is crucial for streamlining validation efforts and aligning goals.
  7. Digital Transformation: The industry is witnessing a shift from traditional, paper-heavy validation methods to more dynamic, data-driven, and digitalized processes. This transformation promises enhanced efficiency, compliance, and collaboration.
  8. Workforce Development: We are a heavily experience driven field. With 38% of validation professionals having 16 or more years of experience, there’s a critical need for knowledge transfer and training to equip newer entrants with necessary skills.
  9. Adoption of Computer Software Assurance (CSA): The industry is gradually embracing CSA processes, driven by recent FDA guidance, though there’s still considerable room for further adoption. I always find this showing up in surveys to be disappointing, as CSA is a racket, as it basically is already existing validation principles. But consultants got to consult.
  10. Focus on Efficiency and Audit Readiness: Creating process efficiencies and ensuring audit readiness have emerged as top priorities for validation programs.

As the validation landscape continues to evolve, it’s crucial for biotech companies to embrace these changes proactively. By leveraging new technologies, fostering collaboration, and focusing on continuous improvement, the industry can overcome these challenges and drive innovation in validation processes.

The future of validation in biotech lies in striking a balance between technological advancement and regulatory compliance, all while maintaining a focus on product quality and patient safety. As we move forward, it’s clear that the validation field will continue to be dynamic and exciting, offering numerous opportunities for innovation and growth.