AI/ML – In-Process Monitoring

I’m often asked where we’ll first see the real impact of AI/ML in GMP. I don’t think I’ve hidden my skepticism on the topic in the past, but people keep asking, so here’s one of the first places I think it will really impact our field.

In-Process Monitoring

AI algorithms, coupled with advanced sensing technology, can detect and respond to minute changes in critical parameters. I can, today, easily imagine a system that not only detects abnormal temperatures but also automatically adjusts pressure and pH levels to maintain optimal conditions to a level of responsiveness not possible in today’s automation system, with continuous monitoring of every aspect of the production process in real-time. This will drive huge gains in predictive maintenance and data-driven decision making for improved product quality through early defect detection, especially in continuous manufacturing processes.

AI and machine learning algorithms will more and more empower manufacturers to analyze complex data sets, revealing hidden patterns and trends that were previously undetectable. This deep analysis will allow for more informed decision-making and process optimization, leading to significant improvements in manufacturing efficiency. Including:

  • Enhancing Equipment Efficiency
    • Reduce downtime
    • Predict and prevent breakdowns
    • Optimize maintenance schedules
  • Process Parameter Optimization
    • Analyze historical and real-time data to determine optimal process parameters
    • Predict product quality and process efficiency
    • Adapt through iterative learning
    • Suggest proactive adjustments to production parameters

There is a lot of hype in this area, I personally do not see us as close as some would say, but we are seeing real implementations in this area, and I think we are on the cusp of some very interesting capabilities.

Validating Manufacturing Process Closure for Biotech Utilizing Single-Use Systems (SUS)

Maintaining process closure is crucial for ensuring product quality and safety in biotechnology manufacturing, especially when using single-use systems (SUS). This approach is an integral part of the contamination control strategy (CCS). To validate process closure in SUS-based biotech manufacturing, a comprehensive method is necessary, incorporating:

  1. Risk assessment
  2. Thorough testing
  3. Ongoing monitoring

By employing risk analysis tools such as Hazard Analysis and Critical Control Points (HACCP) and Failure Mode and Effects Analysis (FMEA), manufacturers can identify potential weaknesses in their processes. Additionally, addressing all four layers of protection helps ensure process integrity and product safety. This risk-based approach to process closure validation is essential for maintaining the high standards required in biotechnology manufacturing, including meeting Annex 1.

Understanding Process Closure

Process closure refers to the isolation of the manufacturing process from the external environment to prevent contamination. In biotech, this is particularly crucial due to the sensitivity of biological products and the potential for microbial contamination.

The Four Layers of Protection

Throughout this process it is important to apply the four layers of protection that form the foundation of a robust contamination control strategy:

  1. Process: The inherent ability of the process to prevent or control contamination
  2. Equipment: The design and functionality of equipment to maintain closure
  3. Operating Procedures: The practices and protocols followed by personnel
  4. Production Environment: The controlled environment surrounding the process

I was discussing this with some colleagues this week (preparing for some risk assessments) and I was reminded that we really should put the Patient in at the center, the zero. Truer words have never been spoken as the patient truly is our zeroth law, the fundamental principle of the GxPs.

Key Steps for Validating Process Closure

Risk Assessment

Start with a comprehensive risk assessment using tools such as HACCP (Hazard Analysis and Critical Control Points) and FMEA (Failure Mode and Effects Analysis). It is important to remember this is not a one or another, but a multi-tiered approach where you first determine the hazards through the HACCP and then drill down into failures through an FMEA.

HACCP Approach

In the HACCP we will apply a systematic, preventative approach to identify hazards in the process with the aim to produce a documented plan to control these scenarios.

a) Conduct a hazard analysis
b) Identify Critical Control Points (CCPs)
c) Establish critical limits
d) Implement monitoring procedures
e) Define corrective actions
f) Establish verification procedures
g) Maintain documentation and records

FMEA Considerations

In the FMEA we will look for ways the process fails, focusing on the SUS components. We will evaluate failures at each level of control (process, equipment, operating procedure and environment).

  • Identify potential failure modes in the SUS components
  • Assess the severity, occurrence, and detectability of each failure mode
  • Calculate Risk Priority Numbers (RPN) to prioritize risks

Verification

Utilizing these risk assessments, define the user requirements specification (URS) for the SUS, focusing on critical aspects that could impact product quality and patient safety. This should include:

  • Process requirements (e.g. working volumes, flow rates, pressure ranges)
  • Material compatibility requirements
  • Sterility/bioburden control requirements
  • Leachables/extractables requirements
  • Integrity testing requirements
  • Connectivity and interface requirements

Following the ASTM E2500 approach, when we conduct the design review of the proposed SUS configuration, to evaluate how well it meets the URS, we want to ensure we cover:

  • Overall system design and component selection
  • Materials of construction
  • Sterilization/sanitization approach
  • Integrity assurance measures
  • Sampling and monitoring capabilities
  • Automation and control strategy

Circle back to the HACCP and FMEA to ensure they appropriately cover critical aspects like:

  • Loss of sterility/integrity
  • Leachables/extractables introduction
  • Bioburden control failures
  • Cross-contamination risks
  • Process parameter deviations

These risk assessments will define critical control parameters and acceptance criteria based on the risk assessment. These will form the basis for verification testing. We will through our verification plan have an appropriate approach to:

  • Verify proper installation of SUS components
  • Check integrity of connections and seals
  • Confirm correct placement of sensors and monitoring devices
  • Document as-built system configuration
  • Test system integrity under various operating conditions
  • Perform leak tests on connections and seals
  • Validate sterilization processes for SUS components
  • Verify functionality of critical sensors and control
  • Run simulated production cycles
  • Monitor for contamination using sensitive detection methods
  • Verify maintenance of sterility throughout the process
  • Assess product quality attributes

The verification strategy will leverage a variety of supplier documentation and internal testing.

Closure Analysis Risk Assessment (CLARA)

Acceptance and release will be to perform a detailed CLARA to:

  • Identify all potential points of contamination ingress
  • Assess the effectiveness of closure mechanisms
  • Evaluate the robustness of aseptic connections
  • Determine the impact of manual interventions on system closure

On Going Use

Coming out of our HACCP we will have a monitoring and verification plan, this will include some important aspects based on our CCPs.

  • Integrity Testing
    • Implement routine integrity testing protocols for SUS components
    • Utilize methods such as pressure decay tests or helium leak detection
    • Establish acceptance criteria for integrity tests
  • Environmental Monitoring
    • Develop a comprehensive environmental monitoring program
    • Include viable and non-viable particle monitoring
    • Establish alert and action limits for environmental contaminants
  • Operator Training and Qualification
    • Develop detailed SOPs for SUS handling and assembly
    • Implement a rigorous training program for operators
    • Qualify operators through practical assessments
  • Change Control and Continuous Improvement
    • Establish a robust change control process for any modifications to the SUS or process
    • Regularly review and update risk assessments based on new data or changes
    • Implement a continuous improvement program to enhance process closure

Leveraging the Four Layers of Protection

Throughout the validation process, ensure that each layer of protection is addressed:

  1. Process:
    • Optimize process parameters to minimize contamination risks
    • Implement in-process controls to detect deviations
  2. Equipment:
    • Validate the design and functionality of SUS components
    • Ensure proper integration of SUS with existing equipment
  3. Operating Procedures:
    • Develop and validate aseptic techniques for SUS handling
    • Implement procedures for system assembly and disassembly
  4. Production Environment:
    • Qualify the cleanroom environment
    • Validate HVAC systems and air filtration

Remember that validation is an ongoing process. Regular reviews, updates to risk assessments, and incorporation of new technologies and best practices are essential for maintaining a state of control in biotech manufacturing using single-use systems.

Connected to the Contamination Control Strategy

Closed systems are a key element of the overall contamination control strategy with closed processing and closed systems now accepted as the most effective contamination control risk mitigation strategy. I might not be able to manufacture in the woods yet, but darn if I won’t keep trying.

They serve as a primary barrier to prevent contamination from the manufacturing environment by helping to mitigate the risk of contamination by isolating the product from the surrounding environment. Closed systems are the key protective measure to prevent contamination from the manufacturing environment and cross-contamination from neighboring operations.

The risk assessments leveraged during the implementation of closed systems are a crucial part of developing an effective CCS and will communicate the (ideally) robust methods used to protect products from environmental contamination and cross-contamination. This is tied into the facility design, environmental controls, risk assessments, and overall manufacturing strategies, which are the key components of a comprehensive CCS.

PDCA and OODA

PDCA (and it’s variants) are a pretty tried and true model for process improvement. In the PDCA model a plan is structured in four steps: P (plan) D (do) C (check) A (act). The intention is create a structured cycle that allows the process to flow in accordance with the objectives to be achieved (P), execute what was planned (D), check whether the objectives were achieved with emphasis on the verification of what went right and what went wrong (C) and identify factors of success or failure to feed a new process of planning (A).

Conceptually, the organization will be a fast turning wheel of endlessly learning from mistakes and seeking to maximize processes in order to remain forever in pursuit of strategic objectives, endlessly searching for the maximum efficiency and effectiveness of the system.

The OODA Loop

The OODA loop or cycle was designed by John R. Boyd and consists of a cycle of four phases:
Observe, Orient, Decide and Act (OODA).

  • Observe: Based on implicit guidance and control, observations are made regarding unfolding circumstances, outside information, and dynamic interaction with the environment (including the result of prior actions).
  • Orient: Observations from the prior stage are deconstructed into separate component
    pieces; then synthesized and analyzed in several contexts such as cultural traditions, genetic
    heritage, and previous experiences; and then combined together for the purposes of
    analysis and synthesis to inform the next phase.
  • Decide: In this phase, hypotheses are evaluated, and a decision is made.
  • Act: Based on the decision from the prior stage, action is taken to achieve a desired effect
    or result

While similar to the PDCA improvement of a known system making it more effective, efficient or effective (depending on the effect to be expected), the OODA strives to model a framework for situational awareness.

Boyd’s concentration on the specific set of circumstances relevant to military situations had for years meant the OODA loop has not received a lot of wide spread interest. I’ve been seeing a lot of recent adaptations of the OODA loop try to expand to address the needs of operating in volatile, uncertain, complex and ambiguous (VUCA) situations. I especially like seeing it as part of resilience and business continuity.

Enhanced Decision-Making Speed and Agility

    The OODA loop enables organizations to make faster, more informed decisions in rapidly changing environments. By continuously cycling through the observe-orient-decide-act process, organizations can respond more quickly to market crises, threats, and emerging opportunities.

    Improved Situational Awareness

      The observation and orientation phases help organizations maintain a comprehensive understanding of their operating environment. This enhanced situational awareness allows us to identify trends, threats, and opportunities more effectively.

      Better Adaptability to Change

        The iterative nature of the OODA loop promotes continuous learning and adaptation. This fosters a culture of flexibility and responsiveness, enabling organizations to adjust their strategies and operations as circumstances evolve.

        Enhanced Crisis Management

          In high-pressure situations or crises, the OODA loop provides a structured approach for rapid, effective decision-making. This can be invaluable for managing unexpected challenges or emergencies.

          Improved Team Coordination and Communication

            The OODA process encourages clear communication and coordination among team members as they move through each phase. This can lead to better team cohesion and more effective execution of strategies.

            Data-Driven Culture

              The OODA loop emphasizes the importance of observation and orientation based on current data. This promotes a data-driven culture where decisions are made based on real-time information rather than outdated assumptions.

              Continuous Improvement

                The cyclical nature of the OODA loop supports ongoing refinement of processes and strategies. Each iteration provides feedback that can be used to improve future observations, orientations, decisions, and actions.

                Complementary Perspectives

                PDCA is typically used for long-term, systematic improvement projects, while OODA is better suited for rapid decision-making in dynamic environments. Using both allows organizations to address both strategic and tactical needs.

                Integration Points

                1. Observation and Planning
                  • OODA’s “Observe” step can feed into PDCA’s “Plan” phase by providing real-time situational awareness.
                  • PDCA’s structured planning can enhance OODA’s orientation process.
                2. Execution
                  • PDCA’s “Do” phase can incorporate OODA loops for quick adjustments during implementation.
                  • OODA’s “Act” step can trigger a new PDCA cycle for more comprehensive improvements.
                3. Evaluation
                  • PDCA’s “Check” phase can use OODA’s observation techniques for more thorough assessment.
                  • OODA’s rapid decision-making can inform PDCA’s “Act” phase for faster course corrections.

                Pillars of Good Data

                One thing we should all agree with is that we need reliable reliable, accurate, and trustworthy data. Which is why we strive for the principles of data governance, data quality, and data integrity, three interconnected concepts that work together to create a robust data management framework.

                Overarching Framework: Data Governance

                Data governance serves as the overarching framework that establishes the policies, procedures, and standards for managing data within an organization. It provides the structure and guidance necessary for effective data management, including:

                • Defining roles and responsibilities for data management
                • Establishing data policies and standards
                • Creating processes for data handling and decision-making
                • Ensuring compliance with regulations and internal policies

                Data governance sets the stage for both data quality and data integrity initiatives by providing the necessary organizational structure and guidelines.

                Data Quality: Ensuring Fitness for Purpose

                Within the data governance framework, data quality focuses on ensuring that data is fit for its intended use. This involves:

                • Assessing data against specific quality dimensions (e.g., accuracy, completeness, consistency, validity, timeliness)
                • Implementing data cleansing and standardization processes
                • Monitoring and measuring data quality metrics
                • Continuously improving data quality through feedback loops and corrective actions

                Data quality initiatives are guided by the policies and standards set forth in the data governance framework, ensuring that quality efforts align with organizational goals and requirements.

                Data Integrity: Maintaining Trustworthiness

                Data integrity works in tandem with data quality to ensure that data remains accurate, complete, consistent, and reliable throughout its lifecycle. The ALCOA+ principles, widely used in regulated industries, provide a comprehensive framework for ensuring data integrity.

                ALCOA+ Principles

                Attributable: Ensuring that data can be traced back to its origin and the individual responsible for its creation or modification.

                Legible: Maintaining data in a clear, readable format that is easily understandable.

                Contemporaneous: Recording data at the time of the event or observation to ensure accuracy and prevent reliance on memory.

                Original: Preserving the original record or a certified true copy to maintain data authenticity.

                Accurate: Ensuring data correctness and freedom from errors.

                Complete: Capturing all necessary information without omissions.

                Consistent: Maintaining data coherence across different systems and over time.

                Enduring: Preserving data for the required retention period in a format that remains accessible.

                Available: Ensuring data is readily accessible when needed for review or inspection.

                Additional Data Integrity Measures

                Security Measures: Implementing robust security protocols to protect data from unauthorized access, modification, or deletion.

                Data Lineage Tracking: Establishing systems to monitor and document data transformations and origins throughout its lifecycle.

                Auditability: Ensuring data changes are traceable through comprehensive logging and change management processes.

                Data Consistency: Maintaining uniformity of data across various systems and databases.

                Data integrity measures are often defined and enforced through data governance policies, while also supporting data quality objectives by preserving the accuracy and reliability of data. By adhering to the ALCOA+ principles and implementing additional integrity measures, organizations can ensure their data remains trustworthy and compliant with regulatory requirements.

                Synergy in Action

                The collaboration between these three elements can be illustrated through a practical example:

                1. Data Governance Framework: An organization establishes a data governance committee that defines policies for GxP data management, including data quality standards and security requirements.
                2. Data Quality Initiative: Based on the governance policies, the organization implements data quality checks to ensure GxP information is accurate, complete, and up-to-date. This includes:
                  • Regular data profiling to identify quality issues
                  • Data cleansing processes to correct errors
                  • Validation rules to prevent the entry of incorrect data
                3. Data Integrity Measures: To maintain the trustworthiness of GxP data, the organization:
                  • Implements access controls to prevent unauthorized modifications
                  • Qualifies system to meet ALCOA+ requirements
                  • Establishes audit trails to track changes to GxP records

                By working together, these elements ensure that:

                • GxP data meets quality standards (data quality)
                • The data remains has a secure and unaltered lineage (data integrity)
                • All processes align with organizational policies and regulatory requirements (data governance)

                Continuous Improvement Cycle

                The relationship between data governance, quality, and integrity is not static but forms a continuous improvement cycle:

                1. Data governance policies inform data quality and integrity standards.
                2. Data quality assessments and integrity checks provide feedback on the effectiveness of governance policies.
                3. This feedback is used to refine and improve governance policies, which in turn enhance data quality and integrity practices.

                This ongoing cycle ensures that an organization’s data management practices evolve to meet changing business needs and technological advancements.

                Data governance, data quality, and data integrity work together as a cohesive system to ensure that an organization’s data is not only accurate and reliable but also properly managed, protected, and utilized in alignment with business objectives and regulatory requirements. This integrated approach is essential for organizations seeking to maximize the value of their data assets while minimizing risks associated with poor data management.

                A GMP Application based on ISA S88.01

                A great example of Data governance is applying ISA S88.01 to enhance batch control processes and improve overall manufacturing operations.

                Data Standardization and Structure

                ISA S88.01 provides a standardized framework for batch control, including models and terminology that define the physical, procedural, and recipe aspects of batch manufacturing. This standardization directly supports data governance efforts by:

                • Establishing a common language for batch processes across the organization
                • Defining consistent data structures and hierarchies
                • Facilitating clear communication between different departments and systems

                Improved Data Quality

                By following the ISA S88.01 standard, organizations can ensure higher data quality throughout the batch manufacturing process:

                • Consistent Data Collection: The standard defines specific data points to be collected at each stage of the batch process, ensuring comprehensive and uniform data capture.
                • Traceability: ISA S88.01 enables detailed tracking of each phase of the batch process, including raw materials used, process parameters, and quality data.
                • Data Integrity: The structured approach helps maintain data integrity by clearly defining data sources, formats, and relationships.

                Enhanced Data Management

                The ISA S88.01 model supports effective data management practices:

                • Modular Approach: The standard’s modular structure allows for easier management of data related to specific equipment, procedures, or recipes.
                • Scalability: As processes or equipment change, the modular nature of ISA S88.01 facilitates easier updates to data structures and governance policies.
                • Data Lifecycle Management: The standard’s clear delineation of process stages aids in managing data throughout its lifecycle, from creation to archival.

                Regulatory Compliance

                ISA S88.01 supports data governance efforts related to regulatory compliance:

                • Audit Trails: The standard’s emphasis on traceability aligns with regulatory requirements for maintaining detailed records of batch processes.
                • Consistent Documentation: Standardized terminology and structures facilitate the creation of consistent, compliant documentation.

                Decision Support and Analytics

                The structured data approach of ISA S88.01 enhances data governance initiatives aimed at improving decision-making:

                • Data Integration: The standard facilitates easier integration of batch data with other enterprise systems, supporting comprehensive analytics.
                • Performance Monitoring: Standardized data structures enable more effective monitoring and comparison of batch processes across different units or sites.

                Continuous Improvement

                Both data governance and ISA S88.01 support continuous improvement efforts:

                • Process Optimization: The structured data from ISA S88.01 compliant systems can be more easily analyzed to identify areas for process improvement.
                • Knowledge Management: The standard terminology and models facilitate better knowledge sharing and retention within the organization.

                By leveraging ISA S88.01 in conjunction with robust data governance practices, organizations can create a powerful framework for managing batch processes, ensuring data quality, and driving operational excellence in manufacturing environments.

                CRLs Should List the Third Party Manufacturer

                It probably is good for the public interest, and frankly for the manufacturing ecosystem, for the FDA to be directed (and given the authority) to disclose the third party whose “facility-related deficiencies” identified during a Current Good Manufacturing Practices (cGMP) results in a CRL.

                A little public shaming would probably help deal with widespread structural deficiencies amongst CDMOs.

                Something certainly needs to happen, this is happening way to often.

                Photo by Leah Newhouse on Pexels.com