I don’t like the term validation deviation, preferring to use discrepancy to cover the errors or failures that occur during qualification/validation, such as when the actual results of a test step in a protocol do not match the expected results. These discrepancies can arise for various reasons, including errors in the protocol, execution issues, or external factors.
I don’t like using the term deviation as I try to avoid terms becoming too overused in too many ways. By choosing discrepancy it serves to move them to a lower order of problem so they can be addressed holistically.
Validation discrepancies really get to the heart of deciding whether the given system/process is fit-for-purpose and fit-for-use. As such, they require being addressed in a timely and pragmatic way.
And, like anything else, having an effective procedure to manage is critical.
Validation discrepancies are a great example of building problem-solving into a process.
ASTM E2500, the Standard Guide for Specification, Design, and Verification of Pharmaceutical and Biopharmaceutical Manufacturing Systems and Equipment, is intended to “satisfy international regulatory expectations in ensuring that manufacturing systems and equipment are fit for the intended use and to satisfy requirements for design, installation, operation, and performance.”
The ASTM E2500 approach is a comprehensive framework for specification setting, design, and verification of pharmaceutical and biopharmaceutical manufacturing systems and equipment. It emphasizes a risk- and science-based methodology to ensure that systems are fit for their intended use, ultimately aiming to enhance product quality and patient safety.
Despite its 17-year history, it is fair to say it is not the best-implemented standard. There are still many unrealized opportunities and some major challenges. I don’t think a single organization I’ve been in has fully aligned, and ASTM E2500 can feel aspirational.
Key Principles
Risk Management: The approach integrates risk management principles from ICH Q8, Q9, and Q10, focusing on identifying and mitigating risks to product quality and patient safety throughout the lifecycle of the manufacturing system.
Good Engineering Practices (GEP): It incorporates GEP to ensure systems are correctly designed, installed, and operated.
Flexibility and Efficiency: It strives for a more flexible and efficient organization of verification activities that can be adapted to each project’s specific context.
Regulatory agencies expect drugmakers to persuade them that we know our processes and that our facilities, equipment, systems, utilities, and procedures have been established based on concrete data and a thorough risk assessment. The ASTM E2500 standard provides a means of demonstrating that all of these factors have been validated in consideration of carefully evaluated risks.
What the Standard Calls for
Four Main Steps
Requirements: Define the system’s needs and critical aspects. Subject Matter Experts (SMEs) play a crucial role in this phase by defining needs, identifying critical aspects, and developing the verification strategy.
Specification & Design: Develop detailed specifications and design the system to meet the requirements. This step involves thorough design reviews and risk assessments to ensure the system functions as intended.
Verification: Conduct verification activities to confirm that the system meets all specified requirements. This step replaces the traditional FAT/SAT/IQ/OQ/PQ sequence with a more streamlined verification process that can be tailored to the project’s needs.
Acceptance & Release: Finalize the verification process and release the system for operational use. This step includes the final review and approval of all verification activities and documentation.
Four Cross-Functional Processes
Good Engineering Practices (GEP): Ensure all engineering activities adhere to industry standards and best practices.
Quality Risk Management: Continuously assess and manage risks to product quality and patient safety throughout the project.
Design Review: Regularly reviews the system design to ensure it meets all requirements and addresses identified risks.
Change Management: Implement a structured process for managing system changes to ensure that all modifications are appropriately evaluated and documented.
Applications and Benefits
Applicability: The ASTM E2500 approach can be applied to new and existing manufacturing systems, including laboratory, information, and medical device manufacturing systems.
Lifecycle Coverage: It applies throughout the manufacturing system’s lifecycle, from concept to retirement.
Regulatory Compliance: The approach is designed to conform with FDA, EU, and other international regulations, ensuring that systems are qualified and meet all regulatory expectations.
Efficiency and Cost Management: By focusing on critical aspects and leveraging risk management tools, the ASTM E2500 approach can streamline project execution, reduce time to market, and optimize resource utilization.
The ASTM E2500 approach provides a structured, risk-based framework for specifying, designing, and verifying pharmaceutical and biopharmaceutical manufacturing systems. It emphasizes flexibility, efficiency, and regulatory compliance, making it a valuable tool for ensuring product quality and patient safety.
What Makes it Different?
ASTM E2500
The more traditional approach
Testing Approach
It emphasizes a risk-based approach, focusing on identifying and managing risks to product quality and patient safety throughout the manufacturing system’s lifecycle. This approach allows for flexibility in organizing verification activities based on the specific context and critical aspects of the system.
Typically follows a prescriptive sequence of tests (FAT, SAT, IQ, OQ, PQ) as outlined in guidelines like EU GMP Annex 15. This method is more rigid and less adaptable to the specific needs and risks of each project.
Verification vs Qualification
The term “verification” encompasses all testing activities, which can be organized more freely and rationally to optimize efficiency. Verification activities are tailored to the project’s needs and focus on critical aspects.
Follows a structured qualification process (Installation Qualification, Operational Qualification, Performance Qualification) with predefined steps and documentation requirements.
Role of Subject Matter Experts
SMEs play a crucial role from the start of the project, contributing to the definition of needs, identification of critical aspects, system design review, and development of the verification strategy. They are involved throughout the project lifecycle.
SMEs are typically involved at specific points in the project lifecycle, primarily during the qualification phases, and may not have as continuous a role as in the ASTM E2500 approach.
Integration of Good Engineering Practices
Offers greater flexibility in organizing verification activities, allowing for a more efficient and streamlined process. This can lead to reduced time to market and optimized resource utilization.
While GEP is also important, the focus is more on the qualification steps rather than integrating GEP throughout the entire project lifecycle.
Change Management
It emphasizes early and continuous change management, starting from the supplier’s site, to avoid test duplication and ensure that changes are properly evaluated and documented.
It emphasizes early and continuous change management, starting from the supplier’s site, to avoid test duplication and ensure that changes are properly evaluated and documented.
Documentation
Documentation is focused on risk management and verification activities, ensuring compliance with international regulations (FDA, EU, ICH Q8, Q9, Q10). The approach is designed to meet regulatory expectations while allowing for flexibility in documentation.
quires extensive documentation for each qualification step, which can be more cumbersome and less adaptable to specific project needs.
Opinion
I’m watching to see what the upcoming update to Annex 15 will do to address the difficulties some see between an ATSM E2500 approach and the European regulations. I also hope we will see an update to ISPE Baseline® Guide Volume 5: Commissioning and Qualification to align an approach.
ISPE Baseline® Guide Volume 5
ATSM E2500
Design inputs Impact assessment Design Qualification Commissioning Multiple trial runs to get things right IQ, OQ, PQ, and acceptance criteria GEP Scope and QA Scope overlapped Focused on Documentation Deliverables Change Management
Design inputs Design Review Risk Mitigation Critical Control Parameters define Acceptance Criteria Verification Testing Performance Testing GEP Scope and QA Scope have a clear boundary Process, Product Quality and Patient Safety Quality by Design, Design Space, and Continuous Improvement
To be honest I don’t think ATSM E2500, ISPE Guide 5, or anything else has the balance just right. And your program ends up being a triangulation between these and the regulations. And don’t even bring in trying to align GAMP5 or USP <1058> or…or…or…
And yes, I do consider this part of my 3-year plan. I look forward to the challenges of a culture shift, increased SME involvement, formalization of GEPs (and teaching engineers how to write), effective change management, timely risk assessments, and comprehensive implementation planning.
Risk management plays a pivotal role in validation by enabling a risk-based approach to defining validation strategies, ensuring regulatory compliance, mitigating product quality and safety risks, facilitating continuous improvement, and promoting cross-functional collaboration. Integrating risk management principles into the validation lifecycle is essential for maintaining control and consistently producing high-quality products in regulated industries such as biotech and medical devices.
We will conduct various risk assessments in our process lifecycle—many ad hoc (static) and a few living (dynamic). Understanding how they fit together in a larger activity set is crucial.
In the Facility, Utilities, Systems, and Equipment (FUSE) space, we are taking the process understanding, translating it into a design, and then performing Design Qualification (DQ) to verify that the critical aspects (CAs) and critical design elements (CDEs) necessary to control risks identified during the quality risk assessment (QRA) are present in the design. This helps mitigate risks to product quality and patient safety. To do this, we need to properly understand the process. Unfortunately, we often start with design before understanding the process and then need to go back and perform rework. Too often I see a dFMEA ignored or as an input to the pFMEA instead of working together in a full risk management cycle.
The Preliminary Hazard Analysis (PHA) supports a pFMEA, which supports a dFMEA, which supports the pFMEA (which also benefits at this stage from a HAACP). Tools fit together to provide the approach. Tools do not become the approach.
Design and Process FMEAs
DFMEA (Design Failure Mode and Effects Analysis) and PFMEA (Process Failure Mode and Effects Analysis) are both methodologies used within the broader FMEA framework to identify and mitigate potential failures. Still, they focus on different aspects of development and manufacturing.
DFMEA
PFMEA
Scope and Focus
Primarily scrutinizes design to preempt flaws.
Focuses on processes to ensure effectiveness, efficiency and reliability.
Stakeholder Involvement
Engages design-oriented teams like engineering, quality engineers, and reliability engineers.
Involves operation-centric personnel such as manufacturing, quality control, quality operations, and process engineers.
Inputs and Outputs
Relies on design requirements, product specs, and component interactions to craft a robust product.
Utilizes process steps, equipment capabilities, and parameters to design a stable operational process.
Stages in lifecycle
Conducted early in development, concurrent with the design phase, it aids in early issue detection and minimizes design impact.
Executed in production planning post-finalized design, ensuring optimized operations prior to full-scale production.
Updated When
Executed in production planning post-finalized design, ensuring optimized operations before full-scale production.
The design qualification phase is especially suitable for determining risks for products and patients stemming from the equipment or machine. These risks should be identified during the design qualification and reflected by appropriate measures in the draft design so that the operator can effectively eliminate, adequately control, and monitor or observe them. To identify design defects (mechanical) or in the creation of systems (electronics) on time and to eliminate them at a low cost, it is advisable to perform the following risk analysis activities for systems, equipment, or processes:
Categorize the GMP criticality and identify the critical quality attributes and process parameters;
Categorize the requirements regarding the patient impact and product impact (for example, in the form of a trace matrix);
Identify critical functions and system elements (e.g., the definition of a calibration concept and preventive maintenance);
Investigate functions for defect recognition. This includes checking alarms and fault indications, operator error, etc. The result of this risk analysis may be the definition of further maintenance activities, a different assessment of a measurement point, or the identification of topics to include in the operating manuals or procedures.
Additional risk analyses for verifying the design may include usability studies using equipment mock-ups or preliminary production trials (engineering studies) regarding selected topics to prove the feasibility of specific design aspects (e.g., interaction between machine and materials).
Too often, we misunderstand risk assessments and start doing them at the most granular level. This approach allows us to right-size our risk assessments and holistically look at the entire lifecycle.
Commissioning, qualification, and validation are three distinct but interrelated processes in the pharmaceutical and biotechnology industries that ensure facilities, equipment, systems, and processes meet regulatory requirements and produce products of the desired quality. Here are the key differences:
Commissioning
Commissioning is a systematic process of ensuring that equipment, systems, and facilities are designed, installed, and functioning according to operational and engineering requirements.
It involves design reviews, installation verification, functional testing, and handover to operations.
Commissioning primarily focuses on satisfying engineering requirements and does not have direct regulatory requirements.
Qualification
Qualification is a regulated and documented process that demonstrates that equipment, systems, and facilities are installed correctly and operate as intended for their specific use.
It applies only to equipment, systems, and utilities that directly or indirectly impact product quality and patient safety.
Qualification activities include Installation Qualification (IQ), Operational Qualification (OQ), and Performance Qualification (PQ).
Qualification is focused on by regulatory authorities like the FDA and EMA to ensure compliance.
Validation
Validation is a broader concept establishing documented evidence that a process consistently produces a product that meets its predetermined specifications and quality attributes.
It encompasses the entire process lifecycle, including process design, qualification of equipment/systems, and continued process verification.
Validation ensures that the equipment and systems are qualified and the entire process is controlled to produce the desired final product.
In summary, commissioning verifies engineering requirements, qualification demonstrates suitability for intended use, and validation provides a high degree of assurance that the process will consistently produce a quality product. These activities are interconnected, with commissioning often leveraged during qualification and qualification being a subset of the overall validation process.
FDA’s Framework for Process Validation
The FDA’s Process Validation Guidance is a core document outlining a lifecycle approach with outlines a lifecycle approach with three main stages:
Stage 1: Process Design
Establish a process design based on knowledge gained through development and scale-up activities.
Identify critical quality attributes (CQAs) and critical process parameters (CPPs) using risk assessment and multivariate studies like Design of Experiments (DoE).
Develop a control strategy to ensure CQAs are met.
Stage 2: Process Qualification
Evaluate the process design through facility, utility, and equipment qualification.
Conduct performance qualification (PQ) by running production batches to confirm the process design has reproducible commercial manufacturing.
Establish scientific evidence that the process meets all defined requirements and product specifications.
Stage 3: Continued Process Verification
Maintain the validated status and monitor performance to ensure a state of control.
Conduct product quality reviews periodically to evaluate process performance.
The guidance emphasizes using a science and risk-based approach throughout the lifecycle, leveraging process understanding and knowledge gained from development through commercial production. Effective process validation requires good planning, documented evidence, and a robust quality system.
The User Requirements are a foundational document identifying the system’s product and process requirements. These product quality-related user requirements are based on product knowledge (CQAs), process knowledge (CPPs), regulatory requirements, and organization/site quality requirements. Writing a good user requirement for quality requirements involves several critical steps to ensure clarity, specificity, and effectiveness.
Understand the User Needs
Start by thoroughly understanding the user’s needs. This involves engaging with the end users or stakeholders to gather insights about their expectations, pain points, and the context in which the system will be used. This foundational step ensures that the requirements you develop are aligned with actual user needs and business goals.
Be Specific and Use Clear Language
Requirements should be specific and clearly stated to avoid ambiguity. Use simple, direct language and avoid technical jargon unless it is widely understood by all stakeholders. Define all terms and ensure that each requirement is phrased in a way that leaves no room for misinterpretation.
Make Requirements Measurable and Testable
Each requirement should be measurable and testable. This means stating requirements so one can verify whether they have been met. For example, instead of saying, “The system should load fast,” specify, “The system should load within 3 seconds when the number of simultaneous users is less than 10,000”.
Avoid Using Ambiguous Terms
Avoid terms open to interpretation, such as “user-friendly” or “fast.” If such terms are necessary, clearly define what they specifically mean in the context of your project. For instance, “user-friendly” is “the user can complete the desired task with no more than three clicks”.
Use the SMART Criteria
Employ the SMART criteria to ensure that each requirement is Specific, Measurable, Achievable, Relevant, and Time-bound. This approach helps set clear expectations and facilitates easier validation and verification of the requirements.
Make Requirements Concise but Comprehensive
While keeping each requirement concise and to the point is important, ensure all necessary details are included. Each requirement should be complete and provide enough detail for designers and developers to implement without making assumptions.
Prioritize Requirements
Not all requirements are equally important. Prioritize them based on their impact on the users and the business objectives. This helps manage the project scope and focuses on delivering maximum value.
It is good to categorize the user requirements here, such as:
Quality
Business
Health, Safety, and Environmental (HSE)
Review and Validate with Stakeholders
Review the requirements regularly with all stakeholders, including end-users, project managers, developers, and testers. This collaborative approach helps identify gaps or misunderstandings early in the project lifecycle.
Maintain a Living Document
Requirements might evolve as new information emerges or business needs change. Maintain your requirements document as a living document, regularly update it, and communicate changes to all stakeholders.
Use Models and Examples
Where applicable, use diagrams, mock-ups, or prototypes to complement the written requirements. Visual aids can help stakeholders better understand the requirements and provide feedback.
When writing user requirements for quality requirements, it’s crucial to avoid common pitfalls that can lead to misunderstandings, scope creep, and, ultimately, a product that does not meet the needs of the users or stakeholders. Here are some of the most common mistakes to avoid:
Ambiguity and Lack of Clarity
One of the most frequent errors in writing requirements is ambiguity. Requirements should be clear and concise, with no room for interpretation. Using vague terms like “user-friendly” or “fast” without specific definitions can lead to unmet expectations because different people may interpret these terms differently.
Incomplete Requirements
Another common issue is incomplete requirements that do not capture all necessary details or scenarios. This can result in features that do not fully address the users’ needs or require costly revisions later in development.
Overlooking Non-Functional Requirements
Focusing solely on what the system should do (functional requirements) without considering how it should perform (non-functional requirements), such as performance, security, and usability, can jeopardize the system’s effectiveness and user satisfaction.
Failure to Involve Stakeholders
Not involving all relevant stakeholders in the requirements gathering and validation process can lead to missing critical insights or requirements important to different user groups. This often results in a product that does not fully meet the needs of all its users.
Scope Creep
Without a clear definition of scope, projects can suffer from scope creep, where additional features and requirements are added without proper review, leading to delays and budget overruns. It’s important to have a well-defined project scope and a change management process in place.
Not Prioritizing Requirements
Not all requirements are equally important. Failing to prioritize requirements can misallocate resources and efforts on less critical features. Using prioritization techniques like MoSCoW (Must have, Should have, Could have, Won’t have this time) can help manage and focus efforts on what truly matters.
Lack of Validation and Verification
Skipping the validation (ensuring the product meets the intended use and needs of the stakeholders) and verification (ensuring the product meets the specified requirements) processes can lead to a final product not aligned with user needs and expectations.
Poor Documentation and Traceability
Inadequate documentation and lack of traceability can lead to confusion and errors during development. Maintaining detailed documentation and traceability from requirements through to implementation is crucial to ensure consistency and completeness.
Ignoring the Importance of Clear Communication
Effective communication is essential throughout the requirements process. Miscommunication can lead to incorrect or misunderstood requirements being developed. Regular, clear communication and documentation updates are necessary to keep all stakeholders aligned.
Not Considering the Testing of Requirements
Considering how requirements will be tested during the definition phase is important. This consideration helps ensure that requirements are testable and that the final product will meet them. Planning for testing early can also highlight any potential issues with clarity or feasibility of requirements.