I think folks tend to fall into a trap when it comes to equipment and GAMP5, automatically assuming that because it is equipment it must be Category 3. Oh, how that can lead to problems.
When thinking about equipment it is best to think in terms of “No Configuration” and ” Low Configuration” software. This terminology is used to describe software that requires little to no configuration or customization to meet the user’s needs.
No Configuration(NoCo) aligns with GAMP 5 Category 3 software, which is described as “Non-Configured Products”. These are commercial off-the-shelf software applications that are used as-is, without any customization or with only minimal parameter settings. My microwave is NoCo.
Low Configuration(LoCo) typically falls between Category 3 and Category 4 software. It refers to software that requires some configuration, but not to the extent of fully configurable systems. My PlayStation is LoCo.
The distinction between these categories is important for determining the appropriate validation approach:
Category 3 (NoCo) software generally requires less extensive validation efforts, as it is used without significant modifications. Truly it can be implicit testing.
Software with low configuration may require a bit more scrutiny in validation, but still less than fully configurable or custom-developed systems.
Remember that GAMP 5 emphasizes a continuum approach rather than strict categorization. The level of validation effort should be based on the system’s impact on patient safety, product quality, and data integrity, as well as the extent of configuration or customization.
When is Something Low Configuration?
Low Configuration refers to software that requires minimal setup or customization to meet user needs, falling between Category 3 (Non-Configured Products) and Category 4 (Configured Products) software. Here’s a breakdown of what counts as low configuration:
Parameter settings: Software that allows basic parameter adjustments without altering core functionality.
Limited customization: Applications that permit some tailoring to specific workflows, but not extensive modifications.
Standard modules: Software that uses pre-built, configurable modules to adapt to business processes.
Default configurations: Systems that can be used with supplier-provided default settings or with minor adjustments.
Simple data input: Applications that allow input of specific data or ranges, such as electronic chart recorders with input ranges and alarm setpoints.
Basic user interface customization: Software that allows minor changes to the user interface without altering underlying functionality.
Report customization: Systems that permit basic report formatting or selection of data fields to display.
Simple workflow adjustments: Applications that allow minor changes to predefined workflows without complex programming.
It’s important to note that the distinction between low configuration and more extensive configuration (Category 4) can sometimes be subjective. The key is to assess the extent of configuration required and its impact on the system’s core functionality and GxP compliance. Organizations should document their rationale for categorization in system risk assessments or validation plans.
Attribute
Category 3 (No Configuration)
Low Configuration
Category 4
Configuration Level
No configuration
Minimal configuration
Extensive configuration
Parameter Settings
Fixed or minimal
Basic adjustments
Complex adjustments
Customization
None
Limited
Extensive
Modules
Pre-built, non-configurable
Standard, slightly configurable
Highly configurable
Default Settings
Used as-is
Minor adjustments
Significant modifications
Data Input
Fixed format
Simple data/range input
Complex data structures
User Interface
Fixed
Basic customization
Extensive customization
Workflow Adjustments
None
Minor changes
Significant alterations
User Account Management
Basic, often single-user
Limited user roles and permissions
Advanced user management with multiple roles and access levels
Report Customization
Pre-defined reports
Basic formatting/field selection
Advanced report design
Example Equipment
pH meter
Electronic chart recorder
Chromatography data system
Validation Effort
Minimal
Moderate
Extensive
Risk Level
Low
Low to Medium
Medium to High
Supplier Documentation
Heavily relied upon
Partially relied upon
Supplemented with in-house testing
Here’s the thing to be aware of, a lot of equipment these days is more category 4 than 3, as the manufacturers include all sorts of features, such as user account management and trending and configurable reports. And to be frank, I’ve seen too many situations where Programmable Logic Controllers (PLCs) didn’t take into account all that configuration from standard function libraries to control specific manufacturing processes.
Your methodology needs to keep up with the technological growth curve.
Facility design and manufacturing processes are complex, multi-stage operations, fraught with difficulty. Ensuring the facility meets Good Manufacturing Practice (GMP) standards and other regulatory requirements is a major challenge. The complex regulations around biomanufacturing facilities require careful planning and documentation from the earliest design stages.
Which is why consensus standards like ASTM E2500 exist.
Central to these approaches are risk assessment, to which there are three primary components:
An understanding of the uncertainties in the design (which includes materials, processing, equipment, personnel, environment, detection systems, feedback control)
An identification of the hazards and failure mechanisms
An estimation of the risks associated with each hazard and failure
Folks often get tied up on what tool to use. Frankly, this is a phase approach. We start with a PHA for design, an FMEA for verification and a HACCP/Layers of Control Analysis for Acceptance. Throughout we use a bow-tie for communication.
Aspect
Bow-Tie
PHA (Preliminary Hazard Analysis)
FMEA (Failure Mode and Effects Analysis)
HACCP (Hazard Analysis and Critical Control Points)
Primary Focus
Visualizing risk pathways
Early hazard identification
Potential failure modes
Systematically identify, evaluate, and control hazards that could compromise product safety
Timing in Process
Any stage
Early development
Any stage, often design
Throughout production
Approach
Combines causes and consequences
Top-down
Bottom-up
Systematic prevention
Complexity
Moderate
Low to moderate
High
Moderate
Visual Representation
Central event with causes and consequences
Tabular format
Tabular format
Flow diagram with CCPs
Risk Quantification
Can include, not required
Basic risk estimation
Risk Priority Number (RPN)
Not typically quantified
Regulatory Alignment
Less common in pharma
Aligns with ISO 14971
Widely accepted in pharma
Less common in pharma
Critical Points
Identifies barriers
Does not specify
Identifies critical failure modes
Identifies Critical Control Points (CCPs)
Scope
Specific hazardous event
System-level hazards
Component or process-level failures
Process-specific hazards
Team Requirements
Cross-functional
Less detailed knowledge needed
Detailed system knowledge
Food safety expertise
Ongoing Management
Can be used for monitoring
Often updated periodically
Regularly updated
Continuous monitoring of CCPs
Output
Visual risk scenario
List of hazards and initial risk levels
Prioritized list of failure modes
HACCP plan with CCPs
Typical Use in Pharma
Risk communication
Early risk identification
Detailed risk analysis
Product Safety/Contamination Control
At BOSCON this year I’ll be talking about this fascinating detail, perhaps too much detail.
A recent FDA Warning Letter really drove home a good point about the perils of ‘retrospective validation’ and how that normally doesn’t mean what folks want it to mean.
“In lieu of process validation studies, you attempted to retrospectively review past batches without scientifically establishing blend uniformity and other critical process performance indicators. You do not commit to conduct further process performance qualification studies that scientifically establish the ability of your manufacturing process to consistently yield finished products that meet their quality attributes.”
The FDA’s response here is important for three truths:
Validation needs to be done against critical quality attributes and critical process parameters to scientifically establish that the manufacturing process is consistent.
Batch data on its own is rather useless.
Validation is a continuous exercise, it is not once-and-done (or rather in most people’s view thrice-and-done).
I don’t think the current GMPs really allow the concept of retrospective validation as most people want it to mean (including the recipient of that warning letter). It’s probably a term we should go into the big box of Nope.
AI generated art
Retrospective validation as most people mean it is a type of process validation that involves evaluating historical data and records to demonstrate that an existing process consistently produces products meeting predetermined specifications. As an approach retrospective validation involves evaluating historical data and records to demonstrate that an existing process consistently produces products meeting predetermined specifications.
The problem here is that this really just tells you what you were already hoping was true.
Retrospective validation has some major flaws:
Limited control over data quality and completeness: Since retrospective validation relies on historical data, there may be gaps or inconsistencies in the available information. The data may not have been collected with validation in mind, leading to missing critical parameters or measurements. It rather throws out most of the principles of science.
Potential bias in existing data: Historical data may be biased or incomplete, as it was not collected specifically for validation purposes. This can make it difficult to draw reliable conclusions about process performance and consistency.
Difficulty in identifying and addressing hidden flaws: Since the process has been in use for some time, there may be hidden flaws or issues that have not been identified or challenged. These could potentially lead to non-conforming products or hazardous operating conditions.
Difficulty in recreating original process conditions: It may be challenging to accurately recreate or understand the original process conditions under which the historical data was generated, potentially limiting the validity of conclusions drawn from the data.
What is truly called for is to perform concurrent validation.
The biotech industry is experiencing a significant transformation in validation processes, driven by rapid technological advancements, evolving regulatory standards, and the development of novel therapies.
The 2024 State of Validation report, authored by Jonathan Kay and funded by Kneat, provides a overview of trends and challenges in the validation industry. Here are some of the key findings:
Compliance and efficiency are top priorities: Creating process efficiencies and ensuring audit readiness have become the primary goals for validation programs.
Compliance burden emerged as the top validation challenge in 2024, replacing shortage of human resources which was the top concern in 2022-2023
Digital transformation is accelerating: 83% of respondents are either using or planning to adopt digital validation systems. The top benefits include improved data integrity, continuous audit readiness, and global standardization.
79% of those using digital validation rely on third-party software providers
Does this mean that 21% of respondents are in companies that have created their own bespoke systems? Or is something else going on there
63% reported that ROI from digital validation systems met or exceeded expectations
Artificial intelligence and machine learning are on the rise: 70% of respondents believe AI and ML will play a pivotal role in the future of validation.
Remote audits are becoming more common: 75% of organizations conducted at least some remote regulatory audits in the past year.
Challenges persist: The industry faces ongoing challenges in balancing costs, attracting talent, and keeping pace with technological advancements.
61% reported an increase in validation workload over the past 12 months
Industry 4.0 adoption is growing: 60% of organizations are in the early stages or actively implementing Industry/Pharma 4.0 technologies.
Digital Transformation:
As highlighted in the 2024 State of Validation report and my previous blog post on “Challenges in Validation,” several key trends and challenges are shaping the future of validation in biotech:
Technological Integration: The integration of AI, machine learning, and automation into validation processes presents both opportunities and challenges. While these technologies offer the potential for increased efficiency and accuracy, they also require new validation frameworks and methodologies.
Regulatory Compliance: Keeping pace with evolving regulatory standards remains a significant challenge. Regulatory bodies are continuously updating guidelines to address technological advancements, requiring companies to stay vigilant and adaptable.
Data Management and Integration: With the increasing use of digital tools and platforms, managing and integrating vast amounts of data has become a critical challenge. The industry is moving towards more robust data analytics and machine learning tools to handle this data efficiently.
Resource Constraints: Particularly for smaller biotech companies, resource limitations in funding, personnel, and expertise can hinder the implementation of advanced validation techniques.
Risk Management: Adopting a risk-based approach to validation is essential but challenging. Companies must develop effective strategies to identify and mitigate risks throughout the product lifecycle.
Collaboration and Knowledge Sharing: Ensuring effective communication and data sharing among various stakeholders is crucial for streamlining validation efforts and aligning goals.
Digital Transformation: The industry is witnessing a shift from traditional, paper-heavy validation methods to more dynamic, data-driven, and digitalized processes. This transformation promises enhanced efficiency, compliance, and collaboration.
Workforce Development: We are a heavily experience driven field. With 38% of validation professionals having 16 or more years of experience, there’s a critical need for knowledge transfer and training to equip newer entrants with necessary skills.
Adoption of Computer Software Assurance (CSA): The industry is gradually embracing CSA processes, driven by recent FDA guidance, though there’s still considerable room for further adoption. I always find this showing up in surveys to be disappointing, as CSA is a racket, as it basically is already existing validation principles. But consultants got to consult.
Focus on Efficiency and Audit Readiness: Creating process efficiencies and ensuring audit readiness have emerged as top priorities for validation programs.
As the validation landscape continues to evolve, it’s crucial for biotech companies to embrace these changes proactively. By leveraging new technologies, fostering collaboration, and focusing on continuous improvement, the industry can overcome these challenges and drive innovation in validation processes.
The future of validation in biotech lies in striking a balance between technological advancement and regulatory compliance, all while maintaining a focus on product quality and patient safety. As we move forward, it’s clear that the validation field will continue to be dynamic and exciting, offering numerous opportunities for innovation and growth.
Organizational competencies are the skills, abilities, and knowledge that allow an organization to be successful in achieving its goals. They form the foundation of an organization’s culture, values, and strategy.
Organizational competencies can be broadly divided into two main categories:
Technical Competencies
Non-Technical Competencies (also called General Competencies)
Technical Competencies
Technical competencies are specific skills and knowledge required to perform particular jobs or functions within an organization. They are directly related to the core business activities and technical aspects of the work. For technical competencies:
They cover various fields of expertise relevant to the specific work carried out in the organization
They are at the heart of what the organizational employees do
They allow an organization to produce products or services efficiently and effectively
They often require ongoing training and reinforcement to stay current
Non-Technical Competencies
Non-technical competencies, also known as general competencies or soft skills, are broader skills and attributes that are important across various roles and functions. They include:
These competencies are crucial for effective interaction, collaboration, and overall organizational success.
Organizational Competencies for Validation (an example)
For an organization focusing on validation the following competencies would be particularly relevant:
Technical Competencies
Skill Area
Key Aspects
Proficiency Levels
Beginner
Intermediate
Advanced
Expert
General CQV Principles
Modern process validation and
guidance
Validation design and how to
reduce variability
Able
to review a basic protocol
Able
to review/approve Validation document deliverables.
Understands
the importance of a well-defined URS.
Able to be QEV lead in a small
project
Able to answer questions and
guide others in QEV
Participates in process
improvement
Able to review and approve
RTM/SRs
Able
to be QEV lead in a large project project
Trains
and mentors others in QEV
Leads
process improvement initiatives
Able
to provide Quality oversight on the creation of Validation Plans for complex
systems and/or projects
Sets
overall CQV strategy
Recognized
as an expert outside of JEB
Facilities
and Utilities
Oversee Facilities, HVAC and
Controlled Environments
Pharma Water and WFI
Pure Steam, Compressed Air,
Medical Gases
Understands
the principles and GMP requirements
Applies the principles,
activities, and deliverables that constitute an efficient and acceptable
approach to demonstrating facility fitness-for-use/qualification
Guide
the Design to Qualification Process for new facilities/utilities or the
expansion of existing facilities/utilities
Able
to establish best practices
Systems
and Equipment
Equipment, including Lab
equipment
Understands
the principles and GMP requirements
Principles, activities, and
deliverables that constitute an efficient and acceptable approach to
demonstrating equipment fitness-for-use/qualification
Able
to provide overall strategy for large projects
Able
to be QEV lead on complex systems and equipment.
Able
to establish best practices
Computer
Systems and Data Integrity
Computer lifecycle, including
validation
Understands
the principles and GMP requirements
Able to review CSV documents
Apply GAMP5 risk
based approach
Day-to-day quality oversight
Able
to provide overall strategy for a risk based GAMP5 approach to computer
system quality
Able
to establish best practices
Asset Lifecycle
Quality
oversight and decision making in the lifecycle asset lifecycle: Plan,
acquire, use, maintain, and dispose of assets
Can
use CMMS to look up Calibrations, Cal schedules and PM schedules
Quality
oversight of asset lifecycle decisions
Able
to provide oversight on Cal/PM frequency
Able
to assess impact to validated state for corrective WO’s.
Able
to establish asset lifecycle for new equipment classes
Establish
risk-based PM for new asset classes
verification
Establish
asset lifecycle approach
Serves
as the organization’s authority on GMP requirements related to asset
management in biotech facilities
Cleaning, Sanitization and Sterilization Validation
Evaluate
and execute cleaning practices, limit calculations, scientific rationales,
and validation documents
Manage
the challenges of multi-product facilities in the establishment of limits,
determination of validation strategies, and maintaining the validated state
Differentiate
the requirements for cleaning and sterilization validation when using manual,
semi-automatic, and automatic cleaning technologies
Review
protocols
Identify
and characterize potential residues including product, processing aids,
cleaning agents, and adventitious agents
Understand
Sterilization principles and requirements
Create,
review and approve scientifically sound rationales, validation protocols, and
reports
Manage
and remediate the pitfalls inherent in cleaning after the production of
biopharmaceutical and pharmaceutical products
Define
cleaning/sterilization validation strategy
Implements
a lifecycle approach to validation, ensuring continued process verification
Implements
a lifecycle approach to validation, ensuring continued process verification
Quality Risk Management
Apply
QRM principles according to Q9
Understands
basic risk assessment principles
Can
identify potential hazards and risks
Familiar
with risk matrices and scoring methods
Participate
in a risk assessment
Conducts
thorough risk assessments using established methodologies
Analyzes
risks quantitatively and qualitatively
Prioritizes
risks based on likelihood and impact
Determine
appropriate tools
Establish
risk-based decision-making tools
Leads
complex risk assessments across multiple areas
Develops
new risk assessment methodologies
Provides
expert guidance on risk analysis techniques
Serves
as the organization’s authority on regulatory requirements and expectations
related to quality risk management
Builds
a proactive risk culture across the organization, fostering risk awareness at
all levels
Process Validation
Demonstrating
that the manufacturing process can consistently produce a product that meets
predetermined specifications and quality attributes.
Understanding
of GMP principles and regulatory requirements
Basic
understanding of GMP principles and regulatory requirements
Can
independently write, approve and execute validation protocols for routine
processes
Ability to develop validation master plans
and protocols
Understanding
of critical process parameters (CPPs) and critical quality attributes (CQAs)
Expertise
in designing and implementing complex validation strategies
Ability
to troubleshoot and resolve validation issues
Deep
understanding of regulatory expectations and industry best practices
Leads
cross-functional validation teams for high-impact projects
Develops
innovative validation approaches for novel bioprocesses
Serves
as an organizational authority on validation matters and regulatory
interactions
Reflective learning is a powerful tool that organizations can leverage to build competency and drive continuous improvement. At its core, this approach involves actively analyzing and evaluating experiences and learning processes to enhance understanding and performance across all levels of the organization.
The process of reflective learning begins with individuals and teams taking the time to step back and critically examine their actions, decisions, and outcomes. This introspection allows them to identify what worked well, what didn’t, and why. By doing so, they can uncover valuable insights that might otherwise go unnoticed in the day-to-day rush of business activities.
One of the key benefits of reflective learning is its ability to transform tacit knowledge into explicit knowledge. Tacit knowledge is the unspoken, intuitive understanding that individuals develop through experience. By reflecting on and articulating these insights, organizations can capture and share this valuable wisdom, making it accessible to others and fostering a culture of collective learning.
To implement reflective learning effectively, organizations should create structured opportunities for reflection. This might include regular debriefing sessions after projects, dedicated time for personal reflection, or the use of learning journals. Additionally, leaders should model reflective practices and encourage open and honest discussions about both successes and failures.
It’s important to note that reflective learning is not just about looking back; it’s also about looking forward. The insights gained through reflection should be used to inform future actions and strategies. This forward-thinking approach helps organizations to be more adaptable and responsive to changing circumstances, ultimately leading to improved performance and innovation.
By embracing reflective learning as a core organizational practice, companies can create a dynamic environment where continuous learning and improvement become ingrained in the culture. This not only enhances individual and team performance but also contributes to the overall resilience and competitiveness of the organization in an ever-changing business landscape.
Implement Regular After-Action Reviews
After-action reviews (AARs) or Lessons Learned are critical to provide a structured way for teams to reflect on projects, initiatives, or events. To implement effective AARs:
Schedule them immediately after key milestones or project completions
Focus on what was planned, what actually happened, why there were differences, and what can be learned
Encourage open and honest discussion without blame
Document key insights and action items
Create a Supportive Environment for Reflection
Foster a culture that values and encourages reflection:
Provide dedicated time and space for individual and group reflection
Model reflective practices at the leadership level
Recognize and reward insights gained through reflection
By systematically implementing these practices, organizations can build a strong competency in reflective learning, leading to improved decision-making, innovation, and overall performance. Utilizing a model always helps.
Kolb’s Reflective Model
Kolb’s reflective model, also known as Kolb’s experiential learning cycle, is a widely used framework for understanding how people learn from experience. The model consists of four stages that form a continuous cycle of learning:
The Four Stages of Kolb’s Reflective Model
Concrete Experience: This is the stage where the learner actively experiences an activity or situation. It involves direct, hands-on involvement in a new experience or a reinterpretation of an existing experience.
Reflective Observation: In this stage, the learner reflects on and reviews the experience. They think about what happened, considering their feelings and the links to their existing knowledge and skills.
Abstract Conceptualization: Here, the learner forms new ideas or modifies existing abstract concepts based on their reflections. This stage involves analyzing the experience and drawing conclusions about what was learned.
Active Experimentation: In the final stage, the learner applies their new knowledge and tests it in new situations. This involves planning how to put the new learning into practice and experimenting with new approaches.
Create Opportunities for Concrete Experiences: Provide employees with hands-on learning experiences, such as job rotations, simulations, or real-world projects.
Encourage Reflection: Set up regular reflection sessions or debriefings after significant experiences. Encourage employees to keep learning journals or participate in group discussions to share their observations.
Facilitate Conceptualization: Provide resources and support for employees to analyze their experiences and form new concepts. This could involve training sessions, mentoring programs, or access to relevant literature and research.
Support Active Experimentation: Create a safe environment for employees to apply their new knowledge and skills. Encourage innovation and provide opportunities for employees to test new ideas in their work.
Integrate the Model into Learning Programs: Design training and development programs that incorporate all four stages of Kolb’s cycle, ensuring a comprehensive learning experience.
Personalize Learning: Recognize that individuals may have preferences for different stages of the cycle. Offer diverse learning opportunities to cater to various learning styles.
Measure and Iterate: Regularly assess the effectiveness of knowledge management initiatives based on Kolb’s model. Use feedback and results to continuously improve the learning process.
By incorporating Kolb’s reflective model into knowledge management practices, we can create a more holistic and effective approach to learning and development. This can lead to improved knowledge retention, better application of learning to real-world situations, and a more adaptable and skilled workforce.
– Expands on Kolb’s work – Recognizes various responses to potential learning situations
Backward Design
Grant Wiggins, Jay McTighe
1. Identify desired results 2. Determine acceptable evidence 3. Plan learning experiences and instruction
– Starts with learning outcomes – Focuses on designing effective learning experiences
Applying the Experiential Learning Model to Validation Competencies
To apply Kolb’s experiential learning model to building an organization’s competency for validation, we can structure the process as follows:
Concrete Experience
Have employees participate in actual validation activities or simulations
Provide hands-on training sessions on validation techniques and tools
Assign validation tasks to teams in real projects
Reflective Observation
Conduct debriefing sessions after validation activities
Encourage employees to keep validation journals or logs
Facilitate group discussions to share experiences and observations
Review validation results and outcomes as a team
Abstract Conceptualization
Offer formal training on validation principles, methodologies, and best practices
Encourage employees to develop validation frameworks or models based on their experiences
Analyze validation case studies from other organizations or industries
Create validation guidelines and standard operating procedures
Active Experimentation
Implement new validation approaches in upcoming projects
Encourage employees to propose and test innovative validation methods
Set up pilot programs to trial new validation tools or techniques
Assign employees to different types of validation projects to broaden their skills
To make this process continuous and effective:
Create a validation competency framework with clear learning objectives and skill levels
Develop a mentoring program where experienced team members guide less experienced colleagues
Establish regular knowledge-sharing sessions focused on validation topics
Implement a system for capturing and disseminating lessons learned from validation activities
Use technology platforms to support collaborative learning and information sharing about validation
Regularly assess and update the organization’s validation processes based on learning outcomes
Encourage cross-functional teams to work on validation projects to broaden perspectives
Partner with external experts or organizations to bring in fresh insights and best practices
Recognize and reward employees who demonstrate growth in validation competencies
Integrate validation competency development into performance reviews and career progression paths
By systematically applying Kolb’s model, we can create a robust learning environment that continuously improves our validation capabilities. This approach ensures that employees not only gain theoretical knowledge but also practical experience, leading to a more competent and adaptable workforce.