Building a Part 11/Annex 11 Course

I have realized I need to build a Part 11 and Annex 11 course. I’ve evaluated some external offerings and decided they really lack that applicability layer, which I am going to focus on.

Here are my draft learning objectives.

21 CFR Part 11 Learning Objectives

  1. Understanding Regulatory Focus: Understand the current regulatory focus on data integrity and relevant regulatory observations.
  2. FDA Requirements: Learn the detailed requirements within Part 11 for electronic records, electronic signatures, and open systems.
  3. Implementation: Understand how to implement the principles of 21 CFR Part 11 in both computer hardware and software systems used in manufacturing, QA, regulatory, and process control.
  4. Compliance: Learn to meet the 21 CFR Part 11 requirements, including the USFDA interpretation in the Scope and Application Guidance.
  5. Risk Management: Apply the current industry risk-based good practice approach to compliant electronic records and signatures.
  6. Practical Examples: Review practical examples covering the implementation of FDA requirements.
  7. Data Integrity: Understand the need for data integrity throughout the system and data life cycles and how to maintain it.
  8. Cloud Computing and Mobile Applications: Learn approaches to cloud computing and mobile applications in the GxP environment.

EMA Annex 11 Learning Objectives

  1. General Guidance: Understand the general guidance on managing risks, personnel responsibilities, and working with third-party suppliers and service providers.
  2. Validation: Learn best practices for validation and what should be included in validation documentation.
  3. Operational Phase: During the operational phase, gain knowledge on data management, security, and risk minimization for computerized systems.
  4. Electronic Signatures: Understand the requirements for electronic signatures and how they should be permanently linked to the respective record, including time and date.
  5. Audit Trails: Learn about the implementation and review of audit trails to ensure data integrity.
  6. Security Access: Understand the requirements for security access to protect electronic records and electronic signatures.
  7. Data Governance: Evaluate the requirements for a robust data governance system.
  8. Compliance with EU Regulations: Learn how to align with Annex 11 to ensure compliance with related EU regulations.

Course Outline: 21 CFR Part 11 and EMA Annex 11 for IT Professionals

Module 1: Introduction and Regulatory Overview

  • History and background of 21 CFR Part 11 and EMA Annex 11
  • Purpose and scope of the regulations
  • Applicability to electronic records and electronic signatures
  • Regulatory bodies and enforcement

Module 2: 21 CFR Part 11 Requirements

  • Subpart A: General Provisions
  • Definitions of key terms
  • Implementation and scope
  • Subpart B: Electronic Records
  • Controls for closed and open systems
  • Audit trails
  • Operational and device checks
  • Authority checks
  • Record retention and availability
  • Subpart C: Electronic Signatures
  • General requirements
  • Electronic signature components and controls
  • Identification codes and passwords

Module 3: EMA Annex 11 Requirements

  • General requirements
  • Risk management
  • Personnel roles and responsibilities
  • Suppliers and service providers
  • Project phase
  • User requirements and specifications
  • System design and development
  • System validation
  • Testing and release management
  • Operational phase
  • Data governance and integrity
  • Audit trails and change control
  • Periodic evaluations
  • Security measures
  • Electronic signatures
  • Business continuity planning

Module 4: PIC/S Data Integrity Requirements

  • Data Governance System
    • Structure and control of the Quality Management System (QMS)
    • Policies related to organizational values, quality, staff conduct, and ethics
  • Organizational Influences
    • Roles and responsibilities for data integrity
    • Training and awareness programs
  • General Data Integrity Principles
    • ALCOA+ principles (Attributable, Legible, Contemporaneous, Original, Accurate, Complete, Consistent, Enduring, and Available)
    • Data lifecycle management
  • Specific Considerations for Computerized Systems
    • Qualification and validation of computerized systems
    • System security and access controls
    • Audit trails and data review
    • Management of hybrid systems
  • Outsourced Activities
    • Data integrity considerations for third-party suppliers
    • Contractual agreements and oversight
  • Regulatory Actions and Remediation
    • Responding to data integrity issues
    • Remediation strategies and corrective actions
  • Periodic System Evaluation
    • Regular reviews and re-validation
    • Risk-based approach to system updates and maintenance

Module 5: Compliance Strategies and Best Practices

  • Interpreting regulatory guidance documents
  • Conducting risk assessments
  • Our validation approach
  • Leveraging suppliers and third-party service providers
  • Implementing audit trails and electronic signatures
  • Data integrity and security controls
  • Change and configuration management
  • Training and documentation requirements

Module 6: Case Studies and Industry Examples

  • Review of FDA warning letters and 483 observations
  • Lessons learned from industry compliance initiatives
  • Practical examples of system validation and audits

Module 7: Future Trends and Developments

  • Regulatory updates and revisions
  • Impact of new technologies (AI, cloud, etc.)
  • Harmonization efforts between global regulations
  • Continuous compliance monitoring

The course will include interactive elements such as hands-on exercises, quizzes, and group discussions to reinforce the learning objectives. The course will provide practical insights for IT professionals by focusing on real-world examples from our company.

Spreadsheets in a GxP Environment

I have them, you have them, and chances are they are used in more ways than you know. The spreadsheet is a powerful tool and really ubiquitous. As such, spreadsheets are used in many ways in the GxP environment, which means they need to meet their intended use and be appropriately controlled. Spreadsheets must perform accurately and consistently, maintain data integrity, and comply with regulatory standards such as health agency guidelines and the GxPs.

That said, it can also be really easy to over-control spreadsheets. It is important to recognize that there is no one-size-fits-all approach.

It is important to build a risk-based approach from a clear definition of the scope and purpose of an individual spreadsheet. This includes identifying the intended use, the type of data a spreadsheet will handle, and the specific calculations or data manipulations it will perform.

I recommend an approach that breaks the spreadsheet down into three major categories. This should also apply to similar tools, such as Jira, Smartsheet, or what-have-you.

    Spreadsheet FunctionalityLevel of verification
    Used like typewriters or simple calculators. They are intended to produce an approved document. Signatories should make any calculations or formulas visible or explicitly describe them and verify that they are correct. The paper printout or electronic version, managed through an electronic document management system, is the GxP record.Control with appropriate procedural governance. The final output may be retained as a record or have an appropriate checked-by-step in another document.
    A low level of complexity (few or no conditional statements, smaller number of cells) and do not use Visual Basic Application programs, macros, automation, or other forms of code.Control through the document lifecycle. Each use is a record.
    A high level of complexity (many conditional statements, external calls or writing to an external database, or linked to other spreadsheets, larger number of cells), using Visual Basic Application, macros, or automation, and multiple users and departments.Treat under a GAMP5 approach for configuration or even customization (Category 4 or 5)
    Requirements by Spreadsheet complexity

    For spreadsheets, the GxP risk classification and GxP functional risk assessment should be performed to include both the spreadsheet functionality and the associated infrastructure components, as applicable (e.g., network drive/storage location).

    For qualification, there should be a succinct template to drive activities. This should address the following parts.

    1. Scope and Purpose

    The validation process begins with a clear definition of the spreadsheet’s scope and purpose. This includes identifying its intended use, the type of data it will handle, and the specific calculations or data manipulations it will perform.

    2. User Requirements and Functional Specifications

    Develop detailed user requirements and functional specifications by outlining what the spreadsheet must do, ensuring that it meets all user needs and regulatory requirements. This step specifies the data inputs, outputs, formulas, and any macros or other automation the spreadsheet will utilize.

    3. Design Qualification

    Ensure that the spreadsheet design aligns with the user requirements and functional specifications. This includes setting up the spreadsheet layout, formulas, and any macros or scripts. The design should prevent common errors such as incorrect data entry and formula misapplication.

    4. Risk Assessment

    Conduct a risk assessment to identify and evaluate potential risks associated with the spreadsheet. This includes assessing the impact of spreadsheet errors on the final results and determining the likelihood of such errors occurring. Mitigation strategies should be developed for identified risks.

    5. Data Integrity and Security

    Implement measures to ensure data integrity and security. This includes setting up access controls, using data validation features to limit data entry errors, and ensuring that data storage and handling comply with regulatory requirements.

    6. Testing (IQ, OQ, PQ)

    • IQ tests the proper installation and configuration of the spreadsheet.
    • OQ ensures the spreadsheet operates as designed under specified conditions.
    • PQ verifies that the spreadsheet consistently produces correct outputs under real-world conditions.

    Remember, all one template; don’t get into multiple documents that each regurgitate all the same stuff.

    Lifecycle Approach

    Spreadsheets should have appropriate procedural guidance and training.

    They should be under risk-based periodic review.

    4th GxP Cloud Compliance Summit – September 5-7

    I am looking forward to speaking at the GxP Cloud Compliance Summit in Boston in September on Implementing a Lifecycle Risk Management Approach to the Cloud. I’ll be discussing some of my favorite topics:

    • Best practices to harness a life cycle risk management approach to protect product quality and patient data
    • What does a living risk assessment look like when key parts of your IT infrastructure is maintained by cloud service providers
    • How does Q9 R1 impact functional and usage assessments around cloud applications

    I am looking forward to meeting and discussing some of the critical questions in our heady embrace of the cloud.

    Cloud based GxP systems have shifted in the last few years from “Something I guess we should figure out” to “Well guess we have it now” to “Well that is all I seem to have now.” And where 5 years ago it seemed we were obsessed about the fine details of Open vs Closed systems and what cloud-based applications are, we are now looking at much more mature questions around a risk based strategy that evaluates and ensures appropriate controls around Data Integrity, Privacy, and Security. Through a risk-based approach, we drive activities such as auditing, change control, qualification/validation, and oversight.

    I am looking forward to having this discussion with my peers and sharing best practices and experiences. It is only though this type of event that we can grow as a professional.

    I hope to see you there.

    Computer Software Assurance Draft

    The FDA published on 13-Sep-2022 the long-awaited draft of the guidance “Computer Software Assurance for Production and Quality System Software,” and you may, based on all the emails and posting be wondering just how radical a change this is.

    It’s not. This guidance is just one big “calm down people” letter from the agency. They publish these sorts of guidance every now and then because we as an industry can sometimes learn the wrong lessons.

    This guidance states:

    1. Determine intended use
    2. Perform a risk assessment
    3. Perform activities to the required level

    I wrote about this approach in “Risk Based Data Integrity Assessment,” and it has existed in GAMP5 and other approaches for years.

    So read the guidance, but don’t panic. You are either following it already or you just need to spend some time getting better at risk assessments and creating some matrix approaches.

    Review of Audit Trails

    One of the requirements for data integrity that has changed in detail as the various guidances (FDA, MHRA, PIC/S) have gone through draft has been review of audit trails. This will also probably be one of the more controversial in certain corners as it can be seen by some as going beyond what has traditionally been the focus of good document practices and computer system validation.

    What the guidances say

    Audit trail review is similar to assessing cross-outs on paper when reviewing data. Personnel responsible for record review under CGMP should review the audit trails that capture changes to data associated with the record as they review the rest of the record (e.g., §§ 211.22(a), 211.101(c) and (d), 211.103, 211.182, 211.186(a), 211.192, 211.194(a)(8), and 212.20(d)). For example, all production and control records, which includes audit trails, must be reviewed and approved by the quality unit (§ 211.192). The regulations provide flexibility to have some activities reviewed by a person directly supervising or checking information (e.g., § 211.188). FDA recommends a quality system approach to implementing oversight and review of CGMP records.

    US FDA. “Who should review audit trails?”  Data Integrity and Compliance With Drug CGMP Questions and Answers Guidance for Industry. Section 7, page 8

    If the review frequency for the data is specified in CGMP regulations, adhere to that frequency for the audit trail review. For example, § 211.188(b) requires review after each significant step in manufacture, processing, packing, or holding, and § 211.22 requires data review before batch release. In these cases, you would apply the same review frequency for the audit trail.If the review frequency for the data is not specified in CGMP regulations, you should determine the review frequency for the audit trail using knowledge of your processes and risk assessment tools. The risk assessment should include evaluation of data criticality, control mechanisms, and impact on product quality. Your approach to audit trail review and the frequency with which you conduct it should ensure that CGMP requirements are met, appropriate controls are implemented, and the reliability of the review is proven.


    US FDA. “How often should audit trails be reviewed?”  Data Integrity and Compliance With Drug CGMP Questions and Answers Guidance for Industry. Section 8, page 8
      Expectations Potential risk of not meeting expectations / items to be checked
    1 Consideration should be given to data management and integrity requirements when purchasing and implementing computerised systems. Companies should select software that includes appropriate electronic audit trail functionality.   Companies should endeavour to purchase and upgrade older systems to implement software that includes electronic audit trail functionality.   It is acknowledged that some very simple systems lack appropriate audit trails; however, alternative arrangements to verify the veracity of data must be implemented, e.g. administrative procedures, secondary checks and controls. Additional guidance may be found under section 9.9 regarding Hybrid Systems.   Audit trail functionality should be verified during validation of the system to ensure that all changes and deletions of critical data associated with each manual activity are recorded and meet ALCOA+ principles.   Audit trail functionalities must be enabled and locked at all times and it must not be possible to deactivate the functionality. If it is possible for administrative users to deactivate the audit trail functionality, an automatic entry should be made in the audit trail indicating that the functionality has been deactivated.   Companies should implement procedures that outline their policy and processes for the review of audit trails in accordance with risk management principles. Critical audit trails related to each operation should be independently reviewed with all other records related to the operation and prior to the review of the completion of the operation, e.g. prior to batch release, so as to ensure that critical data and changes to it are acceptable. This review should be performed by the originating department, and where necessary verified by the quality unit, e.g. during self-inspection or investigative activities.   Validation documentation should demonstrate that audit trails are functional, and that all activities, changes and other transactions within the systems are recorded, together with all metadata.   Verify that audit trails are regularly reviewed (in accordance with quality risk management principles) and that discrepancies are investigated.   If no electronic audit trail system exists a paper based record to demonstrate changes to data may be acceptable until a fully audit trailed (integrated system or independent audit software using a validated interface) system becomes available. These hybrid systems are permitted, where they achieve equivalence to integrated audit trail, such as described in Annex 11 of the PIC/S GMP Guide. Failure to adequately review audit trails may allow manipulated or erroneous data to be inadvertently accepted by the Quality Unit and/or Authorised Person.   Clear details of which data are critical, and which changes and deletions must be recorded (audit trail) should be documented.
    2 Where available, audit trail functionalities for electronic-based systems should be assessed and configured properly to capture any critical activities relating to the acquisition, deletion, overwriting of and changes to data for audit purposes.   Audit trails should be configured to record all manually initiated processes related to critical data.   The system should provide a secure, computer generated, time stamped audit trail to independently record the date and time of entries and actions that create, modify, or delete electronic records.   The audit trail should include the following parameters: – Who made the change – What was changed, incl. old and new values – When the change was made, incl. date and time – Why the change was made (reason) – Name of any person authorising the change.   The audit trail should allow for reconstruction of the course of events relating to the creation, modification, or deletion of an electronic record. The system must be able to print and provide an electronic copy of the audit trail, and whether looked at in the system or in a copy, the audit trail should be available in a meaningful format.   If possible, the audit trail should retain the dynamic functionalities found in the computer system, e.g. search functionality and export to e.g. Excel Verify the format of audit trails to ensure that all critical and relevant information is captured.   The audit trail must include all previous values and record changes must not obscure previously recorded information.   Audit trail entries should be recorded in true time and reflect the actual time of activities. Systems recording the same time for a number of sequential interactions, or which only make an entry in the audit trail, once all interactions have been completed, may not in compliance with expectations to data integrity, particularly where each discrete interaction or sequence is critical, e.g. for the electronic recording of addition of 4 raw materials to a mixing vessel. If the order of addition is a CPP, then each addition should be recorded individually, with time stamps. If the order of addition is not a CCP then the addition of all 4 materials could be recored as a single timestamped activity.

    PIC/S. PI 041-1 “Good Practices for Data Management and Data Integrity in regulated GMP/GDP Environments“ (3rd draft) section 9.4 “Audit trail for computerised systems” page 36

    Thoughts

    It has long been the requirement that computer systems have audit trails and that these be convertible to a format that can be reviewed as appropriate. What these guidances are stating is:

    • There are key activities captured in the audit trail. These key determined in a risk-based manner.
    • These key activities need to be reviewed when making decisions based on them (determine a frequency)
    • The audit trail needs to be able to show the reviewer the key activity
    • These reviews needs to be captured in the quality system (proceduralized, recorded) 
    • This is part of the validated state of your system

    So for example, my deviation system is evaluated and the key activity that needs to be reviewed in the decision to forward process. In this deviation decision quality makes the determination at several points of the workflow. The audit trail review would thus be looking at who made the decision when and did that meet criteria. The frequency might be established at the point of disposition for any deviation still in an opened state and upon closure.

    What we are being asked is to evaluate all your computer systems and figure out what parts of the audit trail need to be reviewed when. 

    Now here’s the problem. Most audit trails are garbage. Maybe they are human readable by some vague definition of readable (or even human). But they don’t have filters, or search or templates. So  companies need to be  (again based on a risk based approach)  evaluating their audit trails system by system to see if they are up-to-the-task. You then end up with one or more solutions:

    • Rebuild the audit trail to make it human readable and give filters and search criteria. For example on a deviation record there is one view for “disposition” and another for “closure”
    • Add reports (such as a set of crystal reports) to make it human readable and give filters and search criteria. Probably end up with a report for “disposition” and another report for “closure.”
    • Utilize an export function to Excel (or similar program)and use Excel’s functions to filter and search. Remember to ensure you have a data verification process in place.
    • The best solution is to ensure the audit trail is a step in your workflow and the review is captured as part of the audit trail. Ideally this is part of an exception reporting process driven by the system.

    What risk based questions should drive this?

    • Overall risk of the system
    • Capabilities of audit trail
    • Can the data be modified after entry? Can it be modified prior to approval?
    • Is the result qualitative or quantitative 
    • Are changes to data visible on the record itself?