Draft Annex 11, Section 13: What the Proposed Electronic Signature Rules Mean

Ready or not, the EU’s draft revision of Annex 11 is moving toward finalization, and its brand-new Section 13 on electronic signatures is a wake-up call for anyone still treating digital authentication as just Part 11 with an accent. In this post I will take a deep dive into what’s changing, why it matters, and how to keep your quality system out of the regulatory splash zone.

Section 13 turns electronic signatures from a check-the-box formality into a risk-based, security-anchored discipline. Think multi-factor authentication, time-zone stamps, hybrid wet-ink safeguards, and explicit “non-repudiation” language—all enforced at the same rigor as system login. If your current SOPs still assume username + password = done, it’s time to start planning some improvements.

Why the Rewrite?

  1. Tech has moved on: Biometric ID, cloud PaaS, and federated identity management were sci-fi when the 2011 Annex 11 dropped.
  2. Threat landscape: Ransomware and credential stuffing didn’t exist at today’s scale. Regulators finally noticed.
  3. Global convergence: The FDA’s Computer Software Assurance (CSA) draft and PIC/S data-integrity guides pushed the EU to level up.

For the bigger regulatory context, see my post on EMA GMP Plans for Regulation Updates.

What’s Actually New in Section 13?

Topic2011 Annex 11Draft Annex 11 (2025)21 CFR Part 11Why You Should Care
Authentication at SignatureSilentMust equal or exceed login strength; first sign = full re-auth, subsequent signs = pwd/biometric; smart-card-only = bannedTwo identification componentsForces MFA or biometrics; goodbye “remember me” shortcuts
Time & Time-ZoneDate + time (manual OK)Auto-captured and time-zone loggedDate + time (no TZ)Multisite ops finally get defensible chronology
Signature Meaning PromptNot requiredSystem must ask user for purpose (approve, review…)Required but less prescriptiveEliminates “mystery clicks” that auditors love to exploit
Manifestation ElementsMinimalFull name, username, role, meaning, date/time/TZName, date, meaningCloses attribution gaps; boosts ALCOA+ “Legible”
Indisputability Clause“Same impact”Explicit non-repudiation mandateEquivalent legal weightSets the stage for eIDAS/federated ID harmonization
Record Linking After ChangePermanent linkIf record altered post-sign, signature becomes void/flaggedLink cannot be excisedEnds stealth edits after approval
Hybrid Wet-Ink ControlSilentHash code or similar to break link if record changesSilentLets you keep occasional paper without tanking data integrity
Open Systems / Trusted ServicesSilentMust comply with “national/international trusted services” (read: eIDAS)Extra controls, but legacy wordingValidates cloud signing platforms out of the box

The Implications

Multi-Factor Authentication (MFA) Is Now Table Stakes

Because the draft explicitly bars any authentication method that relies solely on a smart card or a static PIN, every electronic signature now has to be confirmed with an additional, independent factor—such as a password, biometric scan, or time-limited one-time code—so that the credential used to apply the signature is demonstrably different from the one that granted the user access to the system in the first place.

Time-Zone Logging Kills Spreadsheet Workarounds

One of the more subtle but critical updates in Draft Annex 11’s Section 13.4 is the explicit requirement for automatic logging of the time zone when electronic signatures are applied. Unlike previous guidance—whether under the 2011 Annex 11 or 21 CFR Part 11—that only mandated the capture of date and time (often allowing manual entry or local system time), the draft stipulates that systems must automatically capture the precise time and associated time zone for each signature event. This seemingly small detail has monumental implications for data integrity, traceability, and regulatory compliance. Why does this matter? For global pharmaceutical operations spanning multiple time zones, manual or local-only timestamps often create ambiguous or conflicting audit trails, leading to discrepancies in event sequencing. Companies relying on spreadsheets or legacy systems that do not incorporate time zone information effectively invite errors where a signature in one location appears to precede an earlier event simply due to zone differences. This ambiguity can undermine the “Contemporaneous” and “Enduring” principles of ALCOA+, principles the draft Annex 11 explicitly reinforces throughout electronic signature requirements. By mandating automated, time zone-aware timestamping, Draft Annex 11 Section 13.4 ensures that electronic signature records maintain a defensible and standardized chronology across geographies, eliminating the need for cumbersome manual reconciliation or retrospective spreadsheet corrections. This move not only tightens compliance but also supports modern, centralized data review and analytics where uniform timestamping is essential. If your current systems or SOPs rely on manual date/time entry or overlook time zone logging, prepare for significant system and procedural updates to meet this enhanced expectation once the draft Annex 11 is finalized. .

Hybrid Records Are Finally Codified

If you still print a batch record for wet-ink QA approval, Section 13.9 lets you keep the ritual—but only if a cryptographic hash or similar breaks when someone tweaks the underlying PDF. Expect a flurry of DocuSign-scanner-hash utilities.

Open-System Signatures Shift Liability

Draft Annex 11’s Section 13.2 represents perhaps the most strategically significant change in electronic signature liability allocation since 21 CFR Part 11 was published in 1997. The provision states that “Where the system owner does not have full control of system accesses (open systems), or where required by other legislation, electronic signatures should, in addition, meet applicable national and international requirements, such as trusted services”. This seemingly simple sentence fundamentally reshapes liability relationships in modern pharmaceutical IT architectures.

Defining the Open System Boundary

The draft Annex 11 adopts the 21 CFR Part 11 definition of open systems—environments where system owners lack complete control over access and extends it into contemporary cloud, SaaS, and federated identity scenarios. Unlike the original Part 11 approach, which merely required “additional measures such as document encryption and use of appropriate digital signature standards”, Section 13.2 creates a positive compliance obligation by mandating adherence to “trusted services” frameworks.

This distinction is critical: while Part 11 treats open systems as inherently risky environments requiring additional controls, draft Annex 11 legitimizes open systems provided they integrate with qualified trust service providers. Organizations no longer need to avoid cloud-based signature services; instead, they must ensure those services meet eIDAS-qualified standards or equivalent national frameworks.

The Trusted Services Liability Transfer

Section 13.2’s reference to “trusted services” directly incorporates European eIDAS Regulation 910/2014 into pharmaceutical GMP compliance, creating what amounts to a liability transfer mechanism. Under eIDAS, Qualified Trust Service Providers (QTSPs) undergo rigorous third-party audits, maintain certified infrastructure, and provide legal guarantees about signature validity and non-repudiation. When pharmaceutical companies use eIDAS-qualified signature services, they effectively transfer signature validity liability from their internal systems to certified external providers.

This represents a fundamental shift from the 21 CFR Part 11 closed-system preference, where organizations maintained complete control over signature infrastructure but also bore complete liability for signature failures. Draft Annex 11 acknowledges that modern pharmaceutical operations often depend on cloud service providers, federated authentication systems, and external trust services—and provides a regulatory pathway to leverage these technologies while managing liability exposure.

Practical Implications for SaaS Platforms

The most immediate impact affects organizations using Software-as-a-Service platforms for clinical data management, quality management, or document management. Under current Annex 11 and Part 11, these systems often require complex validation exercises to demonstrate signature integrity, with pharmaceutical companies bearing full responsibility for signature validity even when using external platforms.

Section 13.2 changes this dynamic by validating reliance on qualified trust services. Organizations using platforms like DocuSign, Adobe Sign, or specialized pharmaceutical SaaS providers can now satisfy Annex 11 requirements by ensuring their chosen platforms integrate with eIDAS-qualified signature services. The pharmaceutical company’s validation responsibility shifts from proving signature technology integrity to verifying trust service provider qualifications and proper integration.

Integration with Identity and Access Management

Draft Annex 11’s Section 11 (Identity and Access Management) works in conjunction with Section 13.2 to support federated identity scenarios common in modern pharmaceutical operations. Organizations can now implement single sign-on (SSO) systems with external identity providers, provided the signature components integrate with trusted services. This enables scenarios where employees authenticate through corporate Active Directory systems but execute legally binding signatures through eIDAS-qualified providers.

The liability implications are significant: authentication failures become the responsibility of the identity provider (within contractual limits), while signature validity becomes the responsibility of the qualified trust service provider. The pharmaceutical company retains responsibility for proper system integration and user access controls, but shares technical implementation liability with certified external providers.

Cloud Service Provider Risk Allocation

For organizations using cloud-based LIMS, MES, or quality management systems, Section 13.2 provides regulatory authorization to implement signature services hosted entirely by external providers. Cloud service providers offering eIDAS-compliant signature services can contractually accept liability for signature technical implementation, cryptographic integrity, and legal validity—provided they maintain proper trust service qualifications.

This risk allocation addresses a long-standing concern in pharmaceutical cloud adoption: the challenge of validating signature infrastructure owned and operated by external parties. Under Section 13.2, organizations can rely on qualified trust service provider certifications rather than conducting detailed technical validation of cloud provider signature implementations.

Harmonization with Global Standards

Section 13.2’s “national and international requirements” language extends beyond eIDAS to encompass other qualified electronic signature frameworks. This includes Swiss ZertES standards and Canadian digital signature regulations,. Organizations operating globally can implement unified signature platforms that satisfy multiple regulatory requirements through single trusted service provider integrations.

The practical effect is regulatory arbitrage: organizations can choose signature service providers based on the most favorable combination of technical capabilities, cost, and regulatory coverage, rather than being constrained by local regulatory limitations.

Supplier Assessment Transformation

Draft Annex 11’s Section 7 (Supplier and Service Management) requires comprehensive supplier assessment for computerized systems. However, Section 13.2 creates a qualified exception for eIDAS-certified trust service providers: organizations can rely on third-party certification rather than conducting independent technical assessments of signature infrastructure.

This significantly reduces supplier assessment burden for signature services. Instead of auditing cryptographic implementations, hardware security modules, and signature validation algorithms, organizations can verify trust service provider certifications and assess integration quality. The result: faster implementation cycles and reduced validation costs for signature-enabled systems.

Audit Trail Integration Considerations

The liability shift enabled by Section 13.2 affects audit trail management requirements detailed in draft Annex 11’s expanded Section 12 (Audit Trails). When signature events are managed by external trust service providers, organizations must ensure signature-related audit events are properly integrated with internal audit trail systems while maintaining clear accountability boundaries.

Qualified trust service providers typically provide comprehensive signature audit logs, but organizations remain responsible for correlation with business process audit trails. This creates shared audit trail management where signature technical events are managed externally but business context remains internal responsibility.

Competitive Advantages of Early Adoption

Organizations that proactively implement Section 13.2 requirements gain several strategic advantages:

  • Reduced Infrastructure Costs: Elimination of internal signature infrastructure maintenance and validation overhead
  • Enhanced Security: Leverage specialized trust service provider security expertise and certified infrastructure
  • Global Scalability: Unified signature platforms supporting multiple regulatory jurisdictions through single provider relationships
  • Accelerated Digital Transformation: Faster deployment of signature-enabled processes through validated external services
  • Risk Transfer: Contractual liability allocation with qualified external providers rather than complete internal risk retention

Section 13.2 transforms open system electronic signatures from compliance challenges into strategic enablers of digital pharmaceutical operations. By legitimizing reliance on qualified trust services, the draft Annex 11 enables organizations to leverage best-in-class signature technologies while managing regulatory compliance and liability exposure through proven external partnerships. The result: more secure, cost-effective, and globally scalable electronic signature implementations that support advanced digital quality management systems.

How to Get Ahead (Instead of Playing Cleanup)

  1. Perform a gap assessment now—map every signature point to the new rules.
  2. Prototype MFA in your eDMS or MES. If users scream about friction, remind them that ransomware is worse.
  3. Update validation protocols to include time-zone, hybrid record, and non-repudiation tests.
  4. Rewrite SOPs to include signature-meaning prompts and periodic access-right recertification.
  5. Train users early. A 30-second “why you must re-authenticate” explainer video beats 300 deviations later.

Final Thoughts

The draft Annex 11 doesn’t just tweak wording—it yanks electronic signatures into the 2020s. Treat Section 13 as both a compliance obligation and an opportunity to slash latent data-integrity risk. Those who adapt now will cruise through 2026/2027 inspections while the laggards scramble for remediation budgets.

Building a Part 11/Annex 11 Course

I have realized I need to build a Part 11 and Annex 11 course. I’ve evaluated some external offerings and decided they really lack that applicability layer, which I am going to focus on.

Here are my draft learning objectives.

21 CFR Part 11 Learning Objectives

  1. Understanding Regulatory Focus: Understand the current regulatory focus on data integrity and relevant regulatory observations.
  2. FDA Requirements: Learn the detailed requirements within Part 11 for electronic records, electronic signatures, and open systems.
  3. Implementation: Understand how to implement the principles of 21 CFR Part 11 in both computer hardware and software systems used in manufacturing, QA, regulatory, and process control.
  4. Compliance: Learn to meet the 21 CFR Part 11 requirements, including the USFDA interpretation in the Scope and Application Guidance.
  5. Risk Management: Apply the current industry risk-based good practice approach to compliant electronic records and signatures.
  6. Practical Examples: Review practical examples covering the implementation of FDA requirements.
  7. Data Integrity: Understand the need for data integrity throughout the system and data life cycles and how to maintain it.
  8. Cloud Computing and Mobile Applications: Learn approaches to cloud computing and mobile applications in the GxP environment.

EMA Annex 11 Learning Objectives

  1. General Guidance: Understand the general guidance on managing risks, personnel responsibilities, and working with third-party suppliers and service providers.
  2. Validation: Learn best practices for validation and what should be included in validation documentation.
  3. Operational Phase: During the operational phase, gain knowledge on data management, security, and risk minimization for computerized systems.
  4. Electronic Signatures: Understand the requirements for electronic signatures and how they should be permanently linked to the respective record, including time and date.
  5. Audit Trails: Learn about the implementation and review of audit trails to ensure data integrity.
  6. Security Access: Understand the requirements for security access to protect electronic records and electronic signatures.
  7. Data Governance: Evaluate the requirements for a robust data governance system.
  8. Compliance with EU Regulations: Learn how to align with Annex 11 to ensure compliance with related EU regulations.

Course Outline: 21 CFR Part 11 and EMA Annex 11 for IT Professionals

Module 1: Introduction and Regulatory Overview

  • History and background of 21 CFR Part 11 and EMA Annex 11
  • Purpose and scope of the regulations
  • Applicability to electronic records and electronic signatures
  • Regulatory bodies and enforcement

Module 2: 21 CFR Part 11 Requirements

  • Subpart A: General Provisions
  • Definitions of key terms
  • Implementation and scope
  • Subpart B: Electronic Records
  • Controls for closed and open systems
  • Audit trails
  • Operational and device checks
  • Authority checks
  • Record retention and availability
  • Subpart C: Electronic Signatures
  • General requirements
  • Electronic signature components and controls
  • Identification codes and passwords

Module 3: EMA Annex 11 Requirements

  • General requirements
  • Risk management
  • Personnel roles and responsibilities
  • Suppliers and service providers
  • Project phase
  • User requirements and specifications
  • System design and development
  • System validation
  • Testing and release management
  • Operational phase
  • Data governance and integrity
  • Audit trails and change control
  • Periodic evaluations
  • Security measures
  • Electronic signatures
  • Business continuity planning

Module 4: PIC/S Data Integrity Requirements

  • Data Governance System
    • Structure and control of the Quality Management System (QMS)
    • Policies related to organizational values, quality, staff conduct, and ethics
  • Organizational Influences
    • Roles and responsibilities for data integrity
    • Training and awareness programs
  • General Data Integrity Principles
    • ALCOA+ principles (Attributable, Legible, Contemporaneous, Original, Accurate, Complete, Consistent, Enduring, and Available)
    • Data lifecycle management
  • Specific Considerations for Computerized Systems
    • Qualification and validation of computerized systems
    • System security and access controls
    • Audit trails and data review
    • Management of hybrid systems
  • Outsourced Activities
    • Data integrity considerations for third-party suppliers
    • Contractual agreements and oversight
  • Regulatory Actions and Remediation
    • Responding to data integrity issues
    • Remediation strategies and corrective actions
  • Periodic System Evaluation
    • Regular reviews and re-validation
    • Risk-based approach to system updates and maintenance

Module 5: Compliance Strategies and Best Practices

  • Interpreting regulatory guidance documents
  • Conducting risk assessments
  • Our validation approach
  • Leveraging suppliers and third-party service providers
  • Implementing audit trails and electronic signatures
  • Data integrity and security controls
  • Change and configuration management
  • Training and documentation requirements

Module 6: Case Studies and Industry Examples

  • Review of FDA warning letters and 483 observations
  • Lessons learned from industry compliance initiatives
  • Practical examples of system validation and audits

Module 7: Future Trends and Developments

  • Regulatory updates and revisions
  • Impact of new technologies (AI, cloud, etc.)
  • Harmonization efforts between global regulations
  • Continuous compliance monitoring

The course will include interactive elements such as hands-on exercises, quizzes, and group discussions to reinforce the learning objectives. The course will provide practical insights for IT professionals by focusing on real-world examples from our company.

The Audit Trail and Data Integrity

Requirement

Description

Attributable (Traceable)

  • Each audit trail entry must be attributable to the individual responsible for the direct data input so all changes or creation of data with the persons making those changes. When using a user’s unique ID, this must identify an individual pers on.
  • Each audit trail must be linked to the relevant record throughout the data life cycle.

Legible

  • The system should be able to print or provide an electronic copy of the audit trail.
  • The audit trail must be available in a meaningful format when. viewed in the system or as hardcopy.

Contemporaneous

  • Each audit trail entry must be date- and time-stamped according to a controlled clock which cannot be altered. The time should either be based on central server time or a local time, so long as it is clear in which time zone the entry was performed.

Original

  • The audit trail should retain the dynamic functionalities found in the computerized system, included search functionality to facilitate audit trail review activities.

Accurate

  • Audit trail functionality must be verified to ensure the data written to the audit trail equals the data entered or system generated.
  • Audit trail data must be stored in a secure manner and users cannot have the ability to amend, delete, or switch off the audit trail. Where a system administrator amends, or switches off the audit trail, a record of that action must be retained.

Complete

  • The audit trail entries must be automatically captured by the computerized system whenever an electronic record is created, modified, or deleted.
  • Audit trails, at minimum, must record all end user initiated processes related to critical data. The following parameters must be included:
    • The identity of the person performing the action.
    • In the case of a change or deletion, the detail of the change or deletion, and a record of the original entry.
    • The reason for any GxP change or deletion.
    • The time and date when the action was performed.

Consistent

  • Audit trails are used to review, detect, report, and address data integrity issues.
  • Audit trail reviewers must have appropriate training, system knowledge and knowledge of the process to perform the audit trail review. The review of the relevant audit trails must be documented.
  • Audit trail discrepancies must be addressed, investigated, and escalated to JEB management and national authorities, as necessary.

Enduring

  • The audit trail must be retained for the same duration as the associated electronic record.

Available

  • The audit trail must be available for review at any time by inspectors and auditors during the required retention period.
  • The audit trail must be accessible in a human readable format.

21CFR Part 11 Requirements

Definition: An audit trail is a secure, computer-generated, time-stamped electronic record that allows for the reconstruction of events related to the creation, modification, and deletion of an electronic record.

Requirements:

  • Availability: Audit trails must be easily accessible for review and copying by the FDA during inspections.
  • Automation: Entries must be automatically captured by the system without manual intervention.
  • Components: Each entry must include a timestamp, user ID, original and new values, and reasons for changes where applicable.
  • Security: Audit trail data must be securely stored and not accessible for editing by users

EMA Annex 11 (Eudralex Volume 4) Requirements

Definition: Audit trails are records of all GMP-relevant changes and deletions, created by the system to ensure traceability and accountability.

Requirements:

  • Risk-Based Approach: Building audit trails into the system for all GMP-relevant changes and deletions should be considered based on a risk assessment.
  • Documentation: The reasons for changes or deletions must be documented.
  • Review: Audit trails must be available, convertible into a generally readable form, and regularly reviewed.
  • Validation: The audit trail functionality must be validated to ensure it captures all necessary data accurately and securely.

Requirements from PIC/S GMP Data Integrity Guidance

Definition: Audit trails are metadata recorded about critical information such as changes or deletions of GMP/GDP relevant data to enable the reconstruction of activities.

Requirements:

  • Review: Critical audit trails related to each operation should be independently reviewed with all other records related to the operation, especially before batch release.
  • Documentation: Significant deviations found during the audit trail review must be fully investigated and documented.

Review of Audit Trails

One of the requirements for data integrity that has changed in detail as the various guidances (FDA, MHRA, PIC/S) have gone through draft has been review of audit trails. This will also probably be one of the more controversial in certain corners as it can be seen by some as going beyond what has traditionally been the focus of good document practices and computer system validation.

What the guidances say

Audit trail review is similar to assessing cross-outs on paper when reviewing data. Personnel responsible for record review under CGMP should review the audit trails that capture changes to data associated with the record as they review the rest of the record (e.g., §§ 211.22(a), 211.101(c) and (d), 211.103, 211.182, 211.186(a), 211.192, 211.194(a)(8), and 212.20(d)). For example, all production and control records, which includes audit trails, must be reviewed and approved by the quality unit (§ 211.192). The regulations provide flexibility to have some activities reviewed by a person directly supervising or checking information (e.g., § 211.188). FDA recommends a quality system approach to implementing oversight and review of CGMP records.

US FDA. “Who should review audit trails?”  Data Integrity and Compliance With Drug CGMP Questions and Answers Guidance for Industry. Section 7, page 8

If the review frequency for the data is specified in CGMP regulations, adhere to that frequency for the audit trail review. For example, § 211.188(b) requires review after each significant step in manufacture, processing, packing, or holding, and § 211.22 requires data review before batch release. In these cases, you would apply the same review frequency for the audit trail.If the review frequency for the data is not specified in CGMP regulations, you should determine the review frequency for the audit trail using knowledge of your processes and risk assessment tools. The risk assessment should include evaluation of data criticality, control mechanisms, and impact on product quality. Your approach to audit trail review and the frequency with which you conduct it should ensure that CGMP requirements are met, appropriate controls are implemented, and the reliability of the review is proven.


US FDA. “How often should audit trails be reviewed?”  Data Integrity and Compliance With Drug CGMP Questions and Answers Guidance for Industry. Section 8, page 8
  Expectations Potential risk of not meeting expectations / items to be checked
1 Consideration should be given to data management and integrity requirements when purchasing and implementing computerised systems. Companies should select software that includes appropriate electronic audit trail functionality.   Companies should endeavour to purchase and upgrade older systems to implement software that includes electronic audit trail functionality.   It is acknowledged that some very simple systems lack appropriate audit trails; however, alternative arrangements to verify the veracity of data must be implemented, e.g. administrative procedures, secondary checks and controls. Additional guidance may be found under section 9.9 regarding Hybrid Systems.   Audit trail functionality should be verified during validation of the system to ensure that all changes and deletions of critical data associated with each manual activity are recorded and meet ALCOA+ principles.   Audit trail functionalities must be enabled and locked at all times and it must not be possible to deactivate the functionality. If it is possible for administrative users to deactivate the audit trail functionality, an automatic entry should be made in the audit trail indicating that the functionality has been deactivated.   Companies should implement procedures that outline their policy and processes for the review of audit trails in accordance with risk management principles. Critical audit trails related to each operation should be independently reviewed with all other records related to the operation and prior to the review of the completion of the operation, e.g. prior to batch release, so as to ensure that critical data and changes to it are acceptable. This review should be performed by the originating department, and where necessary verified by the quality unit, e.g. during self-inspection or investigative activities.   Validation documentation should demonstrate that audit trails are functional, and that all activities, changes and other transactions within the systems are recorded, together with all metadata.   Verify that audit trails are regularly reviewed (in accordance with quality risk management principles) and that discrepancies are investigated.   If no electronic audit trail system exists a paper based record to demonstrate changes to data may be acceptable until a fully audit trailed (integrated system or independent audit software using a validated interface) system becomes available. These hybrid systems are permitted, where they achieve equivalence to integrated audit trail, such as described in Annex 11 of the PIC/S GMP Guide. Failure to adequately review audit trails may allow manipulated or erroneous data to be inadvertently accepted by the Quality Unit and/or Authorised Person.   Clear details of which data are critical, and which changes and deletions must be recorded (audit trail) should be documented.
2 Where available, audit trail functionalities for electronic-based systems should be assessed and configured properly to capture any critical activities relating to the acquisition, deletion, overwriting of and changes to data for audit purposes.   Audit trails should be configured to record all manually initiated processes related to critical data.   The system should provide a secure, computer generated, time stamped audit trail to independently record the date and time of entries and actions that create, modify, or delete electronic records.   The audit trail should include the following parameters: – Who made the change – What was changed, incl. old and new values – When the change was made, incl. date and time – Why the change was made (reason) – Name of any person authorising the change.   The audit trail should allow for reconstruction of the course of events relating to the creation, modification, or deletion of an electronic record. The system must be able to print and provide an electronic copy of the audit trail, and whether looked at in the system or in a copy, the audit trail should be available in a meaningful format.   If possible, the audit trail should retain the dynamic functionalities found in the computer system, e.g. search functionality and export to e.g. Excel Verify the format of audit trails to ensure that all critical and relevant information is captured.   The audit trail must include all previous values and record changes must not obscure previously recorded information.   Audit trail entries should be recorded in true time and reflect the actual time of activities. Systems recording the same time for a number of sequential interactions, or which only make an entry in the audit trail, once all interactions have been completed, may not in compliance with expectations to data integrity, particularly where each discrete interaction or sequence is critical, e.g. for the electronic recording of addition of 4 raw materials to a mixing vessel. If the order of addition is a CPP, then each addition should be recorded individually, with time stamps. If the order of addition is not a CCP then the addition of all 4 materials could be recored as a single timestamped activity.

PIC/S. PI 041-1 “Good Practices for Data Management and Data Integrity in regulated GMP/GDP Environments“ (3rd draft) section 9.4 “Audit trail for computerised systems” page 36

Thoughts

It has long been the requirement that computer systems have audit trails and that these be convertible to a format that can be reviewed as appropriate. What these guidances are stating is:

  • There are key activities captured in the audit trail. These key determined in a risk-based manner.
  • These key activities need to be reviewed when making decisions based on them (determine a frequency)
  • The audit trail needs to be able to show the reviewer the key activity
  • These reviews needs to be captured in the quality system (proceduralized, recorded) 
  • This is part of the validated state of your system

So for example, my deviation system is evaluated and the key activity that needs to be reviewed in the decision to forward process. In this deviation decision quality makes the determination at several points of the workflow. The audit trail review would thus be looking at who made the decision when and did that meet criteria. The frequency might be established at the point of disposition for any deviation still in an opened state and upon closure.

What we are being asked is to evaluate all your computer systems and figure out what parts of the audit trail need to be reviewed when. 

Now here’s the problem. Most audit trails are garbage. Maybe they are human readable by some vague definition of readable (or even human). But they don’t have filters, or search or templates. So  companies need to be  (again based on a risk based approach)  evaluating their audit trails system by system to see if they are up-to-the-task. You then end up with one or more solutions:

  • Rebuild the audit trail to make it human readable and give filters and search criteria. For example on a deviation record there is one view for “disposition” and another for “closure”
  • Add reports (such as a set of crystal reports) to make it human readable and give filters and search criteria. Probably end up with a report for “disposition” and another report for “closure.”
  • Utilize an export function to Excel (or similar program)and use Excel’s functions to filter and search. Remember to ensure you have a data verification process in place.
  • The best solution is to ensure the audit trail is a step in your workflow and the review is captured as part of the audit trail. Ideally this is part of an exception reporting process driven by the system.

What risk based questions should drive this?

  • Overall risk of the system
  • Capabilities of audit trail
  • Can the data be modified after entry? Can it be modified prior to approval?
  • Is the result qualitative or quantitative 
  • Are changes to data visible on the record itself?