Retrospective Validation Doesn’t Really Exist

A recent FDA Warning Letter really drove home a good point about the perils of ‘retrospective validation’ and how that normally doesn’t mean what folks want it to mean.

“In lieu of process validation studies, you attempted to retrospectively review past batches without scientifically establishing blend uniformity and other critical process performance indicators. You do not commit to conduct further process performance qualification studies that scientifically establish the ability of your manufacturing process to consistently yield finished products that meet their quality attributes.”

The FDA’s response here is important for three truths:

  1. Validation needs to be done against critical quality attributes and critical process parameters to scientifically establish that the manufacturing process is consistent.
  2. Batch data on its own is rather useless.
  3. Validation is a continuous exercise, it is not once-and-done (or rather in most people’s view thrice-and-done).

I don’t think the current GMPs really allow the concept of retrospective validation as most people want it to mean (including the recipient of that warning letter). It’s probably a term we should go into the big box of Nope.

AI generated art

Retrospective validation as most people mean it is a type of process validation that involves evaluating historical data and records to demonstrate that an existing process consistently produces products meeting predetermined specifications. As an approach retrospective validation involves evaluating historical data and records to demonstrate that an existing process consistently produces products meeting predetermined specifications. 

The problem here is that this really just tells you what you were already hoping was true.

Retrospective validation has some major flaws:

  1. Limited control over data quality and completeness: Since retrospective validation relies on historical data, there may be gaps or inconsistencies in the available information. The data may not have been collected with validation in mind, leading to missing critical parameters or measurements. It rather throws out most of the principles of science.
  2. Potential bias in existing data: Historical data may be biased or incomplete, as it was not collected specifically for validation purposes. This can make it difficult to draw reliable conclusions about process performance and consistency.
  3. Difficulty in identifying and addressing hidden flaws: Since the process has been in use for some time, there may be hidden flaws or issues that have not been identified or challenged. These could potentially lead to non-conforming products or hazardous operating conditions.
  4. Difficulty in recreating original process conditions: It may be challenging to accurately recreate or understand the original process conditions under which the historical data was generated, potentially limiting the validity of conclusions drawn from the data.

What is truly called for is to perform concurrent validation.

3 thoughts on “Retrospective Validation Doesn’t Really Exist

  1. Agree completely on process validation — would be great to see your thoughts on retrospective validation for software. I have seen many instances of companies deploying some new tech for “non GMP use” then retrospectively validating the platform when they want to extend to GxP use cases.

    For example imagine an electronic form tool that has been implemented for various “back office” processes, and there is a desire to extend it for some GMP data collection without starting over from scratch with a separate instance of the software.

    They run some risk that the system is quietly used for GxP purposes prematurely but otherwise I think this can actually be a healthy approach to piloting/adopting new tech quickly.

    Like

    1. There are definitely times when a system moves from non-GXP to GXP, usually because of either a bad judgment call a better understanding of the regulations, expansion of intended use, or occasionally because the regulations have shifted.

      In these cases you now need to validate. And it should be an appropriate risk driven validation.

      In those cases where it is in use for GxP functions you need interim controls in place to ensure data is reliable and accurate, whether additional reviews or other mechanisms. It is not good enough to finish validation and say “Well all that data we entered while validating is good enough.”

      Like

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.