The Molecule That Changed Everything: How Insulin Rewired Drug Manufacturing and Regulatory Thinking

There’s a tendency in our industry to talk about “small molecules versus biologics” as if we woke up one morning and the world had simply divided itself into two neat categories. But the truth is more interesting—and more instructive—than that. The dividing line was drawn by one molecule in particular: insulin. And the story of how insulin moved from animal extraction to recombinant manufacturing didn’t just change how we make one drug. It fundamentally rewired how we think about manufacturing, quality, and regulation across the entire pharmaceutical landscape.

From Pancreases to Plasmids

For the first six decades of its therapeutic life, insulin was an extractive product. Since the 1920s, producing insulin required enormous quantities of animal pancreases—primarily from cows and pigs—sourced from slaughterhouses. Eli Lilly began full-scale animal insulin production in 1923 using isoelectric precipitation to separate and purify the hormone, and that basic approach held for decades. Chromatographic advancements in the 1970s improved purity and reduced the immunogenic reactions that had long plagued patients, but the fundamental dependency on animal tissue remained.

This was, in manufacturing terms, essentially a small-molecule mindset applied to a protein. You sourced your raw material, you extracted, you purified, you tested the final product against a specification, and you released it. The process was relatively well-characterized and reproducible. Quality lived primarily in the finished product testing.

But this model was fragile. Market forces and growing global demand revealed the unsustainable nature of dependency on animal sources. The fear of supply shortages was real. And it was into this gap that recombinant DNA technology arrived.

1982: The Paradigm Breaks Open

In 1978, scientists at City of Hope and Genentech developed a method for producing biosynthetic human insulin (BHI) using recombinant DNA technology, synthesizing the insulin A and B chains separately in E. coli. On October 28, 1982, after only five months of review, the FDA approved Humulin—the first biosynthetic human insulin and the first approved medical product of any kind derived from recombinant DNA technology.

Think about what happened here. Overnight, insulin manufacturing went from:

  • Animal tissue extraction → Living cell factory production
  • Sourcing variability tied to agricultural supply chains → Engineered biological systems with defined genetic constructs
  • Purification of a natural mixture → Directed expression of a specific gene product

The production systems themselves tell the story. Recombinant human insulin is produced predominantly in E. coli (where insulin precursors form inclusion bodies requiring solubilization and refolding) or in Saccharomyces cerevisiae (where soluble precursors are secreted into culture supernatant). Each system brings its own manufacturing challenges—post-translational modification limitations in bacteria, glycosylation considerations in yeast—that simply did not exist in the old extraction paradigm.

This wasn’t just a change in sourcing. It was a change in manufacturing identity.

“The Process Is the Product”

And here is where the real conceptual earthquake happened. With small-molecule drugs, you can fully characterize the molecule. You know every atom, every bond. If two manufacturers produce the same compound by different routes, you can prove equivalence through analytical testing of the finished product. The process matters, but it isn’t definitional.

Biologics are different. As the NIH Regulatory Knowledge Guide puts it directly: “the process is the product”—any changes in the manufacturing process can result in a fundamental change to the biological molecule, impacting the product and its performance, safety, or efficacy. The manufacturing process for biologics—from cell bank to fermentation to purification to formulation—determines the quality of the product in ways that cannot be fully captured by end-product testing alone.

Insulin was the first product to force the industry to confront this reality at commercial scale. When Lilly and Genentech brought Humulin to market, they weren’t just scaling up a chemical reaction. They were scaling up a living system, with all the inherent variability that implies—batch-to-batch differences in cell growth, protein folding, post-translational modifications, and impurity profiles.

This single insight—that for biologics, process control is product control—cascaded through the entire regulatory and quality framework over the next four decades.

The Regulatory Framework Catches Up

Insulin’s journey also exposed a peculiar regulatory gap. Despite being a biologic by any scientific definition, insulin was regulated as a drug under Section 505 of the Federal Food, Drug, and Cosmetic Act (FFDCA), not as a biologic under the Public Health Service Act (PHSA). This was largely a historical accident: when recombinant insulin arrived in 1982, the distinctions between FFDCA and PHSA weren’t particularly consequential, and the relevant FDA expertise happened to reside in the drug review division.

But this classification mismatch had real consequences. Because insulin was regulated as a “drug,” there was no pathway for biosimilar insulins—even after the Hatch-Waxman Act of 1984 created abbreviated pathways for generic small-molecule drugs. The “generic” framework simply doesn’t work for complex biological molecules where “identical” is the wrong standard.

It took decades to resolve this. The Biologics Price Competition and Innovation Act (BPCIA), enacted in 2010 as part of the Affordable Care Act, created an abbreviated regulatory pathway for biosimilars and mandated that insulin—along with certain other protein products—would transition from drug status to biologic status. On March 23, 2020, all insulin products were formally “deemed to be” biologics, licensed under Section 351 of the PHSA.

This wasn’t a relabeling exercise. It opened insulin to the biosimilar pathway for the first time, culminating in the July 2021 approval of Semglee (insulin glargine-yfgn) as the first interchangeable biosimilar insulin product. That approval—allowing pharmacy-level substitution of a biologic—was a moment the industry had been building toward for decades.

ICH Q5 and the Quality Architecture for Biologics

The regulatory thinking that insulin forced into existence didn’t stay confined to insulin. It spawned an entire framework of ICH guidelines specifically addressing the quality of biotechnological products:

  • ICH Q5A – Viral safety evaluation of biotech products derived from cell lines
  • ICH Q5B – Analysis of the expression construct in cell lines
  • ICH Q5C – Stability testing of biotechnological/biological products
  • ICH Q5D – Derivation and characterization of cell substrates
  • ICH Q5E – Comparability of biotechnological/biological products subject to changes in their manufacturing process

ICH Q5E deserves particular attention because it codifies the “process is the product” principle into an operational framework. It states that changes to manufacturing processes are “normal and expected” but insists that manufacturers demonstrate comparability—proving that post-change product has “highly similar quality attributes” and that no adverse impact on safety or efficacy has occurred. The guideline explicitly acknowledges that even “minor” changes can have unpredictable impacts on quality, safety, and efficacy.

This is fundamentally different from the small-molecule world, where a process change can often be managed through updated specifications and finished-product testing. For biologics, comparability exercises can involve extensive analytical characterization, in-process testing, stability studies, and potentially nonclinical or clinical assessments.

How This Changed Industry Thinking

The ripple effects of insulin’s transition from extraction to biologics manufacturing reshaped the entire pharmaceutical industry in several concrete ways:

1. Process Development Became a Core Competency, Not a Support Function.
When “the process is the product,” process development scientists aren’t just optimizing yield—they’re defining the drug. The extensive process characterization, design space definition, and control strategy work enshrined in ICH Q8 (Pharmaceutical Development) and ICH Q11 (Development and Manufacture of Drug Substances) grew directly from the recognition that biologics manufacturing demands a fundamentally deeper understanding of process-product relationships.

2. Cell Banks Became the Crown Jewels.
The master cell bank concept—maintaining a characterized, qualified starting point for all future production—became the foundational control strategy for biologics. Every batch traces back to a defined, banked cell line. This was a completely new paradigm compared to sourcing animal pancreases from slaughterhouses.

3. Comparability Became a Lifecycle Discipline.
In the small-molecule world, process changes are managed through supplements and updated batch records. In biologics, every significant process change triggers a comparability exercise that can take months and cost millions. This has made change control for biologics a far more rigorous discipline and has elevated the role of quality and regulatory functions in manufacturing decisions.

4. The Biosimilar Paradigm Created New Quality Standards.
Unlike generics, biosimilars cannot be “identical” to the reference product. The FDA requires a demonstration that the biosimilar is “highly similar” with “no clinically meaningful differences” in safety, purity, and potency. This “totality of evidence” approach, developed for the BPCIA pathway, requires sophisticated analytical, functional, and clinical comparisons that go well beyond the bioequivalence studies used for generic drugs.

5. Manufacturing Cost and Complexity Became Strategic Variables.
Biologics manufacturing requires living cell systems, specialized bioreactors, extensive purification trains (including viral clearance steps), and facility designs with stringent contamination controls. The average cost to develop an approved biologic is estimated at $2.6–2.8 billion, compared to significantly lower costs for small molecules. This manufacturing complexity has driven the growth of the CDMO industry and made facility design, tech transfer, and manufacturing strategy central to business planning.

The Broader Industry Shift

Insulin was the leading edge of a massive transformation. By 2023, the global pharmaceutical market was $1.34 trillion, with biologics representing 42% of sales (up from 31% in 2018) and growing three times faster than small molecules. Some analysts predict biologics will outstrip small molecule sales by 2027.

This growth has been enabled by the manufacturing and regulatory infrastructure that insulin’s transition helped build. The expression systems first commercialized for insulin—E. coli and yeast—remain workhorses, while mammalian cell lines (especially CHO cells) now dominate monoclonal antibody production. The quality frameworks (ICH Q5 series, Q6B specifications, Q8–Q11 development and manufacturing guidelines) provide the regulatory architecture that makes all of this possible.

Even the regulatory structure itself—the distinction between 21 CFR Parts 210/211 (drug CGMP) and 21 CFR Parts 600–680 (biologics)—reflects this historical evolution. Biologics manufacturers must often comply with both frameworks simultaneously, maintaining drug CGMP baselines while layering on biologics-specific controls for establishment licensing, lot release, and biological product deviation reporting.

Where We Are Now

Today, insulin sits at a fascinating intersection. It’s a relatively small, well-characterized protein—analytically simpler than a monoclonal antibody—but it carries the full regulatory weight of a biologic. The USP maintains five drug substance monographs and thirteen drug product monographs for insulin. Manufacturers must hold Biologics License Applications, comply with CGMP for both drugs and biologics, and submit to pre-approval inspections.

Meanwhile, the manufacturing technology continues to evolve. Animal-free recombinant insulin is now a critical component of cell culture media used in the production of other biologics, supporting CHO cell growth in monoclonal antibody manufacturing—a kind of recursive loop where the first recombinant biologic enables the manufacture of subsequent generations.

And the biosimilar pathway that insulin’s reclassification finally opened is beginning to deliver on its promise. Multiple biosimilar and interchangeable insulin products are now reaching patients at lower costs. The framework developed for insulin biosimilars is being applied across the biologics landscape—from adalimumab to trastuzumab to bevacizumab.

The Lesson for Quality Professionals

If there’s a single takeaway from insulin’s manufacturing evolution, it’s this: the way we make a drug is inseparable from what the drug is. This was always true for biologics, but it took insulin—the first recombinant product to reach commercial scale—to force the industry and regulators to internalize that principle.

Every comparability study you run, every cell bank qualification you perform, every process validation protocol you execute for a biologic product exists because of the conceptual framework that insulin’s journey established. The ICH Q5E comparability exercise, the Q5D cell substrate characterization, the Q5A viral safety evaluation—these aren’t bureaucratic requirements imposed from outside. They’re the rational response to a fundamental truth about biological manufacturing that insulin made impossible to ignore.

The molecule that changed everything didn’t just save millions of lives. It rewired how an entire industry thinks about the relationship between process and product. And in doing so, it set the stage for every biologic that followed.

The Jobs-to-Be-Done (JTBD): Origins, Function, and Value for Quality Systems

In the relentless march of quality and operational improvement, frameworks, methodologies and tools abound but true breakthrough is rare. There is a persistent challenge: organizations often become locked into their own best practices, relying on habitual process reforms that seldom address the deeper why of operational behavior. This “process myopia”—where the visible sequence of tasks occludes the real purpose—runs in parallel to risk blindness, leaving many organizations vulnerable to the slow creep of inefficiency, bias, and ultimately, quality failures.

The Jobs-to-Be-Done (JTBD) tool offers an effective method for reorientation. Rather than focusing on processes or systems as static routines, JTBD asks a deceptively simple question: What job are people actually hiring this process or tool to do? In deviation management, audit response, even risk assessment itself, the answer to this question is the gravitational center on which effective redesign can be based.

What Does It Mean to Hire a Process?

To “hire” a process—even when it is a regulatory obligation—means viewing the process not merely as a compliance requirement, but as a tool or mechanism that stakeholders use to achieve specific, desirable outcomes beyond simple adherence. In Jobs-to-Be-Done (JTBD), the idea of “hiring” a process reframes organizational behavior: stakeholders (such as quality professionals, operators, managers, or auditors) are seen as engaging with the process to get particular jobs done—such as ensuring product safety, demonstrating control to regulators, reducing future risk, or creating operational transparency.

When a process is regulatory-mandated—such as deviation management, change control, or batch release—the “hiring” metaphor recognizes two coexisting realities:

Dual Functions: Compliance and Value Creation

  • Compliance Function: The organization must follow the process to satisfy legal, regulatory, or contractual obligations. Not following is not an option; it’s legally or organizationally enforced.
  • Functional “Hiring”: Even for required processes, users “hire” the process to accomplish additional jobs—like protecting patients, facilitating learning from mistakes, or building organizational credibility. A well-designed process serves both external (regulatory) and internal (value-creating) goals.

Implications for Process Design

  • Stakeholders still have choices in how they interact with the process—they can engage deeply (to learn and improve) or superficially (for box-checking), depending on how well the process helps them do their “real” job.
  • If a process is viewed only as a regulatory tax, users will find ways to shortcut, minimally comply, or bypass the spirit of the requirement, undermining learning and risk mitigation.
  • Effective design ensures the process delivers genuine value, making “compliance” a natural by-product of a process stakeholders genuinely want to “hire”—because it helps them achieve something meaningful and important.

Practical Example: Deviation Management

  • Regulatory “Must”: Deviations must be documented and investigated under GMP.
  • Users “Hire” the Process to: Identify real risks early, protect quality, learn from mistakes, and demonstrate control in audits.
  • If the process enables those jobs well, it will be embraced and used effectively. If not, it becomes paperwork compliance—and loses its potential as a learning or risk-reduction tool.

To “hire” a process under regulatory obligation is to approach its use intentionally, ensuring it not only satisfies external requirements but also delivers real value for those required to use it. The ultimate goal is to design a process that people would choose to “hire” even if it were not mandatory—because it supports their intrinsic goals, such as maintaining quality, learning, and risk control.

Unpacking Jobs-to-Be-Done: The Roots of Customer-Centricity

Historical Genesis: From Marketing Myopia to Outcome-Driven Innovation

The JTBD’s intellectual lineage traces back to Theodore Levitt’s famous adage: “People don’t want to buy a quarter-inch drill. They want a quarter-inch hole.” This insight, presented in his seminal 1960 Harvard Business Review article “Marketing Myopia,” underscores the fatal flaw of most process redesigns: overinvestment in features, tools, and procedures, while neglecting the underlying human need or outcome.

This thinking resonates strongly with Peter Drucker’s core dictum that “the purpose of a business is to create and keep a customer”—and that marketing and innovation, not internal optimization, are the only valid means to this end. Both Drucker and Levitt’s insights form the philosophical substrate for JTBD, framing the product, system, or process not as an end in itself, but as a means to enable desired change in someone’s “real world”.

Modern JTBD: Ulwick, Christensen, and Theory Development

Tony Ulwick, after experiencing firsthand the failure of IBM’s PCjr product, launched a search to discover how organizations could systematically identify the outcomes customers (or process users) use to judge new offerings. Ulwick formalized jobs-as-process thinking, and by marrying Six Sigma concepts with innovation research, developed the “Outcome-Driven Innovation” (ODI) method, later shared with Clayton Christensen at Harvard.

Clayton Christensen, in his disruption theory research, sharpened the framing: customers don’t simply buy products—they “hire” them to get a job done, to make progress in their lives or work. He and Bob Moesta extended this to include the emotional and social dimensions of these jobs, and added nuance on how jobs can signal category-breaking opportunities for disruptive innovation. In essence, JTBD isn’t just about features; it’s about the outcome and the experience of progress.

The JTBD tool is now well-established in business, product development, health care, and increasingly, internal process improvement.

What Is a “Job” and How Does JTBD Actually Work?

Core Premise: The “Job” as the Real Center of Process Design

A “Job” in JTBD is not a task or activity—it is the progress someone seeks in a specific context. In regulated quality systems, this reframing prompts a pivotal question: For every step in the process, what is the user actually trying to achieve?

JTBD Statement Structure:

When [situation], I want to [job], so I can [desired outcome].

  • “When a process deviation occurs, I want to quickly and accurately assess impact, so I can protect product quality without delaying production.”
  • “When reviewing supplier audit responses, I want to identify meaningful risk signals, so I can challenge assumptions before they become failures.”

The Mechanics: Job Maps, Outcome Statements, and Dimensional Analysis

Job Map:

JTBD practitioners break the “job” down into a series of steps—the job map—outlining the user’s journey to achieve the desired progress. Ulwick’s “Universal Job Map” includes steps like: Define and plan, Locate inputs, Prepare, Confirm and validate, Execute, Monitor, Modify, and Conclude.

Dimension Analysis:
A full JTBD approach considers not only the functional needs (what must be accomplished), but also emotional (how users want to feel), social (how users want to appear), and cost (what users have to give up).

Outcome Statements:
JTBD expresses desired process outcomes in solution-agnostic language: To [achieve a specific goal], [user] must [perform action] to [produce a result].

The Relationship Between Job Maps and Process Maps

Job maps and process maps represent fundamentally different approaches to understanding and documenting work, despite both being visual tools that break down activities into sequential steps. Understanding their relationship reveals why each serves distinct purposes in organizational improvement efforts.

Core Distinction: Purpose vs. Execution

Job Maps focus on what customers or users are trying to accomplish—their desired outcomes and progress independent of any specific solution or current method. A job map asks: “What is the person fundamentally trying to achieve at each step?”

Process Maps focus on how work currently gets done—the specific activities, decisions, handoffs, and systems involved in executing a workflow. A process map asks: “What are the actual steps, roles, and systems involved in completing this work?”

Job Map Structure

Job maps follow a universal eight-step method regardless of industry or solution:

  1. Define – Determine goals and plan resources
  2. Locate – Gather required inputs and information
  3. Prepare – Set up the environment for execution
  4. Confirm – Verify readiness to proceed
  5. Execute – Carry out the core activity
  6. Monitor – Assess progress and performance
  7. Modify – Make adjustments as needed
  8. Conclude – Finish or prepare for repetition

Process Map Structure

Process maps vary significantly based on the specific workflow being documented and typically include:

  • Tasks and activities performed by different roles
  • Decision points where choices affect the flow
  • Handoffs between departments or systems
  • Inputs and outputs at each step
  • Time and resource requirements
  • Exception handling and alternate paths

Perspective and Scope

Job Maps maintain a solution-agnostic perspective. We can actually get pretty close to universal industry job maps, because whatever approach an individual organization takes, the job map remains the same because it captures the underlying functional need, not the method of fulfillment. A job map starts an improvement effort, helping us understand what needs to exist.

Process Maps are solution-specific. They document exactly how a particular organization, system, or workflow operates, including specific tools, roles, and procedures currently in use. The process map defines what is, and is an outcome of process improvement.

JTBD vs. Design Thinking, and Other Process Redesign Models

Most process improvement methodologies—including classic “design thinking”—center around incremental improvement, risk minimization, and stakeholder consensus. As previously critiqued , design thinking’s participatory workshops and empathy prototypes can often reinforce conservative bias, indirectly perpetuating the status quo. The tendency to interview, ideate, and choose the “least disruptive” option can perpetuate “GI Joe Fallacy”: knowing is not enough; action emerges only through challenged structures and direct engagement.

JTBD’s strength?

It demands that organizations reframe the purpose and metrics of every step and tool: not “How do we optimize this investigation template?”; but rather, “Does this investigation process help users make actual progress towards safer, more effective risk detection?” JTBD uncovers latent needs, both explicit and tacit, that design thinking’s post-it note workshops often fail to surface.

Why JTBD Is Invaluable for Process Design in Quality Systems

JTBD Enables Auditable Process Redesign

In pharmaceutical manufacturing, deviation management is a linchpin process—defining how organizations identify, document, investigate, and respond to events that depart from expected norms. Classic improvement initiatives target cycle time, documentation accuracy, or audit readiness. But JTBD pushes deeper.

Example JTBD Analysis for Deviations:

  • Trigger: A deviation is detected.
  • Job: “I want to report and contextualize the event accurately, so I can ensure an effective response without causing unnecessary disruption.”
  • Desired Outcome: Minimized product quality risk, transparency of root causes, actionable learning, regulatory confidence.

By mapping out the jobs of different deviation process stakeholders—production staff, investigation leaders, quality approvers, regulatory auditors—organizations can surface unmet needs: e.g., “Accelerating cross-functional root cause analysis while maintaining unbiased investigation integrity”; “Helping frontline operators feel empowered rather than blamed for honest reporting”; “Ensuring remediation is prioritized and tracked.”

Revealing Hidden Friction and Underserved Needs

JTBD methodology surfaces both overt and tacit pain points, often ignored in traditional process audits:

  • Operators “hire” process workarounds when formal documentation is slow or punitive.
  • Investigators seek intuitive data access, not just fields for “root cause.”
  • Approvers want clarity, not bureaucracy.
  • Regulatory reviewers “hire” the deviation process to provide organizational intelligence—not just box-checking.

A JTBD-based diagnostic invariably shows where job performance is low, but process compliance is high—a warning sign of process myopia and risk blindness.

Practical JTBD for Deviation Management: Step-by-Step Example

Job Statement and Context Definition

Define user archetypes:

  • Frontline Production Staff: “When a deviation occurs, I want a frictionless way to report it, so I can get support and feedback without being blamed.”
  • Quality Investigator: “When reviewing deviations, I want accessible, chronological data so I can detect patterns and act swiftly before escalation.”
  • Quality Leader: “When analyzing deviation trends, I want systemic insights that allow for proactive action—not just retrospection.”

Job Mapping: Stages of Deviation Lifecycle

  • Trigger/Detection: Event recognition (pattern recognition)—often leveraging both explicit SOPs and staff tacit knowledge.
  • Reporting: Document the event in a way that preserves context and allows for nuanced understanding.
  • Assessment: Rapid triage—“Is this risk emergent or routine? Is there unseen connection to a larger trend?” “Does this impact the product?”
  • Investigation: “Does the process allow multidisciplinary problem-solving, or does it force siloed closure? Are patterns shared across functions?”
  • Remediation: Job statement: “I want assurance that action will prevent recurrence and create meaningful learning.”
  • Closure and Learning Loop: “Does the process enable reflective practice and cognitive diversity—can feedback loops improve risk literacy?”

JTBD mapping reveals specific breakpoints: documentation systems that prioritize completeness over interpretability, investigation timelines that erode engagement, premature closure.

Outcome Statements for Metrics

Instead of “deviations closed on time,” measure:

  • Number of deviations generating actionable cross-functional insights.
  • Staff perception of process fairness and learning.
  • Time to credible remediation vs. time to closure.
  • Audit reviewer alignment with risk signals detected pre-close, not only post-mortem.

JTBD and the Apprenticeship Dividend: Pattern Recognition and Tacit Knowledge

JTBD, when deployed authentically, actively supports the development of deeper pattern recognition and tacit knowledge—qualities essential for risk resilience.

  • Structured exposure programs ensure users “hire” the process to learn common and uncommon risks.
  • Cognitive diversity teams ensures the job of “challenging assumptions” is not just theoretical.
  • True process improvement emerges when the system supports practice, reflection, and mentoring—outcomes unmeasurable by conventional improvement metrics.

JTBD Limitations: Caveats and Critical Perspective

No methodology is infallible. JTBD is only as powerful as the organization’s willingness to confront uncomfortable truths and challenge compliance-driven inertia:

  • Rigorous but Demanding: JTBD synthesis is non-“snackable” and lacks the pop-management immediacy of other tools.
  • Action Over Awareness: Knowing the job to be done is not sufficient; structures must enable action.
  • Regulatory Realities: Quality processes must satisfy regulatory standards, which are not always aligned with lived user experience. JTBD should inform, not override, compliance strategies.
  • Skill and Culture: Successful use demands qualitative interviewing skill, genuine cross-functional buy-in, and a culture of psychological safety—conditions not easily created.

Despite these challenges, JTBD remains unmatched for surfacing hidden process failures, uncovering underserved needs, and catalyzing redesign where it matters most.

Breaking Through the Status Quo

Many organizations pride themselves on their calibration routines, investigation checklists, and digital documentation platforms. But the reality is that these systems are often “hired” not to create learning—but to check boxes, push responsibility, and sustain the illusion of control. This leads to risk blindess and organizations systematically make themselves vulnerable when process myopia replaces real learning – zemblanity.

JTBD’s foundational question—“What job are we hiring this process to do?”—is more than a strategic exercise. It is a countermeasure against stagnation and blindness. It insists on radical honesty, relentless engagement, and humility before the complexity of operational reality. For deviation management, JTBD is a tool not just for compliance, but for organizational resilience and quality excellence.

Quality leaders should invest in JTBD not as a “one more tool,” but as a philosophical commitment: a way to continually link theory to action, root cause to remediation, and process improvement to real progress. Only then will organizations break free of procedural conservatism, cure risk blindness, and build systems worthy of trust and regulatory confidence.

When Water Systems Fail: Unpacking the LeMaitre Vascular Warning Letter

The FDA’s August 11, 2025 warning letter to LeMaitre Vascular reads like a masterclass in how fundamental water system deficiencies can cascade into comprehensive quality system failures. This warning letter offers lessons about the interconnected nature of pharmaceutical water systems and the regulatory expectations that surround them.

The Foundation Cracks

What makes this warning letter particularly instructive is how it demonstrates that water systems aren’t just utilities—they’re critical manufacturing infrastructure whose failures ripple through every aspect of product quality. LeMaitre’s North Brunswick facility, which manufactures Artegraft Collagen Vascular Grafts, found itself facing six major violations, with water system inadequacies serving as the primary catalyst.

The Artegraft device itself—a bovine carotid artery graft processed through enzymatic digestion and preserved in USP purified water and ethyl alcohol—places unique demands on water system reliability. When that foundation fails, everything built upon it becomes suspect.

Water Sampling: The Devil in the Details

The first violation strikes at something discussed extensively in previous posts: representative sampling. LeMaitre’s USP water sampling procedures contained what the FDA termed “inconsistent and conflicting requirements” that fundamentally compromised the representativeness of their sampling.

Consider the regulatory expectation here. As outlined in ISPE guideline, “sampling a POU must include any pathway that the water travels to reach the process”. Yet LeMaitre was taking samples through methods that included purging, flushing, and disinfection steps that bore no resemblance to actual production use. This isn’t just a procedural misstep—it’s a fundamental misunderstanding of what water sampling is meant to accomplish.

The FDA’s criticism centers on three critical sampling failures:

  • Sampling Location Discrepancies: Taking samples through different pathways than production water actually follows. This violates the basic principle that quality control sampling should “mimic the way the water is used for manufacturing”.
  • Pre-Sampling Conditioning: The procedures required extensive purging and cleaning before sampling—activities that would never occur during normal production use. This creates “aspirational data”—results that reflect what we wish our system looked like rather than how it actually performs.
  • Inconsistent Documentation: Failure to document required replacement activities during sampling, creating gaps in the very records meant to demonstrate control.

The Sterilant Switcheroo

Perhaps more concerning was LeMaitre’s unauthorized change of sterilant solutions for their USP water system sanitization. The company switched sterilants sometime in 2024 without documenting the change control, assessing biocompatibility impacts, or evaluating potential contaminant differences.

This represents a fundamental failure in change control—one of the most basic requirements in pharmaceutical manufacturing. Every change to a validated system requires formal assessment, particularly when that change could affect product safety. The fact that LeMaitre couldn’t provide documentation allowing for this change during inspection suggests a broader systemic issue with their change control processes.

Environmental Monitoring: Missing the Forest for the Trees

The second major violation addressed LeMaitre’s environmental monitoring program—specifically, their practice of cleaning surfaces before sampling. This mirrors issues we see repeatedly in pharmaceutical manufacturing, where the desire for “good” data overrides the need for representative data.

Environmental monitoring serves a specific purpose: to detect contamination that could reasonably be expected to occur during normal operations. When you clean surfaces before sampling, you’re essentially asking, “How clean can we make things when we try really hard?” rather than “How clean are things under normal operating conditions?”

The regulatory expectation is clear: environmental monitoring should reflect actual production conditions, including normal personnel traffic and operational activities. LeMaitre’s procedures required cleaning surfaces and minimizing personnel traffic around air samplers—creating an artificial environment that bore little resemblance to actual production conditions.

Sterilization Validation: Building on Shaky Ground

The third violation highlighted inadequate sterilization process validation for the Artegraft products. LeMaitre failed to consider bioburden of raw materials, their storage conditions, and environmental controls during manufacturing—all fundamental requirements for sterilization validation.

This connects directly back to the water system failures. When your water system monitoring doesn’t provide representative data, and your environmental monitoring doesn’t reflect actual conditions, how can you adequately assess the bioburden challenges your sterilization process must overcome?

The FDA noted that LeMaitre had six out-of-specification bioburden results between September 2024 and March 2025, yet took no action to evaluate whether testing frequency should be increased. This represents a fundamental misunderstanding of how bioburden data should inform sterilization validation and ongoing process control.

CAPA: When Process Discipline Breaks Down

The final violations addressed LeMaitre’s Corrective and Preventive Action (CAPA) system, where multiple CAPAs exceeded their own established timeframes by significant margins. A high-risk CAPA took 81 days instead of the required timeframe, while medium and low-risk CAPAs exceeded deadlines by 120-216 days.

This isn’t just about missing deadlines—it’s about the erosion of process discipline. When CAPA systems lose their urgency and rigor, it signals a broader cultural issue where quality requirements become suggestions rather than requirements.

The Recall That Wasn’t

Perhaps most concerning was LeMaitre’s failure to report a device recall to the FDA. The company distributed grafts manufactured using raw material from a non-approved supplier, with one graft implanted in a patient before the recall was initiated. This constituted a reportable removal under 21 CFR Part 806, yet LeMaitre failed to notify the FDA as required.

This represents the ultimate failure: when quality system breakdowns reach patients. The cascade from water system failures to inadequate environmental monitoring to poor change control ultimately resulted in a product safety issue that required patient intervention.

Gap Assessment Questions

For organizations conducting their own gap assessments based on this warning letter, consider these critical questions:

Water System Controls

  • Are your water sampling procedures representative of actual production use conditions?
  • Do you have documented change control for any modifications to water system sterilants or sanitization procedures?
  • Are all water system sampling activities properly documented, including any maintenance or replacement activities?
  • Have you assessed the impact of any sterilant changes on product biocompatibility?

Environmental Monitoring

  • Do your environmental monitoring procedures reflect normal production conditions?
  • Are surfaces cleaned before environmental sampling, and if so, is this representative of normal operations?
  • Does your environmental monitoring capture the impact of actual personnel traffic and operational activities?
  • Are your sampling frequencies and locations justified by risk assessment?

Sterilization and Bioburden Control

  • Does your sterilization validation consider bioburden from all raw materials and components?
  • Have you established appropriate bioburden testing frequencies based on historical data and risk assessment?
  • Do you have procedures for evaluating when bioburden testing frequency should be increased based on out-of-specification results?
  • Are bioburden results from raw materials and packaging components included in your sterilization validation?

CAPA System Integrity

  • Are CAPA timelines consistently met according to your established procedures?
  • Do you have documented rationales for any CAPA deadline extensions?
  • Is CAPA effectiveness verification consistently performed and documented?
  • Are supplier corrective actions properly tracked and their effectiveness verified?

Change Control and Documentation

  • Are all changes to validated systems properly documented and assessed?
  • Do you have procedures for notifying relevant departments when suppliers change materials or processes?
  • Are the impacts of changes on product quality and safety systematically evaluated?
  • Is there a formal process for assessing when changes require revalidation?

Regulatory Compliance

  • Are all required reports (corrections, removals, MDRs) submitted within regulatory timeframes?
  • Do you have systems in place to identify when product removals constitute reportable events?
  • Are all regulatory communications properly documented and tracked?

Learning from LeMaitre’s Missteps

This warning letter serves as a reminder that pharmaceutical manufacturing is a system of interconnected controls, where failures in fundamental areas like water systems can cascade through every aspect of operations. The path from water sampling deficiencies to patient safety issues is shorter than many organizations realize.

The most sobering aspect of this warning letter is how preventable these violations were. Representative sampling, proper change control, and timely CAPA completion aren’t cutting-edge regulatory science—they’re fundamental GMP requirements that have been established for decades.

For quality professionals, this warning letter reinforces the importance of treating utility systems with the same rigor we apply to manufacturing processes. Water isn’t just a raw material—it’s a critical quality attribute that deserves the same level of control, monitoring, and validation as any other aspect of your manufacturing process.

The question isn’t whether your water system works when everything goes perfectly. The question is whether your monitoring and control systems will detect problems before they become patient safety issues. Based on LeMaitre’s experience, that’s a question worth asking—and answering—before the FDA does it for you.

Meeting Worst-Case Testing Requirements Through Hypothesis-Driven Validation

The integration of hypothesis-driven validation with traditional worst-case testing requirements represents a fundamental evolution in how we approach pharmaceutical process validation. Rather than replacing worst-case concepts, the hypothesis-driven approach provides scientific rigor and enhanced understanding while fully satisfying regulatory expectations for challenging process conditions under extreme scenarios.

The Evolution of Worst-Case Concepts in Modern Validation

The concept of “worst-case” testing has undergone significant refinement since the original 1987 FDA guidance, which defined worst-case as “a set of conditions encompassing upper and lower limits and circumstances, including those within standard operating procedures, which pose the greatest chance of process or product failure when compared to ideal conditions”. The FDA’s 2011 Process Validation guidance shifted emphasis from conducting validation runs under worst-case conditions to incorporating worst-case considerations throughout the process design and qualification phases.

This evolution aligns perfectly with hypothesis-driven validation principles. Rather than conducting three validation batches under artificially extreme conditions that may not represent actual manufacturing scenarios, the modern lifecycle approach integrates worst-case testing throughout process development, qualification, and continued verification stages. Hypothesis-driven validation enhances this approach by making the scientific rationale for worst-case selection explicit and testable.

Guidance/RegulationAgencyYear PublishedPageRequirement
EU Annex 15 Qualification and ValidationEMA20155PPQ should include tests under normal operating conditions with worst case batch sizes
EU Annex 15 Qualification and ValidationEMA201516Definition: Worst Case – A condition or set of conditions encompassing upper and lower processing limits and circumstances, within standard operating procedures, which pose the greatest chance of product or process failure
EMA Process Validation for Biotechnology-Derived Active SubstancesEMA20165Evaluation of selected step(s) operating in worst case and/or non-standard conditions (e.g. impurity spiking challenge) can be performed to support process robustness
EMA Process Validation for Biotechnology-Derived Active SubstancesEMA201610Evaluation of purification steps operating in worst case and/or non-standard conditions (e.g. process hold times, spiking challenge) to document process robustness
EMA Process Validation for Biotechnology-Derived Active SubstancesEMA201611Studies conducted under worst case conditions and/or non-standard conditions (e.g. higher temperature, longer time) to support suitability of claimed conditions
WHO GMP Validation Guidelines (Annex 3)WHO2015125Where necessary, worst-case situations or specific challenge tests should be considered for inclusion in the qualification and validation
PIC/S Validation Master Plan Guide (PI 006-3)PIC/S200713Challenge element to determine robustness of the process, generally referred to as a “worst case” exercise using starting materials on the extremes of specification
FDA Process Validation General Principles and PracticesFDA2011Not specifiedWhile not explicitly requiring worst case testing for PPQ, emphasizes understanding and controlling variability and process robustness

Scientific Framework for Worst-Case Integration

Hypothesis-Based Worst-Case Definition

Traditional worst-case selection often relies on subjective expert judgment or generic industry practices. The hypothesis-driven approach transforms this into a scientifically rigorous process by developing specific, testable hypotheses about which conditions truly represent the most challenging scenarios for process performance.

For the mAb cell culture example, instead of generically testing “upper and lower limits” of all parameters, we develop specific hypotheses about worst-case interactions:

Hypothesis-Based Worst-Case Selection: The combination of minimum pH (6.95), maximum temperature (37.5°C), and minimum dissolved oxygen (35%) during high cell density phase (days 8-12) represents the worst-case scenario for maintaining both titer and product quality, as this combination will result in >25% reduction in viable cell density and >15% increase in acidic charge variants compared to center-point conditions.

This hypothesis is falsifiable and provides clear scientific justification for why these specific conditions constitute “worst-case” rather than other possible extreme combinations.

Process Design Stage Integration

ICH Q7 and modern validation approaches emphasize that worst-case considerations should be integrated during process design rather than only during validation execution. The hypothesis-driven approach strengthens this integration by ensuring worst-case scenarios are based on mechanistic understanding rather than arbitrary parameter combinations.

Design Space Boundary Testing

During process development, systematic testing of design space boundaries provides scientific evidence for worst-case identification. For example, if our hypothesis predicts that pH-temperature interactions are critical, we systematically test these boundaries to identify the specific combinations that represent genuine worst-case conditions rather than simply testing all possible parameter extremes.

Regulatory Compliance Through Enhanced Scientific Rigor

EMA Biotechnology Guidance Alignment

The EMA guidance on biotechnology-derived active substances specifically requires that “Studies conducted under worst case conditions should be performed to document the robustness of the process”. The hypothesis-driven approach exceeds these requirements by:

  1. Scientific Justification: Providing mechanistic understanding of why specific conditions represent worst-case scenarios
  2. Predictive Capability: Enabling prediction of process behavior under conditions not directly tested
  3. Risk-Based Assessment: Linking worst-case selection to patient safety through quality attribute impact assessment

ICH Q7 Process Validation Requirements

ICH Q7 requires that process validation demonstrate “that the process operates within established parameters and yields product meeting its predetermined specifications and quality characteristics”. The hypothesis-driven approach satisfies these requirements while providing additional value

Traditional ICH Q7 Compliance:

  • Demonstrates process operates within established parameters
  • Shows consistent product quality
  • Provides documented evidence

Enhanced Hypothesis-Driven Compliance:

  • Demonstrates process operates within established parameters
  • Shows consistent product quality
  • Provides documented evidence
  • Explains why parameters are set at specific levels
  • Predicts process behavior under untested conditions
  • Provides scientific basis for parameter range justification

Practical Implementation of Worst-Case Hypothesis Testing

Cell Culture Bioreactor Example

For a CHO cell culture process, worst-case testing integration follows this structured approach:

Phase 1: Worst-Case Hypothesis Development

Instead of testing arbitrary parameter combinations, develop specific hypotheses about failure mechanisms:

Metabolic Stress Hypothesis: The worst-case metabolic stress condition occurs when glucose depletion coincides with high lactate accumulation (>4 g/L) and elevated CO₂ (>10%) simultaneously, leading to >50% reduction in specific productivity within 24 hours.

Product Quality Degradation Hypothesis: The worst-case condition for charge variant formation is the combination of extended culture duration (>14 days) with pH drift above 7.2 for >12 hours, resulting in >10% increase in acidic variants.

Phase 2: Systematic Worst-Case Testing Design

Rather than three worst-case validation batches, integrate systematic testing throughout process qualification:

Study PhaseTraditional ApproachHypothesis-Driven Integration
Process DevelopmentLimited worst-case explorationSystematic boundary testing to validate worst-case hypotheses
Process Qualification3 batches under arbitrary worst-caseMultiple studies testing specific worst-case mechanisms
Commercial MonitoringReactive deviation investigationProactive monitoring for predicted worst-case indicators

Phase 3: Worst-Case Challenge Studies

Design specific studies to test worst-case hypotheses under controlled conditions:

Controlled pH Deviation Study:

  • Deliberately induce pH drift to 7.3 for 18 hours during production phase
  • Testable Prediction: Acidic variants will increase by 8-12%
  • Falsification Criteria: If variant increase is <5% or >15%, hypothesis requires revision
  • Regulatory Value: Demonstrates process robustness under worst-case pH conditions

Metabolic Stress Challenge:

  • Create controlled glucose limitation combined with high CO₂ environment
  • Testable Prediction: Cell viability will drop to <80% within 36 hours
  • Falsification Criteria: If viability remains >90%, worst-case assumptions are incorrect
  • Regulatory Value: Provides quantitative data on process failure mechanisms

Meeting Matrix and Bracketing Requirements

Traditional validation often uses matrix and bracketing approaches to reduce validation burden while ensuring worst-case coverage. The hypothesis-driven approach enhances these strategies by providing scientific justification for grouping and worst-case selection decisions.

Enhanced Matrix Approach

Instead of grouping based on similar equipment size or configuration, group based on mechanistic similarity as defined by validated hypotheses:

Traditional Matrix Grouping: All 1000L bioreactors with similar impeller configuration are grouped together.

Hypothesis-Driven Matrix Grouping: All bioreactors where oxygen mass transfer coefficient (kLa) falls within 15% and mixing time is <30 seconds are grouped together, as validated hypotheses demonstrate these parameters control product quality variability.

Scientific Bracketing Strategy

The hypothesis-driven approach transforms bracketing from arbitrary extreme testing to mechanistically justified boundary evaluation:

Bracketing Hypothesis: If the process performs adequately under maximum metabolic demand conditions (highest cell density with minimum nutrient feeding rate) and minimum metabolic demand conditions (lowest cell density with maximum feeding rate), then all intermediate conditions will perform within acceptable ranges because metabolic stress is the primary driver of process failure.

This hypothesis can be tested and potentially falsified, providing genuine scientific basis for bracketing strategies rather than regulatory convenience.

Enhanced Validation Reports

Hypothesis-driven validation reports provide regulators with significantly more insight than traditional approaches:

Traditional Worst-Case Documentation: Three validation batches were executed under worst-case conditions (maximum and minimum parameter ranges). All batches met specifications, demonstrating process robustness.

Hypothesis-Driven Documentation: Process robustness was demonstrated through systematic testing of six specific hypotheses about failure mechanisms. Worst-case conditions were scientifically selected based on mechanistic understanding of metabolic stress, pH sensitivity, and product degradation pathways. Results confirm process operates reliably even under conditions that challenge the primary failure mechanisms.

Regulatory Submission Enhancement

The hypothesis-driven approach strengthens regulatory submissions by providing:

  1. Scientific Rationale: Clear explanation of worst-case selection criteria
  2. Predictive Capability: Evidence that process behavior can be predicted under untested conditions
  3. Risk Assessment: Quantitative understanding of failure probability under different scenarios
  4. Continuous Improvement: Framework for ongoing process optimization based on mechanistic understanding

Integration with Quality by Design (QbD) Principles

The hypothesis-driven approach to worst-case testing aligns perfectly with ICH Q8-Q11 Quality by Design principles while satisfying traditional validation requirements:

Design Space Verification

Instead of arbitrary worst-case testing, systematically verify design space boundaries through hypothesis testing:

Design Space Hypothesis: Operation anywhere within the defined design space (pH 6.95-7.10, Temperature 36-37°C, DO 35-50%) will result in product meeting CQA specifications with >95% confidence.

Worst-Case Verification: Test this hypothesis by deliberately operating at design space boundaries and measuring CQA response, providing scientific evidence for design space validity rather than compliance demonstration.

Control Strategy Justification

Hypothesis-driven worst-case testing provides scientific justification for control strategy elements:

Traditional Control Strategy: pH must be controlled between 6.95-7.10 based on validation data.

Enhanced Control Strategy: pH must be controlled between 6.95-7.10 because validated hypotheses demonstrate that pH excursions above 7.15 for >8 hours increase acidic variants beyond specification limits, while pH below 6.90 reduces cell viability by >20% within 12 hours.

Scientific Rigor Enhances Regulatory Compliance

The hypothesis-driven approach to validation doesn’t circumvent worst-case testing requirements—it elevates them from compliance exercises to genuine scientific inquiry. By developing specific, testable hypotheses about what constitutes worst-case conditions and why, we satisfy regulatory expectations while building genuine process understanding that supports continuous improvement and regulatory flexibility.

This approach provides regulators with the scientific evidence they need to have confidence in process robustness while giving manufacturers the process understanding necessary for lifecycle management, change control, and optimization. The result is validation that serves both compliance and business objectives through enhanced scientific rigor rather than additional bureaucracy.

The integration of worst-case testing with hypothesis-driven validation represents the evolution of pharmaceutical process validation from documentation exercises toward genuine scientific methodology. An evolution that strengthens rather than weakens regulatory compliance while providing the process understanding necessary for 21st-century pharmaceutical manufacturing.

Process Mapping to Process Modeling – The Next Step

In the last two posts (here and here) I’ve been talking about how process mapping is a valuable set of techniques to create a visual representation of the processes within an organization. Fundamental tools, every quality professional should be fluent in them.

The next level of maturity is process modeling which involves creating a digital representation of a process that can be analyzed, simulated, and optimized. Way more comprehensive, and frankly, very very hard to do and maintain.

Process MapProcess ModelWhy is this Important?
Notation ambiguousStandardized notation conventionStandardized notation conventions for process modeling, such as Business Process Model and Notation (BPMN), drive clarity, consistency, communication and process improvements.
Precision usually lackingAs precise as neededPrecision drives model accuracy and effectiveness. Too often process maps are all over the place.
Icons (representing process components made up or loosely definedIcons are objectively defined and standardizedThe use of common modeling conventions ensures that all process creators represent models consistently, regardless of who in the organization created them.
Relationship of icons portrayed visuallyIcon relationships definite and explained in annotations, process model glossary, and process narrativesReducing ambiguity, improving standardization and easing knowledge transfer are the whole goal here. And frankly, the average process map can fall really short.
Limited to portrayal of simple ideasCan depict appropriate complexityWe need to strive  to represent complex workflows in a visually comprehensible manner, striking a balance between detail and clarity. The ability to have scalable detail cannot be undersold.
One-time snapshotCan grow, evolve, matureHow many times have you sat down to a project and started fresh with a process map? Enough said.
May be created with simple drawing toolsCreated with a tool appropriate to the needThe right tool for the right job
Difficult to use for the simplest manual simulationsMay provide manual or automated process simulationIn w world of more and more automation, being able to do a good process simulation is critical.
Difficult to link with related diagram or mapVertical and horizontal linking, showing relationships among processes and different process levelsProcesses don’t stand along, they are interconnected in a variety of ways. Being able to move up and down in detail and across the process family is great for diagnosing problems.
Uses simple file storage with no inherent relationshipsUses a repository of related models within a BPM systemIt is fairly common to do process maps and keep them separate, maybe in an SOP, but more often in a dozen different, unconnected places, making it difficult to put your hands on it. Process modeling maturity moves us towards a library approach, with drives knowledge management.
Appropriate for quick capture of ideasAppropriate for any level of process capture, analysis and designProcesses are living and breathing, our tools should take that into account.

This is all about moving to a process repository and away from a document mindset. I think it is a great shame that the eQMS players don’t consider this part of their core mission. This is because most quality units don’t see this as part of their core mission. We as quality leaders should be seeing process management as critical for future success. This is all about profound knowledge and utilizing it to drive true improvements.