Methodologies, Frameworks, and Tools in Systems Thinking and Quality by Design

We often encounter three fundamental concepts in quality management: methodologies, frameworks, and tools. Despite their critical importance in shaping how we approach challenges, these terms are frequently unclear. It is pretty easy to confuse these concepts, using them interchangeably or misapplying them in practice.

This confusion is not merely a matter of semantics. Misunderstandings or misapplications of methodologies, frameworks, and tools can lead to ineffective problem-solving, misaligned strategies, and suboptimal outcomes. When we fail to distinguish between a methodology’s structured approach, a framework’s flexible guidance, and a tool’s specific function, we risk applying the wrong solution to our challenges or missing out on creative opportunities from their proper use.

In this blog post, I will provide clear definitions, illustrate their interrelationships, and demonstrate their real-world application. By doing so, we will clarify these often-confused terms and show how their proper understanding and application can significantly enhance our approach to quality management and other critical business processes.

Framework: The Conceptual Scaffolding

A framework is a flexible structure that organizes concepts, principles, and practices to guide decision-making. Unlike methodologies, frameworks are not rigidly sequential; they provide a mental model or lens through which problems can be analyzed. Frameworks emphasize what needs to be addressed rather than how to address it.

For example:

  • Systems Thinking Frameworks conceptualize problems as interconnected components (e.g., inputs, processes, outputs).
  • QbD Frameworks outline elements like Quality Target Product Profiles (QTPP) and Critical Process Parameters (CPPs) to embed quality into product design.

Frameworks enable adaptability, allowing practitioners to tailor approaches to specific contexts while maintaining alignment with overarching goals.

Methodology: The Structured Pathway

A methodology is a systematic, step-by-step approach to solving problems or achieving objectives. It provides a structured sequence of actions, often grounded in theoretical principles, and defines how tasks should be executed. Methodologies are prescriptive, offering clear guidelines to ensure consistency and repeatability.

For example:

  • Six Sigma follows the DMAIC (Define, Measure, Analyze, Improve, Control) methodology to reduce process variation.
  • 8D (Eight Disciplines) is a problem-solving methodology with steps like containment, root cause analysis, and preventive action.

Methodologies act as “recipes” that standardize processes across teams, making them ideal for regulated industries (e.g., pharmaceuticals) where auditability and compliance are critical.

Tool: The Tactical Instrument

A tool is a specific technique, model, or instrument used to execute tasks within a methodology or framework. Tools are action-oriented and often designed for a singular purpose, such as data collection, analysis, or visualization.

For example:

  • Root Cause Analysis Tools: Fishbone diagrams, Why-Why, and Pareto charts.
  • Process Validation Tools: Statistical Process Control (SPC) charts, Failure Mode Effects Analysis (FMEA).

Tools are the “nuts and bolts” that operationalize methodologies and frameworks, converting theory into actionable insights.

How They Interrelate: Building a Cohesive Strategy

Methodologies, frameworks, and tools are interdependent. A framework provides the conceptual structure for understanding a problem, the methodology defines the execution plan, and tools enable practical implementation.

Example in Systems Thinking:

  1. Framework: Systems theory identifies inputs, processes, outputs, and feedback loops.
  2. Methodology: A 5-phase approach (problem structuring, dynamic modeling, scenario planning) guides analysis.
  3. Tools: Causal loop diagrams map relationships; simulation software models system behavior.

In QbD:

  1. Framework: The ICH Q8 guideline outlines quality objectives.
  2. Methodology: Define QTPP → Identify Critical Quality Attributes → Design experiments.
  3. Tools: Design of Experiments (DoE) optimizes process parameters.

In Commissioning, Qualification, and Validation (CQV)

  1. Framework: Regulatory guidelines (e.g., FDA’s Process Validation Lifecycle) define stages (Commissioning → Qualification → Validation).
  2. Methodology:
    • Commissioning: Factory Acceptance Testing (FAT) ensures equipment meets design specs.
    • Qualification: Installation/Operational/Performance Qualification (IQ/OQ/PQ) methodologies verify functionality.
    • Validation: Ongoing process verification ensures consistent quality.
  3. Tools: Checklists (IQ), stress testing (OQ), and Process Analytical Technology (PAT) for real-time monitoring.

Without frameworks, methodologies lack context; without tools, methodologies remain theoretical.

Quality Management in the Model

Quality management is not inherently a framework, but rather an overarching concept that can be implemented through various frameworks, methodologies, and tools.

Quality management encompasses a broad range of activities aimed at ensuring products, services, and processes meet consistent quality standards. It can be implemented using different approaches:

  1. Quality Management Frameworks: These provide structured systems for managing quality, such as:
    • ISO 9001: A set of guidelines for quality management systems
    • Total Quality Management (TQM): An integrative system focusing on customer satisfaction and continuous improvement
    • Pharmaceutical Quality System: As defined by ICH Q10 and other regulations and guidance
  2. Quality Management Methodologies: These offer systematic approaches to quality management, including:
    • Six Sigma: A data-driven methodology for eliminating defects
    • Lean: A methodology focused on minimizing waste while maximizing customer value
  3. Quality Management Tools: There are too many tools to count (okay I have a few books on my shelf that try) but tools are usually built to meet the core elements that make up quality management practices:
    • Quality Planning
    • Quality Assurance
    • Quality Control
    • Quality Improvement

In essence, quality management is a comprehensive approach that can be structured and implemented using various frameworks, but it is not itself a framework.

Root Cause Analysis (RCA): Framework or Methodology?

Root cause analysis (RCA) functions as both a framework and a methodology, depending on its application and implementation.

Root Cause Analysis as a Framework

RCA serves as a framework when it provides a conceptual structure for organizing causal analysis without prescribing rigid steps. It offers:

  • Guiding principles: Focus on systemic causes over symptoms, emphasis on evidence-based analysis.
  • Flexible structure: Adaptable to diverse industries (e.g., healthcare, manufacturing) and problem types.
  • Tool integration: Accommodates methods like 5 Whys, Fishbone diagrams, and Fault Tree Analysis.

Root Cause Analysis as a Methodology

RCA becomes a methodology when applied as a systematic process with defined steps:

  1. Problem definition: Quantify symptoms and impacts.
  2. Data collection: Gather evidence through interviews, logs, or process maps.
  3. Causal analysis: Use tools like 5 Whys or Fishbone diagrams to trace root causes.
  4. Solution implementation: Design corrective actions targeting systemic gaps.
ApproachClassificationKey Characteristics
Six SigmaMethodology (DMAIC/DMADV)Structured phases (Define, Measure, Analyze, Improve, Control) for defect reduction.
8DMethodologyEight disciplines for containment, root cause analysis, and preventive action.
RCA ToolsTools (e.g., 5 Whys, Fishbone)Tactical instruments used within methodologies.
  • RCA is a framework when providing a scaffold for causal analysis (e.g., categorizing causes into human/process/systemic factors).
  • RCA becomes a methodology when systematized into phases (e.g., 5 Whys) or integrated into broader methodologies like Six Sigma.
  • Six Sigma and 8D are methodologies, not frameworks, due to their prescriptive, phase-based structures.

This duality allows RCA to adapt to contexts ranging from incident reviews to engineering failure analysis, making it a versatile approach for systemic problem-solving.

Synergy for Systemic Excellence

Methodologies provide the roadmap, frameworks offer the map, and tools equip the journey. In systems thinking and QbD, their integration ensures holistic problem-solving—whether optimizing manufacturing validation (CQV) or eliminating defects (Six Sigma). By anchoring these elements in process thinking, organizations transform isolated actions into coherent, quality-driven systems. Clarity on these distinctions isn’t academic; it’s the foundation of sustainable excellence.

AspectFrameworkMethodology
StructureFlexible, conceptualRigid, step-by-step
ApplicationGuides analysisPrescribes execution

The Building Blocks of Work-as-Prescribed

Work-as-Prescribed – how we translate the desired activities into a set of process and procedure – relies on an understanding of how people think and process information.

The format is pivotal. The difficulties we have in quality are really not much different from elsewhere in society in that we are surrounded by confusing documentation and poorly presented explanations everywhere we look, that provide information but not understanding. Oftentimes we rely on canards of “this is what is expected,” “this is what works” – but rarely is that based on anything more than anecdotal. And as the high incidence of issues and the high cost of training shows, less than adequate.

There is a huge body-of-knowledge out there on cognitive-friendly design of visuals, including documentation. This is an area we as a quality profession need to get comfortable with. Most important, we need to give ourselves permission to adapt, modify and transform the information we need into a shape that aids understanding and makes everyone a better thinker.

Work-as-Prescribed (and work-as-instructed) is the creation of tools and technologies to help us think better, understand more and perform at our peak.

Locus of Understanding

Looking at the process at the right level is key. Think of Work-as-Prescribed as a lens. Sometimes you need a high-powered lens so that you can zoom in on a single task. Other times, you need to zoom out to see a set of tasks, a whole process, or how systems interact.

This is the locus of understanding, where understanding happens. When we take this position, we see how understanding is created. Adopting the locus of understanding means going to the right level for the problem at hand. When we apply it to Work-as-Prescribed we are applying the same principles as we do in problem-solving to developing the right tools to govern the work.

We are conducting knowledge management as part of our continuous improvement.

An important way to look is distributed cognitive resources, which means anything that contributes to the cognitive work being done. Adjusting the locus of understanding means that you can, and should, treat an SOP as a cognitive resource. Some of the memory is in your head and some is in the SOP. Work-as-prescribed is a cognitive resource that we distribute, routinely and casually across the brain and our quality system in the form of documents and other execution aids.

Other tools, like my favorite whiteboard, also serve as distributed cognitive resources.

So, as our documents and other tools are distributed cognitive resources it behooves us to ensure they are based on the best cognitive principles possible to drive the most benefit.

As an aside, there is a whole line of thought about why some physical objects are better at distributed cognitive resources than electronic. Movement actually matters.

Taking it even further (shifting the locus) we can see the entire quality system as a part of a single distributed cognitive system where cognitive work is performed via the cognitive functions of communicating, deciding, planning, and problem-solving. These cognitive functions are supported by cognitive processes such as perceiving, analyzing, exchanging, and manipulating.

Cognitive Activity in Work-As-Prescribed

The tools we develop to provide distributed cognitive activity strive to:

  • Provide short-term or long-term memory aids so that memory load can be reduced.
  • Provide information that can be directly perceived and used such that little effort is needed to interpret and formulate the information explicitly.
  • Provide knowledge and skills that are unavailable from internal representations.
  • Support perceptual operators that can recognize features easily and make inferences directly.
  • Anchor and structure cognitive behavior without conscious awareness.
  • Change the nature of a task by generating more efficient action sequences.
  • Stop time and support perceptual rehearsal to make invisible and transient information visible and sustainable.
  • Aid processibility by limiting abstraction.
  • Determine decision making strategies through accuracy maximization and effort minimization.

Driving Work-As Prescribed

As we build our requirements documents, our process and procedure, there are a few principles to keep in mind to better tap into distributed cognitive resources.

Plan for the flow of information: Think about paths, relationships, seams, edges and other hand-offs. Focus on the flow of information. Remember that we learn in a spiral, and the content needed for a novice is different from that of an expert and build our documents and the information flow accordingly. This principle is called Sequencing.

Break information down into pieces: Called, Chunking, the grouping together of information into ideally sized pieces. When building Work-As-Prescribed pay close attention to which of these chunks are reusable and build accordingly.

The deeply about context: How a tool is used drives what the tool should be.

Think deeply about information structures: Not all information is the same, not every example of Work-as-Prescribed should have the same structure.

Be conscientious about the digital and physical divide: Look for opportunities to integrate or connect these two worlds. Be honest of how enmeshed they are at any point in the system.

We are building our Work-as-Prescribed through leveraging our quality culture, our framework for coordinating work. Pay attention to:

  1. Shared Standards – Ways we communicate
  2. Invisible Environments – Ways we align, conceptually
  3. Visible Environments – Ways we collaborate
  4. Psychological Safety – Ways we behave
  5. Perspectives – Ways we see (and see differently)

Principles in Practice

When design process, procedure and task documentation leverage this principles by build blocks, or microcontent, that is:

  • about one primary idea, fact, or concept
  • easily scannable
  • labeled for clear identification and meaning, and
  • appropriately written and formatted for use anywhere and any time it is needed.

There is a common miscomprehension that simple means short. That just isn’t true. Simple means that it passes a test for the appropriateness of the size of a piece of content of providing sufficient details to answer a specific question for the targeted audience. The size of the content must effectively serve its intended purpose with efficiency, stripping off any unnecessary components.

We need to strive to apply cognitive thinking principles to our practice. The day of judging a requirements document by its page length is long over.

Constituents of cognitive thinking applied to Work-As-Prescribed

Managing Events Systematically

Being good at problem-solving is critical to success in an organization. I’ve written quite a bit on problem-solving, but here I want to tackle the amount of effort we should apply.

Not all problems should be treated the same. There are also levels of problems. And these two aspects can contribute to some poor problem-solving practices.

It helps to look at problems systematically across our organization. The iceberg analogy is a pretty popular way to break this done focusing on Events, Patterns, Underlying Structure, and Mental Model.

Iceberg analogy

Events

Events start with the observation or discovery of a situation that is different in some way. What is being observed is a symptom and we want to quickly identify the problem and then determine the effort needed to address it.

This is where Art Smalley’s Four Types of Problems comes in handy to help us take a risk-based approach to determining our level of effort.

Type 1 problems, Troubleshooting, allows us to set problems with a clear understanding of the issue and a clear pathway. Have a flat tire? Fix it. Have a document error, fix it using good documentation practices.

It is valuable to work the way through common troubleshooting and ensure the appropriate linkages between the different processes, to ensure a system-wide approach to problem solving.

Corrective maintenance is a great example of troubleshooting as it involved restoring the original state of an asset. It includes documentation, a return to service and analysis of data. From that analysis of data problems are identified which require going deeper into problem-solving. It should have appropriate tie-ins to evaluate when the impact of an asset breaking leads to other problems (for example, impact to product) which can also require additional problem-solving.

It can be helpful for the organization to build decision trees that can help folks decide if a given problem stays as troubleshooting or if it it also requires going to type 2, “gap from standard.”

Type 2 problems, gap from standard, means that the actual result does not meet the expected and there is a potential of not meeting the core requirements (objectives) of the process, product, or service. This is the place we start deeper problem-solving, including root cause analysis.

Please note that often troubleshooting is done in a type 2 problem. We often call that a correction. If the bioreactor cannot maintain temperature during a run, that is a type 2 problem but I am certainly going to immediately apply troubleshooting as well. This is called a correction.

Take documentation errors. There is a practice in place, part of good documentation practices, for addressing troubleshooting around documents (how to correct, how to record a comment, etc). By working through the various ways documentation can go wrong, applying which ones are solved through troubleshooting and don’t involve type 2 problems, we can create a lot of noise in our system.

Core to the quality system is trending, looking for possible signals that require additional effort. Trending can help determine where problems lay and can also drive up the level of effort necessary.

Underlying Structure

Root Cause Analysis is about finding the underlying structure of the problem that defines the work applied to a type 2 problem.

Not all problems require the same amount of effort, and type 2 problems really have a scale based on consequences, that can help drive the level of effort. This should be based on the impact to the organization’s ability to meet the quality objectives, the requirements behind the product or service.

For example, in the pharma world there are three major criteria:

  •  safety, rights, or well-being of patients (including subjects and participants human and non-human)
  • data integrity (includes confidence in the results, outcome, or decision dependent on the data)
  • ability to meet regulatory requirements (which stem from but can be a lot broader than the first two)

These three criteria can be sliced and diced a lot of ways, but serve our example well.

To these three criteria we add a scale of possible harm to derive our criticality, an example can look like this:

ClassificationDescription
CriticalThe event has resulted in, or is clearly likely to result in, any one of the following outcomes:   significant harm to the safety, rights, or well-being of subjects or participants (human or non-human), or patients; compromised data integrity to the extent that confidence in the results, outcome, or decision dependent on the data is significantly impacted; or regulatory action against the company.
MajorThe event(s), were they to persist over time or become more serious, could potentially, though not imminently, result in any one of the following outcomes:  
harm to the safety, rights, or well-being of subjects or participants (human or non-human), or patients; compromised data integrity to the extent that confidence in the results, outcome, or decision dependent on the data is significantly impacted.
MinorAn isolated or recurring triggering event that does not otherwise meet the definitions of Critical or Major quality impacts.
Example of Classification of Events in a Pharmaceutical Quality System

This level of classification will drive the level of effort on the investigation, as well as drive if the CAPA addresses underlying structures alone or drives to addressing the mental models and thus driving culture change.

Mental Model

Here is where we address building a quality culture. In CAPA lingo this is usually more a preventive action than a corrective action. In the simplest of terms, corrective actions is address the underlying structures of the problem in the process/asset where the event happened. Preventive actions deal with underlying structures in other (usually related) process/assets or get to the Mindsets that allowed the underlying structures to exist in the first place.

Solving Problems Systematically

By applying this system perspective to our problem solving, by realizing that not everything needs a complete rebuild of the foundation, by looking holistically across our systems, we can ensure that we are driving a level of effort to truly build the house of quality.

Risk Management is our Ability to Anticipate

Risk assessment is a pillar of the quality system because it gives us the ability to anticipate in a consistent manner. It is built on some fundamental criteria:

CriteriaAsksEnsure
ExpertiseWhat sort of expertise is relied upon to look into the futureDiversity in expertise. Drive out subjectivity
FrequencyHow often are future threats and opportunities assessed?Living risk assessments, cycles of review.
CommunicationHow are the expectations of future events communicated or shared within the system?Risk register. Knowledge management. Continuous improvement. Management review
StrategyWhat is the model of the futureSensemaking
Time horizonHow far ahead does the system look ahead? Is the time horizon different for different organization areas?System building
Acceptability of RisksWhich risks are considered acceptable and which unacceptable? On which basis?Controls
CultureIs risk awareness part of the organizational culture?Risk-based-thinking. Mindset
Anticipation Criteria to apply to Risk Management

Documents and the Heart of the Quality System

A month back on LinkedIn I complained about a professional society pushing the idea of a document-free quality management system. This has got to be one of my favorite pet peeves that come from Industry 4.0 proponents, and it demonstrates a fundamental failure to understand core concepts. And frankly one of the reasons why many Industry/Quality/Pharma 4.0 initiatives truly fail to deliver. Unfortunately, I didn’t follow through with my idea of proposing a session to that conference, so instead here are my thoughts.

Fundamentally, documents are the lifeblood of an organization. But paper is not. This is where folks get confused. But fundamentally, this confusion is also limiting us.

Let’s go back to basics, which I covered in my 2018 post on document management.

When talking about documents, we really should talk about function and not just by name or type. This allows us to think more broadly about our documents and how they function as the lifeblood.

There are three types of documents:

  • Functional Documents provide instructions so people can perform tasks and make decisions safely effectively, compliantly, and consistently. This usually includes things like procedures, process instructions, protocols, methods, and specifications. Many of these need some sort of training decision. Functional documents should involve a process to ensure they are up-to-date, especially in relation to current practices and relevant standards (periodic review)
  • Records provide evidence that actions were taken, and decisions were made in keeping with procedures. This includes batch manufacturing records, logbooks and laboratory data sheets and notebooks. Records are a popular target for electronic alternatives.
  • Reports provide specific information on a particular topic on a formal, standardized way. Reports may include data summaries, findings, and actions to be taken.

The beating heart of our quality system brings us from functional to record to reports in a cycle of continuous improvement.

Functional documents are how we realize requirements, that is the needs and expectations of our organization. There are multiple ways to serve up the functional documents, the big three being paper, paper-on-glass, and some sort of execution system. That last, an execution system, united function with record, which is a big chunk of the promise of an execution system.

The maturation mind is to go from mostly paper execution, to paper-on-glass, to end-to-end integration and execution to drive up reliability and drive out error. But at the heart, we still have functional documents, records, and reports. Paper goes, but the document is there.

So how is this failing us?

Any process is a way to realize a set of requirements. Those requirements come from external (regulations, standards, etc) and internal (efficiency, business needs) sources. We then meet those requirements through People, Procedure, Principles, and Technology. They are interlinked and strive to deliver efficiency, effectiveness, and excellence.

So this failure to understand documents means we think we can solve this through a single technology application. an eQMS will solve problems in quality events, a LIMS for the lab, an MES for manufacturing. Each of these is a lever for change but alone cannot drive the results we want.

Because of the limitations of this thought process we get systems designed for yesterday’s problems, instead of thinking through towards tomorrow.

We get documentation systems that think of functional documents pretty much the same way we thought of them 30 years ago, as discrete things. These discrete things then interact through a gap with our electronic systems. There is little traceability, which complicates change control and makes it difficult to train experts. The funny thing, is we have the pieces, but because of the limitations of our technology we aren’t leveraging them.

The v-model approach should be leveraged in a risk-based manner to the design of our full system, and not just our technical aspects.

System feasibility matches policy and governance, user requirements allow us to trace to what elements are people, procedure, principles, and/or technology. Everything then stems from there.