Beyond Documents: Embracing Data-Centric Thinking

We live in a fascinating inflection point in quality management, caught between traditional document-centric approaches and the emerging imperative for data-centricity needed to fully realize the potential of digital transformation. For several decades, we’ve been in a process that continues to accelerate through a technology transition that will deliver dramatic improvements in operations and quality. This transformation is driven by three interconnected trends: Pharma 4.0, the Rise of AI, and the shift from Documents to Data.

The History and Evolution of Documents in Quality Management

The history of document management can be traced back to the introduction of the file cabinet in the late 1800s, providing a structured way to organize paper records. Quality management systems have even deeper roots, extending back to medieval Europe when craftsman guilds developed strict guidelines for product inspection. These early approaches established the document as the fundamental unit of quality management—a paradigm that persisted through industrialization and into the modern era.

The document landscape took a dramatic turn in the 1980s with the increasing availability of computer technology. The development of servers allowed organizations to store documents electronically in centralized mainframes, marking the beginning of electronic document management systems (eDMS). Meanwhile, scanners enabled conversion of paper documents to digital format, and the rise of personal computers gave businesses the ability to create and store documents directly in digital form.

In traditional quality systems, documents serve as the backbone of quality operations and fall into three primary categories: functional documents (providing instructions), records (providing evidence), and reports (providing specific information). This document trinity has established our fundamental conception of what a quality system is and how it operates—a conception deeply influenced by the physical limitations of paper.

Photo by Andrea Piacquadio on Pexels.com

Breaking the Paper Paradigm: Limitations of Document-Centric Thinking

The Paper-on-Glass Dilemma

The maturation path for quality systems typically progresses mainly from paper execution to paper-on-glass to end-to-end integration and execution. However, most life sciences organizations remain stuck in the paper-on-glass phase of their digital evolution. They still rely on the paper-on-glass data capture method, where digital records are generated that closely resemble the structure and layout of a paper-based workflow. In general, the wider industry is still reluctant to transition away from paper-like records out of process familiarity and uncertainty of regulatory scrutiny.

Paper-on-glass systems present several specific limitations that hamper digital transformation:

  1. Constrained design flexibility: Data capture is limited by the digital record’s design, which often mimics previous paper formats rather than leveraging digital capabilities. A pharmaceutical batch record system that meticulously replicates its paper predecessor inherently limits the system’s ability to analyze data across batches or integrate with other quality processes.
  2. Manual data extraction requirements: When data is trapped in digital documents structured like paper forms, it remains difficult to extract. This means data from paper-on-glass records typically requires manual intervention, substantially reducing data utilization effectiveness.
  3. Elevated error rates: Many paper-on-glass implementations lack sufficient logic and controls to prevent avoidable data capture errors that would be eliminated in truly digital systems. Without data validation rules built into the capture process, quality systems continue to allow errors that must be caught through manual review.
  4. Unnecessary artifacts: These approaches generate records with inflated sizes and unnecessary elements, such as cover pages that serve no functional purpose in a digital environment but persist because they were needed in paper systems.
  5. Cumbersome validation: Content must be fully controlled and managed manually, with none of the advantages gained from data-centric validation approaches.

Broader Digital Transformation Struggles

Pharmaceutical and medical device companies must navigate complex regulatory requirements while implementing new digital systems, leading to stalling initiatives. Regulatory agencies have historically relied on document-based submissions and evidence, reinforcing document-centric mindsets even as technology evolves.

Beyond Paper-on-Glass: What Comes Next?

What comes after paper-on-glass? The natural evolution leads to end-to-end integration and execution systems that transcend document limitations and focus on data as the primary asset. This evolution isn’t merely about eliminating paper—it’s about reconceptualizing how we think about the information that drives quality management.

In fully integrated execution systems, functional documents and records become unified. Instead of having separate systems for managing SOPs and for capturing execution data, these systems bring process definitions and execution together. This approach drives up reliability and drives out error, but requires fundamentally different thinking about how we structure information.

A prime example of moving beyond paper-on-glass can be seen in advanced Manufacturing Execution Systems (MES) for pharmaceutical production. Rather than simply digitizing batch records, modern MES platforms incorporate AI, IIoT, and Pharma 4.0 principles to provide the right data, at the right time, to the right team. These systems deliver meaningful and actionable information, moving from merely connecting devices to optimizing manufacturing and quality processes.

AI-Powered Documentation: Breaking Through with Intelligent Systems

A dramatic example of breaking free from document constraints comes from Novo Nordisk’s use of AI to draft clinical study reports. The company has taken a leap forward in pharmaceutical documentation, putting AI to work where human writers once toiled for weeks. The Danish pharmaceutical company is using Claude, an AI model by Anthropic, to draft clinical study reports—documents that can stretch hundreds of pages.

This represents a fundamental shift in how we think about documents. Rather than having humans arrange data into documents manually, we can now use AI to generate high-quality documents directly from structured data sources. The document becomes an output—a view of the underlying data—rather than the primary artifact of the quality system.

Data Requirements: The Foundation of Modern Quality Systems in Life Sciences

Shifting from document-centric to data-centric thinking requires understanding that documents are merely vessels for data—and it’s the data that delivers value. When we focus on data requirements instead of document types, we unlock new possibilities for quality management in regulated environments.

At its core, any quality process is a way to realize a set of requirements. These requirements come from external sources (regulations, standards) and internal needs (efficiency, business objectives). Meeting these requirements involves integrating people, procedures, principles, and technology. By focusing on the underlying data requirements rather than the documents that traditionally housed them, life sciences organizations can create more flexible, responsive quality systems.

ICH Q9(R1) emphasizes that knowledge is fundamental to effective risk management, stating that “QRM is part of building knowledge and understanding risk scenarios, so that appropriate risk control can be decided upon for use during the commercial manufacturing phase.” We need to recognize the inverse relationship between knowledge and uncertainty in risk assessment. As ICH Q9(R1) notes, uncertainty may be reduced “via effective knowledge management, which enables accumulated and new information (both internal and external) to be used to support risk-based decisions throughout the product lifecycle.”

This approach helps us ensure that our tools take into account that our processes are living and breathing, our tools should take that into account. This is all about moving to a process repository and away from a document mindset.

Documents as Data Views: Transforming Quality System Architecture

When we shift our paradigm to view documents as outputs of data rather than primary artifacts, we fundamentally transform how quality systems operate. This perspective enables a more dynamic, interconnected approach to quality management that transcends the limitations of traditional document-centric systems.

Breaking the Document-Data Paradigm

Traditionally, life sciences organizations have thought of documents as containers that hold data. This subtle but profound perspective has shaped how we design quality systems, leading to siloed applications and fragmented information. When we invert this relationship—seeing data as the foundation and documents as configurable views of that data—we unlock powerful capabilities that better serve the needs of modern life sciences organizations.

The Benefits of Data-First, Document-Second Architecture

When documents become outputs—dynamic views of underlying data—rather than the primary focus of quality systems, several transformative benefits emerge.

First, data becomes reusable across multiple contexts. The same underlying data can generate different documents for different audiences or purposes without duplication or inconsistency. For example, clinical trial data might generate regulatory submission documents, internal analysis reports, and patient communications—all from a single source of truth.

Second, changes to data automatically propagate to all relevant documents. In a document-first system, updating information requires manually changing each affected document, creating opportunities for errors and inconsistencies. In a data-first system, updating the central data repository automatically refreshes all document views, ensuring consistency across the quality ecosystem.

Third, this approach enables more sophisticated analytics and insights. When data exists independently of documents, it can be more easily aggregated, analyzed, and visualized across processes.

In this architecture, quality management systems must be designed with robust data models at their core, with document generation capabilities built on top. This might include:

  1. A unified data layer that captures all quality-related information
  2. Flexible document templates that can be populated with data from this layer
  3. Dynamic relationships between data entities that reflect real-world connections between quality processes
  4. Powerful query capabilities that enable users to create custom views of data based on specific needs

The resulting system treats documents as what they truly are: snapshots of data formatted for human consumption at specific moments in time, rather than the authoritative system of record.

Electronic Quality Management Systems (eQMS): Beyond Paper-on-Glass

Electronic Quality Management Systems have been adopted widely across life sciences, but many implementations fail to realize their full potential due to document-centric thinking. When implementing an eQMS, organizations often attempt to replicate their existing document-based processes in digital form rather than reconceptualizing their approach around data.

Current Limitations of eQMS Implementations

Document-centric eQMS systems treat functional documents as discrete objects, much as they were conceived decades ago. They still think it terms of SOPs being discrete documents. They structure workflows, such as non-conformances, CAPAs, change controls, and design controls, with artificial gaps between these interconnected processes. When a manufacturing non-conformance impacts a design control, which then requires a change control, the connections between these events often remain manual and error-prone.

This approach leads to compartmentalized technology solutions. Organizations believe they can solve quality challenges through single applications: an eQMS will solve problems in quality events, a LIMS for the lab, an MES for manufacturing. These isolated systems may digitize documents but fail to integrate the underlying data.

Data-Centric eQMS Approaches

We are in the process of reimagining eQMS as data platforms rather than document repositories. A data-centric eQMS connects quality events, training records, change controls, and other quality processes through a unified data model. This approach enables more effective risk management, root cause analysis, and continuous improvement.

For instance, when a deviation is recorded in a data-centric system, it automatically connects to relevant product specifications, equipment records, training data, and previous similar events. This comprehensive view enables more effective investigation and corrective action than reviewing isolated documents.

Looking ahead, AI-powered eQMS solutions will increasingly incorporate predictive analytics to identify potential quality issues before they occur. By analyzing patterns in historical quality data, these systems can alert quality teams to emerging risks and recommend preventive actions.

Manufacturing Execution Systems (MES): Breaking Down Production Data Silos

Manufacturing Execution Systems face similar challenges in breaking away from document-centric paradigms. Common MES implementation challenges highlight the limitations of traditional approaches and the potential benefits of data-centric thinking.

MES in the Pharmaceutical Industry

Manufacturing Execution Systems (MES) aggregate a number of the technologies deployed at the MOM level. MES as a technology has been successfully deployed within the pharmaceutical industry and the technology associated with MES has matured positively and is fast becoming a recognized best practice across all life science regulated industries. This is borne out by the fact that green-field manufacturing sites are starting with an MES in place—paperless manufacturing from day one.

The amount of IT applied to an MES project is dependent on business needs. At a minimum, an MES should strive to replace paper batch records with an Electronic Batch Record (EBR). Other functionality that can be applied includes automated material weighing and dispensing, and integration to ERP systems; therefore, helping the optimization of inventory levels and production planning.

Beyond Paper-on-Glass in Manufacturing

In pharmaceutical manufacturing, paper batch records have traditionally documented each step of the production process. Early electronic batch record systems simply digitized these paper forms, creating “paper-on-glass” implementations that failed to leverage the full potential of digital technology.

Advanced Manufacturing Execution Systems are moving beyond this limitation by focusing on data rather than documents. Rather than digitizing batch records, these systems capture manufacturing data directly, using sensors, automated equipment, and operator inputs. This approach enables real-time monitoring, statistical process control, and predictive quality management.

An example of a modern MES solution fully compliant with Pharma 4.0 principles is the Tempo platform developed by Apprentice. It is a complete manufacturing system designed for life sciences companies that leverages cloud technology to provide real-time visibility and control over production processes. The platform combines MES, EBR, LES (Laboratory Execution System), and AR (Augmented Reality) capabilities to create a comprehensive solution that supports complex manufacturing workflows.

Electronic Validation Management Systems (eVMS): Transforming Validation Practices

Validation represents a critical intersection of quality management and compliance in life sciences. The transition from document-centric to data-centric approaches is particularly challenging—and potentially rewarding—in this domain.

Current Validation Challenges

Traditional validation approaches face several limitations that highlight the problems with document-centric thinking:

  1. Integration Issues: Many Digital Validation Tools (DVTs) remain isolated from Enterprise Document Management Systems (eDMS). The eDMS system is typically the first step where vendor engineering data is imported into a client system. However, this data is rarely validated once—typically departments repeat this validation step multiple times, creating unnecessary duplication.
  2. Validation for AI Systems: Traditional validation approaches are inadequate for AI-enabled systems. Traditional validation processes are geared towards demonstrating that products and processes will always achieve expected results. However, in the digital “intellectual” eQMS world, organizations will, at some point, experience the unexpected.
  3. Continuous Compliance: A significant challenge is remaining in compliance continuously during any digital eQMS-initiated change because digital systems can update frequently and quickly. This rapid pace of change conflicts with traditional validation approaches that assume relative stability in systems once validated.

Data-Centric Validation Solutions

Modern electronic Validation Management Systems (eVMS) solutions exemplify the shift toward data-centric validation management. These platforms introduce AI capabilities that provide intelligent insights across validation activities to unlock unprecedented operational efficiency. Their risk-based approach promotes critical thinking, automates assurance activities, and fosters deeper regulatory alignment.

We need to strive to leverage the digitization and automation of pharmaceutical manufacturing to link real-time data with both the quality risk management system and control strategies. This connection enables continuous visibility into whether processes are in a state of control.

The 11 Axes of Quality 4.0

LNS Research has identified 11 key components or “axes” of the Quality 4.0 framework that organizations must understand to successfully implement modern quality management:

  1. Data: In the quality sphere, data has always been vital for improvement. However, most organizations still face lags in data collection, analysis, and decision-making processes. Quality 4.0 focuses on rapid, structured collection of data from various sources to enable informed and agile decision-making.
  2. Analytics: Traditional quality metrics are primarily descriptive. Quality 4.0 enhances these with predictive and prescriptive analytics that can anticipate quality issues before they occur and recommend optimal actions.
  3. Connectivity: Quality 4.0 emphasizes the connection between operating technology (OT) used in manufacturing environments and information technology (IT) systems including ERP, eQMS, and PLM. This connectivity enables real-time feedback loops that enhance quality processes.
  4. Collaboration: Breaking down silos between departments is essential for Quality 4.0. This requires not just technological integration but cultural changes that foster teamwork and shared quality ownership.
  5. App Development: Quality 4.0 leverages modern application development approaches, including cloud platforms, microservices, and low/no-code solutions to rapidly deploy and update quality applications.
  6. Scalability: Modern quality systems must scale efficiently across global operations while maintaining consistency and compliance.
  7. Management Systems: Quality 4.0 integrates with broader management systems to ensure quality is embedded throughout the organization.
  8. Compliance: While traditional quality focused on meeting minimum requirements, Quality 4.0 takes a risk-based approach to compliance that is more proactive and efficient.
  9. Culture: Quality 4.0 requires a cultural shift that embraces digital transformation, continuous improvement, and data-driven decision-making.
  10. Leadership: Executive support and vision are critical for successful Quality 4.0 implementation.
  11. Competency: New skills and capabilities are needed for Quality 4.0, requiring significant investment in training and workforce development.

The Future of Quality Management in Life Sciences

The evolution from document-centric to data-centric quality management represents a fundamental shift in how life sciences organizations approach quality. While documents will continue to play a role, their purpose and primacy are changing in an increasingly data-driven world.

By focusing on data requirements rather than document types, organizations can build more flexible, responsive, and effective quality systems that truly deliver on the promise of digital transformation. This approach enables life sciences companies to maintain compliance while improving efficiency, enhancing product quality, and ultimately delivering better outcomes for patients.

The journey from documents to data is not merely a technical transition but a strategic evolution that will define quality management for decades to come. As AI, machine learning, and process automation converge with quality management, the organizations that successfully embrace data-centricity will gain significant competitive advantages through improved agility, deeper insights, and more effective compliance in an increasingly complex regulatory landscape.

The paper may go, but the document—reimagined as structured data that enables insight and action—will continue to serve as the foundation of effective quality management. The key is recognizing that documents are vessels for data, and it’s the data that drives value in the organization.

Stop it with the 4.0 stuff

Industry 4.0, Quality 4.0, Validaiton 4.0. It is all absurd, so cut it out. Old man rant out.

Seriously though, let’s have a chat about this and why it is a bad practice.

When we put a number after something, we denote a version number. Version numbers have meaning, and individuals react to them in a certain way.

Understanding Version Numbers

A version number is a unique identifier assigned to specific releases of software, hardware, firmware, or drivers. It helps developers and users track changes, improvements, and updates in the product over time. Version numbers are crucial for maintaining software, ensuring compatibility, and managing updates effectively.

Structure of Version Numbers

Version numbers typically follow a structured format, often in the form of major.minor.patch or major.minor.patch.build. Each segment of the version number conveys specific information about the changes made in that release.

Major Version

  • Indicates: Significant changes or overhauls.
  • Example: Going from version 1.0.0 to 2.0.0 might indicate a complete redesign or the addition of major new features.
  • Impact: These changes might not be backward compatible with previous versions.

Minor Version

  • Indicates: Addition of new features or significant improvements that are backward compatible.
  • Example: Updating from version 2.1.0 to 2.2.0 could mean new functionalities were added without altering existing ones.
  • Impact: Users can expect enhancements without losing compatibility with previous minor versions.

Patch Version

  • Indicates: Bug fixes and minor improvements.
  • Example: Moving from version 2.2.1 to 2.2.2 might mean minor bugs were fixed.
  • Impact: These updates are usually safe and recommended as they resolve issues without changing functionality.

Build Number

  • Indicates: Specific builds or iterations, often used internally.
  • Example: Version 2.2.2.45 could indicate the 45th build of this particular version.
  • Impact: Helps in identifying specific builds, useful for debugging and internal tracking.

Semantic Versioning

One of the most widely adopted systems for versioning is Semantic Versioning (SemVer). It uses a three-part version number: major.minor.patch. This system provides a clear and standardized way to communicate the nature of changes in each release.

  • Major: Incompatible API changes.
  • Minor: Backward-compatible functionality added.
  • Patch: Backward-compatible bug fixes.

Importance of Version Numbers

  1. Tracking Changes: Helps developers and users keep track of what changes have been made and when.
  2. Compatibility: Ensures that users know whether new versions will work with their current setup.
  3. Support and Maintenance: Facilitates efficient troubleshooting and support by identifying the exact version in use.
  4. Update Management: Allows users to determine if they need to update their software to the latest version.

Why I Dislike Quality 4.0, Validation 4.0, and the Like

It is meant to denote a major version, but it’s not, for a lot of reasons:

  1. These concepts are more growth of design boxes than major changes. To use version control lingo, there is a lot of backward compatibility.
  2. They are not definitive. There are absolutes and best practices and onward progression.
  3. Each company tends to be in different places in different ways, and there are many maturity scales, not just one.

Maturity models are a better option. Each of these buckets has multiple scales, each of which needs to be evaluated and improved.

This is why I like cGMP

The “c” in cGMP stands for “current,” which signifies that the Good Manufacturing Practices (GMP) being referred to are up-to-date with the latest standards and technologies. This differentiation emphasizes that companies must use the most recent and advanced technologies and systems to comply with the regulations set forth by the FDA. The term cGMP ensures that manufacturing practices are not only good but also current, reflecting ongoing improvements and updates in the industry.

The Challenges Ahead for Quality

Discussions about Industry 4.0 and Quality 4.0 often focus on technology. However, technology is just one of the challenges that Quality organizations face. Many trends are converging to create constant disruption for businesses, and the Quality unit must be ready for these changes. Rapid changes in technology, work, business models, customer expectations, and regulations present opportunities to improve quality management but also bring new risks.

The widespread use of digital technology has raised the expectations of stakeholders beyond what traditional quality management can offer. As the lines between companies, suppliers, and customers become less distinct, the scope of quality management must expand beyond the traditional value chain. New work practices, such as agile teams and remote work, are creating challenges for traditional quality management governance and implementation strategies. To remain relevant, Quality leaders must adapt to these changes..

 ChallengeMeansImpact to Quality ManagementHow to Prepare
Advanced AnalyticsThe increase in data sources and improved data processing has led to higher expectations from customers, regulators, business leaders, and employees. They expect companies to use data analytics to provide advanced insights and improve decision-making.Requires a holistic approach that allows quality professionals to access, analyze and apply insights from structured and unstructured data

Quality excellence will be determined by how quickly data can be captured, analyzed, shared and applied  
Develop a talent strategy to recruit, develop, rent or borrow individuals with data analytics capabilities, such as data science, coding and data visualization
Hyper-AutomationTo become more efficient and agile in a competitive market, companies will increasingly use technologies like RPA, AI, and ML. These technologies will automate or enhance tasks that were previously done by humans. In other words, if a task can be automated, it will be.How to ensure these systems meet intended use and all requirements

Algorithm-error-generated root causes
Develop a hyperautomation vision for quality management that highlights business outcomes and reflects the use cases of relevant digital technology

Perform a risk-based assessment with appropriate experts to identify critical failure points in machine and algorithm decision making
Virtualization of WorkThe shift to remote work due to COVID-19, combined with advancements in cloud computing and AR/VR technology, will make work increasingly digital.Rethink how quality is executed and governed in a digital environment.Evaluate current quality processes for flexibility and compatibility with virtual work and create an action plan.

Uncover barriers to driving a culture of quality in a virtual working environment and
incorporate virtual work-relevant objectives, metrics and activities into your strategy.
Shift to Resilient OperationsPrioritizing capabilities that improve resilience and agility.Adapt in real-time to changing and simultaneously varying levels of risk without sacrificing the core purpose of QualityEnable employees to make faster decisions without sacrificing quality by developing training to build quality-informed judgment and embedding quality guidance in employee workflows.

Identify quality processes that may prevent operational resilience and reinvent them by starting from scratch, ruthlessly challenging the necessity of every step and requirement.

Ensure employees and new hires have the right skill sets to design, build and operate a responsive network environment.
Rise of Inter-connected EcosystemsThe growth of interconnected networks of people, businesses, and devices allows companies to create value by expanding their systems to include customers, suppliers, partners, and other organizations.Greater connectivity between customers, suppliers, and partners provides more visibility into the value chain. However, it also increases risk because it can be difficult to understand and manage different views of quality within the ecosystem.Map out the entire quality management ecosystem model and its participants, as well as their interactions with customers.

Co-develop critical-to-quality behaviors with strategic partners.

Strengthen relationships with partners across the ecosystem to capture and leverage relevant information and data, while at the same time addressing data privacy concerns.
Digitally Native WorkforceShift from digital immigrants (my generation and older) to digital natives who are those people who have grown up and are comfortable with computers and the internet. Unlike other generations, digital natives are so used to using technology in all areas of their lives that it is (and always has been) an integral, necessary part of their day-to-day.Increased flexibility leads to a need to rethink the way we monitor, train, and incentivize quality.

Connecting the 4 Ps: People, Processes, Policies and Platforms
Identify and target existing quality processes to digitize to offer desired flexibility.

Adjust messages about the importance of quality to connect with values employees care about (e.g., autonomy, innovation, social issues).
Customer Expectation MultiplicityCustomer expectations evolve quickly and expand into new-in-kind areas as access to information and global connectedness increases.Develop product portfolios, internal processes and company cultures that can quickly adapt to rapidly changing customer expectations for quality.Identify where hyperautomation and predictive capabilities of quality management can enhance customer experience and prevent issues before they occur.
Increasing Regulatory ComplexityThe global regulatory landscape is becoming more complex as countries introduce new regulations at different rates. Increased push for localization.Need strong system to efficiently implement changes across different systems, locations, and regions while maintaining consistent quality management throughout the ecosystem.Coordinate a structured regulatory tracking approach to monitor changing regulatory developments — highly regulated industries require a more comprehensive approach compared to organizations in a moderate regulatory environment
Challenges to Quality Management

The traditional Value Proposition of quality management is no longer sufficient to meet the expectations of stakeholders. With the rise of a digitally native workforce, there are new expectations for how work is done and managed. Business leaders expect quality leaders to have full command of operational data, diagnosing and anticipating quality problems. Regulators also expect high data transparency and traceability.

The value proposition of quality management lies in predicting problems rather than reacting to them. The primary objective of quality management should be to find hidden value by addressing the root causes of quality issues before they manifest. Quality organizations who can anticipate and prevent operational problems will meet or exceed stakeholder expectations.

Our organizations are on a journey towards utilizing predictive capabilities to unlock value, rather than one that retroactively solves problems. Our scope needs to be based on quality being predictive, connected, flexible, and embedded. For me this is the heart of Qualty 4.0.

Quality management should be applied across a multitude of systems, devices, products, and partners to create a seamless experience. This entails transforming quality from a function into an interdisciplinary, participatory process. The expanded scope will reach new risks in an increasingly complex ecosystem. The Quality unit cannot do this on its own; it’s all about breaking down silos and building autonomy within the organization.

To achieve this transformation, we need to challenge ourselves to move beyond top-down and regimented Governance Models and Implementation Strategies. We need to balance our core quality processes and workflows to achieve repeatability and consistency while continually adjusting as situations evolve. We need to build autonomy, critical thinking, and risk-based thinking into our organizational structures.

One way to achieve this is by empowering end-users to solve their own quality challenges through participatory quality management. This encourages personal buy-in and enables quality governance to adapt in real-time to different ways of working. By involving end-users in the process of identifying and solving quality issues, we can build a culture of continuous improvement and foster a sense of ownership over the quality of our products and services.

The future of quality management lies in being predictive, connected, flexible, and embedded.

  • Predictive: The value proposition of quality management needs to be predicting problems over problem-solving.
  • Connected: The scope of quality management needs to extend beyond the value chain and connect across the ecosystem
  • Flexible: The governance model needs to be based on an open-source model, rather than top-down.
  • Embedded: The implementation strategy needs to shift from viewing quality as a role to quality as a skill.

By embracing these principles and involving all stakeholders in the process of continuous improvement, we can unlock hidden value and exceed stakeholder expectations.

Deaing with these challenges and implications requires the Quality organization to treat transformation like a Program. This program should have four main initiative areas:

  1. Build the capacity for targeted prevention through targeted data insights. This includes building alliances with IT and other teams to have the right data available in flexible ways but it also includes the building of capacity to actually use the data.
  2. Expand quality management to cover the entire value network.
  3. Localize Risk Management to Make Quality Governance Flexible and Open Source.
  4. Distribute Tasks and Knowledge to Embed Quality Management in the Business.

Across these pillars the program approach will:

  1. Assess the current state: Identify areas requiring attention and improvement by examining existing People, Processes, Policies and Platforms. This comprehensive assessment will provide a clear understanding of the organization’s current situation and help pinpoint areas where projects can have the most significant impact
  2. Establish clear objectives: Establish clear objectives to h provide a clear roadmap for success.
  3. Prioritize foundational elements: Prioritize building foundational elements. Avoid bells-and-whistles for their own sake.
  4. Develop a phased approach: This is not an overnight process. Develop a phased approach that allows for gradual implementation, with clear milestones and measurable outcomes. This ensures that the organization can adapt and adjust as needed while maintaining ongoing operations and minimizing disruptions.
  5. Collaborate with stakeholders: Engage stakeholders from across the organization,to ensure alignment and buy-in. Create a shared vision for the initiative to ensure that everyone is working towards the same goals. Regular communication and collaboration among stakeholders will foster a sense of ownership and commitment to the transformation process.
  6. Continuously monitor progress: Regularly review the progress, measuring outcomes against predefined objectives. This enables organizations to identify any potential issues or roadblocks and make adjustments as necessary to stay on track. Establishing key performance indicators (KPIs) will help track progress and determine the effectiveness of the Program.
  7. Embrace a culture of innovation: Encourage a culture that embraces innovation and continuous improvement. This helps ensure that the organization remains agile and adaptive, making it better equipped to take advantage of new technologies and approaches as they emerge. Fostering a culture of innovation will empower employees to seek out new ideas and solutions, driving long-term success.
  8. Invest in employee training and development: It is crucial to provide employees with the necessary training and development opportunities to adapt to new technologies and processes. This will ensure that employees are well-equipped to handle the changes brought about by these challenges and contribute to the organization’s overall success.
  9. Evaluate and iterate: As the Program unfolds, it is essential to evaluate the results of each phase and make adjustments as needed. This iterative approach allows organizations to learn from their experiences and continuously improve their efforts, ultimately leading to greater success.

To do this leverage the eight accelerators to change.

The Role of Mixed Reality in Quality 4.0

Last night I had the honor to speak at the ASQ Boston Section monthly meeting on some of the exciting work Thermo Fisher Scientific is doing in mixed reality and how it fits into the industrial transformation that we are all taking stabs at, as well as the broader concept of Quality 4.0.

A small group, but it was really fun to discuss some of the stuff I’ve gotten involved with in the 5 months I’ve been here, and where we see it going.

Slides are available here.

AI/ML-Based SaMD Framework

The US Food and Drug Administration’s proposed regulatory framework for artificial intelligence- (AI) and machine learning- (ML) based software as a medical device (SaMD) is fascinating in what it exposes about the uncertainty around the near-term future of a lot of industry 4.0 initiatives in pharmaceuticals and medical devices.

While focused on medical devices, this proposal is interesting read for folks interested in applying machine learning and artificial intelligence to other regulated areas, such as manufacturing.

We are seeing is the early stages of consensus building around the concept of Good Machine Learning Practices (GMLP), the idea of applying quality system practices to the unique challenges of machine learning.