Subject matter experts have explicit knowledge from formal education and embedded in reports, manuals, websites, memos, and other corporate documents. But their implicit and tacit knowledge, based on their experience, is perhaps the source of their greatest value — whether the subject-matter expert with decades of experience who is lightning fast with a diagnosis and almost always spot-on or the manager whose team everyone wants to be on because she’s so good at motivating and mentoring.
Experts, no matter the domain, tend to have very similar attributes. Understanding these attributes allows us to start understanding how we build expertise.
Critical know-how and “know-what”
Managerial, technical, or both; superior, experience-based techniques and processes; extraordinary factual knowledge
Swift recognition of a phenomenon, situation, or process that has been encountered before
Building and maintaining an extensive network of professionally important individuals
Ability to deal with individuals, including motivating and leading them; comfort with intellectual disagreement
Ability to construct, tailor, and deliver messages through one or more media to build logical and persuasive arguments
Diagnosis and cue seeking
Ability to actively identify cues in a situation that would confirm or challenge a familiar pattern; ability to distinguish signal from noise
Ability to diagnose, interpret, or predict through appropriate senses
Attributes of an Expert
One of the critical parts of being a subject matter expert is being able to help others absorb knowledge and gain wisdom through learn-by-doing techniques— guided practice, observation, problem solving, and experimentation.
Think of this as an apprenticeship program that provides deliberate practice with expert feedback, which is fundamental to the development of expertise.
Do your organizations have this sort of organized way to train an expert? How does it work?
Grace Duffy is the keynote speaker. I’ve known Grace for years and consider her a mentor and I’m always happy to hear her speak. Grace has been building on a theme around her Modular Kaizen approach and the use of the OODA Loop, and this presentation built nicely on what she presented at the Lean Six Sigma Conference in Phoenix, at WCQI and in other places.
Audits as a form of sustainability is an important point to
stress, and hopefully this will be a central theme throughout the conference.
The intended purpose is to build on a systems view for preparation for an effective audit and using the OODA loop to approach evolutionary and revolutionary change approaches.
Grace starts with a brief overview of system and process and then from vision to strategy to daily, and how that forms a mobius strip of macro, meso, micro and individual. She talks a little about the difference between Deming and Juran’s approaches and does a little what-if thinking about how Lean would have devoted if Juran had gone to Japan instead of Deming.
Breaking down OODA (Observe, Orient, Decide Act) as “Where
am I and where is the organization” and then feed into decision making.
Stresses how Orient discusses culture and discusses understanding the culture.
Her link to Lean is a little tenuous in my mind.
She then discusses Tom Pearson’s knowledge management model
with: Local Action; Management Action; Exploratory Analysis; Knowledge
Building; Complex Systems; Knowledge Management; Scientific Creativity. Units
all this with system thinking and psychology. “We’re going to share shamelessly because
that’s how we learn.” “If we can’t have fun with this stuff it’s no good.”
Uniting the two, she describes the knowledge management
model as part of Orient.
Puts revolutionary and evolutionary change in light of
Juran’s Breakthrough versus Continuous Improvement. From here she covers
modular kaizen, starting with incremental change versus process redesign. From
there she breaks it down into a DMAIC model and goes into how much she loves
the measure. She discusses how the human brain is better at connections, which
is a good reinforce of the OODA model.
Breaks down a culture model of Culture/Beliefs,
Visions/Goals and Activities/Plans-and-actions influenced by external events
and how evolutionary improvements stem out of compatibility with those. OODA is
the tool to help determine that compatibility.
Discusses briefly on how standardization fits into systems
and pushes a look from a stability.
Goes back to the culture model but now adds idea generation
and quality test with decisions off of it that lead to revolutionary
improvements. Links back to OODA.
Then quickly covers DMAIC versus DMADV and how that is
another way of thinking about these concepts.
Covers Gina Wickman’s concept of visionary and integrator from Traction.
Ties back OODA to effective auditing: focus on patterns and
not just numbers, Grasp the bigger picture, be adaptive.
This is a big sprawling topic for a key note and at times it
felt like a firehose.. Keynotes often benefit from a lot more laser focus. OODA
alone would have been enough. My head is reeling, and I feel comfortable with
this material. Grace is an amazing, passionate educator and she finds this
material exciting. I hope most of the audience picked that up in this big gulp
approach. This system approach, building on culture and strategy is critical.
OODA as an audit tool is relevant, and it is a tool I think we should be teaching better. Might be a good tool to do for TWEF as it ties into the team/workplace excellence approach. OODA and situational awareness are really united in my mind and that deserves a separate post.
After the keynote there are the breakout sessions. As always, I end up having too many options and must make some decisions. Can never complain about having too many options during a conference.
First Impressions: The Myth of the Objective & Impartial Audit
First session is “First Impressions: The Myth of the
Objective & Impartial Audit” by William Taraszewski. I met Bill back at the
2018 World Conference of Quality Improvement.
Bill starts by discussing how subjectivity and first
impressions and how that involves audits from the very start.
Covers the science of first impressions, point to research of bias and how negative behavior weighs more than positive and how this can be contextual. Draws from Amy Cuddy’s work and lays a good foundation of Trust and Competence and the importance in work and life in general.
Brings this back to ISO 19011:2018 “Guidelines for auditing
management systems” and clause 7.2 determining auditor competence placing
personal behavior over knowledge and skills.
Brings up video auditing and the impressions generated from
video vs in-person are pretty similar but the magnitude of the bad impressions
are greater and the magnitude of positive is lower. That was an interesting
point and I will need to follow-up with that research.
Moves to discussing impartiality in context of ISO
19011:2018, pointing out the halo and horn effects.
Discusses prejudice vs experience as an auditor and covers
confirmation bias and how selective exposure and selective perception fits into
our psychology with the need to be careful since negative outweighs.
Moves into objective evidence and how it fits into an audit.
Provides top tips for good auditor first impressions with
body language and eye contact. Most important, how to check your attitude.
This was a good fundamental on the topics that reinforces some basics and goes back to the research. Quality as a profession really needs to understand how objectivity and impartiality are virtually impossible and how we can overcome bias.
Auditing Risk Management
Barry Craner presented on :Are you ready for an audit of your risk management system?”
Starts with how risk management is here to stay and how it is in most industries. The presenter is focused on medical devices but the concepts are very general.
“As far possible” as a concept is discussed and residual risk. Covers this at a high level.
Covers at a high level the standard risk management process (risk identification, risk analysis, risk control, risk monitoring, risk reporting) asking the question is “RM system acceptable? Can you describe and defend it?”
Provides an example of a risk management file sequence that matches the concept of living risk assessments. This is a flow that goes from Preliminary Hazard analysis to Fault Tree Analysis (FTA) to FMEA. With the focus on medical devices talks about design and process for both the FTA and the FMEA. This is all from the question “Can you describe and defend your risk management program?”
In laying out the risk management program focused in on personnel qualification being pivotal. Discusses answering the question “Are these ready for audit?” When discussing the plan asks the questions “Is your risk management plan: documented and reasonable; ready to audit; and, SOP followed by your company?”
When discussing risk impact breaks it down to “Is the risk acceptable or not.” Goes on to discuss how important it is to defend the scoring rubric, asking the question”Well defined, can we defend?”
Goes back and discusses some basic concepts of hazard and harm. Asks the questions “Did you do this hazard assessment with enough thoroughness? Were the right hazards identified?” Recommends building a example of hazards table. This is good advice. From there answer the question “Do your hazard analses yield reasonable, useful information? Do you use it?”
Provides a nice example of how to build a mitigation plan out of a fault tree analysis.
Discussion on FMEAs faultered on detection, probably could have gone into controls a lot deeper here.
With both the PTA and FMEA discussed how the results needs to be defendable.
Risk management review, with the right metrics are discussed at a high level. This easily can be a session on its own.
Asks the question “Were there actionable tasks? Progress on these tasks?”
It is time to stop having such general overviews at conferences, especially at a conference which are not targeted to junior personnel.
Risk Management is a key enabler of any quality by design, whether of product, facility or equipment. We do living risk assessments to understand the scope of our ongoing risk. Inevitably we either want to implement that new or improved design or we want to mitigate the ongoing risks in our operation. So we turn to change management. And as part of that change management we do a risk assessment. Our change management then informs ongoing risk review.
Risk Management Leads to Change Management
Through your iterative design lifecycle there is a final design ready for introduction. Perhaps this is a totally new thing, perhaps it is a new set of equipment or processes, or just a modification.
All along through the iterative design lifecycle risk management has been applied to establish measurable, testable, unambiguous and traceable performance requirements. Now your process engages with change management to introduce the change.
And a new risk assessment is conducted.
This risk assessment is asking a different question. During the interative design lifecycle the risk question is some form of “What are the risks from this design on the patient/process.” As part of risk management, the question is “What are the risks to SISPQ/GMP from introducing the change.”
This risk assessment is narrower, in that it looks at the process of implementing. Broader that it looks at the entirety of your operations: facility, supply chain, quality system, etc.
The design risk assessment and risk management activities informs the change management risk assessment, but it cannot replace them. They also can serve to lower the rigor of the change management risk assessment, allowing the use of a less formal tool.
Living Risk Reviews
In the third phase of risk management – risk review – we confirm that the risks identified and mitigated as planned and are functioning as intended. We also evaluate to see if any additional, previously unpredicted risks have appeared. Risk review is the living part of the lifecycle as we return to it on a periodic basis.
From this will come new mitigations, targeted to address the identified risks. These mitigations inevitably lead to change management.
We again do a new risk assessment focusing on the risk of implementing the change. Informed by the living risk assessment, we can often utilize a less formal tool to look at the full ramifications of introducing the mitigation (a change).
Each and every change requires a risk assessment to capture the risks of the change. This ICHQ10 requirement is the best way to determine if the change is acceptable.
This risk assessment evaluates the impact on the change on the facility, equipment, materials, supply chain, processes. testing, quality systems and everything else. It is one of the critical reasons it is crucial to involve the right experts.
From this risk assessment comes the appropriate actions before implementing the change, as well as appropriate follow-up activities and it can help define the effectiveness review.
Depends. Sometimes the risk management looks at the individual implementations. Othertimes you need to do separate ones. Many times the risk assessment lead you to breaking up one change control into many. Evaluate as follows:
Are the risks from the separate implementations appropriately captured
Are the risks from pauses between implementations appropriately captured
As the ripples appropriately understood
Change Management Leads back to Risk Management
Sometimes a change control requires a specific risk assessment to be updated, or requires specific risk management to happen.
What about HAACP?
Hazard Analysis Critical Control Point (HACCP) are great tools for risk assessments. They are often the catalyst for doing a change, they are often the artifact of a change. They should never be utilized for determining the impact of a change.
A hazard is any biological, chemical, or physical property that impacts human safety. The HAACP identifies and establishes critical limits. But a HAACP is not the tool to use to determine if a change should move forward and what actions to do. It is to static.
Risk Management is an enabler for change, a tenet enshrined in the ICH guidances. We are engaging in risk management activities throughout our organizations. It is critical to understand how the various risk management activities fit together and how they should be separated.
Microbiologists won’t be sequestered in the laboratory, running samples and conducting environmental testing, once the revisions proposed for Annex 1 of the EU and Pharmaceutical Inspection Cooperation Scheme (PIC/S) GMP guides take effect, Annex 1 rapporteur Andrew Hopkins said Oct. 15.
They will have a broader role that includes conducting risk assessments to ensure that sterile products are made as contamination-free as possible, said Hopkins, who is an inspector for the UK Medicines and Healthcare products Regulatory Agency.
Contamination Control is a fairly wide term used to mean “getting microbiologists out of the lab” and involved in risk management and compliance. Our organization splits that function off from the QC Microbiology organization but there are many models for making it work.
Risk Management is a major part of the new Annex 1, and what they are driving at are good risk assessments with good risk mitigation that involve the microbiologists.
Targeted/ risk based measures of contamination avoidance
Key performance indicators to assess status of contamination control
A defined strategy for deviation management (investigations) and CAPA
When it comes to change management, one of the easiest places to go wrong is to forget to bring the microbiologist in to changes. Based on your strategy you can determine change changes require their assessment and include it in the tool utilized to determine SMEs, for example:
Required if the change meets any of the following criteria:
The change impacts environment integrity, conditions or monitoring, including:
Changes to a controlled room or area that impact integrity
Changes in sampling methodology
Changes in personnel or material flow
The change will result in or modify exposure of product to the environment.
The change can impact microbiological control within a process stream, raw material or process equipment
Data integrity has been, for the last few years, one of the hot topics of regulatory agency inspections for the last few years, one that it has often been noticed seems to be, at times, a popular umbrella for a wide variety of related topics (that usually have a variety of root causes).
Data Integrity is an interesting grab bag because it involves both paper and electronic data. While some of the principles overlap, it sometimes can seem nebulous, Luckily, the MHRA recently published a final guidance on GXP Data Integrity that ties together several threads. This is a great reference document that lays out some key principles:
Organizational culture should drive ALCOA
Data governance is part of the management review process
Data Risk Assessments with appropriate mitigations (full risk management approach)
I love the snarky comment about ALCOA+. More guidances should be this snarky.
The FDA so far this year has been issuing warning letters and 483s in more traditional GMP areas, such as testing and validation. It will be curious if this lessening of focus in a subtle shift in inspection, or just the result of the sites inspected. Either way, building data integrity into your quality systems is a good thing.
Processes and tools for the prevention, detection, analysis, reporting, tracking and remediation of noncompliance to data integrity principles should be integrated into the Quality Management System to:
Prevention of data integrity issues through governance, training, organizational controls, processes, systems underlying and supporting data integrity.
Detection of data integrity issues through leveraging existing Quality Systems, tools and personnel.
Remediation of data integrity issues through leveraging existing Quality Systems that identify and track implementation of corrective/preventive action(s).
Some ways to integrate includes:
Data integrity training for all employees
Include as an aspect of audits and self-inspections
Controls in place to ensure good documentation practices
good validation practices
Computer system lifecycle management (include audit trail reviews)
Ensure your root cause investigators and CAPA people are trained on data integrity
Data integrity as a critical decision point in change management
Data integrity, like many other aspects of a quality culture, are mindsets and tools that are applied throughout the organization. There really isn’t a single project or fix. By applying data integrity principles regularly and consistently you build and ensure. A such, data integrity is really just an affirmation of good quality principles.