(a) Each person engaged in the manufacture, processing, packing, or holding of a drug product shall have education, training, and experience, or any combination thereof, to enable that person to perform the assigned functions. Training shall be in the particular operations that the employee performs and in current good manufacturing practice (including the current good manufacturing practice regulations in this chapter and written procedures required by these regulations) as they relate to the employee’s functions. Training in current good manufacturing practice shall be conducted by qualified individuals on a continuing basis and with sufficient frequency to assure that employees remain familiar with CGMP requirements applicable to them.
(b) Each person responsible for supervising the manufacture, processing, packing, or holding of a drug product shall have the education, training, and experience, or any combination thereof, to perform assigned functions in such a manner as to provide assurance that the drug product has the safety, identity, strength, quality, and purity that it purports or is represented to possess.
(c) There shall be an adequate number of qualified personnel to perform and supervise the manufacture, processing, packing, or holding of each drug product.
US FDA 21CFR 210.25
All parts of the Pharmaceutical Quality system should be adequately resourced with competent personnel, and suitable and sufficient premises, equipment and facilities.
EU EMA/INS/GMP/735037/201 2.1
The organization shall determine and provide the resources needed for the establishment, implementation, maintenance and continual improvement of the quality management system. The organization shall consider:
a) the capabilities of, and constraints on, existing internal resources; b) what needs to be obtained from external providers.
ISO 9001:2015 requirement 7.1.1
It is critical to have enough people with the appropriate level of training to execute their tasks.
It is fairly easy to define the individual training plan, stemming from the job description and the process training requirements. In the aggregate we get the ability to track overdue training, and a forward look at what training is coming due. Quite frankly, lagging indicators that show success at completing assigned training but give no insight to the central question – do we have enough qualified individuals to do the work?
To get this proactive, we start with the resource plan. What operations need to happen in a time frame and what are the resources needed. We then compare that to the training requirements for those operations.
We can then evaluate current training status and retention levels and determine how many instructors we will need to ensure adequate training.
We perform a gap assessment to determine what new training needs exist
We then take a forward look at what new improvements are planned and ensure appropriate training is forecasted.
Now we have a good picture of what an “adequate number” is. We can now set a leading KPI to ensure that training is truly proactive.
Starts with our “Heroes are gone” and “it is time to stand
on our two feet.”
Focuses on the time and effort to train people on lean and six sigma, and how many people do not actually do projects. Basic point is that we use the tools in old ways which are not nimble and aligned to today’s needs. The tools we use versus the tools we are taught.
Hacking lean six sigma is along a similar line to Art Smalley’s four problems.
Covers valuestream mapping and spaghetti diagrams with a
focus on “they delays in between.” Talks about how control charts are not more standard.
Basic point is people don’t spend enough time with the tools of quality. A point
I have opinions on that will end up in another post.
Overcooked data versus raw data – summarized data has little
or no nutritional value.
Brings this back to the issue of lack of problem diagnosis and not problem solving. Comes back to a need for a few easy tools and not the long-tail of six sigma.
This talk is very focused on LSS and the use of very specific tools, which seems like an odd choice at an Audit conference.
“Objectives and Process Measures: ISO 13485:2016 and ISO 9001:2015” by Nancy Pasquan
I appreciate it when the session manager (person who introduces the speaker and manages time) does a safety moment. Way to practice what we preach. Seriously, it should be a norm at all conferences.
Connects with the audience with a confession that the speaker is here to share her pain.
Objective – where we are going. Provide a flow chart of mission/vision (scope) ->establish process -> right direction? -> monitor and measure
Objectives should challenge the organization. Should not be too easy. References SMART. Covers objectives in very standard way. “Remember the purpose is to focus the effort of the entire organization toward these goals.” Links process objectives to the overall company objectives.
Process measures are harder. Uses training for an example. Which tells me adult learning practice is not as much as the QBOK way of thinking as I would like. Kilpatrick is a pretty well-known model.
Process measures will not tell us if we have the right process is a pretty loaded concept. Being careful of what you measure is good advice.
“Auditing Current Trends in Cleaning Validation” by Cathelene Compton
One of the trends in 2019 FDA Warning letters has been cleaning. While not one of the four big ones, cleaning validation always seems relevant and I’m looking forward to this presentation.
Starting with the fact that 15% if all observations on 483 forms related to leaning validation and documentation.
Reviews the three stages from the 2011 FDA Process Validation Guidance and then delvers into a deeper validation lifecycle flowchart.
Stage 1 – choosing the right cleaning agent; different manufacturers of cleaning agents; long-term damage to equipment parts and cleaning agent compatibility. Vendor study for cleaning agent; concentration levels; challenge the cleaning process with different concentrations.
Delves more into cleaning acceptance limits and the importance of calculating in multiple ways. Stresses the importance of an involvement of a toxicologist. Stresses the use of Permitted Daily Exposure and how it can be difficult to get the F-factors.
Ensure that analytical methods meet ICHQ2(R1). Recovery studies on materials of construction. For cleaning agent look for target marker, check if other components in the laboratory also use this marker. Pitfall is the glassware washer not validated.
Trends around recovery factors, for example recoveries for stainless tell should be 90%.
Discusses matrix rationales from the Mylan 483 stressing the need to ensure all toxicity levels are determined and pharmaceological potency is there.
Stage 2 all studies should include visual inspection, micro and analytical. Materials of construction and surface area calculations and swabs on hard to clean or water hold up locations. Chromatography must be assessed for extraneous peaks.
Verification vs verification – validation always preferred.
Training – qualify the individuals who swab. Qualify visual inspectors.
Should see campaign studies, clean hold studies and dirty equipment hold studies.
Stage 3 – continuous is so critical, where folks fall flat. Do every 6 months, no more than a year or manual. CIP should be under a periodic review of mechanical aspects which means requal can be 2-3 years out.
Grace Duffy is the keynote speaker. I’ve known Grace for years and consider her a mentor and I’m always happy to hear her speak. Grace has been building on a theme around her Modular Kaizen approach and the use of the OODA Loop, and this presentation built nicely on what she presented at the Lean Six Sigma Conference in Phoenix, at WCQI and in other places.
Audits as a form of sustainability is an important point to
stress, and hopefully this will be a central theme throughout the conference.
The intended purpose is to build on a systems view for preparation for an effective audit and using the OODA loop to approach evolutionary and revolutionary change approaches.
Grace starts with a brief overview of system and process and then from vision to strategy to daily, and how that forms a mobius strip of macro, meso, micro and individual. She talks a little about the difference between Deming and Juran’s approaches and does a little what-if thinking about how Lean would have devoted if Juran had gone to Japan instead of Deming.
Breaking down OODA (Observe, Orient, Decide Act) as “Where
am I and where is the organization” and then feed into decision making.
Stresses how Orient discusses culture and discusses understanding the culture.
Her link to Lean is a little tenuous in my mind.
She then discusses Tom Pearson’s knowledge management model
with: Local Action; Management Action; Exploratory Analysis; Knowledge
Building; Complex Systems; Knowledge Management; Scientific Creativity. Units
all this with system thinking and psychology. “We’re going to share shamelessly because
that’s how we learn.” “If we can’t have fun with this stuff it’s no good.”
Uniting the two, she describes the knowledge management
model as part of Orient.
Puts revolutionary and evolutionary change in light of
Juran’s Breakthrough versus Continuous Improvement. From here she covers
modular kaizen, starting with incremental change versus process redesign. From
there she breaks it down into a DMAIC model and goes into how much she loves
the measure. She discusses how the human brain is better at connections, which
is a good reinforce of the OODA model.
Breaks down a culture model of Culture/Beliefs,
Visions/Goals and Activities/Plans-and-actions influenced by external events
and how evolutionary improvements stem out of compatibility with those. OODA is
the tool to help determine that compatibility.
Discusses briefly on how standardization fits into systems
and pushes a look from a stability.
Goes back to the culture model but now adds idea generation
and quality test with decisions off of it that lead to revolutionary
improvements. Links back to OODA.
Then quickly covers DMAIC versus DMADV and how that is
another way of thinking about these concepts.
Covers Gina Wickman’s concept of visionary and integrator from Traction.
Ties back OODA to effective auditing: focus on patterns and
not just numbers, Grasp the bigger picture, be adaptive.
This is a big sprawling topic for a key note and at times it
felt like a firehose.. Keynotes often benefit from a lot more laser focus. OODA
alone would have been enough. My head is reeling, and I feel comfortable with
this material. Grace is an amazing, passionate educator and she finds this
material exciting. I hope most of the audience picked that up in this big gulp
approach. This system approach, building on culture and strategy is critical.
OODA as an audit tool is relevant, and it is a tool I think we should be teaching better. Might be a good tool to do for TWEF as it ties into the team/workplace excellence approach. OODA and situational awareness are really united in my mind and that deserves a separate post.
After the keynote there are the breakout sessions. As always, I end up having too many options and must make some decisions. Can never complain about having too many options during a conference.
First Impressions: The Myth of the Objective & Impartial Audit
First session is “First Impressions: The Myth of the
Objective & Impartial Audit” by William Taraszewski. I met Bill back at the
2018 World Conference of Quality Improvement.
Bill starts by discussing how subjectivity and first
impressions and how that involves audits from the very start.
Covers the science of first impressions, point to research of bias and how negative behavior weighs more than positive and how this can be contextual. Draws from Amy Cuddy’s work and lays a good foundation of Trust and Competence and the importance in work and life in general.
Brings this back to ISO 19011:2018 “Guidelines for auditing
management systems” and clause 7.2 determining auditor competence placing
personal behavior over knowledge and skills.
Brings up video auditing and the impressions generated from
video vs in-person are pretty similar but the magnitude of the bad impressions
are greater and the magnitude of positive is lower. That was an interesting
point and I will need to follow-up with that research.
Moves to discussing impartiality in context of ISO
19011:2018, pointing out the halo and horn effects.
Discusses prejudice vs experience as an auditor and covers
confirmation bias and how selective exposure and selective perception fits into
our psychology with the need to be careful since negative outweighs.
Moves into objective evidence and how it fits into an audit.
Provides top tips for good auditor first impressions with
body language and eye contact. Most important, how to check your attitude.
This was a good fundamental on the topics that reinforces some basics and goes back to the research. Quality as a profession really needs to understand how objectivity and impartiality are virtually impossible and how we can overcome bias.
Auditing Risk Management
Barry Craner presented on :Are you ready for an audit of your risk management system?”
Starts with how risk management is here to stay and how it is in most industries. The presenter is focused on medical devices but the concepts are very general.
“As far possible” as a concept is discussed and residual risk. Covers this at a high level.
Covers at a high level the standard risk management process (risk identification, risk analysis, risk control, risk monitoring, risk reporting) asking the question is “RM system acceptable? Can you describe and defend it?”
Provides an example of a risk management file sequence that matches the concept of living risk assessments. This is a flow that goes from Preliminary Hazard analysis to Fault Tree Analysis (FTA) to FMEA. With the focus on medical devices talks about design and process for both the FTA and the FMEA. This is all from the question “Can you describe and defend your risk management program?”
In laying out the risk management program focused in on personnel qualification being pivotal. Discusses answering the question “Are these ready for audit?” When discussing the plan asks the questions “Is your risk management plan: documented and reasonable; ready to audit; and, SOP followed by your company?”
When discussing risk impact breaks it down to “Is the risk acceptable or not.” Goes on to discuss how important it is to defend the scoring rubric, asking the question”Well defined, can we defend?”
Goes back and discusses some basic concepts of hazard and harm. Asks the questions “Did you do this hazard assessment with enough thoroughness? Were the right hazards identified?” Recommends building a example of hazards table. This is good advice. From there answer the question “Do your hazard analses yield reasonable, useful information? Do you use it?”
Provides a nice example of how to build a mitigation plan out of a fault tree analysis.
Discussion on FMEAs faultered on detection, probably could have gone into controls a lot deeper here.
With both the PTA and FMEA discussed how the results needs to be defendable.
Risk management review, with the right metrics are discussed at a high level. This easily can be a session on its own.
Asks the question “Were there actionable tasks? Progress on these tasks?”
It is time to stop having such general overviews at conferences, especially at a conference which are not targeted to junior personnel.
A Goal is generally described as an effort directed towards an end. In project management, for example, the term goal is to three different target values of performance, time and resources. To be more specific, the project goal specifies the desired outcome (performance), the specific end date (time) and the assigned amount of resources (resources). A goal answers to “What” is the main aim of the project.
An Objective defines the tangible and measurable results of the team to support the agreed goal and meet the planned end time and other resource restrictions. It answers to “How” something is to be done.
I think many of us are familiar with the concept of SMART goals. Lately I’ve been using FAST objectives.
Transparency provides the connective tissue, and must be a primary aspect of any quality culture. Transparency is creating a free flow within an organization and between the organization and its many stakeholders. This flow of information is the central nervous system of an organization and it’s effectiveness depends on it. Transparency influences the capacity to solve problems, innovate, meet challenges and as shown above, meet goals.
This information flow is simply that critical information gets to the right person at the right time and for the right reason. By making our goals transparent we can start that process and make a difference in our organizations.