2020 FDA 483s around change

The yearly database of 483s has been updated by the FDA. Lots will be written on themes as we end the year, so I decided to give a few of my immediate observations.

I find that over time what I focus in on changes, as my jobs evolve and change, and the interests I have shift. However, some things never change, so let us talk change.

Reference NumberShort DescriptionLong Description2020 Frequency2019 Frequency2018 Frequency
21 CFR 211.100(a)Changes to Procedures Not Reviewed, ApprovedChanges to written procedures are  not [drafted, reviewed and approved by the appropriate organizational unit] [reviewed and approved by the quality control unit].  Specifically, ***8139
21 CFR 211.160(a)Lab controls established, including changesThe establishment of [specifications] [standards] [sampling plans] [test procedures] [laboratory control mechanisms] including any changes thereto, are not [drafted by the appropriate organizational unit] [reviewed and approved by the quality control unit].  Specifically, ***41817
21 CFR 212.20(c)Adverse effects of changes madeYou did not demonstrate that any change does not adversely affect the [identity] [strength] [quality] [purity] of your PET drug. Specifically,***111
483s related to changes

I think its fair to say the decreases as a result of the pandemic and the reduced inspections.

Over on the device side of things we see:

Reference NumberShort DescriptionLong DescriptionFrequency
21 CFR 820.30(i)Design changes – Lack of or Inadequate ProceduresProcedures for design change have not been [adequately] established.  Specifically,***26
21 CFR 820.40(b)Document change records, maintained.Records of changes to documents were not [adequately] maintained.  Specifically, ***6
21 CFR 820.70(b)Production and Process Change Procedures, lack of or Inad.Procedures for changes to a [specification] [method] [process] [procedure] have not been [adequately] established.  Specifically, *** 5
21 CFR 820.75(c)Process changes – review, evaluation and revalidationA validated process was not [reviewed and evaluated] [revalidated] when changes or process deviations occurred. Specifically, ***5
21 CFR 820.40(b)Change records, contentRecords of changes did not include [a description of the change] [identification of the affected documents] [the signature of the approving official(s)] [the approval date] [when the change became effective].  Specifically, ***



3
21 CFR 820.50(b)Supplier notification of changesThere is no agreement with [suppliers] [contractors] [consultants] to notify you of changes in the product or service.  Specifically, ***3
21 CFR 820.75(c)Documentation – review in response to changes or deviationsThere is no documentation of the [review and evaluation of a process] [revalidation of a process] performed in response to changes or process deviations.  Specifically, ***1
Device 473s around change

I’m a firm believer that pharma should always pay attention to the medical device side for 483s. A lot of us are combination products now (or will be) and there is always good trends to be aware of.

My key takeaways:

  1. Think change management and not just change control and document control
  2. Computer change controls need to be holistic and system orientated
  3. Have a process that ensures changes are appropriately reviewed and approved
  4. Risk based and evaluate validation
  5. A robust supplier management program is critical, plan for change

Here’s a more detailed checklist to help you evaluate your change system.

PIC/S on Change Review and Effectiveness

Starting from the end, let’s review some of the requirements in the new draft PIC/S guidance.

Prior to change closure

RequirementImportant Points
Changes meet their intended objectives and pre-defined effectiveness criteria. Any deviations from those criteria are adequately assessed, accepted and managed/justified. Whenever possible, quantitative data are leveraged to objectively determine change effectiveness (e.g. statistical confidence and coverage).Clearly delineating what effective means as a date is critical to generate data.

CQV activities can tell you if the intended objective is met. Effectiveness reviews must be made up of:

Sufficient data points, as described in the implementation plan, gathered to a described timeline, before an assessment of the change is made.

The success criteria should be achieved. If not, reasons why they have not been achieved should be assessed along with the mitigation steps to address the reasons why, including reverting to the previous operating state where appropriate. This may require the proposal of a subsequent change or amendment of the implementation plan to ensure success.

Data and knowledge gathered from implementation of the change should be shared with the development function and other locations, as appropriate, to ensure that learning can be applied in products under development or to similar products manufactured at the same or other locations
As part of the quality risk management activities, residual risks are assessed and managed to acceptable levels, and appropriate adaptations of procedures and controls are implemented.These are action items in the change control.

As part of the closure activities, revise the risk assessment, clearly delineating risk assessment in two phases.
Any unintended consequences or risks introduced as a result of changes are evaluated, documented, accepted and handled adequately, and are subject to a pre-defined monitoring timeframe.Leverage the deviation system.

Prior to or after change closure

RequirementImportant Points
Any post-implementation actions needed (including those for deviations from pre-defined acceptance criteria and/or CAPAs) are identified and adequately completed.If you waterfall into a CAPA system, it is important to include effectiveness reviews that are to the change, and not just to the root cause.
Relevant risk assessments are updated post-effectiveness assessments. New product/process knowledge resulting from those risk assessments are captured in the appropriate Quality and Operations documents (e.g. SOPs, Reports, Product Control Strategy documents, etc.)Risk management is not a once and done for change management.
Changes are monitored via ongoing monitoring systems to ensure maintenance of a state of control, and lessons learned are captured and shared/communicated.Knowledge management is critical as part of the product management lifecycle.

Lessons learned are critical.

Regulatory Focus on Change Management

November was an exciting month for change management!

ICH Q12 “Technical and Regulatory Considerations for Pharmaceutical Product Lifecycle Management” was adopted by the ICH in Singapore, which means Q12 is now in Stage 5, Implementation. Implementation should be interesting as concepts like “established conditions” and “product lifecycle management” which sit at the core of Q12 are still open for interpretation as Q12 is implemented in specific regulatory markets.

And then, to end the month, PIC/S published draft 1 of PI 054-1 “Recommendation on How to Evaluate / Demonstrate the Effectiveness of a Pharmaceutical Quality System in relation to Risk-based Change Management.”

This draft guidance is now in a review period by regulatory agencies. Which means no public comments, but it will be applied on a 6-month trial basis by PIC/S participating authorities, which include the US Food and Drug Administration and other regulators across Europe, Australia, Canada, South Africa, Turkey, Iran, Argentina and more.

This document is aligned to ICH Q10, and there should be few surprised in this. Given PIC/S concern that “ongoing continual improvement has probably not been realised to a meaningful extent. The PIC/S QRM Expert Circle, being well-placed to focus on the QRM concepts of the GMPs and of ICH Q10, is seeking to train GMP inspectors on what a good risk-based change management system can look like within the PQS, and how to assess the level of effectiveness of the PQS in this area” it is a good idea to start aligning to be ahead of the curve.

“Changes typically have an impact assessment performed within the change control system. However, an impact assessment is often not as comprehensive as a risk assessment for the proposed change.”

This is a critical thing that agencies have been discussing for years. There are a few key takeaways.

  1. The difference between impact and risk is critical. Impact is best thought of as “What do I need to do to make the change.” Risk is “What could go wrong in making this change?” Impact focuses on assessing the impact of the proposed change on various things such as on current documentation, equipment cleaning processes, equipment qualification, process validation, training, etc. While these things are very important to assess, asking the question about what might go wrong is also important as it is an opportunity for companies to try to prevent problems that might be associated with the proposed change after its implementation.
  2. This 8 page document is really focusing on the absence of clear links between risk assessments, proposed control strategies and the design of validation protocols.
  3. The guidance is very concerned about appropriately classifying changes and using product data to drive decisions. While not specifying it in so many words, one of the first things that popped to my mind was around how we designate changes as like-for-like in the absence of supporting data. Changes that are assigned a like-for-like classification are often not risk-assessed, and are awarded limited oversight from a GMP perspective. These can sometimes result in major problems for companies, and one that I think people are way to quick to rush to.

Much of my thoughts on implementing this can be found in my presentation on change management and change control.

It is fascinating to look at appendix 1, which really lays out some critical goals of this draft guidance: better risk management, real time release, and innovative approaches to process validation. This is sort of the journey we are all on.

ASQ Audit Conference – Day 1 Afternoon

I presented on change management and then I spent the afternoon focusing more on ASQ member leader stuff. So not much to report on sessions.

My session, Lessons on Change Management went well. I probably should have cut the slides way back instead of re-purposing slides from a longer presentation, but I think I hit a lot of key points and hopefully it was valuable for folks.

I ended up working the FDC Division table after that, so I skipped the final session of the day. Probably best, after presenting its always hard for me to focus for a little while.

Tomorrow is a full day, and I present on data integrity.

Lessons Learned and Change Management

One of the hallmarks of a quality culture is learning from our past experiences, to eliminate repeat mistakes and to reproduce success. The more times you do an activity, the more you learn, and the better you get (within limits for simple activities).  Knowledge management is an enabler of quality systems, in part, to focus on learning and thus accelerate learning across the organization as a whole, and not just one person or a team.

This is where the” lessons learned” process comes in.  There are a lot of definitions of lessons learned out there, but the definition I keep returning to is that a lessons learned is a change in personal or organizational behavior as a result from learning from experience. Ideally, this is a permanent, institutionalized change, and this is often where our quality systems can really drive continuous improvement.

Lessons learned is activity to lessons identified to updated processes
Lessons Learned

Part of Knowledge Management

The lessons learned process is an application of knowledge management.

Lessons identified is generate, assess, and share.

Updated processes (and documents) is contextualize, apply and update.

Lessons Learned in the Context of Knowledge Management

Identify Lessons Learned

Identifying lessons needs to be done regularly, the closer to actual change management and control activities the better. The formality of this exercise depends on the scale of the change. There are basically a few major forms:

  • After action reviews: held daily (or other regular cycle) for high intensity learning. Tends to be very focused on questions of the day.
  • Retrospective: Held at specific periods (for example project gates or change control status changes. Tends to have a specific focus on a single project.
  • Consistency discussions: Held periodically among a community of practice, such as quality reviewers or multiple site process owners. This form looks holistically at all changes over a period of time (weekly, monthly, quarterly). Very effective when linked to a set of leading and lagging indicators.
  • Incident and events: Deviations happen. Make sure you learn the lessons and implement solutions.

The chosen formality should be based on the level of change. A healthy organization will be utilizing all of these.

Level of ChangeForm of Lesson Learned
TransactionalConsistency discussion
After action (when things go wrong)
OrganizationalRetrospective
After action (weekly, daily as needed)
TransformationalRetrospective
After action (daily)

Successful lessons learned:

  • Are based on solid performance data: Based on facts and the analysis of facts.
  • Look at positive and negative experiences.
  • Refer back to the change management process, objectives of the change, and other success criteria
  • Separate experience from opinion as much as possible. A lesson arises from actual experience and is an objective reflection on the results.
  • Generate distinct lessons from which others can learn and take action. A good action avoids generalities.

In practice there are a lot of similarities between the techniques to facilitate a good lessons learned and a root cause analysis. Start with a good core of questions, starting with the what:

  • What were some of the key issues?
  • What were the success factors?
  • What worked well?
  • What did not work well?
  • What were the challenges and pitfalls?
  • What would you approach differently if you ever did this again?

From these what questions, we can continue to narrow in on the learnings by asking why and how questions. Ask open questions, and utilize all the techniques of root cause analysis here.

Then once you are at (or close) to a defined issue for the learning (a root cause), ask a future-tense question to make it actionable, such as:

  • What would your advice be for someone doing this in the future?
  • What would you do next time?

Press for specifics. if it is not actionable it is not really a learning.

Update the Process

Learning implies memory, and an organization’s memories usually require procedures, job aids and other tools to be updated and created. In short, lessons should evolve your process. This is often the responsibility of the change management process owner. You need to make sure the lesson actually takes hold.

Differences between effectiveness reviews and lesson’s learned

There are three things to answer in every change

  1. Was the change effective – did it meet the intended purposes
  2. Did the change have any unexpected effects
  3. What can we learn from this change for the next change?

Effectiveness reviews are 1 and 2 (based on a risk based approach) while lessons learned is 3. Lessons learned contributes to the health of the system and drives continuous improvements in the how we make changes.

Citations

  • Lesson learned management model for solving incidents. (2017). 2017 12th Iberian Conference on Information Systems and Technologies (CISTI), Information Systems and Technologies (CISTI), 2017 12th Iberian Conference On, 1.
  • Fowlin, J. j & Cennamo, K. (2017). Approaching Knowledge Management Through the Lens of the Knowledge Life Cycle: a Case Study Investigation. TechTrends: Linking Research & Practice to Improve Learning61(1), 55–64. 
  • Michell, V., & McKenzie, J. (2017). Lessons learned: Structuring knowledge codification and abstraction to provide meaningful information for learning. VINE: The Journal of Information & Knowledge Management Systems47(3), 411–428.
  • Milton, N. J. (2010). The Lessons Learned Handbook : Practical Approaches to Learning From Experience. Burlington: Chandos Publishing.
  • Paul R. Carlile. (2004). Transferring, Translating, and Transforming: An Integrative Framework for Managing Knowledge across Boundaries. Organization Science, (5), 555.
  • Secchi, P. (Ed.) (1999). Proceedings of Alerts and Lessons Learned: An Effective way to prevent failures and problems. Technical Report WPP-167. Noordwijk, The Netherlands: ESTEC