Training Program

A reader asks what the expectations of the FDA are towards the “training program” and tips on changing the culture/attitude towards positive training behaviors.

I’ve answered the FDA part before in “Site Training Needs,” “Training Plan,” “CVs and JDs and Training Plans,” and “HR and Quality, joined at the hip” but I think it is a good idea to revisit the topic and look at some 483 observations.

Employees are not given training in the particular operations they perform as part of their function, current good manufacturing practices and written procedures required by current good manufacturing practice regulations.

Specifically, your firm does not have a written training program. There are no records to demonstrate (b)(6)(b. is qualified perform filling operations. I observed (b)(4)(b perform filling operation for Wart Control Extra Strength, 4 ml, lot USWCXx-4059 on 01/18/23. I observed (b (7)(C). perform reprocessing and packing operations. There are no records to demonstrate (b (7)C. is qualified perform these operations.

Employees are not given training in current good manufacturing practices and written procedures required by current good manufacturing practice regulations.

Specifically, your firm lacked training documentation for cGMP, SOPs, or specific job functions for all employees that perform analytical testing on finished OTC drug products. Additionally, your firm lacks a written procedure on employee training.

I could list a bunch more, but they all pretty much say the same.

  1. Have a documented training program.
  2. Conduct training on the operations an individual performs AND on general GxP principles appropriate to their job.

A documented training program should set out:

  1. Job Descriptions
  2. Organizational Charts
  3. Curriculum Vitae/Resume
  4. Identify Training
    • Individual Learning Plans (Training Assignments)
  5. Training Program Execution
    • Development and Management of Training Materials
    • Training execution
      • New Hire Orientation
      • On-the-Job Training (OJT)
      • Continuous Training
  6. Qualified Trainers
  7. Training Records of Personnel
  8. Periodic Review of Training System Performance

Conducting training starts with having a training plan to identify what appropriate training looks like.

The question is always what level of training is adequate. The honest answer is whatever works, and you aren’t training and educating your personnel enough. This is one of the things were the proof is in the pudding. Build the ways to measure effectiveness of training and you will be golden.

The Difference Between Education and Training and Impact on Procedure

When we solve problems in the wrong way, we end up creating bigger problems. One of the biggest of these stems from the differences between education and training and how we try to address education deficiencies (real or perceived) in the procedure.

  • Training: The primary goal of training is to develop specific skills and behaviors that improve performance and productivity in a particular job or task. It is practical and hands-on, focusing on applying knowledge to perform specific tasks effectively. For example, training might involve learning how to use a particular software or operate machinery.
  • Education: Education aims to provide a broader understanding of concepts, theories, and principles. It is more about acquiring knowledge and developing critical thinking, reasoning, and judgment. Education prepares individuals for future roles and helps them understand the broader context of their work.

For example, in writing a procedure on good documentation practices (GDocP), we might include a requirement to show the work on all calculations except simple. Knowledge of the broader principles of mathematics is education, and a simple calculation is a fundamental building block of mathematics. We now have two choices. We can proceduralize a definition and provide examples of simple calculations, or a basic understanding of mathematics is a prerequisite for doing the work, part of the core competencies.

This example may seem minor, but it quickly builds up. Every time we add an item that should be education to a procedure, we increase the difficulty of using and training on the document. Good documentation practices are a great example because we take some basic ALCOA+ concepts and then give possible permutations, many of which rely on education premises.

Train-the-Trainer

A firm requirement throughout the GxP regulations and the various ISO standards is that individuals are appropriately trained and qualified to do their work.

Inevitably, that appropriately trained and qualified comes down to the trainer who is conducting the training. How do we train our trainers to ensure that individuals are learning and acquiring all of the skills, knowledge, and insight they need to perform their roles well.

Inadequate training is a consistent finding across the GxPs, so you have to ask are we training our trainers to an appropriate level to make the training effective? Equally important you are spending a lot of time and money training people so you want it to be effective and worth the resources spent.

There are really two options for trainers: 1. Trainers who become qualified to teach a course and 2. SMEs who are qualified to be trainers. In either case, you need that qualification mechanism to ensure your trainers can train. I’ll be honest for technical material I prefer SMEs being trained to be trainers as the experience is usually a whole lot better for the trainee.

This training focuses on being able to deliver informal and formal learning solutions in a manner that is both engaging and effective. Be able to:

  • Manage the learning environment.
  • Prepare for training delivery.
  • Convey objectives.
  • Align learning solutions with course objectives and learner needs.
  • Establish credibility as an instructor.
  • Create a positive learning climate.
  • Deliver various learning methodologies.
  • Facilitate learning.
  • Encourage participation and build learner motivation.
  • Deliver constructive feedback.
  • Ensure learning outcomes.
  • Evaluate solutions.

This qualification path will prove itself valuable.Through this we can ensure that our trainings meet their four objectives and that participants can demonstrate:

  1. Awareness: Participant says, “I’ve heard that!”
  2. Understanding: Participant recognizes the subject matter and then explains it.
  3. Practice: Participant actually uses the learning on the job.
  4. Mastery: Participant can use the acquired knowledge to teach others.

Being a trainer is critical for true subject matter expertise and process ownership.

And as an aside, notice I didn’t include instructional design, this is where your training team can really add value!

Designing Level 2 Training Effectiveness Assessments

In the Kilpatrick model, a level 2 assessment measures how much individuals learned. It is asking did the learners actually learn what we wanted them to learn? Did we actually advance knowledge?

For many of us, the old go-to is the multiple-choice quiz.

If we actually want to assess a learner’s ability to do something or think critically about a topic, a multiple-choice quiz isn’t going to work. This isn’t to say that a multiple-choice quiz can’t be challenging, but the focus of a multiple-choice quiz is on the learner’s understanding of the content, not on the learner’s knowledge of how to apply the content to a variety of different contexts.

Say we are designing a root cause analysis course. By the end of the course, your learners should be able to understand some core principles of root cause analysis so that they can perform better investigations, find root causes and determine appropriate CAPAs. While there may be some inherently wrong approaches to root cause analysis that could be assessed in a multiple-choice quiz, a skilled investigator will likely not be dealing with obvious “right” and “wrong” ways to identify causes. Most investigations require complex interactions with people. As such, there may be multiple decisions an investigator needs to make and, within the scope of a course, it could be really hard to identify what skills a budding investigator needs to develop through multiple-choice quizzes alone.

So, what kinds of assessments could you use beyond multiple-choice quizzes and when should you use them? There’s a lot of complexity to these choices which ultimately need to align what
you want people in the course to learn with how you think they can best demonstrate evidence of that learning.

Assessment InstrumentWhen to use itExample
Multiple-Choice Quiz or
Exam
To assess a learner’s understanding of a concept, definition, or specific process. Could also be used to assess responses or reactions to a scenario-based question if there are clear “right” or “wrong” responses.Understanding of core concepts of root cause analysis. Simple branching choices, for example what tool to use when.
Open-Ended QuestionsTo assess a learner’s ability to interpret and apply a new idea. Could also be used to assess a learner’s ability to describe an approach to a process or problem.Demonstrate knowledge of root cause analysis techniques through various practice exercises.
Long-Form Written
Assignment
To assess a learner’s ability to make an argument, analyze a text or current event, or use outside evidence to inform a particular claim. Could also be used to assess a learner’s understanding of how to produce a piece of writing specific to a particular field or discipline (for example, a lab report in a lab sciences context or a policy memo in a public policy context).Write an analysis and investigation report up from a example.
ProjectTo assess a learner’s ability to make a new product and apply skills learned to build an independent work. Could also be used to assess a learner’s understanding of how to
create a field-specific artifact.
Conduct a root cause analysis from an exercise.

On the job training.
PortfolioTo assess a learner’s ability to grow, revise, and create a body of work over a particular period of time. Review of investigations on periodic basis
Assessment Types

A lot of learning experiences will implement a combination of these types of assessments in a course, and it’s likely that at different phases of your course and for different purposes, you
will need to select more than one assessment or evaluation method.

Remember that an assessment serves two additional purposes: It helps the learners recognize where they are in the course so that they have an understanding of the progress, and it helps you, as the facilitator, see what challenges and triumphs the learners are experiencing all the way throughout the course.

Microfeedback for Adjusting Behaviors

Previously I’ve talked about defining the values and behavior associated with quality culture. Once you’ve established these behaviors, a key way to make them happen is through microfeedback, a skill each quality professional, supervisor, and leader in your organization should be trained on.

We are all familiar with the traditional feedback loop: you receive feedback, reflect on it, make a plan, and then take action. This means feedback is given after a series of actions have taken place. Feedback addresses a few key observations for future improvements. In a situation when actions and sequences are quite complicated and interdependent, feedback can fail to provide useful insights to improve performance. Micro-feedback potentially can be leveraged to prevent critical mistakes and mitigate risks, which makes it a great way to build culture and drive performance.

Micro-feedback is a specific and just-in-time dose of information or insights that can reduce gaps between the desired behavioral goals and reality. Think of it as a microscope used to evaluate an individuals comprehension and behavior and prescribe micro-interventions to adjust performance and prevent mistakes.

Microfeedback, provided during the activity observed, is a fundamental aspect of the Gemba walk. These small tweaks can be adapted, and utilized to provide timely insights and easy-to-accomplish learning objectives, to drive deep clarity and stay motivated to modify their performance

Where and when the microfeedback happens is key:

1. Taskbased microfeedback focuses corrective or suggestive insights on the content of a task. To provide higher impact focus micro-feedback on the correct actions rather than incorrect performance. For example “Report this issue as an incident…”

2. Process-based micro-feedback focuses on the learning processes and works best to foster critical thinking in a complex environment. For example, “This issue can be further processed based on the decision tree strategies we talked about earlier.”

3. Self-regulation-based micro-feedback focuses on giving suggestive or directive insights helping individuals to better manage and regulate their own learning. For example, “Pause once you have completed the task and ask yourself a set of questions following the 5W2H formula.”

For microfeedback to be truly successful it needs to be in the context of a training program, where clear behavorial goals has been set. This training program should include a specific track for managers that allows them to provide microfeedback to close the gap between where the learner is and where the learner aims to be. This training will provide specific cues or reinforcement toward a well-understood task and focus on levels of task, process, or self-regulation.

During change management, provide positive micro-feedback on correct, rather than incorrect, performance. This can be very valuable as you think about sustainability of the change.

Leveraged sucessful, but well trained observers and peers, microfeedback will provide incremental and timely adjustments to drive behavior.