The Difference Between Education and Training and Impact on Procedure

When we solve problems in the wrong way, we end up creating bigger problems. One of the biggest of these stems from the differences between education and training and how we try to address education deficiencies (real or perceived) in the procedure.

  • Training: The primary goal of training is to develop specific skills and behaviors that improve performance and productivity in a particular job or task. It is practical and hands-on, focusing on applying knowledge to perform specific tasks effectively. For example, training might involve learning how to use a particular software or operate machinery.
  • Education: Education aims to provide a broader understanding of concepts, theories, and principles. It is more about acquiring knowledge and developing critical thinking, reasoning, and judgment. Education prepares individuals for future roles and helps them understand the broader context of their work.

For example, in writing a procedure on good documentation practices (GDocP), we might include a requirement to show the work on all calculations except simple. Knowledge of the broader principles of mathematics is education, and a simple calculation is a fundamental building block of mathematics. We now have two choices. We can proceduralize a definition and provide examples of simple calculations, or a basic understanding of mathematics is a prerequisite for doing the work, part of the core competencies.

This example may seem minor, but it quickly builds up. Every time we add an item that should be education to a procedure, we increase the difficulty of using and training on the document. Good documentation practices are a great example because we take some basic ALCOA+ concepts and then give possible permutations, many of which rely on education premises.

Risk Management is a Living Process

Living and adhoc risk assessments

ISO 31000-2018 “Risk Management Guidelines” discusses on-going monitoring and review of risk management activities. We see a similar requirement in ICH Q9(r1) for the pharmaceutical industry. In many organizations we can take a lot of time on the performance of risk assessments (hopefully effectively) and a lot of time mitigating risks (again, hopefully effectively) but many organizations struggle in maintaining a lifecycle approach.

To do appropriate lifecycle management we should ensure three things:

  1. Planned review
  2. Continuous Monitoring
  3. Incorporate through governance, improvement and knowledge management activities.

Reviews are a critical part of our risk management process framework.

This living risk management approach effectively drives work in Control Environment, Response and Stress Testing.

At heart lies the ongoing connection between risk management and knowledge management.

Train-the-Trainer

A firm requirement throughout the GxP regulations and the various ISO standards is that individuals are appropriately trained and qualified to do their work.

Inevitably, that appropriately trained and qualified comes down to the trainer who is conducting the training. How do we train our trainers to ensure that individuals are learning and acquiring all of the skills, knowledge, and insight they need to perform their roles well.

Inadequate training is a consistent finding across the GxPs, so you have to ask are we training our trainers to an appropriate level to make the training effective? Equally important you are spending a lot of time and money training people so you want it to be effective and worth the resources spent.

There are really two options for trainers: 1. Trainers who become qualified to teach a course and 2. SMEs who are qualified to be trainers. In either case, you need that qualification mechanism to ensure your trainers can train. I’ll be honest for technical material I prefer SMEs being trained to be trainers as the experience is usually a whole lot better for the trainee.

This training focuses on being able to deliver informal and formal learning solutions in a manner that is both engaging and effective. Be able to:

  • Manage the learning environment.
  • Prepare for training delivery.
  • Convey objectives.
  • Align learning solutions with course objectives and learner needs.
  • Establish credibility as an instructor.
  • Create a positive learning climate.
  • Deliver various learning methodologies.
  • Facilitate learning.
  • Encourage participation and build learner motivation.
  • Deliver constructive feedback.
  • Ensure learning outcomes.
  • Evaluate solutions.

This qualification path will prove itself valuable.Through this we can ensure that our trainings meet their four objectives and that participants can demonstrate:

  1. Awareness: Participant says, “I’ve heard that!”
  2. Understanding: Participant recognizes the subject matter and then explains it.
  3. Practice: Participant actually uses the learning on the job.
  4. Mastery: Participant can use the acquired knowledge to teach others.

Being a trainer is critical for true subject matter expertise and process ownership.

And as an aside, notice I didn’t include instructional design, this is where your training team can really add value!

Designing Level 2 Training Effectiveness Assessments

In the Kilpatrick model, a level 2 assessment measures how much individuals learned. It is asking did the learners actually learn what we wanted them to learn? Did we actually advance knowledge?

For many of us, the old go-to is the multiple-choice quiz.

If we actually want to assess a learner’s ability to do something or think critically about a topic, a multiple-choice quiz isn’t going to work. This isn’t to say that a multiple-choice quiz can’t be challenging, but the focus of a multiple-choice quiz is on the learner’s understanding of the content, not on the learner’s knowledge of how to apply the content to a variety of different contexts.

Say we are designing a root cause analysis course. By the end of the course, your learners should be able to understand some core principles of root cause analysis so that they can perform better investigations, find root causes and determine appropriate CAPAs. While there may be some inherently wrong approaches to root cause analysis that could be assessed in a multiple-choice quiz, a skilled investigator will likely not be dealing with obvious “right” and “wrong” ways to identify causes. Most investigations require complex interactions with people. As such, there may be multiple decisions an investigator needs to make and, within the scope of a course, it could be really hard to identify what skills a budding investigator needs to develop through multiple-choice quizzes alone.

So, what kinds of assessments could you use beyond multiple-choice quizzes and when should you use them? There’s a lot of complexity to these choices which ultimately need to align what
you want people in the course to learn with how you think they can best demonstrate evidence of that learning.

Assessment InstrumentWhen to use itExample
Multiple-Choice Quiz or
Exam
To assess a learner’s understanding of a concept, definition, or specific process. Could also be used to assess responses or reactions to a scenario-based question if there are clear “right” or “wrong” responses.Understanding of core concepts of root cause analysis. Simple branching choices, for example what tool to use when.
Open-Ended QuestionsTo assess a learner’s ability to interpret and apply a new idea. Could also be used to assess a learner’s ability to describe an approach to a process or problem.Demonstrate knowledge of root cause analysis techniques through various practice exercises.
Long-Form Written
Assignment
To assess a learner’s ability to make an argument, analyze a text or current event, or use outside evidence to inform a particular claim. Could also be used to assess a learner’s understanding of how to produce a piece of writing specific to a particular field or discipline (for example, a lab report in a lab sciences context or a policy memo in a public policy context).Write an analysis and investigation report up from a example.
ProjectTo assess a learner’s ability to make a new product and apply skills learned to build an independent work. Could also be used to assess a learner’s understanding of how to
create a field-specific artifact.
Conduct a root cause analysis from an exercise.

On the job training.
PortfolioTo assess a learner’s ability to grow, revise, and create a body of work over a particular period of time. Review of investigations on periodic basis
Assessment Types

A lot of learning experiences will implement a combination of these types of assessments in a course, and it’s likely that at different phases of your course and for different purposes, you
will need to select more than one assessment or evaluation method.

Remember that an assessment serves two additional purposes: It helps the learners recognize where they are in the course so that they have an understanding of the progress, and it helps you, as the facilitator, see what challenges and triumphs the learners are experiencing all the way throughout the course.

Build Your Knowledge Base

Engaging with knowledge and Knowledge Management are critical parts of development. The ability to navigate the flood of available data to find accurate information is tied directly to individuals’ existing knowledge and their skills at distinguishing credible information from misleading content.

There is ample evidence that many individuals lack the ability to accurately judge their understanding or the quality and accuracy of their performance (i.e., calibration). To truly develop our knowledge, we need to be engaged in deliberative practice. But to truly calibrate requires feedback, guidance, and coaching that you may not have access to within our organizations. This requires effort and deliberate building of a system and processes.

Information can be found with little mental effort but without critical analysis of its legitimacy or validity, the ease of information can actually work against the development of deeper-processing strategies. It is really easy to go-online and get an answer, but unless learners put themselves in positions to struggle cognitively with an issue, and unless they have occasions to transform or reframe problems, their likelihood of progressing into competence is jeopardized.

The more learners forge principled knowledge in a professional domain, the greater their reported interest in and identity with that field. Therefore, without the active pursuit of knowledge, these individuals’ interest in professional development may wane and their progress toward expertise may stall. This is why I find professional societies so critical, and why I am always pushing people to step up.

My constant goal as a mentor is to help people do the following:

  • Refuse to be lulled into accepting a role as passive consumers of information, striving instead to be active producers of knowledge
  • Probe and critically analyze the information they encounter, rather
    than accepting quick, simple answers
  • Forge a meaningful interest in the profession and personal connections to members
    of professional communities, instead of relying on moment-by-moment stimulation and superficial relationships

If we are going to step up to the challenges ahead of us, to address the skill gaps we are seeing, we each need to be deliberate in how we develop and deliberate in how we build our organizations to support development.