ALCOA or ALCOA+

My colleague Michelle Eldridge recently shared this video for the differences between ALCOA and ALCOA+ from learnaboutgmp. It’s cute, it’s to the point, it makes a nice primer.

As I’ve mentioned before, the MHRA in it’s data integrity guidance did take a dig at ALCOA+:

The guidance refers to the acronym ALCOA rather than ‘ALCOA +’. ALCOA being Attributable, Legible, Contemporaneous, Original, and Accurate and the ‘+’ referring to Complete, Consistent, Enduring, and Available. ALCOA was historically regarded as defining the attributes of data quality that are suitable for regulatory purposes. The ‘+’has been subsequently added to emphasise the requirements. There is no difference in expectations regardless of which acronym is used since data governance measures should ensure that data is complete, consistent, enduring and available throughout the data lifecycle.

Two things should be drawn from this:

  1. Data Integrity is a set of best practices that are still developing, so make sure you are pushing that development and not ignoring it. Much better to be pushing the boundaries of the “c” then end up being surprised.
  2. I actually agree with the MHRA. Complete, consistent, enduring and available are really just subsets of the others. But, like they also say the acronym means little, just make sure you are doing it.

Data Integrity, it’s the new quality culture.

Likelihood of occurrence in risk estimation

People use imprecise words to describe the chance of events all the time — “It’s likely to rain,” or “There’s a real possibility they’ll launch before us,” or “It’s doubtful the nurses will strike.” Not only are such probabilistic terms subjective, but they also can have widely different interpretations. One person’s “pretty likely” is another’s “far from certain.” Our research shows just how broad these gaps in understanding can be and the types of problems that can flow from these differences in interpretation.

“If You Say Something Is “Likely,” How Likely Do People Think It Is?” by by Andrew Mauboussin and Michael J. Mauboussin

Risk estimation is based on two components:

  • The probability of the occurrence of harm
  • The consequences of that harm

With a third element of detectability of the harm being used in many tools.

Often-times we simplify probability of the occurrence into likelihood. The quoted article above is a good simple primer on why we should be careful of that. It offers three recommendations that I want to talk about. Go read the article and then come back.

I.                Use probabilities instead of words to avoid misinterpretation

Avoid the simplified quality probability levels, such as “likely to happen”, “frequent”, “can happen, but not frequently”, “rare”, “remote”, and “unlikely to happen.” Instead determine probability levels. even if you are heavily using expert opinion to drive probabilities, given ranges of numbers such as “<10% of the time”, “20-60% of the time” and “greater than 60% of the time.”

It helps to have several sets of scales.

The article has an awesome graph that really is telling for why we should avoid words.

W180614_MAUBOUSSIN_HOWPEOPLE

II.             Use structured approaches to set probabilities

Ideally pressure test these using a Delphi approach, or something similar like paired comparisons or absolute probability judgments. Using the historic data, and expert opinion, spend the time to make sure your probabilities actually capture the realities.

Be aware that when using historical data that if there is a very low frequent of occurrence historically, then any estimate of probability will be uncertain. In these cases its important to use predicative techniques and simulations. Monte Carlo anyone?

III.           Seek feedback to improve your forecasting

Risk management is a lifecycle approach, and you need to be applying good knowledge management to that lifecycle. Have a mechanism to learn from the risk assessments you conduct, and feed that back into your scales. These scales should never be a once and done.

In Conclusion

Risk Management is not new. It’s been around long enough that many companies have the elements in place. What we need to be doing to driving to consistency. Drive out the vague and build best practices that will give the best results. When it comes to likelihood there is a wide body of research on the subject and we should be drawing from it as we work to improve our risk management.

Move beyond setting your scales at the beginning of a risk assessment. Scales should exist as a library (living) that are drawn upon for specific risk evaluations. This will help to ensure that all participants in the risk assessment have a working vocabulary of the criteria, and will keep us honest and prevent any intentional or unintentional manipulation of the criteria based on an expected outcome.

.

Questions to ask when contemplating data integrity

Here are a set of questions that should be evaluated in any data integrity risk assessment/evaluation.

  1. Do you have a list of all GxP activities performed in your organization?
  2. Do you know which GxP activities involve intensive data handling tasks?
  3. Do you know the automation status of each GxP activity?
  4. Have you identified a list of GxP records that will be created by each GxP activity?
  5. Have you determined the format in which the official GxP records will be maintained?
  6. Have you determined if a signature is required for each GxP record?
  7. Do you have controls to ensure that observed, measured or processed GxP data is accurate?
  8. Do you have controls to ensure that GxP data is maintained in full without being omitted, discarded or deleted?
  9. Do you have controls to ensure that naming, measurement units, and value limits are defined and applied consistently during GxP data handling?
  10. Do you have controls to ensure that GxP data is recorded at the same time as the observation/measurement is made or shortly thereafter?
  11. Do you have controls to ensure that GxP data is recorded in a clear and human readable form?
  12. Do you have controls to ensure that data values represent the first recording of the GxP data or an exact copy of an original data?
  13. Do you have SOP(s) addressing management of GxP documents and records and good documentation practices?
  14. Do you have SOP(s) addressing the escalation of quality events that also cover data integrity breaches?
  15. Do you have SOP(s) addressing self-inspections/audits with provisions for data integrity?
  16. Do you have SOP(s) addressing management of third parties with provisions for the protection of data integrity?
  17. Do you have SOP(s) for Computerized Systems Compliance?
  18. Do you have SOP(s) for training and does it include training on data integrity for employees handling GxP data?
  19. For GxP activities that generate data essential for product quality, product supply or patient safety, do you have controls to prevent or minimize:
    • Process execution errors due to human inability, negligence or inadequate procedures?
    •  Non-compliance due to unethical practices such as falsification?
  20. Do you have controls to ensure that only authorized employees are granted access to GxP data based on the requirements of their job role?
  21. Do you have controls to ensure that only the GxP activity owner or delegate can grant access to the GxP data?
  22. Do you have controls to eliminate or reduce audiovisual distractions for GxP activities with intensive data handling tasks?
  23. Do you assess the design and configuration of your computerized GxP activity to minimize manual interventions where possible?
  24. Do you have controls for review of audit trail data at relevant points in the process to support important GxP actions or decisions?
  25. Do you have controls, supervision or decision support aids to help employees who perform error-prone data handling activities?
  26. Do you have controls to ensure business continuity if a GxP record essential for product quality, product supply, or patient safety is not available? Both for when there is a temporary interruption to GxP activity or during a disaster scenario?
  27. Do you have a process for ensuring that data integrity requirements are included in the design and configuration of GxP facilities where data handling activities take place?
  28. Have you assessed the compliance status of computerized systems used to automate GxP activities?
  29. Do you have controls to prevent data capture and data handling errors during GxP data creation?
  30. Do you have controls to ensure the accuracy of date and time applied to GxP data, records and documents?
  31. Do you have controls to ensure that changes to GxP data are traceable to who did what, when and if relevant why during the lifecycle of the GxP data?
  32. Do you have controls to ensure that – when required – legally binding signatures can be applied to GxP records and its integrity are ensured during the retention period of the GxP record?
  33. Do you have controls to ensure that GxP computerized systems managing GxP data can:
    • Allow access only to employees with proper authorization?
    • Identify each authorized employee uniquely?
  34. Do you have controls to ensure that GxP data can be protected against accidental or willful harm?
  35. Do you have controls to keep GxP data in a human readable form for the duration of the retention period?
  36. Do you have controls to ensure that the process for offline retention and retrievals is fit for its intended purpose?

Barriers and root cause analysis

Barriers, or controls, are one of the (not-at-all) secret sauces of root cause analysis.

By understanding barriers, we can understand both why a problem happened and how it can be prevented in the future. An evaluation of current process controls as part of root cause analysis can help determine whether all the current barriers pertaining to the problem you are investigating were present and effective (even if they worked or not).

At its simplest it is just a three-part brainstorm:

Barrier Analysis
Barriers that failed The barrier was in place and operational at the time of the accident, but it failed to prevent the accident.
Barriers that were not used The barrier was available, but workers chose not to use it.
Barriers that did not exist The barrier did not exist at the time of the event. A source of potential corrective and preventive actions (depending on what they are)

The key to this brainstorming session is to try to find all of the failed, unused, or nonexistent barriers. Do not be concerned if you are not certain which category they belong in.

Most forms of barrier analysis look at two types, technical and administrative. My company breaks the administrative into human and organization, and I have to admit that breakdown has grown on me.

Choose Technical Human Organization
If A technical or engineering control exists The control relies on a human reviewer or operator The control involves a transfer of responsibility. For example, a document reviewed by both manufacturing and quality.
Examples Separation among manufacturing or packaging lines

Emergency power supply

Dedicated equipment

Barcoding

Keypad controlled doors

Separated storage for components

Software which prevents a workflow going further if a field is not completed

Redundant designs

Training and certifications

Use of checklist

Verification of critical task by a second person

 

Clear procedures and policies

Adequate supervision

Adequate load of work

Periodic process audits

 

These barriers are the same as current controls is in a risk assessment, which is key in a wide variety of risk assessment tools.

Change Control SIPOC

Always start with a SIPOC is a mantra many of us steeped in Six Sigma have heard a lot. There is some truth to having a good visual diagram that helps define a system or project. As this blog will be discussing change management and change control quite a bit, here is a SIPOC that governs change control.

SIPOC for CCR

This SIPOC represents change control from the perspective of a pharmaceutical manufacturing plant. But this will apply to many manufacturing industries, though the focus on regulatory might shift.