Layering metrics

We have these quality systems with lots of levers, with interrelated components. And yet we select one or two metrics and realize that even if we meet them, we aren’t really measuring the right stuff nor are we driving continious improvement.

One solution is to create layered metrics, which basically means drill down your process and identify the metrics at each step.

Lots of ways to do this. An easy way to start is to use the 5-why process, a tool most folks are comfortable with.

So for example, CAPA. It is pretty much agreed upon that CAPAs should be completed in a timely manner. That makes this a top level goal. Unfortunately, in this hypothetical example, we are suffering a less than 100% closure goal (or whatever level is appropriate in your organization based on maturity)

Why 1Why was CAPA closure not 100%
Because CAPA tasks were not closed on time.

Success factor needed for this step: CAPA tasks to be closed by due date.

Metric for this step: CAPA closure task success rate
Why 2Why were CAPA tasks not closed on time?
Because individuals did not have appropriate time to complete CAPA tasks.

Metric for this step: Planned versus Actual time commitment
Why 3Why did individuals not have appropriate time to complete CAPA tasks?
Because CAPA task due dates are guessed at.

Metric for this step: CAPA task adherence to target dates based on activity (e.g. it takes 14 days to revise a document and another 14 days to train, the average document revision task should be 28 days)
Why 4Why are CAPA task due dates guessed at?
Because appropriate project planning is not completed.

Metric for this step: Adherence to Process Confirmation
Why 5Why is appropriate project planning not completed?
Because CAPAs are always determined on the last day the deviation is due.

Metric: Adherence to Root Cause Analysis process

I might on report on the top CAPA closure rate and 1 or 2 of these, and keep the others in my process owner toolkit. Maybe we jump right to the last one as what we report on. Depends on what needs to be influenced in my organization and it will change over time.

It helps to compare this output against the 12 system leverage points.

Donella Meadows 12 System Leverage Points

These metrics go from 3 “goals of the system” with completing CAPA tasks effectively and on time, to 4 “self organize” and 5 “rules of the system.” It also has nice feedback loops based on the process confirmations. I’d view them as potentially pretty successful. Of course, we would test these and tinker and basically experiment until we find the right set of metrics that improves our top-level goal.

Effective Organizations — Think Different

Effectiveness I recently had a bit of a wake-up call via Twitter. I asked the following question: “What’s the one thing /above all/ that makes for an effective organisation?” My thanks to all those who took the time to reply with their viewpoint. The wake-up call for me was the variety of these responses. All […]

via Effectiveness — Think Different

Great thought-piece over on “Think Different” on effectiveness, with a nice tie-in to Donnella Meadow’s “Twelve Leverage Points to Intervene in a System.”

In quality management systems, it is critical to look at effectiveness. If you do not measure, you do not know if the system is working the ways you expect and desire.

We often discuss lagging (output measurement) and leading (predictive) indicators, and this is a good way to start, but if we apply System Thinking and use Meadow’s twelve leverage points we can see that most metrics tend to be around 7-12, with the more effective levers being the least utilized.

I think there are a lot of value in finding metrics within these levers.

So for example, a few indicators on the effectiveness of lever 4 “The Power to Add, Change, Evolve, or Self-Organize System Structure”:

Lagging Leading
Effective CAPAs to the System Number of changes initiated by level of organization and scale of change
Deviation Reduction

 

Evolved Expendable Launch Vehicle (EELV) Quality Management

We determined that ULA, SpaceX, and AR were not performing adequate quality assurance management for the EELV program as evidenced by the 181 nonconformities to the AS9100C at the EELV contractor production facilities. This inadequate quality assurance management could increase costs, delay launch schedules, and increase the risk of mission failure.

From ”

Evaluation of the Evolved Expendable Launch Vehicle Program Quality Management System DODIG-2018-045, Department of Defense Office of Inspector General

It is useful to read audit reports and inspection findings from multiple industries. From this we can see trends, make connections and learn.

I see a few things that stand out.

DOD findings

Risk Register

Our evaluation of the RIO database showed that 11 out of 26 risks related to either Atlas V or Delta IV launch vehicle were in “red” status, which indicates that risk mitigation was behind schedule.

It is not enough to identify risks (though that is a critical place to start). You just can’t track them (though again, if you don’t track it you don’t see it). You actually have to have clear plans to mitigate and eliminate the risks. And this is where the program seemed to fall short.

Not a surprise. I think a lot of companies are having these difficulties. In the pharma world the regulatory agencies have been signaling pretty strongly that this is an issue.

Make sure you identify risks, track them, and have plans that are actually carried out to remediate.

Configuration Management

SpaceX failed to comply with AS9100C, section 7.1.3, which requires it to “establish, implement, and maintain a configuration management process.” Configuration management is a controlled process to establish the baseline configuration of a product and any changes to that product. This process should occur during the entire life cycle of a product to provide visibility and control of its physical, functional, and performance attributes.

First rule of reading inspection reports: Things probably went bad if a section starts with standard review 101 material.

That said, hello change management my dear friend.

This was the gist of my ASQ WCQI workshop last May, every industry needs good change management and change control.

Material Management

ULA and AR failed to comply with AS9100C, section 8.3, which requires them to “ensure that product which does not conform to product requirements are identified and controlled to prevent its unintended use or delivery.”

At ULA, we found 18 expired limited-life material items that were between 32 and 992 days past their expiration dates, but available for use on EELV flight hardware. This material should have been impounded and dispositioned. The use of expired limited-life items, such as glues and bonding agents, could result in product that does not meet specifications and may require costly rework.

I find it hard to believe that these companies aren’t tracking inventory. If they are tracking inventory and have any sort of cycle count process then the mechanism exists to ensure expired material is removed from the possibility of use. And yet we still see these observations across the pharma industry as well.

Concluding Thoughts

Quality Management has it its core the same principles, no matter the industry. We use similar tools. Leverage the best practices out there. Read about stresses other companies are having, learn from them and remediate at your own organization.