Risk, Hazard and Harm

Risk Is….

The combination of the probability of the occurrence of the harm and the severity of that harm.

The effect of uncertainty on objectives

Often characterized by reference to the potential event and consequences or combination of these

Often expressed in terms of a combination of the consequences of an event (including in changes in circumstances) and the associated likelihood of the occurrence

 

Hazard, harm and risk

HazardHarmRisk
Enabling state that leads to the possibility of harmInjury or damageProbability of harm from a situation triggered by the hazard.
Hazard harm and risk

A hazard is defined in ISO 12100 as “The potential source of harm.” This definition is carried through other ISOs and regulatory guidances. The hazard is what could go wrong, our “What If…”, it is when we start engaging the outcome identification loop to query uncertainty about the future.

Harm are those injuries or damages I should care about.

Every risk assessment is really asking “What could go wrong,” and then answering two questions:

  1. If it did go wrong how bad is it – the Harm
  2. And how likely is it to go wrong – Probability.

Risk is then the combination of those things as a magnitude or priority.

Risk assessment tools break down into two major camps. Those that start with the hazards, asking how something can fail; and those that start with the harms, asking what bad things do we want to avoid.

Qualitative Risk Analysis

Risk can be associated with a number of different types of consequences, impacting different objectives. The types of consequences to be analyzed are decided when planning the assessment. The context statement is checked to ensure that the consequences to be analyzed align with the purpose of the assessment and the decisions to be made. This can be revisited during the assessment as more is learned.

Methods used in analyzing risks can be qualitative, semiquantitative, or quantitative. The decision here will be on the intended use, the availability of reliable data, and the decision-making needs of the organization. In ICH Q9 this is also the level of formality.

Risk Is….

The combination of the probability of the occurrence of the harm and the severity of that harm.

The effect of uncertainty on objectives

Often characterized by reference to the potential event and consequences or combination of these

Often expressed in terms of a combination of the consequences of an event (including in changes in circumstances) and the associated likelihood of the occurrence

 

 

Qualitative assessments define consequence (or severity), likelihood, and level of risk by significance levels, such as “high,” “medium,” or “low.” They work best when supporting analysis that have a narrow application or are within another quality system, such as change control.

Qualitative

Below is a good way to break down consequences and likelihood for a less formal assessment.

Consequence

Increase Likelihood

Severity

People

Assets

Requirements

Ability to Meet Regulations

  1. Never Heard of in Industry

B. Has Occurred in Industry

C. Occurs Several Times Per Year in Company

D. Occurs Several Times Per Year at Location

0

No Injury

No Damage

No Effect

No Impact

Manage for Continuous Improvement

1

Slight Injury

Slight Damage

Slight Effect

Slight Impact

Incorporate Risk – Reduction Measures

2

Minor Injury

Minor Damage

Limited Effect

Limited Impact

3

Major Injury

Localized Damage

Localized Effect

Considerable Impact

Intolerable – Immediate Corrective Action

4

1-3 Fatalities

Major Damage

Major Effect

National Impact

5

Multiple Fatalities

Extensive Damage

Massive Effect

International Impact

 

Building the Risk Team

Good risk assessments are a team effort. If done right this is a key way to reduce subjectivity and it recognizes that none of us know everything.

An effective risk team:

One of the core jobs of a process owner in risk assessment is assembling this team and ensuring they have the space to do their job. They are often called the champion or sponsor for good reason.

It is important to keep in mind that membership of this team will change, gaining and losing members and bringing on people for specific subsections, depending on the scale and scope of the risk assessment.

The more complex the scope and the more involved the assessment tool, the more important it is to have a facilitator to drive the process. This allows someone to focus on the process of the risk assessment, and the reduction of subjectivity.

Success/Failure Space, or Why We Can Sometimes Seem Pessimistic

When evaluating a system we can look at it in two ways. We can identify ways a thing can fail or the various ways it can succeed.

Success/Failure Space

These are really just two sides of the coin in many ways, with identifiable points in success space coinciding with analogous points in failure space. “Maximum anticipated success” in success space coincides with “minimum anticipated failure” in failure space.

Like everything, how we frame the question helps us find answers. Certain questions require us to think in terms of failure space, others in success. There are advantages in both, but in risk management, the failure space is incredibly valuable.

It is generally easier to attain concurrence on what constitutes failure than it is to agree on what constitutes success. We may desire a house that has great windows, high ceilings, a nice yard. However, the one we buy can have a termite-infested foundation, bad electrical work, and a roof full of leaks. Whether the house is great is a matter of opinion, but we certainly know all it is a failure based on the high repair bills we are going to accrue.

Success tends to be associated with the efficiency of a system, the amount of output, the degree of usefulness. These characteristics are describable by continuous variables which are not easily modeled in terms of simple discrete events, such as “water is not hot” which characterizes the failure space. Failure, in particular, complete failure, is generally easy to define, whereas the event, success, maybe more difficult to tie down

Theoretically the number of ways in which a system can fail and the number of ways in which a system can ·succeed are both infinite, from a practical standpoint there are generally more ways to success than there are to failure. From a practical point of view, the size of the population in the failure space is less than the size of the population in the success space. This leads to risk management focusing on the failure space.

The failure space maps really well to nominal scales for severity, which can be helpful as you build your own scales for risk assessments.

For example, let’s look at an example of a morning commute.

Example of the failure space for a morning commute

Evaluating Controls as Part of Risk Management

When I teach an introductory risk management class, I usually use an icebreaker of “What is the riskiest activity you can think of doing. Inevitably you will get some version of skydiving, swimming with sharks, jumping off bridges. This activity is great because it starts all conversations around likelihood and severity. At heart, the question brings out the concept of risk important activities and the nature of controls.

The things people think of, such as skydiving, are great examples of activities that are surrounded by activities that control risk. The very activity is based on accepting reducing risk as low as possible and then proceeding in the safest possible pathway. These risk important activities are the mechanism just before a critical step that:

  1. Ensure the appropriate transfer of information and skill
  2. Ensure the appropriate number of actions to reduce risk
  3. Influence the presence or effectiveness of barriers
  4. Influence the ability to maintain positive control of the moderation of hazards

Risk important activities is a concept important to safety-thought and are at the center of a lot of human error reduction tools and practices. Risk important activities are all about thinking through the right set of controls, building them into the procedure, and successfully executing them before reaching the critical step of no return. Checklists are a great example of this mindset at work, but there are a ton of ways of doing them.

In the hospital they use a great thought process, “Five rights of Safe Medication Practices” that are: 1) right patient, 2) right drug, 3) right dose, 4) right route, and 5) right time. Next time you are getting medication in the doctor’s office or hospital evaluate just what your caregiver is doing and how it fits into that process. Those are examples of risk important activities.

Assessing controls during risk assessment

Risk is affected by the overall effectiveness of any controls that are in place.

The key aspects of controls are:

  • the mechanism by which the controls are intended to modify risk
  • whether the controls are in place, are capable of operating as intended, and are achieving the expected results
  • whether there are shortcomings in the design of controls or the way they are applied
  • whether there are gaps in controls
  • whether controls function independently, or if they need to function collectively to be effective
  • whether there are factors, conditions, vulnerabilities or circumstances that can reduce or eliminate control effectiveness including common cause failures
  • whether controls themselves introduce additional risks.

A risk can have more than one control and controls can affect more than one risk.

We always want to distinguish between controls that change likelihood, consequences or both, and controls that change how the burden of risk is shared between stakeholders

Any assumptions made during risk analysis about the actual effect and reliability of controls should be validated where possible, with a particular emphasis on individual or combinations of controls that are assumed to have a substantial modifying effect. This should take into account information gained through routine monitoring and review of controls.

Risk Important Activities, Critical Steps and Process

Critical steps are the way we meet our critical-to-quality requirements. The activities that ensure our product/service meets the needs of the organization.

These critical steps are the points of no-return, the point where the work-product is transformed into something else. Risk important activities are what we do to remove the danger of executing that critical step.

Beyond that critical step, you have rejection or rework. When I am cooking there is a lot of prep work which can be a mixture of critical steps, from which there is no return. I break the egg wrong and get eggshells in my batter, there is a degree of rework necessary. This is true for all our processes.

The risk-based approach to the process is to understand the critical steps and mitigate controls.

We are thinking through the following:

  • Critical Step: The action that triggers irreversibility. Think in terms of critical-to-quality attributes.
  • Input: What came before in the process
  • Output: The desired result (positive) or the possible difficulty (negative)
  • Preconditions: Technical conditions that must exist before the critical step
  • Resources: What is needed for the critical step to be completed
  • Local factors: Things that could influence the critical step. When human beings are involved, this is usually what can influence the performer’s thinking and actions before and during the critical step
  • Defenses: Controls, barriers and safeguards

Risk Management Mindset

Good risk management requires a mindset that includes the following attributes:

  • Expect to be surprised: Our processes are usually underspecified and there is a lot of hidden knowledge. Risk management serves to interrogate the unknowns
  • Possess a chronic sense of unease: There is no such thing as perfect processes, procedures, training, design, planning. Past performance is not a guarantee of future success.
  • Bend, not break: Everything is dynamic, especially risk. Quality comes from adaptability.
  • Learn: Learn from what goes well, from mistakes, have a learning culture
  • Embrace humility: No one knows everything, bring those in who know what you do not.
  • Acknowledge differences between work-as-imagined and work-as-done: Work to reduce the differences.
  • Value collaboration: Diversity of input
  • Drive out subjectivity: Understand how opinions are formed and decisions are made.
  • Systems Thinking: Performance emerges from complex, interconnected and interdependent systems and their components

The Role of Monitoring

One cannot control risk, or even successfully identify it unless a system is able flexibly to monitor both its own performance (what happens inside the system’s boundary) and what happens in the environment (outside the system’s boundary). Monitoring improves the ability to cope with possible risks

When performing the risk assessment, challenge existing monitoring and ensure that the right indicators are in place. But remember, monitoring itself is a low-effectivity control.

Ensure that there are leading indicators, which can be used as valid precursors for changes and events that are about to happen.

For each monitoring control, as yourself the following:

IndicatorHow have the indicators been defined? (By analysis, by tradition, by industry consensus, by the regulator, by international standards, etc.)
RelevanceWhen was the list created? How often is it revised? On which basis is it revised? Who is responsible for maintaining the list?
TypeHow many of the indicators are of the ‘leading,’ type and how many are of the lagging? Do indicators refer to single or aggregated measurements?
ValidityHow is the validity of an indicator established (regardless of whether it is leading or lagging)? Do indicators refer to an articulated process model, or just to ‘common sense’?
DelayFor lagging indicators, how long is the typical lag? Is it acceptable?
Measurement typeWhat is the nature of the measurements? Qualitative or quantitative? (If quantitative, what kind of scaling is used?)
Measurement frequencyHow often are the measurements made? (Continuously, regularly, every now and then?)
AnalysisWhat is the delay between measurement and analysis/interpretation? How many of the measurements are directly meaningful and how many require analysis of some kind? How are the results communicated and used?
StabilityAre the measured effects transient or permanent?
Organization SupportIs there a regular inspection scheme or -schedule? Is it properly resourced? Where does this measurement fit into the management review?

Key risk indicators come into play here.

Hierarchy of Controls

Not every control is the same. This principle applies to both current control and planning future controls.