Information Gaps

An information gap is a known unknown, a question that one is aware of but for which one is uncertain of the answer. It is a disparity between what the decision maker knows and what could be known The attention paid to such an information gap depends on two key factors: salience, and importance.

  • The salience of a question indicates the degree to which contextual factors in a situation highlight it. Salience might depend, for example, on whether there is an obvious counterfactual in which the question can be definitively answered.
  • The importance of a question is a measure of how much one’s utility would depend on the actual answer. It is this factor—importance—which is influenced by actions like gambling on the answer or taking on risk that the information gap would be relevant for assessing.

Information gaps often dwell in the land of knightian uncertainty.

Communicating these Known Unknowns

Communicating around Known Unknowns and other forms of uncertainty

A wide range of reasons for information gaps exist:

  • variability within a sampled population or repeated measures leading to, for example, statistical margins-of-error
  • computational or systematic inadequacies of measurement
  • limited knowledge and ignorance about underlying processes
  • expert disagreement.

Ambiguity

Ambiguity is present in virtually all real-life situations and are those ‘situations in which we do not have sufficient information to quantify the stochastic nature of the problem. It is a lack of knowledge as
to the ‘basic rules of the game’ where cause-and-effect are not understood and there is no precedent for
making predictions as to what to expect

Ambiguity is often used, especially in the context of VUCA, to cover situations in situations that have:

  • Doubt about the nature of cause and effect
  • Little to no historical information to predict the outcome
  • Difficult to forecast or plan for

It is important to answer whether there are risks of lack of experience and predictability that might affect the situation, and interrogate our unknown unknowns.

People are ambiguity averse in that they prefer situations in which probabilities are perfectly known to situations in which they are unknown.

Ambiguity is best resolved by experimentation.

Review of Process/Procedure

Review of documents are a critical part of the document management lifecycle.

Document Lifecycle

In the post Process/Procedure Lifecycle there are some fundamental stakeholders:

  • The Process Owner defines the process, including people, process steps, and technology, as well as the connections to other processes. They are accountable for change management, training, monitoring and control of the process and supporting procedure. The Process Owners owns the continuous improvement of the overall process.
  • Quality is ultimately responsible for the decisions made and that they align, at a minimum, with all regulatory requirements and internal standards.
  • Functional Area Management represents the areas that have responsibilities in the process and has a vested interest or concern in the ongoing performance of a process. This can include stakeholders who are process owners in upstream or downstream processes.
  • A Subject Matter Expert (SME) is typically an expert on a narrow division of a process, such as a specific tool, system, or set of process steps. A process may have multiple subject matter experts associated with it, each with varying degrees of understanding of the over-arching process.

A Risk Based Approach

The level of review of a new or revised process/procedure is guided by three fundamental risk questions:

  • What might go wrong with the associated process? (risk identification)
  • What is the likelihood that this will go wrong? (risk analysis)
  • What are the consequences? How severe are they if this goes wrong? (risk analysis)

Conducting risk identification is real about understanding how complicated and complex the associated process is. This looks at the following criteria:

  • Interconnectedness: the organization and interaction of system components and other processes
  • Repeatability: the amount of variance in the process
  • Information content: the amount of information needed to interact with the process

What Happens During a Review of Process and Procedure

The review of a process/procedure ensures that the proposed changes add value to the process and attain the outcome the organization wants. There are three levels of review (which can and often do happen simultaneously):

  • Functional review
  • Expert review by subject matter experts
  • Step-by-step real-world challenge

Functional review is the vetting of the process/procedure. Process stakeholders, including functional area management affected by the change has the opportunity to review the draft, suggest changes and agree to move forward.

Functional review supplies the lowest degree of assurance. This review looks for potential impact of the change on the function – usually focused on responsibilities – but does not necessarily assures a critical review.

In the case of expert review, the SMEs will review the draft for both positive and negative elements. On the positive side, they will look for the best practices, value-adding steps, flexibility in light of changing demands, scalability in light of changing output targets, etc. On the negative side, they will look for bottlenecks in the process, duplication of effort, unnecessary tasks, non-value-adding steps, role ambiguities (i.e. several positions responsible for a task, or no one responsible for a task), etc.

Expert review provides a higher degree of assurance because it is a compilation of expert opinion and it is focused on the technical content of the procedure.

The real-world challenge tests the process/procedure’s applicability by challenging it step-by-step in as much as possible the actual conditions of use. Tis involves selecting seasoned employee(s) within the scope of the draft procedure – not necessarily a SME – and comparing the steps as drafted with the actual activities. It is important to ascertain if they align. It is equally important to consider evidence of resistance, repetition and human factor problems.

Sometimes it can be more appropriate to do the real-world test as a tabletop or simulation exercise.

As sufficient reviews are obtained, the comments received are incorporated, as appropriate. Significant changes incorporated during the review process may require the procedure be re-routed for review, and may require the need to add additional reviews.

Repeat as a iterative process as necessary.

Design lifecycle

The process/procedure lifecycle can be seen as the iterative design lifecycle.

Design Thinking: Determine process needs.

  • Collect and document business requirements
  • Map current-state processes.
  • Observe and interview process workers.
  • Design process to-be.

Startup: Create process documentation, workflows, and support materials. Review and described above

Continuous Improvement: Use the process; Collect, analyze, and report; Improve

Sensemaking, Foresight and Risk Management

I love the power of Karl Weick’s future-oriented sensemaking – thinking in the future perfect tense – for supplying us a framework to imagine the future as if it has already occurred. We do not spend enough time being forward-looking and shaping the interpretation of future events. But when you think about it quality is essentially all about using existing knowledge of the past to project a desired future.

This making sense of uncertainty – which should be a part of every manager’s daily routine – is another name for foresight. Foresight can be used as a discipline to help our organizations look into the future with the aim of understanding and analyzing possible future developments and challenges and supporting actors to actively shape the future.

Sensemaking is mostly used as a retrospective process – we look back at action that has already taken place, Weick himself acknowledged that people’s actions may be guided by future-oriented thoughts, he nevertheless asserted that the understanding that derives from sensemaking occurs only after the fact, foregrounding the retrospective quality of sensemaking even when imagining the future.

“When one imagines the steps in a history that will realize an outcome, then there is more likelihood that one or more of these steps will have been performed before and will evoke past experiences that are similar to the experience that is imagined in the future perfect tense.”

R.B. MacKay went further in a fascinating way by considering the role that counterfactual and prefactual processes play in future-oriented sensemaking processes. He finds that sensemaking processes can be prospective when they include prefactual “whatifs” about the past and the future. There is a whole line of thought stemming from this that looks at the meaning of the past as never static but always in a state of change.

Foresight concerns interpretation and understanding, while simultaneously being a process of thinking the future in order to improve preparedness. Though seeking to understand uncertainty, reduce unknown unknowns and drive a future state it is all about knowledge management fueling risk management.

Do Not Ignore Metaphor

A powerful tool in this reasoning, imagining and planning the future, is metaphor. Now I’m a huge fan of metaphor, though some may argue I make up horrible ones – I think my entire team is sick of the milk truck metaphor by now – but this underutilized tool can be incredibly powerful as we build stories of how it will be.

Think about phrases such as “had gone through”, “had been through” and “up to that point” as commonly used metaphors of emotional experiences as a physical movement or a journey from one point to another. And how much that set of journey metaphors shape much of our thinking about process improvement.

Entire careers have been built on questioning the heavy use of sport or war metaphors in business thought and how it shapes us. I don’t even watch sports and I find myself constantly using it as short hand.

To make sense of the future find a plausible answer to the question ‘what is the story?’, this brings a balance between thinking and acting, and allows us to see the future more clearly.

Bibliography

  • Cornelissen, J.P. (2012), “Sensemaking under pressure: the influence of professional roles and social accountability on the creation of sense”, Organization Science, Vol. 23 No. 1, pp. 118-137, doi: 10. 1287/orsc.1100.0640.
  • Greenberg, D. (1995), “Blue versus gray: a metaphor constraining sensemaking around a restructuring”, Group and Organization Management, Vol. 20 No. 2, pp. 183-209, available at: http://doi-org.esc-web.lib.cbs.dk:8443/10.1177/1059601195202007
  • Luscher, L.S. and Lewis, M.W. (2008), “Organizational change and managerial sensemaking: working through paradox”, Academy of Management Journal, Vol. 51 No. 2, pp. 221-240, doi: 10.2307/20159506.
  • MacKay, R.B. (2009), “Strategic foresight: counterfactual and prospective sensemaking in enacted environments”, in Costanzo, L.A. and MacKay, R.B. (Eds), Handbook of Research on Strategy and Foresight, Edward Elgar, Cheltenham, pp. 90-112, doi: 10.4337/9781848447271.00011
  • Tapinos, E. and Pyper, N. (2018), “Forward looking analysis: investigating how individuals “do” foresight and make sense of the future”, Technological Forecasting and Social Change, Vol. 126 No. 1, pp. 292-302, doi: 10.1016/j.techfore.2017.04.025.
  • Weick, K.E. (1979), The Social Psychology of Organizing, McGraw-Hill, New York, NY.
  • Weick, K.E. (1995), Sensemaking in Organizations, Sage, Thousand Oaks, CA.

Pandemics and the failure to think systematically

As it turns out, the reality-based, science-friendly communities and information sources many of us depend on also largely failed. We had time to prepare for this pandemic at the state, local, and household level, even if the government was terribly lagging, but we squandered it because of widespread asystemic thinking: the inability to think about complex systems and their dynamics. We faltered because of our failure to consider risk in its full context, especially when dealing with coupled risk—when multiple things can go wrong together. We were hampered by our inability to think about second- and third-order effects and by our susceptibility to scientism—the false comfort of assuming that numbers and percentages give us a solid empirical basis. We failed to understand that complex systems defy simplistic reductionism.

Zeynep Tufekci, “What Really Doomed Americas Coronovirus Response” published 24-Mar-2020 in the Atlantic

On point analysis. Hits many of the themes of this blog, including system thinking, complexity and risk and makes some excellent points that all of us in quality should be thinking deeply upon.

COVID-19 is not a black swan. Pandemics like this have been well predicted. This event is a different set of failures, that on a hopefully smaller scale most of us are unfortunately familiar with in our organizations.

I certainly didn’t break out of the mainstream narrative. I traveled in February, went to a conference and then held a small event on the 29th.

The article stresses the importance of considering the trade-offs between resilience, efficiency, and redundancy within the system, and how the second- and third-order impacts can reverberate. It’s well worth reading for the analysis of the growth of COVID-19, and more importantly our reaction to it, from a systems perspective.