Goals, Objectives and Transparency

Organizations, projects and teams have goals and objectives, and often these terms are used interchangeably. When I’m trying to be good on nomenclature, I use the following standard definitions:

Goal is generally described as an effort directed towards an end. In project management, for example, the term goal is to three different target values of performance, time and resources. To be more specific, the project goal specifies the desired outcome (performance), the specific end date (time) and the assigned amount of resources (resources). A goal answers to “What” is the main aim of the project. 

An Objective defines the tangible and measurable results of the team to support the agreed goal and meet the planned end time and other resource restrictions. It answers to “How” something is to be done.

I think many of us are familiar with the concept of SMART goals. Lately I’ve been using FAST objectives.

From “With Goals, FAST Beats SMART” by Donald Sull and Charles Sull

Transparency provides the connective tissue, and must be a primary aspect of any quality culture. Transparency is creating a free flow within an organization and between the organization and its many stakeholders. This flow of information is the central nervous system of an organization and it’s effectiveness depends on it. Transparency influences the capacity to solve problems, innovate, meet challenges and as shown above, meet goals.

This information flow is simply that critical information gets to the right person at the right time and for the right reason. By making our goals transparent we can start that process and make a difference in our organizations.

Uncertainty and Subjectivity in Risk Management

The July-2019 monthly gift to members of the ASQ is a lot of material on Failure Mode and Effect Analysis (FMEA). Reading through the material got me to thinking of subjectivity in risk management.

Risk assessments have a core of the subjective to them, frequently including assumptions about the nature of the hazard, possible exposure pathways, and judgments for the likelihood that alternative risk scenarios might occur. Gaps in the data and information about hazards, uncertainty about the most likely projection of risk, and incomplete understanding of possible scenarios contribute to uncertainties in risk assessment and risk management. You can go even further and say that risk is socially constructed, and that risk is at once both objectively verifiable and what we perceive or feel it to be. Then again, the same can be said of most of science.

Risk is a future chance of loss given exposure to a hazard. Risk estimates, or qualitative ratings of risk, are necessarily projections of future consequences. Thus, the true probability of the risk event and its consequences cannot be known in advance. This creates a need for subjective judgments to fill-in information about an uncertain future. In this way risk management is rightly seen as a form of decision analysis, a form of making decisions against uncertainty.

Everyone has a mental picture of risk, but the formal mathematics of risk analysis are inaccessible to most, relying on probability theory with two major schools of thought: the frequency school and the subjective probability school. The frequency school says probability is based on a count of the number of successes divided by total number of trials. Uncertainty that is ready characterized using frequentist probability methods is “aleatory” – due to randomness (or random sampling in practice). Frequentist methods give an estimate of “measured” uncertainty; however, it is arguably trapped in the past because it does not lend itself to easily to predicting future successes.

In risk management we tend to measure uncertainty with a combination of frequentist and subjectivist probability distributions. For example, a manufacturing process risk assessment might begin with classical statistical control data and analyses. But projecting the risks from a process change might call for expert judgments of e.g. possible failure modes and the probability that failures might occur during a defined period. The risk assessor(s) bring prior expert knowledge and, if we are lucky, some prior data, and start to focus the target of the risk decision using subjective judgments of probabilities.

Some have argued that a failure to formally control subjectivity — in relation to probability judgments – is the failure of risk management. This was an argument that some made during WCQI, for example. Subjectivity cannot be eliminated nor is it an inherent limitation. Rather, the “problem with subjectivity” more precisely concerns two elements:

  1. A failure to recognize where and when subjectivity enters and might create problems in risk assessment and risk-based decision making; and
  2. A failure to implement controls on subjectivity where it is known to occur.

Risk is about the chance of adverse outcomes of events that are yet to occur, subjective judgments of one form or another will always be required in both risk assessment and risk management decision-making.

We control subjectivity in risk management by:

  • Raising awareness of where/when subjective judgments of probability occur in risk assessment and risk management
  • Identifying heuristics and biases where they occur
  • Improving the understanding of probability among the team and individual experts
  • Calibrating experts individually
  • Applying knowledge from formal expert elicitation
  • Use expert group facilitation when group probability judgments are sought

Each one of these is it’s own, future, post.

Topics of concern for collaboration

More a collection of topics for things I am currently exploring. Please add additional ones and/or resources in the comments.

Trends Concerns
Increasing collaborative modes of working, specifically more:
Matrix structures (Cross et al. 2013, 2016; Cross and Gray 2013)
(Distributed) Teamwork (Cross et al. 2015) 
(Multi-) Project work (Zika-Viktorsson et al. 2006) and multiple team membership (O`Leary et al. 2011)
Interruptions, which are ‘normal’ or even as a necessary part of knowledge workers’ workday (Wajcman and Rose 2011)
Collaboration, which is seen as an end (Breu et al. 2005; Dewar et al. 2009; Gardner 2017; Randle 2017)
Collaborative work is highly demanding (Barley et al. 2011; Dewar et al. 2009; Eppler and Mengis 2004)
Perils of multitasking (Atchley 2010; Ophir et al. 2009; Turkle 2015)
Too many structurally unproductive and inefficient teams (Duhigg 2016)
Lack of accountability for meeting and conference call time (Fried 2016)
Overall, lack of structural protection of employee’s productive time (Fried 2016)
Impacts of collaborative technology
Growing share of social technologies in the workplace (Bughin et al. 2017)
‘Always on’ mentality, cycle of responsiveness (Perlow 2012)
Platforms are designed to prime and nudge users to spend more time using them (Stewart 2017)
Unclear organizational expectations how to use collaborative technology and limited individual knowledge (Griffith 2014; Maruping and Magni 2015)
Technology exacerbates organizational issues (Mankins 2017)
Inability to ‘turn off’ (Perlow 2012)
Technology creates more complexity than productivity gains (Stephens et al. 2017)
Increasing complex media repertoires: highly differentiated, vanishing common denominator (Greene 2017; Mankins 2017)
Social technology specific Increased visibility (Treem and Leonardi 2013) and thus the ability to monitor behaviour Impression management and frustration (Farzan et al. 2008)
Overall, overload scenarios and fragmentation of work (Cross et al. 2015; Wajcman and Rose 2011)
Increasing ratio of collaborative activities for managers (Mankins and Garton 2017; Mintzberg 1990) and employees (CEB 2013; Cross and Gray 2013)

Workdays are primarily characterized by communication and collaboration.
Managers at intersections of matrix structures get overloaded (Feintzeig 2016; Mankins and Garton 2017)
Limited knowledge how to shape collaboration on the managerial level (Cross and Gray 2013; Maruping and Magni 2015)
Experts and structurally exposed individuals (e.g. boundary spanners) easily get overburdened with requests (Cross et al. 2016; Cross and Gray 2013).

Behavioral traits (‘givers’) may push employees close burn-outs (Grant 2013; Grant and Rebele 2017)
Diminishing ‘perceived control’ over one’s own schedule (Cross and Gray 2013)

Overall, managers and employees do not have enough uninterrupted time (Cross et al. 2016; Mankins and Garton 2017)

Resources

  • Atchley, P. 2010. “You Can’t Multitask, So Stop Trying,” Harvard Business Review
  • Barley, S. R., Meyerson, D. E., and Grodal, S. 2011. “E-mail as a Source and Symbol of Stress,” Organization Science (22:4), pp. 887–906.
  • Breu, K., Hemingway, C., and Ashurst, C. 2005. “The impact of mobile and wireless technology on knowledge workers: An exploratory study,” in Proceedings of the 13th European Conference on Information Systems, Regensburg, Germany.
  • Bughin, J., Chui, M., Harrysson, M., and Lijek, S. 2017. “Advanced social technologies and the future of collaboration,” McKinsey Global Institute.
  • CEB. 2013. “Driving the Strategic Agenda in the New Work Environment
  • Cross, R., Ernst, C., Assimakopoulos, D., and Ranta, D. 2015. “Investing in boundary-spanning collaboration to drive efficiency and innovation,” Organizational Dynamics (44:3), pp. 204–216.
  • Cross, R., and Gray, P. 2013. “Where Has the Time Gone? Addressing Collaboration Overload in a Networked Economy,” California Management Review (56:1), pp. 1–17.
  • Cross, R., Kase, R., Kilduff, M., and King, Z. 2013. “Bridging the gap between research and practice in organizational network analysis: A conversation between Rob Cross and Martin Kilduff,” Human Resource Management (52:4), pp. 627–644.
  • Cross, R., Rebele, R., and Grant, A. 2016. “Collaborative Overload,” Harvard Business Review (94:1), pp. 74–79.
  • Cross, R., Taylor, S.N., Zehner, D. 2018. “Collaboration without burnout“. Harvard Business Review. (96:4), pp. 134-137.
  • Dewar, C., Keller, S., Lavoie, J., and Weiss, L. M. 2009. “How do I drive effective collaboration to deliver real business impact?,” McKinsey & Company.
  • Duhigg, C. 2016. Smarter, Faster, Better – The Secrets of Being Productive in Life and Business, New York, USA: Penguin Random House.
  • Eppler, M. J., and Mengis, J. 2004. “The Concept of Information Overload: A Review of Literature from Organization Science, Accounting, Marketing, MIS, and Related Disciplines,” The Information Society (20:5), pp. 325–344.
  • Farzan, R., DiMicco, J. M., Millen, D. R., Brownholtz, B., Geyer, W., and Dugan, C. 2008. “Results from Deploying a Participation Incentive Mechanism within the Enterprise,” in Proceedings of the 26th SIGCHI Conference on Human Factors in Computing Systems, Florence, Italy.
  • Feintzeig, R. 2016. “So Busy at Work, No Time to Do the Job,” The Wall Street Journal
  • Fried, J. 2016. “Restoring Sanity to the Office,” Harvard Business Review .
  • Gardner, H. K. 2017. Smart Collaboration: How Professionals and Their Firms Succeed by Breaking Down Silos, Boston, USA: Harvard Business Review Press.
  • Grant, A. 2013. Give and Take: A Revolutionary Approach to Success, New York, USA: Penguin Group.
  • Grant, A., and Rebele, R. 2017. “Generosity Burnout,” Harvard Business Review
  • Greene, J. 2017. “Beware Collaboration-Tool Overload,” The Wall Street Journal
  • Griffith, T. L. 2014. “Are Companies Ready to Finally Kill Email?,” MIT Sloan Management Review
  • Lock Lee, L. 2017. “Enterprise Social Networking Benchmarking Report 2017,” SWOOP Analytics
  • Mankins, M. 2017. “Collaboration Overload Is a Symptom of a Deeper Organizational Problem,” Harvard Business Review
  • Mankins, M., and Garton, E. 2017. Time, Talent, Energy, Boston, USA: Harvard Business Review Press
  • Maruping, L. M., and Magni, M. 2015. “Motivating Employees to Explore Collaboration Technology in Team Contexts,” MIS Quarterly (39:1), pp. 1–16.
  • O’Leary, M. B., Mortensen, M., and Woolley, A. W. 2011. “Multiple Team Membership: a Theoretical Model of Its Effects on Productivity and,” Academy of Management Review (36:3), pp. 461–478.
  • Ophir, E., Nass, C., and Wagner, A. D. 2009. “Cognitive control in media multitaskers,” Proceedings of the National Academy of Sciences of the United States of America (106:37), pp. 15583–15587.
  • Perlow, L. A. 1999. “The time famine: Toward a sociology of work time,” Administrative Science Quarterly (44:1), pp. 57–81.
  • Perlow, L. A. 2012. Sleeping With Your Smartphone, Boston, USA: Harvard Business Review Press.
  • Perlow, L. A. 2014. “Manage Your Team’s Collective Time,” Harvard Business Review (92:6), pp. 23–25.
  • Perlow, L. A., and Porter, J. L. 2009. “Making time off predictable–and required,” Harvard Business Review (87:10), pp. 102–109.
  • Randle, C. 2017. “24/7: Managing Constant Connectivity,” in Work Pressures: New Agendas in Communication, D. I. Ballard and M. S. McGlone (eds.), New York, USA: Routledge, pp. 20–26.
  • Stephens, K. K. 2017. “Understanding Overload in a Contemporary World,” in Work Pressures: New Agendas in Communication, D. I. Ballard and M. S. McGlone (eds.), New York, USA: Routledge.
  • Stephens, K. K., Mandhana, D. M., Kim, J. J., and Li, X. 2017. “Reconceptualizing Communication Overload and Building a Theoretical Foundation,” Communication Theory (27:3), pp. 269–289.
  • Stewart, J. B. 2017. “Facebook Has 50 Minutes of Your Time Each Day. It Wants More.” The New York Times
  • Treem, J. W., and Leonardi, P. M. 2013. “Social Media Use in Organizations: Exploring the Affordances of Visibility, Editability, Persistence, and Association,” Annals of the International Communication Association (36:1), pp. 143–189.
  • Turkle, S. 2015. Reclaiming Conversation: The Power of Talk in a Digital Age, New York, USA: Pinguin Press.
  • Wajcman, J., and Rose, E. 2011. “Constant Connectivity: Rethinking Interruptions at Work,” Organization Studies (32:7), pp. 941–961.
  • Zika-Viktorsson, A., Sundström, P., and Engwall, M. 2006. “Project overload: An exploratory study of work and management in multi-project settings,” International Journal of Project Management (24:5), pp. 385–394.

Competencies in Quality

Competence is the set of demonstrable characteristics and skills that enable, and improve the efficiency of, performance of a job. There are a ton of different models out there, but I like to think in terms of three or four different kinds of competences: professional and methodological skills; social competence; and self-competence which includes personal and activity- and implementation-oriented skills. Another great way to look at these are competencies for inter-personal (maps to social competence), intrapersonal (maps to self-competence), and cognitive (maps to professional and methodological skills).

The ongoing digital transformation (Industry 4.0) leads to changing competence requirements which means new ways of life-long teaching and learning are necessary in order to keep up.

We can look at the 4 competencies across three different categories: Human, Organization and Technology:

  Human Organization Technology
Professional & methodological expertise
(Cognitive)
System thinking
Process thinking
Results oriented work
Complexity management
Business thinking
Problem solving
Sensitization ergonomics
Structured, analytical thinking
Change management
Qualification/further education
Agile methods/tools
Lean Enterprise
Client orientation
Workplace design
Soft/hardware understanding
Cyber-physical system understanding
Usability
Human-machine interfaces
Social Competence
(Inter-personal)
Inter-disciplinary thinking
Managerial competence
Ability to work as a team
Conflict management
Communication
Empathy
Employee satisfaction
Human centering
Social networking  
Self-Competence
(Intrapersonal)
Lifelong learning
Personal initiative
Innovativeness
Independent work
Sense of responsibility
Readiness for change
Process orientation  

When it comes to the professional competencies there is a large spread depending on what our industries requires. As a pharmaceutical quality professional I have different professional expertise than a colleague in the construction industry. What we do have in common is the methodological expertise I listed above.

Understanding competencies is important, it allows us to determine what skills are critical, to mentor and develop our people. It also helps when you are thinking in terms of body of knowledge, and just want communities of practice should be focusing on.

FDA enforcement actions decline

Investigative report on FDA enforcement under Trump from Science’s news department shows a steep decline in enforcement actions.

I’ve noticed this, but it is good to see actual data behind it.

I’ll be frank, it would take a lot of data that does not exist to make me feel the companies under the FDA’s oversight have gotten better as a whole. Anecdotally, well there are a lot of less than sterling players out there.

I have mostly questions:

  1. Have we seen this trend in previous Republican administrations, and is it more pronounced here?
  2. Is there any evidence that the increase under Obama was a reaction to the previous Republican administration? Are we in a cycle of lax and then tougher enforcement that maybe evens out? That sort of variance is not healthy.
  3. What, if any data, will we be able to see about impact? There are certainly concerns that the FDA has not done enough. Will this be exacerbated?
  4. What will it take for this to start affecting the mutual recognition agreements with the EU and other major bodies?