Choosing the Right Consultancy Firm

We’ve all been there. Crazy schedules, not enough resources, and timelines looming. We need to bring in the consultants. We bring in the consultants and then “meh,” widespread feeling we are better off doing it ourselves.

One of the significant problems is folks don’t look at consultant companies as part of a system, instead thinking bringing in an expert will just solve things. The other is picking the last company you used instead of properly vetting for the job.

Choosing the right fit is critical. I always look for an excellent systemic consultancy. Hard to find.

AspectTraditional ConsultancySystemic Consultancy
ApproachLinear, top-down approach relying on “expert” adviceProcess-oriented, viewing organizations as complex, interconnected systems
Problem-solvingAnalyzes problems and proposes solutions based on expertiseHelps clients discover solutions by examining the entire system and its dynamics
FocusFocuses on specific issues or departments in isolationLooks at the whole organization and how different parts interact
Client InvolvementPositions consultants as outside experts delivering solutionsInvolves clients deeply in the process, seeing them as co-owners of projects
Consideration of Social FactorsMay overlook social and psychological aspectsHeavily considers social interactions, culture, and psychological factors
ImplementationProvides recommendations but may leave implementation to the clientAssists with implementation and provides ongoing support
TimeframeEngagements can be lengthy, lasting months or yearsCan be more time-efficient, sometimes requiring fewer sessions
PerspectiveRelies more on logical-rational analysisExpands the view to include aspects neglected by purely rational analysis
AdaptabilityMay be less flexible in adapting to changing circumstancesDesigned to be more adaptable to complex, evolving situations
Outcome FocusAims for specific, predefined outcomesFocuses on improving overall system functioning and adaptability
Industry SpecializationOften works across various industries without deep specializationOften has deep excellence within a narrow corridor of competencies and roles
Real SolutionsMay take time to understand the problem and suggest solutionsCan start work on day one, providing immediate, insightful solutions
Personal SkillsMay require time to build rapport and understand the organizational cultureOften possesses the maturity to navigate difficult situations and build immediate rapport

To truly find a good fit, you must go through a robust selection process.

  1. Define your needs and project scope clearly. Before evaluating a consultancy, clearly understand your project goals, timeline, budget, and desired outcomes. This will help you find consultants with the right expertise and capabilities.
  2. Look for relevant experience and expertise. Seek out consultants with a proven track record in your industry and with similar types of projects. Ask for case studies and client references to verify their experience.
  3. Assess their approach and methodology. Look for consultants who have a structured yet flexible approach that can be tailored to your specific needs. The above table can really help. Their methodology should align with your company’s culture and ways of working.
  4. Evaluate their team and resources. Consider the qualifications of the specific team members working on your project. Also, assess whether the consulting firm has adequate resources and support to deliver successfully.
  5. Check their communication style and cultural fit. The consultants should be able to communicate clearly and work well with your team. Their working style and values should align with your company culture.
  6. Compare pricing and value. While cost is important, focus on the overall value the consultant can provide rather than just the lowest price. Consider their expertise, methodology, and potential ROI.
  7. Assess their thought leadership and innovation. Look for consultants demonstrating forward-thinking approaches and staying on top of industry trends and best practices.
  8. Consider the size and type of firm. Based on your project needs and budget, decide whether you need a large global firm, a boutique specialist, or an independent consultant.
  9. Review their technology and tools. Evaluate whether the consultant has access to relevant technologies, data, and analytical tools that can benefit your project.
  10. Trust your instincts. After evaluating all the factors, trust your judgment about which consultant you feel most confident partnering with for a successful outcome.

To select the right consultancy:

  1. Create a shortlist of potential consultants based on the above criteria.
  2. Request detailed proposals from your top choices.
  3. Conduct in-person interviews and presentations.
  4. Check references thoroughly.
  5. Evaluate proposals against your key selection criteria.
  6. Negotiate terms with your preferred consultant.
  7. Ensure alignment on project scope, deliverables, timeline, and budget before finalizing.

A systematic approach to evaluating and selecting a consulting partner will help you find the right fit for your needs and set your project up for success.

Remember, criminal enterprises like McKinsey should be avoided.

How I would Organize a Meeting of a CoP

As I discussed in “A CoP is Collaborative Learning, not Lecture,” it is past time to stop treating professionals as college kids (it is also past time to stop teaching college kids that way, but another subject). Lectures have their place. There is undoubtedly a high need for information transfer events (but even these can be better structured), and there will always be a need for GAMP5 workshops, training courses, and webinars on a specific topic.

But that is not the place of a community of practice.

I’ve written in the past some ways I prefer to structure professional engagements, such as poster sessions and an unconference. I have demonstrated some ways I think we can do this better. So, let’s turn our attention to what a better GAMP5 community of practice session could look like

We aim to connect, communicate, share, collaborate, and dialogue. So, what would a six-hour event look like?

Noon to 1:00—Networking and poster session. We have a lot of introverts in this industry, so help folks connect by doing it in a structured way. Posters are excellent as they can serve as a springboard for conversation. All the presentations that started about ISPE and GAMP5, what the GAMP5 plans are for the next two years, and current regulatory trends are posters.

1:00-2:00—Think-Pair-Share: There will be three rounds of 15 minutes each, each with a different topic. Each participant will have an 11×17 piece of paper to take notes of the other person’s thoughts. Post.

2:00 to 2:30: Review thoughts, brainstorm a theme, and propose.

2:30 to 2:45: N/5 voting for top themes

2:30 to 3:30 – Mock audit, fishbowl style. Deep dive on a particular issue, audit style.

3:30 to 4:30 -Unconference-style breakouts of the themes. Each working group comes out with a hand-drawn poster (or more based on how productive the group is)

4:30 to 5:00 – Present ideas

5:00 to 6:00 – Network, discuss ideas. Add to them.

Hit the bar/restaurant.

Publish the results, and continue to work on the online forum.

The Use of Glossaries

I’ve gone on record with my disdain for reference sections in documents, and similarly, I am not a huge fan of glossary sections. A glossary section is a point of failure in that the same terms used across documents will inevitably start drifting. A preferred practice is to have a common glossary instead so there is one source of truth. Several eDMS platforms even have this as a feature.

Go a step further and just use the already existing glossaries. The WHO’s Quality Assurance of Medicines Terminology Database is an underutilized resource in the pharmaceutical quality world. One should use this as a starting point for your glossary or, better yet, only provide terms not in this database. Again, I know of at least one eDMS where you can point the glossary feature at this external database.

Multi-Criteria Decision-Making to Drive Risk Control

To be honest, too often, we perform a risk assessment not to make decisions but to justify an already existing risk assessment. The risk assessment may help define a few additional action items and determine how rigorous to be about a few things. It actually didn’t make much of an impact on the already-decided path forward. This is some pretty bad risk management and decision-making.

For highly important decisions with high uncertainty or complexity, it is useful to consider the options/alternatives that exist and assess the benefits and risks of each before deciding on a path forward. Thoroughly identifying options/alternatives and assessing the benefits and risks of each can help the decision-making process and ultimately reduce risk.

An effective, highly structured decision-making process can help answer the question, ‘How can we compare the consequences of the various options before deciding?

The most challenging risk decisions are characterized by having several different, important things to consider in an environment where there are often multiple stakeholders and, often, multiple decision-makers. 

In Multi-Criteria Decision-Making (MCDM), the primary objective is the structured consideration of the available alternatives (options) for achieving the objectives in order to make the most informed decision, leading to the best outcome.

In a Quality Risk Management context, the decision-making concerns making informed decisions in the face of uncertainty about risks related to the quality (and/or availability) of medicines.

Key Concepts of MCDM

  1. Conflicting Criteria: MCDM deals with situations where criteria conflict. For example, when purchasing a car, one might need to balance cost, comfort, safety, and fuel economy, which often do not align perfectly.
  2. Explicit Evaluation: Unlike intuitive decision-making, MCDM involves a structured approach to explicitly evaluate multiple criteria, which is crucial when the stakes are high, such as deciding whether to build additional manufacturing capacity for a product under development.
  3. Types of Problems:
  • Multiple-Criteria Evaluation Problems: These involve a finite number of alternatives known at the beginning. The goal is to find the best alternative or a set of good alternatives based on their performance across multiple criteria.
  • Multiple-Criteria Design Problems: In these problems, alternatives are not explicitly known and must be found by solving a mathematical model. The number of alternatives can be very large, often exponentially.

Preference Information: The methods used in MCDM often require preference information from decision-makers (DMs) to differentiate between solutions. This can be done at various stages of the decision-making process, such as prior articulation of preferences, which transforms the problem into a single-criterion problem.

MCDM focuses on risk and uncertainty by explicitly weighing criteria and trade-offs between them. Multi-criteria decision-making (MCDM) differs from traditional decision-making methods in several key ways:

  1. Explicit Consideration of Multiple Criteria: Traditional decision-making often focuses on a single criterion like cost or profit. MCDM explicitly considers multiple criteria simultaneously, which may be conflicting, such as cost, quality, safety, and environmental impact[1]. This allows for a more comprehensive evaluation of alternatives.
  2. Structured Approach: MCDM provides a structured framework for evaluating alternatives against multiple criteria rather than relying solely on intuition or experience. It involves techniques like weighting criteria, scoring alternatives, and aggregating scores to rank or choose the best option.
  3. Transparency and Consistency: MCDM methods aim to make decision-making more transparent, consistent, and less susceptible to individual biases. The criteria, weights, and evaluation process are explicitly defined, allowing for better justification and reproducibility of decisions.
  4. Quantitative Analysis: Many MCDM methods employ quantitative techniques, such as mathematical models, optimization algorithms, and decision support systems. This enables a more rigorous and analytical approach compared to traditional qualitative methods.
  5. Handling Complexity: MCDM is particularly useful for complex decision problems involving many alternatives, conflicting objectives, and multiple stakeholders. Traditional methods may struggle to handle such complexity effectively.
  6. Stakeholder Involvement: Some MCDM methods, like the Analytic Hierarchy Process (AHP), facilitate the involvement of multiple stakeholders and the incorporation of their preferences and judgments. This can lead to more inclusive and accepted decisions.
  7. Trade-off Analysis: MCDM techniques often involve analyzing trade-offs between criteria, helping decision-makers understand the implications of prioritizing certain criteria over others. This can lead to more informed and balanced decisions.

While traditional decision-making methods rely heavily on experience, intuition, and qualitative assessments, MCDM provides a more structured, analytical, and comprehensive approach, particularly in complex situations with conflicting criteria.

Multi-Criteria Decision-Making (MCDM) is typically performed following these steps:

  1. Define the Decision Problem: Clearly state the problem or decision to be made, identify the stakeholders involved, and determine the desired outcome or objective.
  2. Establish Criteria: Identify the relevant criteria that will be used to evaluate the alternatives. These criteria should be measurable, independent, and aligned with the objectives. Involve stakeholders in selecting and validating the criteria.
  3. Generate Alternatives: Develop a comprehensive list of potential alternatives or options that could solve the problem. Use techniques like brainstorming, benchmarking, or scenario analysis to generate diverse alternatives.
  4. Gather Performance Data: Assess how each alternative performs against each criterion. This may involve quantitative data, expert judgments, or qualitative assessments.
  5. Assign Criteria Weights: By assigning weights, determine each criterion’s relative importance or priority. This can be done through methods like pairwise comparisons, swing weighting, or direct rating. Stakeholder input is crucial here.
  6. Apply MCDM Method: Choose an appropriate MCDM technique based on the problem’s nature and the available data. Some popular methods include: Analytic Hierarchy Process (AHP); Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS); ELimination and Choice Expressing REality (ELECTRE); Preference Ranking Organization METHod for Enrichment of Evaluations (PROMETHEE); and, Multi-Attribute Utility Theory (MAUT).
  7. Evaluate and Rank Alternatives: Apply the chosen MCDM method to evaluate and rank the alternatives based on their performance against the weighted criteria. This may involve mathematical models, software tools, or decision support systems.
  8. Sensitivity Analysis: Perform sensitivity analysis to assess the robustness of the results and understand how changes in criteria weights or performance scores might affect the ranking or choice of alternatives.
  9. Make the Decision: Based on the MCDM analysis, select the most preferred alternative or develop an action plan based on the ranking of alternatives. Involve stakeholders in the final decision-making process.
  10. Monitor and Review: Implement the chosen alternative and monitor its performance. Review the decision periodically, and if necessary, repeat the MCDM process to adapt to changing circumstances or new information.

MCDM is an iterative process; stakeholder involvement, transparency, and clear communication are crucial. Additionally, the specific steps and techniques may vary depending on the problem’s complexity, the data’s availability, and the decision-maker’s preferences.

MCDM TechniqueDescriptionApplicationKey Features
Analytic Hierarchy Process (AHP)A structured technique for organizing and analyzing complex decisions, using mathematics and psychology.Widely used in business, government, and healthcare for prioritizing and decision-making.Pairwise comparisons, consistency checks, and hierarchical structuring of criteria and alternatives.
Technique for Order Preference by Similarity to Ideal Solution (TOPSIS)Based on the concept that the chosen alternative should have the shortest geometric distance from the positive ideal solution and the longest geometric distance from the negative ideal solution.Frequently used in engineering, management, and human resource management for ranking and selection problems.Compensatory aggregation, normalization of criteria, and calculation of geometric distances.
Elimination and Choice Expressing Reality (ELECTRE)An outranking method that compares alternatives by considering both qualitative and quantitative criteria. It uses a pairwise comparison approach to eliminate less favorable alternatives.Commonly used in project selection, resource allocation, and environmental management.Use of concordance and discordance indices, handling of both qualitative and quantitative data, and ability to deal with incomplete rankings.
Preference Ranking Organization Method for Enrichment Evaluation (PROMETHEE)An outranking method that uses preference functions to compare alternatives based on multiple criteria. It provides a complete ranking of alternatives.Applied in various fields such as logistics, finance, and environmental management.Preference functions, visual interactive modules (GAIA), and sensitivity analysis.
Multi-Attribute Utility Theory (MAUT)Involves converting multiple criteria into a single utility function, which is then used to evaluate and rank alternatives. It takes into account the decision-maker’s risk preferences and uncertainties.Used in complex decision-making scenarios involving risk and uncertainty, such as policy analysis and strategic planning.Utility functions, probabilistic weights, and handling of uncertainty.
Popular MCDM Techniques

Quality Agreements with Cloud Providers

Having a quality agreement with a cloud provider is crucial for several reasons:

Ensure Regulatory Compliance

A quality agreement helps ensure the cloud provider’s services and processes comply with relevant regulations and guidelines, such as GxP (Good Practice) requirements from agencies like the FDA, EMA, and MHRA. It defines the roles, responsibilities, and expectations for maintaining data integrity, security, and quality standards throughout the product lifecycle.

Delineate Responsibilities

Cloud services often involve complex technology stacks and multiple subservice providers. A quality agreement clearly delineates the responsibilities of the regulated company and the cloud provider, ensuring that critical activities like change control, incident management, data governance, and security controls are properly addressed and assigned.

Establish Service Levels

The quality agreement specifies the agreed service levels, performance metrics, and key performance indicators (KPIs) that the cloud provider must meet, such as application availability, support response times, data security breach notification timelines, and system performance. This helps maintain the required quality of service.

Enable Oversight and Audits

The agreement outlines provisions for initial qualification audits, periodic audits, and inspections by the regulated company to assess the cloud provider’s compliance with the agreed terms. It also defines processes for managing audit findings and corrective actions.

Ensure Data Integrity and Security

Addressing data-related requirements, such as data ownership, privacy, protection controls, retention, archiving, and disposal processes, is critical to ensuring data integrity and security throughout the data lifecycle.

Manage Third-Party Risks

The agreement establishes guidelines for the approval process and compliance requirements when the cloud provider uses subcontractors or third-party services, mitigating associated risks.

Contents

A quality agreement between a regulated company (customer) and a Cloud (SaaS, PaaS, IaaS) provider should cover the following key elements:

Roles and Responsibilities

Clearly define the roles, responsibilities, and obligations of both parties regarding:

  • Regulatory compliance (GxP, data privacy, security, etc.)
  • Quality management system and processes
  • Change control and release management
  • Incident and deviation management
  • Data integrity, backup, and recovery
  • Performance monitoring and reporting

Service Levels and Performance Metrics

Specify the agreed service levels and key performance indicators (KPIs) for:

  • Application availability and uptime
  • Support response and resolution times
  • Data security and breach notification timelines
  • System performance and capacity

Audits and Assessments

Outline the provisions for:

  • Initial qualification audits of the SaaS provider
  • Periodic audits and inspections by the regulated company
  • Processes for managing audit findings and corrective actions

Data Management

Address data-related aspects such as:

  • Data ownership and usage rights
  • Data privacy and protection controls (as per applicable regulations)
  • Data retention, archiving, and disposal processes

Subcontracting and Third Parties

Establish guidelines for:

  • Approval process for use of subcontractors/third parties
  • Ensuring subcontractors comply with the quality agreement
  • Communication of changes impacting the regulated company

Term, Termination, and Offboarding

Specify conditions for:

  • Initial term and renewal of the quality agreement
  • Termination rights (e.g., for non-compliance, data breaches)
  • Responsibilities during offboarding and data transition

The quality agreement should be a comprehensive yet pragmatic document that ensures the cloud solution meets the regulated company’s quality and compliance requirements throughout the engagement.