The Practice Paradox: Why Technical Knowledge Isn’t Enough for True Expertise

When someone asks about your skills they are often fishing for the wrong information. They want to know about your certifications, your knowledge of regulations, your understanding of methodologies, or your familiarity with industry frameworks. These questions barely scratch the surface of actual competence.

The real questions that matter are deceptively simple: What is your frequency of practice? What is your duration of practice? What is your depth of practice? What is your accuracy in practice?

Because here’s the uncomfortable truth that most professionals refuse to acknowledge: if you don’t practice a skill, competence doesn’t just stagnate—it actively degrades.

The Illusion of Permanent Competency

We persist in treating professional expertise like riding a bicycle, “once learned, never forgotten”. This fundamental misunderstanding pervades every industry and undermines the very foundation of what it means to be competent.

Research consistently demonstrates that technical skills begin degrading within weeks of initial training. In medical education, procedural skills show statistically significant decline between six and twelve weeks without practice. For complex cognitive skills like risk assessment, data analysis, and strategic thinking, the degradation curve is even steeper.

A meta-analysis examining skill retention found that half of initial skill acquisition performance gains were lost after approximately 6.5 months for accuracy-based tasks, 13 months for speed-based tasks, and 11 months for mixed performance measures. Yet most professionals encounter meaningful opportunities to practice their core competencies quarterly at best, often less frequently.

Consider the data analyst who completed advanced statistical modeling training eighteen months ago but hasn’t built a meaningful predictive model since. How confident should we be in their ability to identify data quality issues or select appropriate analytical techniques? How sharp are their skills in interpreting complex statistical outputs?

The answer should make us profoundly uncomfortable.

The Four Dimensions of Competence

True competence in any professional domain operates across four critical dimensions that most skill assessments completely ignore:

Frequency of Practice

How often do you actually perform the core activities of your role, not just review them or discuss them, but genuinely work through the systematic processes that define expertise?

This infrequency creates competence gaps that compound over time. Skills that aren’t regularly exercised atrophy, leading to oversimplified problem-solving, missed critical considerations, and inadequate solution strategies. The cognitive demands of sophisticated professional work—considering multiple variables simultaneously, recognizing complex patterns, making nuanced judgments—require regular engagement to maintain proficiency.

Deliberate practice research shows that experts practice longer sessions (87.90 minutes) compared to amateurs (46.00 minutes). But more importantly, they practice regularly. The frequency component isn’t just about total hours—it’s about consistent, repeated exposure to challenging scenarios that push the boundaries of current capability.

Duration of Practice

When you do practice core professional activities, how long do you sustain that practice? Minutes? Hours? Days?

Brief, superficial engagement with complex professional activities doesn’t build or maintain competence. Most work activities in professional environments are fragmented, interrupted by meetings, emails, and urgent issues. This fragmentation prevents the deep, sustained practice necessary to maintain sophisticated capabilities.

Research on deliberate practice emphasizes that meaningful skill development requires focused attention on activities designed to improve performance, typically lasting 1-3 practice sessions to master specific sub-skills. But maintaining existing expertise requires different duration patterns—sustained engagement with increasingly complex scenarios over extended periods.

Depth of Practice

Are you practicing at the surface level—checking boxes and following templates—or engaging with the fundamental principles that drive effective professional performance?

Shallow practice reinforces mediocrity. Deep practice—working through novel scenarios, challenging existing methodologies, grappling with uncertain outcomes—builds robust competence that can adapt to evolving challenges.

The distinction between deliberate practice and generic practice is crucial. Deliberate practice involves:

  • Working on skills that require 1-3 practice sessions to master specific components
  • Receiving expert feedback on performance
  • Pushing beyond current comfort zones
  • Focusing on areas of weakness rather than strengths

Most professionals default to practicing what they already do well, avoiding the cognitive discomfort of working at the edge of their capabilities.

Accuracy in Practice

When you practice professional skills, do you receive feedback on accuracy? Do you know when your analyses are incomplete, your strategies inadequate, or your evaluation criteria insufficient?

Without accurate feedback mechanisms, practice can actually reinforce poor techniques and flawed reasoning. Many professionals practice in isolation, never receiving objective assessment of their work quality or decision-making effectiveness.

Research on medical expertise reveals that self-assessment accuracy has two critical components: calibration (overall performance prediction) and resolution (relative strengths and weaknesses identification). Most professionals are poor at both, leading to persistent blind spots and competence decay that remains hidden until critical failures expose it.

The Knowledge-Practice Disconnect

Professional training programs focus almost exclusively on knowledge transfer—explaining concepts, demonstrating tools, providing frameworks. They ignore the practice component entirely, creating professionals who can discuss methodologies eloquently but struggle to execute them competently when complexity increases.

Knowledge is static. Practice is dynamic.

Professional competence requires pattern recognition developed through repeated exposure to diverse scenarios, decision-making capabilities honed through continuous application, and judgment refined through ongoing experience with outcomes. These capabilities can only be developed and maintained through deliberate, sustained practice.

A study of competency assessment found that deliberate practice hours predicted only 26% of skill variation in games like chess, 21% for music, and 18% for sports. The remaining variance comes from factors like age of initial exposure, genetics, and quality of feedback—but practice remains the single most controllable factor in competence development.

The Competence Decay Crisis

Industries across the board face a hidden crisis: widespread competence decay among professionals who maintain the appearance of expertise while losing the practiced capabilities necessary for effective performance.

This crisis manifests in several ways:

  • Templated Problem-Solving: Professionals rely increasingly on standardized approaches and previous solutions, avoiding the cognitive challenge of systematic evaluation. This approach may satisfy requirements superficially while missing critical issues that don’t fit established patterns.
  • Delayed Problem Recognition: Degraded assessment skills lead to longer detection times for complex issues and emerging problems. Issues that experienced, practiced professionals would identify quickly remain hidden until they escalate to significant failures.
  • Inadequate Solution Strategies: Without regular practice in developing and evaluating approaches, professionals default to generic solutions that may not address specific problem characteristics effectively. The result is increased residual risk and reduced system effectiveness.
  • Reduced Innovation: Competence decay stifles innovation in professional approaches. Professionals with degraded skills retreat to familiar, comfortable methodologies rather than exploring more effective techniques or adapting to emerging challenges.

The Skill Decay Research

The phenomenon of skill decay is well-documented across domains. Research shows that skills requiring complex mental requirements, difficult time limits, or significant motor control have an overwhelming likelihood of being completely lost after six months without practice.

Key findings from skill decay research include:

  • Retention interval: The longer the period of non-use, the greater the probability of decay
  • Overlearning: Extra training beyond basic competency significantly improves retention
  • Task complexity: More complex skills decay faster than simple ones
  • Feedback quality: Skills practiced with high-quality feedback show better retention

A practical framework divides skills into three circles based on practice frequency:

  • Circle 1: Daily-use skills (slowest decay)
  • Circle 2: Weekly/monthly-use skills (moderate decay)
  • Circle 3: Rare-use skills (rapid decay)

Most professionals’ core competencies fall into Circle 2 or 3, making them highly vulnerable to decay without systematic practice programs.

Building Practice-Based Competence

Addressing the competence decay crisis requires fundamental changes in how individuals and organizations approach professional skill development and maintenance:

Implement Regular Practice Requirements

Professionals must establish mandatory practice requirements for themselves—not training sessions or knowledge refreshers, but actual practice with real or realistic professional challenges. This practice should occur monthly, not annually.

Consider implementing practice scenarios that mirror the complexity of actual professional challenges: multi-variable analyses, novel technology evaluations, integrated problem-solving exercises. These scenarios should require sustained engagement over days or weeks, not hours.

Create Feedback-Rich Practice Environments

Effective practice requires accurate, timely feedback. Professionals need mechanisms for evaluating work quality and receiving specific, actionable guidance for improvement. This might involve peer review processes, expert consultation programs, or structured self-assessment tools.

The goal isn’t criticism but calibration—helping professionals understand the difference between adequate and excellent performance and providing pathways for continuous improvement.

Measure Practice Dimensions

Track the four dimensions of practice systematically: frequency, duration, depth, and accuracy. Develop personal metrics that capture practice engagement quality, not just training completion or knowledge retention.

These metrics should inform professional development planning, resource allocation decisions, and competence assessment processes. They provide objective data for identifying practice gaps before they become performance problems.

Integrate Practice with Career Development

Make practice depth and consistency key factors in advancement decisions and professional reputation building. Professionals who maintain high-quality, regular practice should advance faster than those who rely solely on accumulated experience or theoretical knowledge.

This integration creates incentives for sustained practice engagement while signaling commitment to practice-based competence development.

The Assessment Revolution

The next time someone asks about your professional skills, here’s what you should tell them:

“I practice systematic problem-solving every month, working through complex scenarios for two to four hours at a stretch. I engage deeply with the fundamental principles, not just procedural compliance. I receive regular feedback on my work quality and continuously refine my approach based on outcomes and expert guidance.”

If you can’t make that statement honestly, you don’t have professional skills—you have professional knowledge. And in the unforgiving environment of modern business, that knowledge won’t be enough.

Better Assessment Questions

Instead of asking “What do you know about X?” or “What’s your experience with Y?”, we should ask:

  • Frequency: “When did you last perform this type of analysis/assessment/evaluation? How often do you do this work?”
  • Duration: “How long did your most recent project of this type take? How much sustained focus time was required?”
  • Depth: “What was the most challenging aspect you encountered? How did you handle uncertainty?”
  • Accuracy: “What feedback did you receive? How did you verify the quality of your work?”

These questions reveal the difference between knowledge and competence, between experience and expertise.

The Practice Imperative

Professional competence cannot be achieved or maintained without deliberate, sustained practice. The stakes are too high and the environments too complex to rely on knowledge alone.

The industry’s future depends on professionals who understand the difference between knowing and practicing, and organizations willing to invest in practice-based competence development.

Because without practice, even the most sophisticated frameworks become elaborate exercises in compliance theater—impressive in appearance, inadequate in substance, and ultimately ineffective at achieving the outcomes that stakeholders depend on our competence to deliver.

The choice is clear: embrace the discipline of deliberate practice or accept the inevitable decay of the competence that defines professional value. In a world where complexity is increasing and stakes are rising, there’s really no choice at all.

Building Deliberate Practice into the Quality System

Embedding genuine practice into a quality system demands more than mandating periodic training sessions or distributing updated SOPs. The reality is that competence in GxP environments is not achieved by passive absorption of information or box-checking through e-learning modules. Instead, you must create a framework where deliberate, structured practice is interwoven with day-to-day operations, ongoing oversight, and organizational development.

Start by reimagining training not as a singular event but as a continuous cycle that mirrors the rhythms of actual work. New skills—whether in deviation investigation, GMP auditing, or sterile manufacturing technique—should be introduced through hands-on scenarios that reflect the ambiguity and complexity found on the shop floor or in the laboratory. Rather than simply reading procedures or listening to lectures, trainees should regularly take part in simulation exercises that challenge them to make decisions, justify their logic, and recognize pitfalls. These activities should involve increasingly nuanced scenarios, moving beyond basic compliance errors to the challenging grey areas that usually trip up experienced staff.

To cement these experiences as genuine practice, integrate assessment and reflection into the learning loop. Every critical quality skill—from risk assessment to change control—should be regularly practiced, not just reviewed. Root cause investigation, for instance, should be a recurring workshop, where both new hires and seasoned professionals work through recent, anonymized cases as a team. After each practice session, feedback should be systematic, specific, and forward-looking, highlighting not just mistakes but patterns and habits that can be addressed in the next cycle. The aim is to turn every training into a diagnostic tool for both the individual and the organization: What is being retained? Where does accuracy falter? Which aspects of practice are deep, and which are still superficial?

Crucially, these opportunities for practice must be protected from routine disruptions. If practice sessions are routinely canceled for “higher priority” work, or if their content is superficial, their effectiveness collapses. Commit to building practice into annual training matrices alongside regulatory requirements, linking participation and demonstrated competence with career progression criteria, bonus structures, or other forms of meaningful recognition.

Finally, link practice-based training with your quality metrics and management review. Use not just completion data, but outcome measures—such as reduction in repeat deviations, improved audit readiness, or enhanced error detection rates—to validate the impact of the practice model. This closes the loop, driving both ongoing improvement and organizational buy-in.

A quality system rooted in practice demands investment and discipline, but the result is transformative: professionals who can act, not just recite; an organization that innovates and adapts under pressure; and a compliance posture that is both robust and sustainable, because it’s grounded in real, repeatable competence.

How I would Organize a Meeting of a CoP

As I discussed in “A CoP is Collaborative Learning, not Lecture,” it is past time to stop treating professionals as college kids (it is also past time to stop teaching college kids that way, but another subject). Lectures have their place. There is undoubtedly a high need for information transfer events (but even these can be better structured), and there will always be a need for GAMP5 workshops, training courses, and webinars on a specific topic.

But that is not the place of a community of practice.

I’ve written in the past some ways I prefer to structure professional engagements, such as poster sessions and an unconference. I have demonstrated some ways I think we can do this better. So, let’s turn our attention to what a better GAMP5 community of practice session could look like

We aim to connect, communicate, share, collaborate, and dialogue. So, what would a six-hour event look like?

Noon to 1:00—Networking and poster session. We have a lot of introverts in this industry, so help folks connect by doing it in a structured way. Posters are excellent as they can serve as a springboard for conversation. All the presentations that started about ISPE and GAMP5, what the GAMP5 plans are for the next two years, and current regulatory trends are posters.

1:00-2:00—Think-Pair-Share: There will be three rounds of 15 minutes each, each with a different topic. Each participant will have an 11×17 piece of paper to take notes of the other person’s thoughts. Post.

2:00 to 2:30: Review thoughts, brainstorm a theme, and propose.

2:30 to 2:45: N/5 voting for top themes

2:30 to 3:30 – Mock audit, fishbowl style. Deep dive on a particular issue, audit style.

3:30 to 4:30 -Unconference-style breakouts of the themes. Each working group comes out with a hand-drawn poster (or more based on how productive the group is)

4:30 to 5:00 – Present ideas

5:00 to 6:00 – Network, discuss ideas. Add to them.

Hit the bar/restaurant.

Publish the results, and continue to work on the online forum.

A CoP is Collaborative Learning, not Lecture

I was recently at an event for GAMP5 that billed itself as a community of practice. Instead, it was a bunch of lectures, a lot of being talked at, and no collaborative learning.

Collaborative learning is an educational approach where two or more individuals work together to understand a concept, solve a problem, or create a product. This method leverages the group members’ collective resources, skills, and knowledge, fostering an environment where participants actively engage with each other to achieve shared learning goals. It is the heart of a flourishing community of practice and something we should do much more as industry professionals.

Key Characteristics of Collaborative Learning

  1. Group Dynamics: Collaborative learning involves small groups, typically ranging from pairs to groups of no more than six members, where each member contributes to the group’s success. The interaction among group members is crucial, as it involves sharing ideas, evaluating each other’s contributions, and collectively solving problems.
  2. Active Engagement: Unlike traditional individual learning, collaborative learning requires active participation from all members. This engagement can take various forms, including face-to-face discussions, online forums, group projects, and peer reviews.
  3. Shared Responsibility: In collaborative learning, responsibility and authority are distributed among group members. Each participant is accountable not only for their own learning but also for helping their peers understand and succeed.
  4. Diverse Perspectives: Collaborative learning often brings together individuals from different backgrounds, promoting diversity of thought and fostering open-mindedness and acceptance.

Benefits of Collaborative Learning

  1. Enhances Problem-Solving Skills: Working in groups exposes participants to various perspectives and approaches, which can lead to more effective problem-solving strategies.
  2. Improves Communication Skills: Collaborative learning requires clear and effective verbal and written communication, which helps participants develop strong communication skills.
  3. Fosters Social Interaction: By working together, participants practice and enhance social skills such as active listening, empathy, and respect, essential for building strong personal and professional relationships.
  4. Promotes Critical Thinking: The need to discuss, debate, and defend ideas in a group setting encourages participants to think critically and deeply about the subject matter.
  5. Encourages Creativity: Exchanging diverse ideas and perspectives can inspire creative solutions and innovative thinking.

Theoretical Background

Collaborative learning is rooted in Lev Vygotsky’s zone of proximal development concept, which emphasizes the importance of social interaction and communication in learning. According to Vygotsky, learners can achieve higher levels of understanding and retain more information when they work collaboratively, as they can learn from each other’s experiences and insights.

Examples of Collaborative Learning Activities

  1. Think-Pair-Share: Participants think about a question individually, discuss their thoughts with a partner, and then share their conclusions with the larger group.
  2. Jigsaw Method: Participants are divided into “home” groups, and each member becomes an expert on a subtopic. They then teach their subtopic to their group members, ensuring everyone understands the topic.
  3. Fishbowl Debate: Small groups of participants debate a topic, with some members observing and taking notes. This method encourages active participation and critical thinking.
  4. Case Studies: Groups analyze and discuss real-world scenarios, applying theoretical knowledge to practical situations.
  5. Online Forums: Participants collaborate through discussion boards or live collaboration software, sharing ideas and working together on projects.

Communities of Practice

Knowledge management is a key enabler for quality, and should firmly be part of our standards of practice and competencies. There is a host of practices, and one tool that should be in our toolboxes as quality professionals is the Community of Practice (COP).

What is a Community of Practice?

Wenger, Trayner, and de Laat (2011) defined a Community of Practice as a “learning partnership among people who find it useful to learn from and with each other about a particular domain. They use each other’s experience of practice as a learning resource.” Etienne Wagner is the theoretical origin of the idea of a Community of Practice, as well as a great deal of the subsequent development of the concept.

Communities of practice are groups of people who share a passion for something that they know how to do, and who interact regularly in order to learn how to do it better. As such, they are a great tool for continuous improvement.

These communities can be defined by disciplines, by problems, or by situations. They can be internal or external. A group of deviation investigators who want to perform better investigations, contamination control experts sharing across sites, the list is probably endless for whenever there is a shared problem to be solved.

The idea is to enable practitioners to manage knowledge. Practitioners have a special connection with each other because they share actual experiences. They understand each other’s stories, difficulties, and insights. This allows them to learn from each other and build on each other’s expertise.

There are three fundamental characteristics of communities:

  • Domain: the area of knowledge that brings the community together, gives it its identity, and defines the key issues that members need to address. A community of practice is not just a personal network: it is about something. Its identity is defined not just by a task, as it would be for a team, but by an “area” of knowledge that needs to be explored and developed.
  • Community: the group of people for whom the domain is relevant, the quality of the relationships among members, and the definition of the boundary between the inside and the outside. A community of practice is not just a Web site or a library; it involves people who interact and who develop relationships that enable them to address problems and share knowledge.
  • Practice: the body of knowledge, methods, tools, stories, cases, documents, which members share and develop together. A community of practice is not merely a community of interest. It brings together practitioners who are involved in doing something. Over time, they accumulate practical knowledge in their domain, which makes a difference to their ability to act individually and collectively.

The combination of domain, community, and practice is what enables communities of practice to manage knowledge. Domain provides a common focus; community builds relationships that enable collective learning; and practice anchors the learning in what people do. Cultivating communities of practice requires paying attention to all three elements.

Communities of Practice are different than workgroups or project teams.

What’s the purpose?Who belongs?What holds it together?How long does it last?
Community of PracticeTo develop members’ capabilities. To build and exchange knowledgeMembers who share domain and communityCommitment from the organization. Identification with the group’s expertise. PassionAs long as there is interest in maintaining the group
Formal work groupTo deliver a product or serviceEveryone who reports to the group’s managerJob requirements and common goalsUntil the next reorganization
Project teamTo accomplish a specific taskEmployee’s assigned by managementThe project’s milestones and goalsUntil the project has been completed
Informal networkTo collect and pass on business informationFriends and business acquantaincesMutual needsAs long as people have a reason to connect
Types of organizing blocks

Establishing a Community of Practice

Sponsorship

For a Community of Practice to thrive it is crucial for the organization to provide adequate
sponsorship. Sponsorship are those leaders who sees that a community can deliver value and therefore makes sure that the community has the resources it needs to function and that its ideas and proposals find their way into the organization. While there is often one specific sponsor, it is more useful to think about the sponsorship structure that enables the communities to thrive and have an impact on the performance of the organization. This includes high-level executive sponsorship as well as the sponsorship of line managers who control the time usage of employees. The role of sponsorship includes:

  • Translating strategic imperatives into a knowledge-centric vision of the organization
  • Legitimizing the work of communities in terms of strategic priorities
  • Channeling appropriate resources to ensure sustained success
  • Giving a voice to the insights and proposals of communities so they affect the way business is conducted
  • Negotiating accountability between line operations and communities (e.g., who decides which “best practices” to adopt)

Support Structure

Communities of Practice need organizational support to function. This support includes:

  • A few explicit roles, some of which are recognized by the formal organization and resourced with dedicated time
  • Direct resources for the nurturing of the community infrastructure including meeting places, travel funds, and money for specific projects
  • Technological infrastructure that enables members to communicate regularly and to accumulate documents

It pays when you use communities of practice in a systematic way to put together a small “support team” of internal
consultants who provide logistic and process advice for communities, including coaching community leaders, educational activities to raise awareness and skills, facilitation services, communication with management, and
coordination across the various community of practices. But this is certainly not needed.

Process Owners and Communities of Practice go hand-in-hand. Often it is either the Process Owner in a governance or organizing role; or the community of practice is made up of process owners across the network.

Recognition Structure

Communities of Practice allows its participants to build reputation, a crucial asset in the knowledge economy. Such reputation building depends on both peer and organizational recognition.

  • Peer recognition: community-based feedback and acknowledgement mechanisms that celebrate community participation
  • Organizational recognition: rubric in performance appraisal for community contributions and career paths for people who take on community leadership

ASQ Technical Forums and Divisions as Knowledge Communities

I have been spending a lot of time lately thinking about how to best build and grow knowledge communities within quality. One of my objectives at WCQI this year was to get more involved in the divisions and technical forums and I, frankly, might have been overly successful in volunteering for the Team and Workplace Excellence Forum (TWEF) – more on that later when announcements have been made.

Stan Garfield provides 10 principles for successful Knowledge Management Communities. If you are interested in the topic of knowledge management, Stan is a great thinker and resource.

PrincipleThoughts for ASQ Divisions/Technical Forums
Communities should be independent of organizational structure; they are built around areas upon which members wish to interact. The divisions and technical forums are one part of the organizational structure of the ASQ, but they tend to be more on the knowledge generating side of things. The other major membership unit, sections, are geographical.

Divisions and forums are basically broken in two categories: industry type(s) and activity band.

The Food, Drug, and Cosmetic or Biomedical are great examples of industry focused (these are by nature of my work the only two I’ve paid attention to), and they seem to be very focused on product integrity questions.

The activity bands are all over the place. For example in the People and Service technical committee there is a Quality Management, Human Development and Leadership and a Team Excellence Forum. Those three have serious overlap.

It is of interest to me that the other divisions in the People and Service technical committee are Education, Healthcare, Government, Customer Supplier and Service Quality, which are much more industry focused.

And then there is the Social Responsibility division. I have super respect for those people, because they are basically trying to reinvent the definition of quality in a way that can be seen as anathema to the traditional product integrity focused viewpoint.

There is still so much to figure out about the TCCs.
Communities are different from teams; they are based on topics, not on assignments. Easy enough in the ASQ as this is a volunteer organization.
Communities are not sites, team spaces, blogs or wikis; they are groups of people who choose to interact. As the ASQ tries to develop my.ASQ to something folks are actually using, this is a critical principle. The site pages will grow and be used because people are interacting, not drive interaction.

Ravelry seems like a great example on how to do this right. Anyone know of any white papers on Ravelry?
Community leadership and membership should be voluntary; you can suggest that people join, but should not force them to. Divisions are voluntary to join, and people get involved if they chose to.

Please volunteer…..
Communities should span boundaries; they should cross functions, organizations, and geographic locations. The ASQ has this mostly right.

The industry focused communities are made up of members across companies, with a wide spread of locations.
Minimize redundancy in communities; before creating a new one, check if an existing community already addresses the topic. The ASQ hasn’t done a great job of this. One of my major thoughts is that the Quality Management Division has traditionally claimed ownership of the CMQ/OE body of knowledge, but frankly a good chunk of it should be between the Team Excellence and Human Development divisions, which between them seem to have a fair bit of overlap.

Take change management, or project management, or program management. Which one of the three divisions should be focusing on that? All three? Seems a waste of effort. It’s even worse that I know the Lean Division spends a fair amount talking about this.
Communities need critical mass; take steps to build membership.The major dilemma for professional associations. Love to see your suggestions in the comments.
Communities should start with as broad a scope as is reasonable; separate communities can be spun off if warranted. I’m going to say a radical and unpopular thought. If the ASQ was serious about transformation it would have dissolved half of the divisions and then rebuilt them from scratch. Too many are relics of the past and are not relevant in their current construction. Do you truly need a Lean and a Six Sigma forum? A Team Excellence and a Human development (and a quality management).Should biomedical (medical devices) be part of the FDC?
Communities need to be actively nurtured; community leaders need to create, build, and sustain communities. To do this community leaders need training, coaching and mentoring. I’m happy with the connections I’ve started building in headquarters and with a certain board member.

Perhaps one of the focuses of the Team and Workplace Excellence Forum should be to help push the praxis on this.
Communities can be created, led, and supported using TARGETs:
Types (TRAIL — Topic, Role, Audience, Industry, Location)
Activities (SPACE — Subscribe, Post, Attend, Contribute, Engage)
Requirements (SMILE — Subject, Members, Interaction, Leaders, Enthusiasm)
Goals (PATCH — Participation, Anecdotes, Tools, Coverage, Health)
Expectations (SHAPE — Schedule, Host, Answer, Post, Expand)
Tools (SCENT — Site, Calendar, Events, News, Threads).
Okay. So much here. But this helps me build an agenda for a forthcoming meeting.

I may be jumping the gun, but if you are a member of the ASQ and interested in contributing to the Team and Excellence Forum, contact me.