Ensuring our practices are linked to science

There is a lot of poor to outright bad science in business, leadership and quality circles. We also have a tendency to place anecdotal evidence over objective.

Here are some of the ones I am always on the look out for, on a “horrible to I can live with it” scale. It is by no means an exhaustive list. I tried to avoid “fads” as that is a debatable set.

Myers Brigg (MBTI)

Corporate astrology, pure and simple. Once I see this, I know everything that comes after it is problematic. Books and books have been written on how useless this is. So stop it already.

Learning Styles

The research is definitive. There is no such thing as a learning style and focusing on them is basically a waste of time and will distract you from actually creating valuable training content.

70:20:10 Rule

This is no rule. It was a guideline thrown out on the fly and because of the nice round numbers has become widely used. No empirical evidence supports it in any way. The only study of any repute, a 2003 study by Enos, Kehrhahn and Bell actually showed completely different ratios – 16% from experience on the job, 44% from learning from others, 30% from formal training and a leftover 10% that they couldn’t quite define. But those aren’t round and cool sounding.

It doesn’t even work as a general principle, it is too rigid to be of any use and doesn’t (again) represent how people learn and do work.

Triune Brian

This is just outmoded, the idea that we have a lizard brain has been show to be wildly inaccurate. There are just way better models. Also, the way it is used in most contexts we can just cut it out and not have a loss. It’s time to see this outdated model retired from good in quality circles. The tool using crows will be less likely to plot our demise.

Nonverbal communication

Unfortunately, although thousands of peer-reviewed publications provide very important insights on the impact of nonverbal communication in social interactions, we are exposed to a plethora of false beliefs, stereotypes, and pseudoscientific techniques to “read” nonverbal behaviors. Frankly, I just assume that whatever is being presented is mostly untrue and work from there.

Brainstorming as a crowd

Group driven brainstorming has reducing value and we are better off utilizing brain writing activities.

Case Studies

I love reading about other’s experience, and do enjoy a good case study. However, the belief that case studies of successful (or unsuccessful) organizations present valid advice is not a conclusion that has been tested, and can create an illusion of causality. This constructivist sensemaking is useful, but we should always be careful in drawing wider parallels or establishing ‘facts.’ Call it the ‘Wisdom of Teams’ effect if you want to engage in a little drawing of the illusion.

Similar reasons exist to be careful of benchmarking, which is all opinion and no science.

Employee Engagement

There is little to no evidence that any of the vague concepts of employee engagement actually improve productivity or even that any interventions will actually raise the scores. The only thing proven about employee engagement is the number of hours folks spend on it.

One thought on “Ensuring our practices are linked to science

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.