A critical skill of a quality professional (of any professional), and a fundamental part of Quality 4.0, is managing data — knowing how to acquire good data, analyze it properly, follow the clues those analyses offer, explore the implications, and present results in a fair, compelling way.
As we build systems, validate computer systems, create processes we need to ensure the quality of data. Think about the data you generate, and continually work to make it better.
I am a big fan of tools like the Friday Afternoon Measurement to determine where data has problems.
Have the tools to decide what data stands out, use control charts and regression analysis. These tools will help you understand the data. “Looks Good To Me: Visualizations As Sanity Checks” by Michael Correll is a great overview of how data visualization can help us decide if the data we are gathering makes sense.
Then root cause analysis (another core capability) allows us to determine what is truly going wrong with our data.
Throughout all your engagements with data understand statistical significance, how to quantify whether a result is likely due to chance or from the factors you were measuring.
In the past it was enough to understand a pareto chart, and histogram, and maybe a basic control chart. Those days are long gone. What quality professionals need to bring to the table today is a deeper understanding of data and how to gather, analyze and determine relevance. Data integrity is a key concept, and to have integrity, you need to understand data.