Next Generation Quality: It’s All About the Data

As early as 2015, McKinsey’s “Digital America” report projected that adoption of Industry 4.0 technologies in manufacturing alone was expected to increase domestic GDP by over $2 Trillion by 2025. This estimate, developed from expectations surrounding productivity enhancements, waste reduction using methods from lean manufacturing, and new business models enabled by technologies like 3D printing and practices such as remanufacturing, is on track to not only be met — but exceeded.

Manufacturing is Being Revitalized

All of these sea changes are happening because of data – and the software used to collect, manipulate, and understand it. While traditional manufacturing jobs have relied on physical and mechanical skills, new manufacturing jobs require additional cognitive skills. As a result, manufacturers are scrambling to identify and roll out technology training for workers that will best support these emerging needs. At the same time, organizations recognize that institutional memory remains critical. Job shadowing and mentorship will be required to bridge the gap, especially as the people who entered the workforce in the 1970’s and 1980’s begin to retire.

The next generation of manufacturing work will revolve around the data: generating it, keeping track of it, and getting it where it needs to go to keep production processes in control and to capture new opportunities. It doesn’t matter whether that data is stored in a database, or on the web, or in someone’s head. Industry 4.0 technologies will create an environment where delays of information and delays of material (for example, from shipping and transport) will no longer be barriers to getting work done. Creating, sharing, analyzing, and acting based on real-time data will help us better control and optimize work processes.

The Widening Skills Gap

Why are manufacturers concerned about this skills gap? First of all, standardized, repetitive tasks are increasingly being automated. Instead of just following Standard Operating Procedures (SOPs), front line staff will be asked to do more tasks that are cognitively intensive – like visualizing and interpreting data, or choosing (and using) recommendations made by artificial intelligence (AI) or machine learning algorithms. Skill sets related to using software, interpreting data, and taking action in response to that data will be required. Most of the entry- and mid-level workforce in manufacturing today were not trained with that orientation in mind.

Quality, productivity, and efficiency improvements in Industry 4.0 will be drawn from three areas of emphasis: increasing connectedness (between people, machines, and data), augmenting intelligence, and automation of repetitive tasks. Retraining needs to address each of these three areas.

Taking Action

What can your organization do to prepare for these changes? Even if you’re in an industry other than manufacturing, there are three steps you can take to make the transition easier:

  1. Identify master data
  2. Establish systems of record
  3. Implement data governance

The best place to begin is to figure out the most critical data that your organization needs to survive — items like who your employees are, who your customers are (and how to get in touch with them), which products you offer and the suppliers who provide the inputs that make them possible. This master data tends to change slowly over time, but should be carefully controlled by processes and guidelines, as well as metrics to provide insight into its evolution.

Improving product and process quality requires that organizations make data-driven decisions about when, how much, and how often to adjust aspects of operations. This means data to make those decisions has to be available and accurate. Unfortunately, information is often siloed, living in Word docs and Excel files, and although that data might be on a publicly accessible network, finding it and knowing whether it will meet your needs can be impossible.

Establishing systems of record — software systems that are intended to be a “single source of truth” for one or more collections of master data, is the next step. Busting silos and encouraging collaboration facilitates systems integration and leads to better decisions. This way, your organization can save time, money, and effort while capturing valuable opportunities for growth and improvement.

Finally, a solid plan for data governance — strategic, high level planning and control for data management tasks — is necessary to lock in the benefits from master data and systems of record. According to the Data Management Association (DAMA) a data governance framework describes “the exercise of authority, control, and shared decision making (planning, monitoring, and enforcement) over the management of data assets.” A data governance framework is a quality management system for your data, helping your organization keep its data and information assets in order. Establishing a data governance framework is an essential part of planning for Quality 4.0, as well as for large enterprises or other organizations that are drowning in data.

References

Khatri, V., & Brown, C. V. (2010). Designing data governance. Communications of the ACM, 53(1), 148-152.

McKinsey Global Institute. (2015, December). Digital America: A Tale of the Haves and Have-Mores. Available from https://www.mckinsey.com/~/media/McKinsey/Industries/High%20Tech/Our%20Insights/Digital%20America%20A%20tale%20of%20the%20haves%20and%20have%20mores/Digital%20America%20Full%20Report%20December%202015.ashx

Radziwill, N. M. (2018, October 5). Your Data is Your Most Valuable Asset: Getting Started with Quality 4.0. Intelex Blog. Available from https://blog.intelex.com/2018/10/05/data-valuable-asset-getting-started-quality-4-0/

Shah, A. (2018, May 17). Factory Workers Become Coders as Companies Automate.  Wall Street Journal. Available from https://www.wsj.com/articles/factory-workers-become-coders-as-companies-automate-11558085401

This entry was posted in EHSQ, Quality and tagged , by Nicole Radziwill. Bookmark the permalink.

About Nicole Radziwill

Nicole Radziwill is a quality manager and data scientist with more than 20 years leadership experience in software, telecommunications, research infrastructure, and higher education. Prior to joining Intelex, she was an associate professor of data science and production systems at James Madison University, Assistant Director for End to End Operations at the National Radio Astronomy Observatory (NRAO), managed software product development for the Green Bank Observatory (GBO), and managed client engagements for Nortel Networks and Clarify (CRM). She is an ASQ-certified manager of operational excellence (CMQ/OE), an ASQ-certified Six Sigma Black Belt (CSSBB), and contributed to the development of ISO 26000—“Guidance on Social Responsibility.”

Leave a Reply

Your email address will not be published. Required fields are marked *