If you are well versed in Lean, and hence Toyota Production System (TPS), you are aware that unique Toyota tools such as the four-step rapid setup are engineering intensive, and they only yield a high return on investment in a repetitive manufacturing environment.
Toyota’s press and machine tool produces only a dozen or so different part numbers during its lifetime – i.e., it is a highly repetitive manufacturing environment.
And hence, force fitting lean principles, “AS-IS”, across all types of manufacturing environments and systems may actually end-up hurting the efficiency of manufacturing processes.
And modifications have been made to accommodate lean principles to various manufacturing setups.
Since American manufacturers generally produce both new products and spare parts – i.e., they do both repetitive AND nonrepetitive manufacturing, they modified lean methods and devised approaches like lean six sigma.
But lean six sigma had drawbacks as well. Let us consider an example.
The 20% data black box
Lean Six Sigma used Pareto analysis to find the repetitive 20% of part numbers that delivered 80% of revenue. The narrative was that if you make 80% of revenue production highly efficient, 20% of the remaining portion will not impact your numbers significantly. Companies embraced this methodology for years.
Thus, Lean Six Sigma neglected all the waste in the 20% revenue. But note that since lean six sigma leveraged quantitative methods, and hence data, the data for that 20% was also not being looked into. This approach of focusing on 20% of the parts eventually inadvertently made most of the data for these parts Dark Data – the data that was never looked at and could contain a significant value.
That 80% of Big Data contains valuable insights into many aspects that plague manufacturing processes today. Many manufacturing strategies leverage common assets for parts and hence, no matter how small the revenue, these parts need to be factored in the manufacturing process.
And if you are not capturing the data, and it is not available for analysis, you will never attain true optimization of your processes.Advancement in technology also enables, as well as requires manufacturers to attain visibility into every data point in their manufacturing landscape.
Siloed value stream mapping
Another example is the siloed approach of value stream mapping.
Rather than evaluating cost and waste globally, Lean Six Sigma focused on value stream maps at the department level and had Black Belts work on removing local sources of waste.
As manufacturing environments get digitized, this siloed approach does not provide the impact of interlinked processes, and in some cases, interfacing functions. Consider an example. There may be overflow in the finished goods warehouse section attached to a plant. Unless you look at interfacing processes like transportation, customer management etc. , value stream will not be able to capture the challenges accurately.
AI is the Solution
Fortunately, the advent of easily accessible exponential computing power and the availability of cloud computing make the size of the data and the time taken to process and analyze it irrelevant.
But another critical aspect is the open-source availability of advanced Deep Learning tools to perform analytics that can significantly impact many Manufacturing metrics. I write about applications of Deep Learning in Manufacturing frequently on this blog. In one of my posts, I provided my perspectives on how Neural Networks can significantly reduce setup times. NNs can process large amounts of structured and unstructured data, hence opening many doors to another level of Manufacturing Analytics.
Manufacturing analytics is on the cusp of a revolution, powered by the perfect combination of availability of the level of people, processes and technology required. Leverage this perfect timing to innovate !

