From a display in Maine’s Owls Head Transportation Museum:
“That the motor truck is an excellent substitute for the horse has been proven in every instance where business men have given it a fair trial. But the man who uses his motor truck simply as a substitute for horses neglects to make the most of his opportunities. The horse is not a machine—five to six hours’ actual work—fifteen to twenty-five miles—is its maximum day’s work. A motor truck can be used twenty-four hours a day if necessary, and it will travel the last hour and the hundredth mile just as fast as the first.
“Business men who are using the motor truck in place of horse and wagon equipment with the greatest success are men who have given this problem careful study. In most instances it was necessary to change the plan of routing—delays which were necessary to give the horses rest were eliminated—plans were laid to keep the truck busy the entire day with as few delays as possible…”
Are we reinventing the wheel?
What do you interpret from the advertisement above? It is a brilliant representation of our struggle with new technologies, deep learning or not. We think about any new technology from the context of existing technologies or processes. Winners in this decade will be the ones who develop a breed of managers who embrace a new technology to redefine processes and offerings- and do not try to force fit it onto existing processes.
Recently, I saw an article from MIT Sloan Review about an experiment done by professors from Michigan and MIT. They compared the accuracy rate for a Regression model predicting credit card customer acquisitions with a Deep learning model and found that the percentage difference was only 3%. Deep learning is more computing intense than regression model, so the question was – is it worth using Deep Learning for a mere improvement in accuracy of 3 percentage points? Not.
But as the authors note, Deep learning may not be an excellent method to use at all. You can read the article here, Deep Learning in Marketing Analytics, but cutting through all the rhetoric in the article, my “manipulated” interpretation of the message is:
Any new advances should not be blindly leveraged on existing initiatives or challenges. Understand what is the most optimal way to leverage the new capability.
Where to use DL in the supply chain context?
The article mentioned above was Marketing analytics focused, so you may ask, where can (or should) we use Deep Learning in supply chains? Not in forecasting- is my suggestion, unless you have a very complex forecasting challenges like fashion apparel.
So where should you use deep learning? To change the paradigms of your business. To erase the constraint boundaries of your operating model and create new boundaries.
Can real-time predictive and prescriptive analytics on shipments moving through your supply chain provide you with an ability you did not have before?
Can extracting predictions from trillions of transactions happening over hundreds of systems in your network provides you with a capability that can be a game changer?
Can you capture inventory images on your website and translate those into real-time insights into your warehouse efficiency?
What if an Algorithm driven setup takes over setting and calculating time standard metrics of your warehouse?
Above mentioned examples are just some broad, high-level examples. See the appendix for more suggested applications I have recently experimented with. The idea, as far as leveraging Deep learning in supply chain goes, is :
Use Deep learning to develop capabilities that you did not have before if there was some form of analytics that was extremely challenging to perform, automating a process that required some sort of cognitive (human) inputs, taking control of basic to intermediate supply chain planning decisions, generating real-time, end to end supply chain insights across your network.
Do not introduce complexity in the analytical or automation processes that do not need that complexity.
How to leverage it is equally important.
That also ties to the complexity aspect. Within the same process, you may need a more granular application of deep learning on the part of the process, whereas the remainder can be simple automation.
Remember that Deep learning is computing power intense, so optimizing its application across the network not only saves you $ but also minimizes any latency.

