Industry 4.0: Next-Gen Manufacturing Fueled By Advanced Analytics
Manufacturing is changing
What will be the human-machine interactions in context of the new manufacturing paradigm?
From mass production to re-configurable manufacturing
The manufacturing industry is moving from 'push' to 'pull' business models, from unified to modular products and services, with customers and data at the centre.
Mass manufacturing introduced cost-focused strategies and the notion of interchangeable parts.
Lean Manufacturing led operations and quality improvements, with a view to reduce non value-added activities.
Flexible manufacturing took advantage of IT-driven automation to enable production of variable products.
Re-configurable manufacturing takes all the above to the next level by combining flexibility and adaptability of the production systems, optimising adjustable production resources across all functions, opening the door to new product and service platform strategies.
More regional, modular and personalised products
Today, flexibility means producing customised products of high quality that can be quickly delivered. This creates new opportunities for product regionalisation, more modularisation, and more personalised offering (both products and services) - which attract new customers and drive more competitive advantage (Koren, 2010). Most of this is underpinned by increasingly complex and large volume of business data that cannot be computed and interpreted in traditional ways. This is ever so true when engineering and manufacturing complex products.
Data is the new oil
Manufacturers are now looking at making better and faster informed decisions and feedback loops driven by data evidence. Big Data technologies can help them analyse and make sense of large volume of data, from a variety of sources, across different format, across different purpose, across multiple functions and organisational silos, while delivering the analysis at a greater velocity.
Big data technologies include production reporting tools, supply chain analytics, distributed storage and distributed processing. They are enabled by machine learning solutions which leverages descriptive / diagnostic / predictive analytics, shop floor automation, real time data optimisation.
Predictive analytics is not just to understand why you lost a customer but how to prevent you from losing one before it happens.
The analytics value chain
Analytics is the discovery and communication of meaningful patterns in data across all functional silos. It’s not just the data that matters, but also the signals in and across data. Predictive models summarise large quantities of data to amplify its value. It is all about having the right people and right models to make sense of it and leverage data to its full potential.
Predictive modeling mathematically represents underlying relationships in historical data in order to explain the data and make predictions, forecasts or classifications about future events. It is based on a combination of statistical analysis, historical review, pattern recognition, linear and non-linear mathematics, risk detection, deep learning and advanced neural networks. Predictive analytics summarise large quantities of data to amplify its value. The value chain for predictive modeling relies on machine-to-machine, machine learning and data science which includes instrumentation, data log capture, data storage, transformation and preparation, model development, application and deployment.
Toward automated insight, real-time optimisation and artificial intelligence
Everyone would agree that asking questions of data is only effective while knowing the right questions to ask.
Just because organisations have a lot of data, it doesn’t mean they’re doing a good job of acting on it.
Converting data into information, knowledge, then making an informed and effective decision/action is the holy grail of analytics. Predictive analytics requires reverse engineering of potential decisions that could or would need to be made, while searching for data evidence in order to support the case. However, delivering real-time actionable intelligence and optimisation is not easy.
Closed-loop performance systems that deliver continuous innovation and insight are tricky to build and maintain - especially due to the changing nature of the data and its increasing number of potential applications. The modern data scientist needs data from all over the place. The typical analyst spends more than 50% of the time chasing data, which slows delivery of analytic insights and limits the time available for thorough analysis.
Some refer to this conundrum as 'the data problem'. Every steps in getting closer to solving this data problem is likely to contribute in creating smart operations - and a fortiori getting closer to enabling smart manufacturing.
What are your thoughts?
Koren Y (2010) The Global Manufacturing Revolution, John Wiley
This article was originally posted on LinkedIn.