Big Data is playing (and increasingly will play) a fundamental role within the assets of the manufacturing industry.
For example, if you like to cook, you know very well how important it is to respect the quantities, times and temperatures required in order to guarantee satisfactory quality of the dishes. If you have ever tried to make a pizza, you will have learned that the right mix of flour, water, yeast and salt, temperature and cooking time are all parameters that decisively influence the final result.
This is also true for any manufacturing process: whether you are cooking a pizza or making an industrial product, in fact, the key process parameters must be controlled in detail for essentially 2 reasons:
- to ensure compliance with the final requirements
- to minimize the variability within the processes
To have high quality processes it is therefore necessary:
- to identify the critical parameters
- being able to collect them in real time and efficiently
- being able to keep them in check
- react quickly in case of process deviations
It is clear how fundamental it is to establish the culture of data within companies, especially now that with digital technologies it is possible to collect a large amount of data and analyze them in times and costs that are decidedly accessible to all reality, especially to SMEs.
Implementing an effective data collection system
Data collection, especially in digital format, has become an increasingly consolidated practice. Although obvious, this process is not always without its difficulties: the complexity of industrial systems, the variety of network protocols and the absence of standards in many cases mean that this process requires time, investment and skills that are often external to the company.
However, what’s important to note is that not all data are equally important. In fact, it would be inefficient to collect a huge amount of data, then create expensive and useless infrastructures, if this data did not help to make the company more competitive. For this reason, it is essential, before investing time and money, to investigate the fundamental data that must be collected.
Once the what is defined, the how must be chosen. Let’s forget paper: it would remain on our desks waiting to do data-entry excel! Much better to implement a digital infrastructure to make the data usable in formats immediately usable for subsequent analyzes. In this sense, the company will have to structure a corporate database from which the various business analytics, erp mes, etc. applications will be drawn.
“Okay, but what do we do with all this data?” “Should I spend money to collect the data? In the end, the customer doesn’t ask for them!”
These questions that are often asked to me are quite common for those unfamiliar with processes. As mentioned above, processes depend on parameters and we cannot control processes without knowing these parameters, preferably punctual and in real time.
“You can’t improve what you don’t know” is my answer.
Data Analytics must be entrusted to professionals able to extrapolate value-added information through analysis. Data scientists do not necessarily know industrial processes (although it would still be a good thing), however they have the mathematical and often also computer skills necessary for the purpose.
In the industrial field, for example, these skills are mostly statistical: tools such as linear regression or hypothesis tests, the choice of a representative sample with an adequate number, for example, allow to identify on objective bases and with a confidence level fixed (usually 95%) the causes of process deviations.
Extrapolation of information in real time
Artificial Intelligence, or the implementation of algorithms for real-time analysis of big data, can nowadays perform many of the activities performed by data scientists, with obvious advantages…
- …in terms of costs: non-recurring cost and overall lower than the recurring costs of data scientists
- …in terms of time: the analysis takes place essentially in real time, unlike data scientists who obviously require more time to carry out their analyzes
There are several solutions on the market that can help companies implement a robust system of data analysis in real time.
Ensuring the exchange of information to all interested parties
Making sure that value-added information is shared with interested parties is the last step to capitalize on the data collected. In fact, what is the use if the information does not reach the person in charge of the process or continuous improvement in a short time? In this sense, it will be necessary to create an infrastructure and an adequate corporate IT system, through the integration of various software (ERP, PLM and MES above all) that can be used on their servers or in the cloud, now an increasingly common solution that offers significant advantages.
Finally, it will be necessary to ensure that the exchange of information, and therefore of data, takes place in a secure manner. That is, it will be necessary to take all the precautions of cybersecurity to prevent this information from being stolen or the system from being attacked and made non-functional.
Where to start?
In this short post we have highlighted the importance data are increasingly covering and at all company levels, especially in production contexts. The data must be considered as a real company asset, so it is important:
- to define which data to collect
- to plan an efficient data collection system
- to have an adequate data analysis system
- to ensure that the information extracted is shared with all interested parties
- to guarantee an adequate level of cybersecurity
If you want to implement an effective data collection and analysis system, we recommend to have a look at the following courses on our catalogue first:
- Data Analytics Using Python
- Industry 4.0 Masterclass
- The 9 key technologies of Industry 4.0
- Diploma in C++ and Python Programming
Finally, don’t forget to subscribe to our newsletter to stay up to date on our services.