Big data – from data to knowledge

  • January 13, 2016
  • infoteam Software AG
  • infoteam Software AG
  • News
Besides the Internet of Things, handling big data is a basic pillar of Industry 4.0 in the manufacturing industry. Especially in older businesses, there are multitudes of data that cannot be analysed, such as measurement data from the control system or conventional manual inputs. If these data were standardised, stored in a long-term database and built up, real-time analyses and data mining applications could be realised in order to generate knowledge and make predictions. This would minimise downtime and optimise the efficiency of production – even over several links in the production chain.
Although big data, data mining and data analysis have been appearing in the media for years, many companies are still not tapping the new potential.
This is primarily due to the long service lives of industrial manufacturing machines before they are modernised or replaced.  As a result, many production facilities still retain old, inhibiting data structures.
An increasing number of machines can be networked with one another thanks to technical “innovations”. Even database systems, which previously were only meant for ERP systems, are seeing more and more frequent use in production. These allow measurement and production data to be stored securely and transparently. This is the basis of the first steps towards networked production. As soon as the data are in a structured data pool, the current situation can be analysed using dedicated applications which have been adapted to the specific ways the user thinks and works.
This way, a planner with no in-depth IT knowledge can very quickly draw conclusions about the process or product directly from their work station. 
As the data are up to date, free from errors and more accessible, the production process can be analysed over an extended period of time. Additionally, identical process steps or machines can be compared with one another, even in real-time if the data are sufficiently up to date. 
From information to knowledge
Continuous analyses of the current situation lead to larger volumes of data which can then be used for more in-depth analyses. To this end, the data have to be transformed from a production database into an analysis database. Depending on the purpose of the analysis, there are various ways of processing the data:
Online analytical processing (OLAP) is ideal for quickly analysing large volumes of data or testing hypotheses.
Statistical analyses produce new knowledge about the internal structures of data.
Learning and regression algorithms make forecasts about future measurements.
Expansions to the data structures
Data management can be changed as soon as the advantages of the in-house production data pool have been exhausted in a factory such as with cloud solutions. Even the content can be expanded, leading in this case to a data warehouse.

More information/examples here >>

Learn More

Did you enjoy this great article?

Check out our free e-newsletters to read more great articles..