Never Lose Data Again

07/09/2020

WZL successfully Implements “Big Data Lake” with Semantic Data Management for Production Machines at the Edge

 

Every manufacturing facility, whether small or medium-sized enterprises (SMEs), large companies or research institutions, has most likely already been faced with the question of which data is relevant and therefore needs to be collected. At the Chair of Manufacturing Technology of the Laboratory for Machine Tools and Production Engineering (WZL) of the RWTH Aachen University, this challenge has now become extremely simple. From now on, the answer will be: All!

With the implementation of the central “Big Data Lake” concept, the first step of the digitization roadmap of the Chair of Manufacturing Technology of the WZL was successfully completed.

“Never lose a single date of your own production again, whether from the machine, the tools or quality control,”

was the vision, states Prof. Thomas Bergs, Di-rector of the WZL and head of the Chair of Manufacturing Technology.

“With the Big Data Lake concept we have now succeeded in doing this. After minimal pre-processing, production data is centrally persisted as raw as possible on an easily scalable, multi-distributed file system in our own network. In this way, we will be able to access valid historical data in the future, when issues change or new perspectives on the data become relevant,” continues Dr. Daniel Trauth, CDO and Chief Engineer for Digital Transformation at the Chair of Manufacturing Technology.

“Two problems of common manufacturing data are solved simultaneously with the Big Data Lake concept,” explains Joachim Stanke, Senior Solution Architect at the chair. “On the one hand, large amounts of data can be stored permanently in raw format, making it a perfect data-base for teaching and modeling extremely precise AI algorithms. On the other hand, these developed AI algorithms can in turn be applied to the rapid data streams of manufacturing machines in order to derive approximately real-time decisions for processes, machines or periph-erals.”

  “Big Data Lake” infrastructure Copyright: © WZL Insights into the scalable “Big Data Lake” infrastructure, incl. 5G infrastructure

Stable Hardware and Metadata Information

On the hardware side, the WZL relies on the stable and reliable hardware from Dell Technologies. Dell Technologies and WZL have more than just a supplier relationship: For years, they have been jointly pushing the limits of storing, processing and analyzing large amounts of data.

On the software side, a lambda architecture based on the Apache Hadoop family is used, which is perfected by semantic data management from HotSprings GmbH. The semantic data management guarantees the complete and precise enrichment of the production data with decisive meta-information, whereby an artificial intelligence (AI) recognizes typical data patterns already during data acquisition and suggests correlations and information. “In this way, important relationships will remain traceable and reconstructable in the future, even if the current generation of employees is no longer at the institute,” says Dr. Max Haberstroh, CEO of HotSprings GmbH.

In a next step, the central “Big Data Lake” concept will be supplemented by a decentralized edge computing network, which will allow process monitoring and data analysis to be carried out quickly and efficiently at the production machines. For computationally intensive tasks, the edge devices can then access the central “Big Data Lake”. This is followed by the implementation of a WZL Machine Cloud, which, as a multi-platform consisting of edge and cloud, enables data exchange across the various WZL locations and stakeholders.

Secured by a block chain, data integrity and data sovereignty of the originators can be guaranteed at all times, GAIA-X compatible. “What seemed impossible until now will then be commonplace: The joint development of AI algorithms on different data sets of different stakeholders for maximum effectiveness in producing supply chains,” says Prof. Thomas Bergs.