Assistance Systems for Sheet Metal Forming




+49 241 80 27999



The Laboratory of Machine Tools and Production Engineering (WZL) has developed a portable demonstrator within the research initiative "SME 4.0 Competence Center", which makes the vision of Industry 4.0 for SME Sheet Metal Forming tangible. The Material Scanner (MatS) uses the example of fineblanking to demonstrate how implicit process knowledge can be made visible and used for optimized process control by merging production and information technology.


The demonstrator consists of three components:

  • a test bench for the simulation of typical fineblanking phenomena (a)
  • a graphics processor-based computing unit for the central processing of measurement data (b)
  • a wireless graphical user interface for decentralized visualization of measurement data (c).

In the current state of development, the sheet metal separation process is performed without the implementation of a genuine cutting process is simulated. This is done with the help of a cutting impact simulator, which can be set in intervals to strike a sheet metal strip and which thereby generates a mechanical impulse (cutting impact) as well as an acoustic signal (cutting sound). The corresponding sensors are used to detect the cutting impact and the sound of the cut is recorded and transmitted wirelessly to the central GPU computing unit. In addition, selected material properties of the sheet metal strip are recorded and compared with the cutting impact and sound is correlated. With the help of algorithms from the field of artificial intelligence, it is then possible to be able to react to process anomalies in real time.

  People sitting at table with documents Copyright: © WZL/Kaufmann

Real-time data analysis using artificial neural networks

The second component and core of the demonstrator is the central graphics processor-based arithmetic unit. An artificial neural network correlates material properties there, process parameters and the above-mentioned perceptive measured variables, the cutting impact and sound. Google impressively demonstrated the possibilities of artificial neural networks in 2016, when it became possible for the first time, to beat a human opponent with the help of a computer program in the board game "Go". The peculiarity, however, was not that this was successful, but rather how it succeeded. The artificial neuronal network has derived moves, which in the over 2000-year-old tradition of play remained undiscovered for humans. Consequently, there was no answer to these moves.

Transferred to the fineblanking process, the aim is now to make implicit process contexts visible with the aid of perceptive measured variables such as the cutting impact and sound as well as artificial neural networks. This enables a sometimes unknown process optimization. Current research at the WZL is developing a software platform based on artificial neural networks that can analyze sensor data streams in real time. Among other things, it will be possible to detect anomalies in the cut and cut sound in real time and to identify their causes. Causes and effects are then made visible via wireless visualization.

Informed anytime, anywhere thanks to wireless, decentralized visualization

A wireless, decentralized visualization is the third component of the demonstrator. An exemplary user interface is used to demonstrate how a machine operator can be informed about the causes and effects of process anomalies in real time in the real process. The development of this user interface is also the subject of current research. Particular importance is attached to the easy comprehensibility of the user interface so that the machine operator can easily obtain the necessary information to be able to optimally guide the process at any time.