Micro-activity recognition in industrial assembly process with IMU data and deep learning
Sprache des Titels:
Englisch
Original Buchtitel:
PETRA '22: Proceedings of the 15th International Conference on PErvasive Technologies Related to Assistive Environments
Original Kurzfassung:
Automated understanding of work-steps in industrial assembly work is important for assistive guidance technologies in employee-machine collaboration. Our aim is to identify micro activities of employees during the assembly of automated teller machines (ATM) for purposes of assistance in their daily complex tasks using mobile wearable devices and hand-operated tools. Forgotten or incorrectly installed parts, missing or non-tightened screws during assembly, that are expensive and time consuming to repair are some common mistakes that are addressed with this approach. In this paper the focus is at a seamless embedding of non-impeding Inertial Measurement Units (IMUs), worn on body or integrated into tools and devices, allowing for unobstructed monitoring of tools? usage pattern. Therefore, understanding the activities that occurred and thus recognizing the assembly work steps. The hypothesis is that a system capable of high level detection for micro activities in an assembly line, utilizing IMUs and neural networks, will (i) reduce the error rate in the final product, (ii) assist the workers in real-time scenarios by performing quality control (iii) understand the stages of the assembly workflow. The results for this study are evidenced with empirical observations of work-step executions by (i) hand screwing, (ii) screw driver screwing, (iii) machine screwing, (iv) wrench screwing, with the size of null class being disproportionally dominant in the data set. Deep Learning models including Long Short-term Memory (LSTM) and Convolutional Neural Network (CNN) architectures are evaluated, while presenting the challenges encountered during our research and experiments. The classification performance for our experiments is documented and in the final step a recognition of 91.19% is achieved, using a CNN.