Back Issues Online
Back Issues Online

by Ralf Klamma (RWTH Aachen University), István Koren (RWTH Aachen University) and Matthias Jarke (Fraunhofer FIT and RWTH Aachen University)

The Horizon 2020 project WEKIT has created an industrial training platform, combining sensor technologies, artificial intelligence and mixed reality. The platform creates training materials on the fly and delivers them in a standardised manner.

With digitisation of workplaces impacting many professions, industrial training needs to adapt accordingly. Wearables show enormous potential to help in this space, and we envisage three major shifts taking place. First, industrial training often includes training in manual labour processes. Quantitative digital ethnography based on big data from wearable devices could foster an understanding of these processes in digital workplaces. Second, once the big data are acquired, automated analysis will be needed. To achieve this, artificial intelligence and machine learning will enter industrial training. Finally, the interface between humans and machines will move away from flat computer screens to more immersive forms of feedback in artificial spaces in mixed reality environments.

The European H2020 project WEKIT (Wearable Experience for Knowledge Intensive Training, 2015-2019) aims to introduce wearables into industrial training scenarios. The project addresses three scenarios, displayed in Figure 1: ground training for astronauts, training for aircraft maintenance personnel in arctic rescue missions and training of medical personnel in the use of 4D ultrasound diagnostic devices.

Figure 1: Scenarios from the WEKIT project © WEKIT and Mikhail Fominykh.
Figure 1: Scenarios from the WEKIT project © WEKIT and Mikhail Fominykh.

From a technical perspective, the main outcome of the project is the WEKIT.ONE platform [1]. The platform consists of a self-developed hardware component: a bank of sensors in a self-designed vest with an electronic board (PCB), connected to a MS HoloLens and more sensors in mobile phones and arm wrists.

The hardware is connected to a recorder and a player. The recorder enables the creation of learning materials by recording the sensor data of experts executing procedures for the training scenarios. The data are stored in a database and analysed by the WEKIT.ONE software. The player recognises the training situation either automatically or as instructed by a trainer. Subsequently, the player supports the training through a number of environmental augmentations, such as virtual traces on the ground leading from one station to the next, or by “ghost hands” indicating the manipulation of devices like the cutting of a wire through the use of an augmented head-mounted device.

As a pedagogical innovation, WEKIT has developed a new instructional design model. Tasks, support information, procedural (manual) knowledge and practices have been categorised for use within the platform. In this context, WEKIT has been addressing the question of how we can describe, store, retrieve and exchange training scenarios together with necessary contextual information in a standardised way. An IEEE-SA working group has begun the standardisation of our Augmented Reality Learning Experience Model (ARLEM). An open-source model editor is available at [L1]. Public project deliverables are available on the website [L2] and a start-up company WEKIT ECS (Experience Capturing Solutions) [L3] is exploiting the results.

This new field of learning with wearable technologies is thoroughly explored in a new book published by Springer [2]. The interface between humans and machines moves away from traditional cognitive tools, just as we moved from typewriters to digital thinking tools. Humans are increasingly interacting with machines, and technology is increasingly being used in training scenarios [3]. Data fusion and artificial intelligence can bring together data from heterogeneous sources, helping us to recognise human activities in both human-human and human-robot collaboration scenarios. By making the analytical results available in mixed reality spaces, training, analytics and interventions can take place within the same space without the media breaks and loss of context that occur in traditional training.

New forms of sensor data fusion and processing, including big data visual analytics, are enabling innovative research into the changing workplace. In conducting this research, data protection, privacy, ethical considerations and adherence to workplace legislation must have the highest priority. Like any new technology, these tools have the potential to be misused, and a broad social acceptance is key to their success.

Links:
[L1] https://kwz.me/hEA
[L2] http://wekit.eu/
[L3] https://wekit-ecs.com

References:
[1] B. Limbu et al.: “WEKIT.One: A Sensor-Based Augmented Reality System for Experience Capture and Re-enactment,” in LNCS, Transforming Learning with Meaningful Technologies, M. Scheffel, et al., Eds., Springer, 2019, pp. 158–171.
[2] I. Buchem, R. Klamma, F. Wild: “Perspectives on Wearable Enhanced Learning (WELL): Current Trends, Research, and Practice”. Cham, Switzerland: Springer Nature Switzerland AG, 2019.
[3] R. Klamma, R. Ali, and I. Koren: “Immersive Community Analytics for Wearable Enhanced Learning,” in LNCS, Learning and Collaboration Technologies. Ubiquitous and Virtual Environments for Learning and Collaboration, P. Zaphiris and A. Ioannou, Eds., Springer, 2019, pp. 162–174.

Please contact:
Ralf Klamma
RWTH Aachen University, Germany
This email address is being protected from spambots. You need JavaScript enabled to view it.

Next issue: January 2025
Special theme:
Large-Scale Data Analytics
Call for the next issue
Image ERCIM News 120 epub
This issue in ePub format

Get the latest issue to your desktop
RSS Feed