The rapid incursion of Extended Reality (XR) technologies, encompassing Virtual Reality (VR), Augmented Reality (AR), and Mixed Reality (MR), is significantly transforming the landscape of Human-Robot Interaction (HRI) within industrial settings. This evolution transcends the mere physical cohabitation of humans and robots, venturing into a realm where deep integration is facilitated through multimodal interactions. Such interactions extend beyond traditional communication, involving vision, auditory signals, tactile feedback, and even physiological understanding.
At the forefront of this transformative wave is the concept of the Human Digital Twin (HDT). HDTs advance beyond mere virtual replicas of production systems [1], incorporating human characteristics and behaviors into digital models of complex systems (e.g., collaborative robotics work cells). Such characteristics and behaviors can include the perceived fatigue of workers, worker presence, and hand dominance, which can enhance task allocation and planning [2] (see video). Interpreting human intentions (e.g., human-robot object handovers) makes exchanges and interactions safer and more efficient [3].
The necessity for adaptive interfaces becomes apparent in the realm of XR applications, particularly when addressing diverse roles within a manufacturing system. For instance, in a simulated production setting, a maintenance technician focuses on the equipment’s condition and wear levels, a production manager monitors the production line’s key performance indicators, and a quality manager examines the output’s conformity parameters [5]. Within such a framework, HDTs contribute to enhancing adaptiveness and personalisation, paving the way towards more dynamic and interactive collaborative spaces.
The XR5.0 project, funded by the European Union’s research and innovation programme (GA: 101135209), aims to develop, demonstrate, and validate a novel Person-Centric and AI-based XR paradigm that will be tailored to the requirements and nature of I5.0 applications.
SUPSI will contribute to XR5.0 by providing Clawdite [9], an IIoT Platform that supports the easy and fast ramp-up of HDT and serve as a centralized repository for worker-related data fed by Artificial Intelligence (AI) modules to understand human intentions, preferences, and contextual cues. Clawdite, originally developed within the project STAR, will be extended in XR5.0 to provide contextual data and real-time sensory inputs needed to personalizing XR applications and human-robot interactions.
SUPSI will target the design of faithful digital images of workers to personalise XR functionalities. Additionally, specific mechanisms will be developed for dynamically adapting XR content based on the individual needs and contexts of users (e.g., operators, technicians), leveraging advanced AI techniques for real-time content modification. These efforts underline the commitment for creating a workplace where technology enhances human capabilities and where XR technologies serve as a bridge towards more intuitive and human-centric environments.
The integration between HDT and XR technologies marks a significant milestone in the evolution of HRI. This alliance signifies a shift towards work environments where technology is in harmony with human needs, guiding the industry towards new peaks of efficiency and satisfaction in spite of Industry 5.0.
Authors: Emanuele Fagnano, Davide Matteri, Vincenzo Cutrona, Elias Montini. SUPSI, SPS lab
[1] Montini, E., Bonomi, N., Daniele, F., Bettoni, A., Pedrazzoli, P., Carpanzano, E., & Rocco, P. (2021). The human-digital twin in the manufacturing industry: Current perspectives and a glimpse of future. Trusted artificial intelligence in manufacturing: A review of the emerging wave of ethical and human centric AI technologies for smart production, 132-147.
[2] Montini, E., Cutrona, V., Dell’Oca, S., Landolfi, G., Bettoni, A., Rocco, P., & Carpanzano, E. (2023). A framework for human-aware collaborative robotics systems development. Procedia CIRP, 120, 1083-1088.
[3] Christen, S., Yang, W., Pérez-D’Arpino, C., Hilliges, O., Fox, D., & Chao, Y. (2023). Learning Human-to-Robot Handovers from Point Clouds. CVPR 2023.
[4] Sampieri, A., D’Amely, G., Avogaro, A., Cunico, F., Skenderi, G., Setti, F., Cristani, M., & Galasso, F (2022). Pose Forecasting in Industrial Human-Robot Collaboration. ECCV 2022.
[5] Greci L. (2022). XR for Industrial Training & Maintenance. Chapter 13 – Roadmapping Extended Reality: [6] Montini, E., Cutrona, V., Bonomi, N., Landolfi, G., Bettoni, A., Rocco, P., & Carpanzano, E. (2022). An iiot platform for human-aware factory digital twins. Procedia CIRP, 107, 661-667.