What if User Interfaces could adapt to User Characteristics and State?

Industry 5.0 (I5.0) places human well-being, safety, and productivity at the heart of technological advancements [1]. However, the transition to I5.0 introduces new challenges and, even if the significant advancement in automation, workers continue to play a crucial role in managing the control and oversighting of production processes.

One of the key challenges is effective human-computer interaction (HCI) through User Interfaces (UIs). In high-pressure manufacturing environments, traditional static interfaces often fail to accommodate varying skill levels, task complexities, and real-time operational needs. This is where Adaptive User Interfaces (AUIs) come into play, dynamically adjusting to user needs to improve efficiency, usability, and safety.

The fundamental premise of AUIs is that the system continuously assesses contextual factors, such as the user’s skill level, the complexity of the task, environmental variables, and past interactions the user had with the system, for altering the interface accordingly. By reducing the mismatch between the interface configuration and user expectations or abilities, AUIs enhance overall usability, promote learning, and mitigate the risks of user error, particularly in high-stakes, complex environments such as manufacturing or assembly lines.

This dynamic adaptation can take various forms and can include:

  • Adjusting the complexity of the interface, by reducing non-essential information to minimize distractions and help the user focus on critical interface elements.
  • Providing contextual assistance, by offering real-time guidance or tutorials tailored to the user’s skill level.
  • Modifying the interface, by rearranging different elements to prioritize the most relevant information based on the user’s current task.
  • Providing task-specific guidance, by delivering focused, step-by-step instructions tailored to the specific task at hand, ensuring that users are not overwhelmed with unnecessary information.
  • Streamlining complex procedures to minimize errors and improve decision-making, by navigating complex workflows and breaking down intricate processes into manageable stages.
  • Predicting the adaptation, by analysing past interactions and patterns to anticipate user needs and pre-emptively adjust the interface.

In this context, Augmented Reality (AR) and Virtual Reality (VR) open even more opportunities and possibilities for adaptation. These technologies not only provide immersive environments but also enable new type of interface adjustments.

When integrated with AI and Human Digital Twins, these technologies create truly personalized and adaptive work environments that enhance efficiency, safety, and user satisfaction.

Challenges in Adopting Adaptive User Interfaces

However, several challenges remain in fully harnessing the potential of AUIs. One significant issue is the lack of a unified framework. Currently, there is no standardized architecture governing the development of the AUI life cycle. A review of 212 studies of the last decade revealed that only 10 focused on frameworks and just 4 addressed methodologies, underscoring the pressing need for structured and systematic approaches in this domain, particularly regarding user experience (UX) and usability[2]. Another challenge lies in the insufficient application of human-centered design principles. Traditional interfaces often fail to present data in ways that align with the operator’s immediate tasks, leading to slower decision-making and potential mistakes. Operators need systems that are not only capable of presenting data but also of presenting it in a way that is meaningful and actionable in real-time. A low level of human-centered design also leads to a low level of worker acceptance. Early involvement of workers in the design process has been shown to improve adoption rates; however, designing solutions that are universally applicable remains a challenge due to the wide variety of systems and socio-technical contexts in which they operate. This diversity complicates efforts to create adaptable solutions that meet the needs of all users effectively.

XR5.0 paradigm combining Human Digital Twins for Adaptive Interfaces

Human-centricity, a pillar of I5.0, emphasizes prioritising human needs, including safety, health, self-actualization, and personal development, within production processes. In this context, Human Digital Twins (HDTs) are proposed as one of the critical methods for empowering operators in smart manufacturing systems. They integrate models fed by dynamic, real-time data with static or quasi-static data of the user, enabling a comprehensive representation of the human entity, the so-called ‘digital avatar’, that aims to revolutionize human-system integration by embedding human traits into the system’s architecture and functionality.

HDTs are emerging as a transformative solution in the manufacturing industry, particularly for addressing the challenges of physically and mentally demanding tasks. By creating real-time virtual replicas of workers, these systems enable continuous monitoring of physical and physiological conditions to proactively manage worker fatigue, prevent hazardous situations, and adapt workflows to individual needs.

Acting as a repository of each worker’s information and contextual factors, the HDT is representing the optimal solution to design interfaces with a clear human-centered view. Adaptation algorithms, which define the rules to be applied for an interface change, can directly interact with the HDT to enable the provision of effective, personalized modification to the User Interface. For example, in a manufacturing training scenario, the HDT can be used to monitor and adapt the learning process for a worker being trained to operate a robotic assembly line. The information stored by the HDT, i.e.,  worker’s current skills, learning pace, cognitive load, and so on, is then exploited by the adaptation algorithm to modify the training interface in real-time to help the user, e.g., when a worker struggles with understanding a specific task. Adaptations might involve simplifying the user interface or providing more detailed step-by-step visual instructions. Additionally, if the HDT is enabled to automatically detect unpleasant situations, e.g., workers feeling high levels of fatigue or stress, targeted alerts can be shown to them, e.g., to recommend breaks or adjust the training intensity to ensure optimal learning conditions.

The XR5.0 project, funded by the European Union’s research and innovation programme (GA: 101135209) and the Swiss State Secretariat for Education, Research and Innovation (SERI), aims to develop, demonstrate, and validate a novel Person-Centric and AI-based XR paradigm that will be tailored to the requirements and nature of I5.0 applications.

SUPSI will contribute to XR5.0 by providing Clawdite, an IIoT Platform that supports the easy and fast ramp-up of HDTs and serve as a centralized repository for worker-related data fed by Artificial Intelligence (AI) modules to understand human intentions, preferences, and contextual cues. Clawdite will be exploited to personalize XR functionalities of the pilot applications to ensure workers can experience a complete and personalized environment built around them. In fact, through the analysis of data collected from workers through sensors and wearable devices, specific mechanisms will be developed for dynamically adapting XR content based on the individual needs and contexts of users (e.g., operators, technicians), leveraging advanced AI techniques for real-time content modification.

Authors:  Davide Matteri, Vincenzo Cutrona, Elias Montini, Samuele Dell’Oca, Sara Masiero, SUPSI, SPS lab


[1] Lu, Yuqian, et al. “Outlook on human-centric manufacturing towards Industry 5.0.” Journal of Manufacturing Systems 62 (2022): 612-627.

[2] Brdnik, Saša, Tjaša Heričko, and Boštjan Šumak. “Intelligent user interfaces and their evaluation: a systematic mapping study.” Sensors 22.15 (2022): 5830.