Human-Centered AI: The XR5.0 Project and the Future of Adaptive Systems 

Artificial Intelligence (AI) is rapidly transforming how we live, learn, and work. Yet, even as AI systems become more capable, they often they often lack real-time awareness of users’ cognitive and affective states. What if our AI systems could understand users’ signals — attention, stress, or fatigue—and adjust their behavior to help us perform better and feel better? 

At the Insight Research Center of Instituto Piaget, Portugal, this question lies at the heart of an ambitious initiative – the XR5 Project –  a Horizon Europe collaboration, involving 23 partners from nine countries. We are exploring how AI, multimodal sensing and assessment, and human factors research can enable systems that adapt in real-time to the user’s cognitive and emotional states. 

From Industry 5.0 to Human-Centered AI 

Industry 5.0 envisions a future where technology and people work together harmoniously. Unlike the automation-driven vision of Industry 4.0, the new paradigm emphasizes human-centric, sustainable, and resilient systems. XR5 embodies this principle by developing extended reality (XR) technologies—virtual and augmented environments—that are not only functional but also responsive to human conditions, reducing mental workload and stress to sustain wellbeing. 

The project’s core idea is simple yet transformative: use AI-driven sensing and adaptation to make XR environments responsive of their users’ mental, emotional and physical states. Whether a worker is fatigued, stressed, or overloaded, the system detects these cues and adjusts the interaction accordingly—simplifying interfaces, modulating training pace, or providing supportive feedback. 

Understanding and Measuring Human Factors 

To understand how the system adapts to the user, XR5 assesses the user state across three complementary dimensions —cognitive workload, affective state , and physiological signals—grounded cognitive ergonomics and applied psychophysiology: 

  • Cognitive factors, such as attention, workload, and decision fatigue – govern how users process information, allocate attention, and make decisions in complex environments. 
  • Affective factors, such as frustration, motivation, and trust – influence engagement,  confidence, and readiness to learn or perform under varying emotional conditions. 
  • Physiological indicators, including heart rate variability (HRV), eye movements,  electrodermal activity, and pupil dilation – provide objective, real-time signals that support inference of  cognitive load and emotional state. 

Integrating multimodal signals provides a precise, holistic understanding of the user’s condition, enabling an AI-driven adaptive engine to tailor the XR environment in real time—modulating visual complexity, pacing, task difficulty, and information flow. 

Use Case: Adaptive Training in Extended Reality 

One of XR5’s most promising use cases is adaptive worker training. In traditional XR-based training, every user follows the same scenario at the same pace, regardless of skill level or mental fatigue. XR5 changes that by adapting content and pacing to the user’s moment-to-moment state. 

Imagine a worker wearing a lightweight XR headset connected to wearable sensors such as a smartwatch and ring. As the worker performs a maintenance simulation, the system continuously monitors eye-tracking data and physiological signals. When the AI-driven adaptive engine estimates high cognitive load or stress, it reduces task complexity, slows the pace of instruction, and highlights key visual cues to maintain comprehension and comfort. 

Conversely, if the user shows signs of confidence and low cognitive strain, the system increases challenge—introducing more advanced steps, tighter tolerances, or time constraints—to sustain engagement and accelerate learning. This closed feedback loop—from sensing to inference to adaptation—delivers a personalized, effective, and well-being-aware training experience. 

Why It Matters 

Human adaptability has always been our greatest strength, but in the digital age, the cognitive demands placed on us can exceed our limits. The XR5 Project shifts the paradigm toward technology that adapts to people, rather than the other way around. By integrating AI, human factors assessment, and immersive technologies, XR5 has the potential to: 

  • Improve learning outcomes through adaptive pacing, and feedback. 
  • Reduce mental workload, fatigue and stress in industrial and training environments. 
  • Improve safety and performance by aligning digital systems with human capabilities. 

These outcomes are not just technological goals—they reflect a deeper human aspiration: to build intelligent systems that respect our limits and amplify our strengths, supporting both performance and well-being. 

Toward a More Human Future 

The XR5 Project exemplifies a new era of Human-Centered Systems, where intelligent systems act as responsive collaborators – able to sense and adapt to our cognitive and emotional realities. It marks a concrete step toward AI that enhances not only productivity but also human well-being, empathy, and resilience. 

In a world increasingly mediated by algorithms and automation, XR5.0 reminds us that the most advanced technology is the one that adapts to human – respecting our limits and amplifying our strengths. It is a tangible step toward the broader vision championed by Instituto Piaget’s Insight Research Center for Human and Ecological Development: AI systems that enhance human wellbeing, not just productivity. 

IPIA: António Rosinha, Joaquim Reis and Toacy Oliveira