KUKABot in Action: Voice‑Driven AI Support for KUKA Technicians Inside RoboSpace

As industrial automation and robotics systems grow in complexity, technicians are
increasingly required to handle large volumes of technical documentation, configuration
data, troubleshooting procedures, and machine‑specific knowledge is needed often under
time pressure and directly on the shop floor. In the XR5.0 project’s Rapid Human‑Centric
AI‑Enabled Product Design KUKA pilot
is addressing this challenge by integrating the
voice‑driven AI assistant KUKABot directly into the immersive XR application RoboSpace.
Together, these technologies transform how technicians interact with robots, access
information, and execute tasks in real industrial environments.


This KUKABot enhances technician workflows through natural‑language interaction,
Retrieval‑Augmented Generation (RAG) capabilities, and seamless integration within spatial
XR interfaces.


A Unified XR Workspace for Robot Interaction
The XR application RoboSpace is built as a high‑fidelity, spatially aligned interface for
interacting with KUKA robotic systems on the shop floor. It combines true‑to‑scale robot
visualization, real‑time OPC UA data streaming, and interactive engineering tools into a
single immersive environment based on the HoloSpace Software by Hololight.


Technicians can:

  • Walk around the full‑scale digital robot model-
  • Inspect the robotic cells from any angle
  • Compare digital and physical configurations-
  • Use advanced tools such as cross‑sections, measurements, annotations, and collision
    checks


This interactive 3D environment replaces the traditional reliance on multiple software
systems, terminals, printed documentation, and separate engineering tools. It also provides
the foundation on which KUKABot operates.


AI Assistant Embedded Directly Into XR
Integrated as part of RoboSpace’s AI Service Layer, KUKABot provides natural‑language
access to robot‑related documentation and technical knowledge. It allows technicians to
request information while keeping both hands free for inspection or operation tasks. This is
an essential capability in industrial environments where handheld interaction is often
impractical.


What KUKABot Can Deliver

Through simple voice commands, technicians can request:

  • Technical specifications
  • Motion range and workspace tables
  • Troubleshooting procedures
  • Datasheet excerpts
  • Visual diagrams displayed inside XR panels

This makes it possible to retrieve information without navigating complex menus or
returning to a workstation.

RAG for Industrial Documentation

KUKABot uses a Retrieval‑Augmented Generation (RAG) approach. It retrieves relevant
documentation and generates structured responses, including tables, images, and
summaries displayed as contextual XR overlays.

This architecture enables:

  • Faster access to information
  • Reduced cognitive load
  • Consistent and context‑aware guidance
  • Hands‑free task execution
  • Because KUKABot is implemented as an API‑based service, its knowledge base can be updated continuously without requiring changes to the XR interface.


Voice‑Driven Interaction for Real‑World Conditions
RoboSpace includes a voice‑driven interaction layer, combining speech‑to‑text for
recognizing commands and text‑to‑speech for delivering system responses. Users can
activate tools, switch views, request documentation, or navigate the interface without
needing to use physical controllers or hand menus.


This interaction model is essential for:

  • Environments with limited mobility
  • Situations requiring protective equipment
  • Tasks where technicians cannot remove their hands from the machine
  • Reducing interruptions and context switching
  • By integrating KUKABot with this layer, RoboSpace becomes a seamless conversational and operational environment.

Impact on the Technicians Workflows

  1. Faster Troubleshooting
    With KUKABot available directly in XR, technicians no longer need to step away from the
    machine to search for commissioning instructions and specifications. The system fetches
    relevant troubleshooting steps and displays them.
  2. Reduced Cognitive Load Context‑dependent visualizations, such as OPC UA sensor values overlaid on the correct
    components, combined with on‑demand explanations from KUKABot reduces the
    cognitive load.
  3. Improved Safety and Task Accuracy
    Technicians can confirm procedures verbally while observing the machine in XR. This
    minimizes errors that can occur when switching between devices or reading
    documentation away from the equipment. Combined with predictive safety alerts from
    the Human Digital Twin module, the overall system strengthens safe and effective
    human‑robot collaboration.
  4. Enhanced Training and Onboarding
    During Training sessions inside RoboSpace, trainees can ask KUKABot for clarifications or
    additional instructions. This enables richer and more adaptive learning experiences.

Insights From Pilot Evaluation at KUKA
The XR5.0 pilot evaluation at KUKA’s facilities confirmed the strong operational relevance of
integrating KUKABot into RoboSpace. Technicians appreciated the intuitive interface,
hands‑free access to structured information, and the clear potential for long‑term efficiency
gains.


Toward AI‑Supported, Human‑Centric Robotics Workflows
The integration of KUKABot into RoboSpace demonstrates a powerful vision for the future
of human‑robot interaction. An environment where technicians engage directly with
physical and digital systems through natural language, contextual spatial visualization, and
AI‑driven support.


By embedding intelligence, safety awareness, and documentation access inside an
immersive XR workspace, the XR5.0 pilot showcases a scalable, Industry‑5.0‑aligned
approach to technician empowerment. As KUKA continues to refine performance and
expand capabilities, KUKABot and RoboSpace represent a major step toward highly
adaptive, efficient, and human‑centric robotics operations.