Explainable XR: Understanding User Behaviors of XR Environments using LLM-assisted Analytics Framework

📅 2025-01-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study addresses key challenges in immersive interaction analysis within multi-user, cross-reality (XR) environments—including heterogeneous data, poor interpretability, and difficulty in behavioral modeling across augmented, virtual, and mixed reality (AR/VR/MR). We propose the first end-to-end interpretable XR analytics framework. Methodologically: (i) we introduce User Action Descriptors (UADs), a novel unified representation encoding multimodal behavior, intent, and contextual information; (ii) we design a platform-agnostic XR session recorder enabling virtuality-agnostic data capture; and (iii) we develop an LLM-driven, perspective-adaptive visual analytics interface that generates semantically enriched, interactive insights. Evaluated across five individual and collaborative XR use cases, the framework significantly improves analytical usability and enables generation of multidimensional, actionable behavioral insights. It establishes a new paradigm for XR human factors research and system optimization.

Technology Category

Application Category

📝 Abstract
We present Explainable XR, an end-to-end framework for analyzing user behavior in diverse eXtended Reality (XR) environments by leveraging Large Language Models (LLMs) for data interpretation assistance. Existing XR user analytics frameworks face challenges in handling cross-virtuality - AR, VR, MR - transitions, multi-user collaborative application scenarios, and the complexity of multimodal data. Explainable XR addresses these challenges by providing a virtuality-agnostic solution for the collection, analysis, and visualization of immersive sessions. We propose three main components in our framework: (1) A novel user data recording schema, called User Action Descriptor (UAD), that can capture the users' multimodal actions, along with their intents and the contexts; (2) a platform-agnostic XR session recorder, and (3) a visual analytics interface that offers LLM-assisted insights tailored to the analysts' perspectives, facilitating the exploration and analysis of the recorded XR session data. We demonstrate the versatility of Explainable XR by demonstrating five use-case scenarios, in both individual and collaborative XR applications across virtualities. Our technical evaluation and user studies show that Explainable XR provides a highly usable analytics solution for understanding user actions and delivering multifaceted, actionable insights into user behaviors in immersive environments.
Problem

Research questions and friction points this paper is trying to address.

Extended Reality (XR)
Interaction Analysis
Multi-user Scenarios
Innovation

Methods, ideas, or system contributions that make the work stand out.

InterpretableXR
User Activity Descriptor
Cross-device Virtual Experience Recording
Y
Yoonsang Kim
Center for Visual Computing at Stony Brook University, New York
Zainab Aamir
Zainab Aamir
PhD Candidate, Stony Brook University
Augmented RealityHuman-Computer InteractionImmersive Facilities
M
Mithilesh Singh
Center for Visual Computing at Stony Brook University, New York
S
S. Boorboor
Center for Visual Computing at Stony Brook University, New York
Klaus Mueller
Klaus Mueller
Professor of Computer Science, Stony Brook University
VisualizationVisual AnalyticsData ScienceExplainable AIMedical Imaging
A
Arie E. Kaufman
Center for Visual Computing at Stony Brook University, New York