🤖 AI Summary
This study addresses the lack of systematic integration in current Affective Extended Reality (Affective XR) research, which hinders a clear understanding of its design paradigms and technical landscape in emotion recognition and sharing. Employing a scoping review methodology, the work systematically analyzes 82 human-computer interaction studies, synthesizing advancements in biosignal sensing, XR platforms, and affective modeling to produce the first comprehensive research map of Affective XR. The analysis reveals the diversity of emotion-sharing objectives, distills key design dimensions, and categorizes prevailing system architectures and evaluation approaches. Furthermore, it identifies underexplored research directions, offering a cohesive theoretical framework and actionable pathways to guide future investigations in this emerging interdisciplinary domain.
📝 Abstract
This paper introduces the notion of affective extended reality (XR) to characterise XR systems that use biodata to enable understanding of emotions. The HCI literature contains many such systems, but they have not yet been mapped into a coherent whole. To address this, we conducted a scoping review of 82 papers that explore the nexus of biodata, emotions, and XR. We analyse the technologies used in these systems, the interaction techniques employed, and the methods used to evaluate their effectiveness. Through our analysis, we contribute a mapping of the current landscape of affective XR, revealing diversity in the goals for enabling emotion sharing. We demonstrate how HCI researchers have explored the design of the interaction flows in XR biofeedback systems, highlighting key design dimensions and challenges in understanding emotions. We discuss underused approaches for emotion sharing and highlight opportunities for future research on affective XR.