XR$^3$: An Extended Reality Platform for Social-Physical Human-Robot Interaction

📅 2026-01-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the limitations in social physical human-robot interaction (HRI) research imposed by the high cost of physical robots and the lack of realistic haptic feedback in conventional virtual reality (VR) systems. To overcome these challenges, the authors present a co-located dual-headset VR platform that leverages a Wizard-of-Oz paradigm, wherein a hidden operator drives a virtual robot avatar in real time, synchronously rendering body movements, facial expressions, and gaze behavior while enabling genuine tactile interaction. This system achieves, for the first time, spatiotemporal alignment between visual and haptic modalities in co-located dyadic VR, allowing independent manipulation of multiple nonverbal social cues without altering physical contact. The platform substantially lowers the barrier to entry for embodied haptic HRI studies and offers an efficient solution for rapid prototyping and rigorous experimental evaluation.

Technology Category

Application Category

📝 Abstract
Social-physical human-robot interaction (spHRI) is difficult to study: building and programming robots that integrate multiple interaction modalities is costly and slow, while VR-based prototypes often lack physical contact, breaking users'visuo-tactile expectations. We present XR$^3$, a co-located dual-VR-headset platform for HRI research in which an attendee and a hidden operator share the same physical space while experiencing different virtual embodiments. The attendee sees an expressive virtual robot that interacts face-to-face in a shared virtual environment. In real time, the robot's upper-body motion, head and gaze behavior, and facial expressions are mapped from the operator's tracked limbs and face signals. Because the operator is co-present and calibrated in the same coordinate frame, the operator can also touch the attendee, enabling perceived robot touch synchronized with the robot's visible hands. Finger and hand motion is mapped to the robot avatar using inverse kinematics to support precise contact. Beyond motion retargeting, XR$^3$ supports social retargeting of multiple nonverbal cues that can be experimentally varied while keeping physical interaction constant. We detail the system design and calibration, and demonstrate the platform in a touch-based Wizard-of-Oz study, lowering the barrier to prototyping and evaluating embodied, contact-based robot behaviors.
Problem

Research questions and friction points this paper is trying to address.

human-robot interaction
social-physical interaction
virtual reality
touch-enabled interaction
visuo-tactile expectations
Innovation

Methods, ideas, or system contributions that make the work stand out.

co-located dual-headset VR
touch-enabled HRI
visuo-tactile synchronization
social retargeting
Wizard-of-Oz embodiment
🔎 Similar Papers
No similar papers found.