MoCoMR: A Collaborative MR Simulator with Individual Behavior Modeling

📅 2025-03-12
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
MR collaboration research is hindered by high costs of real-world data collection and poor experimental controllability. To address this, we propose MoCoMR—the first multimodal MR collaborative simulation platform supporting fine-grained individual behavioral modeling. It integrates gaze, speech, and locomotion modeling with controllable virtual agents to generate high-fidelity synthetic data. Our key contributions are: (1) the first configurable and interpretable virtual experimental paradigm enabling causal analysis of group dynamics; and (2) a standardized API for declarative scene definition and synchronized multimodal data acquisition. Validated on real collaborative data from 48 participants, MoCoMR successfully reproduces critical behavioral patterns—e.g., turn-taking, spatial coordination, and attention alignment—while reducing data acquisition costs and experimental complexity by over 70%. This establishes a novel methodological foundation for investigating MR collaboration mechanisms.

Technology Category

Application Category

📝 Abstract
Studying collaborative behavior in Mixed Reality (MR) often requires extensive, challenging data collection. This paper introduces MoCoMR, a novel simulator designed to address this by generating synthetic yet realistic collaborative MR data. MoCoMR captures individual behavioral modalities such as speaking, gaze, and locomotion during a collaborative image-sorting task with 48 participants to identify distinct behavioral patterns. MoCoMR simulates individual actions and interactions within a virtual space, enabling researchers to investigate the impact of individual behaviors on group dynamics and task performance. This simulator facilitates the development of more effective and human-centered MR applications by providing insights into user behavior and interaction patterns. The simulator's API allows for flexible configuration and data analysis, enabling researchers to explore various scenarios and generate valuable insights for optimizing collaborative MR experiences.
Problem

Research questions and friction points this paper is trying to address.

Generates synthetic collaborative MR data for behavior analysis
Simulates individual actions and interactions in virtual spaces
Facilitates development of human-centered MR applications
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generates synthetic realistic collaborative MR data
Simulates individual actions in virtual space
Provides flexible API for scenario exploration
🔎 Similar Papers
No similar papers found.