🤖 AI Summary
Medical image segmentation is critical for clinical diagnosis and surgical planning, yet conventional manual annotation remains inefficient and cognitively demanding. To address this, we propose an immersive XR segmentation system leveraging the Meta Quest 3 headset and MX Ink stylus—the first to introduce an “XR-stylus fusion paradigm.” This paradigm integrates paper-like handwriting intuition with real-time volume rendering (VTK), enabling dynamic 2D/3D slicing and spatially aware segmentation of CT data. The system is built on Unity XR SDK and OpenCV, supporting dual-input modalities (hand gestures and stylus) and rigorously evaluated per ISO/IEC 9241-110 human factors standards. Validation on a public craniofacial CT dataset yields a System Usability Scale (SUS) score of 66 and a user self-explanatory rating of 4.1/5, with significantly lower cognitive load than desktop-based tools. Results demonstrate enhanced naturalness of human-AI collaboration and improved integration into clinical workflows.
📝 Abstract
Medical imaging segmentation is essential in clinical settings for diagnosing diseases, planning surgeries, and other procedures. However, manual annotation is a cumbersome and effortful task. To mitigate these aspects, this study implements and evaluates the usability and clinical applicability of an extended reality (XR)-based segmentation tool for anatomical CT scans, using the Meta Quest 3 headset and Logitech MX Ink stylus. We develop an immersive interface enabling real-time interaction with 2D and 3D medical imaging data in a customizable workspace designed to mitigate workflow fragmentation and cognitive demands inherent to conventional manual segmentation tools. The platform combines stylus-driven annotation, mirroring traditional pen-on-paper workflows, with instant 3D volumetric rendering. A user study with a public craniofacial CT dataset demonstrated the tool's foundational viability, achieving a System Usability Scale (SUS) score of 66, within the expected range for medical applications. Participants highlighted the system's intuitive controls (scoring 4.1/5 for self-descriptiveness on ISONORM metrics) and spatial interaction design, with qualitative feedback highlighting strengths in hybrid 2D/3D navigation and realistic stylus ergonomics. While users identified opportunities to enhance task-specific precision and error management, the platform's core workflow enabled dynamic slice adjustment, reducing cognitive load compared to desktop tools. Results position the XR-stylus paradigm as a promising foundation for immersive segmentation tools, with iterative refinements targeting haptic feedback calibration and workflow personalization to advance adoption in preoperative planning.