Interactive Holographic Visualization for 3D Facial Avatar

πŸ“… 2025-02-12
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Traditional medical training relies on flat-screen displays or static models for visualizing dynamic facial expressions, resulting in limited realism and interactivity. To address this, we propose a real-time interactive holographic virtual human system specifically designed for pain assessment training. Our approach pioneers the integration of 3D Gaussian Splatting into viewpoint-adaptive holographic facial projection, synergistically combining real-time viewpoint calibration, stereoscopic holographic rendering, and interactive expression-driven facial modeling. This enables high-fidelity, dynamic, nonverbal feedback rendering from arbitrary viewing angles. To our knowledge, this is the first method to bridge the technical gap in dynamic, interactive 3D facial holography. Evaluated in medical simulation education, it achieves >30 FPS real-time performance while significantly improving medical students’ accuracy in recognizing pain-related facial expressions and enhancing training immersion.

Technology Category

Application Category

πŸ“ Abstract
Traditional methods for visualizing dynamic human expressions, particularly in medical training, often rely on flat-screen displays or static mannequins, which have proven inefficient for realistic simulation. In response, we propose a platform that leverages a 3D interactive facial avatar capable of displaying non-verbal feedback, including pain signals. This avatar is projected onto a stereoscopic, view-dependent 3D display, offering a more immersive and realistic simulated patient experience for pain assessment practice. However, there is no existing solution that dynamically predicts and projects interactive 3D facial avatars in real-time. To overcome this, we emphasize the need for a 3D display projection system that can project the facial avatar holographically, allowing users to interact with the avatar from any viewpoint. By incorporating 3D Gaussian Splatting (3DGS) and real-time view-dependent calibration, we significantly improve the training environment for accurate pain recognition and assessment.
Problem

Research questions and friction points this paper is trying to address.

Dynamic 3D facial avatar visualization
Real-time interactive holographic projection
Enhanced pain assessment training environment
Innovation

Methods, ideas, or system contributions that make the work stand out.

3D interactive facial avatar
holographic 3D display
real-time view-dependent calibration
πŸ”Ž Similar Papers
No similar papers found.
T
Tri Tung Nguyen Nguyen
Ritsumeikan University
F
Fujii Yasuyuki
Ritsumeikan University
D
D. Tran
Ritsumeikan University
Joo-Ho Lee
Joo-Ho Lee
Professor, Ritsumeikan University
Intelligent SpaceSystem IntegrationMachine Learning