GazeFlow: Personalized Ambient Soundscape Generation for Passive Strabismus Self-Monitoring

๐Ÿ“… 2026-02-23
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This study addresses the lack of accessible, passive tools for post-strabismus surgery eye alignment self-monitoring, a gap exacerbated by current therapiesโ€™ reliance on active patient engagement and clinical oversight. The authors propose a browser-based passive monitoring system that leverages a personalized temporal autoencoder to detect ocular misalignment from low-frequency (30 Hz) webcam eye-tracking data. Drawing on calm computing principles, the system conveys deviation severity through ambient audio feedback within the userโ€™s peripheral awareness. Key innovations include binocular time-frequency decoupling (BTFD), contrastive biometric pretraining (CBP), and Gaze-MAML meta-learning to mitigate inter-subject variability and domain shift from high-precision eye trackers to commodity webcams. Evaluated on the GazeBase dataset (N=50), the method achieves an F1 score of 0.84; a preliminary user study (N=6) reports high eye movement awareness (5.8/7) and strong preference for ambient audio feedback (6.2/7).

Technology Category

Application Category

๐Ÿ“ Abstract
Strabismus affects 2-4% of the population, yet individuals recovering from corrective surgery lack accessible tools for monitoring eye alignment. Dichoptic therapies require active engagement & clinical supervision, limiting their adoption for passive self-awareness. We present GazeFlow, a browser-based self-monitoring system that uses a personalized temporal autoencoder to detect eye drift patterns from webcam-based gaze tracking & provides ambient audio feedback. Unlike alert-based systems, GazeFlow operates according to calm computing principles, morphing musical parameters in proportion to drift severity while remaining in peripheral awareness. We address the challenges of inter-individual variability & domain transfer (1000Hz research to 30Hz webcam) by introducing Binocular Temporal-Frequency Disentanglement (BTFD), Contrastive Biometric Pre-training (CBP), & Gaze-MAML. We validate our approach on the GazeBase dataset (N=50) achieving F1=0.84 for drift detection, & conduct a preliminary user study (N=6) with participants having intermittent strabismus. Participants reported increased awareness of their eye behaviour (M=5.8/7) & preference for ambient feedback over alerts (M=6.2/7). We discuss the system's potential for self-awareness applications & outline directions for clinical validation.
Problem

Research questions and friction points this paper is trying to address.

strabismus
self-monitoring
gaze tracking
passive awareness
eye alignment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Binocular Temporal-Frequency Disentanglement
Contrastive Biometric Pre-training
Gaze-MAML
Calm Computing
Ambient Feedback
๐Ÿ”Ž Similar Papers
No similar papers found.