🤖 AI Summary
Motion-induced degradation of 2D gaze estimation accuracy remains poorly quantified in real-world mobile scenarios. Method: We conducted two user studies, collecting protracted facial videos and high-precision ground-truth gaze labels under diverse motion states—including supine rest and maze navigation—and applied statistical regression to disentangle and quantify the independent contributions of three dynamic error sources: eye-to-camera distance, head pose, and device orientation. Contribution/Results: Motion increases gaze estimation error by up to 48.91%. We thus propose a novel, motion-robust evaluation paradigm for adaptive eye tracking, establishing an empirical benchmark for gaze model design, system calibration, and behavioral modeling in dynamic environments. This work provides actionable insights for improving gaze estimation reliability during natural user movement.
📝 Abstract
Mobile gaze tracking involves inferring a user's gaze point or direction on a mobile device's screen from facial images captured by the device's front camera. While this technology inspires an increasing number of gaze-interaction applications, achieving consistent accuracy remains challenging due to dynamic user-device spatial relationships and varied motion conditions inherent in mobile contexts. This paper provides empirical evidence on how user mobility and behaviour affect mobile gaze tracking accuracy. We conduct two user studies collecting behaviour and gaze data under various motion conditions - from lying to maze navigation - and during different interaction tasks. Quantitative analysis has revealed behavioural regularities among daily tasks and identified head distance, head pose, and device orientation as key factors affecting accuracy, with errors increasing by up to 48.91% in dynamic conditions compared to static ones. These findings highlight the need for more robust, adaptive eye-tracking systems that account for head movements and device deflection to maintain accuracy across diverse mobile contexts.