GazeSwipe: Enhancing Mobile Touchscreen Reachability through Seamless Gaze and Finger-Swipe Integration

📅 2025-03-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address thumb reach limitations and degraded interaction efficiency during one-handed operation of large-screen smartphones, this paper proposes a seamless gaze-touch collaborative interaction technique. Methodologically: (1) we introduce a novel calibration-free gaze estimation method leveraging the front-facing camera, eliminating the need for specialized hardware or explicit user calibration; (2) we design an imperceptible online auto-calibration mechanism that dynamically compensates for gaze estimation errors using swipe trajectory data; and (3) we develop a real-time gaze-touch fusion model enabling fluent mapping between visual intent and touch actions. Experimental evaluation on smartphones and tablets demonstrates significant improvements: task success rate increases notably, average task completion time decreases by 19.3%, user preference rises by 42%, and post-calibration gaze accuracy improves dynamically by 37%. The approach delivers a lightweight, robust, and deployable interaction paradigm tailored for large-screen mobile devices.

Technology Category

Application Category

📝 Abstract
Smartphones with large screens provide users with increased display and interaction space but pose challenges in reaching certain areas with the thumb when using the device with one hand. To address this, we introduce GazeSwipe, a multimodal interaction technique that combines eye gaze with finger-swipe gestures, enabling intuitive and low-friction reach on mobile touchscreens. Specifically, we design a gaze estimation method that eliminates the need for explicit gaze calibration. Our approach also avoids the use of additional eye-tracking hardware by leveraging the smartphone's built-in front-facing camera. Considering the potential decrease in gaze accuracy without dedicated eye trackers, we use finger-swipe gestures to compensate for any inaccuracies in gaze estimation. Additionally, we introduce a user-unaware auto-calibration method that improves gaze accuracy during interaction. Through extensive experiments on smartphones and tablets, we compare our technique with various methods for touchscreen reachability and evaluate the performance of our auto-calibration strategy. The results demonstrate that our method achieves high success rates and is preferred by users. The findings also validate the effectiveness of the auto-calibration strategy.
Problem

Research questions and friction points this paper is trying to address.

Enhancing mobile touchscreen reachability with gaze and swipe
Eliminating explicit gaze calibration for user convenience
Compensating gaze inaccuracies using finger-swipe gestures
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines eye gaze with finger-swipe gestures
Uses smartphone's front camera for gaze tracking
Auto-calibration improves gaze accuracy dynamically
🔎 Similar Papers
No similar papers found.
Zhuojiang Cai
Zhuojiang Cai
Technical University of Munich
Human-Computer InteractionComputer Vision
J
Jingkai Hong
State Key Lab. of VR Technology and Systems, Beihang University, Beijing, China
Z
Zhimin Wang
State Key Lab. of VR Technology and Systems, Beihang University, Beijing, China
F
Feng Lu
State Key Lab. of VR Technology and Systems, Beihang University, Beijing, China