🤖 AI Summary
Existing micro-gesture recognition methods struggle with users holding objects and require unconstrained hand motion. To address this, we propose a seamless interaction approach based on wrist-worn active acoustic sensing. By emitting ultrasonic signals and analyzing reflections from the hand–object coupled system, our method simultaneously captures micro-gestures, grip postures, and object geometric features—without modifying existing objects. This work presents the first single-device solution enabling joint recognition of 30 micro-gesture classes under handheld conditions, achieved through integrated time-frequency signal representation and a multi-task deep neural network. In experiments involving 10 participants manipulating 25 everyday objects, the system achieves a mean accuracy of 92.0%; robustness is further validated across 10 deformable objects. The associated dataset is publicly released.
📝 Abstract
As computing devices become increasingly integrated into daily life, there is a growing need for intuitive, always-available interaction methods, even when users' hands are occupied. In this paper, we introduce Grab-n-Go, the first wearable device that leverages active acoustic sensing to recognize subtle hand microgestures while holding various objects. Unlike prior systems that focus solely on free-hand gestures or basic hand-object activity recognition, Grab-n-Go simultaneously captures information about hand microgestures, grasping poses, and object geometries using a single wristband, enabling the recognition of fine-grained hand movements occurring within activities involving occupied hands. A deep learning framework processes these complex signals to identify 30 distinct microgestures, with 6 microgestures for each of the 5 grasping poses. In a user study with 10 participants and 25 everyday objects, Grab-n-Go achieved an average recognition accuracy of 92.0%. A follow-up study further validated Grab-n-Go's robustness against 10 more challenging, deformable objects. These results underscore the potential of Grab-n-Go to provide seamless, unobtrusive interactions without requiring modifications to existing objects. The complete dataset, comprising data from 18 participants performing 30 microgestures with 35 distinct objects, is publicly available at https://github.com/cjlisalee/Grab-n-Go_Data with the DOI: https://doi.org/10.7298/7kbd-vv75.