Grab-n-Go: On-the-Go Microgesture Recognition with Objects in Hand

📅 2025-08-15
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing micro-gesture recognition methods struggle with users holding objects and require unconstrained hand motion. To address this, we propose a seamless interaction approach based on wrist-worn active acoustic sensing. By emitting ultrasonic signals and analyzing reflections from the hand–object coupled system, our method simultaneously captures micro-gestures, grip postures, and object geometric features—without modifying existing objects. This work presents the first single-device solution enabling joint recognition of 30 micro-gesture classes under handheld conditions, achieved through integrated time-frequency signal representation and a multi-task deep neural network. In experiments involving 10 participants manipulating 25 everyday objects, the system achieves a mean accuracy of 92.0%; robustness is further validated across 10 deformable objects. The associated dataset is publicly released.

Technology Category

Application Category

📝 Abstract
As computing devices become increasingly integrated into daily life, there is a growing need for intuitive, always-available interaction methods, even when users' hands are occupied. In this paper, we introduce Grab-n-Go, the first wearable device that leverages active acoustic sensing to recognize subtle hand microgestures while holding various objects. Unlike prior systems that focus solely on free-hand gestures or basic hand-object activity recognition, Grab-n-Go simultaneously captures information about hand microgestures, grasping poses, and object geometries using a single wristband, enabling the recognition of fine-grained hand movements occurring within activities involving occupied hands. A deep learning framework processes these complex signals to identify 30 distinct microgestures, with 6 microgestures for each of the 5 grasping poses. In a user study with 10 participants and 25 everyday objects, Grab-n-Go achieved an average recognition accuracy of 92.0%. A follow-up study further validated Grab-n-Go's robustness against 10 more challenging, deformable objects. These results underscore the potential of Grab-n-Go to provide seamless, unobtrusive interactions without requiring modifications to existing objects. The complete dataset, comprising data from 18 participants performing 30 microgestures with 35 distinct objects, is publicly available at https://github.com/cjlisalee/Grab-n-Go_Data with the DOI: https://doi.org/10.7298/7kbd-vv75.
Problem

Research questions and friction points this paper is trying to address.

Recognizing hand microgestures while holding objects
Enabling interaction with occupied hands using wearables
Distinguishing fine-grained hand movements and object geometries
Innovation

Methods, ideas, or system contributions that make the work stand out.

Active acoustic sensing for microgesture recognition
Single wristband captures hand and object data
Deep learning identifies 30 distinct microgestures
🔎 Similar Papers
No similar papers found.
Chi-Jung Lee
Chi-Jung Lee
Cornell University
HCI
J
Jiaxin Li
Cornell University, Ithaca, New York, USA
Tianhong Catherine Yu
Tianhong Catherine Yu
PhD Student, Cornell University
Ruidong Zhang
Ruidong Zhang
Cornell University
Ubiquitous computingWearable computing
V
Vipin Gunda
Cornell University, Ithaca, New York, USA
F
François Guimbretière
Cornell University, Ithaca, New York, USA
C
Cheng Zhang
Cornell University, Ithaca, New York, USA