Leveraging GCN-based Action Recognition for Teleoperation in Daily Activity Assistance

📅 2025-04-09
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high physical workload and poor naturalness in remote robotic assistance for Activities of Daily Living (ADLs) in home-based elderly care, this paper proposes an intuitive teleoperation framework grounded in human motion recognition. Methodologically, it introduces a novel action-driven paradigm integrating a lightweight Spatio-Temporal Graph Convolutional Network (S-ST-GCN) with a Finite State Machine (FSM): S-ST-GCN enables efficient real-time pose estimation and motion classification, while FSM models semantic action states, filters misclassifications, and maps recognized actions directly to pre-defined robot trajectories—bypassing conventional kinematic constraints. Experiments demonstrate statistically significant reductions in operator limb fatigue (p < 0.01), a motion recognition accuracy of 92.3%, and ADL task success rates exceeding 89% (e.g., object handing, assisted standing). These results validate the framework’s robustness, low cognitive–physical load, and practical applicability in real-world assisted living scenarios.

Technology Category

Application Category

📝 Abstract
Caregiving of older adults is an urgent global challenge, with many older adults preferring to age in place rather than enter residential care. However, providing adequate home-based assistance remains difficult, particularly in geographically vast regions. Teleoperated robots offer a promising solution, but conventional motion-mapping teleoperation imposes unnatural movement constraints on operators, leading to muscle fatigue and reduced usability. This paper presents a novel teleoperation framework that leverages action recognition to enable intuitive remote robot control. Using our simplified Spatio-Temporal Graph Convolutional Network (S-ST-GCN), the system recognizes human actions and executes corresponding preset robot trajectories, eliminating the need for direct motion synchronization. A finite-state machine (FSM) is integrated to enhance reliability by filtering out misclassified actions. Our experiments demonstrate that the proposed framework enables effortless operator movement while ensuring accurate robot execution. This proof-of-concept study highlights the potential of teleoperation with action recognition for enabling caregivers to remotely assist older adults during activities of daily living (ADLs). Future work will focus on improving the S-ST-GCN's recognition accuracy and generalization, integrating advanced motion planning techniques to further enhance robotic autonomy in older adult care, and conducting a user study to evaluate the system's telepresence and ease of control.
Problem

Research questions and friction points this paper is trying to address.

Enabling intuitive remote robot control for elderly care assistance
Reducing operator fatigue via action recognition-based teleoperation
Improving teleoperation reliability using finite-state machine filtering
Innovation

Methods, ideas, or system contributions that make the work stand out.

GCN-based action recognition for robot control
Finite-state machine filters misclassified actions
Preset robot trajectories replace motion synchronization
🔎 Similar Papers
No similar papers found.
Thomas M. Kwok
Thomas M. Kwok
University of Waterloo
RoboticsExoskeletonHuman-robot interactionAI & Machine learningTeleoperation
J
Jiaan Li
Department of Mechanical and Mechatronics Engineering, University of Waterloo, Canada
Y
Yue Hu
Department of Mechanical and Mechatronics Engineering, University of Waterloo, Canada