Biomechanically consistent real-time action recognition for human-robot interaction

📅 2025-10-21
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of real-time, biomechanically plausible human action recognition in industrial settings using standard 2D cameras. We propose an end-to-end framework that takes joint angles—not joint coordinates—as input, integrating human kinematic modeling and biomechanical priors into a lightweight Transformer architecture equipped with a temporal smoothing mechanism. This design significantly enhances robustness against pose variations, inter-subject anatomical differences, and camera viewpoint shifts, enabling truly low-latency online interaction. Evaluated on a custom industrial action dataset comprising 11 subjects, our method achieves 88% classification accuracy, outperforming mainstream real-time baselines. Furthermore, it successfully enables real-time closed-loop control of a simulated robot, demonstrating practical applicability in industrial automation scenarios.

Technology Category

Application Category

📝 Abstract
This paper presents a novel framework for real-time human action recognition in industrial contexts, using standard 2D cameras. We introduce a complete pipeline for robust and real-time estimation of human joint kinematics, input to a temporally smoothed Transformer-based network, for action recognition. We rely on a new dataset including 11 subjects performing various actions, to evaluate our approach. Unlike most of the literature that relies on joint center positions (JCP) and is offline, ours uses biomechanical prior, eg. joint angles, for fast and robust real-time recognition. Besides, joint angles make the proposed method agnostic to sensor and subject poses as well as to anthropometric differences, and ensure robustness across environments and subjects. Our proposed learning model outperforms the best baseline model, running also in real-time, along various metrics. It achieves 88% accuracy and shows great generalization ability, for subjects not facing the cameras. Finally, we demonstrate the robustness and usefulness of our technique, through an online interaction experiment, with a simulated robot controlled in real-time via the recognized actions.
Problem

Research questions and friction points this paper is trying to address.

Real-time human action recognition using biomechanical joint angles
Robust recognition across different subjects and environments
Enabling real-time human-robot interaction through action recognition
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses biomechanical joint angles for recognition
Implements Transformer network with temporal smoothing
Enables real-time human-robot interaction via actions
🔎 Similar Papers
No similar papers found.
W
Wanchen Li
LIRMM, University of Montpellier, Montpellier, France
K
Kahina Chalabi
LAAS-CNRS, Université Paul Sabatier, CNRS, Toulouse, France
Maxime Sabbah
Maxime Sabbah
LAAS-CNRS, Gepetto-group
collaborative roboticsoptimal controlhuman motion analysis
T
Thomas Bousquet
LIRMM, University of Montpellier, Montpellier, France; LAAS-CNRS, Université Paul Sabatier, CNRS, Toulouse, France
R
Robin Passama
LIRMM, University of Montpellier, Montpellier, France
S
Sofiane Ramdani
LIRMM, University of Montpellier, Montpellier, France
A
Andrea Cherubini
Nantes Université, École Centrale Nantes, CNRS, LS2N, UMR 6004, 1, rue de la Noe, 44321 Nantes, France
Vincent Bonnet
Vincent Bonnet
LAAS-CNRS, GEPETTO Team, University of Toulouse
Humanoid robotIdentificationRehabilitationOptimisationInertial Measurement Unit