Joint Moment Estimation for Hip Exoskeleton Control: A Generalized Moment Feature Generation Method

๐Ÿ“… 2024-10-01
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
Existing hip joint torque estimation methods lack personalization, resulting in poor cross-subject generalizability and limiting real-time exoskeleton assistance accuracy. To address this, we propose a subject-specific, instantaneous hip torque estimation method relying solely on encoder-based kinematic data. Our approach introduces a Generalized Moment Feature (GMF) generation framework that learns subject-invariant, decodable torque representations; critically, it is the first to embed anthropometric parameters in a differentiable manner within a GRU-based temporal model, enabling joint optimization of generalizability and personalization. The end-to-end architecture achieves an RMSE of 0.1180 Nm/kg on treadmill testsโ€”improving by 6.5% over non-parameter-fused and 8.3% over conventional parameter-fused baselines. Real-world validation demonstrates a statistically significant 20.5% reduction in average metabolic cost during level-ground walking (p < 0.01).

Technology Category

Application Category

๐Ÿ“ Abstract
Hip joint moments during walking are the key foundation for hip exoskeleton assistance control. Most recent studies have shown estimating hip joint moments instantaneously offers a lot of advantages compared to generating assistive torque profiles based on gait estimation, such as simple sensor requirements and adaptability to variable walking speeds. However, existing joint moment estimation methods still suffer from a lack of personalization, leading to estimation accuracy degradation for new users. To address the challenges, this paper proposes a hip joint moment estimation method based on generalized moment features (GMF). A GMF generator is constructed to learn GMF of the joint moment which is invariant to individual variations while remaining decodable into joint moments through a dedicated decoder. Utilizing this well-featured representation, a GRU-based neural network is used to predict GMF with joint kinematics data, which can easily be acquired by hip exoskeleton encoders. The proposed estimation method achieves a root mean square error of 0.1180 Nm/kg under 28 walking speed conditions on a treadmill dataset, improved by 6.5% compared to the model without body parameter fusion, and by 8.3% for the conventional fusion model with body parameter. Furthermore, the proposed method was employed on a hip exoskeleton with only encoder sensors and achieved an average 20.5% metabolic reduction (p<0.01) for users compared to assist-off condition in level-ground walking.
Problem

Research questions and friction points this paper is trying to address.

Estimating hip joint moments for exoskeleton control
Improving accuracy by addressing lack of personalization
Reducing metabolic cost in level-ground walking
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generalized moment features for joint estimation
GRU-based network predicts moment features
Encoder-only sensors enable metabolic reduction
๐Ÿ”Ž Similar Papers
Y
Yuanwen Zhang
Shenzhen Key Laboratory of Biomimetic Robotics and Intelligent Systems and Guangdong Provincial Key Laboratory of Human-Augmentation and Rehabilitation Robotics in Universities, Southern University of Science and Technology, Shenzhen 518055, China, and also with the Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen 518055, China.
J
Jingfeng Xiong
Shenzhen Key Laboratory of Biomimetic Robotics and Intelligent Systems and Guangdong Provincial Key Laboratory of Human-Augmentation and Rehabilitation Robotics in Universities, Southern University of Science and Technology, Shenzhen 518055, China, and also with the Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen 518055, China.
H
Haolan Xian
Shenzhen Key Laboratory of Biomimetic Robotics and Intelligent Systems and Guangdong Provincial Key Laboratory of Human-Augmentation and Rehabilitation Robotics in Universities, Southern University of Science and Technology, Shenzhen 518055, China, and also with the Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen 518055, China.
C
Chuheng Chen
Shenzhen Key Laboratory of Biomimetic Robotics and Intelligent Systems and Guangdong Provincial Key Laboratory of Human-Augmentation and Rehabilitation Robotics in Universities, Southern University of Science and Technology, Shenzhen 518055, China, and also with the Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen 518055, China.
X
Xinxing Chen
Shenzhen Key Laboratory of Biomimetic Robotics and Intelligent Systems and Guangdong Provincial Key Laboratory of Human-Augmentation and Rehabilitation Robotics in Universities, Southern University of Science and Technology, Shenzhen 518055, China, and also with the Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen 518055, China.
Chenglong Fu
Chenglong Fu
Southern University of Science and Technology
Yuquan Leng
Yuquan Leng
Harbin Institute of Technology
RoboticsLower limb exoskeleton