DogLegs: Robust Proprioceptive State Estimation for Legged Robots Using Multiple Leg-Mounted IMUs

📅 2025-03-06
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the insufficient robustness of legged robot body-state estimation under extreme conditions where external sensors (e.g., LiDAR, cameras) fail, this paper proposes a tightly coupled self-perception method integrating torso-mounted IMU, multiple foot-mounted IMUs, and joint encoders. Within an extended Kalman filter framework, we innovatively formulate a collaborative error model across foot-mounted IMUs and incorporate inter-frame relative pose constraints. Leveraging foot contact detection and leg kinematic geometric constraints, the method jointly enforces zero-velocity updates and drift suppression. Experimental results demonstrate that our approach significantly outperforms conventional legged odometry relying solely on torso IMU and joint encoders across diverse unstructured terrains, achieving substantially improved body pose estimation accuracy. All experimental datasets are publicly released.

Technology Category

Application Category

📝 Abstract
Robust and accurate proprioceptive state estimation of the main body is crucial for legged robots to execute tasks in extreme environments where exteroceptive sensors, such as LiDARs and cameras may become unreliable. In this paper, we propose DogLegs, a state estimation system for legged robots that fuses the measurements from a body-mounted inertial measurement unit (Body-IMU), joint encoders, and multiple leg-mounted IMUs (Leg-IMU) using an extended Kalman filter (EKF). The filter system contains the error states of all IMU frames. The Leg-IMUs are used to detect foot contact, thereby providing zero velocity measurements to update the state of the Leg-IMU frames. Additionally, we compute the relative position constraints between the Body-IMU and Leg-IMUs by the leg kinematics and use them to update the main body state and reduce the error drift of the individual IMU frames. Field experimental results have shown that our proposed system can achieve better state estimation accuracy compared to the traditional leg odometry method (using only Body-IMU and joint encoders) across different terrains. We make our datasets publicly available to benefit the research community.
Problem

Research questions and friction points this paper is trying to address.

Improves legged robot state estimation in extreme environments
Fuses Body-IMU, Leg-IMUs, and joint encoders using EKF
Reduces error drift and enhances accuracy across terrains
Innovation

Methods, ideas, or system contributions that make the work stand out.

Fuses Body-IMU, Leg-IMUs, and joint encoders
Uses extended Kalman filter for error correction
Detects foot contact via Leg-IMUs for zero velocity
🔎 Similar Papers
No similar papers found.
Yibin Wu
Yibin Wu
University of Bonn
SLAMInertial NavigationSensor FusionDeep Learning
J
Jian Kuang
GNSS Research Center, Wuhan University, China
Shahram Khorshidi
Shahram Khorshidi
PhD Researcher, University of Bonn
Legged Robot ControlOptimizationSystem IdentificationState Estimation
X
Xiaoji Niu
GNSS Research Center, Wuhan University, China
L
L. Klingbeil
Center for Robotics, University of Bonn, Germany; Institute of Geodesy and Geoinformation, University of Bonn, Germany
Maren Bennewitz
Maren Bennewitz
Professor of Computer Science, University of Bonn, Germany
Mobile RoboticsHumanoid RobotsRobot LearningHRI
H
H. Kuhlmann
Center for Robotics, University of Bonn, Germany; Institute of Geodesy and Geoinformation, University of Bonn, Germany