🤖 AI Summary
This work addresses the degradation of localization accuracy in legged robots operating in geometrically sparse environments—such as narrow tunnels—where LiDAR constraints are insufficient. To mitigate this issue, the authors propose an adaptive tightly coupled fusion framework that integrates LiDAR, IMU, and legged odometry within an error-state Kalman filter. The method incorporates an online observability-aware mechanism that dynamically evaluates environmental degeneracy and adaptively adjusts the observation noise covariance of each sensor accordingly. Experimental results demonstrate that, in degenerate scenarios like long corridors, the proposed approach significantly outperforms existing methods, achieving notably improved localization accuracy and robustness.
📝 Abstract
This paper addresses the problem of accurate localization for quadrupedal robots operating in narrow tunnel-like environments. Due to the long and homogeneous characteristics of such scenarios, LiDAR measurements often provide weak geometric constraints, making traditional sensor fusion methods susceptible to accumulated motion estimation errors. To address these challenges, we propose AIMS, an adaptive LiDAR-IMU-leg odometry fusion method for robust quadrupedal robot localization in degenerate environments. The proposed method is formulated within an error-state Kalman filtering framework, where LiDAR and leg odometry measurements are integrated with IMU-based state prediction, and measurement noise covariance matrices are adaptively adjusted based on online degeneracy-aware reliability assessment. Experimental results obtained in narrow corridor environments demonstrate that the proposed method improves localization accuracy and robustness compared with state-of-the-art approaches.