Robust Tightly-Coupled Filter-Based Monocular Visual-Inertial State Estimation and Graph-Based Evaluation for Autonomous Drone Racing

๐Ÿ“… 2026-03-03
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This work addresses the challenge of achieving robust and computationally efficient state estimation for high-speed drone racing in GNSS-denied and motion-capture-free environments. The authors propose ADR-VINS, a tightly coupled monocular visual-inertial navigation system based on an error-state Kalman filter that directly incorporates pixel reprojection errors of gate corner features as update termsโ€”requiring only two visible corners and eliminating the need for conventional PnP solvers. A robust reweighting strategy replaces RANSAC to suppress outliers. Furthermore, they introduce ADR-FGO, a factor graph optimization framework that, for the first time, enables high-precision offline reference trajectory generation without pre-installed track infrastructure. Evaluated on the TII-RATM dataset, ADR-VINS achieves an average translational RMS error of 0.134 m, while ADR-FGO reaches 0.060 m, and the system was successfully deployed in the A2RL racing competition, maintaining stable estimation at speeds up to 20.9 m/s.

Technology Category

Application Category

๐Ÿ“ Abstract
Autonomous drone racing (ADR) demands state estimation that is simultaneously computationally efficient and resilient to the perceptual degradation experienced during extreme velocity and maneuvers. Traditional frameworks typically rely on conventional visual-inertial pipelines with loosely-coupled gate-based Perspective-n-Points (PnP) corrections that suffer from a rigid requirement for four visible features and information loss in intermediate steps. Furthermore, the absence of GNSS and Motion Capture systems in uninstrumented, competitive racing environments makes the objective evaluation of such systems remarkably difficult. To address these limitations, we propose ADR-VINS, a robust, monocular visual-inertial state estimation framework based on an Error-State Kalman Filter (ESKF) tailored for autonomous drone racing. Our approach integrates direct pixel reprojection errors from gate corners features as innovation terms within the filter. By bypassing intermediate PnP solvers, ADR-VINS maintains valid state updates with as few as two visible corners and utilizes robust reweighting instead of RANSAC-based schemes to handle outliers, enhancing computational efficiency. Furthermore, we introduce ADR-FGO, an offline Factor-Graph Optimization framework to generate high-fidelity reference trajectories that facilitate post-flight performance evaluation and analysis on uninstrumented, GNSS-denied environments. The proposed system is validated using TII-RATM dataset, where ADR-VINS achieves an average RMS translation error of 0.134 m, while ADR-FGO yields 0.060 m as a smoothing-based reference. Finally, ADR-VINS was successfully deployed in the A2RL Drone Championship Season 2, maintaining stable and robust estimation despite noisy detections during high-agility flight at top speeds of 20.9 m/s. We further utilize ADR-FGO for post-flight evaluation in uninstrumented racing environments.
Problem

Research questions and friction points this paper is trying to address.

autonomous drone racing
visual-inertial state estimation
perceptual degradation
GNSS-denied environment
performance evaluation
Innovation

Methods, ideas, or system contributions that make the work stand out.

visual-inertial state estimation
Error-State Kalman Filter
robust reweighting
Factor-Graph Optimization
autonomous drone racing
๐Ÿ”Ž Similar Papers
No similar papers found.
Maulana Bisyir Azhari
Maulana Bisyir Azhari
Korea Advanced Institute of Science and Technology
RoboticsComputer VisionSLAM
D
Donghun Han
Unmanned Systems Research Group (USRG), School of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon 305701, South Korea
S
SungJun Park
Unmanned Systems Research Group (USRG), School of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon 305701, South Korea
David Hyunchul Shim
David Hyunchul Shim
Professor, School of Electrical Engineering, Director, Korea RPAS Research Center, KAIST
Unmanned SystemsRobotic Systems