MILUV: A Multi-UAV Indoor Localization dataset with UWB and Vision

📅 2025-04-19
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the challenge of validating multi-UAV collaborative indoor localization algorithms. To this end, we introduce the first multimodal benchmark dataset specifically designed for this task, featuring synchronized acquisition of ultra-wideband (UWB) ranging measurements—including raw channel impulse response (CIR) data—binocular and downward-looking visual streams, inertial measurement unit (IMU) readings, laser altimeter outputs, and ground-truth motion-capture trajectories. The dataset comprehensively covers both line-of-sight (LOS) and non-line-of-sight (NLOS) conditions, as well as high-dynamic maneuvers at speeds up to 4.418 m/s. Notably, it is the first benchmark to jointly incorporate high-fidelity UWB CIR signals and multi-view visual flows for collaborative multi-UAV flight evaluation. We release an open-source, unified evaluation framework supporting diverse algorithms, including visual-inertial odometry (VIO), UWB-aided extended Kalman filter (EKF) localization, and CIR-based NLOS classification. The dataset comprises 36 experiments totaling 217 minutes of high-quality flight data, substantially lowering the barrier for developing and validating multi-sensor fusion localization systems.

Technology Category

Application Category

📝 Abstract
This paper introduces MILUV, a Multi-UAV Indoor Localization dataset with UWB and Vision measurements. This dataset comprises 217 minutes of flight time over 36 experiments using three quadcopters, collecting ultra-wideband (UWB) ranging data such as the raw timestamps and channel-impulse response data, vision data from a stereo camera and a bottom-facing monocular camera, inertial measurement unit data, height measurements from a laser rangefinder, magnetometer data, and ground-truth poses from a motion-capture system. The UWB data is collected from up to 12 transceivers affixed to mobile robots and static tripods in both line-of-sight and non-line-of-sight conditions. The UAVs fly at a maximum speed of 4.418 m/s in an indoor environment with visual fiducial markers as features. MILUV is versatile and can be used for a wide range of applications beyond localization, but the primary purpose of MILUV is for testing and validating multi-robot UWB- and vision-based localization algorithms. The dataset can be downloaded at https://doi.org/10.25452/figshare.plus.28386041.v1. A development kit is presented alongside the MILUV dataset, which includes benchmarking algorithms such as visual-inertial odometry, UWB-based localization using an extended Kalman filter, and classification of CIR data using machine learning approaches. The development kit can be found at https://github.com/decargroup/miluv, and is supplemented with a website available at https://decargroup.github.io/miluv/.
Problem

Research questions and friction points this paper is trying to address.

Develops a dataset for multi-UAV indoor localization testing
Combines UWB and vision data for localization algorithms
Provides diverse sensor data for algorithm validation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-UAV dataset with UWB and vision
Includes diverse sensor data and ground-truth
Development kit with benchmarking algorithms
🔎 Similar Papers
No similar papers found.