Time-Optimized Safe Navigation in Unstructured Environments through Learning Based Depth Completion

📅 2025-06-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Real-time, safe, and time-optimal autonomous navigation for size-, weight-, and power-constrained (SWaP-limited) quadrotors in unknown, unstructured environments remains highly challenging. Method: We propose a lightweight, tightly integrated visual SLAM and online trajectory planning framework. Specifically, we introduce a novel dense depth completion algorithm that jointly fuses stereo and monocular learned depth; construct an incrementally updated dense 3D occupancy map; and design an A*-iLQR hybrid online time-optimal trajectory replanner that ensures collision avoidance while meeting stringent embedded real-time constraints (<50 ms per frame). Contribution/Results: To our knowledge, this is the first end-to-end framework enabling real-time, safe navigation using only lightweight visual sensors—validated through robust flight experiments across diverse indoor and outdoor unknown environments. Our trajectory generation speed significantly surpasses state-of-the-art methods, guarantees strict collision-free execution, and supports long-duration autonomous operation.

Technology Category

Application Category

📝 Abstract
Quadrotors hold significant promise for several applications such as agriculture, search and rescue, and infrastructure inspection. Achieving autonomous operation requires systems to navigate safely through complex and unfamiliar environments. This level of autonomy is particularly challenging due to the complexity of such environments and the need for real-time decision making especially for platforms constrained by size, weight, and power (SWaP), which limits flight time and precludes the use of bulky sensors like Light Detection and Ranging (LiDAR) for mapping. Furthermore, computing globally optimal, collision-free paths and translating them into time-optimized, safe trajectories in real time adds significant computational complexity. To address these challenges, we present a fully onboard, real-time navigation system that relies solely on lightweight onboard sensors. Our system constructs a dense 3D map of the environment using a novel visual depth estimation approach that fuses stereo and monocular learning-based depth, yielding longer-range, denser, and less noisy depth maps than conventional stereo methods. Building on this map, we introduce a novel planning and trajectory generation framework capable of rapidly computing time-optimal global trajectories. As the map is incrementally updated with new depth information, our system continuously refines the trajectory to maintain safety and optimality. Both our planner and trajectory generator outperforms state-of-the-art methods in terms of computational efficiency and guarantee obstacle-free trajectories. We validate our system through robust autonomous flight experiments in diverse indoor and outdoor environments, demonstrating its effectiveness for safe navigation in previously unknown settings.
Problem

Research questions and friction points this paper is trying to address.

Achieving autonomous safe navigation in complex unstructured environments
Real-time depth completion using lightweight sensors for dense mapping
Computing time-optimal collision-free trajectories with onboard computational constraints
Innovation

Methods, ideas, or system contributions that make the work stand out.

Learning-based depth completion for dense 3D mapping
Real-time time-optimal global trajectory planning
Onboard lightweight sensor fusion for navigation
🔎 Similar Papers
Jeffrey Mao
Jeffrey Mao
New York University
Robotics Unmanned Aerial Vehicles
R
Raghuram Cauligi Srinivas
New York University, New York City, NY 11217 USA
S
Steven Nogar
Army Research Lab
Giuseppe Loianno
Giuseppe Loianno
UC Berkeley
RoboticsMAVsVisionSensor Fusion