🤖 AI Summary
Existing autonomous 3D detection for UAVs suffers from fragmented perception-planning-control pipelines and inadequate long-horizon motion prediction. Method: This paper proposes a data-driven predictive control framework that requires no prior dynamical model and enables plug-and-play integration with commercial off-the-shelf black-box UAVs. It innovatively incorporates backface culling—a technique from 3D computer graphics—into the control loop to enable long-range, high-precision online trajectory generation. Furthermore, it jointly leverages input-output system identification and visibility-aware analysis to ensure robustness and reliable long-term prediction in complex environments. Contribution/Results: Experiments demonstrate substantial improvements in detection coverage and efficiency. The method exhibits strong generalization under unknown dynamics and dynamic obstacles, while maintaining real-time performance.
📝 Abstract
Automated inspection with Unmanned Aerial Systems (UASs) is a transformative capability set to revolutionize various application domains. However, this task is inherently complex, as it demands the seamless integration of perception, planning, and control which existing approaches often treat separately. Moreover, it requires accurate long-horizon planning to predict action sequences, in contrast to many current techniques, which tend to be myopic. To overcome these limitations, we propose a 3D inspection approach that unifies perception, planning, and control within a single data-driven predictive control framework. Unlike traditional methods that rely on known UAS dynamic models, our approach requires only input-output data, making it easily applicable to off-the-shelf black-box UASs. Our method incorporates back-face elimination, a visibility determination technique from 3D computer graphics, directly into the control loop, thereby enabling the online generation of accurate, long-horizon 3D inspection trajectories.