🤖 AI Summary
To address the challenge of real-time autonomous navigation for physical robots in dynamic environments, this paper proposes a task-driven, reconfigurable brain-inspired navigation framework compatible with both ground (TurtleBot) and aerial (Parrot Bebop2) platforms. Methodologically, it integrates event-based vision (DVS), spiking neural network (SNN)-inspired feature extraction, lightweight online replanning, and a hierarchical modular autonomy stack (perception–planning–control). It achieves, for the first time, end-to-end closed-loop event-driven flight navigation through real-world moving door scenarios; enables millisecond-scale local obstacle avoidance replanning on TurtleBot; and accomplishes high-speed door traversal at 1.2 m/s on Bebop2 with end-to-end latency <15 ms and 67% power reduction. The core contribution is the first cross-platform reconfigurable brain-inspired navigation architecture, empirically validating the real-time performance, robustness, and energy efficiency of event-driven paradigms on resource-constrained physical robots.
📝 Abstract
Neuromorphic vision, inspired by biological neural systems, has recently gained significant attention for its potential in enhancing robotic autonomy. This paper presents a systematic exploration of a proposed Neuromorphic Navigation framework that uses event-based neuromorphic vision to enable efficient, real-time navigation in robotic systems. We discuss the core concepts of neuromorphic vision and navigation, highlighting their impact on improving robotic perception and decision-making. The proposed reconfigurable Neuromorphic Navigation framework adapts to the specific needs of both ground robots (Turtlebot) and aerial robots (Bebop2 quadrotor), addressing the task-specific design requirements (algorithms) for optimal performance across the autonomous navigation stack -- Perception, Planning, and Control. We demonstrate the versatility and the effectiveness of the framework through two case studies: a Turtlebot performing local replanning for real-time navigation and a Bebop2 quadrotor navigating through moving gates. Our work provides a scalable approach to task-specific, real-time robot autonomy leveraging neuromorphic systems, paving the way for energy-efficient autonomous navigation.