🤖 AI Summary
This work addresses the high computational cost and limited generalization in visual point-goal navigation by drawing inspiration from the efficient navigation mechanisms of insects. It proposes a lightweight, end-to-end navigation agent that uniquely integrates two key functional modules found in the insect brain—associative learning and path integration. Relying solely on visual inputs, the model achieves near state-of-the-art navigation performance in both the standard Habitat benchmark and more realistic simulated environments. Crucially, it reduces computational overhead by several orders of magnitude while demonstrating strong robustness to environmental perturbations, thereby significantly enhancing both efficiency and practical applicability.
📝 Abstract
In this work we develop a novel insect-inspired agent for visual point-goal navigation. This combines abstracted models of two insect brain structures that have been implicated, respectively, in associative learning and path integration. We draw an analogy between the formal benchmark of the Habitat point-goal navigation task and the ability of insects to learn and refine visually guided paths around obstacles between a discovered food location and their nest. We demonstrate that the simple insect-inspired agent exhibits performance comparable to recent SOTA models at many orders of magnitude less computational cost. Testing in a more realistic simulated environment shows the approach is robust to perturbations.