🤖 AI Summary
This paper addresses the challenge of resource-constrained visual homing for outdoor robots in GPS-denied environments. Inspired by ant navigation, we propose a biologically inspired approach that integrates a mushroom body (MB) neural circuit model with a learned walking policy. Angular path integration signals are used to classify low-resolution (32×32-pixel) panoramic views for memory-based homing, and a dedicated fifth output neuron enables precise docking. The system runs in real time at 8 Hz on a Raspberry Pi 4 using only 9 kB of memory. Key contributions include: (i) the first deployment of a lateralized MB architecture for visual homing in real-world outdoor settings; (ii) a novel memory-based view classification mechanism for homing; and (iii) full decoupling of the behavioral pipeline—learning to walk, random exploration, autonomous homing, and accurate docking—thereby significantly improving robustness and energy efficiency in natural environments.
📝 Abstract
Ants achieve robust visual homing with minimal sensory input and only a few learning walks, inspiring biomimetic solutions for autonomous navigation. While Mushroom Body (MB) models have been used in robotic route following, they have not yet been applied to visual homing. We present the first real-world implementation of a lateralized MB architecture for visual homing onboard a compact autonomous car-like robot. We test whether the sign of the angular path integration (PI) signal can categorize panoramic views, acquired during learning walks and encoded in the MB, into "goal on the left" and "goal on the right" memory banks, enabling robust homing in natural outdoor settings. We validate this approach through four incremental experiments: (1) simulation showing attractor-like nest dynamics; (2) real-world homing after decoupled learning walks, producing nest search behavior; (3) homing after random walks using noisy PI emulated with GPS-RTK; and (4) precise stopping-at-the-goal behavior enabled by a fifth MB Output Neuron (MBON) encoding goal-views to control velocity. This mimics the accurate homing behavior of ants and functionally resembles waypoint-based position control in robotics, despite relying solely on visual input. Operating at 8 Hz on a Raspberry Pi 4 with 32x32 pixel views and a memory footprint under 9 kB, our system offers a biologically grounded, resource-efficient solution for autonomous visual homing.