π€ AI Summary
Current spiking neural networks (SNNs) predominantly rely on a single synaptic plasticity mechanism, limiting representational capacity and robustness and failing to emulate the brainβs multi-mechanism collaborative learning. To address this, we propose the first brain-inspired multi-plasticity co-training framework, unifying spike-timing-dependent plasticity (STDP), heterosynaptic STDP (Hetero-STDP), and synaptic homeostasis within a single model. Crucially, we introduce an adaptive mechanism-weight allocation strategy that preserves the intrinsic dynamics of each plasticity rule while enabling dynamic, synergistic optimization. The framework supports end-to-end learning on both static image and dynamic event-camera data. Extensive experiments across multiple benchmark datasets demonstrate substantial improvements in accuracy and generalization over state-of-the-art SNN methods. These results validate the critical role of multi-mechanism collaboration in enhancing SNN expressivity and robustness, establishing a general-purpose, biologically grounded training paradigm for efficient, brain-like SNNs.
π Abstract
Spiking Neural Networks (SNNs) are promising brain-inspired models known for low power consumption and superior potential for temporal processing, but identifying suitable learning mechanisms remains a challenge. Despite the presence of multiple coexisting learning strategies in the brain, current SNN training methods typically rely on a single form of synaptic plasticity, which limits their adaptability and representational capability. In this paper, we propose a biologically inspired training framework that incorporates multiple synergistic plasticity mechanisms for more effective SNN training. Our method enables diverse learning algorithms to cooperatively modulate the accumulation of information, while allowing each mechanism to preserve its own relatively independent update dynamics. We evaluated our approach on both static image and dynamic neuromorphic datasets to demonstrate that our framework significantly improves performance and robustness compared to conventional learning mechanism models. This work provides a general and extensible foundation for developing more powerful SNNs guided by multi-strategy brain-inspired learning.