🤖 AI Summary
This work proposes a model-agnostic, globally guided Hebbian learning (GHL) framework that overcomes the limitations of traditional Hebbian learning, which is constrained by purely local update rules and struggles to balance biological plausibility with performance in large-scale networks and complex tasks. By integrating task-dependent global sign signals with local Oja’s rule and competitive learning mechanisms, GHL introduces directional guidance from global objectives while preserving biological realism. The approach substantially improves Hebbian learning performance on challenging benchmarks such as ImageNet, significantly narrowing the gap with backpropagation-based methods and enabling effective scaling of biologically inspired learning to deep neural architectures.
📝 Abstract
Backpropagation algorithm has driven the remarkable success of deep neural networks, but its lack of biological plausibility and high computational costs have motivated the ongoing search for alternative training methods. Hebbian learning has attracted considerable interest as a biologically plausible alternative to backpropagation. Nevertheless, its exclusive reliance on local information, without consideration of global task objectives, fundamentally limits its scalability. Inspired by the biological synergy between neuromodulators and local plasticity, we introduce a novel model-agnostic Global-guided Hebbian Learning (GHL) framework, which seamlessly integrates local and global information to scale up across diverse networks and tasks. In specific, the local component employs Oja's rule with competitive learning to ensure stable and effective local updates. Meanwhile, the global component introduces a sign-based signal that guides the direction of local Hebbian plasticity updates. Extensive experiments demonstrate that our method consistently outperforms existing Hebbian approaches. Notably, on large-scale network and complex datasets like ImageNet, our framework achieves the competitive results and significantly narrows the gap with standard backpropagation.