🤖 AI Summary
This work addresses catastrophic forgetting in artificial neural networks (ANNs) during continual learning by proposing a brain-inspired adaptive synaptogenesis mechanism that emulates supervised Hebbian learning coordinated between the neocortex and hippocampus. Methodologically, it formalizes biological synaptogenesis for the first time as a hardware-deployable, sparse, gated learning framework, wherein dynamic hippocampal gating regulates synaptic formation—departing from conventional static-weight ANN paradigms. Integrating neuromorphic algorithm design, SPICE-level nanomagnetic device modeling, and a custom accelerator architecture, the approach enables high-density, ultra-low-power synaptic plasticity operations. Experimental results demonstrate a 37% reduction in error rate on continual learning benchmarks. A 16-nm hardware prototype achieves per-synapse energy consumption below 1 aJ and synaptic density exceeding 10¹²/cm².
📝 Abstract
The human brain functions very differently from artificial neural networks (ANN) and possesses unique features that are absent in ANN. An important one among them is"adaptive synaptogenesis"that modifies synaptic weights when needed to avoid catastrophic forgetting and promote lifelong learning. The key aspect of this algorithm is supervised Hebbian learning, where weight modifications in the neocortex driven by temporal coincidence are further accepted or vetoed by an added control mechanism from the hippocampus during the training cycle, to make distant synaptic connections highly sparse and strategic. In this work, we discuss various algorithmic aspects of adaptive synaptogenesis tailored to edge computing, demonstrate its function using simulations, and design nanomagnetic hardware accelerators for specific functions of synaptogenesis.