Efficient ANN-Guided Distillation: Aligning Rate-based Features of Spiking Neural Networks through Hybrid Block-wise Replacement

πŸ“… 2025-03-20
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the low training efficiency of spiking neural networks (SNNs) and the difficulty of effectively transferring knowledge from artificial neural networks (ANNs) into the rate-coding domain, this paper proposes an ANN-guided, end-to-end differentiable distillation framework. The method replaces SNN components with corresponding ANN modules in a block-wise manner, embedding them directly into the SNN forward passβ€”thereby preserving intrinsic spiking dynamics while enabling progressive alignment of rate-coded feature representations. Crucially, it is the first to seamlessly integrate rate-coded backpropagation into such hybrid architectures, ensuring both gradient validity and structural consistency. Evaluated on multiple benchmark datasets, the approach significantly accelerates training convergence and improves generalization performance, consistently outperforming state-of-the-art ANN-to-SNN distillation methods.

Technology Category

Application Category

πŸ“ Abstract
Spiking Neural Networks (SNNs) have garnered considerable attention as a potential alternative to Artificial Neural Networks (ANNs). Recent studies have highlighted SNNs' potential on large-scale datasets. For SNN training, two main approaches exist: direct training and ANN-to-SNN (ANN2SNN) conversion. To fully leverage existing ANN models in guiding SNN learning, either direct ANN-to-SNN conversion or ANN-SNN distillation training can be employed. In this paper, we propose an ANN-SNN distillation framework from the ANN-to-SNN perspective, designed with a block-wise replacement strategy for ANN-guided learning. By generating intermediate hybrid models that progressively align SNN feature spaces to those of ANN through rate-based features, our framework naturally incorporates rate-based backpropagation as a training method. Our approach achieves results comparable to or better than state-of-the-art SNN distillation methods, showing both training and learning efficiency.
Problem

Research questions and friction points this paper is trying to address.

Align SNN feature spaces to ANN using rate-based features
Improve ANN-to-SNN conversion via hybrid block-wise replacement
Enhance SNN training efficiency through ANN-guided distillation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hybrid block-wise replacement strategy
Rate-based feature alignment
Rate-based backpropagation training
πŸ”Ž Similar Papers
No similar papers found.
S
Shu Yang
ZJU-UIUC Institute, Zhejiang University
Chengting Yu
Chengting Yu
Zhejiang University
L
Lei Liu
ZJU-UIUC Institute, Zhejiang University
H
Hanzhi Ma
ZJU-UIUC Institute, Zhejiang University; College of Information Science and Electronic Engineering, Zhejiang University
A
Aili Wang
ZJU-UIUC Institute, Zhejiang University; College of Information Science and Electronic Engineering, Zhejiang University
E
Erping Li
ZJU-UIUC Institute, Zhejiang University; College of Information Science and Electronic Engineering, Zhejiang University