Full Integer Arithmetic Online Training for Spiking Neural Networks

📅 2025-09-08
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the high training overhead and deployment challenges of Spiking Neural Networks (SNNs) on neuromorphic hardware, this paper proposes the first fully integer online training framework. Our method integrates mixed-precision (8-/12-bit) integer arithmetic with a synergistic BPTT-RTRL optimization strategy, eliminating floating-point operations entirely while supporting both convolutional and recurrent SNN architectures. It enables end-to-end integer-only gradient updates. Experiments on MNIST and the Spiking Heidelberg Digits (SHD) dataset demonstrate that our approach matches or surpasses full-precision baselines in accuracy, reduces memory footprint by over 60%, and maintains stable performance with 8-/12-bit weight quantization during inference. To the best of our knowledge, this is the first work achieving high-accuracy, low-overhead, hardware-friendly online learning for SNNs—establishing a scalable training paradigm for resource-constrained neuromorphic chips.

Technology Category

Application Category

📝 Abstract
Spiking Neural Networks (SNNs) are promising for neuromorphic computing due to their biological plausibility and energy efficiency. However, training methods like Backpropagation Through Time (BPTT) and Real Time Recurrent Learning (RTRL) remain computationally intensive. This work introduces an integer-only, online training algorithm using a mixed-precision approach to improve efficiency and reduce memory usage by over 60%. The method replaces floating-point operations with integer arithmetic to enable hardware-friendly implementation. It generalizes to Convolutional and Recurrent SNNs (CSNNs, RSNNs), showing versatility across architectures. Evaluations on MNIST and the Spiking Heidelberg Digits (SHD) dataset demonstrate that mixed-precision models achieve accuracy comparable to or better than full-precision baselines using 16-bit shadow and 8- or 12-bit inference weights. Despite some limitations in low-precision and deeper models, performance remains robust. In conclusion, the proposed integer-only online learning algorithm presents an effective solution for efficiently training SNNs, enabling deployment on resource-constrained neuromorphic hardware without sacrificing accuracy.
Problem

Research questions and friction points this paper is trying to address.

Reducing computational intensity of SNN training methods
Enabling hardware-friendly implementation with integer arithmetic
Maintaining accuracy while improving efficiency for deployment
Innovation

Methods, ideas, or system contributions that make the work stand out.

Integer-only online training algorithm
Mixed-precision approach reduces memory usage
Hardware-friendly implementation with integer arithmetic
🔎 Similar Papers