SPARTA: Advancing Sparse Attention in Spiking Neural Networks via Spike-Timing-Based Prioritization

📅 2025-08-03
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing spiking neural networks (SNNs) predominantly rely on rate coding, neglecting the rich temporal dynamics encoded in precise spike timing—such as spike latency, inter-spike intervals, and firing patterns—leading to computational redundancy and limited representational capacity. To address this, we propose SPARTA, the first framework to explicitly incorporate fine-grained spike timing into attention mechanisms. SPARTA introduces time-aware priority gating and resource-adaptive sparse selection, jointly optimized with heterogeneous neuron dynamics modeling. Evaluated on DVS-Gesture, CIFAR10-DVS, and static CIFAR-10 (with event-to-frame conversion), SPARTA achieves 98.78%, 83.06%, and 95.3% accuracy, respectively, while attaining 65.4% synaptic sparsity. These results significantly outperform state-of-the-art SNNs, establishing a novel paradigm for temporally driven, brain-inspired attention that balances efficiency and expressivity.

Technology Category

Application Category

📝 Abstract
Current Spiking Neural Networks (SNNs) underutilize the temporal dynamics inherent in spike-based processing, relying primarily on rate coding while overlooking precise timing information that provides rich computational cues. We propose SPARTA (Spiking Priority Attention with Resource-Adaptive Temporal Allocation), a framework that leverages heterogeneous neuron dynamics and spike-timing information to enable efficient sparse attention. SPARTA prioritizes tokens based on temporal cues, including firing patterns, spike timing, and inter-spike intervals, achieving 65.4% sparsity through competitive gating. By selecting only the most salient tokens, SPARTA reduces attention complexity from O(N^2) to O(K^2) with k << n, while maintaining high accuracy. Our method achieves state-of-the-art performance on DVS-Gesture (98.78%) and competitive results on CIFAR10-DVS (83.06%) and CIFAR-10 (95.3%), demonstrating that exploiting spike timing dynamics improves both computational efficiency and accuracy.
Problem

Research questions and friction points this paper is trying to address.

Enhances SNNs by utilizing precise spike timing dynamics
Reduces attention complexity from O(N^2) to O(K^2)
Improves computational efficiency and accuracy in neural networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Leverages spike-timing for efficient sparse attention
Reduces complexity via competitive gating and prioritization
Achieves high accuracy with dynamic token selection
🔎 Similar Papers
No similar papers found.