🤖 AI Summary
To address the high computational cost and poor deployability of Continuous Graph Neural Networks (CGNNs) on edge devices, this paper proposes the first continuous spiking graph learning framework that integrates second-order ordinary differential equations (ODEs) with Spiking Neural Networks (SNNs). The method employs a higher-order second-order ODE to govern spiking graph representation and propagation, theoretically ensuring mitigation of gradient explosion/vanishing and enhanced modeling of long-range node dependencies. By jointly encoding temporal and structural dynamics, it achieves efficient graph learning. Extensive experiments on multiple benchmark tasks demonstrate that our model surpasses state-of-the-art CGNNs and conventional GNNs in accuracy while significantly reducing energy consumption. These results validate its feasibility and superiority in resource-constrained edge scenarios.
📝 Abstract
Continuous graph neural networks (CGNNs) have garnered significant attention due to their ability to generalize existing discrete graph neural networks (GNNs) by introducing continuous dynamics. They typically draw inspiration from diffusion-based methods to introduce a novel propagation scheme, which is analyzed using ordinary differential equations (ODE). However, the implementation of CGNNs requires significant computational power, making them challenging to deploy on battery-powered devices. Inspired by recent spiking neural networks (SNNs), which emulate a biological inference process and provide an energy-efficient neural architecture, we incorporate the SNNs with CGNNs in a unified framework, named Continuous Spiking Graph Neural Networks (COS-GNN). We employ SNNs for graph node representation at each time step, which are further integrated into the ODE process along with time. To enhance information preservation and mitigate information loss in SNNs, we introduce the high-order structure of COS-GNN, which utilizes the second-order ODE for spiking representation and continuous propagation. Moreover, we provide the theoretical proof that COS-GNN effectively mitigates the issues of exploding and vanishing gradients, enabling us to capture long-range dependencies between nodes. Experimental results on graph-based learning tasks demonstrate the effectiveness of the proposed COS-GNN over competitive baselines.