🤖 AI Summary
This work addresses the over-smoothing problem in deep graph convolutional networks (GCNs), where node representations collapse as network depth increases. It is the first to model over-smoothing as a depth-dependent phenomenon in the spectral domain and proposes a lightweight, interpretable low-rank spectral adaptation mechanism. By introducing learnable low-rank correction terms in the spectral domain, the method adaptively adjusts the fixed Laplacian propagation operator, selectively mitigating representation contraction while preserving the low-pass inductive bias inherent to GCNs. Notably, this approach significantly delays over-smoothing without requiring any redesign of the message-passing architecture. Experiments on multiple benchmark datasets demonstrate that the effective depth of GCNs nearly doubles, and both embedding variance and spectral analysis confirm a substantial alleviation of representation collapse.
📝 Abstract
Oversmoothing is a fundamental limitation of deep graph convolutional networks (GCNs), causing node representations to collapse as depth increases. While many prior approaches mitigate this effect through architectural modifications or residual mechanisms, the underlying spectral cause of oversmoothing is often left implicit. We propose Laplacian-LoRA, a simple and interpretable low-rank spectral adaptation of standard GCNs. Rather than redesigning message passing, Laplacian-LoRA introduces a learnable, spectrally anchored correction to the fixed Laplacian propagation operator, selectively weakening contraction while preserving stability and the low-pass inductive bias. Across multiple benchmark datasets and depths, Laplacian-LoRA consistently delays the onset of oversmoothing, extending the effective depth of GCNs by up to a factor of two. Embedding variance diagnostics confirm that these gains arise from delayed representational collapse, while learned spectral analysis demonstrates that the correction is smooth, bounded, and well behaved. Our results show that oversmoothing is a depth-dependent spectral phenomenon that can be systematically delayed through modest, low-rank adaptation of the graph propagation operator.