🤖 AI Summary
Spectral clustering suffers from poor generalizability, limited scalability, and suboptimal spectral embeddings that fail to approximate optimal graph cuts, primarily due to reliance on handcrafted affinity matrices and non-differentiable eigendecomposition. To address these limitations, we propose the first end-to-end differentiable spectral clustering framework: we fully differentiably reformulate the Normalized Cut (Ncut) objective using implicit differentiation and differentiable spectral graph theory, enabling deep neural networks to directly learn task-optimal spectral embeddings without requiring pre-defined similarity matrices or fixed graph structures. Our method supports cross-graph generalization and zero-shot transfer to unseen graph topologies. Extensive experiments on multiple benchmark datasets demonstrate superior clustering performance over conventional spectral clustering and GNN-based clustering methods, while achieving a threefold improvement in training efficiency.