🤖 AI Summary
Traditional graph Transformers struggle to model resting-state fMRI due to node updates lacking functional subnetwork awareness and failing to capture the brain’s modular architecture. To address this, we propose the Self-Clustering Graph Transformer (SCGT), which introduces a novel function-driven self-clustering attention mechanism. SCGT leverages static functional connectivity as an unsupervised prior to guide subnetwork clustering and enables cluster-aware node representation learning. This design jointly enhances interpretability and representational capacity while supporting end-to-end learning and identification of functional subnetworks. Evaluated on the large-scale ABCD dataset (N = 7,957), SCGT achieves state-of-the-art performance in both cognitive composite score prediction and sex classification—significantly outperforming standard graph Transformers and leading baseline models.
📝 Abstract
Resting-state functional magnetic resonance imaging (rs-fMRI) offers valuable insights into the human brain's functional organization and is a powerful tool for investigating the relationship between brain function and cognitive processes, as it allows for the functional organization of the brain to be captured without relying on a specific task or stimuli. In this study, we introduce a novel attention mechanism for graphs with subnetworks, named Self-Clustering Graph Transformer (SCGT), designed to handle the issue of uniform node updates in graph transformers. By using static functional connectivity (FC) correlation features as input to the transformer model, SCGT effectively captures the sub-network structure of the brain by performing cluster-specific updates to the nodes, unlike uniform node updates in vanilla graph transformers, further allowing us to learn and interpret the subclusters. We validate our approach on the Adolescent Brain Cognitive Development (ABCD) dataset, comprising 7,957 participants, for the prediction of total cognitive score and gender classification. Our results demonstrate that SCGT outperforms the vanilla graph transformer method and other recent models, offering a promising tool for modeling brain functional connectivity and interpreting the underlying subnetwork structures.