Self-Clustering Graph Transformer Approach to Model Resting-State Functional Brain Activity

📅 2025-01-17
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Traditional graph Transformers struggle to model resting-state fMRI due to node updates lacking functional subnetwork awareness and failing to capture the brain’s modular architecture. To address this, we propose the Self-Clustering Graph Transformer (SCGT), which introduces a novel function-driven self-clustering attention mechanism. SCGT leverages static functional connectivity as an unsupervised prior to guide subnetwork clustering and enables cluster-aware node representation learning. This design jointly enhances interpretability and representational capacity while supporting end-to-end learning and identification of functional subnetworks. Evaluated on the large-scale ABCD dataset (N = 7,957), SCGT achieves state-of-the-art performance in both cognitive composite score prediction and sex classification—significantly outperforming standard graph Transformers and leading baseline models.

Technology Category

Application Category

📝 Abstract
Resting-state functional magnetic resonance imaging (rs-fMRI) offers valuable insights into the human brain's functional organization and is a powerful tool for investigating the relationship between brain function and cognitive processes, as it allows for the functional organization of the brain to be captured without relying on a specific task or stimuli. In this study, we introduce a novel attention mechanism for graphs with subnetworks, named Self-Clustering Graph Transformer (SCGT), designed to handle the issue of uniform node updates in graph transformers. By using static functional connectivity (FC) correlation features as input to the transformer model, SCGT effectively captures the sub-network structure of the brain by performing cluster-specific updates to the nodes, unlike uniform node updates in vanilla graph transformers, further allowing us to learn and interpret the subclusters. We validate our approach on the Adolescent Brain Cognitive Development (ABCD) dataset, comprising 7,957 participants, for the prediction of total cognitive score and gender classification. Our results demonstrate that SCGT outperforms the vanilla graph transformer method and other recent models, offering a promising tool for modeling brain functional connectivity and interpreting the underlying subnetwork structures.
Problem

Research questions and friction points this paper is trying to address.

Model brain functional connectivity
Improve node updates in graph transformers
Interpret brain subnetwork structures
Innovation

Methods, ideas, or system contributions that make the work stand out.

Self-Clustering Graph Transformer
Cluster-specific node updates
Static functional connectivity input
🔎 Similar Papers
No similar papers found.
B
Bishal Thapaliya
1Department of Computer Science, Georgia State University, Atlanta, USA; 2Tri-Institutional Center for Translational Research in Neuroimaging and Data Science
Esra Akbas
Esra Akbas
Georgia State University
Graph Machine LearningData/Graph miningNetwork Science
R
Ram Sapkota
1Department of Computer Science, Georgia State University, Atlanta, USA; 2Tri-Institutional Center for Translational Research in Neuroimaging and Data Science
Bhaskar Ray
Bhaskar Ray
Georgia Sate University
Data MiningMachine LearningDeep LearningImaging Genetics
Vince D. Calhoun
Vince D. Calhoun
Director-Translational Research in Neuroimaging and Data Science (TReNDS;GSU/GAtech/Emory)
brain imaging/MRI/EEG/MEGdata fusiondata scienceimage analysismental illness
J
Jingyu Liu
1Department of Computer Science, Georgia State University, Atlanta, USA; 2Tri-Institutional Center for Translational Research in Neuroimaging and Data Science