Efficient and Scalable Granular-ball Graph Coarsening Method for Large-scale Graph Node Classification

📅 2026-03-30
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Training graph convolutional networks (GCNs) on large-scale graphs is hindered by substantial computational costs and poor scalability. Existing graph coarsening methods often fall short due to either neglecting multi-granular structural information or incurring high computational complexity. To address these limitations, this work proposes an efficient and scalable granular-ball graph coarsening framework that leverages a linear-time multi-granularity granular-ball algorithm to produce structure-preserving subgraphs. Combined with random sampling and mini-batch training strategies, the approach significantly reduces graph size while maintaining representational fidelity. Extensive experiments on multiple large-scale node classification benchmarks demonstrate that the proposed method achieves superior trade-offs between training efficiency and predictive accuracy compared to state-of-the-art coarsening techniques, thereby validating its effectiveness in enhancing both the scalability and performance of GCNs.
📝 Abstract
Graph Convolutional Network (GCN) is a model that can effectively handle graph data tasks and has been successfully applied. However, for large-scale graph datasets, GCN still faces the challenge of high computational overhead, especially when the number of convolutional layers in the graph is large. Currently, there are many advanced methods that use various sampling techniques or graph coarsening techniques to alleviate the inconvenience caused during training. However, among these methods, some ignore the multi-granularity information in the graph structure, and the time complexity of some coarsening methods is still relatively high. In response to these issues, based on our previous work, in this paper, we propose a new framework called Efficient and Scalable Granular-ball Graph Coarsening Method for Large-scale Graph Node Classification. Specifically, this method first uses a multi-granularity granular-ball graph coarsening algorithm to coarsen the original graph to obtain many subgraphs. The time complexity of this stage is linear and much lower than that of the exiting graph coarsening methods. Then, subgraphs composed of these granular-balls are randomly sampled to form minibatches for training GCN. Our algorithm can adaptively and significantly reduce the scale of the original graph, thereby enhancing the training efficiency and scalability of GCN. Ultimately, the experimental results of node classification on multiple datasets demonstrate that the method proposed in this paper exhibits superior performance. The code is available at https://anonymous.4open.science/r/1-141D/.
Problem

Research questions and friction points this paper is trying to address.

Graph Convolutional Network
Graph Coarsening
Large-scale Graph
Node Classification
Computational Overhead
Innovation

Methods, ideas, or system contributions that make the work stand out.

Granular-ball Graph Coarsening
Multi-granularity
Scalable GCN
Linear Time Complexity
Large-scale Graph Node Classification
🔎 Similar Papers
No similar papers found.
Guan Wang
Guan Wang
CEO, Sapient Intelligence
Artificial General IntelligenceReinforcement LearningLarge Language Models
Shuyin Xia
Shuyin Xia
Professor, School of Computer Science, Chongqing University of Posts and Telecommunications
Granular computingClusteringRough setsClassifiersGranular ball computing
L
Lei Qian
School of Computer Science and Technology, Chongqing Key Laboratory of Computational Intelligence, Key Laboratory of Cyberspace Big Data Intelligent Security, Ministry of Education, Sichuan-Chongqing Co-construction Key Laboratory of Digital Economy Intelligence and Key Laboratory of Big Data Intelligent Computing, Chongqing University of Posts and Telecommunications, 400065, Chongqing, China
Guoyin Wang
Guoyin Wang
Chongqing University of Posts & Telecommunications
Artificial Intelligencerough setsdata miningknowledge technology
Y
Yi Liu
Chongqing Ant Consumer Finance Co., Ltd., Ant Group
Y
Yi Wang
Chongqing Ant Consumer Finance Co., Ltd., Ant Group
Wei Wang
Wei Wang
Tongyi Lab, Alibaba Group
Generative Models