Scalable community detection in massive networks via predictive assignment

📅 2025-03-20
📈 Citations: 0
✨ Influential: 0
📄 PDF
🤖 AI Summary
To address memory explosion and high computational cost in community detection on ultra-large-scale networks, this paper proposes a novel “predictive assignment” paradigm: it decouples the task into two stages—(i) small-scale parameter learning on subgraphs via a degree-corrected stochastic block model, and (ii) parallel node-level vector prediction across the entire network. By circumventing global matrix factorization, the method ensures statistical consistency while enabling distributed implementation. We establish its asymptotic consistency theoretically. Empirical evaluation on real-world ultra-large graphs—including DBLP and Twitch—demonstrates that our approach achieves community partitioning accuracy comparable to state-of-the-art methods, while exhibiting linear scalability in both memory consumption and runtime—significantly outperforming conventional matrix factorization–based approaches. The core contribution is the first principled “learning-prediction” decoupling framework for community detection, jointly achieving scalability, accuracy, and theoretical rigor.

Technology Category

Application Category

📝 Abstract
Massive network datasets are becoming increasingly common in scientific applications. Existing community detection methods encounter significant computational challenges for such massive networks due to two reasons. First, the full network needs to be stored and analyzed on a single server, leading to high memory costs. Second, existing methods typically use matrix factorization or iterative optimization using the full network, resulting in high runtimes. We propose a strategy called extit{predictive assignment} to enable computationally efficient community detection while ensuring statistical accuracy. The core idea is to avoid large-scale matrix computations by breaking up the task into a smaller matrix computation plus a large number of vector computations that can be carried out in parallel. Under the proposed method, community detection is carried out on a small subgraph to estimate the relevant model parameters. Next, each remaining node is assigned to a community based on these estimates. We prove that predictive assignment achieves strong consistency under the stochastic blockmodel and its degree-corrected version. We also demonstrate the empirical performance of predictive assignment on simulated networks and two large real-world datasets: DBLP (Digital Bibliography &Library Project), a computer science bibliographical database, and the Twitch Gamers Social Network.
Problem

Research questions and friction points this paper is trying to address.

Scalable community detection in massive networks
High memory costs from single-server network storage
Slow runtimes due to matrix factorization methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Predictive assignment avoids large-scale matrix computations
Breaks task into parallelizable vector computations
Uses subgraph for parameter estimation first
🔎 Similar Papers
No similar papers found.