GLADMamba: Unsupervised Graph-Level Anomaly Detection Powered by Selective State Space Model

📅 2025-03-23
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the inefficiency in modeling long-range dependencies and insufficient exploitation of spectral graph information in unsupervised graph-level anomaly detection (UGLAD), this paper pioneers the integration of the selective state space model (Mamba) into UGLAD. We propose a dual-module framework: (i) View-Fused Mamba, which fuses structural information from multiple graph views, and (ii) Spectrum-Guided Mamba, which performs spectral-aware embedding optimization guided by the Rayleigh quotient. By synergizing Mamba’s linear-complexity long-range modeling capability with spectral graph theory’s frequency-domain priors, our approach avoids the quadratic complexity inherent in Transformer-based methods. Extensive experiments on 12 real-world graph datasets—spanning social networks, anticancer drug discovery, and toxic molecule identification—demonstrate significant improvements over existing state-of-the-art methods. The source code is publicly available.

Technology Category

Application Category

📝 Abstract
Unsupervised graph-level anomaly detection (UGLAD) is a critical and challenging task across various domains, such as social network analysis, anti-cancer drug discovery, and toxic molecule identification. However, existing methods often struggle to capture the long-range dependencies efficiently and neglect the spectral information. Recently, selective State Space Models (SSMs), particularly Mamba, have demonstrated remarkable advantages in capturing long-range dependencies with linear complexity and a selection mechanism. Motivated by their success across various domains, we propose GLADMamba, a novel framework that adapts the selective state space model into UGLAD field. We design View-Fused Mamba (VFM) with a Mamba-Transformer-style architecture to efficiently fuse information from different views with a selective state mechanism. We also design Spectrum-Guided Mamba (SGM) with a Mamba-Transformer-style architecture to leverage the Rayleigh quotient to guide the embedding refining process. GLADMamba can dynamically focus on anomaly-related information while discarding irrelevant information for anomaly detection. To the best of our knowledge, this is the first work to introduce Mamba and explicit spectral information to UGLAD. Extensive experiments on 12 real-world datasets demonstrate that GLADMamba outperforms existing state-of-the-art methods, achieving superior performance in UGLAD. The code is available at https://github.com/Yali-F/GLADMamba.
Problem

Research questions and friction points this paper is trying to address.

Detect graph-level anomalies without supervision efficiently
Capture long-range dependencies and spectral information effectively
Improve anomaly detection by dynamically focusing on relevant data
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses selective State Space Models for anomaly detection
Integrates Mamba-Transformer for multi-view information fusion
Leverages spectral guidance via Rayleigh quotient optimization
🔎 Similar Papers
No similar papers found.
Yali Fu
Yali Fu
Jilin University
LLMsReasoningMultimodal LearningGraph LearningAnomaly Detection
J
Jindong Li
Hong Kong University of Science and Technology (Guangzhou), Guangzhou, China
Q
Qi Wang
Jilin University, Changchun, China; Engineering Research Center of Knowledge-Driven Human-Machine Intelligence, Ministry of Education, China
Qianli Xing
Qianli Xing
Macquarie University
Data MiningDeep LearningCrowdsourcing