Learnable Kernel Density Estimation for Graphs

📅 2025-05-27
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Graph density estimation faces the challenge of simultaneously achieving effective structural modeling and theoretical guarantees. This paper proposes LGKDE, a learnable graph kernel density estimation framework: first, graphs are mapped to discrete distributions via graph neural networks; second, a multi-scale graph metric is learned end-to-end using maximum mean discrepancy (MMD), jointly modeling node features and spectral-domain perturbations to adaptively characterize normal density boundaries. LGKDE is the first end-to-end learnable graph KDE method with provable statistical consistency, robustness, and convergence guarantees. Its novel dual-domain perturbation generation mechanism significantly enhances boundary characterization capability. Experiments demonstrate that LGKDE consistently outperforms state-of-the-art methods on synthetic graph density recovery and graph anomaly detection across multiple benchmark datasets.

Technology Category

Application Category

📝 Abstract
This work proposes a framework LGKDE that learns kernel density estimation for graphs. The key challenge in graph density estimation lies in effectively capturing both structural patterns and semantic variations while maintaining theoretical guarantees. Combining graph kernels and kernel density estimation (KDE) is a standard approach to graph density estimation, but has unsatisfactory performance due to the handcrafted and fixed features of kernels. Our method LGKDE leverages graph neural networks to represent each graph as a discrete distribution and utilizes maximum mean discrepancy to learn the graph metric for multi-scale KDE, where all parameters are learned by maximizing the density of graphs relative to the density of their well-designed perturbed counterparts. The perturbations are conducted on both node features and graph spectra, which helps better characterize the boundary of normal density regions. Theoretically, we establish consistency and convergence guarantees for LGKDE, including bounds on the mean integrated squared error, robustness, and complexity. We validate LGKDE by demonstrating its effectiveness in recovering the underlying density of synthetic graph distributions and applying it to graph anomaly detection across diverse benchmark datasets. Extensive empirical evaluation shows that LGKDE demonstrates superior performance compared to state-of-the-art baselines on most benchmark datasets.
Problem

Research questions and friction points this paper is trying to address.

Learning graph density estimation with structural and semantic patterns
Overcoming limitations of fixed graph kernel features
Ensuring theoretical guarantees for density estimation robustness
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses graph neural networks for distribution representation
Learns graph metric via maximum mean discrepancy
Perturbs node features and graph spectra
🔎 Similar Papers
2021-12-14IEEE Transactions on Neural Networks and Learning SystemsCitations: 25
X
Xudong Wang
School of Data Science, The Chinese University of Hong Kong, Shenzhen (CUHK-Shenzhen), China
Ziheng Sun
Ziheng Sun
CUHKSZ
C
Chris Ding
School of Data Science, The Chinese University of Hong Kong, Shenzhen (CUHK-Shenzhen), China
Jicong Fan
Jicong Fan
The Chinese University of Hong Kong, Shenzhen
Artificial IntelligenceMachine Learning