Energy-based Epistemic Uncertainty for Graph Neural Networks

📅 2024-06-06
🏛️ Neural Information Processing Systems
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the challenge of unifying quantification of epistemic uncertainty across structural scales in Graph Neural Networks (GNNs), this paper proposes the multi-scale Energy-Based Aggregation framework, GEBM. Leveraging the hierarchical structure naturally induced by graph diffusion, GEBM constructs an integrable Energy-Based Model (EBM) that seamlessly integrates both structure-aware and structure-agnostic uncertainty for the first time. It further incorporates density regularization and evidential deep learning to enhance posterior calibration and prediction robustness. Evaluated on seven graph anomaly detection tasks, GEBM achieves state-of-the-art out-of-distribution (OOD) separation on six tasks and attains the top average rank across all datasets. Crucially, GEBM is architecture-agnostic—requiring no modification to the backbone GNN—and readily integrates with any pre-trained GNN, significantly improving uncertainty discrimination under distributional shift.

Technology Category

Application Category

📝 Abstract
In domains with interdependent data, such as graphs, quantifying the epistemic uncertainty of a Graph Neural Network (GNN) is challenging as uncertainty can arise at different structural scales. Existing techniques neglect this issue or only distinguish between structure-aware and structure-agnostic uncertainty without combining them into a single measure. We propose GEBM, an energy-based model (EBM) that provides high-quality uncertainty estimates by aggregating energy at different structural levels that naturally arise from graph diffusion. In contrast to logit-based EBMs, we provably induce an integrable density in the data space by regularizing the energy function. We introduce an evidential interpretation of our EBM that significantly improves the predictive robustness of the GNN. Our framework is a simple and effective post hoc method applicable to any pre-trained GNN that is sensitive to various distribution shifts. It consistently achieves the best separation of in-distribution and out-of-distribution data on 6 out of 7 anomaly types while having the best average rank over shifts on emph{all} datasets.
Problem

Research questions and friction points this paper is trying to address.

Quantify epistemic uncertainty in GNNs
Aggregate energy at structural levels
Improve predictive robustness of GNNs
Innovation

Methods, ideas, or system contributions that make the work stand out.

Energy-based model for GNN
Aggregates energy at structural levels
Improves predictive robustness significantly
🔎 Similar Papers
No similar papers found.
D
Dominik Fuchsgruber
School of Computation, Information and Technology & Munich Data Science Institute, Technical University of Munich, Germany
T
Tom Wollschlager
School of Computation, Information and Technology & Munich Data Science Institute, Technical University of Munich, Germany
Stephan Günnemann
Stephan Günnemann
Professor of Computer Science, Technical University of Munich
Machine LearningGraphsGraph Neural NetworksRobustness