Scalable Hypergraph Structure Learning with Diverse Smoothness Priors

πŸ“… 2025-04-04
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
Learning hypergraph structures under unknown high-order relationships remains challenging due to inconsistent total variation (TV) definitions arising from diverse smoothness priors. Method: This paper proposes a scalable temporal-signal-based hypergraph inference framework. We introduce the first unified convex optimization framework encompassing multiple hypergraph total variation formulations, coupled with a dynamic hyperedge search constraint mechanism to ensure both convergence and sparsity. The method jointly leverages hypergraph signal processing and heterogeneous smoothness prior modeling, and employs an efficient forward-backward-forward algorithm for optimization. Results: Experiments demonstrate significant improvements in inference accuracy over state-of-the-art methods, strong robustness across varying TV terms, and scalability to large-scale graph-structured learning tasks.

Technology Category

Application Category

πŸ“ Abstract
In graph signal processing, learning the weighted connections between nodes from a set of sample signals is a fundamental task when the underlying relationships are not known a priori. This task is typically addressed by finding a graph Laplacian on which the observed signals are smooth. With the extension of graphs to hypergraphs - where edges can connect more than two nodes - graph learning methods have similarly been generalized to hypergraphs. However, the absence of a unified framework for calculating total variation has led to divergent definitions of smoothness and, consequently, differing approaches to hyperedge recovery. We confront this challenge through generalization of several previously proposed hypergraph total variations, subsequently allowing ease of substitution into a vector based optimization. To this end, we propose a novel hypergraph learning method that recovers a hypergraph topology from time-series signals based on a smoothness prior. Our approach addresses key limitations in prior works, such as hyperedge selection and convergence issues, by formulating the problem as a convex optimization solved via a forward-backward-forward algorithm, ensuring guaranteed convergence. Additionally, we introduce a process that simultaneously limits the span of the hyperedge search and maintains a valid hyperedge selection set. In doing so, our method becomes scalable in increasingly complex network structures. The experimental results demonstrate improved performance, in terms of accuracy, over other state-of-the-art hypergraph inference methods; furthermore, we empirically show our method to be robust to total variation terms, biased towards global smoothness, and scalable to larger hypergraphs.
Problem

Research questions and friction points this paper is trying to address.

Learning hypergraph connections from sample signals without prior knowledge
Unifying smoothness definitions for hypergraph structure recovery
Ensuring scalable and convergent hyperedge selection in complex networks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Generalizes hypergraph total variations for optimization
Uses convex optimization with guaranteed convergence algorithm
Limits hyperedge search span for scalability
πŸ”Ž Similar Papers
2023-12-15IEEE International Conference on Acoustics, Speech, and Signal ProcessingCitations: 3
B
Benjamin T. Brown
Department of Electrical and Computer Engineering, University of Kentucky, Lexington, KY 40506, USA
Haoxiang Zhang
Haoxiang Zhang
Queen’s University
Software EngineeringEmpirical Software EngineeringMining Software Repositories
D
Daniel L. Lau
Department of Electrical and Computer Engineering, University of Kentucky, Lexington, KY 40506, USA
Gonzalo R. Arce
Gonzalo R. Arce
Charles B. Evans Professor, JP Morgan-Chase Faculty Fellow, University of Delaware
Computational imagingdata sciencemachine learningsignal processing