Hypergraph p-Laplacian equations for data interpolation and semi-supervised learning

📅 2024-11-19
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the numerical challenges in hypergraph $p$-Laplacian regularization for data interpolation and semi-supervised learning—namely, non-smoothness of the objective functional and non-uniqueness of minimizers. We first rigorously derive a variational equation based on the subdifferential of the $p$-Laplacian energy and construct a mathematically well-posed, computationally efficient simplified model. This model circumvents both non-smoothness and solution non-uniqueness, effectively suppressing spurious spikes and oscillations in interpolation while improving classification accuracy in semi-supervised tasks. It exhibits low time complexity and strong scalability. Our core contribution is the systematic integration of subdifferential analysis into hypergraph signal processing, yielding the first numerically implementable $p$-Laplacian framework for large-scale hypergraph learning that simultaneously ensures theoretical rigor and practical efficiency.

Technology Category

Application Category

📝 Abstract
Hypergraph learning with $p$-Laplacian regularization has attracted a lot of attention due to its flexibility in modeling higher-order relationships in data. This paper focuses on its fast numerical implementation, which is challenging due to the non-differentiability of the objective function and the non-uniqueness of the minimizer. We derive a hypergraph $p$-Laplacian equation from the subdifferential of the $p$-Laplacian regularization. A simplified equation that is mathematically well-posed and computationally efficient is proposed as an alternative. Numerical experiments verify that the simplified $p$-Laplacian equation suppresses spiky solutions in data interpolation and improves classification accuracy in semi-supervised learning. The remarkably low computational cost enables further applications.
Problem

Research questions and friction points this paper is trying to address.

Develops fast numerical methods for hypergraph p-Laplacian learning
Addresses non-differentiability and non-uniqueness in optimization
Proposes simplified equation for efficient data interpolation and classification
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hypergraph p-Laplacian for higher-order data relationships
Simplified well-posed p-Laplacian equation for efficiency
Low-cost solution enhancing interpolation and classification
🔎 Similar Papers
No similar papers found.
K
Kehan Shi
Department of Mathematics, China Jiliang University, Hangzhou 310018, China; Computational Imaging Group and Helmholtz Imaging, Deutsches Elektronen-Synchrotron DESY, 22607 Hamburg, Germany
Martin Burger
Martin Burger
Deutsches Elektronen-Synchrotron DESY und Universität Hamburg
MathematicsImaging