Sharpness-aware Second-order Latent Factor Model for High-dimensional and Incomplete Data

📅 2025-12-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Optimizing higher-order latent factor (SLF) models for high-dimensional incomplete (HDI) data is challenging due to their bilinear structure and inherent non-convexity. Method: This paper proposes the Sharpness-Aware Second-order Latent Factor (SSLF) model, the first to integrate Sharpness-Aware Minimization (SAM) into second-order latent factor modeling. It introduces a differentiable Hessian-vector product path for efficient curvature estimation and explicitly incorporates a sharpness-aware regularization term to enable curvature-informed optimization. Contribution/Results: SSLF significantly enhances generalization and convergence stability in non-convex low-rank representation learning. Extensive experiments on multiple industrial-scale sparse datasets demonstrate that SSLF consistently outperforms state-of-the-art methods, achieving substantial improvements in both prediction accuracy and training robustness.

Technology Category

Application Category

📝 Abstract
Second-order Latent Factor (SLF) model, a class of low-rank representation learning methods, has proven effective at extracting node-to-node interaction patterns from High-dimensional and Incomplete (HDI) data. However, its optimization is notoriously difficult due to its bilinear and non-convex nature. Sharpness-aware Minimization (SAM) has recently proposed to find flat local minima when minimizing non-convex objectives, thereby improving the generalization of representation-learning models. To address this challenge, we propose a Sharpness-aware SLF (SSLF) model. SSLF embodies two key ideas: (1) acquiring second-order information via Hessian-vector products; and (2) injecting a sharpness term into the curvature (Hessian) through the designed Hessian-vector products. Experiments on multiple industrial datasets demonstrate that the proposed model consistently outperforms state-of-the-art baselines.
Problem

Research questions and friction points this paper is trying to address.

Optimizes second-order latent factor models for high-dimensional incomplete data
Addresses non-convex optimization difficulty via sharpness-aware minimization
Improves generalization by injecting sharpness into Hessian-based curvature
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses Hessian-vector products for second-order information
Injects sharpness term into curvature via Hessian-vector products
Applies Sharpness-aware Minimization to find flat minima
🔎 Similar Papers
No similar papers found.
Jialiang Wang
Jialiang Wang
Research Scientist, Meta AI
Computer VisionGenerative AI
X
Xueyan Bao
College of Computer and Information Science, Southwest University, Chongqing, China
H
Hao Wu
College of Computer and Information Science, Southwest University, Chongqing, China