Guaranteed Noisy CP Tensor Recovery via Riemannian Optimization on the Segre Manifold

📅 2025-10-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses low-rank tensor recovery from noisy linear measurements, aiming to improve accuracy and efficiency in high-dimensional data analysis tasks such as tensor PCA and tensor regression. We propose a Riemannian optimization framework built on the Segre manifold—the set of rank-one tensors—which naturally encodes the low-canonical-polyadic (CP) rank constraint via the manifold’s geometric structure, ensuring all iterates remain feasible. To our knowledge, this is the first work to design and analyze Riemannian gradient descent (RGD) and Riemannian Gauss–Newton (RGN) algorithms for this problem with rigorous convergence guarantees: RGD achieves local linear convergence, while RGN attains local quadratic convergence when initialized sufficiently close to the ground truth. Experiments demonstrate that both methods are stable and efficient across diverse noise levels and achieve statistically optimal estimation rates.

Technology Category

Application Category

📝 Abstract
Recovering a low-CP-rank tensor from noisy linear measurements is a central challenge in high-dimensional data analysis, with applications spanning tensor PCA, tensor regression, and beyond. We exploit the intrinsic geometry of rank-one tensors by casting the recovery task as an optimization problem over the Segre manifold, the smooth Riemannian manifold of rank-one tensors. This geometric viewpoint yields two powerful algorithms: Riemannian Gradient Descent (RGD) and Riemannian Gauss-Newton (RGN), each of which preserves feasibility at every iteration. Under mild noise assumptions, we prove that RGD converges at a local linear rate, while RGN exhibits an initial local quadratic convergence phase that transitions to a linear rate as the iterates approach the statistical noise floor. Extensive synthetic experiments validate these convergence guarantees and demonstrate the practical effectiveness of our methods.
Problem

Research questions and friction points this paper is trying to address.

Recovering low-CP-rank tensors from noisy linear measurements
Solving tensor recovery via Riemannian optimization on Segre manifold
Developing algorithms with guaranteed convergence under noise conditions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Optimization on Segre manifold for tensor recovery
Riemannian Gradient Descent with linear convergence rate
Riemannian Gauss-Newton with quadratic-linear convergence transition
🔎 Similar Papers
No similar papers found.
K
Ke Xu
Department of Applied and Computational Mathematics and Statistics, University of Notre Dame
Yuefeng Han
Yuefeng Han
University of Notre Dame
Tensor LearningStochastic OptimizationHigh-dimensional StatisticsTime SeriesDeep Learning