Riemannian Optimization on Tree Tensor Networks with Application in Machine Learning

📅 2025-07-29
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Optimizing tree tensor networks (TTNs) for low-rank approximation and quantum many-body simulation remains challenging due to their non-Euclidean parameter space and complex interdependencies among tensors. Method: This work first systematically characterizes the quotient manifold geometry inherent in TTNs; based on this, it develops tailored Riemannian first- and second-order optimization algorithms and introduces an efficient gradient computation framework leveraging the quotient structure. Furthermore, it proposes an end-to-end differentiable backpropagation mechanism for kernel learning with TTNs, enabling fully differentiable tensor network modeling. Contribution/Results: Experiments on benchmark machine learning tasks demonstrate that the proposed methods significantly improve convergence speed and numerical stability compared to conventional Euclidean or heuristic optimization strategies. They further exhibit enhanced robustness and generalization capability. This work establishes a new paradigm for interpretable machine learning and quantum-inspired modeling using tensor networks.

Technology Category

Application Category

📝 Abstract
Tree tensor networks (TTNs) are widely used in low-rank approximation and quantum many-body simulation. In this work, we present a formal analysis of the differential geometry underlying TTNs. Building on this foundation, we develop efficient first- and second-order optimization algorithms that exploit the intrinsic quotient structure of TTNs. Additionally, we devise a backpropagation algorithm for training TTNs in a kernel learning setting. We validate our methods through numerical experiments on a representative machine learning task.
Problem

Research questions and friction points this paper is trying to address.

Analyzing differential geometry of tree tensor networks
Developing efficient optimization algorithms for TTNs
Training TTNs in kernel learning via backpropagation
Innovation

Methods, ideas, or system contributions that make the work stand out.

Riemannian optimization on tree tensor networks
Exploiting quotient structure for efficient algorithms
Backpropagation for TTN kernel learning
🔎 Similar Papers
No similar papers found.
M
Marius Willner
Chair of Mathematical Data Science, University of Augsburg, Augsburg, Germany
M
Marco Trenti
Tensor AI Solutions GmbH, Pfaffenhofen a.d. Roth, Germany
Dirk Lebiedz
Dirk Lebiedz
Professor for Mathematics, Ulm University, Germany
Scientific ComputingModelingSimulation and Optimization