A Structure-Preserving Kernel Method for Learning Hamiltonian Systems

📅 2024-03-15
🏛️ arXiv.org
📈 Citations: 1
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses the inverse problem of recovering a nonlinear Hamiltonian function from noisy Hamiltonian vector field data. We propose a symplectic-structure-preserving kernel ridge regression method. Our key contributions are: (i) establishing, for the first time, a theoretical framework for kernel regression under gradient-based loss, introducing the notion of *differential reproducibility* and its associated representer theorem; (ii) proving the equivalence between the structure-preserving kernel estimator and the posterior mean of a Gaussian process; and (iii) constructing a differential reproducing kernel Hilbert space, incorporating adaptive regularization and deriving optimal convergence rates under both fixed and adaptive regularization schemes. Both theoretical analysis and numerical experiments demonstrate that the proposed method significantly outperforms existing approaches in terms of recovery accuracy and symplectic structure fidelity.

Technology Category

Application Category

📝 Abstract
A structure-preserving kernel ridge regression method is presented that allows the recovery of nonlinear Hamiltonian functions out of datasets made of noisy observations of Hamiltonian vector fields. The method proposes a closed-form solution that yields excellent numerical performances that surpass other techniques proposed in the literature in this setup. From the methodological point of view, the paper extends kernel regression methods to problems in which loss functions involving linear functions of gradients are required and, in particular, a differential reproducing property and a Representer Theorem are proved in this context. The relation between the structure-preserving kernel estimator and the Gaussian posterior mean estimator is analyzed. A full error analysis is conducted that provides convergence rates using fixed and adaptive regularization parameters. The good performance of the proposed estimator together with the convergence rate is illustrated with various numerical experiments.
Problem

Research questions and friction points this paper is trying to address.

Recovering nonlinear Hamiltonian functions from noisy data
Extending kernel regression for gradient-based loss functions
Analyzing error convergence rates with regularization parameters
Innovation

Methods, ideas, or system contributions that make the work stand out.

Structure-preserving kernel ridge regression method
Closed-form solution for Hamiltonian recovery
Differential reproducing property and Representer Theorem
🔎 Similar Papers
J
Jianyu Hu
Division of Mathematical Sciences, School of Physical and Mathematical Sciences, Nanyang Technological University, Singapore
Juan-Pablo Ortega
Juan-Pablo Ortega
Nanyang Technological University
Learning of dynamic processesgeometric mechanics
D
Daiying Yin
Division of Mathematical Sciences, School of Physical and Mathematical Sciences, Nanyang Technological University, Singapore