🤖 AI Summary
This paper addresses the inverse problem of recovering a nonlinear Hamiltonian function from noisy Hamiltonian vector field data. We propose a symplectic-structure-preserving kernel ridge regression method. Our key contributions are: (i) establishing, for the first time, a theoretical framework for kernel regression under gradient-based loss, introducing the notion of *differential reproducibility* and its associated representer theorem; (ii) proving the equivalence between the structure-preserving kernel estimator and the posterior mean of a Gaussian process; and (iii) constructing a differential reproducing kernel Hilbert space, incorporating adaptive regularization and deriving optimal convergence rates under both fixed and adaptive regularization schemes. Both theoretical analysis and numerical experiments demonstrate that the proposed method significantly outperforms existing approaches in terms of recovery accuracy and symplectic structure fidelity.
📝 Abstract
A structure-preserving kernel ridge regression method is presented that allows the recovery of nonlinear Hamiltonian functions out of datasets made of noisy observations of Hamiltonian vector fields. The method proposes a closed-form solution that yields excellent numerical performances that surpass other techniques proposed in the literature in this setup. From the methodological point of view, the paper extends kernel regression methods to problems in which loss functions involving linear functions of gradients are required and, in particular, a differential reproducing property and a Representer Theorem are proved in this context. The relation between the structure-preserving kernel estimator and the Gaussian posterior mean estimator is analyzed. A full error analysis is conducted that provides convergence rates using fixed and adaptive regularization parameters. The good performance of the proposed estimator together with the convergence rate is illustrated with various numerical experiments.