Learning Memory Kernels in Generalized Langevin Equations

📅 2024-02-18
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the nonparametric learning of memory kernels in generalized Langevin equations (GLEs). We propose a novel framework integrating regularized Prony analysis with Sobolev-norm regression. Unlike conventional L²-based regression or inverse Laplace transform estimation, our method formulates a reproducing kernel Hilbert space (RKHS) regularized loss functional in an exponentially weighted L² space, ensuring theoretical guarantees on both approximation error control and generalization performance. Extensive numerical experiments—validated against correlation function estimation and inverse Laplace methods—demonstrate that the proposed approach achieves significantly improved memory kernel reconstruction accuracy, exhibits strong robustness to weight parameter selection, and naturally enables joint modeling of force and drift terms. The framework thus establishes a new data-driven paradigm for non-Markovian stochastic dynamics, uniquely combining rigorous theoretical foundations with practical applicability.

Technology Category

Application Category

📝 Abstract
We introduce a novel approach for learning memory kernels in Generalized Langevin Equations. This approach initially utilizes a regularized Prony method to estimate correlation functions from trajectory data, followed by regression over a Sobolev norm-based loss function with RKHS regularization. Our method guarantees improved performance within an exponentially weighted L^2 space, with the kernel estimation error controlled by the error in estimated correlation functions. We demonstrate the superiority of our estimator compared to other regression estimators that rely on L^2 loss functions and also an estimator derived from the inverse Laplace transform, using numerical examples that highlight its consistent advantage across various weight parameter selections. Additionally, we provide examples that include the application of force and drift terms in the equation.
Problem

Research questions and friction points this paper is trying to address.

Estimating memory kernels in Generalized Langevin Equations
Improving kernel estimation via Sobolev norm-based regression
Comparing performance against L² loss and Laplace transform methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Uses regularized Prony method for correlation estimation
Employs Sobolev norm-based loss with RKHS regularization
Ensures kernel error control via correlation function accuracy
🔎 Similar Papers
No similar papers found.
Q
Quanjun Lang
J
Jianfeng Lu