Muti-Fidelity Prediction and Uncertainty Quantification with Laplace Neural Operators for Parametric Partial Differential Equations

📅 2025-02-01
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Addressing the dual challenges of scarce high-fidelity data and the trade-off between prediction accuracy and uncertainty quantification in surrogate modeling of parametric partial differential equations (pPDEs), this paper proposes the Multi-Fidelity Laplacian Neural Operator (MF-LNO) framework. MF-LNO innovatively fuses low- and high-fidelity data via a dynamic weighted correction mechanism to enhance generalization. Furthermore, it introduces an enhanced replica-exchange stochastic gradient Langevin dynamics (RE-SGLD) algorithm to enable efficient Bayesian posterior approximation directly in function space. Compared to single-fidelity baselines, MF-LNO reduces prediction error by 40–80% across four canonical dynamical systems governed by pPDEs. It significantly improves data efficiency and yields more reliable uncertainty estimates. By jointly advancing multi-fidelity learning and Bayesian neural operators, MF-LNO establishes a new paradigm for trustworthy, uncertainty-aware surrogate modeling of pPDEs.

Technology Category

Application Category

📝 Abstract
Laplace Neural Operators (LNOs) have recently emerged as a promising approach in scientific machine learning due to the ability to learn nonlinear maps between functional spaces. However, this framework often requires substantial amounts of high-fidelity (HF) training data, which is often prohibitively expensive to acquire. To address this, we propose multi-fidelity Laplace Neural Operators (MF-LNOs), which combine a low-fidelity (LF) base model with parallel linear/nonlinear HF correctors and dynamic inter-fidelity weighting. This allows us to exploit correlations between LF and HF datasets and achieve accurate inference of quantities of interest even with sparse HF data. We further incorporate a modified replica exchange stochastic gradient Langevin algorithm, which enables a more effective posterior distribution estimation and uncertainty quantification in model predictions. Extensive validation across four canonical dynamical systems (the Lorenz system, Duffing oscillator, Burgers equation, and Brusselator reaction-diffusion system) demonstrates the framework's effectiveness. The results show significant improvements, with testing losses reduced by 40% to 80% compared to traditional approaches. This validates MF-LNO as a versatile tool for surrogate modeling in parametric PDEs, offering significant improvements in data efficiency and uncertainty-aware prediction.
Problem

Research questions and friction points this paper is trying to address.

Laplace Neural Operator
Parametric Partial Differential Equations
High-Cost Training Data
Innovation

Methods, ideas, or system contributions that make the work stand out.

MF-LNOs
Uncertainty Quantification
Data Efficiency
🔎 Similar Papers
No similar papers found.