PMNO: A novel physics guided multi-step neural operator predictor for partial differential equations

📅 2025-06-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Neural operators suffer from poor extrapolation capability and training instability under limited data in long-term PDE forecasting, due to constrained representational capacity and strong data dependence. To address this, we propose the Physics-Guided Multi-step Neural Operator (PMNO). Methodologically, PMNO is the first to integrate BDF-based implicit time discretization with multi-step historical input modeling, and introduces a causal masking training strategy for end-to-end optimization; it further supports arbitrary spatial resolution generalization. Technically, it unifies FNO/DeepONet architectures, implicit backward propagation, and sequential causal modeling. Experiments demonstrate that PMNO significantly outperforms state-of-the-art methods on diverse benchmarks—including 2D linear systems, irregular domains, complex-valued wave equations, and reaction-diffusion processes—achieving high-accuracy, long-horizon, and cross-resolution predictions.

Technology Category

Application Category

📝 Abstract
Neural operators, which aim to approximate mappings between infinite-dimensional function spaces, have been widely applied in the simulation and prediction of physical systems. However, the limited representational capacity of network architectures, combined with their heavy reliance on large-scale data, often hinder effective training and result in poor extrapolation performance. In this paper, inspired by traditional numerical methods, we propose a novel physics guided multi-step neural operator (PMNO) architecture to address these challenges in long-horizon prediction of complex physical systems. Distinct from general operator learning methods, the PMNO framework replaces the single-step input with multi-step historical data in the forward pass and introduces an implicit time-stepping scheme based on the Backward Differentiation Formula (BDF) during backpropagation. This design not only strengthens the model's extrapolation capacity but also facilitates more efficient and stable training with fewer data samples, especially for long-term predictions. Meanwhile, a causal training strategy is employed to circumvent the need for multi-stage training and to ensure efficient end-to-end optimization. The neural operator architecture possesses resolution-invariant properties, enabling the trained model to perform fast extrapolation on arbitrary spatial resolutions. We demonstrate the superior predictive performance of PMNO predictor across a diverse range of physical systems, including 2D linear system, modeling over irregular domain, complex-valued wave dynamics, and reaction-diffusion processes. Depending on the specific problem setting, various neural operator architectures, including FNO, DeepONet, and their variants, can be seamlessly integrated into the PMNO framework.
Problem

Research questions and friction points this paper is trying to address.

Enhancing long-horizon prediction of complex physical systems
Reducing reliance on large-scale data for neural operators
Improving extrapolation capacity and training stability
Innovation

Methods, ideas, or system contributions that make the work stand out.

Multi-step historical data in forward pass
Implicit time-stepping with BDF scheme
Resolution-invariant neural operator architecture
🔎 Similar Papers
No similar papers found.
Jin Song
Jin Song
Academy of Mathematics and Systems Science, Chinese Academy of Sciences
Applied MathematicsDeep learningNonlinear Waves
Kenji Kawaguchi
Kenji Kawaguchi
Presidential Young Professor, National University of Singapore
LLMsLarge language modelDeep learningAI
Z
Zhenya Yan
School of Mathematics and Information Science, Zhongyuan University of Technology, Zhengzhou 450007, China; School of Mathematical Sciences, University of Chinese Academy of Sciences, Beijing 100049, China