Predicting Change, Not States: An Alternate Framework for Neural PDE Surrogates

📅 2024-12-17
🏛️ Computer Methods in Applied Mechanics and Engineering
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the poor long-term stability and generalization of neural PDE surrogate models that directly predict system states. We propose a novel paradigm: predicting time derivatives followed by ODE integration. Methodologically, we build upon the neural ODE framework, jointly optimizing a physics-informed loss and a self-supervised temporal differencing regularization, enabling implicit differentiation and adaptive-step integration. Our key contribution lies in decoupling physical constraints from data fitting—shifting the learning objective from “predicting states” to “predicting dynamics”—thereby substantially mitigating error accumulation. Evaluated on canonical PDE benchmarks—including Burgers’ and Navier–Stokes equations—our approach achieves a 38% reduction in average relative error, improves extrapolation capability by 2.1×, and accelerates training convergence by 1.6×, while maintaining high accuracy, strong numerical stability, and flexible inference.

Technology Category

Application Category

Problem

Research questions and friction points this paper is trying to address.

Predict temporal derivatives instead of next state for PDEs
Improve neural surrogate accuracy and stability
Enable flexible time-stepping during inference
Innovation

Methods, ideas, or system contributions that make the work stand out.

Predict temporal derivatives instead of next states
Use ODE integrator for flexible time-stepping
Improve accuracy and stability in fine discretization
🔎 Similar Papers
No similar papers found.
Anthony Zhou
Anthony Zhou
PhD Candidate, Carnegie Mellon University
Scientific Machine Learning
A
A. Farimani
Department of Mechanical Engineering, Carnegie Mellon University, Pittsburgh, PA, USA.; Machine Learning Department, Carnegie Mellon University, Pittsburgh, PA, USA.