🤖 AI Summary
Traditional numerical solvers for parametric partial differential equations (PDEs) incur prohibitive computational costs in multi-query scenarios. Method: This paper proposes a unified framework to systematically compare physics-informed neural networks (PINNs) and neural operators—including DeepONet and the Fourier neural operator—in their ability to learn solution mappings over infinite-dimensional function spaces. It innovatively integrates soft-constraint physics embedding with operator approximation theory to characterize fundamental differences in their generalization mechanisms. Contribution/Results: The analysis yields theoretically grounded guidelines for method selection. Experiments on canonical fluid and solid mechanics tasks demonstrate that the proposed approach achieves real-time inference and inverse problem solving up to 10³–10⁵× faster than conventional solvers, while maintaining comparable accuracy—thereby significantly enhancing parameter-space exploration efficiency.
📝 Abstract
PDEs arise ubiquitously in science and engineering, where solutions depend on parameters (physical properties, boundary conditions, geometry). Traditional numerical methods require re-solving the PDE for each parameter, making parameter space exploration prohibitively expensive. Recent machine learning advances, particularly physics-informed neural networks (PINNs) and neural operators, have revolutionized parametric PDE solving by learning solution operators that generalize across parameter spaces. We critically analyze two main paradigms: (1) PINNs, which embed physical laws as soft constraints and excel at inverse problems with sparse data, and (2) neural operators (e.g., DeepONet, Fourier Neural Operator), which learn mappings between infinite-dimensional function spaces and achieve unprecedented generalization. Through comparisons across fluid dynamics, solid mechanics, heat transfer, and electromagnetics, we show neural operators can achieve computational speedups of $10^3$ to $10^5$ times faster than traditional solvers for multi-query scenarios, while maintaining comparable accuracy. We provide practical guidance for method selection, discuss theoretical foundations (universal approximation, convergence), and identify critical open challenges: high-dimensional parameters, complex geometries, and out-of-distribution generalization. This work establishes a unified framework for understanding parametric PDE solvers via operator learning, offering a comprehensive, incrementally updated resource for this rapidly evolving field