PINN Balls: Scaling Second-Order Methods for PINNs with Domain Decomposition and Adaptive Sampling

📅 2025-10-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
While second-order optimization enhances accuracy in Physics-Informed Neural Networks (PINNs), its high memory footprint and poor scalability hinder practical deployment. Method: We propose PINN Balls—a framework featuring learnable domain decomposition and a sparse-coding-based Mixture of Experts (MoE) architecture for efficient, localized physical field modeling; integrated with Adversarial Adaptive Sampling (AAS) to improve training coverage in challenging regions, and supporting distributed second-order optimization. Contribution/Results: PINN Balls preserves theoretical rigor in PDE solution while substantially reducing memory consumption and improving training scalability and computational efficiency. Experiments across diverse scientific machine learning benchmarks demonstrate state-of-the-art (SOTA) accuracy and overcome the scalability bottleneck that has historically limited large-scale adoption of second-order methods in PINNs.

Technology Category

Application Category

📝 Abstract
Recent advances in Scientific Machine Learning have shown that second-order methods can enhance the training of Physics-Informed Neural Networks (PINNs), making them a suitable alternative to traditional numerical methods for Partial Differential Equations (PDEs). However, second-order methods induce large memory requirements, making them scale poorly with the model size. In this paper, we define a local Mixture of Experts (MoE) combining the parameter-efficiency of ensemble models and sparse coding to enable the use of second-order training. Our model -- extsc{PINN Balls} -- also features a fully learnable domain decomposition structure, achieved through the use of Adversarial Adaptive Sampling (AAS), which adapts the DD to the PDE and its domain. extsc{PINN Balls} achieves better accuracy than the state-of-the-art in scientific machine learning, while maintaining invaluable scalability properties and drawing from a sound theoretical background.
Problem

Research questions and friction points this paper is trying to address.

Scaling second-order training methods for PINNs
Reducing memory requirements of large PINN models
Developing adaptive domain decomposition for PDE solutions
Innovation

Methods, ideas, or system contributions that make the work stand out.

Mixture of Experts enables second-order training
Learnable domain decomposition adapts to PDE
Adversarial Adaptive Sampling optimizes domain structure
🔎 Similar Papers
No similar papers found.
A
Andrea Bonfanti
BMW AG, Digital Campus Munich, Basque Center for Applied Mathematics, University of the Basque Country
I
Ismael Medina
University of Göttingen, Campus Institute Data Science
R
Roman List
BMW AG, Digital Campus Munich
B
Björn Staeves
BMW AG, Digital Campus Munich
Roberto Santana
Roberto Santana
Intelligent Systems Group ISG, University of the Basque Country UPV/EHU
Estimation of Distribution AlgorithmsEvolutionary ComputationProbabilistic Graphical ModelsMachine Learning
M
Marco Ellero
Basque Center for Applied Mathematics, CFD Group, IKERBASQUE, Basque Foundation for Science, Swansea University, Complex Fluids Research Group