🤖 AI Summary
To address the challenge of modeling global interactions in Lagrangian fluid simulation, this paper proposes a particle-mesh dual-domain graph neural network architecture. Building upon a GNN backbone (e.g., GNS), it incorporates a lightweight Eulerian mesh component that aggregates global contextual information via convolutional operations, while enabling cross-domain feature interaction through particle-mesh projection and residual connections. This design significantly expands the receptive field with negligible computational overhead. Experiments on standard fluid benchmarks demonstrate that our method improves rollout accuracy by 57% over GNS with only a 13% increase in inference time; compared to SEGNN, it achieves 49% higher accuracy while reducing training and inference time by 30% and 48%, respectively. Under equivalent computational budgets, it attains an average performance gain of 47% over GNS. The core contribution lies in a low-overhead, high-efficiency paradigm for modeling global interactions in particle-based fluid simulation.
📝 Abstract
Partial differential equations (PDEs) are central to dynamical systems modeling, particularly in hydrodynamics, where traditional solvers often struggle with nonlinearity and computational cost. Lagrangian neural surrogates such as GNS and SEGNN have emerged as strong alternatives by learning from particle-based simulations. However, these models typically operate with limited receptive fields, making them inaccurate for capturing the inherently global interactions in fluid flows. Motivated by this observation, we introduce Convolutional Residual Global Interactions (CORGI), a hybrid architecture that augments any GNN-based solver with a lightweight Eulerian component for global context aggregation. By projecting particle features onto a grid, applying convolutional updates, and mapping them back to the particle domain, CORGI captures long-range dependencies without significant overhead. When applied to a GNS backbone, CORGI achieves a 57% improvement in rollout accuracy with only 13% more inference time and 31% more training time. Compared to SEGNN, CORGI improves accuracy by 49% while reducing inference time by 48% and training time by 30%. Even under identical runtime constraints, CORGI outperforms GNS by 47% on average, highlighting its versatility and performance on varied compute budgets.