CORGI: GNNs with Convolutional Residual Global Interactions for Lagrangian Simulation

📅 2025-11-28
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the challenge of modeling global interactions in Lagrangian fluid simulation, this paper proposes a particle-mesh dual-domain graph neural network architecture. Building upon a GNN backbone (e.g., GNS), it incorporates a lightweight Eulerian mesh component that aggregates global contextual information via convolutional operations, while enabling cross-domain feature interaction through particle-mesh projection and residual connections. This design significantly expands the receptive field with negligible computational overhead. Experiments on standard fluid benchmarks demonstrate that our method improves rollout accuracy by 57% over GNS with only a 13% increase in inference time; compared to SEGNN, it achieves 49% higher accuracy while reducing training and inference time by 30% and 48%, respectively. Under equivalent computational budgets, it attains an average performance gain of 47% over GNS. The core contribution lies in a low-overhead, high-efficiency paradigm for modeling global interactions in particle-based fluid simulation.

Technology Category

Application Category

📝 Abstract
Partial differential equations (PDEs) are central to dynamical systems modeling, particularly in hydrodynamics, where traditional solvers often struggle with nonlinearity and computational cost. Lagrangian neural surrogates such as GNS and SEGNN have emerged as strong alternatives by learning from particle-based simulations. However, these models typically operate with limited receptive fields, making them inaccurate for capturing the inherently global interactions in fluid flows. Motivated by this observation, we introduce Convolutional Residual Global Interactions (CORGI), a hybrid architecture that augments any GNN-based solver with a lightweight Eulerian component for global context aggregation. By projecting particle features onto a grid, applying convolutional updates, and mapping them back to the particle domain, CORGI captures long-range dependencies without significant overhead. When applied to a GNS backbone, CORGI achieves a 57% improvement in rollout accuracy with only 13% more inference time and 31% more training time. Compared to SEGNN, CORGI improves accuracy by 49% while reducing inference time by 48% and training time by 30%. Even under identical runtime constraints, CORGI outperforms GNS by 47% on average, highlighting its versatility and performance on varied compute budgets.
Problem

Research questions and friction points this paper is trying to address.

Captures global fluid interactions in Lagrangian simulations.
Improves accuracy of GNN-based PDE solvers efficiently.
Reduces computational cost while enhancing long-range dependency modeling.
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hybrid GNN with Eulerian grid for global interactions
Project particle features to grid, apply convolution, map back
Improves accuracy significantly with minimal computational overhead
🔎 Similar Papers
No similar papers found.
E
Ethan Ji
Department of Computer Science, UCLA
Y
Yuanzhou Chen
Department of Computer Science, UCLA
A
Arush Ramteke
Department of Computer Science, UCLA
F
Fang Sun
Department of Computer Science, UCLA
T
Tianrun Yu
Department of Computer Science, UCLA
J
Jai Parera
Department of Computer Science, UCLA
W
Wei Wang
Department of Computer Science, UCLA
Yizhou Sun
Yizhou Sun
Professor, Computer Science, UCLA
Information NetworksKnowledge GraphsGraph Neural NetworksData MiningMachine Learning