Parameter-Efficient Conditioning for Material Generalization in Graph-Based Simulators

📅 2025-11-07
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Existing Graph Network Simulators (GNS) generalize well to unseen geometries but exhibit poor transferability across material constitutive behaviors, severely limiting engineering applicability. To address this, we propose a parameter-efficient conditional fine-tuning framework: motivated by the observation that material sensitivity concentrates in early message-passing layers, we fine-tune only the initial layers and incorporate Feature-wise Linear Modulation (FiLM) for adaptive feature conditioning. The method achieves rapid adaptation to novel materials using merely 12 short trajectory samples. It delivers high-accuracy long-horizon predictions across diverse physical systems—including fluids, deformable bodies, and granular flows—for unseen, interpolated, and moderately extrapolated material parameters, reducing data requirements by 5×. Furthermore, we extend the framework to inverse problems, successfully identifying unknown cohesive strength parameters.

Technology Category

Application Category

📝 Abstract
Graph network-based simulators (GNS) have demonstrated strong potential for learning particle-based physics (such as fluids, deformable solids, and granular flows) while generalizing to unseen geometries due to their inherent inductive biases. However, existing models are typically trained for a single material type and fail to generalize across distinct constitutive behaviors, limiting their applicability in real-world engineering settings. Using granular flows as a running example, we propose a parameter-efficient conditioning mechanism that makes the GNS model adaptive to material parameters. We identify that sensitivity to material properties is concentrated in the early message-passing (MP) layers, a finding we link to the local nature of constitutive models (e.g., Mohr-Coulomb) and their effects on information propagation. We empirically validate this by showing that fine-tuning only the first few (1-5) of 10 MP layers of a pretrained model achieves comparable test performance as compared to fine-tuning the entire network. Building on this insight, we propose a parameter-efficient Feature-wise Linear Modulation (FiLM) conditioning mechanism designed to specifically target these early layers. This approach produces accurate long-term rollouts on unseen, interpolated, or moderately extrapolated values (e.g., up to 2.5 degrees for friction angle and 0.25 kPa for cohesion) when trained exclusively on as few as 12 short simulation trajectories from new materials, representing a 5-fold data reduction compared to a baseline multi-task learning method. Finally, we validate the model's utility by applying it to an inverse problem, successfully identifying unknown cohesion parameters from trajectory data. This approach enables the use of GNS in inverse design and closed-loop control tasks where material properties are treated as design variables.
Problem

Research questions and friction points this paper is trying to address.

GNS models fail to generalize across different material constitutive behaviors
Existing models require retraining for each new material type
Material property sensitivity is concentrated in early network layers
Innovation

Methods, ideas, or system contributions that make the work stand out.

Parameter-efficient conditioning mechanism for material generalization
Targeted Feature-wise Linear Modulation on early message-passing layers
Achieves accurate simulations with minimal training data requirements
🔎 Similar Papers
No similar papers found.
N
Naveen Raj Manoharan
Department of Civil, Architecture and Environmental Engineering, The University of Texas at Austin, USA
H
Hassan Iqbal
The Oden Institute for Computational Engineering and Sciences, The University of Texas at Austin, USA
Krishna Kumar
Krishna Kumar
Indian Institute of Technology Kharagpur
Pattern-forming InstabilitiesNonlinear Dynamics