🤖 AI Summary
Inverse design of nonlinear spinodoid metamaterials under data scarcity remains challenging due to the high computational cost of simulating microstructure–property relationships.
Method: This paper proposes a highly data-efficient differentiable surrogate modeling framework. Leveraging only 75 training samples, we construct a neural network surrogate that explicitly encodes structural parameter permutation equivariance as an inductive bias, enabling accurate mapping from microstructural features to effective elasticity tensors. The framework integrates group-equivariant architectural constraints, differentiable optimization, and multi-objective gradient-based inversion.
Contribution/Results: Our method achieves inverse design accuracy comparable to data-hungry approaches requiring thousands of simulations across three complex design tasks. It is the first to incorporate equivariance priors into spinodoid microstructure modeling, drastically reducing reliance on large-scale finite-element datasets. This establishes a scalable, efficient, and physics-informed paradigm for nonlinear metamaterial inverse design.
📝 Abstract
We create an data-efficient and accurate surrogate model for structure-property linkages of spinodoid metamaterials with only 75 data points -- far fewer than the several thousands used in prior works -- and demonstrate its use in multi-objective inverse design. The inverse problem of finding a material microstructure that leads to given bulk properties is of great interest in mechanics and materials science. These inverse design tasks often require a large dataset, which can become unaffordable when considering material behavior that requires more expensive simulations or experiments. We generate a data-efficient surrogate for the mapping between the characteristics of the local material structure and the effective elasticity tensor and use it to inversely design structures with multiple objectives simultaneously. The presented neural network-based surrogate model achieves its data efficiency by inherently satisfying certain requirements, such as equivariance with respect to permutations of structure parameters, which avoids having to learn them from data. The resulting surrogate of the forward model is differentiable, allowing its direct use in gradient-based optimization for the inverse design problem. We demonstrate in three inverse design tasks of varying complexity that this approach yields reliable results while requiring significantly less training data than previous approaches based on neural-network surrogates. This paves the way for inverse design involving nonlinear mechanical behavior, where data efficiency is currently the limiting factor.