A Deep Surrogate Model for Robust and Generalizable Long-Term Blast Wave Prediction

📅 2026-02-20
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes RGD-Blast, a deep surrogate model for blast wave propagation that addresses the challenges of strong nonlinearity, steep gradients, high computational cost, and the degradation of long-term prediction accuracy in existing machine learning surrogates—particularly under out-of-distribution conditions such as complex urban environments. By integrating multi-scale feature extraction with a dynamic-static feature coupling mechanism, RGD-Blast effectively mitigates error propagation inherent in autoregressive forecasting, significantly enhancing generalization across unseen building layouts and varying explosion parameters. The model achieves computational efficiency two orders of magnitude higher than conventional numerical simulations while maintaining comparable accuracy, demonstrating exceptional long-horizon predictive performance and robustness with an average RMSE below 0.01 and R² exceeding 0.89 over 280 time steps.

Technology Category

Application Category

📝 Abstract
Accurately modeling the spatio-temporal dynamics of blast wave propagation remains a longstanding challenge due to its highly nonlinear behavior, sharp gradients, and burdensome computational cost. While machine learning-based surrogate models offer fast inference as a promising alternative, they suffer from degraded accuracy, particularly evaluated on complex urban layouts or out-of-distribution scenarios. Moreover, autoregressive prediction strategies in such models are prone to error accumulation over long forecasting horizons, limiting their robustness for extended-time simulations. To address these limitations, we propose RGD-Blast, a robust and generalizable deep surrogate model for high-fidelity, long-term blast wave forecasting. RGD-Blast incorporates a multi-scale module to capture both global flow patterns and local boundary interactions, effectively mitigating error accumulation during autoregressive prediction. We introduce a dynamic-static feature coupling mechanism that fuses time-varying pressure fields with static source and layout features, thereby enhancing out-of-distribution generalization. Experiments demonstrate that RGD-Blast achieves a two-order-of-magnitude speedup over traditional numerical methods while maintaining comparable accuracy. In generalization tests on unseen building layouts, the model achieves an average RMSE below 0.01 and an R2 exceeding 0.89 over 280 consecutive time steps. Additional evaluations under varying blast source locations and explosive charge weights further validate its generalization, substantially advancing the state of the art in long-term blast wave modeling.
Problem

Research questions and friction points this paper is trying to address.

blast wave prediction
long-term forecasting
out-of-distribution generalization
error accumulation
spatio-temporal dynamics
Innovation

Methods, ideas, or system contributions that make the work stand out.

deep surrogate model
multi-scale module
dynamic-static feature coupling
out-of-distribution generalization
long-term blast wave prediction
🔎 Similar Papers
D
Danning Jing
National Key Laboratory of Parallel and Distributed Computing, National University of Defense Technology, Changsha 410073, China; Laboratory of Digitizing Software for Frontier Equipment, National University of Defense Technology, Changsha 410073, China; College of Computer Science and Technology, National University of Defense Technology, Changsha 410073, China
X
Xinhai Chen
National Key Laboratory of Parallel and Distributed Computing, National University of Defense Technology, Changsha 410073, China; Laboratory of Digitizing Software for Frontier Equipment, National University of Defense Technology, Changsha 410073, China; College of Computer Science and Technology, National University of Defense Technology, Changsha 410073, China
X
Xifeng Pu
National Key Laboratory of Parallel and Distributed Computing, National University of Defense Technology, Changsha 410073, China; Laboratory of Digitizing Software for Frontier Equipment, National University of Defense Technology, Changsha 410073, China; College of Computer Science and Technology, National University of Defense Technology, Changsha 410073, China
J
Jie Hu
National Key Laboratory of Parallel and Distributed Computing, National University of Defense Technology, Changsha 410073, China; Laboratory of Digitizing Software for Frontier Equipment, National University of Defense Technology, Changsha 410073, China; College of Computer Science and Technology, National University of Defense Technology, Changsha 410073, China
C
Chao Huang
CAEP Software Center for High Performance Numerical Simulation, Beijing 100088, China; Institute of Applied Physics and Computational Mathematics, Beijing 100088, China
X
Xuguang Chen
National Key Laboratory of Parallel and Distributed Computing, National University of Defense Technology, Changsha 410073, China; Laboratory of Digitizing Software for Frontier Equipment, National University of Defense Technology, Changsha 410073, China; College of Computer Science and Technology, National University of Defense Technology, Changsha 410073, China
Qinglin Wang
Qinglin Wang
National University of Defense Technology
Parallel algorithmsHigh Performance ComputingDeep LearningMachine LearningGPU
J
Jie Liu
National Key Laboratory of Parallel and Distributed Computing, National University of Defense Technology, Changsha 410073, China; Laboratory of Digitizing Software for Frontier Equipment, National University of Defense Technology, Changsha 410073, China; College of Computer Science and Technology, National University of Defense Technology, Changsha 410073, China