🤖 AI Summary
This work addresses the challenge of data assimilation in compressible flows with shocks, where traditional ensemble Kalman filters (EnKF) suffer from non-Gaussian, bimodal forecast distributions due to shock position uncertainty, leading to unphysical oscillations. To overcome this limitation, the authors propose embedding a deep neural network within the EnKF framework and performing the Kalman update directly in the neural network parameter space for the first time. By incorporating physics-informed transfer learning, the method ensures smooth evolution of network parameters across assimilation cycles. This approach effectively mitigates filter divergence caused by non-Gaussianity near shocks, significantly improving accuracy and robustness in canonical test cases—including the Burgers equation, Sod shock tube, and two-dimensional detonation waves—while avoiding the numerical artifacts commonly observed with standard EnKF implementations.
📝 Abstract
Data assimilation (DA) for compressible flows with shocks is challenging because many classical DA methods generate spurious oscillations and nonphysical features near uncertain shocks. We focus here on the ensemble Kalman filter (EnKF). We show that the poor performance of the standard EnKF may be attributed to the bimodal forecast distribution that can arise in the vicinity of an uncertain shock location; this violates the assumptions underpinning the EnKF, which assume a forecast which is close to Gaussian. To address this issue we introduce the new neural EnKF. The basic idea is to systematically embed neural function approximations within ensemble DA by mapping the forecast ensemble of shocked flows to the parameter space (weights and biases) of a deep neural network (NN) and to subsequently perform DA in that space. The nonlinear mapping encodes sharp and smooth flow features in an ensemble of NN parameters. Neural EnKF updates are therefore well-behaved only if the NN parameters vary smoothly within the neural representation of the forecast ensemble. We show that such a smooth variation of network parameters can be enforced via physics-informed transfer learning, and demonstrate that in so-doing the neural EnKF avoids the spurious oscillations and nonphysical features that plague the standard EnKF. The applicability of the neural EnKF is demonstrated through a series of systematic numerical experiments with an inviscid Burgers'equation, Sod's shock tube, and a two-dimensional blast wave.