🤖 AI Summary
Uncertainty propagation in neural network surrogate models of nonlinear dynamical systems remains challenging, particularly due to the computational burden and approximation errors inherent in Monte Carlo methods.
Method: This paper proposes a novel state estimation algorithm integrating analytical moment computation with assumed density filtering (ADF) and Rauch–Tung–Striebel (RTS) smoothing. It leverages newly derived closed-form expressions for the mean and covariance of deep neural network outputs under Gaussian input distributions, eliminating reliance on Monte Carlo sampling. Cross-entropy is adopted as a distribution-aware metric for evaluating filter/smoothing performance, offering greater sensitivity to probabilistic fidelity than conventional RMSE.
Contribution/Results: The method achieves significantly improved state estimation accuracy on stochastic Lorenz and Wiener systems. Moreover, enhanced uncertainty quantification directly translates into superior performance of downstream linear-quadratic regulator (LQR) control, demonstrating the practical benefits of analytically propagated uncertainties in closed-loop applications.
📝 Abstract
The Kalman filter and Rauch-Tung-Striebel (RTS) smoother are optimal for state estimation in linear dynamic systems. With nonlinear systems, the challenge consists in how to propagate uncertainty through the state transitions and output function. For the case of a neural network model, we enable accurate uncertainty propagation using a recent state-of-the-art analytic formula for computing the mean and covariance of a deep neural network with Gaussian input. We argue that cross entropy is a more appropriate performance metric than RMSE for evaluating the accuracy of filters and smoothers. We demonstrate the superiority of our method for state estimation on a stochastic Lorenz system and a Wiener system, and find that our method enables more optimal linear quadratic regulation when the state estimate is used for feedback.