π€ AI Summary
In variational data assimilation, inverting the Hessian matrix is computationally prohibitive, while the conjugate gradient (CG) method suffers from slow convergence and poor robustness under ill-conditioned scenarios. To address these challenges, this paper proposes the first meta-learning framework integrating Fourier Neural Operators (FNOs) to approximate the inverse Hessian operator, enabling rapid, generalizable preconditioner learning across diverse problems. The framework jointly learns high-quality initial guesses and efficient preconditioners for CG, significantly improving initialization efficiency in optimization. Experiments on the linear advection equation demonstrate that, compared to standard CG, the proposed method reduces average relative error by 62% and decreases iteration counts by 17%. Notably, it exhibits superior convergence stability and acceleration performance under ill-conditioned settings.
π Abstract
Data assimilation (DA) is crucial for enhancing solutions to partial differential equations (PDEs), such as those in numerical weather prediction, by optimizing initial conditions using observational data. Variational DA methods are widely used in oceanic and atmospheric forecasting, but become computationally expensive, especially when Hessian information is involved. To address this challenge, we propose a meta-learning framework that employs the Fourier Neural Operator (FNO) to approximate the inverse Hessian operator across a family of DA problems, thereby providing an effective initialization for the conjugate gradient (CG) method. Numerical experiments on a linear advection equation demonstrate that the resulting FNO-CG approach reduces the average relative error by $62%$ and the number of iterations by $17%$ compared to the standard CG. These improvements are most pronounced in ill-conditioned scenarios, highlighting the robustness and efficiency of FNO-CG for challenging DA problems.