🤖 AI Summary
Existing neural operators face three key bottlenecks in PDE solving: poor scalability on dense grids, error accumulation in long-horizon temporal rollout, and weak generalization due to task coupling. This paper proposes ECHO, a generative Transformer-based operator framework that integrates a hierarchical convolutional encoder-decoder architecture with an adaptive training paradigm—mapping sparse inputs to high-resolution outputs—achieving ~100× spatiotemporal compression. By decoupling representation learning from task-specific fine-tuning, ECHO enables unified solution across diverse PDE systems. Furthermore, it incorporates both conditional and unconditional generative modeling to substantially suppress error drift. Evaluated on complex geometries, high-frequency dynamics, and long-time-domain PDEs, ECHO delivers high-fidelity simulation on million-node grids. It sets new state-of-the-art performance in computational efficiency and cross-task generalization.
📝 Abstract
We introduce ECHO, a transformer-operator framework for generating million-point PDE trajectories. While existing neural operators (NOs) have shown promise for solving partial differential equations, they remain limited in practice due to poor scalability on dense grids, error accumulation during dynamic unrolling, and task-specific design. ECHO addresses these challenges through three key innovations. (i) It employs a hierarchical convolutional encode-decode architecture that achieves a 100 $ imes$ spatio-temporal compression while preserving fidelity on mesh points. (ii) It incorporates a training and adaptation strategy that enables high-resolution PDE solution generation from sparse input grids. (iii) It adopts a generative modeling paradigm that learns complete trajectory segments, mitigating long-horizon error drift. The training strategy decouples representation learning from downstream task supervision, allowing the model to tackle multiple tasks such as trajectory generation, forward and inverse problems, and interpolation. The generative model further supports both conditional and unconditional generation. We demonstrate state-of-the-art performance on million-point simulations across diverse PDE systems featuring complex geometries, high-frequency dynamics, and long-term horizons.