Learning Gradient Flow: Using Equation Discovery to Accelerate Engineering Optimization

📅 2026-02-13
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the high computational cost of evaluating objective functions and their gradients, as well as slow convergence, in engineering optimization. It proposes the Learned Gradient Flow (LGF) optimizer, which employs a data-driven equation discovery approach to infer continuous-time dynamical systems from optimization trajectories—systems that correspond to algorithms such as gradient descent, Newton’s method, and ADAM. LGF constructs surrogate gradient flow models that can replace the original problem, adaptively generating polynomial surrogates of varying orders in either full-dimensional or reduced-dimensional spaces. This significantly reduces reliance on repeated evaluations of the original objective function and its gradients. Demonstrated across diverse forward and inverse problems in structural topology optimization and scientific machine learning, the method accelerates convergence while preserving essential features of the optimization trajectory.

Technology Category

Application Category

📝 Abstract
In this work, we investigate the use of data-driven equation discovery for dynamical systems to model and forecast continuous-time dynamics of unconstrained optimization problems. To avoid expensive evaluations of the objective function and its gradient, we leverage trajectory data on the optimization variables to learn the continuous-time dynamics associated with gradient descent, Newton's method, and ADAM optimization. The discovered gradient flows are then solved as a surrogate for the original optimization problem. To this end, we introduce the Learned Gradient Flow (LGF) optimizer, which is equipped to build surrogate models of variable polynomial order in full- or reduced-dimensional spaces at user-defined intervals in the optimization process. We demonstrate the efficacy of this approach on several standard problems from engineering mechanics and scientific machine learning, including two inverse problems, structural topology optimization, and two forward solves with different discretizations. Our results suggest that the learned gradient flows can significantly expedite convergence by capturing critical features of the optimization trajectory while avoiding expensive evaluations of the objective and its gradient.
Problem

Research questions and friction points this paper is trying to address.

optimization
gradient flow
equation discovery
surrogate modeling
computational efficiency
Innovation

Methods, ideas, or system contributions that make the work stand out.

equation discovery
gradient flow
surrogate modeling
optimization acceleration
data-driven dynamics
G
Grant Norman
Smead Aerospace Engineering Sciences, 3775 Discovery Drive, Boulder, CO 80309
C
Conor Rowan
Smead Aerospace Engineering Sciences, 3775 Discovery Drive, Boulder, CO 80309
Kurt Maute
Kurt Maute
Professor of Aerospace Engineering, University of Colorado Boulder
Computational MechanicsDesign Optimization
Alireza Doostan
Alireza Doostan
University of Colorado, Boulder
Uncertainty quantificationcomputational mechanics