Inspired by machine learning optimization: can gradient-based optimizers solve cycle skipping in full waveform inversion given sufficient iterations?

📅 2025-09-18
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
FWI suffers from cycle-skipping when the initial velocity model is inaccurate and low-frequency data (<3 Hz) are absent, causing gradient-based optimizers to converge to local minima. To address this, we propose a large-learning-rate gradient optimization strategy that circumvents conventional line-search constraints, endowing local optimizers with quasi-global search capabilities. Through multiple rounds of sufficiently long iterations, the method achieves progressive convergence—from shallow to deep subsurface layers—thereby mitigating cycle-skipping. Numerical and field-data experiments demonstrate that the approach robustly reconstructs high-fidelity velocity models even in the absence of frequencies below 5 Hz, gradually approaching the global optimum. This significantly enhances the robustness and accuracy of FWI imaging for complex geological structures.

Technology Category

Application Category

📝 Abstract
Full waveform inversion (FWI) iteratively updates the velocity model by minimizing the difference between observed and simulated data. Due to the high computational cost and memory requirements associated with global optimization algorithms, FWI is typically implemented using local optimization methods. However, when the initial velocity model is inaccurate and low-frequency seismic data (e.g., below 3 Hz) are absent, the mismatch between simulated and observed data may exceed half a cycle, a phenomenon known as cycle skipping. In such cases, local optimization algorithms (e.g., gradient-based local optimizers) tend to converge to local minima, leading to inaccurate inversion results. In machine learning, neural network training is also an optimization problem prone to local minima. It often employs gradient-based optimizers with a relatively large learning rate (beyond the theoretical limits of local optimization that are usually determined numerically by a line search), which allows the optimization to behave like a quasi-global optimizer. Consequently, after training for several thousand iterations, we can obtain a neural network model with strong generative capability. In this study, we also employ gradient-based optimizers with a relatively large learning rate for FWI. Results from both synthetic and field data experiments show that FWI may initially converge to a local minimum; however, with sufficient additional iterations, the inversion can gradually approach the global minimum, slowly from shallow subsurface to deep, ultimately yielding an accurate velocity model. Furthermore, numerical examples indicate that, given sufficient iterations, reasonable velocity inversion results can still be achieved even when low-frequency data below 5 Hz are missing.
Problem

Research questions and friction points this paper is trying to address.

Investigating gradient optimizers overcoming cycle skipping in seismic inversion
Testing large learning rate strategies for global convergence in FWI
Achieving accurate velocity models without low-frequency data via sufficient iterations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Gradient-based optimizers with large learning rate
Sufficient iterations to overcome local minima
Works without low-frequency data below 5Hz
🔎 Similar Papers
No similar papers found.
X
Xinru Mu
Physical Science and Engineering Division, King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia
Omar M. Saad
Omar M. Saad
Research Scientist, KAUST
Artificial IntelligenceSeismologyGeophysicsEarthquake
Shaowen Wang
Shaowen Wang
Professor, University of Illinois Urbana-Champaign
CyberGISGeospatial Data ScienceSpatial AISpatial AnalysisSustainability
T
Tariq Alkhalifah
Physical Science and Engineering Division, King Abdullah University of Science and Technology, Thuwal 23955, Saudi Arabia