🤖 AI Summary
This work addresses the limitation of traditional worst-case analysis in high-dimensional optimization, which fails to characterize practical convergence behavior. Methodologically, it adopts a distributional perspective: modeling the objective function as a stochastic process and capturing its statistical structure via exchangeability and non-stationary isotropic covariance kernels. Integrating Bayesian optimization, stochastic functional analysis, and distributional modeling, the approach uncovers mechanisms underlying the predictability of optimization progress in high dimensions. Theoretically, it derives an optimal step-size policy for gradient descent under the proposed distributional assumptions. Furthermore, it establishes a unified analytical framework for landscape formation of stochastic objectives in machine learning. The core contribution lies in replacing worst-case reasoning with distributional reasoning, thereby establishing, for the first time, a systematic theoretical connection between predictability and adaptive step-size control in high-dimensional optimization.
📝 Abstract
This PhD thesis presents a distributional view of optimization in place of a worst-case perspective. We motivate this view with an investigation of the failure point of classical optimization. Subsequently we consider the optimization of a randomly drawn objective function. This is the setting of Bayesian Optimization. After a review of Bayesian optimization we outline how such a distributional view may explain predictable progress of optimization in high dimension. It further turns out that this distributional view provides insights into optimal step size control of gradient descent. To enable these results, we develop mathematical tools to deal with random input to random functions and a characterization of non-stationary isotropic covariance kernels. Finally, we outline how assumptions about the data, specifically exchangability, can lead to random objective functions in machine learning and analyze their landscape.