🤖 AI Summary
This work addresses the high query cost and strong parameter dependence inherent in black-box function optimization. We propose a novel zeroth-order optimization method grounded in the quantum parameter-shift rule (PSR), marking the first integration of PSR into classical black-box optimization frameworks. Unlike conventional zeroth-order methods, our approach requires no gradient information and constructs accurate directional estimates using only a small number of function queries—bypassing the need for high query frequency or large parameter counts. Evaluated across diverse non-convex black-box tasks, the method achieves significantly faster convergence, reducing average query counts by approximately 40% and demonstrating superior parameter efficiency compared to state-of-the-art zeroth-order optimizers such as NES and ZOO. The proposed framework establishes a scalable, resource-efficient paradigm for black-box optimization under stringent computational constraints.
📝 Abstract
Machine learning has been widely applied in many aspects, but training a machine learning model is increasingly difficult. There are more optimization problems named"black-box"where the relationship between model parameters and outcomes is uncertain or complex to trace. Currently, optimizing black-box models that need a large number of query observations and parameters becomes difficult. To overcome the drawbacks of the existing algorithms, in this study, we propose a zeroth-order method that originally came from quantum computing called the parameter-shift rule, which has used a lesser number of parameters than previous methods.