🤖 AI Summary
To address the limited flexibility, absence of uncertainty quantification, and inability to reuse historical observations in the parameter-shift rule (PSR) for variational quantum eigensolvers (VQE), this paper proposes the Bayesian Parameter-Shift Rule (Bayesian PSR). Our method models the gradient function via a Gaussian process, enabling gradient evaluation at arbitrary points and principled uncertainty quantification—marking the first integration of Bayesian inference into the PSR framework. We further introduce the Gradient Confidence Region (GradCoRe) mechanism, which dynamically selects optimal observation points to minimize quantum circuit evaluations per optimization step. In degenerate cases, Bayesian PSR reduces to the generalized PSR. Numerical experiments demonstrate significant acceleration of VQE convergence, with superior gradient estimation efficiency compared to state-of-the-art approaches such as sequential minimal optimization.
📝 Abstract
Parameter shift rules (PSRs) are key techniques for efficient gradient estimation in variational quantum eigensolvers (VQEs). In this paper, we propose its Bayesian variant, where Gaussian processes with appropriate kernels are used to estimate the gradient of the VQE objective. Our Bayesian PSR offers flexible gradient estimation from observations at arbitrary locations with uncertainty information and reduces to the generalized PSR in special cases. In stochastic gradient descent (SGD), the flexibility of Bayesian PSR allows the reuse of observations in previous steps, which accelerates the optimization process. Furthermore, the accessibility to the posterior uncertainty, along with our proposed notion of gradient confident region (GradCoRe), enables us to minimize the observation costs in each SGD step. Our numerical experiments show that the VQE optimization with Bayesian PSR and GradCoRe significantly accelerates SGD and outperforms the state-of-the-art methods, including sequential minimal optimization.