Bayesian Parameter Shift Rule in Variational Quantum Eigensolvers

📅 2025-02-04
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
To address the limited flexibility, absence of uncertainty quantification, and inability to reuse historical observations in the parameter-shift rule (PSR) for variational quantum eigensolvers (VQE), this paper proposes the Bayesian Parameter-Shift Rule (Bayesian PSR). Our method models the gradient function via a Gaussian process, enabling gradient evaluation at arbitrary points and principled uncertainty quantification—marking the first integration of Bayesian inference into the PSR framework. We further introduce the Gradient Confidence Region (GradCoRe) mechanism, which dynamically selects optimal observation points to minimize quantum circuit evaluations per optimization step. In degenerate cases, Bayesian PSR reduces to the generalized PSR. Numerical experiments demonstrate significant acceleration of VQE convergence, with superior gradient estimation efficiency compared to state-of-the-art approaches such as sequential minimal optimization.

Technology Category

Application Category

📝 Abstract
Parameter shift rules (PSRs) are key techniques for efficient gradient estimation in variational quantum eigensolvers (VQEs). In this paper, we propose its Bayesian variant, where Gaussian processes with appropriate kernels are used to estimate the gradient of the VQE objective. Our Bayesian PSR offers flexible gradient estimation from observations at arbitrary locations with uncertainty information and reduces to the generalized PSR in special cases. In stochastic gradient descent (SGD), the flexibility of Bayesian PSR allows the reuse of observations in previous steps, which accelerates the optimization process. Furthermore, the accessibility to the posterior uncertainty, along with our proposed notion of gradient confident region (GradCoRe), enables us to minimize the observation costs in each SGD step. Our numerical experiments show that the VQE optimization with Bayesian PSR and GradCoRe significantly accelerates SGD and outperforms the state-of-the-art methods, including sequential minimal optimization.
Problem

Research questions and friction points this paper is trying to address.

Bayesian variant for gradient estimation
Flexible gradient estimation with uncertainty
Minimizes observation costs in SGD
Innovation

Methods, ideas, or system contributions that make the work stand out.

Bayesian variant of PSR
Gaussian processes for gradients
GradCoRe minimizes observation costs
🔎 Similar Papers
No similar papers found.