Performance Guarantees for Quantum Neural Estimation of Entropies

📅 2025-11-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Quantum neural estimators (QNEs) lack non-asymptotic error bounds and a systematic theory for hyperparameter design in quantum entropy and divergence estimation. Method: We propose a hybrid architecture integrating classical neural networks with parameterized quantum circuits, and analyze sample complexity under the Thompson metric. Contribution/Results: This work establishes the first rigorous non-asymptotic upper bound on the estimation risk of QNEs for quantum relative entropy, proving sub-Gaussian concentration of the estimation error and deriving the minimax-optimal rate for ε-accuracy. When the input density operators satisfy permutation invariance, the dimensional dependence improves to polylog(d). The copy complexity is bounded by O(|Θ(𝒰)|d/ε²), where |Θ(𝒰)| denotes the effective parameter count of the quantum circuit. Our analysis provides the first formal performance guarantee for QNEs, substantially reducing the burden of hyperparameter tuning and advancing their principled implementation.

Technology Category

Application Category

📝 Abstract
Estimating quantum entropies and divergences is an important problem in quantum physics, information theory, and machine learning. Quantum neural estimators (QNEs), which utilize a hybrid classical-quantum architecture, have recently emerged as an appealing computational framework for estimating these measures. Such estimators combine classical neural networks with parametrized quantum circuits, and their deployment typically entails tedious tuning of hyperparameters controlling the sample size, network architecture, and circuit topology. This work initiates the study of formal guarantees for QNEs of measured (Rényi) relative entropies in the form of non-asymptotic error risk bounds. We further establish exponential tail bounds showing that the error is sub-Gaussian, and thus sharply concentrates about the ground truth value. For an appropriate sub-class of density operator pairs on a space of dimension $d$ with bounded Thompson metric, our theory establishes a copy complexity of $O(|Θ(mathcal{U})|d/ε^2)$ for QNE with a quantum circuit parameter set $Θ(mathcal{U})$, which has minimax optimal dependence on the accuracy $ε$. Additionally, if the density operator pairs are permutation invariant, we improve the dimension dependence above to $O(|Θ(mathcal{U})|mathrm{polylog}(d)/ε^2)$. Our theory aims to facilitate principled implementation of QNEs for measured relative entropies and guide hyperparameter tuning in practice.
Problem

Research questions and friction points this paper is trying to address.

Providing performance guarantees for quantum neural estimation of entropies
Establishing non-asymptotic error bounds for quantum entropy estimators
Analyzing sample complexity for quantum neural network implementations
Innovation

Methods, ideas, or system contributions that make the work stand out.

Hybrid classical-quantum neural network architecture
Non-asymptotic error risk bounds with sub-Gaussian tails
Minimax optimal copy complexity for entropy estimation
🔎 Similar Papers
No similar papers found.