🤖 AI Summary
This work investigates the minimal query complexity—i.e., the least number of channel calls required—to discriminate between classical and quantum channels with a given error probability. For binary and multiple-channel discrimination, we establish the first exact characterization of query complexity for binary classical channel discrimination. Our method leverages geometric, Holevo, and Uhlmann channel fidelities, together with three Rényi divergences (geometric, Petz, and Holevo types), to derive universal upper and lower bounds. The analysis integrates tools from information theory, non-asymptotic hypothesis testing, and channel distinguishability measures. Key contributions include: (i) revealing a logarithmic dependence of binary discrimination complexity on the error probability and an inverse-logarithmic scaling with channel fidelity; (ii) deriving an $O(log M)$ upper bound for $M$-ary channel discrimination; and (iii) establishing tight, Rényi-divergence–dominated bounds for asymmetric discrimination settings.
📝 Abstract
Quantum channel discrimination has been studied from an information-theoretic perspective, wherein one is interested in the optimal decay rate of error probabilities as a function of the number of unknown channel accesses. In this paper, we study the query complexity of quantum channel discrimination, wherein the goal is to determine the minimum number of channel uses needed to reach a desired error probability. To this end, we show that the query complexity of binary channel discrimination depends logarithmically on the inverse error probability and inversely on the negative logarithm of the (geometric and Holevo) channel fidelity. As a special case of these findings, we precisely characterize the query complexity of discriminating between two classical channels. We also provide lower and upper bounds on the query complexity of binary asymmetric channel discrimination and multiple quantum channel discrimination. For the former, the query complexity depends on the geometric R'enyi and Petz R'enyi channel divergences, while for the latter, it depends on the negative logarithm of (geometric and Uhlmann) channel fidelity. For multiple channel discrimination, the upper bound scales as the logarithm of the number of channels.