🤖 AI Summary
This work investigates the statistical efficiency of adaptive measurements in quantum shadow tomography, specifically under the physically realistic constraint of non-entangling two-copy measurements (e.g., Bell measurements on two copies). Contrary to prior belief that adaptivity yields only marginal gains under such restricted measurements, we provide the first rigorous proof that adaptive two-copy measurements achieve exponential reduction in sample complexity—namely, $O(log M)$—whereas any non-adaptive scheme requires at least $Omega(sqrt{M})$ samples. Technically, our analysis integrates tools from quantum information theory, information-theoretic lower bound techniques, and an explicit construction of an adaptive measurement sequence. This establishes the indispensable role of adaptivity even for low-order, experimentally feasible measurements. Our results deliver both a theoretical breakthrough and practical guidance for resource-constrained quantum learning protocols.
📝 Abstract
In recent years there has been significant interest in understanding the statistical complexity of learning from quantum data under the constraint that one can only make unentangled measurements. While a key challenge in establishing tight lower bounds in this setting is to deal with the fact that the measurements can be chosen in an adaptive fashion, a recurring theme has been that adaptivity offers little advantage over more straightforward, nonadaptive protocols. In this note, we offer a counterpoint to this. We show that for the basic task of shadow tomography, protocols that use adaptively chosen two-copy measurements can be exponentially more sample-efficient than any protocol that uses nonadaptive two-copy measurements.