🤖 AI Summary
This work investigates the scalability and memory retention of quantum scrambling systems for temporal information processing. We develop a quantum reservoir computing (QRC) framework based on higher-order unitary designs, rigorously analyzing its behavior under both noiseless and noisy environments. Our analysis integrates local noise modeling, concentration inequalities, and random matrix theory. We establish, for the first time, that in the noiseless case, measurement-based readout concentrates exponentially with reservoir size and remains stable across iterations, whereas memory of the initial state and early inputs decays doubly exponentially in both reservoir size and iteration count. Under noise, iterative dynamics induce additional exponential memory decay. We introduce novel proof techniques that demonstrate the feasibility of small, reusable reservoirs while identifying fundamental bottlenecks to large-scale generalization. These results provide critical theoretical foundations for the limits of quantum temporal learning and inform hardware-aware design principles.
📝 Abstract
Scrambling quantum systems have been demonstrated as effective substrates for temporal information processing. While their role in providing rich feature maps has been widely studied, a theoretical understanding of their performance in temporal tasks is still lacking. Here we consider a general quantum reservoir processing framework that captures a broad range of physical computing models with quantum systems. We examine the scalability and memory retention of the model with scrambling reservoirs modelled by high-order unitary designs in both noiseless and noisy settings. In the former regime, we show that measurement readouts become exponentially concentrated with increasing reservoir size, yet strikingly do not worsen with the reservoir iterations. Thus, while repeatedly reusing a small scrambling reservoir with quantum data might be viable, scaling up the problem size deteriorates generalization unless one can afford an exponential shot overhead. In contrast, the memory of early inputs and initial states decays exponentially in both reservoir size and reservoir iterations. In the noisy regime, we also prove exponential memory decays with iterations for local noisy channels. Proving these results required us to introduce new proof techniques for bounding concentration in temporal quantum learning models.