🤖 AI Summary
Random reservoir computing lacks a rigorous theoretical foundation—particularly formal universality proofs. Method: We propose a novel paradigm wherein the reservoir state probability distribution serves as the readout signal. Specifically, we construct a reservoir architecture based on stochastic nonlinear dynamical systems and provide the first rigorous proof of universal approximation for random echo state networks. We further introduce a probabilistic readout mechanism that unifies exponential effective readout dimensionality with hardware compactness, incorporating quantum-inspired modeling principles. Results: Under low-noise conditions, our approach significantly outperforms comparably sized deterministic reservoir computers on both classification and chaotic time-series prediction tasks. It achieves superior performance while maintaining high hardware efficiency, thereby overcoming the intrinsic dimensionality bottleneck of conventional deterministic reservoirs.
📝 Abstract
Reservoir computing is a form of machine learning that utilizes nonlinear dynamical systems to perform complex tasks in a cost-effective manner when compared to typical neural networks. Recent advancements in reservoir computing, in particular quantum reservoir computing, use reservoirs that are inherently stochastic. In this paper, we investigate the universality of stochastic reservoir computers which use the probabilities of each stochastic reservoir state as the readout instead of the states themselves. This allows the number of readouts to scale exponentially with the size of the reservoir hardware, offering the advantage of compact device size. We prove that classes of stochastic echo state networks form universal approximating classes. We also investigate the performance of two practical examples in classification and chaotic time series prediction. While shot noise is a limiting factor, we show significantly improved performance compared to a deterministic reservoir computer with similar hardware when noise effects are small.