Residual Reservoir Memory Networks

πŸ“… 2025-08-13
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
To address the limited capability of conventional reservoir computing (RC) models in modeling long-term dependencies, this paper proposes the Residual Reservoir Memory Network (RRMN)β€”a training-free recurrent architecture that integrates a linear memory reservoir with a nonlinear reservoir and introduces residual orthogonal connections along the temporal dimension to explicitly enhance long-range information propagation. Orthogonal constraints on the reservoir weights improve the linear stability of reservoir state dynamics, enabling flexible residual configurations. Theoretical analysis characterizes the network’s dynamical properties via linear stability analysis. Extensive experiments on multiple time-series forecasting and 1D pixel classification benchmarks demonstrate that RRMN consistently outperforms classical RC variants, achieving significant improvements in prediction accuracy, robustness to noise and hyperparameter variations, and long-range dependency capture.

Technology Category

Application Category

πŸ“ Abstract
We introduce a novel class of untrained Recurrent Neural Networks (RNNs) within the Reservoir Computing (RC) paradigm, called Residual Reservoir Memory Networks (ResRMNs). ResRMN combines a linear memory reservoir with a non-linear reservoir, where the latter is based on residual orthogonal connections along the temporal dimension for enhanced long-term propagation of the input. The resulting reservoir state dynamics are studied through the lens of linear stability analysis, and we investigate diverse configurations for the temporal residual connections. The proposed approach is empirically assessed on time-series and pixel-level 1-D classification tasks. Our experimental results highlight the advantages of the proposed approach over other conventional RC models.
Problem

Research questions and friction points this paper is trying to address.

Introducing untrained recurrent networks with residual memory
Studying reservoir state dynamics via linear stability analysis
Evaluating performance on time-series and classification tasks
Innovation

Methods, ideas, or system contributions that make the work stand out.

Combines linear and nonlinear memory reservoirs
Uses residual orthogonal temporal connections
Enhances long-term input propagation dynamics
πŸ”Ž Similar Papers
No similar papers found.