🤖 AI Summary
This paper addresses nonlinear filters in Euclidean space that are causal, time-invariant, and possess fading memory. We propose a universal Volterra reservoir kernel. Methodologically, we first integrate the state-space representation of Volterra series with reservoir computing, constructing a recursively computable kernel mapping on an infinite-dimensional tensor algebra—balancing theoretical universality and practical implementability. Theoretically, the kernel satisfies the Representer Theorem and uniformly approximates any fading-memory filter. Empirically, it significantly improves modeling accuracy and generalization in Bitcoin price forecasting—a challenging nonlinear time-series task. Our core contributions are: (i) establishing a unified Volterra-reservoir modeling paradigm; (ii) enabling efficient recursive implementation of infinite-dimensional kernels; and (iii) providing a new framework for dynamical system learning that jointly ensures expressive power and computational feasibility.
📝 Abstract
A universal kernel is constructed whose sections approximate any causal and time-invariant filter in the fading memory category with inputs and outputs in a finite-dimensional Euclidean space. This kernel is built using the reservoir functional associated with a state-space representation of the Volterra series expansion available for any analytic fading memory filter. It is hence called the Volterra reservoir kernel. Even though the state-space representation and the corresponding reservoir feature map are defined on an infinite-dimensional tensor algebra space, the kernel map is characterized by explicit recursions that are readily computable for specific data sets when employed in estimation problems using the representer theorem. We showcase the performance of the Volterra reservoir kernel in a popular data science application in relation to bitcoin price prediction.