Infinite-dimensional next-generation reservoir computing

📅 2024-12-13
🏛️ arXiv.org
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Next-generation reservoir computing (NG-RC) struggles with infinite-dimensional covariates in spatiotemporal forecasting of complex systems, requiring manual specification of lag and polynomial orders—limiting its theoretical generality and practical robustness. Method: We propose a kernel ridge regression (KRR)-driven reconstruction framework that embeds NG-RC into a universal reproducing kernel Hilbert space (RKHS), enabling implicit representation of infinite-dimensional dynamics without explicit parametrization of temporal lags or polynomial bases. Contribution/Results: This work presents the first infinite-dimensional generalization of NG-RC, achieving both lag-order independence and polynomial-dimension independence. Theoretically grounded and empirically validated, the method eliminates tedious hyperparameter tuning while significantly improving training efficiency and generalization performance across diverse spatiotemporal prediction benchmarks—outperforming standard NG-RC consistently.

Technology Category

Application Category

📝 Abstract
Next-generation reservoir computing (NG-RC) has attracted much attention due to its excellent performance in spatio-temporal forecasting of complex systems and its ease of implementation. This paper shows that NG-RC can be encoded as a kernel ridge regression that makes training efficient and feasible even when the space of chosen polynomial features is very large. Additionally, an extension to an infinite number of covariates is possible, which makes the methodology agnostic with respect to the lags into the past that are considered as explanatory factors, as well as with respect to the number of polynomial covariates, an important hyperparameter in traditional NG-RC. We show that this approach has solid theoretical backing and good behavior based on kernel universality properties previously established in the literature. Various numerical illustrations show that these generalizations of NG-RC outperform the traditional approach in several forecasting applications.
Problem

Research questions and friction points this paper is trying to address.

Efficient training for complex spatio-temporal forecasting
Extension to infinite covariates in NG-RC
Improved performance over traditional NG-RC methods
Innovation

Methods, ideas, or system contributions that make the work stand out.

Infinite-dimensional kernel ridge regression
Agnostic to past lags
Outperforms traditional NG-RC
🔎 Similar Papers
No similar papers found.