Fast Gaussian Process Approximations for Autocorrelated Data

📅 2025-12-02
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
Gaussian process (GP) regression suffers from high computational complexity on autocorrelated data (e.g., time-series or spatial data), and conventional approximation methods often fail due to overfitting in temporal or structural dependencies. Method: This paper proposes a fast GP approximation framework tailored for block-wise decorrelated data. Its core innovation lies in the systematic adaptation of sparse GP methods and structured covariance decomposition techniques to data preprocessed via blocking and decorrelation—ensuring theoretical consistency while mitigating temporal overfitting. Contribution/Results: The method achieves both computational efficiency and high predictive accuracy. On multiple real-world autocorrelated datasets, it accelerates inference by one to two orders of magnitude compared to exact GP inference, while attaining prediction errors comparable to those of exact GP and significantly outperforming state-of-the-art GP approximations. This work establishes a new paradigm for scalable GP modeling under non-i.i.d. settings.

Technology Category

Application Category

📝 Abstract
This paper is concerned with the problem of how to speed up computation for Gaussian process models trained on autocorrelated data. The Gaussian process model is a powerful tool commonly used in nonlinear regression applications. Standard regression modeling assumes random samples and an independently, identically distributed noise. Various fast approximations that speed up Gaussian process regression work under this standard setting. But for autocorrelated data, failing to account for autocorrelation leads to a phenomenon known as temporal overfitting that deteriorates model performance on new test instances. To handle autocorrelated data, existing fast Gaussian process approximations have to be modified; one such approach is to segment the originally correlated data points into blocks in which the blocked data are de-correlated. This work explains how to make some of the existing Gaussian process approximations work with blocked data. Numerical experiments across diverse application datasets demonstrate that the proposed approaches can remarkably accelerate computation for Gaussian process regression on autocorrelated data without compromising model prediction performance.
Problem

Research questions and friction points this paper is trying to address.

Speeds up Gaussian process computation for autocorrelated data
Modifies approximations to prevent temporal overfitting in models
Uses data blocking to maintain prediction performance while accelerating
Innovation

Methods, ideas, or system contributions that make the work stand out.

Blocking autocorrelated data for decorrelation
Modifying existing Gaussian process approximations
Accelerating computation without performance loss
🔎 Similar Papers
No similar papers found.
A
Ahmadreza Chokhachian
H. Milton Stewart School of Industrial and Systems Engineering, Georgia Institute of Technology, Atlanta, GA 30339
Matthias Katzfuss
Matthias Katzfuss
Professor of Statistics, University of Wisconsin–Madison
Spatio-Temporal StatisticsGaussian ProcessesUQProbabilistic Machine Learning
Y
Yu Ding
H. Milton Stewart School of Industrial and Systems Engineering, Georgia Institute of Technology, Atlanta, GA 30339