🤖 AI Summary
To address the high computational complexity and poor scalability of kernel matrix–vector multiplication (MVM) in Gaussian process inference—particularly for large-scale, low-dimensional datasets—this paper introduces the first exact, fast MVM algorithm for multivariate Matérn kernels with half-integer smoothness parameters. Methodologically, the approach leverages an analytic decomposition of the Matérn kernel, combined with a divide-and-conquer strategy, weighted empirical cumulative distribution functions, and a persistent sorted data structure, while integrating linear fixed-effect prediction. This yields an overall time complexity of *O*(*N* log *N*). Experiments demonstrate substantial speedups over standard implementations on datasets comprising hundreds of thousands of low-dimensional points. The implementation is publicly available. The core contribution is the first exact, scalable MVM decomposition for Matérn kernels—achieving simultaneous gains in numerical accuracy, computational efficiency, and modeling expressivity.
📝 Abstract
To speed up Gaussian process inference, a number of fast kernel matrix-vector multiplication (MVM) approximation algorithms have been proposed over the years. In this paper, we establish an exact fast kernel MVM algorithm based on exact kernel decomposition into weighted empirical cumulative distribution functions, compatible with a class of kernels which includes multivariate Matérn kernels with half-integer smoothness parameter. This algorithm uses a divide-and-conquer approach, during which sorting outputs are stored in a data structure. We also propose a new algorithm to take into account some linear fixed effects predictor function. Our numerical experiments confirm that our algorithm is very effective for low-dimensional Gaussian process inference problems with hundreds of thousands of data points. An implementation of our algorithm is available at https://gitlab.com/warin/fastgaussiankernelregression.git.