🤖 AI Summary
High-dimensional function approximation in reproducing kernel Hilbert spaces (RKHS) suffers from the “curse of dimensionality” when using regular grid sampling, leading to exponential growth in sample complexity.
Method: We propose a high-dimensional irregular sampling scheme based on tensor products of low-dimensional deterministic quasi-random sequences (e.g., Halton, Sobol’), leveraging the tensor-product structure of the underlying RKHS kernel. Unlike conventional grid-based approaches, this construction yields sparse, mathematically tractable, non-uniform sampling sets.
Contribution/Results: We establish theoretical guarantees for stable function reconstruction in RKHS under this sampling scheme. The method reduces computational complexity from exponential to nearly linear in dimension, while preserving analytical rigor. It provides a principled, scalable framework for modeling irregular high-dimensional data, sparse sampling, and efficient function approximation—bridging theoretical analysis with practical algorithmic design.
📝 Abstract
We develop sampling formulas for high-dimensional functions in reproducing kernel Hilbert spaces, where we rely on irregular samples that are taken at determining sequences of data points. We place particular emphasis on sampling formulas for tensor product kernels, where we show that determining irregular samples in lower dimensions can be composed to obtain a tensor of determining irregular samples in higher dimensions. This in turn reduces the computational complexity of sampling formulas for high-dimensional functions quite significantly.