🤖 AI Summary
This paper investigates the convergence of kernel regression in reproducing kernel Hilbert spaces (RKHS) under irregular sampling. Under weak assumptions—namely, Lipschitz or Hölder continuity of the kernel and input distribution—explicit error bounds in the RKHS norm are established. For the first time, uniform convergence on compact domains is proven without requiring uniform sampling or density regularity, and optimal algebraic convergence rates are derived. The analysis integrates tools from reproducing kernel theory, functional analysis, and non-uniform sampling theory. Key contributions include: (1) convergence guarantees in both the RKHS norm and the uniform norm; (2) explicit, computable convergence rates for Lipschitz and Hölder kernels, respectively; and (3) a substantial expansion of the theoretical applicability of kernel methods to irregular, sparse, or heterogeneous sampling settings.
📝 Abstract
We analyse the convergence of sampling algorithms for functions in reproducing kernel Hilbert spaces (RKHS). To this end, we discuss approximation properties of kernel regression under minimalistic assumptions on both the kernel and the input data. We first prove error estimates in the kernel's RKHS norm. This leads us to new results concerning uniform convergence of kernel regression on compact domains. For Lipschitz continuous and H""older continuous kernels, we prove convergence rates.