🤖 AI Summary
This paper addresses parameter estimation and asymptotic inference for linear regression with Hilbert space-valued covariates—such as two-dimensional brain images—under an unknown reproducing kernel. To overcome this challenge, we propose a kernel-free estimation method based on principal component analysis (PCA): first performing kernel-free PCA to reduce the dimensionality of the covariates, then fitting linear regression in the low-dimensional projected space. We establish the $sqrt{n}$-consistency and asymptotic normality of the resulting regression coefficient estimator. This work is the first to systematically incorporate Hilbert space covariates with unknown reproducing kernels into the linear regression framework, circumventing explicit RKHS kernel estimation and thereby substantially improving computational efficiency and robustness. Simulation studies and real brain imaging data analyses confirm the consistency, predictive accuracy, and large-sample statistical validity of the proposed estimator.
📝 Abstract
We present a new method of linear regression based on principal components using Hilbert-space-valued covariates with unknown reproducing kernels. We develop a computationally efficient approach to estimation and derive asymptotic theory for the regression parameter estimates under mild assumptions. We demonstrate the approach in simulation studies as well as in data analysis using two-dimensional brain images as predictors.