Nonlinear functional regression by functional deep neural network with kernel embedding

📅 2024-01-05
🏛️ arXiv.org
📈 Citations: 4
Influential: 0
📄 PDF
🤖 AI Summary
This paper addresses nonlinear regression with infinite-dimensional functional inputs by proposing a kernel-embedded functional deep neural network (KFDNN). Methodologically, KFDNN integrates a data-dependent smooth kernel integral transform, spectral projection onto feature-function bases, and a deep ReLU network to establish a discretization-invariant kernel embedding dimensionality reduction mechanism—preserving joint input–response structural information while enhancing noise robustness and generalization under sparse sampling. Theoretically, KFDNN achieves both low approximation error and favorable generalization bounds. Empirically, it significantly outperforms state-of-the-art functional data analysis (FDA) methods on noisy and sparsely sampled functional datasets. The core contribution lies in the first synergistic integration of kernel embedding and deep networks for functional dimensionality reduction and regression, thereby overcoming the classical limitations of discretization dependence and linear modeling assumptions.

Technology Category

Application Category

📝 Abstract
With the rapid development of deep learning in various fields of science and technology, such as speech recognition, image classification, and natural language processing, recently it is also widely applied in the functional data analysis (FDA) with some empirical success. However, due to the infinite dimensional input, we need a powerful dimension reduction method for functional learning tasks, especially for the nonlinear functional regression. In this paper, based on the idea of smooth kernel integral transformation, we propose a functional deep neural network with an efficient and fully data-dependent dimension reduction method. The architecture of our functional net consists of a kernel embedding step: an integral transformation with a data-dependent smooth kernel; a projection step: a dimension reduction by projection with eigenfunction basis based on the embedding kernel; and finally an expressive deep ReLU neural network for the prediction. The utilization of smooth kernel embedding enables our functional net to be discretization invariant, efficient, and robust to noisy observations, capable of utilizing information in both input functions and responses data, and have a low requirement on the number of discrete points for an unimpaired generalization performance. We conduct theoretical analysis including approximation error and generalization error analysis, and numerical simulations to verify these advantages of our functional net.
Problem

Research questions and friction points this paper is trying to address.

Develops adaptive dimension reduction for functional data
Proposes kernel embedding and neural network for regression
Analyzes approximation rates and generalization performance
Innovation

Methods, ideas, or system contributions that make the work stand out.

Kernel embedding with adaptive smooth kernel
Dimension reduction via projection Mercer kernel
Deep ReLU network for nonlinear prediction
🔎 Similar Papers
No similar papers found.
Z
Zhongjie Shi
Department of Electrical Engineering, ESAT-STADIUS, KU Leuven, Kasteelpark Arenberg 10, B-3001 Leuven, Belgium
J
Jun Fan
Department of Mathematics, Hong Kong Baptist University, Kowloon, Hong Kong
L
Linhao Song
School of Mathematical Science, Beihang University, Beijing, China; School of Data Science, City University of Hong Kong, Kowloon, Hong Kong
Ding-Xuan Zhou
Ding-Xuan Zhou
University of Sydney
theory of deep learningstatistical learningwaveletsapproximation theory
J
J. Suykens
Department of Electrical Engineering, ESAT-STADIUS, KU Leuven, Kasteelpark Arenberg 10, B-3001 Leuven, Belgium