Two-Dimensional Deep ReLU CNN Approximation for Korobov Functions: A Constructive Approach

📅 2025-03-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work addresses the curse of dimensionality in high-dimensional function approximation by investigating the approximation capacity of two-dimensional deep ReLU convolutional neural networks (CNNs) for the Korobov function class. We propose a fully constructive approach: designing a CNN architecture comprising zero-padding, multi-channel convolutional layers, and a fully connected output layer, and explicitly constructing all network parameters under the continuous-weight model. We establish, for the first time, a near-optimal approximation rate for two-dimensional CNNs on the Korobov class—achieving an approximation error bound of $O(N^{-alpha}log N)$, where $N$ denotes the total number of parameters and $alpha$ is determined by the function’s smoothness. This result rigorously characterizes the expressive power of two-dimensional CNNs and reveals their efficient dimensional adaptivity to high-dimensional structured functions. It provides a foundational theoretical guarantee for deep convolutional models in approximating smooth multivariate functions.

Technology Category

Application Category

📝 Abstract
This paper investigates approximation capabilities of two-dimensional (2D) deep convolutional neural networks (CNNs), with Korobov functions serving as a benchmark. We focus on 2D CNNs, comprising multi-channel convolutional layers with zero-padding and ReLU activations, followed by a fully connected layer. We propose a fully constructive approach for building 2D CNNs to approximate Korobov functions and provide rigorous analysis of the complexity of the constructed networks. Our results demonstrate that 2D CNNs achieve near-optimal approximation rates under the continuous weight selection model, significantly alleviating the curse of dimensionality. This work provides a solid theoretical foundation for 2D CNNs and illustrates their potential for broader applications in function approximation.
Problem

Research questions and friction points this paper is trying to address.

Investigates 2D CNN approximation for Korobov functions.
Proposes constructive approach for 2D CNN network design.
Demonstrates near-optimal rates, reducing dimensionality curse.
Innovation

Methods, ideas, or system contributions that make the work stand out.

2D CNNs with ReLU for Korobov functions
Constructive approach for network building
Near-optimal rates, reduces dimensionality curse
🔎 Similar Papers
No similar papers found.
Q
Qin Fang
Information and Engineering College, Dalian University, Dalian 116622, China
L
Lei Shi
School of Mathematical Sciences and Shanghai Key Laboratory for Contemporary Applied Mathematics, Fudan University, Shanghai 200433, China
M
Min Xu
School of Mathematical Sciences, Dalian University of Technology, Dalian 116024, China
Ding-Xuan Zhou
Ding-Xuan Zhou
University of Sydney
theory of deep learningstatistical learningwaveletsapproximation theory