Sparse RBF Networks for PDEs and nonlocal equations: function space theory, operator calculus, and training algorithms

📅 2026-01-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes SparseRBFnet, a novel framework for efficiently solving nonlinear partial differential equations involving nonlocal operators such as the fractional Laplacian. By integrating adaptive-width shallow kernel networks with anisotropic kernel parameterization, the method achieves sparse representations of solutions and quasi-analytical evaluation of operators within a unified Besov space setting. A key innovation lies in decoupling the solution space from kernel selection, enabling explicit and robust operator evaluation. The authors further introduce a three-stage training strategy that combines inner-weight optimization with second-order methods. Extensive benchmarks on high-dimensional, high-order, and fractional PDEs demonstrate that SparseRBFnet attains high accuracy while significantly enhancing sparsity and computational efficiency.

Technology Category

Application Category

📝 Abstract
This work presents a systematic analysis and extension of the sparse radial basis function network (SparseRBFnet) previously introduced for solving nonlinear partial differential equations (PDEs). Based on its adaptive-width shallow kernel network formulation, we further investigate its function-space characterization, operator evaluation, and computational algorithm. We provide a unified description of the solution space for a broad class of radial basis functions (RBFs). Under mild assumptions, this space admits a characterization as a Besov space, independent of the specific kernel choice. We further demonstrate how the explicit kernel-based structure enables quasi-analytical evaluation of both differential and nonlocal operators, including fractional Laplacians. On the computational end, we study the adaptive-width network and related three-phase training strategy through a comparison with variants concerning the modeling and algorithmic details. In particular, we assess the roles of second-order optimization, inner-weight training, network adaptivity, and anisotropic kernel parameterizations. Numerical experiments on high-order, fractional, and anisotropic PDE benchmarks illustrate the empirical insensitivity to kernel choice, as well as the resulting trade-offs between accuracy, sparsity, and computational cost. Collectively, these results consolidate and generalize the theoretical and computational framework of SparseRBFnet, supporting accurate sparse representations with efficient operator evaluation and offering theory-grounded guidance for algorithmic and modeling choices.
Problem

Research questions and friction points this paper is trying to address.

Sparse RBF Networks
Partial Differential Equations
Nonlocal Equations
Function Space Theory
Operator Calculus
Innovation

Methods, ideas, or system contributions that make the work stand out.

SparseRBFnet
Besov space
nonlocal operators
adaptive-width network
fractional Laplacian
🔎 Similar Papers
No similar papers found.