Solving PDEs in One Shot via Fourier Features with Exact Analytical Derivatives

📅 2026-02-11
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work proposes a novel approach to solving partial differential equations (PDEs) that overcomes the high computational cost and reliance on iterative optimization or automatic differentiation inherent in traditional physics-informed neural networks (PINNs). By leveraging sine-based random Fourier features with analytically tractable, recursive higher-order derivatives, the method enables direct solution of linear PDEs via a single least-squares solve under frozen features, while efficiently handling nonlinear PDEs through Newton–Raphson iteration. Notably, it achieves operator construction with O(1) complexity and completely eliminates the need for automatic differentiation, establishing a true “solve-once” framework. Evaluated on 17 benchmark PDEs across dimensions one to six, the approach attains relative L² errors of 10⁻⁷ for linear problems in 0.07 seconds and 10⁻⁸–10⁻⁹ for nonlinear cases in under 9 seconds, significantly outperforming existing PINN methods in both accuracy and speed.

Technology Category

Application Category

📝 Abstract
Recent random feature methods for solving partial differential equations (PDEs) reduce computational cost compared to physics-informed neural networks (PINNs) but still rely on iterative optimization or expensive derivative computation. We observe that sinusoidal random Fourier features possess a cyclic derivative structure: the derivative of any order of $\sin(\mathbf{W}\cdot\mathbf{x}+b)$ is a single sinusoid with a monomial prefactor, computable in $O(1)$ operations. Alternative activations such as $\tanh$, used in prior one-shot methods like PIELM, lack this property: their higher-order derivatives grow as $O(2^n)$ terms, requiring automatic differentiation for operator assembly. We propose FastLSQ, which combines frozen random Fourier features with analytical operator assembly to solve linear PDEs via a single least-squares call, and extend it to nonlinear PDEs via Newton--Raphson iteration where each linearized step is a FastLSQ solve. On a benchmark of 17 PDEs spanning 1 to 6 dimensions, FastLSQ achieves relative $L^2$ errors of $10^{-7}$ in 0.07\,s on linear problems, three orders of magnitude more accurate and significantly faster than state-of-the-art iterative PINN solvers, and $10^{-8}$ to $10^{-9}$ on nonlinear problems via Newton iteration in under 9s.
Problem

Research questions and friction points this paper is trying to address.

partial differential equations
random Fourier features
derivative computation
iterative optimization
high-order derivatives
Innovation

Methods, ideas, or system contributions that make the work stand out.

Fourier features
analytical derivatives
FastLSQ
one-shot PDE solver
least-squares method
🔎 Similar Papers
No similar papers found.