🤖 AI Summary
This paper addresses two closely related approximation problems: (1) the optimal approximation rate of zonoids in $mathbb{R}^{d+1}$ by $n$ line segments under the Hausdorff distance; and (2) the optimal uniform-norm approximation order of shallow ReLU$^k$ neural networks in their variation spaces. Methodologically, the work integrates geometric analysis, convex body approximation theory, and variational space modeling of ReLU$^k$ networks. For problem (1), it completely closes the long-standing logarithmic gap between upper and lower bounds for $d = 2,3$, and establishes tight, dimension-agnostic approximation rates for all $d$. For problem (2), it significantly improves the known approximation orders for all $k geq 1$, and—crucially—achieves simultaneous uniform approximation of both the target function and its derivatives up to order $k$. The results constitute the strongest currently known dimension-independent uniform approximation guarantees for shallow ReLU$^k$ networks.
📝 Abstract
We study the following two related problems. The first is to determine to what error an arbitrary zonoid in $mathbb{R}^{d+1}$ can be approximated in the Hausdorff distance by a sum of $n$ line segments. The second is to determine optimal approximation rates in the uniform norm for shallow ReLU$^k$ neural networks on their variation spaces. The first of these problems has been solved for $d
eq 2,3$, but when $d=2,3$ a logarithmic gap between the best upper and lower bounds remains. We close this gap, which completes the solution in all dimensions. For the second problem, our techniques significantly improve upon existing approximation rates when $kgeq 1$, and enable uniform approximation of both the target function and its derivatives.