Minimum Norm Interpolation via The Local Theory of Banach Spaces: The Role of $2$-Uniform Convexity

πŸ“… 2026-03-30
πŸ“ˆ Citations: 0
✨ Influential: 0
πŸ“„ PDF
πŸ€– AI Summary
This work investigates the generalization performance of minimum-norm interpolation (MNI) in overparameterized models under norms not induced by inner products. Focusing on settings where closed-form solutions are unavailable, the authors establish sharp generalization bounds for linear MNI with non-Gaussian covariates by leveraging the assumption of 2-uniform convexity, together with tools from local theory of Banach spaces, geometry of isotropic convex bodies, and analysis of sub-Gaussian random vectors. The key contributions include elucidating the pivotal roles of isotropic position and 2-uniform convexity in MNI analysis, proving tightness of the upper bound on the bias of MNI in linear regression, and deriving optimal generalization bounds for β„“_p-MNI in the regime p ∈ (1 + C/ log d, 2].
πŸ“ Abstract
The minimum-norm interpolator (MNI) framework has recently attracted considerable attention as a tool for understanding generalization in overparameterized models, such as neural networks. In this work, we study the MNI under a $2$-uniform convexity assumption, which is weaker than requiring the norm to be induced by an inner product, and it typically does not admit a closed-form solution. At a high level, we show that this condition yields an upper bound on the MNI bias in both linear and nonlinear models. We further show that this bound is sharp for overparameterized linear regression when the unit ball of the norm is in isotropic (or John's) position, and the covariates are isotropic, symmetric, i.i.d. sub-Gaussian, such as vectors with i.i.d. Bernoulli entries. Finally, under the same assumption on the covariates, we prove sharp generalization bounds for the $\ell_p$-MNI when $p \in \bigl(1 + C/\log d, 2\bigr]$. To the best of our knowledge, this is the first work to establish sharp bounds for non-Gaussian covariates in linear models when the norm is not induced by an inner product. This work is deeply inspired by classical works on $K$-convexity, and more modern work on the geometry of 2-uniform and isotropic convex bodies.
Problem

Research questions and friction points this paper is trying to address.

minimum-norm interpolation
2-uniform convexity
overparameterized models
non-Gaussian covariates
generalization bounds
Innovation

Methods, ideas, or system contributions that make the work stand out.

minimum-norm interpolation
2-uniform convexity
sharp generalization bounds
isotropic position
non-Gaussian covariates
πŸ”Ž Similar Papers
2024-09-28arXiv.orgCitations: 0