On damage of interpolation to adversarial robustness in regression

📅 2026-01-22
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This study investigates the impact of interpolation—i.e., achieving zero training error—on adversarial robustness in regression tasks, with a focus on performance degradation under future input perturbations (X-attacks). Within a nonparametric regression framework, the authors combine minimax convergence rate analysis with numerical experiments to uncover, for the first time, the theoretical mechanism by which interpolation severely undermines adversarial robustness in high-dimensional settings. They identify a “curse of sample size”: as the number of samples grows, interpolating estimators become increasingly suboptimal even under arbitrarily small X-attacks. This finding challenges the prevailing notion that interpolation is benign, demonstrating instead that perfect fit to training data can critically compromise a model’s resilience to input perturbations.

Technology Category

Application Category

📝 Abstract
Deep neural networks (DNNs) typically involve a large number of parameters and are trained to achieve zero or near-zero training error. Despite such interpolation, they often exhibit strong generalization performance on unseen data, a phenomenon that has motivated extensive theoretical investigations. Comforting results show that interpolation indeed may not affect the minimax rate of convergence under the squared error loss. In the mean time, DNNs are well known to be highly vulnerable to adversarial perturbations in future inputs. A natural question then arises: Can interpolation also escape from suboptimal performance under a future $X$-attack? In this paper, we investigate the adversarial robustness of interpolating estimators in a framework of nonparametric regression. A finding is that interpolating estimators must be suboptimal even under a subtle future $X$-attack, and achieving perfect fitting can substantially damage their robustness. An interesting phenomenon in the high interpolation regime, which we term the curse of simple size, is also revealed and discussed. Numerical experiments support our theoretical findings.
Problem

Research questions and friction points this paper is trying to address.

interpolation
adversarial robustness
regression
X-attack
nonparametric regression
Innovation

Methods, ideas, or system contributions that make the work stand out.

interpolation
adversarial robustness
nonparametric regression
X-attack
curse of simple size