Critical Points of Degenerate Metrics on Algebraic Varieties: A Tale of Overparametrization

📅 2025-12-24
📈 Citations: 0
Influential: 0
📄 PDF
🤖 AI Summary
This work investigates the distribution of critical points for degenerate quadratic optimization problems over algebraic varieties in overparameterized machine learning. When the number of training samples is smaller than the model’s degrees of freedom, the empirical loss function becomes rank-deficient; to address this, we introduce a projection morphism that reduces the degenerate problem to a non-degenerate one and systematically characterize the ramification locus as the dominant structural feature governing highly degenerate critical geometries. Innovatively extending Euclidean Distance Degree (EDD) theory to degenerate metric settings, we establish, for the first time, a rigorous correspondence between the ramification locus in algebraic geometry and critical points in overparameterized deep learning. This yields an explicit closed-form formula for counting critical points, validated empirically on canonical neural network architectures with high predictive accuracy—thereby substantially broadening the applicability of EDD theory to modern optimization and deep learning.

Technology Category

Application Category

📝 Abstract
We study the critical points over an algebraic variety of an optimization problem defined by a quadratic objective that is degenerate. This scenario arises in machine learning when the dataset size is small with respect to the model, and is typically referred to as overparametrization. Our main result relates the degenerate optimization problem to a nondegenerate one via a projection. In the highly-degenerate regime, we find that a central role is played by the ramification locus of the projection. Additionally, we provide tools for counting the number of critical points over projective varieties, and discuss specific cases arising from deep learning. Our work bridges tools from algebraic geometry with ideas from machine learning, and it extends the line of literature around the Euclidean distance degree to the degenerate setting.
Problem

Research questions and friction points this paper is trying to address.

Study critical points of degenerate quadratic objectives on algebraic varieties
Address overparametrization in machine learning with small datasets
Extend Euclidean distance degree theory to degenerate optimization settings
Innovation

Methods, ideas, or system contributions that make the work stand out.

Projection transforms degenerate to nondegenerate optimization
Ramification locus key in highly-degenerate regime
Algebraic geometry tools count critical points
🔎 Similar Papers
No similar papers found.